I’m enjoying a new app that my friend, Simon, shared with the CLMOOC community. It’s called Fused and it allows you to blend together images (and maybe blending videos on images? I need to explore that but I think Simon did it).
We have a postcard project going in CLMOOC, too, and this week, I received two different postcards on two different days (a total four postcards in three days!). I tried the blending technique with Fused to pull together the pair of postcards on each day, and the result is pretty lovely.
This is from postcards that arrived from Karen and Stephanie
I’ve had Science Journal app on my phone (Android) for some time now, and every so often, I pull it out to play with it. But last night, as my new/old band began to play for the first time in over a year with a PA system and guitar amps (long story short: we lost our singer and bass player and practice space, went acoustic, found new practice space, looking for singer and bass player), I wondered what the sound levels were.
Right before we started our first song of the night in our new practice space — Love Potion Number 9 — I put the Science Journal app into motion, capturing and recording the decibel levels in the room. Yeah, it was loud. Our drummer has been waiting a long time to pound on his skins (as opposed to the electronic drums he has been using). He pounded away.
But it was neat to see the spikes of the song in Science Journal later on. I could see where the solos were, and where the song dipped into the break part, and more. I could see where the decibels clipped maybe a bit too high.
It made me wonder about that 85 db range that we hit. So I tracked down this chart. No wonder our lead guitar player wears special “in ear” plugs. We hit 737 sounds!
This was sort of fun. I guess. I heard about a music video mobile app called Chosen that is becoming popular with young people (after, typically, not necessarily being used in the way it was built for). Kids use it for lip-syncing videos, and the company just got rights to millions of songs, I guess.
I figure it’s always a good idea to get a handle on what is becoming popular (still have yet to do Snapchat, though, so maybe take my pronouncements with a grain of salt) and so I dove into Chosen.
It’s pretty simple to use. You can record your voice or music, or choose music (this is the lip-sync method). There are some funny overlays you can choose. You hit “record” and do your thing. The video gets saved to your device and you can share it out. Or you can share it within the Chosen ecosystem. (Note: the folks might want to, eh, choose, a new name I could not shake a religious theme from my mind when hearing the name of the app).
Here I am, grooving to Justin Timberlake’s song of the summer on pop radio:
Give it a try. See what you think. Maybe we can do a lip-sync competition during CLMOOC?
Well, now … this is some sort of magic. I had a songwriting friend who urged me to check out this new free app by Apple called Music Memos, and I finally got around to it yesterday. Yes, it is pretty nifty. You can record ideas, and not only will the app record (not all that special, really), it will lay out the chord structure (see image), and allow you to add in automated bass and drums.
So, the musical idea takes shape in the app. Sort of. It’s not perfect (the drums start off and end rather awkward and the bass doesn’t always want to be in tune with the song) but it is a great way to “jot down” musical ideas and at least hear them begin to come into formation. I jangled in a few chords and was pretty impressed with the results.
Did I mention this one is for free?
Here is a song I did in the app, moved to Garageband for a slight mix, then into Soundcloud, and then into Zeega …
Yesterday, I shared out a bit about the twists and turns of a poetry project that involved friends from around the world, postcards sent and found and lost, and an original poem that I had broken apart and parsed out, with an invitation to friends to reconstruct the words and phrases on a digital wall.
Today, I wanted to find a way to share out the original poem, particularly now that we have opened up the wall and others are jumping in, adding words and media and more. The poem is receding in that space as collaborative sharing surfaces. I love that movement, but I still want to keep the original poem intact.
The video above is that poem, told as digital poem, and with a bit of App Smashing on the iPad, too. I used the Legend app to make the animated text pieces (with background images as screenshots of the wall), and then I moved those short videos into the iMovie app to create a single video. Meanwhile, the guitar music is a piece that I wrote and recorded with the new Music Memo app the other night, mixed in the Garageband app, and then moved onto the iMovie app as soundtrack. From there, I uploaded the video into YouTube. Phew. It seems like a lot to juggle but it all worked rather seamlessly to create what I had in mind.
I debated whether to narrate the poem with voice and then decided against it, letting the words speak for themselves on the theme of poetry writing as collaborative endeavor, and using the music to create an emotional underpinning of the poem. (I also am considering asking folks to record their word or phrase, and then stitching our voices together like a quilt …)
This not a review, per se, but a sharing of my various interpretations of the theme (as I understand it) behind Nick Sousanis’ interesting graphic novel/dissertation Unflattening. (This book was suggested by my friend Ron in the Rhizomatic Learning space, and then Susan mentioned she had read it and so did Wendy, and then Terry got the book and began doing his own interpretations and then Greg just got the book but knew of the work and … meanwhile, Sousanis himself has been engaged in the conversations on Twitter about our observations of his work … all pretty fascinating in and of itself)
Honestly, I will need to read Unflattening again, and maybe a few more times, to gather up all of the nuances of thinking, but Sousanis puts forth ideas about how to break free of a narrow vision of the world and art and meaning by reminding us that we need to better see how image and art and other perceptions come into play when navigating the world. His use of the comic/graphic story format is incredibly engaging and interesting, and perfectly suited for this kind of philosophical journey.
While reading, I kept wondering how to represent my own thinking as the reader (following Terry’s lead) in non-traditional ways. How could I “unflatten” my own experiences with the book?
Unflattening is a simultaneous engagement of multiple vantage points from which to engender new ways of seeing.” — Sousanis, page 32
I began, as I usually do but which seemed very appropriate here, with a comic and a remix. I took a page from Unflattening and added my own layer of comic characters, making commentary on the content of the page. My idea was not to lessen Sousanis’ message, but to strengthen it by showing how a reader can interact with text.
Still, the remix comic exists in flat space.
I started thinking, Sousanis should have an Augmented Reality layer to the book, which would create an invisible layer of information and maybe more insights on top of the book as it exists. If we all had Google Glasses when we might read books in a different way …
This led me back to the Aurasmas app, which I have toyed around with before, to see if I could add a layer of commentary via video on top of the book itself. I was quickly reminded how complicated it is to share “auras” (as the app calls them) but I finally figured it out (the app is native to your device; if you want to share auras you create, you need to set up a folder at the website, load your project there, and then share out the link. Those who have the app can use the link, which opens up the app on their device and sets off the “aura” when they point their camera at the object, which in this case is Unflattening.)
Here, then (I hope) is the link you can use to get to my “aura” of Unflattening. Don’t have the book? No problem. Use the image of the book’s cover here as your object for launching the aura. On your mobile device, click on the link below, which should launch the app, and then point your camera on the image in this post (OK, so that might require some device juggling. Be safe out there, people.) Ideally, a video of me should emerge in the augmented layer of the book’s cover. I hope it works for you. It did for me, when I tested it. If not, the above screenshot is pretty nifty, with the illustration web of footprints running through my face (and what’s up with my eyes? I must be in the midst of some keen perceptions there).
It also occurred to me that I could use a nifty tool in the Firefox browser that lets you get a 3D look at websites, and that I could use that tool to look at Sousanis’ own website where he writes about the writing of Unflattening. I love how he uses the last part of his book to talk about what influenced individual pages. I am a sucker for “behind the scenes” of writers. In using the 3D view tool in Firefox, I would be making the leap from the book to the author writing about the book that I was reading, and I would be using yet another lens to see what he was writing about. Maybe. I’m not sure it succeeded on that level, but it is still an intriguing look at how to use “multiple engagement points” to look at the web. I took a tour.
Meanwhile, Terry and Greg and I and a few others are working on a media annotation of a page in Unflattening, with Sousanis’ permission (although, to be frank, we would have done it anyway, as that is the reader’s prerogative, but we let Sousanis pick the page from his book he would like us to annotate because the relationship between reader and writer is always an interesting one to explore. I wonder how Nick feels about all this.)
… from one the folks (Biz Stone) who launched Twitter comes Super … which is sort of a collage/media app, merging words with images. The free app is easy to use — you start with a list of starter words/prompts, write what you want to write, choose an image from its recommended files (via keywords), tinker with the image and words, and then publish.
Finished works on Super can be easily shared in other social media, and the visual element makes it worth checking out. Liker Twitter, this is “short form” writing — every word counts, or else the page gets cluttered. The most time I spent with Super is figuring out the right image to with my words, and then worrying about the visual design element of the piece. I have no idea where Super might be heading, in terms of its flow and sense of possibilities. It does seem a little artsy-whacky right now, but I am fine with that.
They also added a new feature called Strips, that allows you to tie together a few Supers into a sort of comic strip narrative. I have not yet given that a go but I will.
(Note: This is a Slice of Life, facilitated by Two Writing Teachers. Slice of Life is a weekly writing activity. You write, too.)
It’s not that I didn’t have plenty of shoveling to do yesterday. I did. I did. But a snow day yesterday also gave me time to play around with an app that I had put on my iPad the other day, thanks to Paul Hamilton. Adventure Creator is a “make your own adventure” interactive fiction maker and I am still working to figure it out (Paul helped with a short video tutorial).
I’ve worked with Twine (which is free) and played around with some other “make your own adventure” — or interactive fiction — creators, such as Inklewriter. This Adventure Creator app seems intriguing, although it costs almost 4 bucks so I am not sure it is reasonable for an entire classroom.
Still, I dove in, played around and began making an interactive story about Rhizomatic Learning, as I gear up for the upcoming Rhizo15 online gathering that is slated to begin in March (I think). My idea is to create an interactive story, exploring a bit of rhizomatic thinking. Mostly, I hope it helps me better understand the concept. Even a year after Rhizo14, and the continued connections all year, I am still a bit fuzzy on this kind of interlacing and inter-tangent thinking of learning practices, although I know enough about it to know there is something there.
Adventure Creator allows you to build out a story, and I think you can add objects, but I have a lot to learn and am grateful for Paul’s video guidance, and now I need to dig into the app and find tutorials. Constructing a text-style ‘make your own adventure’ story requires planning and thinking but I think it could be cool.
I posted this comic yesterday to the Walk My World twitter stream because a series of tweets had me laughing.
Greg, over at Walk My World, then asked if I might create a tutorial on the comic strip app that I use quite a bit these days — Comics Head. Sure, I thought, and then realized it could be a bit subversive, too. So the tutorial is a comic making fun of making a tutorial of the making of a comic.
Head spinning? Yeah.
Then, in the spirit of the YouShow15 project and its emphasis on the Director’s Cut of making media, I used the audio feature in the app (which is a cool new function) to create a fake “Director’s Cut” of the making of the comic … I won’t do the whole recursive thing again.
At the National Writing Project Annual Meeting in November (this post has been in my draft box for a bit of time), I attended a session by a representative of GlassLab Games, which has been working in a partnership with NWP folks to develop a video game app designed to teach elements of argument to middle school students.
The game is called Mars Gen One: Argubot Academy, and it is a free app from the Apple Store. Mat Frenz, of GlassLabs, was very knowledgeabout about game mechanics, and of why games are a natural way to pique the curiosity of students. He notes that good games can be an “engagement bridge” for students to learn difficult material, and the hope for Argubot Academy is that players “will master the mechanics of argument with the same passion as mastering the mechanics of Pokemon.” The game developers build some of the mechanics and look/feel/design of the game with echoes from the Pokemon universe.
Mars Gen One: Argubot Academy has a narrative of science, as the player is on a discovery mission and is forced to create “argubots” that are powered by the strands of strong argument claims and evidence. The player asks questions, explores the spaceship and then goes into “battle” against others with their argubots, seeing if their claims and evidence is strong enough to hold up to scrutiny. A teacher account allows you to track progress of students, and it charts out where strengths and weaknesses of the individual player/students are. That is all handy information.
I played the game a bit over the summer, when it was first released and promoted via NWP and Educator Innovator, and then again during the session, as Mat gave us an overview and tour of the game itself. I know a lot of teachers in the room were excited about. I have my slight reservations. First of all, my classroom does not have iPads, so for all practical purposes, the game is not in our future. I also found the game a bit too wordy, knowing my students as I do, although when I mentioned this is conversation with other teachers in the session, they disagreed with me. So, maybe it is my own perception. I am also not sure it would engage my students over multiple sessions, although Mat shared testimonials from teachers using the app, praising it as tool for engagement.
But, don’t listen to me. Give the app a try. It’s free, and a lot of thought has gone into the development. It might just work for you, particularly as we shift into higher gear away from persuasion and deep into argument. The game might be just the hook for your students.