A few weeks ago, for the #CLMOOC DigiWriMo Pop Up Make Cycle, the focus was on animation. There are all sorts of apps that allow you to animate now, and StickNodes is one of my favorites (I paid the $1.99 for the Pro version). It’s an update on an old freeware that I used to use with students called Pivot Animator. When we shifted to Macs, I had to move away from Pivot (it is a PC-only freeware) and tried Stykz for a bit.
StickNodes Pro is pretty easy to use, and has a lot of powerful features for animating stick figures. It’s also pretty darn fun to use. You can create and then export your animation as video or gif files, which can be hosted elsewhere.
Here is one of my early experiments: Stickman Walking. (I had uploaded it into Vine, which you can no longer do)
No surprise that there are tutorial videos on YouTube for using the app. Here is the first in a series done by this person.
Give it a try. Or try some other app, and let us know. We’re animating this week!
My 11 year old son has really enjoyed making videos with the Musical.ly app this summer. If you don’t know what it that is, Musical.ly is a lip-syncing app, which provides short bursts of song that the user creates an equally short video for. Most people lip-sync the song and share within the community. Tons of kids are using it.
(Note: I asked later the boys about it. It was done in the app. So much for hacking the app for creative video editing.)
I also get a little antsy, though, in how my son, entering his middle school years, gets caught up in the number of viewers and likes and all of the suitcase luggage of social media (thumbs up, plus, etc.) on his work (not just here in this app but also in other platforms) that doesn’t really designate anything much in reality. He sees it as “someone is watching what I am making” but I suspect it is an ego thing, too. He wants to be popular, and he sees technology as one of the ways to be “cool.”
We’re already overhearing conversations about “how many views do you have?” and “how many videos have you made?” as if it were all a cold numbers game.
We’re trying to temper that impulse for “likes without context” with discussions at home, as best we can. He’s always enjoyed making movies, and has regularly written and produced videos himself with friends. (In fact, he is finishing up the editing a video that he and this same friend shot over the weekend.)
This Musical.ly app is designed for short videos, and I hope that it doesn’t suck dry the creative fountain for his desire to make longer video productions. I hope, instead, it gives him more ideas that he can use elsewhere.
And sure, he’s having fun with it. That’s important, too.
I am intrigued by ways in which social media and technology bridge the gaps between real space and virtual space. So, augmented reality apps are interesting (if still a bit complicated to use). Virtual reality ideas pique my interest. So when I saw someone share out about the app, Gum, I thought I would give it a shot.
Gum is an app that uses the bar codes on products (such as food) as a means to leave comments and texts related to those products. So, for example, if I open up the Gum app and connect it to the bar code of my older son’s favorite food — Ramen Noodles, chicken-flavor — I can leave some thoughts about the noodles. Anyone who uses the Gum app and scans in the bar code of Ramen Noodles (chicken) will now see the comment I left there about alternative uses for Ramen noodles.
And they can leave their own comment, too.
(Try scanning the bar code here of my Ramen Noodle package)
I left a question on the other kid-fave food in our house — Annie’s Mac and Cheese (purple box). Maybe someone will answer it in the future, using the Gum app. I’ll get a message if someone does. (I hope someone does).
(Try scanning the bar code of my Annie’s Mac and Cheese box)
I had thought, initially, that I could maybe upload an image with my text (and even crafted a comic for the noodles), but I guess it is just text connections across social media space with real objects. I’m still interested, and wonder how a network like CLMOOC might tap and hack into this kind of app for some connected project. How might we riff together with a connection to physical space with Gum?
I’m enjoying a new app that my friend, Simon, shared with the CLMOOC community. It’s called Fused and it allows you to blend together images (and maybe blending videos on images? I need to explore that but I think Simon did it).
We have a postcard project going in CLMOOC, too, and this week, I received two different postcards on two different days (a total four postcards in three days!). I tried the blending technique with Fused to pull together the pair of postcards on each day, and the result is pretty lovely.
This is from postcards that arrived from Karen and Stephanie
I’ve had Science Journal app on my phone (Android) for some time now, and every so often, I pull it out to play with it. But last night, as my new/old band began to play for the first time in over a year with a PA system and guitar amps (long story short: we lost our singer and bass player and practice space, went acoustic, found new practice space, looking for singer and bass player), I wondered what the sound levels were.
Right before we started our first song of the night in our new practice space — Love Potion Number 9 — I put the Science Journal app into motion, capturing and recording the decibel levels in the room. Yeah, it was loud. Our drummer has been waiting a long time to pound on his skins (as opposed to the electronic drums he has been using). He pounded away.
But it was neat to see the spikes of the song in Science Journal later on. I could see where the solos were, and where the song dipped into the break part, and more. I could see where the decibels clipped maybe a bit too high.
It made me wonder about that 85 db range that we hit. So I tracked down this chart. No wonder our lead guitar player wears special “in ear” plugs. We hit 737 sounds!
This was sort of fun. I guess. I heard about a music video mobile app called Chosen that is becoming popular with young people (after, typically, not necessarily being used in the way it was built for). Kids use it for lip-syncing videos, and the company just got rights to millions of songs, I guess.
I figure it’s always a good idea to get a handle on what is becoming popular (still have yet to do Snapchat, though, so maybe take my pronouncements with a grain of salt) and so I dove into Chosen.
It’s pretty simple to use. You can record your voice or music, or choose music (this is the lip-sync method). There are some funny overlays you can choose. You hit “record” and do your thing. The video gets saved to your device and you can share it out. Or you can share it within the Chosen ecosystem. (Note: the folks might want to, eh, choose, a new name I could not shake a religious theme from my mind when hearing the name of the app).
Here I am, grooving to Justin Timberlake’s song of the summer on pop radio:
Give it a try. See what you think. Maybe we can do a lip-sync competition during CLMOOC?
Well, now … this is some sort of magic. I had a songwriting friend who urged me to check out this new free app by Apple called Music Memos, and I finally got around to it yesterday. Yes, it is pretty nifty. You can record ideas, and not only will the app record (not all that special, really), it will lay out the chord structure (see image), and allow you to add in automated bass and drums.
So, the musical idea takes shape in the app. Sort of. It’s not perfect (the drums start off and end rather awkward and the bass doesn’t always want to be in tune with the song) but it is a great way to “jot down” musical ideas and at least hear them begin to come into formation. I jangled in a few chords and was pretty impressed with the results.
Did I mention this one is for free?
Here is a song I did in the app, moved to Garageband for a slight mix, then into Soundcloud, and then into Zeega …
Yesterday, I shared out a bit about the twists and turns of a poetry project that involved friends from around the world, postcards sent and found and lost, and an original poem that I had broken apart and parsed out, with an invitation to friends to reconstruct the words and phrases on a digital wall.
Today, I wanted to find a way to share out the original poem, particularly now that we have opened up the wall and others are jumping in, adding words and media and more. The poem is receding in that space as collaborative sharing surfaces. I love that movement, but I still want to keep the original poem intact.
The video above is that poem, told as digital poem, and with a bit of App Smashing on the iPad, too. I used the Legend app to make the animated text pieces (with background images as screenshots of the wall), and then I moved those short videos into the iMovie app to create a single video. Meanwhile, the guitar music is a piece that I wrote and recorded with the new Music Memo app the other night, mixed in the Garageband app, and then moved onto the iMovie app as soundtrack. From there, I uploaded the video into YouTube. Phew. It seems like a lot to juggle but it all worked rather seamlessly to create what I had in mind.
I debated whether to narrate the poem with voice and then decided against it, letting the words speak for themselves on the theme of poetry writing as collaborative endeavor, and using the music to create an emotional underpinning of the poem. (I also am considering asking folks to record their word or phrase, and then stitching our voices together like a quilt …)
This not a review, per se, but a sharing of my various interpretations of the theme (as I understand it) behind Nick Sousanis’ interesting graphic novel/dissertation Unflattening. (This book was suggested by my friend Ron in the Rhizomatic Learning space, and then Susan mentioned she had read it and so did Wendy, and then Terry got the book and began doing his own interpretations and then Greg just got the book but knew of the work and … meanwhile, Sousanis himself has been engaged in the conversations on Twitter about our observations of his work … all pretty fascinating in and of itself)
Honestly, I will need to read Unflattening again, and maybe a few more times, to gather up all of the nuances of thinking, but Sousanis puts forth ideas about how to break free of a narrow vision of the world and art and meaning by reminding us that we need to better see how image and art and other perceptions come into play when navigating the world. His use of the comic/graphic story format is incredibly engaging and interesting, and perfectly suited for this kind of philosophical journey.
While reading, I kept wondering how to represent my own thinking as the reader (following Terry’s lead) in non-traditional ways. How could I “unflatten” my own experiences with the book?
Unflattening is a simultaneous engagement of multiple vantage points from which to engender new ways of seeing.” — Sousanis, page 32
I began, as I usually do but which seemed very appropriate here, with a comic and a remix. I took a page from Unflattening and added my own layer of comic characters, making commentary on the content of the page. My idea was not to lessen Sousanis’ message, but to strengthen it by showing how a reader can interact with text.
Still, the remix comic exists in flat space.
I started thinking, Sousanis should have an Augmented Reality layer to the book, which would create an invisible layer of information and maybe more insights on top of the book as it exists. If we all had Google Glasses when we might read books in a different way …
This led me back to the Aurasmas app, which I have toyed around with before, to see if I could add a layer of commentary via video on top of the book itself. I was quickly reminded how complicated it is to share “auras” (as the app calls them) but I finally figured it out (the app is native to your device; if you want to share auras you create, you need to set up a folder at the website, load your project there, and then share out the link. Those who have the app can use the link, which opens up the app on their device and sets off the “aura” when they point their camera at the object, which in this case is Unflattening.)
Here, then (I hope) is the link you can use to get to my “aura” of Unflattening. Don’t have the book? No problem. Use the image of the book’s cover here as your object for launching the aura. On your mobile device, click on the link below, which should launch the app, and then point your camera on the image in this post (OK, so that might require some device juggling. Be safe out there, people.) Ideally, a video of me should emerge in the augmented layer of the book’s cover. I hope it works for you. It did for me, when I tested it. If not, the above screenshot is pretty nifty, with the illustration web of footprints running through my face (and what’s up with my eyes? I must be in the midst of some keen perceptions there).
It also occurred to me that I could use a nifty tool in the Firefox browser that lets you get a 3D look at websites, and that I could use that tool to look at Sousanis’ own website where he writes about the writing of Unflattening. I love how he uses the last part of his book to talk about what influenced individual pages. I am a sucker for “behind the scenes” of writers. In using the 3D view tool in Firefox, I would be making the leap from the book to the author writing about the book that I was reading, and I would be using yet another lens to see what he was writing about. Maybe. I’m not sure it succeeded on that level, but it is still an intriguing look at how to use “multiple engagement points” to look at the web. I took a tour.
Meanwhile, Terry and Greg and I and a few others are working on a media annotation of a page in Unflattening, with Sousanis’ permission (although, to be frank, we would have done it anyway, as that is the reader’s prerogative, but we let Sousanis pick the page from his book he would like us to annotate because the relationship between reader and writer is always an interesting one to explore. I wonder how Nick feels about all this.)
… from one the folks (Biz Stone) who launched Twitter comes Super … which is sort of a collage/media app, merging words with images. The free app is easy to use — you start with a list of starter words/prompts, write what you want to write, choose an image from its recommended files (via keywords), tinker with the image and words, and then publish.
Finished works on Super can be easily shared in other social media, and the visual element makes it worth checking out. Liker Twitter, this is “short form” writing — every word counts, or else the page gets cluttered. The most time I spent with Super is figuring out the right image to with my words, and then worrying about the visual design element of the piece. I have no idea where Super might be heading, in terms of its flow and sense of possibilities. It does seem a little artsy-whacky right now, but I am fine with that.
They also added a new feature called Strips, that allows you to tie together a few Supers into a sort of comic strip narrative. I have not yet given that a go but I will.