Grappling with Algorithms and Justice (Oh, the Humanity)

Last night was the second online session of an inquiry group project called The Grapple Series – hosted by the National Writing Project, Western Pennsylvania Writing Project and the CMU CREATE Lab — that is looking at the impact of AI and technology on our lives. The theme last night was algorithms and justice, a pairing that made for interesting conversations about how blind trust in both often lead to disastrous consequences.

We explored some interesting reading and video pieces before gathering in our online session. The articles explored the issue from multiple angles, but the overall connecting concepts are clear: algorithms are created people, and people have bias, and so algorithms have bias, too, and when algorithms are embedded with bias, it impacts our notions of justice in the world.

Sometimes, this is literal — as in the case of computer software being used to designate length of parole. Sometimes, it is more nuanced — the way search engines bring racial stereotypes to the surface. Sometimes it is not yet known — the way facial recognition is changing our sense of privacy in the public sphere.

The Grapple gathering began with a large discussion and writing about justice and algorithms, and then broke into smaller groups, where we engaged in deeper debate about the role of algorithms on society.

We also teamed up to create our own paper “algorithm” for fighting off the common cold, and while our group went a sort of silly route (Should a teacher call in sick or not?), the short flowchart activity reminded us how often we can fall into Yes/No binary decisions that can leave the humanity aspect out. Another small group did integrate ideas of humanity into their algorithm, and I found that quite interesting.

I appreciate being able to work through and “grapple” with these complex questions rippling through society. There is no real solution — the algorithmic genie is long gone from its bottle. But we can be aware, and make some decisions about how what information we share and how we are being manipulated by technology.

Here are resources shared before our session, if you are interested:

Peace (ain’t no code for that),
Kevin

Grapple Session: An Inquiry into AI and Ethics

Grapple Session One poemLast night, I joined an online gathering of folks in The Grapple Series, hosted by the National Writing Project’s Western Pennsylvania Writing Project and a group out of Carnegie Mellon called the CREATE Lab. This was the first of four scheduled sessions on AI and Ethics, and it was a fascinating start to the conversation and inquiry.

One of the guiding inquiry questions revolves around the dual wonder of whether we humans are making our machines more human or whether machines are humanizing us. Or some variation of that question. Essentially, it has us critically looking at the rise of AI in our society, and in education and writing. We were a mix of technology doubters and evangelists, I think, which made the discussion all the more richer.

If ever there was a time to pause and look more closely at Artificial Intelligence and humanity, now is the time. And for us teachers, this kind of inquiry is critical, not just for our profession (where Big Tech is pushing AI as the solution for problems of accountability and teaching time) but also for our students, and the social world they are inhabiting now and beyond.

I didn’t have this inquiry question formulated last night but it is starting to come together for me …

How do we teach students about the impact of Artificial Intelligence on our lives with the urgency of NOW, the present, as opposed to some futuristic notion of the Rise of Machines of science fiction?

We did a fun game of Bot or Not, that had us looking at poetry and trying to decide if it was created by human hand/mind/soul or a machine. I did a fair job, mostly through luck and instinct and not through any real insights I have in knowing what’s a bot or not with a piece of writing. (My morning poem, above, was inspired by further thinking this morning of last night’s session)

The hosts — Michelle King, Laura Roop, and Beatrice Dias — were fantastic, guiding the discussion and opening the Zoom space for conversations (which is difficult when you have a lot of people in the space). I’m looking forward to the next session, when the conversation will turn on Algorithms and Ethical Design (I think that was the title, but I could be wrong …)

Peace (in a human world),
Kevin

When A Stumble Goes Viral


Fail flickr photo by clasesdeperiodismo shared under a Creative Commons (BY-NC-SA) license

My son was running an event he had not run before at his high school indoor track meet the other day. We were cheering him on — he’s fast — when he took a turn and began to stumble. He fell to the track but then muscled his way back to his feet and crossed the finish line, unhurt but very frustrated.

The next day, he told us that a friend on the track team had been shooting video of his race, caught the stumble, and had remixed the footage for Tik Tok. My son said he was fine with it. The video clip does not show my son’s face or any other identifying features. It’s shot from the back. Strangely, the friend edited the video to indicate to the audience in the opening frame it was him (the friend) in the video. Maybe this was to protect my son’s privacy.

“It has over 120,000 views,” my son told us that first night, the day after the race, showing us the clip on Tik Tok. Yesterday, the second day, I asked about it. “It has over 250,000 views and 5,000 likes,” he told me.

He’s proud of this Tik Tok trending, but I’m not so sure about it.

First of all, there’s the “asking permission” factor in this whole story, where the friend posted the video and only then later showed it to my son. No one should be asked afterwards, even if the track meet was in a public space. That’s just wrong.

Second, there’s this fascination with views and likes that drives me batty, as if that were social capital that has tangible value (it really doesn’t, unless you are creating a company that needs eyeballs for advertising and exposure). I’ve written about this before, quite a bit, and noted how this aspect of social media is really a way for the companies to sell advertising and to track user data.

Third, this whole notion that Fail Videos are what can get the most of our collective attention bothers me to no end, that we mock the stumbles of others for entertainment. I realize that Nice Videos don’t have the same impact on our brain — sort of how we notice and remember only the bad in the world, not the good. But to see that part of our shared humanity on display so vividly, and with such popularity, is a particularly negative reflection on who we are, as a people.

My son is unconcerned with all of this. (Note: I have scanned through the Tik Tik trending videos the last few days — while I have viewed some questionable content and some strange things, and yes, some amusing clips, I have not seen the video of my son’s track fail on the public trending page.)

While I’m proud that he is so resilient – that the frustration on the track did not spill over to seeing his mistake play out on social media — I still wish he and his whole generation would shake loose the notions of viral videos being something worth striving for. It may be like shouting in the void, but we have to keep our warning voices loud anyway.

Someday, they’ll hear us.

Peace (gone viral),
Kevin

The Meanest Place on the Internet (YouTube’s Toxicity Problem)

Whenever I talk to my sixth graders about decorum and trolling in online spaces, one platform consistently rises to the surface as their prime example of the “meanest place on the Internet”: YouTube video channels and, more specifically, the comment section of videos. No other platform even comes close for them. Year after year, YouTube is the place most kids point to as the meanest, nastiest place on the Internet. They share their surprise and disgust at what people will write, and get away with, and how commenters will openly attack others, including the most vulnerable video makers.

As YouTube is the place my young students spend the most amount of their online time — for some, the time spent can be a few hours a day — it always strikes me as frustrating that they are both exposed to potentially great videos (and there certainly are great videos on YouTube, for any kind of interest and topic and niche learning) in combination with humanity acting so plainly bad, it makes me embarrassed on our collective behalf.

Maybe YouTube (aka Google) is finally understanding this.

Along with the changes to its platform to make it in federal compliance for young viewers (all YouTube channel operations must now designate their channel for an audience of children or not, which mandates certain settings for video uploads), YouTube seems to be making more visible its efforts to root out the negativity.

We know that the comment section is an important place for fans to engage with creators and each other. At the same time, we heard feedback that comments are often where creators and viewers encounter harassment. – from YouTube Blog

YouTube folks claim in a new post that they are now beefing up the way comments are filtered and giving more flexibility to YouTube creators, as well as setting forth more algorithms to catch toxic comments before they even reach the comment bin. (See Comment Settings for YouTube, too)

There is a link to a Transparency Report, that shows how many videos have been removed and some other data, too, that is sort of fascinating to look at. For example, it seems to indicate that 500 million comments have been removed from July through September alone. Sheesh.

Well, we’ll see if it all works to make YouTube a more positive place while still protecting free speech (I acknowledge this is a juggling act, but, figure it out, people). My students will tell me if it’s working or not, I am sure.

Peace (everywhere),
Kevin

Book Review: Tubes (A Journey to the Center of the Internet)

Andrew Blum’s journey to the center of the Internet, as he calls it, begins when a squirrel nibbles the wires of his house, shutting his online access of. This event sparks a years-long journey of curiosity to figure out how the wires all connect, and how data flows through the physical space of the world.

In Tubes: A Journey to the Center of the Internet, Blum brings us along with him. It’s a pretty fascinating ride, if a bit technical at times, as he researches, investigates, and visits some of the main hubs of the dispersed Internet, from data centers to undersea cables to spaces below buildings in urban centers to isolated rural places — all forging different kinds of connection so that when I hit “publish” on this blog post and you click to read what I wrote, the data flows rather seamlessly (or so it appears) through fibers, wires, and yes, tubes of light.

There are moments where Blum geeks out a bit too much for my tastes, but I understand why he goes into such descriptions about routers, packets and fibers. What I was more interested in is how he frames the flow of information with the physical aspects of the world — the way we can imagine data moving along the contours of our Earth, and the ways in which those same contours provide barriers of access, too.

Overall, though, Tubes gives the reader a fuller sense of the digital world — sparking some appreciation for the original design of a distributed networked space and for the rather fragile elements that make up what we mostly take for granted. Some hubs are monumentally important, and yet, as Blum describes them, neither as secure as one would expect nor as reliable as they could be.

I really appreciated these final thoughts of Blum, who seeks to humanize his research, and ground it in the world we live in, not the virtual one we imagine when we use our technology.

“What I understood when I arrived home was that the Internet wasn’t a physical world or a virtual world, but a human world. The Internet’s physical infrastructure has many centers, but from a certain vantage point there is really only one: You. Me. The lowercase i. Wherever I am, and wherever you are.”

— from Tubes by Andrew Blum, page 268

Peace (flowing through us all),
Kevin

John Reminds Us: Sometimes, Old School is Still Cool

John Spencer often shares interesting videos and resources and insights about innovative practice, and he often reminds us that advanced technology and the newest gadgets aren’t alway what’s needed. Here, he reminds us that “vintage” works, too. and might have more and different value than online or technology-based activities.

He also lists a bunch of possibilities, from duct tape and cardboard to hacking board games to sketchnoting on paper  (and it appears this whole concept will form elements of an upcoming book on the topic).

Peace (reminding us of its reach),

Kevin

Changes Afoot for YouTube (What Kids Can and Cannot See)

If you are a teacher or school that oversees its own YouTube channel (like I do), you need to know that changes are coming for how YouTube deals with videos and children. This comes after YouTube and Google were at the center of legal action around children’s access to videos, and I think the changes will be helpful.

Read more – Jeff Bradbury does a good job of explaining these changes for educators (thanks to Sheri, for sharing Jeff’s post)

There’s been a bunch of pushback by YouTube content creators — those who make their money off advertising inside videos — about the changes, which are part of COPPA (the US Children’s Online Privacy Protection Act (COPPA) requirements, but I am all for deeper protections for those viewers under the age of 13. If that’s going to be your main audience, then you better be doing your job on protecting those viewers.

The Federal Trade Commission has released some information about what kind of material is “made for kids” or not.

Peace (what we see is what we do),
Kevin

Book Review: Because Internet (Understanding the New Rules of Language)

Sometimes, when you come across a linguist — even if you love words and language — the insider-speech gets a little too much to bear. Not so with Gretchen McCulloch, whose book Because Internet (Understanding the New Rules of Language) is infused with focused curiosity, a sense of fun and academic research. Yes, it’s possible.

And what she is looking at is our fascinating times of what seems to be our ever expanding elastic language — where the immersive and social qualities of technology seem to be altering the ways in which we write and speak and communicate in different ways. As teachers, many of us know this just by listening and reading our students.

McCulloch notes a few times in her book that her examination here is merely a snapshot of the present, not a prediction of where language is going.

To the people who make internet language. You are the territory, this is merely a map. — from the dedication page, by Gretchen McCulloch

Still, it’s a fascinating dip into rippling waters.

What interested me the most was her look at the explosion of informal writing — particularly as she notes how social media and technology connections is tearing down the rules of formal writing, for informal communications (while formal rules still apply for formal writing) — and what she calls “typographical tone of voice” — a term that I love for its poetry.

In this section, McCulloch explores the expanded use of punctuation for meaning making, the use of font styles (no caps/all caps, etc), repeating letters for emotive resonance, abbreviations to connote kindness, the echoes of coding into our writing, the use of space between words and passages, and ways we project emotions and feeling into our writing when confronted with limited means.

I mean, wow. That’s a lot of intriguing lens on writing, and McCulloch navigates them all with a personable voice, a linguist’s ear for language, and a sense of both celebration and skepticism about what might or might not be happening with our language.

Later, she also explores memes and emoticons, and the way visual language is complementing written language, often in complementary and complicated ways. This book covers a lot of ground, but McCulloch is an able tour guide, pointing out the funny quirks as well as the emerging patterns.

Peace (written out),
Kevin

My Students and How They Use Technology: Survey Results

tech survey collage

Each year, for the past eight or nine years, I have given my sixth grade students a survey at the start of our Digital Life unit — as much to inform our discussions as to give me some insight into trends over time with an 11 year old audience.

This year, for example it’s a growing TikTok trend and a further devaluing of Facebook, with Instagram’s popularity also on the decline. Also, there are fewer reported negative experiences even as more students report adults talking to them about how best to use technology and digital media.

All this also helps me send forward resources to families and parents, as an encouragement to talk about and monitor technology use with their children.

This leads us to the first activity — The Internet Mapping Project by Kevin Kelly– and students are planning to share out today their artistic interpretations of how they envision their interaction with technology. I am always curious to see how they approach this prompt. Some go literal. Others, symbolic.

Internet Mapping Project template

Peace (becoming aware),
Kevin

#WriteOut: Giving Kids A Camera In Order to Capture The Wild

As the Write Out project kicks off today (and goes for the next two weeks, with the National Day on Writing right in the middle of it all), I wanted to share out a project I have had underway for a few weeks now, in which my sixth grade students have been going about their small suburban town “capturing the wild” with photographs. We aim to use the photos as part of a connection with another school, and for some writing this week.

You can view my podcast video here (via SoundSlides)

Peace (thinking it through),
Kevin