Supplementary Materials for Gallery Page

A web tool for reflecting about the COVID-19 pandemic through performing gestures captured as silhouettes.

Tools used: BodyPix, OpenCV, LowDB, Glitch.com
Live demohttps://glitch.com/~bodypix-covid-grief 

During the COVID-19 pandemic, we find ourselves experiencing a period of profound collective grief. Grief hurts. One way we can process, heal, and/or cope in this moment of grief is by listening to our own hurt, and listening to the hurt of others in solidarity. Through the invitation of anonymously performing our felt experience with gesture for a webcam, this piece aims to create a lens for observation of our felt experience during the pandemic, one that opens up space for participation and solidarity. It is built with browser tools (Ml5.js, BodyPix) to make this lens of observation as widely accessible as possible. 

Silhouettes of COVID-19

 

A BodyPix-driven tool for reflecting about the COVID-19 pandemic through performing gestures captured as sillhouettes.

Tools used: BodyPix, OpenCV, LowDB, Glitch.com
Live demo: https://glitch.com/~bodypix-covid-grief 

Some context

We are experiencing a period of profound collective grief. In a recent  interview with Harvard Business Review, David Kessler, expert on grief and co-author (with Elisabeth Kübler-Ross) of on Grief and Grieving: Finding the Meaning of Grief through the Five Stages of Loss, says that we are feeling a sense of grief during the COVID-19 pandemic:

“…we’re feeling a number of different griefs. We feel the world has changed, and it has. We know this is temporary, but it doesn’t feel that way, and we realize things will be different…This is hitting us and we’re grieving. Collectively. We are not used to this kind of collective grief in the air.”

Grief hurts. One way we can process, heal, and/or cope in this moment of grief is by listening to our own hurt, and listening to the hurt of others in solidarity.

healing happens when a place of trauma or pain is given full attention, really listened to.

Adrienne Maree Brown

The most profound way to process grief with others is being physically present with them. This is because, according to scholars and musicians Stephen Neely (professor of Eurythmics at CMU) and Émile Jaques-Dalcroze, our bodies are considered the first instrument; we come to know our world through the immediate tangible interactions with our environment.

“before any experience is understood in the mind, it has to first resound through and be felt in the first experiencing instrument, the body.”

— Jaques-Dalcroze

But grieving with others becomes profoundly hard when we must be apart, when being close to loved ones means we might make them sick.

It’s hard to carve out a moment to take stock of how we’re feeling, and even more difficult to share feeling with others beyond text, video, and our own limited networks. In the words of artist Jenny Odell, art can serve as a kind of “attentional prosthesis.”  For example, work such as Nina Katchadourian’s Sorted Books invite us to dig into our own libraries to create our own sculptural book phrases.

From Nina Katchadourian’s Sorted Books project

Taking a leaf from Nina Katchadourian’s book, I’m hoping to create a lens for observation of our felt experience during the pandemic, one that opens up space for participation and solidarity.

How might I create a capture system for expressing grief as a display of solidarity?

The general workflow

The subject: a body in motion

I chose to abstract the body in motion into an animated silhouette. It not only protects identity of participants, but amplifies our sense of solidarity with them. We can see ourselves in more abstract bodies. They also invite play.

From Scott McCloud’s Understanding Comics

 

The capture system

What tools like Ml5.js and others offer us is a way to make tools, or lenses of observation, more widely accessible to non-experts. Platforms like Glitch made deployment friction-less.

The workflow

At the end of the capture, you can download the capture you’ve created, and submission is optional.

The captures

I opened up the tool on a trial run with my cohort, with several captures over 1-2 days. I processed their submissions into a collection of gifs:

We may find that we have lost a lot during this pandemic. 😢
Expressing sadness can be an act of release. How do you feel?

There will be a moment when the pandemic is over. Imagine it 😀  How will you dance?

What piqued my interest

A lens for reflection. I’ve created a tool for easily (and privately) expressing emotion through gesture, particularly in a time when body-centered reflection is needed. The project is less about the gallery of captures, more about the opportunity to explore with the body. Nevertheless, I am most excited about capturing (and remembering) the pandemic feels in a visceral way.

Bodypix. This is a relatively new port to Ml5.js. The browser-based tool makes capturing poetic movement in time more easy to use and accessible.

Glitches & opportunities

Tone setting / framing. I’m used to holding these kinds of reflective conversations in person. How I could set a reflective tone with no control over its context of us was largely a mystery to me. Early feedback seemed to emphasize the tool didn’t provoke them to reflect deeply.

Glitchiness in gifs. Inconsistency across machines produced wildly inconsistent (and glitchy) gifs.  How quickly certain machines can run the bodyPix model and draw frames in seemed to P5.js vary wildly.

Seamless workflow for storage of gifs. My server (temporarily) broke. Lowdb didn’t readily store the created gifs in an image format, so . My server filled up after only ~20 submissions, after which I opted to only store silhouette frames (which I then post-processed into GIFs by manually downloading the frames from Glitch). Ideally, the pipeline from capture to gallery would be fully automated.

Project update

Done

  • Added opencv to find / smooth contours
  • Updated Ml5.js to use most recent BodyPix model from Tensorflow (the new model has pose estimation built in, more customization options)
  • Played around with video size / queuing results to speed performance

Left to do

  • create interface that prompts people to respond
  • save / play responses

 

Poor Images: Screen Selfie

April plan: Reflect together while social distancing

https://bodypix-covid-grief.glitch.me

That discomfort you’re feeling is grief. In an article that shares the same name, David Kessler talks about how naming our experience as grief begins to give us tools to talk about it. After we acknowledge the presence of grief, the next, and most critical step, is processing it.

But it’s hard to take a moment and take stock of how we’re feeling, and even more difficult to share feeling with others beyond text, video, and our own limited networks. Taking a leaf from Nina’s book, I’m hoping to create a lens for observation that opens up space for participation.

I want to create a lens on our own grief (and other pandemic feelings) as a way to begin to process them. By using browser-based body-detection, I’m hoping to open up the possibilities for observing how our bodies are doing in light of this (traumatic) new way of life, and sharing that experience (in a safe + meaningful way) .

My plan for the rest of the semester is to work on a capture system in Ml5 for processing and recording our experience pandemic through physically “performing” (in front of their webcam) how we are feeling.

Our likeness from the webcam footage is abstracted into a silhouette. By focusing on the silhouette of a body as the capture, the tool helps us focus on the shapes our bodies create, and not our specific appearance. Further, when we look at other’s responses, we can more viscerally “feel” their presence.

By developing this project for the browser, I’m hoping to open up access to as many participants as possible for participation.

These silhouettes will be aggregated and displayed back for people to appreciate everyone’s, possibly feel some solidarity.

Tools: ML5.js (bodyPix), webcam

 

Testing and breaking the Super SloMo video interpolation algorithm

Using Super SloMo by Jiang H. et. al (Python/run on Windows 10):

I. Tests with a GoPro Hero8 in slow motion mode:

Notes: Had to reduce video quality to 720p to stop the algorithm from crashing. 2x slow factor works pretty well, 4x produces some jumpy / distortion effects.

In general, the algorithm produces a little distortion on the corners of videos.

II. “Breaking” the algorithm using some timelapse footage

Notes: This weirdness works best with footage that has people / objects entering and leaving a space.

III. Transition between two images (using two friends I’ve met before)

IRLDistortion: Face in chrome coffee cup straw

Every day, like I’ve done for the last two years of grad school at CMU, I wake up and make coffee. It is what makes me feel “normal.”

At my cousin’s house, the same is true. Except I use his glass cup, and his chrome straw. The pink sliver in the straw is me, trying to feel normal by drinking this iced coffee as I finish my thesis.

Pollinators: temporal capture idea

Inspired by David Rokeby’s Plot Against Time series, I thought it would be interesting to apply this “long exposure” (or what looks like video processing) idea towards the subject of pollinators in my community farm’s herb garden (Garfield Community Farm).

In particular, I’m interested in a  (slightly) more micro scale than Rokeyby’s typical camera position. One photographer’s work, David Liittschwager’s “biocubes,” provides a compelling new form factor to study the motion of pollinators in one space. This is because the diversity of pollinators in an herb garden is critical to that garden’s health. An abundance of pollinators (bees, flies, other insects) can be best appreciated up close; a ton of movement and activity happens in the herb garden in a very small space.

Slightly closer than this framing would be an optimal scale for observing pollinators in the garden. The pollinator activity in this beebalm patch would be off the charts!

Liittschwager’s study of life in 1-cubic foot, coupled with Rokeyby’s video processing for movement could provide an interesting look into the pollinators of Garfield Community Farm’s herb garden.