lemonbear-FinalProject

Project Description

My final project for this class was a set of 12 generative love-letter postcards, with the images on the front created by composing elements from the Google Quick, Draw! dataset into a “living room”, and the text on the back created by Markov Chains trained off of ~ 33,000 messages I’ve sent to my partner over the past year.

Inspirations & Ideation

At the onset of this project, I knew I wanted to create some kind of data-driven small multiples work that inspired by the care I have for my partner.

I was really touched by Lukas’ work for our mid-semester critique (2021), where he plotted a massive amount of zigzagging lines reflecting texts between him and his girlfriend. I had also enjoyed Caroline Hermans’ project exploring attention dynamics with linked scarves (2019), and “Dear Data”  by Giorgia Lupi and Stefanie Posavec (2014–2015). All three projects exhibited the power of data to tell a story.

“dear data”

These projects helped convince me that data visualization is uniquely suited to tell the story of a longer-term, committed relationship—not in the manner of poets, with their manic, myopic focus on the singular rapturous moments, but in a fashion that underscores how a healthy relationship is built up of small, consistent gestures, day in and day out. (I hope my saying this doesn’t constitute an “interpretation” of my work; I only state it to illuminate why I decided on the tools I used to make the project.)

Since my project was composed of essentially two sections (the front and back of the postcards), I first explored how and why to generate text from a corpus of text messages. To aid me on my discovery of the why, Golan recommended Allison Parrish’s Eyeo 2015 talk, which I thoroughly enjoyed. I liked the analogy she drew between unmanned travel and exploration of strange language spaces. She posited that, in the same way a weather balloon can go higher and explore longer without the weight of a human, generative language can go into novel spaces without the weight of our intrinsic biases towards familiar meanings and concepts. I then learned the how from one of Parrish’s Jupyter Notebooks, where I gained the technical skills to train Markov Chains based on accumulated corpus of text.

Parrish’s diagram of “unexplored language spaces”

In my journey to brainstorm fronts for the postcards, Golan introduced me to the Google Quick, Draw! dataset, where users doodle different categories of objects (i.e., couches, house plants, etc.) in order to help researchers train ML classification models. I found the dataset oddly charming in its visual imperfections, and I loved how it provided a survey of how users think of different objects (how round a couch is to them, what kind of house plant they choose to draw). Additionally, I’ve always been intrigued by the concept of building a home and understanding the components (people, objects, light) that define a home. I thought generation from such a widely sourced database might be an interesting way to explore home construction without being weighed down by past personal connotations of home, or even traumas surrounding home. And so I decided on creating little living rooms as the front of the postcards.

potential parts of a home

Process

For the text of the postcards, I requested my message history data from Discord (yeah, yeah), which provided me with large .csvs of every message I’d ever sent through the platform. I picked out the .csv corresponding to my partner’s and my DMs, and manipulated it using Pandas, a process which included cleaning out non-ascii characters and sensitive information, and depositing lines of text into twelve different .txt files, the first of which held only messages from November 2020, the second of which held messages from November–December 2020, and so on until the twelfth held all messages from November 2020–October 2021. I processed the data in this manner in order to create twelve different paragraphs, each generated from a model trained on progressively more data, and composed of sentences that held words such as “love”, “sweet”, “want”, and so on, to give them the “love-letter” characteristic.

The progression of the twelve cards isn’t as clear as I hoped it’d be, but the first few postcards are still distinct in some ways from the last few, influenced by how much information each postcard had access to:

from 2 months
from 12 months

For the generative living rooms, I downloaded large .ndjson files of objects from the Quick, Draw! dataset I thought would go nicely into a living room, and randomly picked instances to place in random locations with random sizes (the latter two parameters limited by certain ranges). This was done in Python with the vsketch and vpype modules:

I then spent a few hours plotting double sided postcards, as shown below:

Further Documentation

(.svgs of all the text sides of the postcards are posted under the cut)

Takeaways

Though I ultimately like what I created, I am unsatisfied by some elements of the final product. I found it difficult to create living rooms with what I felt was an appropriate balance of charm and order—the ones I ended up with feel overly chaotic and uneven in composition. Furthermore, I am disappointed that the progression of the paragraphs isn’t as clear as I hoped it’d be. I think to make it clearer I’d have to understand a little more about different models for text generation.

Overall, this project taught me how to use numerous tools/workflows (manipulation of .csvs, .ndjson/.jsons, data management and cleaning, generative image toolkits like vksetch, learning about naïve ML models, etc.). However, I am most thankful for the ideation process and how the project made me think about technology as a tool to express concepts/emotions that might be otherwise difficult to convey. I want to continue using the creative and technical skills I gained from this project to go forth and make new pieces like it!

Continue reading “lemonbear-FinalProject”

lemonbear-FinalWIP

For my final project, I’m creating a series of 12 postcards generated by a neural net trained on a year of messages I’ve sent to my partner. Each postcard is trained on progressively more data (i.e., the first card is representative of Nov. 2020 only, the second card is representative of Nov. & Dec. 2020, and so on).

Additionally, (though I’m still workshopping this portion) the front of the cards are living rooms generated from the Google quick draw dataset. As the series progresses, the rooms also become progressively lived-in and elaborate.

Some of my inspirations were Nick Bantock’s Griffin & Sabine series, which is told in epistolary form:

Inside pages of the book “The Pharos Gate, Griffin & Sabine’s Lost Correspondence” – Written and Illustrated by Nick Bantock

I enjoyed these books immensely as a kid; there was something thrilling about the voyeuristic quality of opening someone else’s mail, amplified by the lush illustrations & calligraphy. I wanted to emulate the intimacy of the narrative.

I also enjoyed various creative datavis projects, including Dear Data by Giorgia Lupi and Stefanie Posavec. The ways they bridged data, storytelling, and beauty/visual interest were very compelling to me.

All 12 postcards in .svg form are below the cut:

Continue reading “lemonbear-FinalWIP”

lemonbear—MidSemester

I decided to add color to revamp my tiling project from a couple weeks ago (where I coded a pattern to laser cut onto some fiberboard). Because I was dissatisfied by how strongly the grid dictated the piece, I thought of ways to make the piece more free flowing while still maintaining the original rules for the piece (edges want to match up). I devised a simple logic for the coloring, where the smaller shapes were colored warmly, and the larger swaths of color were cooler. I made sure none of the colors continued over edges, and disregarded the tile boundaries, opting to create a composition defined by only the drawn arcs.

Here’s the mockup for the coloring scheme:

I elected to only color a chunk of it as the original piece was somewhat overwhelming in scale. It came out like this:

The finished product.
Another fun doodle with the complementary colors.

lemonbear-Proposal

I want my final project to be a data-driven small multiples work. I was really touched by Lukas’ work for the midsemester critique with the zigzagging lines reflecting texts between him and his girlfriend, and I had also seen Caroline Hermans’ project exploring attention dynamics with linked scarves, so I was considering a project reflecting in some way the history of my messages with my partner…! As for the output, I’m still sort of waffling between some ideas—generative poems constructed in a “mad-libs” manner; generative living rooms with different elements like couches, tables, lamps, etc.; generative letters between fictional figures with text with varying levels of readability (delving into asemic writing territory) corresponding to varying levels of understanding between said fictional figures.

Golan recommended Allison Parrish’s Eyeo 2015 talk, which I thoroughly enjoyed. I liked the analogy she drew between unmanned travel and exploration of strange language spaces. In the same way a weather balloon can go higher and explore longer without the weight of a human, generative language can go into novel spaces without the weight of our intrinsic biases or bents towards meaning. I also liked the piece at the bottom of this lecture, Sissy Marley’s “My House Wallpaper” (2020) because I’ve always been obsessed with the concept of building a home and understanding the components (people, objects, light) that define a home. I thought generation might be an interesting way to explore home construction without being weighed down by past connotations of home/traumas surrounding home.

Lukas also recommended “Dear Data” (putting this here for my future reference).

potential parts of a home
mad libs brainstorming
i got distracted in 213 and thought about light and place instead

11.03	Wed	-- Due: #10 (Research/Proposal/Tests); DISCUSSION.
11.08	Mon	-- Have parsed the data to get something meaningful out of it
11.10	Wed	-- Come to class with a chunk of code that is a prototype for the output (have done some research regarding language generation and/or small multiples)
11.15	Mon	-- Create 3-4 small pieces that *exist* with aforemented data & code; continue revising codebase
11.17	Wed	-- Due: #11 (Major Milestone); CRITIQUE.
11.22	Mon	-- Revise work based on critique; make 3-4 small pieces again that *exist*
11.24	Wed	-- NO SESSION (Thanksgiving).
11.29	Mon	-- Create final plotted project; by Wednesday have made all the necessary hand-drawn changes
12.01	Wed	-- Due: #12 (Final Project); EXHIBITION.

lemonbear—FieldComposition

SVGs:

Photographs:

Reflection:

My offering for this prompt wasn’t especially adventurous, but from Golan’s suggestions in class the other day, I learned a bit more about Perlin Noise and the various parameters I could tune to get the piece to come across as a field when plotted. I also experimented a little bit with color and hacking my way into multicolor by taping multiple pens together; I think this method could be more impactful if I devised a less shoddy way to do it (more consistent pressure, more control over distance between pens) and had an actual svg that produced some kind of optical illusion when plotted with multiple pens at once.

lemonbear—FieldReading

I enjoyed Tyler Hobbs’ article on flow fields; the methodology felt approachable and at the same time produced a myriad of beautiful images. I liked how he experimented with color and line length to produce a wide variety of emotion in the final products.

lemonbear—PatternReading

Computed art lends itself to elegance in pattern. I liked how the 10 print passage had a ton of examples of old 1-line code bits that produced serendipitous arrangements of lines and blocks, and I tried to write code for this project that would surprise me similarly. For example, I had no idea the leaf-like structures in my piece would arise, but they did, and that was a lovely discovery.

an in progress shot of my piece

lemonbear—TilingPattern

I thought it would be fun to create an interactive tiling game, so I brainstormed some algorithmic rules vaguely following the spirit of Wang Tiles (connections have to flow across tile boundaries):

I learned how to use vsketch/vpype (s/o to Perry for the help) for this project. I’ve been meaning to try a project in Python for a while, because I can write the most haphazard bullshit code in Python (I’m like halfway comfortable with JS, I don’t know Java that well, anything C-based I have to be really careful in). The chaotic energy of Python has always felt most akin to the chaos of art making to me.  Here is some of the code. It is terrible:

The final SVG looked like this:

I then spent half an hour laser cutting:

I brought these to the STUDIO and people got to play with them, which made me really happy. The numbers produced mixed reactions—a lot of people thought they contextualized each tile piece too heavily and were too obtrusive, which I somewhat agree with. I do think the ability to move the pieces around and have clear evidence that they’re not in their designated place is interesting—”the rules are obviously broken, but the beauty persists” kind of deal.

lemonbear—BlobFamily

I have a somewhat meager offering for this week. I came up with another algorithm that combined circles with circular arcs by matching the slopes of the tangents, but I had difficulty implementing it. I was trying to mimic the forms of water bubbles, and it was going to look something like this:

I ended up pivoting to something simpler, an implementation of the algorithm that Golan mentioned in class, which mapped 2-d space Perlin noise to radial alteration of circles. I played a little bit with using the blobs to fill a space at different densities. Some of the different iterations looked like this:

 

I liked the poetry with which Laura Kim spoke about blobs, and how they saw blob-making as a form of alternative communication, a strange and novel language.