Final Project: Running Companion!

This project used a blob code to animate a character that runs with you. By wearing bands of a certain color, the computer can determine what pace goal you want to set, and how fast you’re running. Using the difference between these two paces, it determines the location of the animal companion on the screen. If the pace is slower than the pace goal, then the animal will move to the left of the screen, as if it is running faster than you. If the pace is faster than the pace goal, the animal will move to the right of the screen, as if it is running slower than you.

The amount of bands that you wear determines what animal companion you get, and what your pace goal is. The animal runs at the pace goal.

This sort of pace-keeper allows for runners to be more conscious about their cadence, and makes fixing their cadence easier.


The project in use:

At the final presentation:

Here’s my final code, you have to open it in processing:


Proposal: Pace Yourself

Title: Pace Yourself

Summary: An projected animal companion assists you on your runs and logs your pace.

What is it

This projection is meant to be an entertaining way to help runners keep their pace. As an ex-cross country runner, I often find myself trying to run at a pace I can’t maintain anymore. Last year, I got shin splints halfway through training for a half marathon because I was pushing myself too hard. In cases like mine, or for runners who are training alone, without others to help set the pace, this projection helps the runner set goals and gives them a visual pace-keeper.

This allows runners to increase their pace in a constructive manner by analyzing their step per minute count for every minute of their run, and comparing it to their previous runs and their long-term target pace. The mechanism can then suggest a target pace for their next run. The visual projection of a running animal can allow the runner to keep their short-term target pace in a simple way, by keeping up with their animal friend.


  1. draw a couple of running animal animations, for different paces
  2. use geolocation tracking to determine pace (on run)
  3. code pace graph viewer (3-line graph) (off run)
  4. code short-term pace goal calculator (off run)
  5. animal selector based on short-term goal pace (on run)
  6. sync animal run pace and short-term goal pace (on run)
  7. code animal noise for excessive pace deviation (on run)
  8. combine on-run and off-run modes — based on device motion
  9. adjust for phone use and display
  10. find a portable projector
  11. animal projection
  12. create armband including projector and phone holder


This project is meant to be lightweight and useable for high intensity physical activity. I would need a portable projector and materials to create an armband that could carry the projector and a phone while still allowing the animal image to be projected and the phone screen to be visible (fabric, needle, thread).


What I need: Place for laptop, dark hallway or area to walk around in (so people can see the projection)


Poster or storyboard to explain functionality of the equipment and/or wifi connection instructions so that people can work the basic non-projection code on their phones.


It’s a legitimate story, because I wasn’t sure how we were supposed to do these:


It was 00:07. That’s when I usually go for my run, because I don’t really like it when all of my neighbors can watch me try to exercise. That, and I never run into anyone cars or anything, which is always a bonus. I heard a huge screech at exactly 00:07.23, and I turned around really sharply, like how my karate instructor taught me. That’s when I saw you. You saw me, and you started laughing maniacally before sprinting past me. Your were all sweaty and your hair was a stringy mess, and I remember asking myself what you were doing, and then asking you. “Hey, you, what are  you doing?” You looked harmless enough, but just in case, I took my fighting stance, one fist to guard, one ready to punch, knees apart. You pulled your earbuds out. “I’m running with Marie Antoinette.” Your voice sounded sweet. But then you put your earbuds back in and started screaming again. At 00:09 I lost sight of you. You had just ducked into the bushes off of the Main Street, nervously looking back over your shoulder a few times before doing that.


You don’t have to explain yourself. I saw your watch-gadget. My friend Marc has one of them, he says it’s the best thing that’s ever happened to him, which is saying something. He definitely put on some more muscle since he got it, he says it’s better than playing a video game. He’s totally obsessed. He tried to suck me in, too, he thinks I’d be a natural, because I like history and stuff. I don’t know, the whole ghost thing sort of freaks me out. I’d rather just learn history from books. And I don’t really want the ghost of Benjamin Franklin telling me I need to be going at a faster pace. Plus, I don’t know, I feel like you’d get less checkpoints in the country. There are more creepy things, but they’re sort of spread out, you know? It would be good for sprinters, but not for people who actually want to run long distance. If you’re in the city, and you just walk by a skyscraper you could get, like, five different ghosts right there. They’d all be some plain old minor building celebrity or whatever, not very interesting, but at least it’d be something. And they’d just talk to you instead of chasing you. City folk are lazier than you might think. Marc was saying he wanted to add one. Apparently there was this guy, Keith, who lived in his building who loved birds, like 50 years ago. Local legend says he would squawk at the other residents in the halls, and I think Marc thinks it would be a funny joke for other runners. Like, if they run past his building, they run into Keith who squawks at them if they’re not running fast enough. But he’s too lazy to actually go ahead and get all the community votes to add him in.


You stopped right in front of my house and screamed before running diagonally across the road, really fast. I think you said something like “I’ve got to escape the Leatherman ghost.” I almost called the police, when I saw you suddenly stop and look at your wrist. It looked like you were dialing a watch, which sounds a little weird but I don’t think much could’ve been weirder than that screaming bit you just did I saw. You kind of stopped making noise after you’d played around on your wrist. Anyways, you seemed to enjoy it, because you smiled before leaving my neighbor’s yard, like you were content about something. And you were talking to yourself in low tones. I don’t know, I felt kind of bad. I figured so what if you’re a little crazy, no reason to put you in jail or anything. I don’t know, I haven’t seen many crazy people in my life, maybe you weren’t actually crazy. Everyone has weird days, right? And some people do believe in ghosts. I don’t know, they’re pretty scary, I’d scream if I thought one was following me.

Exercise 8

For this assignment, I wanted to go back to object programming, because that confuses me more than most things, and I tend to be a generally confused person.

I also decided I was going to use Processing, because I’ve been watching a ton of coding train videos and I got tired of seeing suggestions come up on the side of my screen that look super cool but use Processing and had me thinking “that’s probably going to take me a while, I can’t learn how to use that right now.”


I used a blob detection code that allowed me to track a colour using my computer camera. I then decided I wanted to detect someone’s facial expression by using the shape of their lips. I could then use my millenial/gen X social media skills to create emojis that reflect typical facial expressions, and flash the related emoji at the user. Using my stellar illustrator skills and some photo bits and pieces off of the internet, I collaged a couple of emojis. I then used my a-ma-zing object programming skills to decode the code I was using and integrate an analysis of blob sizes and numbers. This took an insane amount time, because there are just so many objects, but I have it working! It’s super jittery though, especially between the “shock” face and “happy” face, and you have to be at just the right distance from the camera.

(I also started thinking: hey, if I can detect colors, maybe I’ll be able to detect light colors my arduino gives off and have an elementary replacement to my malfunctioning serial control.)


If anyone wants to test this code out, your skin and lip color will be too similar, and if you mess with the code’s colour threshold you will definitely make it worse, so here’s the easiest fix: Put on some bright lipstick. If you’re reluctant to do so you’re going to have to figure out a way to make your lips change colour. Once you’ve defied the laws of nature (or used someone’s makeup kit) hit the ‘i’ key and click on your lips, then hit the ‘i’ key again.

It’s preset to a bright red, so if you already have a bright red lipstick on, you’re all set.


I sincerely apologize for the weird faces.

code and image files:


Assignment 7: Musical Runs

For this assignment, I chose to make a script that allows a user to select a running pace (in steps per minute), which, in turn, selects a preloaded song whose bpm matches the steps per minute of the user. I then used a pedometer code to run a servo that would rotate at a certain speed based on the steps per minute of the user. In a more thoroughly developed version of this design, the servo would run a music box-esque display.

My original idea for this was to have the pedometer run all aspects of the project: it would determine the song choice and the display. I also wanted to use the spotify API to somehow allow a wider range of song choices that would match the pace goal perfectly. Unfortunately, I got too frustrated trying to find a way to use spotify, and my serial connection didn’t work, so I had to split my project into a computer half and an arduino half, and give up on the song range. What I ended up creating uses the computer to set a goal, and the arduino to give the user a visual platform to match the set goal. This way, a user sets their goal via the p5 platform, which sets the song beat, and then has to run at a pace that ensures that the servo rotation matches the song beat. It’s fun to play with. I also added a graphic display that allows the user to see the beat of the song as well as hear it.

Here are my codes:


Music box movement

And my fritzing sketch:

Exercise 3/23


So essentially, Tactum is a software that scans a body part, like your arm, and allows you to 3-D model an object on it using your finger. (Check out the video, it’ll explain things better than I ever could.) It’s a way of using technology and physical interaction as a means of artistic personalization, with functional application.

I found information about Tactum in an article that mentioned it in conjunction with a new 3-D printer developed by the Disney research team that prints soft fabrics.  I think the idea of pairing them is to insinuate that maybe one day we could all just step into our closets in the morning and get a whole new, personally created, perfectly tailored outfit. (Think: Cinderella’s fairy godmother but 3-D printer style.) This project takes a process  that usually is reserved for either huge clothing companies or for select skilled individuals, and hands it to whoever can use this machine. I think that this hand-off is what makes this an interesting project, and what compelled me to immediately think “Ooh this is cool, I want to try this.”

Assignment 6: Cornflower Field

For this assignment, I wanted to create a sort of minefield using p5 that you could navigate (as a blue dot with a trail) using a gyroscope. I turned the minefield into a calming blue field of dots that don’t explode when you run over them, mostly because I was having enough difficulty creating the player and chose to ignore adding too much detail to the scene.

My main issues with this assignment arose with my serial, naturally. I tried to make a complex connection, but after spending ages attempting to make it work, I decided that maybe I needed to try making a more basic interaction and work upwards from there. I was still hopeful that my serial control issues were just due to my own oversights. So instead of trying to create a mouse using a gyro, I chose to start with getting data from a photo sensor, which I had used before, and using those numbers to drive the distance that the dot travels every time the player hits the space bar. So essentially, the game would be to be able to manipulate light and shadow on the sensor accurately to ensure that the player doesn’t run into a friendly blue flower (or mine) in their travels from one side of the screen to the other. The player would also be using their arrow keys (up and down) to determine the direction of the player.

Obviously, my problem actually didn’t lie in the complexity of my code, and my use of the photo sensor didn’t work out. Instead, I made/collaged from different random walker codes I found to create a code that would move the player at random speed intervals in quick succession. The game would be to use the up/down/side keys to navigate the board vertically. This took me a very long time to successfully create, and the result is a little disappointing, but kind of weirdly fun to play with.

Be warned, the player moves very fast, even at its slowest.

Here’s the code:


Assignment 5

My plan was to use the wind turbines in the united states as my data input, and display their location on a map of the world. I was then going to have the little location dots change colors when clicked (just to get some more object programming in).

I got the map to appear:

and I can zoom in and out and that’s kind of fun:

but whenever I try to load data onto the page I get this lovely display:

forever and ever.

So here’s my code.

Maps turbines

Assignment 2

It’s running smoothly, so here it is.

I wanted to make a ghost chat which would allow the user to interact with “ghosts” that I programmed to be “detected” by a photosensor. By pushing the button, the user can ask questions and get an answer. I also initially wanted to program a theme song for each ghost that could run in the background or when someone asks for the ghosts’ favorite song, but that took a backseat when I realized how confused I actually was by interrupts. If I were to really make this perfect, I’d probably find a different sensor, and figure out a way to have the ghost answer questions that the user defines, as opposed to pre-programmed answers to pre-programmed questions.


Assignment 3

It’s still not working, but this is what I have.

I decided to integrate mouse clicking into my assignment 2 ghost chat, and bring the push-button interrupt into the p5js part of the equation, as opposed to leaving it with the arduino. This means that the user should be able to click around on the screen to ask questions. My issue could be that my code is wonky, because I haven’t yet been able to make p5js actually take information from the arduino and print it successfully using my own sketch, or it could be technical:

I thought the problem had solved itself, but it seems that the Beats have spawned more beats, and p5.serialcontrol likes to open that port instead of the usbmodem even after I’ve closed the weird beats port and opened the other one instead. Sometimes it stays with the usb modem for a few minutes, and I can run a few sketches, and sometimes it doesn’t feel the need to do anything I ask, and I wrestle with it for an hour or two.

So here are my sketches:

ghost_chat_p5js 2

and the fritzing: