I won’t lie, this particular assignment didn’t really speak to me hence the basic choice in capture techniques. I more or less tried to integrate capture into my pre-existing plans for the weekend, most clearly visible in my time lapses, which are also the most boring. With the panos, my hope was to be able to walk around an object in a circle and get the real world item’s “texture”, in the video game sense of the word. My phone would not allow me to do this so I just tried to abuse the software as hard as I could. Finally, for the slow mos, I wanted to capture the ubiquitous (I hope) experience of dropping your phone on your face when you’re drowsy. I have always struggled to give candid reactions while being aware of being filmed so I actually quite enjoyed making this capture because the panic of Object Coming at Face overrides my camera-shyness allowing for far more candid film.
I wanted to choose subjects that focused on constant movement—not just capturing my own movement, but also the motion of the world around me. Specifically, I aimed to document the flow of everyday life using different techniques. There’s something fascinating about speeding up or slowing down the mundane—the things we do every day without often appreciating. Every interaction and emotion we experience becomes part of that unnoticed routine.
Time Lapse – Roundtrip walk to CVS for Cough Medication
The first capture is a time-lapse of my roundtrip walk to CVS to pick up some cough medication. I thought it was interesting because, with each new street, a different scene unfolds. I chose this subject because I find time-lapse tricky to keep engaging—if it doesn’t feel like the scenery is changing, the footage can seem stagnant, which becomes boring quickly. While I enjoyed the overall result and various scenes experienced, the camera focus was slightly off throughout the video, and I couldn’t adjust it, so the footage wasn’t as sharp as I’d hoped. I also realized I’m quite a bumpy/uneven walker!
Slo Mo – Monday Morning CMU Rush Hour
The second capture is a slow-mo of people walking around me. After watching life sped up, I wanted to see how it looked slowed down. We’re always living life at such a fast pace that we rarely slow down to notice the details—exactly what I wanted to capture. I filmed during the busiest part of the day, when students rush to their next class. Slow-mo is great for picking up small, intimate details, but I forgot just how slow it really is. I caught a funny moment of a freshman struggling with his backpack strap, clearly wrestling with it in slow motion (0:25). Another part of the video, where the sun completely overshines the person in front of me, made it look like they were walking into heaven (2:10). I loved observing each person’s reactions and interactions—there’s something fascinating about seeing these small, often unnoticed moments up close. It highlights actions we might hope go unseen, but in slow motion, these movements become more pronounced and scrutinized. As a critique, my footage was a bit shaky, and I focused too much on the person in front of me, missing some great interactions happening elsewhere.
Skeletal Tracking – Monday Morning CMU Rush Hour
Finally, sticking with the theme of people and movement, I experimented with the Ghost Vision app, which uses skeletal tracking to detect human figures in real time. I was curious to see how well it could capture motion and wanted to test its limits in terms of the number of individuals it could track. When I tried recording horizontally, it didn’t work, but switching to vertical mode captured almost everyone around me impressively well. It was fun to see how accurately it outlined people walking past. Although the heat-sensing features and other gadgets cost extra, I’d consider paying the $1.99 if I needed those functions. But even without them, I was really impressed—it was very accurate with it’s tracking!
iNaturalist is a mobile app that uses image recognition technology to identify the plants and animals around you. I snapped a photo of two plants in my apartment, a basil plant and what I learned to be known as a Mother-In-Law’s Tongues! The app identified the plants and offered information about them.
Ghost Vision uses machine vision to detect human figures in real time providing skeletal data of the person in-view. It can even detect multiple people at once. I snapped a picture of myself in the mirror to try this out!
ZIG SIM uses data from a ton of different sensors in your phone to allow you to obtain a ton of different metrics. You can measure things like touch radius, pressure, mic level, gps, etc. Below are screenshots of me trying out the gravity and compass features!
This was the app I wrote about in looking outward 3 and had used it to make some incomplete scans of myself. The scans below of stuffed animals are still not great conventionally, but I found that rotating the object itself works better than moving the phone around it if you’re scanning alone. The app works for human figures if there’s someone else there to scan you, and small objects in a clean environment. The app does a decent job of capturing the fur. In the second scan of the fox, I was holding it by the head, and the scanner captured some of my hand and clothing. It makes the object look like a planet with satellites floating around it. For projects we may want cleaner scans.
Slow Shutter
I used the two different modes of the camera: motion blur and light trail. Good subjects would be fast moving (or just moving, since the shutter speed and blur strength can be changed) objects, and for the light trail mode glowing objects would be an optimal choice.
The method really depends on whether the subject is fixed / in motion. If the object is moving then a fixed camera can produce the effect of only the subject being in motion while everything else stands still (the case of me shaking my hair and the bear moving around). In the light trail mode where I photographed my light, I moved the camera since my light is unmoving.
Ghost Vision
I think the SLS camera estimates how far away an object is (LIDAR) and detects people shaped stuff using an AI(SLS?), and marks them with stick figures. It is slightly creepy, but the ultimate effect is that it kept on marking my shelf and other things in my room as a “ghost”. It would be interesting if it the stick figures can be manipulated, but I did not find such function on the app. Perhaps a better LIDAR camera that could record more details would be good for creating nuanced shaded recordings / photos of messy environments.
I decided to experiment with the slit scan, long exposure, and panorama apps. I used the same subject for all the experiments, the face on this dog toy. I thought it would be fun to use a dog toy since I was trying to get trippy pictures.
For the slit scan camera, I learned that the subject needed to be moving to capture it properly. I was experimented with moving it in different directions and learned that if I went the opposite direction of the scan, I could get a clearer picture of my subject.
I also tried the panorama camera, but I didn’t really like the results when I went left to right, so I ended up turning my phone during the pano Instead.
Finally, I tried a long exposure cam, and I moved the mushroom toward the camera to get the final image (which is honestly my favorite of the bunch)
I experimented with slow mo, timelapse, and light painting (long exposure) techniques to capture fire in different ways. For the slow mo video, I placed my phone on a stand, held a lighter, and waved it in front of the camera. In the timelapse, I lit a candle and recorded the wax melting into liquid over 30 minutes. For the light painting with long exposure, I kept my phone on a stand and moved the lit candle around, with each photo having an exposure time of about 30 seconds.
I was interested in trying the “slit scan” app, which I quickly realized makes for interesting effects when capturing something in motion (not as interesting when there isn’t movement). What’s interesting about capturing motion is that as a photographer, you should decide if you are going to move the camera, or if you are going to keep the camera still and capture something that is moving, or both. I have been working on group projects on campus all day, and I was having trouble thinking of a subject or set of subjects whose motion I could capture. I was trying to capture people moving around me, but that didn’t seem like a cohesive set. So I decided to instead capture myself in movement because I could be a clear and consistent subject. In the slit scan selfies below, I took selfies while walking around different places on campus. While I’m not the most avid selfie-taker, I do think that taking selfies this way allowed for a wider display of my different sorts of dispositions – of course my face and it’s expression is distorted in these, but I think that the variety of expressions that you can see here are greater than the variety of expressions there would have been if I had just taken regular selfies instead. I’m planning to try slit scan again for other objects in motion. Maybe tomorrow if I go for a run, I’ll take a pause and use it to capture some of the runners around me from a constant place.
Tomorrow I am also planning to take some panoramic photos as well as Timelapse photos. For the Timelapse photos my first idea is to take Timelapse photos of drinks or dishes with foods and liquids in the that are being consumed. Not sure if this a subject I can capture in a day, but I will try! For the panoramic photos I’m thinking my subject could be the streets I walk along on my way to campus. ~till tomorrow!
While walking to class, I became interested in the motion of various shadows. The wind was subtle, so I wanted to exaggerate and speed up the motion by taking a Timelapse. These photos were more interesting to me than the panoramic photos I took (where I was trying to catch the same car from two views). Above is a sample of the flickering plant shadows and below are couple of my failed car panoramas.