Motion Capture Final Project

Objective

I aimed to explore capturing dance movements through a Motion Capture (Mocap) system, focusing on understanding its setup and workflow while creating an animation from the captured data.

Process

System Testing:

I used the Mocap Lab in the Hunt Library Basement. There are 10 motion capture cameras mounted to capture movement in the space.

  • Challenges and Adjustments:
  • Calibration was essential for accurate capture, involving a wand with sensors to determine the cameras’ locations.
  • Initial calibration was poor due to system neglect.
    • Solution: Adjusted camera positions to improve calibration.
    • Result: Calibration accuracy improved but hardware issues persisted, making complex motion capture difficult.

     

  • https://drive.google.com/file/d/1kRd9X2ERyBjxDxj7PbBXGGeyQ92Uary9/view?usp=sharing

 

Recording:

I invited a friend, a costume design student from the School of Drama, to perform a short ballet routine for capture.

  • Challenges:
    • Hardware instability led to unreliable data capture.
    • Export of the ballet data was unsuccessful due to system restrictions.
    • Recorded video of the session was preserved for reference.

 

Rigging:

First Attempt:

https://drive.google.com/file/d/117iZ76MnFCeKrLIg_D61js7rksitm501/view?usp=drive_link

Second Attempt:

      • Model: Downloaded a pre-rigged character from Mixamo.
      • Data: Used test data due to the ballet motion file’s export failure.
      • Outcome: Successfully animated the pre-rigged model using test data.

Next Steps
  1. Locate the original ballet motion data and reattempt the export process.
  2. Redo rigging and animation with the captured dance motion.
  3. Explore finding a model that better aligns with the my conceptual design and hopefully build a small scene.

 

 

Special Thanks to Sukie Wang – Ballet Performer

Person in Time – Interactive Dance Installation

Introduction:

For this project, I used TouchDesigner to create an interactive installation. Lines with the spring feature are projected onto a wall, and a dancer performs in front of this projection. Her movements, especially her hand motions, cause the lines to respond and move.

Workflow:

I used TouchDesigner to create an interactive installation that captures and responds to a dancer’s movements. I created two invisible balls in TouchDesigner that follow her hands. Each ball has a specific size and weight, acting as the force that moves the projected lines on the wall.

(TouchDesigner patch)

To capture her movements accurately in low light, I used the Kinect Azure’s depth-image feature. This depth data feeds into TouchDesigner, where it tracks her hand movements and moves the invisible balls. These balls, in turn, cause the lines to respond dynamically. Each line is made up of multiple points, which I carefully adjusted to create smooth, natural interactions. I also tuned the spring constant to make sure the lines have a fluid feel as they move.

     

(System Diagram)

     

(Set-up Picture)

For the final piece, I filmed the interaction with a DSLR camera, capturing how the lines and dancer create a unique, expressive performance.

 

Final Work:

 

Discussion:

I chose an interactive projection for this project because it brings the dancer’s essence—the unique energy and fluidity of her movements—into a visible, tangible form. The interactive element responds in real-time, allowing her movements to directly shape and influence the projected lines. This immediate response captures the intangible qualities of her performance, making the interaction between the dancer and the projection feel almost like a conversation.

This interaction between the physical and virtual spaces adds an extra layer of depth to the performance, as the projected lines become an extension of her movements. The projection also allows viewers to experience the dancer’s expressive style beyond what a simple recording could convey, as her motions actively sculpt the environment around her.

 

Inspiration Cite:

 

Special Thanks:

Performer – Meixin Yu

Assistant – Elvin Jiang

 

 

Interative Installation-Work in Progress

Concept

I’m planning to make an interactive installation that combines projection and live movement.

 

Workflow

The setup involves using TouchDesigner to create an interactive patch, which will be projected onto the wall. A dancer will perform in front of the projection, and I’ll use a kinetic or thermal camera to capture the dancer’s movements in real-time, feeding that data back into TouchDesigner to dynamically interact with the dancer’s motion.

 

Questions for improvement

 

How to make it more engaging and unique? and more relating to the prompt?

I’m still working on elevating the idea, like how can I use the captured movement to create a deeper, more immersive interaction between the dancer and the projection? Also, to be more related to the prompt, should the projection respond to specific types of movement, emotional intensity, or body part focus? I’m considering how the visuals could evolve based on the dancer’s movement patterns, creating a temporal “portrait” that reveals something hidden or essential about them.

 

I’d love suggestions on how to push this concept further.

Looking Outwards 4: Person Over Time

  1. “Boyhood” by Richard Linklater

“Boyhood” is a coming-of-age drama directed by Richard Linklater, filmed over 12 years (2002–2013). It follows Mason Evans Jr. (Ellar Coltrane) as he grows from age six to eighteen, with his divorced parents.

Linklater wrote the script year by year, incorporating the actors’ real-life changes into the story, creating a unique portrayal of growing up in real-time. It was very impressive to watch this unique portrayal of the characters growing up in real time, highlighting how small, everyday moments shape a person’s identity, making the film a powerful reflection on the passage of time.

2. “Following Piece” – Vito Acconci

a paper with photos, notes, and a map

This is a month-long performance art piece by Acconci in 1969 in which he randomly followed strangers through the streets of New York City until they entered a private space. Acconci described the experience as losing his sense of self, becoming almost an extension of the person he was following. It’s an exploration of human behavior over time, with a focus on the mundane and transient nature of public and private space.

 

3.”Underground Circut” – Yuge Zhou

Zhou has created several pieces centered on the theme of temporal changes, and this is one of my favorites. It’s a collage of hundreds of video clips shot in New York subway stations. “Station to station, the movement of commuters in the outer rings suggests the repetitive cycle of life and urban theatricality and texture.”

 

Temporal Decay Slit-Scanner

Objective:

To compress the decay of flowers into a single image using the slit-scan technique, creating a typology that visually reconstructs the process of decay over time.

I’ve always been fascinated by the passage of time and how we can visually represent it. Typically, slit-scan photography is used to capture fast motion, often with quirky or distorted effects. My goal, however, was to adapt this technique for a slower process—decay. By using a slit-scan on time-lapse footage, each “slit” represents a longer period, and when compiled together, they reconstruct an object as it changes over time.

Process

Why Flowers?
I chose flowers as my subject because their relatively short lifespan makes them ideal for capturing visible transformations within a short period. Their shape and contour changes as they decay fit perfectly with my goal to visualize time through decay. Initially, I considered using food but opted for flowers to avoid insect issues in my recording space.

Time-Lapse Filming

The setup required a stable environment with constant lighting, a still camera, and no interruptions. I found an unused room in an off-campus drama building, which was perfect as it had once been a dark room. The ceiling had collapsed in the spring, so it’s rarely used, ensuring my setup could remain undisturbed for days.

I used Panasonic G7s, which I sourced from the VMD department. These cameras have built-in time-lapse functionality, allowing me to customize the intervals. I connected the cameras to continuous power and set consistent settings across them—shutter speed, white balance, etc.

The cameras were set to take a picture every 15 minutes over a 7-day period, resulting in 672 images. Not all recordings were perfect, as some flowers shifted during the decay process.

Making Time-Lapse Videos

I imported the images into Adobe Premiere, set each image to a duration of 12 frames, and compiled them into a video at 24 frames per second. This frame rate and duration gave me flexibility in controlling the slit-scan speed. I shot the images in a 4:3 aspect ratio at 4K resolution but resized them to 1200×900 to match the canvas size.

 

Slit-Scan

Using Processing, I automated the slit-scan process. Special thanks to Golan for helping with the coding.

Key Variables in the Code:

  • nFramesToGrab: Controls the number of frames skipped before grabbing the next slit (set to 12 frames in this case, equating to 15 minutes).
  • sourceX: The starting X-coordinate in the video, determining where the slit is pulled from.
  • X: The position where the slit is drawn on the canvas.

For the first scan, I set the direction from right to left. As the X and sourceX coordinates decrease, the image is reconstructed from the decay sequence, with each slit representing a 15-minute interval in the flowers’ lifecycle. In this case, the final scan used approximately 3,144 frames, capturing about 131 hours of the flower’s decay over 5.5 days.

Slit-Scan Result:

  • Hydrangea Right-to-Left: The scan proceeds from right to left, scaning a slit everying 12 frames of the video, pulling a moment from the flower’s decay. The subtle, gradual transformation is captured in a single frame, offering a timeline of the flower’s life compressed into one image.

Expanding the Typology

I experimented with different scan directions and speeds to see how they changed the visual outcome. Beyond right-to-left scans, I tested left-to-right scans, as well as center-out scans, where the slits expand from the middle of the image toward the edges, creating new ways to compress time into form.

  • Hydrangea Left-to-Right with Scanning every 6 Frames

  • Hydrangea Center Out: This version creates a visual expansion from the flower’s center as it decays, offering an interesting play between symmetry and time. The top image is scanning in the speed of every 30 frames, and the bottom image is every 36 frames. We can also see the intersting comprision between different speed of scanning.

  • Sunflower Center Out/ every 30 frames
  • The sunflower fell off of the frame created this streching warping effects.

  • Roses Left-to-Right/ every 12 frames

  • Roses Right-to-Left/ every 18 frames
  • While filming the time-lapse for the roses, they gradually fell toward the camera over time. Due to limited space and lighting, I set up the camera with a shallow apparatus and didn’t expect the fell, which resulted in some blurry footage. However, I think this unintended blur adds a unique artistic quality to the final result.

Conclusion

By adjusting the scanning speed and direction, I was able to create a variety of effects, turning the decay process into a typology of time-compressed visuals. This method can be applied to any time-lapse video to reveal the subtle, gradual changes in the subject matter. Additionally, by incorporating a Y-coordinate, I could extend this typology even further, allowing for multidirectional scans or custom-shaped images.

One challenge was keeping the flowers in the same position for 7 days, as any movement would result in distortions during the scan. Finding the right scanning speed also took some experimentation—it depended heavily on the decay speed and the changes in the flowers’ shapes over time.

Despite these challenges, the slit-scan process succeeded in capturing a beautiful, visual timeline. It condenses time into a single frame, transforming the subtle decay of flowers into an artistic representation of life fading away. This project not only visualizes time but also reshapes it into a new typology—a series of compressed images that track the natural decay of organic forms.

Typology Machine Work-in-Progress

Concept:

I’m interested in recording fruits or food decaying in time-lapse and transforming them into a slit-scan format for the final works.

 

Slit-Scan:

Definition: Slit-scan is a photographic technique that uses a narrow slit to capture a subject, resulting in a variety of image effects.

I’m choosing this imaging technique because I believe it’s a creative and effective way to represent the concept of time, which is often used for capturing time-lapses or movements.

Slit scan - Interactive installation :: Behance Myrto Amorgianou

Experiments in Slit-scan Photography : 7 Steps (with Pictures) - Instructables

 Alvaro & Ishikawa

James (Jung-Hoon) Seo

 Martin Hilpoltsteiner

 

Process Plan:

Set up a shooting booth with multiple iPhones to capture time-lapse videos or images of various fruits and foods decaying over a 1-2 week period. Then, apply post-production processes to create the slit-scan effect.

 

Questions to resolve:

  1. The post-production technique: Coding, After Effects, or 3D photogrammetry? This will also determine whether I should capture the process using video or still image formats.
  2. The type of slit-scan effect: Strip, video cube, or subject-shaped?

 

Thanks: https://flong.com/archive/texts/lists/slit_scan/index.html

Pocket Postulating

Three different captures with my phone: 3D scanning of Nittany Lion using “Reality Composer”, time-lapse documentation of a design meeting, and a panoramic shot of the CFA 1st floor.

 

https://drive.google.com/drive/folders/1S4NhZlCdrzQiO-TE5a7NmOMKjlGyWXks?usp=sharing

Raman spectroscopy

I find C.V. Raman’s observation and measurement of diffraction, known as Raman spectroscopy, really interesting. This technique shows how light interacts with materials at a molecular level by picking up small changes in the light’s wavelength as it scatters off a surface. While I was doing research on the method, I found that it’s mainly used in science to study molecular composition. However, like the “bar code” it creates (as shown in the article), I believe it has many ways to be used creatively in artistic applications. The method is especially fascinating because it’s not measuring the direct reflection but the diffraction and the scattering of the lights. By capturing the unique “color shifts” or diffraction patterns that different materials show, we could turn these small changes into visual art, revealing a new way to see the physical properties of different materials.

Star-Finding App and Camera Lens Simulation App!

I’d like to share two of my favorite apps that enhance your phone’s capture capabilities:

Stellarium: This is an AR app that helps you identify stars and constellations in real-time by simply pointing your camera at the sky. It’s particularly enjoyable when you’re camping or at the beach in the night, gazing at the vast expanse sky and wondering about the stars you’re seeing.

Cadrage Director’s Viewfinder: This app simulates various cameras and lenses using only your phone’s camera. It’s very helpful for pre-production work, allowing you to preview frames before actual on-set filming. You can also create shot lists with the app.

Dan Hoopert – What Does a Tree Sound Like?

Audio Synthesis: What Does a Tree Sound Like?

“Here a single beam of light scans from left to right, creating a 2D cross section of the shape. Using the area of this cross section MIDI data is generated using Houdini and CHOPS. This can be fed into any DAW providing a base for content driven sound design. Field recordings from the object’s natural environment triggered by this data allows a close relationship between the light and sounds that are created, completely unique to the chosen object.  ”

This artwork is created by London-based 3D artist Dan Hoopert. It’s a data-driven piece exploring the relationship between visual and sound. It uses photogrammetry to recreate a tree in 3D space. Silent objects from the real world are given voices in the virtual realm, yet with unnatural, electronic sounds that create an intriguing sense of conflict. Data from the scanning is also visualized through particles that follow the beam of light. I’m especially captivated by the surrealist imagery of the work.