shrugbread- microscopic scanning pipeline

For my final project I created a pipeline for obtaining super hi-res , stitched together images from the studio’s Unitron zoom microscope.

The main components of the capture setup I have are the axidraw a3, a custom 3d printed build plate , and the Unitron Zoom HD microscope.

These items are all connected to my laptop which is running a processing script that moves the axidraw build plate in a predetermined path, stopping along that path and outputting the microscope image at each point along the path. With this code I have full control over the variables that move the axidraw arm and can make scans with a specific number of images, image density, and the time interval that the capture is occuring.(This code was given to me by Golan Levin, and was edited in great collaboration by Himalini Gururaj, and Angelica Bonilla)

the counter above displays the progress through the scan and below is an output window to check focus and placement of the mounting plate

 

this is a simplified version of how the robot path is shaped

From here I put the output images from the processing program into a software called gigapan stitch to have the edges of each photograph aligned and merged together. each photograph needs some amount of overlap for the software to be able to detect and match up edges

From here the images go straight to being hosted on the gigapan website and are accessible to all.

Here are some I have made so far with this method.

http://www.gigapan.com/gigapans/231258

http://www.gigapan.com/gigapans/231255 

http://www.gigapan.com/gigapans/231253

http://www.gigapan.com/gigapans/231250

http://www.gigapan.com/gigapans/231241

http://www.gigapan.com/gigapans/231256

http://www.gigapan.com/gigapans/231259

here’s a timelapse of a scan I couldn’t use:

This process for scanning objects opens up scanning to a host of subjects that a flatbed scanner may not be able to capture. Such as objects with strange topology and subjects that ideally should not be dried out. I also believe that with a proper stitching software I will be able to record images with density and dpi that go beyond a typical flatbed scanner.

I have a lot I would change about my project if I was given a chance to restart or work on it further. I didn’t have enough time to give softwares similar to gigapan a shot, and I think in many ways gigapan is a less-efficient, data capped method for image stitching than may be necessary for what I need. I  also wish the range of objects I could capture did not have so many stipulations. At the moment I am only able to get a small sliver of the plane of my subject in focus. where I would have preferred to stitch together different focal planes. the subjects also have to be rigid enough to not deform or slosh around during axi-draw bed transit.  With this project I am super excited to get a chance to photograph live mushroom cultures I have been growing and other biomaterial that does not scan well in a flat bed scanner.  I want to utilize this setup to it’s fullest potential and be able to use the microscope near maximum magnification while having it be viewable.

This project really tested a lot of my observational and hence problem solving skills. There were many moments where after viewing my exported images I was able to detect an issue in the way I was capturing that I could experiment with, such as realizing that increasing the amount of overlap between images doesn’t increase dpi unless I am actually increasing my magnification strength, or realizing with a zig zag path that half of my images are being stored in the wrong order.

Once a again very big thanks to all of the people who helped me at every step of the way with this project. Y’all rock.

 

Scanning microscope pipeline proposal

For my final project, I plan on creating a pipeline for capturing and stitching hi-res image segments of the contents of a 100mm x 100mm petri dish. After I record a video from the studio’s HD microscope, I plan to use panorama stitching software to create a large image that you can zoom in on and view each segment of the petri dish in HD and host that on my website so it can be viewed and appreciated by anyone remotely. for reference check out : http://gigapan.com/

Visibility Through Voice

Visibility Through Voice is a Kinect based capture setup that draws the silhouette of a participants when they speak. I used Touchdesigner as the interface between Kinect, camera, and microphone and explored it as a tool for manipulating the relationship vocalization and visibility. I spoke from a random text generator and did walks in and out of the bounds of the Kinect to make these paintings.

Videos with camera view!

 

Videos without camera view

 

I used the CHOP reference feature in Touchdesigner to pull off this effect mainly. It allowed me to take volume and pitch data from the microphone I was using and have it real-time update the brightness of my Kinect data. Once I have the audio reactive capability set up, I create a feedback loop that retains the brightness from the prior frame.

While I was very interested in the silhouette aspect of my project, I was mainly inspired by the visuals of long exposure video and the work of David Rokeby who uses a similar technique in “Plot Against Time” 

I started this project knowing I wanted to work and grow in Touchdesigner. The conceptual framework grew around tests in audio reactivity. This project was mainly the result of a handful of experiments. I tried many different types of projection, recording, and methods for producing things in Touchdesigner as well as getting lots of help from the people around me. The only thing I wish I had time to fix in this project is to make the audio reactivity not connected to brightness, but opacity so that I don’t need to interact with new pixels to create color.

 

 

shrugbread-Person in Shadow

My idea for Person in Time project I plan on creating a silhouette copy that only records your shadow when you’re producing a noise.

I have really been inspired lately by KN on YouTube. The video ‘ghosty’ is my favorite and the biggest inspiration for this project.

ig: @frog_spit_simulation

Touchdesigner has multiple ways of doing audioreactive visuals, so my main idea right now is to project the azure kinect’s particle cloud and have the scale of the point cloud controlled by a microphone waveform. This idea is very subject to change or be redone entirely. I am very open to adding other elements that help the conceptual lenses I am using in this piece.

 

shrugbread- Cemetery Mushrooms

In this project I created a typology of spore prints from wild mushrooms foraged in Homewood cemetery, and displayed them with the names of the buried they were found closest to.

My question was to see if I could capture environmental the impact of human practices around death. This question became a typology of how to represent life coming out of death.

Mushroom life cycle — Science Learning Hub

In the life cycle of a mushroom, the purpose of the fruitbody stem/cap shape is to drop spores from gills on the underside of the cap. Almost all mushrooms spread hundreds of thousands of small spores to be carried by the wind and reproduce away from the fruiting body. Spore printing takes advantage of this unique method of reproduction.

How to: Make Spore Prints - Milkwood

Spore printing is done by cutting off the stem, covering the mushroom cap, and letting it sit for 2-24 hours as the spores fall. Spore printing is mainly used by foragers for mushroom-identification, as the color of a mushroom’s spores may be one of the only differences between an edible mushroom and a poisonous lookalike. Spore printing is also a tool for archiving and preservation as the spores can lay dormant for years in a well maintained spore print, ready to restart the life cycle of the fungus.

I took inspiration from the cyanotype photography of Anna Atkins and Wilson “Snowflake” Bentley. Both are dealing with very delicate natural phenomena. I find a specific relationship with William Bentley because of the delicate nature of both spore prints and snowflakes. In order to capture them you have to disturb them as little as possible. A single finger smudge can completely ruin any image you’re trying to create. I also find kinship with Anna Atkin’s photography as it is specifically about the relationship to ecology and setting. Mushroom and plant species vary across the world, but only in western Pennsylvania will you find this specific set of mushrooms

Anna Atkins | Spiraea aruncus (Tyrol) | The Metropolitan Museum of ArtSnowflakes: Wilson Bentley's Civil War | Harvard Art Museums

The hardest part of this project was organization of my assets. I lost track of what prints were produced by what mushrooms multiple times, had I set up a system of documentation early on I could have saved myself a lot of pain later. Another challenge with the spore printing method is that it is variable on the age of the mushroom collected. A mushroom picked too young won’t drop it’s full set of spores, while a mushroom picked too old won’t show up on the paper at all. I ended up gathering double the amount of mushrooms than spore prints.

 

I missed many opportunities for a more refined and full typology by being almost entirely focused on foraging and scanning. I didn’t have enough time to consider presentation and storytelling fully. Finding a connection to the prints and the gravesites remains a challenge.  Despite this I gained a strong connection to the land in Pittsburgh as well as connections with other mushroom hunters in Pittsburgh

 

 

shrugbread- Typologies Proposal

I plan on capturing spore prints from mushrooms found in Homewood cemetery. Spore prints are an almost self-capturing method mainly for identifying spores due to the color of the spores dropped from the mushroom cap. Spore prints require no extra methods of developing and reflect information directly from the mushroom.

I plan on using the spore prints in conjunction with data collected at the site of retrieval to contextualize where the mushroom was growing in relation to gravesites and the trees they were growing on. Trying to capture final reproductive stage of one organism next to a place of rest.

shrugbread-SEM

For this trip to CBI’s Scanning Electron Microscope I brought in a couple flakes of fish food hoping to see tiny bits of organic matter however preserved in the flat flake. This project is how I actually learned that fish food is only part finely powdered”fish-meal” with the rest of the concotion just being a soup of wheat, letchin, and water.

I opened with this and it was promising to have really interesting caves and craters to explore

But due to how thin and shallow the craters are they stop showing a lot of clarity. This was still incredibly insightful! The most fun part of this is using the controller that sets your framing and zooming, it’s a very different frame of mind than digital photography.

shrugbread- Reading 1

I have studied the history of photography a bit, but had never come across photography’s involvement with the School of Military Engineering at Chatham and Abney. Learning about the extensive specialization and attention to details that many photographers would have to go through with various other kinds of scientists helps me wrap my head around the idea of photography as measurement tool rather than photography as expression. Both in the US and internationally, militaries have been some of the most influential and leading entities when it comes to researching an inventing new tech, it does not surprise me that the British military put so much effort into photography as recording tools for expeditions like recording the transit of Venus.

I am interested in the artistic potential available through the use of population and statistical data with visuals created or processed in Touchdesigner. I have experience in storytelling through worldbuilding and imagining futures for the contemporary world, but I am more interested to see if I can build a framework or tool that gives information to predict and contextualize changes in human life. I am still figuring out what different types of capture I can engage in .

shrugbread-Looking Outwards

Pigeon Blog (2006)

Pigeon blog is a bioart project by Beatriz Da Costa that attempts to retrieve and map accurate air pollution data by attaching air quality sensors to the backs of homing pigeons. The data was then distributed to a map that showed flight paths as well as changes in air quality. The project was launched to bring more awareness to the issue of air pollution in southern California, as well as to inspire more people to take science into their own hands with DIY tech and grassroots activism.

I found that this project was one of the closest things in my cmu education so far that matched up with the prompt of experimental capture, and what I like most about this project is that the choice to use pigeons is incredibly important to the data collection. Their mobility through cities and forgiving nature as trainable animals makes them the perfect vehicle for the air quality sensors, whereas other means would be too slow or call attention to themselves.