Many devices we saw Monday in class were ones I had never thought of as capture devices for creating art, including medical equipment like the ultrasound transducer we experimented with. This inspired me to research medical equipment, and here’s a short list of common technologies we have for scanning brain images, which I found very interesting:
CT (Computed Tomography): An X-ray-based scan that beams X-rays through the head, producing a picture that looks like a horizontal slice of the brain.
MRI (Magnetic Resonance Imaging): These scans construct an image of the brain by passing a magnetic field over the head. Hydrogen molecules in the brain react to this magnetic field by reverberating and sending a wave back. The scanner records this and turns it into a highly accurate image of the brain.
PET (Positron Emission Tomography): PET involves injecting a small amount of radioactive material into the body, which then accumulates in the brain. The scanner detects this radiation to create images that highlight areas of functional activity, producing a multi-color image of the brain that resembles a heat map.
PET technology is particularly interesting due to its ability to visualize brain activity, which I think it could be used to create dynamic, time-lapse pieces representing changes in brain activity over time. For example, visualizing changes in brain activity during different emotional states, and it could be translated into a series of animation.
The project Breeze is a device created by the artist to capture and visualize wind using wind data and a robotic arm. Developed during the covid 19 lockdown, the artist sought to bring the outdoor weather into an isolated indoor space. I find the idea of visualizing wind through the swaying of plants particularly compelling. The use of a typically indoor plant to represent outdoor wind is an interesting contradiction, reflecting the artist’s experience of isolation during the lockdown. This parallel between the indoor plant and the outdoor wind highlights the tension between lockdown and the natural world. Since this project involves just a single device, I think it would be even more impactful if multiple devices were used together, potentially forming an indoor garden. This might create a more immersive experience. This project kind of reminds me of Janet Echelman’s installations as they both attempt to capture the forms and directions of air and wind.
The project “Eternal Blue” by Richard Vijgen visualizes malicious packets caught by the university’s firewall in real time, inspired by a significant cyberattack that the university experienced. I find this project fascinating because it reveals the often unseen danger of cyberattacks, which can have severe consequences without our awareness. The concept of making the invisible visible is powerful and thought provoking.
The artist visualizes the country of origin of the attacks using different colored pixels, with each intercepted packet logged as a single colored pixel. While the article does not specify how the colors were chosen to represent each origin, I see potential in further exploring the color representation aspect, which could add another layer of meaning to the project. This project shares similarities with the wifi signal visualization project we discussed in class, as both reveal the continuous unseen processes happening around us.
Camille is utilizing the concept of psychoacoustics to build a site specific music installation. She found the resonant frequencies of the room and used them pitch the bell tones heard in the room. In the video she goes over the historical significance of the bell and why it appears in this work. There’s also a lot of other things happening, but I think this piece is really fantastic and effective in building on a space that can’t be replicated anywhere else. It’s kind of capturing a life in the room that wouldn’t otherwise exist or claiming the space for whatever period the piece was up.
Stereographs are really effective at observing the small details of an image that I might otherwise miss. I never had an interest in them until I slowed down and looked through a series of photographs of different minorities laboring in the fields. Certain parts of the image felt more solidified, or calcified looking through the stereoscope versus looking with the naked eye. The photograph almost became a sculpture to me. I’m not necessarily interested in using a stereoscope, but I would be interested in exploring this idea with various lenses and playing with the sense of sight more abstractly.
Below is a link to Keystone View Company – A Pennsylvania based company that holds a large archive of curious imagery available to the public
Their blog – https://stereoscopy.blog/2021/01/03/keystone-view-company/ Archives – https://archives.lib.utrgv.edu/repositories/2/resources/427
Outwards from June – Report 2: Joshua Ellingson’s Oscilloscope Clips
Here’s a super fun project by Joshua Ellingson titled Oscilloscope Clips for April-May 2022, where oscilloscope art is combined with Pepper’s Ghost illusions (really makes me want to try it!!) Basically, oscilloscope art is produced using a oscilloscope which is playing oscilloscope music (explained further with links below). The display from the oscilloscope is then projected using the Pepper’s Ghost technique so that it looks 3-D and as if it’s a hologram dancing to music inside of a glass shell.
I’ve found a lot of ‘sound visualizers.’ The simplest, most home-made (or high-school-science-classroom-made) project I found is Sound Visualizer & Chladni Patterns Formed on a Plastic Bucket // Homemade Science with Bruce Yeany I’m not sure if this is technically an oscilloscope, but it is definitely visualizing sound waves in a similar way. I think that what’s cool about Bruce Yeany’s sound visualizer is that it’s hand-held and easy to make! He also just seems like a super chill, nice guy and explains how the science and the set-up work in a way that’s easy to understand. There’s this other guy, Steve Mould, who made a similar sound visualizing device, but for his device to work he needs to put a Bluetooth speaker in a bowl, which is a more expensive set-up, and requires all sounds to be visualized to come from that Bluetooth speaker (as opposed to being able to capture live sounds and yell into it, like Bruce does). Finally, there’ a super cool video about called This is Music On An Oscilloscope – (Drawing with Sound) which shows how you can use an oscilloscope to draw things with sound and explains that you can actually make music to be enjoyed as oscilloscope music! In the video the first show how a track called “blocks” looks through the oscilloscope and they explain that this kind of music is created to be both audibly AND visually interesting.
I think all these projects are super cool, but I would be interested in getting them to visualize or capture vibrations in the environment that are just below our level of audio perception. Thinking about ‘the music of the spheres’ for example, from the previous post, I’d be interested in having an oscilloscope display not only making the invisible (sound) visually perceptible, but also making the inaudible (at least to human ears) AND invisible perceptible.
As for the sources that inspired Joshua Ellingson for this project, he states that he’s learned about the work of Jerobeam Fenderson & Hansi Raber and their OsciStudio utilities. He got the oscilloscope from a friend.
Outwards from June – Report 1: Intriguing Capture Device – Consider the Oscilloscope
A few weeks ago, I learned about Oscilloscopes, (I think I had heard about them years before in a physics class, but at that time quickly forgot about them). Technically, they are electronic test instruments that display voltage variation over time. Oscilloscopes capture variation in electric signals using various methods (which could be a post for another time). Oscilloscopes are useful in many different situations as tools that help people understand how electric signals are changing – people use oscilloscopes for troubleshooting automotive systems or even to display heartbeats as electrocardiograms.
My personal interest in oscilloscopes is related to visualizing sound waves, ‘sound’ being famously hard to ‘see,’ besides the fact that human ears, or auditory receptors, are not attuned to perceive all the sounds, or rather, vibrations that exist in our surroundings.
I really enjoy the idea of finding ways of translating sounds and vibrations into something visual. Of course, we have ways of perceiving vibrations, but it’s interesting to use a tool so that we can get another perspective or vantage point on that better illuminates how those vibrations behave. How might oscilloscopes be more effective? What the creators of the oscilloscope got right is that it’s an amazing analytical tool both to tune and troubleshoot with – I think it’s underexplored potential is as a creative tool. I was reading about William Duddell, who’s credited as the inventor of ‘the moving coil oscillograph’, which is a precursor to the oscilloscope. Apparently little William was some kind of prodigy who at only 4 years old transformed a toy mouse into an automaton by embedding clockwork within it. Duddell also is one of the first people to have created electronic music, using a device called the ‘singing arc,’ one of the first electronic oscillators. Duddell used his oscillograph creatively to play God Save the Queen and in doing so demonstrated that he could use his oscillograph to determine the precise conditions required to produce oscillations (and produce them in the sequence of a tune by wiring them to a keyboard). Some of the motivations for Duddell in making his oscillograph are linked to the fact that arc lamps which lit the streets made ‘audible humming, hissing, or even howling sounds,’ which was not ideal, so measuring the instability in their current was helpful and Duddell’s oscilloscope could do just that.
I became interested in visualizing sound recently, after being captivated by a passage in the book When We Cease to Understand the World by Benjamin Labatut. In it, there’s a mention of Johannes Keplerwho believed that there’s a melody produced by planets in motion around the sun. He called it the music of the spheres and thought that though human ears couldn’t hear this music, the human mind would be able to understand it. I thought the idea of the cosmic music of the spheres was so beautiful that I investigated it further and found that tons of artists and musicians have referenced it. One of my favorite passages is from a Lovecraft story — “His ears seemed at times to catch a faint, elusive susurrus which could not quite be identified with the nocturnal hum of the squalid streets outside, and he thought of vague, irrelevant things like the music of the spheres and the unknown, inaccessible life of alien dimensions pressing on our own.”
I’ll leave this at that for now!
P.S. please forgive me for violating the new media rules of the ‘looking outwards’ report as I have technically seen this tool before, but it was only a few weeks ago and I’ve been very interested in it since and have wanted to share it with people who it might also inspire (maybe we can someday work on a project together!)
I really like the EMF sonifier. Although when I searched online the most similar results were “EMF amplifiers”. It basically senses the electromagnetic fields created by different voltages and electrical currents, and outputs a sound corresponding to the frequencies. It’s a great way to acutely experience how much of our lives are surrounded by invisible currents flowing around. I’m thinking of combining the sound experience with visualizations of the currents, and possibly re-recording the audio from EMF sonifiers with the binaural mic to make it a much more lived and intimate experience. (The robot arm is cool so I have it here but I’m thinking more of site-based recordings in different types of locations).
I knew immediately I was going to talk about Joe Pease with this prompt because I went through a minor obsession phase with his work. His work is pretty much entirely illusory video edits and overlays. My understanding is that he takes pretty average stock-photo-passing-video and overlays a set of them to create false interaction between the subjects. At some point, false camera movement and grain are added which makes the fake interactions look more realistic, along with adding a CCTV or candid quality to the piece. I’m going to link to his Instagram for the videos which are better than the lazy still I’m including:
Paragraphica is an innovative camera that utilizes AI and location data to generate a photographic representation of a place and moment based on descriptive text. It gathers data such as the address, weather, and nearby places to create a paragraph that encapsulates the current environment, which is then converted into a unique image using a text-to-image API. The camera includes dials that allow users to control the radius of data collection, the noise in the AI’s image generation, and how closely the image follows the descriptive paragraph. This project’s use of AI with direct, real-time interaction with environments is what I find so inspiring. What I think the creators got right here is there ability to capture and enhance the visuals of real environments and in real-time. I think the creators could take it a step further by allows users to adjust the descriptive text that is used to generate the image, it would be even cooler if you gave the user liberty to image their space in a new way, in real-time. I also think it could be cool to make a similar technology that had a more specific transformation step, i.e. used the AI to enhance the image in a specific way rather than a general recreation of the photo. The image produced is limited by the descriptive text that is generated.
Related Technologies: Paragraphica was created with Raspberry Pi 4, 15-inch touchscreen, 3D printed housing, custom electronics and using Noodl, python and Stable Diffusion API.