Comments on: Exercise 2: Physical Computing as Foundation https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/ 60-223 Fall 2017 Tue, 07 Nov 2017 00:08:48 +0000 hourly 1 https://wordpress.org/?v=4.8.24 By: soojins@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-107 Tue, 12 Sep 2017 07:01:15 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-107 Interactive Particle Print Dress by Shenova Fashion

https://www.youtube.com/watch?v=guALfLoto10

1. Basic description, elaborate on a title or use
This dress is designed by Shenova Fashion Corporate. It’s a particle physics pattern-printed dress, that senses the heart beat rates of the model, and lights up with respect to the heart beat rhythm.

2. What is it supposed to do?
The LEDs sewed under the dress would flash on and off by the measured fluctuation of the models’ heartbeat.

3. How does it do this?
With the sponsorship of IBM’s Bluemix, the designers incorporated the application created in node.js to allow the heart rate sensors to sync with the led’s brightness and lighting.

4. Does it work?
As shown in the video, the LED lights flicker in accordance to the typical rhythm created in heart beats.

5. How would you change it?
I would explore more on the aesthetic and technical capabilities of the LED dress, where I would use not only the heart fluctuation data, but also thermal sensors or brainwave sensors to further expand possibilities for wearable sensors.

]]>
By: ianf@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-82 Sat, 09 Sep 2017 00:23:41 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-82 http://deeplocal.com/projects/netflix-socks.html
http://makeit.netflix.com/projects/socks

Netflix socks is an open source project done by Pittsburgh’s own DeepLocal. The premise was this: people fall asleep quite often while watching their favorite show on Netflix. These socks detect when this happens and pause your show so you don’t miss a single minute of it. This is done by using an accelerometer to measure movement – when you are completely still, the assumption is that you have fallen asleep. When the sock thinks you’re asleep, it flashes an led to warn that it is about to pause your show (if you notice, you can just move to tell it you are actually awake). To stop the show, an arduino sends an infrared signal to your television telling it to pause.

It seems to me that this would work, but most likely gets triggered a lot when you don’t actually want it to. That being said, it really doesn’t need to work – this was all just a marketing campaign. If I were to change the design, I would probably use a pulse sensor to get a more reliable read on when the person is asleep. I would also program the arduino to turn off the television when it pauses Netflix, which would conserve energy and make a nice, dark sleeping environment.

]]>
By: tmustako@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-80 Fri, 08 Sep 2017 16:40:20 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-80 http://www.cardiffmiller.com/artworks/inst/cabinet_of_curiousness.html

The Cabinet of Curiousness is a 20 drawer cabinet that, when you open a drawer, plays a different recording (singing, opera, radio, news, animal noises, etc).

It does this by using sensors to tell if the specific drawer is open, and if it is plays the speaker in the specific drawer. I think the interesting moments occur when people open multiple drawers and make a sort of remix, either picking multiple sounds that are similair to creat a hormonious audio, or mixing very different songs.

I think this piece is strong, I dont think the author should really have altered it as this was their vision but if I where to make something similair I would have included a visual element when opening the drawers (for example, if the noise in the drawer is an opera piece I would have included a diorama of an opera scene)

]]>
By: etbrenna@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-79 Thu, 07 Sep 2017 21:49:29 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-79 The artist Hamilton Mestizo has created an interactive artifact called the Echolocalizator. The piece was made as a reflection of technology and how it is integrated in our lives. Digital-virtual technology is an integral tool that has influenced our perception of the real world and its phenomena. This art piece stems from this question of how we develop our own sense of reality. Computing is now able to create a virtualized reality in a virtualized environment that completely removes the real environment the virtual reality is perceived in. The Echolocalizator is a leather helmet that allows the user to perceive the space through sound, based on the concept of echolocation.

The artifact functions by using a sonars embedded in the headpiece. Inside is a simple electric circuit. An Arduino mini pro outputs a program that associates the sonar signal with an audio file (WAV) stored a microSD. A microcontroller then translates the sound signals into centimeters based on the distance between the object and the sonar. The range extends from 10 to 650 cms. The result is a bineural sound atmosphere that describes the surrounding space.

I would make the piece involve two people, or a group of people. This would be an interesting social response that would translate the private landscape created in the mind into a public space – harnessing the idea of being alone, together. I would also test out making the range smaller so there are less sound inputs and the user would be able to focus more on the subtle changes and nuances of the sound landscape.

Website: http://librepensante.org/
Process Blog: http://ecolocalizador.blogspot.com/

]]>
By: rnarayan@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-78 Thu, 07 Sep 2017 21:22:27 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-78 https://www.media.mit.edu/projects/animastage/overview/

The project that I would like to discuss is called Animastage. It was developed by the Tangible Media group at the MIT Media Lab. It allows you to animate physical crafts on an actuated stage. Animastage lets you place your craft (paper, 3d doodled or otherwise) on this stage and allows you to manipulate them. Additionally, you can also create a variable landscape to add scene. I think this project is really interesting because after developing animated characters in the virtual realm for so long, we are now pivoting towards using technology as an aid to prototype in the physical world. It’s kind of goofy in that you’re basically prototyping puppets but you not longer have to hold them, the stage does itself. I’m a big fan of using technology in unexpected ways to bring back older artforms. I think animastage does a good job of making animation / puppetry accessible and fun.

With that being said, the prototype is a definitely in it’s earlier stages. It could use some work in making the animations seem more seamless. At this point, a human could do a better job getting across a narrative through craft/puppetry than this installation. I think it could benefit from being a little quieter and somehow facilitating translational movement across the stage. I really commend the group for a great idea and would love to see how it could develops in further iterations

]]>
By: lkendric@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-77 Thu, 07 Sep 2017 19:31:15 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-77 A visiting lecturer came to Carnegie Mellon two years ago to speak to my pre-college architecture class about artistic and technological research. This research project member at Disney talked to us about one of his company’s projects called Botanicus Interacticus, using their newly-developed interactive plant technology.
Botanicus Interacticus involves artificial and living plants as mediums for users to interact and play with. With computing devices placed in virtually “any” plant soil, the plant turns into a touch pad for music, mouse-like features on a computer (selecting, directing to a new webpage, etc.), and other interactive functions. This computing device in conjunction with the users chosen plant medium detects human gestures, location of touch, proximity between the plant and the user, and the amount of contact. Researchers on the team analyzed the electrical properties of plants and replicated them in the computing device, making these functions possible.
This device is incredibly entertaining to its users, considering implementing such a design in a Disney park would attract curious children and adults. Today, as the line between nature and technology begins to fade, this type of interactive design may stand at the forefront where these realms of our environment collide. If I were to improve this device I would begin by thinking about what other types of plants or living materials found in nature could become mediums. For example, on a broader scale, they may wire a larger computing device in the ground near a bed of flowers or a field of corn. If humans brushed up against these things perhaps it could trigger not only music, but lights in buildings or on sidewalks as they approach them. Perhaps human interaction doesn’t have to be the only trigger. Sun, rain, or wind may play a factor in the output of the system.
https://www.disneyresearch.com/project/botanicus-interacticus-interactive-plant-technology/

]]>
By: noell@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-76 Thu, 07 Sep 2017 17:32:40 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-76 A orthopedic drill that has sensors on it tell you bone density, depth, and possible some diagnostivc features?

Orthopedic surgery often involved drilling into the bone- and we obviously can’t see into the bone
medium that we are drilling into. We can use x-rays, can do different scans, use biomarkers and biopsies (the later of which are
very invasive), and we try to understand whta we are getting into before we open the patient up. It’d be great
to have a record of how the drill was operating/interacting with the bone. Different layers have different densities, and being able to collect
data like density, composition, mineral content etc. can give a greater image of what the patient is experiencing, the state of their body, as well as contribute
to medical literature with additional data.

I think it will include pressure sensors, and estimate density with how easily/how much drag is experienced by the drill, by the bone.
Maybe it will have a small pocket in a disposable drill head to collect material from the hole made.
Additional data and biometrics that can be taken will have to be researched.

The drill at the moment does exist to some extent, and detects bone layers with thresholding technology based off of
asscociating applied torque/drill mechanics to bone characteristics. It currently does not, however, contribute to sample acqusition of biometrics, but rather
just focuses on perfecting technique with surgery performance. Having these biometrics would contribute to undertsanding the greater picture of the
pateint’s wellbeing, as well as contribute to research.

]]>
By: dpetroul@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-75 Thu, 07 Sep 2017 15:59:02 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-75 The project I have chosen to write about is called Sun Spot. It was made in the fall of 2015 physical computing class by Alex Palatucci, Katelyn Smith, and Rehan Butt. It is a small yellow box that attaches to clothes or bags with a string.

It is supposed to notify the holder/wearer of this technology when they have been exposed to surplus amounts of Ultraviolet Radiation, thus protecting them from harmful rays.

Sunspot works with a UV sensor, a microcontroller, and a motor. It records UV radiation once per second, and when it is at a peak for an extended amount of time, buzzes once every hour for up to 4 hours exposed to UV at which point it buzzes continuously until UV exposure is cut off.

According to the write up, this project was “extremely successful” in achieving its purpose and notifying its user of extended UV radiation exposure.

The creators also had some ideas for improvement of their project. I would ideally attempt to make it smaller and integrate it into a pair of sunglasses or a hat that shields the sun. That way, while being more convenient to carry, wear and remember, more aesthetically pleasing, it is more functional. If worn on the eyes or head, the areas SunSpot is picking up radiation from are only shielded when the wearer away from radiation. On the other hand, if it is attached on a bag, shade from the wearer’s body or clothes onto sunspot might produce inaccurate readings and data. If it cannot be made smaller, I would make sure it was waterproof so that when doing water sports or sweating, both often associated with sun, it is still functional. I would also make it more easily attachable to different devices. Winter sports such as skiing and snowboarding in sunny states like Colorado are known to cause skin damage because of the sun. I would make it attachable to a snowboard, surfboard, helmet, bike, or skis. In that case, I would also keep a small part of its box as a casing for a small tube of emergency sunscreen in case exposure is unavoidable. It could also record UV radiation exposure times/locations and let you know which activities result in the largest amount of exposure.

]]>
By: mrquinn@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-74 Thu, 07 Sep 2017 15:47:10 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-74 Haile is a drum playing robot designed at the Georgia Institute of Technology by professor Gil Weinberg that can analyze beats on a drum played by a human, and respond with its own beat that it plays on a physical drum with a mechanical arm.
One other cool thing about Haile is that all non electrical components are made of wood.

https://www.youtube.com/watch?v=veQS6tsogAA.

Haile makes decisions about how to play and what to play in real time, responding to what it hears. This is a robot you can have a jam session with.

Haile is equipped with a microphone that detects: Pitch of a drum hit, beat patterns in successions of drum hits, amplitude of drum hits, and frequency of drum hits.

From the pitch and beat patterns, Haile generates a harmonious matching beat, and from amplitude and frequency of hits it determines whether to take a leading role (making it’s own drumming louder and more densely packed with beats) or a following role (quieter drumming with less dense beats.)

Haile has two arms that extends forward and back, with small batons at the ends that can swing up and down (one slower and harder than the other). Haile’s mounting is adjusted to be at the proper height of a drum placed in front of it, and calibrated so that it’s arm can reach nearly the middle of the drum, and near the edge. Haile and the user are meant to play on opposite sides of the same large drum. Haile controls the beat and volume by swinging the baton that strikes the drum, and the pitch by extending or retracting its arm to decide where on the drum to strike. The arms each have a linear motor controlling the extending and retracting, and a solenoid motor controlling the batons.

It works really well, passing the Turing test most of the time. Lots of drummers find Haile to be an engaging jamming partner.

I would give Haile more arms, and copy/alter the basic drumming code to accommodate other kinds of percussion instruments. It would be especially interesting to have Haile hitting two drums with different arm sets and responding and generating new beats according to its own, having a jam session with itself. I’d try to make a roomful of Hailes all playing different instruments, all responding to the sounds of all the other Hailes around them. All robot band!!

]]>
By: stnorman@andrew.cmu.edu https://courses.ideate.cmu.edu/60-223/f2017/exercise-2-physical-computing-as-foundation/#comment-73 Thu, 07 Sep 2017 15:33:42 +0000 https://courses.ideate.cmu.edu/60-223/f2017/?p=78#comment-73 Link https://www.youtube.com/watch?v=h5n0rw8wo14

The piece of physical computing that I have chosen to analyze is an interactive proximity-sensing table created by You Tuber Graham Monahan. What he has created is a device that when presented with motion will respond by lighting up. If one holds their hand above the table and moves it across the board’s LEDs they will follow the path of your hand and turn on and off following the path. A secondary aspect of this design is that not only does it illuminate in response to movement, but depending on how close or far you place you hand the lights will dim or brighten. The closer your hand the brighter they are and vice versa.

This prototype is broken down into three modules with three circuit boards controlling each one to create a seamless flow. In order to properly sense the human hand, the form of detection is infrared. The sensors control the outputs and will modulate the LEDs by brightening and dimming as the hand passes. Once turned on it properly assesses the environment, current heat map, etc, and will calibrate for further use. Once in use the surface can sense movement up to one foot away and from one foot and lower it will manipulate the level of illumination accordingly.

What is remarkable about this is that this is a simple prototype for the many iterations that came after, all still documented on YouTube. Its construction is not necessarily crude but is certainty not refined. When prompted with a hand the tabletop follows the movement with only a slight delay and mimics back an LED path of your hand successfully and without man handling as I have seen with some other physical computing projects on YouTube. Interestingly, many of my initial critiques are addressed in later iterations of the prototype. I initially thought that implementing color would give it a greater level of complexity. Next, I thought that adding a higher fidelity response via the LEDs would give the interaction that little kick of intrigue. In this specific prototype the LEDs responded in chunks that followed the path of your hand but not the shape. Fortunately for the piece the designer implemented the critiques above, and in the end made something highly successful. Something that could add a sense of dimension to the piece is adding a layer of sound. The closer you are the louder the feedback is and vice versa. This could create a very immersive device that plays with multiple senses.

]]>