https://www.youtube.com/watch?v=guALfLoto10
1. Basic description, elaborate on a title or use
This dress is designed by Shenova Fashion Corporate. It’s a particle physics pattern-printed dress, that senses the heart beat rates of the model, and lights up with respect to the heart beat rhythm.
2. What is it supposed to do?
The LEDs sewed under the dress would flash on and off by the measured fluctuation of the models’ heartbeat.
3. How does it do this?
With the sponsorship of IBM’s Bluemix, the designers incorporated the application created in node.js to allow the heart rate sensors to sync with the led’s brightness and lighting.
4. Does it work?
As shown in the video, the LED lights flicker in accordance to the typical rhythm created in heart beats.
5. How would you change it?
I would explore more on the aesthetic and technical capabilities of the LED dress, where I would use not only the heart fluctuation data, but also thermal sensors or brainwave sensors to further expand possibilities for wearable sensors.
Netflix socks is an open source project done by Pittsburgh’s own DeepLocal. The premise was this: people fall asleep quite often while watching their favorite show on Netflix. These socks detect when this happens and pause your show so you don’t miss a single minute of it. This is done by using an accelerometer to measure movement – when you are completely still, the assumption is that you have fallen asleep. When the sock thinks you’re asleep, it flashes an led to warn that it is about to pause your show (if you notice, you can just move to tell it you are actually awake). To stop the show, an arduino sends an infrared signal to your television telling it to pause.
It seems to me that this would work, but most likely gets triggered a lot when you don’t actually want it to. That being said, it really doesn’t need to work – this was all just a marketing campaign. If I were to change the design, I would probably use a pulse sensor to get a more reliable read on when the person is asleep. I would also program the arduino to turn off the television when it pauses Netflix, which would conserve energy and make a nice, dark sleeping environment.
]]>The Cabinet of Curiousness is a 20 drawer cabinet that, when you open a drawer, plays a different recording (singing, opera, radio, news, animal noises, etc).
It does this by using sensors to tell if the specific drawer is open, and if it is plays the speaker in the specific drawer. I think the interesting moments occur when people open multiple drawers and make a sort of remix, either picking multiple sounds that are similair to creat a hormonious audio, or mixing very different songs.
I think this piece is strong, I dont think the author should really have altered it as this was their vision but if I where to make something similair I would have included a visual element when opening the drawers (for example, if the noise in the drawer is an opera piece I would have included a diorama of an opera scene)
]]>The artifact functions by using a sonars embedded in the headpiece. Inside is a simple electric circuit. An Arduino mini pro outputs a program that associates the sonar signal with an audio file (WAV) stored a microSD. A microcontroller then translates the sound signals into centimeters based on the distance between the object and the sonar. The range extends from 10 to 650 cms. The result is a bineural sound atmosphere that describes the surrounding space.
I would make the piece involve two people, or a group of people. This would be an interesting social response that would translate the private landscape created in the mind into a public space – harnessing the idea of being alone, together. I would also test out making the range smaller so there are less sound inputs and the user would be able to focus more on the subtle changes and nuances of the sound landscape.
Website: http://librepensante.org/
Process Blog: http://ecolocalizador.blogspot.com/
The project that I would like to discuss is called Animastage. It was developed by the Tangible Media group at the MIT Media Lab. It allows you to animate physical crafts on an actuated stage. Animastage lets you place your craft (paper, 3d doodled or otherwise) on this stage and allows you to manipulate them. Additionally, you can also create a variable landscape to add scene. I think this project is really interesting because after developing animated characters in the virtual realm for so long, we are now pivoting towards using technology as an aid to prototype in the physical world. It’s kind of goofy in that you’re basically prototyping puppets but you not longer have to hold them, the stage does itself. I’m a big fan of using technology in unexpected ways to bring back older artforms. I think animastage does a good job of making animation / puppetry accessible and fun.
With that being said, the prototype is a definitely in it’s earlier stages. It could use some work in making the animations seem more seamless. At this point, a human could do a better job getting across a narrative through craft/puppetry than this installation. I think it could benefit from being a little quieter and somehow facilitating translational movement across the stage. I really commend the group for a great idea and would love to see how it could develops in further iterations
]]>Orthopedic surgery often involved drilling into the bone- and we obviously can’t see into the bone
medium that we are drilling into. We can use x-rays, can do different scans, use biomarkers and biopsies (the later of which are
very invasive), and we try to understand whta we are getting into before we open the patient up. It’d be great
to have a record of how the drill was operating/interacting with the bone. Different layers have different densities, and being able to collect
data like density, composition, mineral content etc. can give a greater image of what the patient is experiencing, the state of their body, as well as contribute
to medical literature with additional data.
I think it will include pressure sensors, and estimate density with how easily/how much drag is experienced by the drill, by the bone.
Maybe it will have a small pocket in a disposable drill head to collect material from the hole made.
Additional data and biometrics that can be taken will have to be researched.
The drill at the moment does exist to some extent, and detects bone layers with thresholding technology based off of
asscociating applied torque/drill mechanics to bone characteristics. It currently does not, however, contribute to sample acqusition of biometrics, but rather
just focuses on perfecting technique with surgery performance. Having these biometrics would contribute to undertsanding the greater picture of the
pateint’s wellbeing, as well as contribute to research.
It is supposed to notify the holder/wearer of this technology when they have been exposed to surplus amounts of Ultraviolet Radiation, thus protecting them from harmful rays.
Sunspot works with a UV sensor, a microcontroller, and a motor. It records UV radiation once per second, and when it is at a peak for an extended amount of time, buzzes once every hour for up to 4 hours exposed to UV at which point it buzzes continuously until UV exposure is cut off.
According to the write up, this project was “extremely successful” in achieving its purpose and notifying its user of extended UV radiation exposure.
The creators also had some ideas for improvement of their project. I would ideally attempt to make it smaller and integrate it into a pair of sunglasses or a hat that shields the sun. That way, while being more convenient to carry, wear and remember, more aesthetically pleasing, it is more functional. If worn on the eyes or head, the areas SunSpot is picking up radiation from are only shielded when the wearer away from radiation. On the other hand, if it is attached on a bag, shade from the wearer’s body or clothes onto sunspot might produce inaccurate readings and data. If it cannot be made smaller, I would make sure it was waterproof so that when doing water sports or sweating, both often associated with sun, it is still functional. I would also make it more easily attachable to different devices. Winter sports such as skiing and snowboarding in sunny states like Colorado are known to cause skin damage because of the sun. I would make it attachable to a snowboard, surfboard, helmet, bike, or skis. In that case, I would also keep a small part of its box as a casing for a small tube of emergency sunscreen in case exposure is unavoidable. It could also record UV radiation exposure times/locations and let you know which activities result in the largest amount of exposure.
]]>https://www.youtube.com/watch?v=veQS6tsogAA.
Haile makes decisions about how to play and what to play in real time, responding to what it hears. This is a robot you can have a jam session with.
Haile is equipped with a microphone that detects: Pitch of a drum hit, beat patterns in successions of drum hits, amplitude of drum hits, and frequency of drum hits.
From the pitch and beat patterns, Haile generates a harmonious matching beat, and from amplitude and frequency of hits it determines whether to take a leading role (making it’s own drumming louder and more densely packed with beats) or a following role (quieter drumming with less dense beats.)
Haile has two arms that extends forward and back, with small batons at the ends that can swing up and down (one slower and harder than the other). Haile’s mounting is adjusted to be at the proper height of a drum placed in front of it, and calibrated so that it’s arm can reach nearly the middle of the drum, and near the edge. Haile and the user are meant to play on opposite sides of the same large drum. Haile controls the beat and volume by swinging the baton that strikes the drum, and the pitch by extending or retracting its arm to decide where on the drum to strike. The arms each have a linear motor controlling the extending and retracting, and a solenoid motor controlling the batons.
It works really well, passing the Turing test most of the time. Lots of drummers find Haile to be an engaging jamming partner.
I would give Haile more arms, and copy/alter the basic drumming code to accommodate other kinds of percussion instruments. It would be especially interesting to have Haile hitting two drums with different arm sets and responding and generating new beats according to its own, having a jam session with itself. I’d try to make a roomful of Hailes all playing different instruments, all responding to the sounds of all the other Hailes around them. All robot band!!
]]>The piece of physical computing that I have chosen to analyze is an interactive proximity-sensing table created by You Tuber Graham Monahan. What he has created is a device that when presented with motion will respond by lighting up. If one holds their hand above the table and moves it across the board’s LEDs they will follow the path of your hand and turn on and off following the path. A secondary aspect of this design is that not only does it illuminate in response to movement, but depending on how close or far you place you hand the lights will dim or brighten. The closer your hand the brighter they are and vice versa.
This prototype is broken down into three modules with three circuit boards controlling each one to create a seamless flow. In order to properly sense the human hand, the form of detection is infrared. The sensors control the outputs and will modulate the LEDs by brightening and dimming as the hand passes. Once turned on it properly assesses the environment, current heat map, etc, and will calibrate for further use. Once in use the surface can sense movement up to one foot away and from one foot and lower it will manipulate the level of illumination accordingly.
What is remarkable about this is that this is a simple prototype for the many iterations that came after, all still documented on YouTube. Its construction is not necessarily crude but is certainty not refined. When prompted with a hand the tabletop follows the movement with only a slight delay and mimics back an LED path of your hand successfully and without man handling as I have seen with some other physical computing projects on YouTube. Interestingly, many of my initial critiques are addressed in later iterations of the prototype. I initially thought that implementing color would give it a greater level of complexity. Next, I thought that adding a higher fidelity response via the LEDs would give the interaction that little kick of intrigue. In this specific prototype the LEDs responded in chunks that followed the path of your hand but not the shape. Fortunately for the piece the designer implemented the critiques above, and in the end made something highly successful. Something that could add a sense of dimension to the piece is adding a layer of sound. The closer you are the louder the feedback is and vice versa. This could create a very immersive device that plays with multiple senses.
]]>