For our project, we created two robots programmed to move randomly until they touch–after which they would spin together forever. This is an exploration into partnership, and how partnerships are formed. When the project is run, the movement is often sharp, and the robots will veer very close to each other without actually touching. This is indicative of the difficulty finding partnership can sometimes be, and indicative of how sometimes you may veer very close to each other without actually meeting. Overall, however, I think this project takes on a much more explorative tone rather than one that conveys an explicit message.
Personally, I worked mainly on the software side, and getting the robots to recognize when they touched.
We used an mbed, a 3pi, and conductive film to create this project.
Video: https://vimeo.com/143955313
The code can be found here:
]]>Our group built a midi synthesizer that, upon playing a few notes, will start to play a harmony alongside the user. This harmony can then be played over as well, to create the impression of a more complete piece of music.
Our intention was to build a kids’ toy. Children have little exposure to music and huge imaginations. Thus, the generated harmony is analogous to what kind of music the children may be hearing in their heads, and is intended to bridge the gap between their imaginations and what sounds they can actually play. This is not just applicable to children: many adults that have little or no exposure to music often find themselves confronted with a huge gap between their musical imaginations and what they are actually capable of playing. Since the harmony on this synthesizer are automatically generated based on preset chords and intervals, this is meant to both bridge the gap between imagination and execution and also show that much of music follows the same guiding principles.
The synthesizer works by detecting changes in voltage from each pressure sensor, sending a byte signal mapped to that pressure sensor, translating the byte signal into MIDI, and then playing it on Garageband. The harmony is looped, and is generated by randomly choosing intervals and chords that make sense in the current context.
This project has much that can be improved. In particular, the harmony generation is very rudimentary. Some basic key detection and machine learning could make it better.
The code simply translates and forwards the data from the keyboard into a MIDI signal, and then starts playing a looped harmony. The MIDI signal is generated using a one-to-one mapping between a MIDI note and a key. Upon a press of a key, each note plays for a set amount of time.
Here is a diagram of our hardware:
The code can be found here:
]]>The creators of Egg Minder claim that you will, “never be in a scramble for a good egg again.” After all, Egg Minder comes equipped with LED lights to indicate the age of eggs, push notifications to tell you when eggs are going bad or when you’re running low, and a mobile app that allows you to check the status of your eggs on the go.
Egg Minder was designed with the consumer in mind, only sending push notifications when you are in dire need of more eggs. Egg minder is relatively cheap and connected, working most efficiently with your smartphone. As a plus, it looks nice as well.
]]>Batband is a product that promises to deliver studio quality sound without the intrusion of in-ear earphones. Batband works by emitting sound waves that reverberate through the skull and then get picked up by the inner ear, in a process the creators call “bone conduction technology”. Because Batband leaves your ear canal free, you can simultaneously process sound coming from the device linked to Batband, and sound coming from the outside world. This means that you can listen to music, answer phone calls, or use any other sound-incorporating function on your phone, without losing track of what’s going on in your immediate surroundings.
Batband combines portablity and functionality