bPolite consists of a chalkboard attached to a scrolling display showing a question. Members of the community can use chalkboard space to answer the question and build a response together. Questions could be anything from how to deal with a neighborhood issue to what peoples favorite ice cream flavor is. If a member of the community has an idea for a question, they can text the display from any phone (they don’t have to have a smart phone) and their question will be posted to the display after being checked by a moderator who also lives in that community.
Additionally, photos of the responses are taken using a small camera that is pointed at the board. The pictures are posted to a website so that members of the community can see the responses to different questions (this was only partially implemented in our prototype).
We used a variety of tools and techniques to create our prototype including: Arduino, Raspberry Pi, twillio (for SMS), vacuum forming, and basic fabrication techniques. Full technical writeup can be found here.
Mock up website:
Progress Photos:
]]>by: Roberto Andaya
I have always tried taking naps in between classes but they are not as comfortable as I would like. I always wish I had some sort of travel pillow system that did not take too much space. My NapJac idea does just that and more!
]]>
Reference:
http://people.csail.mit.edu/mrub/vidmag/
]]>We live in an era of constant monitoring and electronic intervention. Mobile technologies have enabled fast computing on our persons in all aspects of our lives including, education, biometric monitoring, social media correspondence, and much more. With this in mind we approach our final project with a focus on wearable devices, biometric monitoring, fitness tracking, and any other type of on-body sensing or actuation. This may include VR/AR, watches, e-textiles, etc. The focus of this proposal is not to describe perfectly how build your project, but how a user would interact with it. Consider this an exercise in describing an experience not a technology. The proposal will be in the form of a short video (between 15 and 30 seconds), and may use live action, stop-motion, animation, or any other technique to describe your proposed experience. You have only two days to generate this video so the focus is on the idea more than the craft of the video. A rough animation of a great idea will be more convincing than a polished less developed proposal. You will work alone and we will break into teams of two on Thursday to move forward. Good luck!
]]>Particle’s documentation can be found at: https://docs.particle.io/reference/javascript/
My updated files are at: https://github.com/arathorn593/IDeATePhysCompProject03-Cardboard
npm install particle-api-js
script.js
file, you must add this line to your index.html file: <script type="text/javascript" src="//cdn.jsdelivr.net/particle-api-js/5/particle.min.js"></script>
. This line needs to go before you import the script.js
file, so you should have your new line followed by the existing line that includes the script:/* this is the added line */ <script type="text/javascript" src="//cdn.jsdelivr.net/particle-api-js/5/particle.min.js"></script> /* this is the existing line */ <script src="js/script.js"></script>
var Particle = require('particle-api-js');
line that the particle documentation says to include. Thus, the top of your init file should look something like:var particle; var token; function init() { particle = new Particle(); particle.login({username: user, password: pass}).then( function(data){ console.log('API call completed on promise resolve: ', data.body.access_token); token = data.body.access_token; }, function(err) { console.log('API call completed on promise fail: ', err); } ); //.... }
bool state = false; void setup() { pinMode(D7, OUTPUT); //register the function with the cloud Particle.function("light", light); } void loop() { //don't need to do anything here } //this function is called from the script.js file when //the stool is selected int light (String str) { //toggle the state of the light state = !state; //write to the light if (state) { digitalWrite(D7, HIGH); } else { digitalWrite(D7, LOW); } return 3; //return any number }
var fnPr = particle.callFunction({ deviceId: deviceID, name: 'light', argument: 'hi', auth: token }); fnPr.then( function(data) { console.log('Function called succesfully:', data); }, function(err) { console.log('An error occurred:', err); });
Let me know if I need to add anything to this post.
]]>‘Einstein, Gravity & 101 years’
Initial seeds’ and direction of Einstein, Gravity & 101 years (E.G.101.Y) grew from my search for an elemental physical action that creates an under appreciated or surprising response and sound.
Technologies involved include: Processing programing tools and direction were fundamental to accomplishing the practical digital applications creating audible and visual responses integrated into project E.G.101.Y.. IPhone software App Particle/Spark, was used to connect and utilize the Photon, Bluetooth enabled, circuit board hardware attached to a small breadboard and the adjustable gain micret’ (VCC: 2.4-5.5V) held up famously after countless direct and indirect little glass ball strikes.
Stage 2: velocity banana
An Apple MacBook Pro facilitated near infinite calculations and processes to help write this vital documentation; additionally it helped send said’ code and visual signals to a ceiling mounted video projector and receive signals from aforementioned Photon. Miscellaneous power tools were utilized for presented and prototype construction. Lastly, the grandmother of all technologies, 500,000-year-old fire.
Note: diagrams, photos, and sketches of progress are also integrated into video attached below. The MacBook pro with USB powered the Photon below to Micro input then connected to the Micret by wires with a ground, AO, and power out to ground, out, and VCC on the Micret.
Photon wired for micret’ connect
VCC: 2.4 – 5.5v Adjustable Gain micret’ connected to Photon (above)
~~~~~~~~
VIDEO:
‘Einstein, Gravity & 101 years’ (A process inclusive video presentation, click link)
Einstein, Gravity & 101 years 1a
~~~~~~~~
Link to a Github repo or gist containing your code:
http://github.com/eonpcs/Working-Best-Rockstar-Einstein-Gravity-101-years.git
Inspiration was drawn from a few people and ideas for this sound based project. Here is a shortlist of those thoughtful, inspired projects or people.
• John Whites paintings have given me pause since first experiencing them, especially the pieces in the grand staircase at the Carnegie Museum of Natural History. I am not sure how I feel about the layers of humanity starting on the first floor as laborers then on up to the 3rd floor where people are drifting up to the sunlit future built on the toil of those before them. This project allowed space to explore these ideas. Here is a link to the Museums Grand Staircase paintings page on this impressive 3 floor mural.
• Sarah Sze is a site specific artist I had the privilege and pleasure to assist during the Carnegie International Art competition and exhibition. She has shifted the idea of a art space, what is its place and what are the rules? Sarah has been changing what thousands used to think was art, breaking through bearers with both tradional constructs of the field and physical responses. Sarah Sze art materials are found anywhere she finds inspiration.
• Einsteins theory of general relativity is now confirmed. Gravitational waves were recently detected by us humans! Einsteins 101 year old theory has been tested for decades, the gravity ripple from two black holes created a proton width blip detected by a set of man made gravity wave receptors. Here is a great New York Times link to the ‘music of the cosmos’ .
• James Terrell’s Roden Crater ‘a gateway to observe light, time, and space.’ is a massive crater that has been changed into a land art installation. It ‘engages with the sun, moon, stars and planets.’ A light and large scale interactive art artist. He often relates directly with the sky, stars and light. Here is a link to J. Terrell’s official web site.
(Night photo of the Roden Crater site specific art installation by James Terrell)
– Photon, Micret’, Processing code, imagined to this projected multi layered and active visual response to sound.
Lessons learned include:
This slower moving inspirational Project 2 brought many insights. Computer code utilized for interactive visual ellipses moving with changing color applications, a thoughtful relevant application. Sound input visualization tool to bring unexpected response to surprising thud and bell ring finale’ for the presentation. Fortunately, the healthy pause in our daily grind from Einstein, Gravity & 101 Years gave satisfying emotional results and response from those that experienced it.
The early stage prototype in glass box brought light to physical, sound, and visual access points to be reconsidered. Re-framing restrictions brought focused energy and clarity to resolving the physical, electrical, audio, light, and conceptual challenges.
Video link to glass box prototype:
PCStudio prj_ 2 glass box – HD 720p
Next stages: in this project could bring more audio cues and projected responses. Multiple physical reactions to unexpected thoughtful twists including analogies and referance to our time here in relation to the greater humbling hope for a universal understanding of how and where did we come from.
Can our inspired incremental projects provide power filled distractions pivoting us toward the next great impactful invention?
Yes.
Thank you for reading this it was a privilege and pleasure to share it.
]]>
Overview
Bike Buddy is a bike computer that uses sound to generate its data. Made using minimal components, Bike Buddy uses a simple contact mic that plugs directly into your phone for maximum convenience.
Inspiration
I bike a lot, and thus, I like to keep track of the miles I ride, and how fast I am going. This semester I also built a bike computer with a group of friends for Build18, so I had this in mind when I approached this project. However, unlike the bike computer I helped make for Build18 (seen below), Bike Buddy is minimal and uses an android phone to process and display the data from the wheel.
My Build18 bike computer used rare earth magnets mounted on the wheel to trigger a hall-effect sensor mounted on the fork. This information was then processed and displayed by a light blue bean microcontroller.
Once I had the idea of using sound as the input to a bike computer, I was also inspired by the childhood practice of sticking a card into the spokes of a bike wheel to make lots of sound. I started from this concept of having something stationary on the frame hit all of the spokes, but after several iterations, I settled on what became Bike Buddy.
Technology
I used a piezo contact mic to pick up the sound of a zip tie on the wheel hitting a piece of wood mounted on the front fork. I then plugged this mic (with very minimal circuitry) into an android phone with a custom app.
Process
My initial idea was to put a zip tie around the fork of my bike and have it stick into the spokes. I would then pick up the ticks with the electret microphone. I had intended to mount a light blue bean on the fork in a laser cut enclosure. The bean would do all of the data processing needed to get speed and distance. However, this proved to be overly complicated in several respects. First of all, every revolution, many spokes would be hit, and the spoke pattern on my wheels is not completely even. Additionally, the electret mics have a significant amount of noise (especially at high speeds) which would make detecting the spoke hits difficult. Also, the bean introduced the complex problem of visualizing the data after the sound was processed. Because of this, I settled on mounting a piece of wood onto the fork and having a zip tie around the air valve on my wheel. This way, there would be only one tick per rotation, and I would get a great place to mount a contact mic. The contact mic reduced almost all outside noise, so I was just hearing the ticks of the zip tie hitting the wood piece.
To overcome the issue of visualizing data from the Light Blue Bean, I just cut it out entirely. I plugged the mic directly into the phone via a TRRS plug. In order to get good data, I had to wire the piezo up to a capacitor and a pull down resistor. I also added a 100 Ohm resistor between the right and left audio output channels to ground so that the phone thought it was a pair of earbuds.
I soldered up the circuit on some perfboard and then used heat shrink to protect it and the piezo.
After I had the physical and electronic hardware sorted out, I had to write an app that read the mic and processed the data. I used the official android documentation and lots of googling to solve the many problems that came up when making the app (I have omitted these as many had to do with problems that were specific to my setup). In order to actually read the audio input on the app, I used the AudioRecord class in a separate thread.
Code
The code for the app is on github: https://github.com/arathorn593/Bike-Buddy
Reflection
While working on this project, I learned about how a simple project can still have lots of interesting tech and design problems. However, I am most excited about the future possibilities of this device. Since the processing is done on the phone, the speed and distance data could easily be linked to GPS data or hooked into a quantitative self ecosystem.
Additionally, I am very interested in a potential varient of this device where the mic is mounted directly on the front fork. Then, potholes could be detected and linked to GPS data from the phone. This would provide a way for road conditions for bikes to be mapped in real time.
]]>What started as a sound generating glove mechanism turned into a gestural puppet controller.
Video (that will be replaced with better video shortly):
Idea genesis:
This project began with my fascination with gestural technology. Artists like Laetitia Sonami have been making waves in the world of unlikely sound generation. Sonami’s project “Lady’s Glove” features a sound generating glove that is controlled by finger movement. The result is a cohesive performance in which Sonami combines simple finger flexion with an array of sound effects.
My objective was to do something similar, but instead of embedding the glove with an arsenal of noise, I was more interested in creating a simple sound gradient so that the angle of flexion for each finger was calculated.
How did that turn out?
Not exactly what I described.
Technologies used.
Photos:
coming soon!!!
Gist Link to Max Patch:
https://gist.github.com/LValley/e4b7d09429ce168d6ad0.js
Inspiration Links:
Lady’s Glove
http://sonami.net/ladys-glove/
(and very loosely) Stelarc’s third hand:
http://stelarc.org/?catID=20265
After thoughts:
While this project was somewhat of a wild ride, I am glad that I ended up with a functioning project that made sense.
In the future; definitely, more finite planning.
]]>
Inspiration
At first I wanted to have this exist out in the world, preferably out over one of the rivers. But due to the scope of the project, the time we had and some technical issues, I scaled it down significantly. I decided on a light and silly output of a running animation. I wanted to get more familiar with Processing and figured this would be a good project to start doing that.
Technologies Used
I used a Particle Photon board to the initial signal processing from the piezo microphone that was on the instrument. Then I used Processing to control the animation with a Serial input.
Photos
Here are some photos of earlier prototypes for the propellers
I even tried to make my own propellers so I wouldn’t have to use spoons, but unfortunately I couldn’t get a good form from the vacuum former.
Here’s a sketch for the final animation. Hopefully I can redo the animation so that its more than a woman running.
Code
https://github.com/dcamposzamora/windmillanimation
The file name “Switching_animations.pde” is the first code I showed for critique that switched between 2 different animations. But the second “Slow_still_frames.pde” is the one in the video, where the animation moves depending on the input of the serial. So if the wind is hitting faster the images of the animation switch faster.
External Libraries
I used examples from the Processing reference libraries for this project.
Conclusion
I had a lot of difficulty with this project so the final product is far from what I envisioned. Since I was caught up with the conceptual roadblocks, I had less time to troubleshoot the technical difficulties that arose. But in the end I’m glad I got the microphone and animation working. Initially I thought of this as being just a kind of dumb, fun project to get to know some of software and hardware better but during the critique, the suggestion that something like this could be applied to children’s toys or books was really interesting to me. It makes me wonder what the possibilities of using interactive technologies could be to expand on children’s books (Goosebumps choose-your-own-adventures x100) or cartoons and short animations.
]]>