Uncategorized – F15 60-223: Intro to Physical Computing https://courses.ideate.cmu.edu/60-223/f2015 Carnegie Mellon University, IDEATE Thu, 17 Dec 2015 20:19:25 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.31 Final Project: Spark Wand v2 https://courses.ideate.cmu.edu/60-223/f2015/final-project-spark-wand-v2/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-spark-wand-v2/#respond Thu, 17 Dec 2015 17:37:17 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11065 SAM_0026

For my final project for physical computing, I decided to advance with my progress on the Spark Wand. If you want to see how my part has evolved from it’s last primary iteration, the link to that is:

The Spark Wand is part of a multidisciplinary theatrical project integrating quadcopters, a controller (the spark wand), motion-capture technologies and an actor controlling the system. The system of operations is integrated together using what is known as the Robotic Operating System (ROS). ROS does a great job at being able to get signals and operations between multiple software and systems working together in a cohesive manner. As part of my assignment to this project, I had to communicate with ROS and I did this through ROSPy and I can show you what the results will look like.

View post on imgur.com

Working with the aerial robotics lab and some of the members involved in the project, I was able to create the first prototype of the Spark Wand. The reason why I call it that is because it uses a component called a Spark/Particle Photon. The Photon is a special Arduino-like micro-controller that has an integrated WiFi component that uses the same programming language as Arduino and has a bit more features like web-based development and browser-based testing. The primary reason I used this board was because of its convenient form factor. Its dimensions are 1.44 in. x 0.8 in. x 0.27 in. with headers on which is very convenient for making an easily-portable and lightweight controller/wand. However, the convenient form factor of the Spark/Particle Photon was only one of the things I was able to use to get a more convenient form factor for the controller itself.

SAM_0022

The controller itself comes with 10 buttons and a laser button that relay information into a “dashboard” that is included with the development provided by Particle and also sends information through ROS. The Dashboard lets me monitor the signals, counts how many times they come in and also provide me information of the time the signal is received and gives a graphical representation of it. An example of what it looks like is shown below:

device-management-logs-98d34920

The wand is made to be comfortable, easy to use, and lightweight with no external wires or cables. The latency experienced with a system with even decent connection to WiFi not noticeable and provides reliable results. In this revision of the project, I primarily focused on getting the form factor much sleeker and comfortable. The noticeable changes are that there are no exposed wires. I was able to do this by shielding the robot with a 3D-printed case that housed all of the components, the buttons and let the photon exposed for feedback and access.

SAM_0028        SAM_0030

SAM_0029        SAM_0031

Some of the main features is an on/off switch that allowed for power control. A 1C LiPo battery in order to provide power with a sleek packaging. Personalized grippers to house the customized laser pointer in order provide space but keep the laser at a constant position. And a protoboard with all of the wires and buttons in one compact form factor.

controllerassembly1        controllerassembly2

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-spark-wand-v2/feed/ 0
Final Project: Telemouths Take 2 https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/#respond Thu, 17 Dec 2015 09:28:17 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11029 Telemouths is an experiment in creating tools to allow for video-game style storytelling in a physical setting. Version 1 allowed a remote user to speak through a mask on another participant’s face; version 2 attempts to incorporate the beginnings of game logic by automating speech, triggered by a user’s proximity to points of interest.

YouTube / Dan Sakamoto – via Iframely

Version 2 is a mixed success; still to be resolved are the issue of getting a tone in the desired frequency range to be omnidirectional, as well as some difficulty getting the iPhone microphone to hear those frequencies. Finally, a signal-producing circuit would pitch-drift when used to generate high-enough sounds, and had to be set aside for now.

Speech snippets in this version are taken from the “The Sims 2”.

Documentation of iteration 1 can be found here.
The signal-generating circuit is based on this tutorial, just with one of the resistors changed in order to change the frequency.
Unity code is on github. Note that it’s the same repository as the last iteration, with the new version of unity file added.
Dog whistle app (used in place of signal-generating circuit) was downloaded from the Apple App Store here.

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/feed/ 0
Final Project: Bacterial Record Player https://courses.ideate.cmu.edu/60-223/f2015/final-project-bacterial-record-player/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-bacterial-record-player/#respond Wed, 16 Dec 2015 21:13:10 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10972 The bacterial record player is a home audio device that sonifies bacterial colonies. ‘Records’ are produced by collecting bacterial samples from the body and environment and incubating them on agar plates. Each culture sounds unique.

Vimeo / Thomas Eliot – via Iframely

IMG_6062IMG_6057

How it works:

Agar plates can be inserted into the viewing chamber, where photographs are taken with a canon DSLR. There is a python backend that uses gphoto2 to control the camera, and OpenCV to process the image. A 4-connectivity algorithm is implemented to detect bacterial colony areas and locations. Colony data is transmitted to Processing via OpenOSC. The colonies are drawn in Processing, and can be interacted with. Clicking on a colony causes that colony to propogate outwards, creating tones whenever the wave intersects with another colony. Tones are created by stretching or shortening a 2 second 440Hz sinusoid clip. The pitch of the tone is based on the proportions of the areas of the expanding and colliding bacteria.

code

 

The setup gave some nice photographs of the colonies themselves, too

2145

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-bacterial-record-player/feed/ 0
Wearables & Out in the World: Spark Wand https://courses.ideate.cmu.edu/60-223/f2015/wearables-out-in-the-world-spark-wand/ https://courses.ideate.cmu.edu/60-223/f2015/wearables-out-in-the-world-spark-wand/#respond Sat, 05 Dec 2015 06:40:27 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10922 by Kevin Apolo

YouTube / musiczheir – via Iframely

IMG_3264

The Spark Wand is part of a multidisciplinary theatrical project integrating quadcopters, a controller (or wand), motion-capture technologies and an actor controlling the system. Working with the aerial robotics lab and some of the members involved in the project, I was able to create the first prototype of the Spark Wand. The reason why I call it that is because it uses a component called a Spark/Particle Photon. The Photon is a special Arduino-like micro-controller that has an integrated WiFi component that uses the same programming language as Arduino and has a bit more features like web-based development, browser-based development and testing. The primary reason I used this board was because of its convenient form factor. Its dimensions are 1.44 in x 0.8 in x 0.27 in with headers on which is very convenient for making an easily-portable and lightweight controller/wand.

spark6

The controller itself comes with 10 buttons that relay information (for the time being) into a “dashboard” that is included with the development provided by Particle. It lets me monitor the signals. counts how many times they come in and also provide me information of the time the signal is received and gives a graphical representation of it. An example of what it looks like is shown in fig. 3 below:

device-management-logs-98d34920

The controllers will interface with a system called ROS (otherwise known as the Robotic Operating System) that takes multiple inputs and communicates between the system, the wand, motion capture system and quadcopters.

IMG_3267spark3

The wand is made to be comfortable, easy to use, and lightweight with no external wires or cables. The latency experienced with a system with even decent connection to WiFi not noticeable and provides reliable results.

sparkcontrollerSW1

]]>
https://courses.ideate.cmu.edu/60-223/f2015/wearables-out-in-the-world-spark-wand/feed/ 0
D3b: Police Gun https://courses.ideate.cmu.edu/60-223/f2015/d3b-police-gun/ https://courses.ideate.cmu.edu/60-223/f2015/d3b-police-gun/#respond Fri, 04 Dec 2015 02:40:09 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10865 This gun does not throw bullets where it is aimed. Instead it projects the name of innocents murdered by police. It forces the wielder to reflect on the damage of which they are capable, and the mistakes of those made before them.

An iPhone runs OpenCV on the camera image to identify faces. It generates a name and date of death for each body and fixes it to an interpolated chest position. This is projected back onto the person with a Optima ML750 picoprojector connected to the iPhone via a lightning to HDMI adapter. The iPhone and projector are aligned and made into the form of a handgun using an Open Beam chassis. ‘Firing’ the gun by recoiling it upwards displays another name, using the iPhone gyroscope.

Previously I had received feedback on the first iteration of the project. The cardboard phone attachment was replaced with an aluminum construction, making the device more durable and professional-looking. Another criticism received was that projecting just a heart onto a person is a task that a simple flashlight can do. So the content of the projection was expanded.

Taking the project to the next step would mean strengthening the emotional connection to the victims portrayed, as a name & date is a relatively clinical representation of a life. The device could be made into a weapon attachment for use in empathy training of police forces. Users reported that the light from the projector was too bright when pointed directly at them. I reduced the brightness and found that the effect was improved and the face recognition functioned better.

Code:

myFaceRec

]]>
https://courses.ideate.cmu.edu/60-223/f2015/d3b-police-gun/feed/ 0
Wearable and Out-in-the-world: Crash Helmet https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-crash-helmet/ https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-crash-helmet/#respond Fri, 04 Dec 2015 02:14:08 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10847 By Varun Gupta (varung1) and Craig Morey (cmorey)   

The Crash Helmet is a bike helmet that marks out locations where people have been killed while cycling in Pittsburgh to the biker wearing the helmet and to the rest of the community. The helmet currently uses the location of the recent death of cyclist Susan Hicks near Forbes and Bellefield.

The Crash Helmet uses a LightBlue Bean to power four red LEDs. The LightBlue Bean connects to an iPhone over Bluetooth, and when the iPhone comes within 20 meters of a crash site, the LEDs on the helmet go dark. Additionally, the LightBlue Bean is connected to a vibrational motor in the front of the helmet, to signal the crash locations to the biker.

Below is a closeup of the Light blue bean and LEDs and the Vibration motor on the inside of the helmet.

DSC_0015DSC_0016

 

Below is the Crash helmet in action at the site of the crash.

YouTube / Varun Gupta – via Iframely

Below is a fritzing diagram of the circuit.

crash_helmet_v2

]]>
https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-crash-helmet/feed/ 0
Wearable Game Design: Journey Car https://courses.ideate.cmu.edu/60-223/f2015/wearable-game-design-journey-car/ https://courses.ideate.cmu.edu/60-223/f2015/wearable-game-design-journey-car/#respond Thu, 03 Dec 2015 11:59:14 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10844 By: Roberto Andaya and Steven Ji

YouTube / Michelle Ma – via Iframely

Journey car is a controllable RC car game in which two players come together with the objective of knocking down beacons located throughout a specified course. One player drives the vehicle while the other player controls the wireless camera view. Both players view the same wireless camera feed and communicate with each other to knock the beacons down.

The general idea is based on the popular Playstation Video game, “Journey”, in which two players work together to reach the mountain. In the video game, there is no formal communication except for cryptic noises. From this game idea, we were inspired to give players an experience of not being able to physically see each other and be looking out from the perspective of the car.

Here is the car system:

DSC_0463

The Journey Car is a wearable gear in the sense that the person has the entire control systems on their person. Everything is packaged and powered by batteries. We made sure to make this experience as ready to go as possible so that people can pick up and play.

Steven Ji was in charge of creating this packaging system by studying how google cardboard systems were created. He went through half a dozen iterations to reach this finalized design.

DSC_0477DSC_0474

The wearable gear contains batteries to power the receiver for the wireless camera and also powering the Fatshark goggles. Fatshark goggles are an instrument that allow the user to see video from the wireless camera. Below are videos that show the packaged system in action.

Here are how the mobile viewer and control systems function:

YouTube / Chia Chi – via Iframely

YouTube / Jenna Choo – via Iframely

Below is a picture of the car system with the wireless camera mounted onto the front. Roberto Andaya was in charge of making sure the car system worked and was thoroughly constructed to a certain strength.

DSC_0234

The camera mount is laser cut acrylic that is specifically designed to be attached to the servo underneath it such that it is secure and able to rotate.

DSC_0243

A problem with the initial designs was the sheer amount of force created by the RC car system. Cardboard as a whole would not do and neither would Acrylic. We created a polycarbonate base and top system that about 20 times stronger than Acrylic. The base system was to ensure the system was held together. After some experience with the vehicle we also found that the RC car could tip over so a top piece was added to protect the equipment.

Below is the 2.4 GHz radio controller system that was used to control the RC vehicle and the Camera servo to turn it. This system consists of a HK-GT2 controller sending a signal to the receiver unit called a HK-GT2b which sends the commands to the controlling units.

controllerreceiver

Feedback:

During our first prototype presentation other members of the class were excited for the project but their main issue was that it felt too limited to its environment. We had been set on an idea of the RC car solving an indoor puzzle but the class was more interested in going the maximum speed that the RC car could go. With a room so small for presenting max speed was not possible. After that we decided on creating an outdoors speed game.

 

]]>
https://courses.ideate.cmu.edu/60-223/f2015/wearable-game-design-journey-car/feed/ 0
Journey Car: Prototype https://courses.ideate.cmu.edu/60-223/f2015/journey-car-prototype/ https://courses.ideate.cmu.edu/60-223/f2015/journey-car-prototype/#respond Sat, 21 Nov 2015 00:24:25 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10765 The link below has all the information regarding the prototype

The Game.compressed

 

]]>
https://courses.ideate.cmu.edu/60-223/f2015/journey-car-prototype/feed/ 0
D3a: Body Canvas Projection Prototype https://courses.ideate.cmu.edu/60-223/f2015/d3a-body-canvas-projection-prototype/ https://courses.ideate.cmu.edu/60-223/f2015/d3a-body-canvas-projection-prototype/#respond Fri, 20 Nov 2015 20:07:57 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10759 I created a handheld device that when pointed at a person, projects a heart onto their chest. It makes the recipient vulnerable by revealing to the public the location of this vital organ. OpenCV is running on an iPhone attached to a projector. Heart location is extrapolated from head position.

IMG_4255

video:

iPhone Obj-C:

FaceRec

]]>
https://courses.ideate.cmu.edu/60-223/f2015/d3a-body-canvas-projection-prototype/feed/ 0
Wearable and Out In the World (Prototype) – Repman https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-prototype-repman/ https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-prototype-repman/#respond Tue, 17 Nov 2015 19:54:21 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10704 by Jonathan Dyer, Anatol Liu and Kiran Matharu

image

When you hear the word “wearable”, what’s the first company you think of? Most likely your answer to the question is “Fitbit”, the $300 million dollar wearable company that just filed for an IPO. Fitbit has made their money by promising to improve your workout by tracking your heart rate and telling you how effective you’ve been. Though, how useful is information that someone would have to teach you how to use? Wearables should compliment and improve our current workout routines, not force us to adopt an entirely new behavior. That was our thinking when we designed “Repman”, a rep tracking wristband.

Repman is a wristband that uses accelerometers to count your reps as you go and vibrates when you’ve finished your set. There is no need to interact with an app or apple watch, just simply press the button and begin your workout.

The wristband works by collecting Y acceleration data from the Light Blue Bean. It first smooths the data using a smoothing filter found on the Arduino website. It then uses an algorithm to detect peaks in the data and counts each peak as a rep. It discards any rep that is too close too another, interpreting it as noise. The code and circuit diagram can be found here:

image

]]>
https://courses.ideate.cmu.edu/60-223/f2015/wearable-and-out-in-the-world-prototype-repman/feed/ 0