Joseph Paetz – Physical Computing Studio https://courses.ideate.cmu.edu/48-390/s2016 CMU | Spring 2016 | 48-390 Sun, 02 Oct 2016 15:29:26 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.31 Final Project – bPolite https://courses.ideate.cmu.edu/48-390/s2016/2016/10/02/final-project-bpolite/ https://courses.ideate.cmu.edu/48-390/s2016/2016/10/02/final-project-bpolite/#respond Sun, 02 Oct 2016 15:22:57 +0000 https://courses.ideate.cmu.edu/48-390/s2016/?p=466 Continue Reading →]]> bPolite is a prototype community curated message board intended to create a place where community discussions could happen in the physical world, thus increasing the sense of community within a neighborhood.

copy-of-img_4050

bPolite consists of a chalkboard attached to a scrolling display showing a question. Members of the community can use chalkboard space to answer the question and build a response together. Questions could be anything from how to deal with a neighborhood issue to what peoples favorite ice cream flavor is. If a member of the community has an idea for a question, they can text the display from any phone (they don’t have to have a smart phone) and their question will be posted to the display after being checked by a moderator who also lives in that community.

Additionally, photos of the responses are taken using a small camera that is pointed at the board. The pictures are posted to a website so that members of the community can see the responses to different questions (this was only partially implemented in our prototype).

We used a variety of tools and techniques to create our prototype including: Arduino, Raspberry Pi, twillio (for SMS), vacuum forming, and basic fabrication techniques. Full technical writeup can be found here.

Mock up website:

copy-of-1-beacon-home-page

Progress Photos:

copy-of-img_3973

copy-of-img_3967

copy-of-img_3943

copy-of-img_3936

copy-of-img_3933

copy-of-img_3784

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/10/02/final-project-bpolite/feed/ 0
Final Project proposal – Joseph Paetz https://courses.ideate.cmu.edu/48-390/s2016/2016/03/31/final-project-proposal-joseph-paetz/ https://courses.ideate.cmu.edu/48-390/s2016/2016/03/31/final-project-proposal-joseph-paetz/#respond Thu, 31 Mar 2016 05:00:45 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=427

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/03/31/final-project-proposal-joseph-paetz/feed/ 0
Calling Photon functions from Javascript https://courses.ideate.cmu.edu/48-390/s2016/2016/03/21/calling-photon-functions-from-javascript/ https://courses.ideate.cmu.edu/48-390/s2016/2016/03/21/calling-photon-functions-from-javascript/#respond Mon, 21 Mar 2016 01:13:34 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=415 Continue Reading →]]> Sky showed me Particle’s javascript library, and I thought I would post what I had to do to get it working.

Particle’s documentation can be found at: https://docs.particle.io/reference/javascript/

My updated files are at: https://github.com/arathorn593/IDeATePhysCompProject03-Cardboard

  1. From the terminal, run:   npm install particle-api-js
  2. To allow you to use the library in the main script.js file, you must add this line to your index.html file: <script type="text/javascript" src="//cdn.jsdelivr.net/particle-api-js/5/particle.min.js"></script>. This line needs to go before you import the script.js file, so you should have your new line followed by the existing line that includes the script:
    /* this is the added line */
    <script type="text/javascript" src="//cdn.jsdelivr.net/particle-api-js/5/particle.min.js"></script>
    
    /* this is the existing line */
    <script src="js/script.js"></script>
  3. Next, add the code to login at the top of your init function. Make sure to save your token to a global variable and don’t include the var Particle = require('particle-api-js');  line that the particle documentation says to include. Thus, the top of your init file should look something like:
    var particle;
    var token;
    function init() {
    	particle = new Particle();
    
    	particle.login({username: user, password: pass}).then(
    	  	function(data){
    	    	console.log('API call completed on promise resolve: ', data.body.access_token);
    	    	token = data.body.access_token;
    	   },
    	  	function(err) {
    	    	console.log('API call completed on promise fail: ', err);
    	   }
    	);
    
    	//....
    
    }
  4. Now, you are set up and you can use the functions in the library. The only one I am using right now is the callFunction function. I call it whenever the default stool object is selected (check out the picker function in script.js). To call a function on your photon, you also need to register the function to the cloud within you photon code (see https://docs.particle.io/reference/firmware/photon/).
    • My photon code:
      bool state = false;
      
      void setup() {
          pinMode(D7, OUTPUT);
          
          //register the function with the cloud
          Particle.function("light", light);
      }
      
      void loop() {
          //don't need to do anything here
      }
      
      //this function is called from the script.js file when
      //the stool is selected
      int light (String str) {
          //toggle the state of the light 
          state = !state;
          
          //write to the light
          if (state) {
              digitalWrite(D7, HIGH);
          } else {
              digitalWrite(D7, LOW);
          }
          return 3;   //return any number
      }
    • The code to call the light function:
      var fnPr = particle.callFunction({ deviceId: deviceID, name: 'light', argument: 'hi', auth: token });
      
      fnPr.then(
      	function(data) {
      		console.log('Function called succesfully:', data);
      	}, function(err) {
      		console.log('An error occurred:', err);
      });

       

Let me know if I need to add anything to this post.

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/03/21/calling-photon-functions-from-javascript/feed/ 0
Project 2: Bike Buddy – Joseph Paetz https://courses.ideate.cmu.edu/48-390/s2016/2016/02/26/project-2-bike-buddy-joseph-paetz/ https://courses.ideate.cmu.edu/48-390/s2016/2016/02/26/project-2-bike-buddy-joseph-paetz/#respond Fri, 26 Feb 2016 10:15:40 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=318 Continue Reading →]]> Video

Overview

Bike Buddy is a bike computer that uses sound to generate its data. Made using minimal components, Bike Buddy uses a simple contact mic that plugs directly into your phone for maximum convenience.

Inspiration

I bike a lot, and thus, I like to keep track of the miles I ride, and how fast I am going. This semester I also built a bike computer with a group of friends for Build18, so I had this in mind when I approached this project. However, unlike the bike computer I helped make for Build18 (seen below), Bike Buddy is minimal and uses an android phone to process and display the data from the wheel.

DSC_0700

My Build18 bike computer used rare earth magnets mounted on the wheel to trigger a hall-effect sensor mounted on the fork. This information was then processed and displayed by a light blue bean microcontroller.

Once I had the idea of using sound as the input to a bike computer, I was also inspired by the childhood practice of sticking a card into the spokes of a bike wheel to make lots of sound. I started from this concept of having something stationary on the frame hit all of the spokes, but after several iterations, I settled on what became Bike Buddy.

Technology

I used a piezo contact mic to pick up the sound of a zip tie on the wheel hitting a piece of wood mounted on the front fork. I then plugged this mic (with very minimal circuitry) into an android phone with a custom app.

Process

My initial idea was to put a zip tie around the fork of my bike and have it stick into the spokes. I would then pick up the ticks with the electret microphone. I had intended to mount a light blue bean on the fork in a laser cut enclosure. The bean would do all of the data processing needed to get speed and distance. However, this proved to be overly complicated in several respects. First of all, every revolution, many spokes would be hit, and the spoke pattern on my wheels is not completely even. Additionally, the electret mics have a significant amount of noise (especially at high speeds) which would make detecting the spoke hits difficult. Also, the bean introduced the complex problem of visualizing the data after the sound was processed. Because of this, I settled on mounting a piece of wood onto the fork and having a zip tie around the air valve on my wheel. This way, there would be only one tick per rotation, and I would get a great place to mount a contact mic. The contact mic reduced almost all outside noise, so I was just hearing the ticks of the zip tie hitting the wood piece. DSC_1220 DSC_1208

To overcome the issue of visualizing data from the Light Blue Bean, I just cut it out entirely. I plugged the mic directly into the phone via a TRRS plug. In order to get good data, I had to wire the piezo up to a capacitor and a pull down resistor. I also added a 100 Ohm resistor between the right and left audio output channels to ground so that the phone thought it was a pair of earbuds.

Circuit Diagram

Circuit Diagram

Circuit laid out on perfboard

Circuit laid out on perfboard

I soldered up the circuit on some perfboard and then used heat shrink to protect it and the piezo.

Soldered Circuit

Soldered Circuit

Cable after heat shrink

Cable after heat shrink

After I had the physical and electronic hardware sorted out, I had to write an app that read the mic and processed the data. I used the official android documentation and lots of googling to solve the many problems that came up when making the app (I have omitted these as many had to do with problems that were specific to my setup). In order to actually read the audio input on the app, I used the AudioRecord class in a separate thread.

Code

The code for the app is on github: https://github.com/arathorn593/Bike-Buddy

Reflection

While working on this project, I learned about how a simple project can still have lots of interesting tech and design problems. However, I am most excited about the future possibilities of this device. Since the processing is done on the phone, the speed and distance data could easily be linked to GPS data or hooked into a quantitative self ecosystem.

Additionally, I am very interested in a potential varient of this device where the mic is mounted directly on the front fork. Then, potholes could be detected and linked to GPS data from the phone. This would provide a way for road conditions for bikes to be mapped in real time.

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/02/26/project-2-bike-buddy-joseph-paetz/feed/ 0
Looking Outwards: C3D4 https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-c3d4/ https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-c3d4/#respond Thu, 28 Jan 2016 15:58:25 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=159 Continue Reading →]]> EXPLAINED: A sumo robot who uses extreme speed to knock its opponent out of the ring.

CHOSEN: I chose this robot because it has very good documentation and because it is one of the many sumo bots that use extreme speed as their main weapon. Additionally, everything on the robot is mounted to a single base plate which is our intended sumo bot construction technique.

CRITIQUED: The designer made a very effective sumo robot which did very well in several competitions. The design was steadily iterated over time in order to achieve this bot. Also, they included multiple sensors to find both opponents and the edge of the arena. However, the robot seems to only really smash straight forward at the beginning of the match which may allow random factors such as traction in different parts of the arena to effect the outcome of the match. It would have been nice to see more use of the sensors intended to track the other robot.

RELATED: Like many robots of this size, it uses extreme speed and a low angled blade to knock its opponents out of the ring. Other examples are:

  • The two white robots in this video: https://www.youtube.com/watch?v=30sbXfiHrqw
  • Senju Fast Sumo Robot: https://www.youtube.com/watch?v=RbA1rMRJNl0

LINK: http://web.mit.edu/jlramos/www/sumo_robots.html

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-c3d4/feed/ 0
Looking Outwards: Autonomous Sumo Combat Robot with Pneumatic Flipper https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-autonomous-sumo-combat-robot-with-pneumatic-flipper/ https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-autonomous-sumo-combat-robot-with-pneumatic-flipper/#respond Thu, 28 Jan 2016 15:30:17 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=128 Continue Reading →]]> EXPLAINED: A sumo robot that uses a flipper to help get its opponents out of the ring.

CHOSEN: I chose this robot because it is one of the few small sumo robots to use an attack besides ramming to get their opponent out of the ring. Additionally, the designer was able to fit a lot of stuff into the small package (including an air tank, large lipo, motors, etc.) all while keeping it under 1kg. This could help us figure out how to keep our sumos small as well.

CRITIQUED: The designer did a very good job at fitting a lot into a small space and staying beneath the 1kg weight limit for the competition this robot was entered in. Also, the choice of flipper actuation (pneumatics) proved to be very strong and effective in the ring. However, this robot’s main weakness in the ring was its sides. Ultrasonic sensors on the back and front allowed it to avoid enemies from those directions. Unfortunately, the side sensors seem to have holes in the enclosure, but they were not added.

RELATED: This robot uses a similar lifting mechanism to larger battle bot robots. Some examples are:

  • Bronco: http://battlebots.wikia.com/wiki/Bronco; https://www.youtube.com/watch?v=mgY0BRrEsxw
  • Bigger Brother: http://battlebots.wikia.com/wiki/Bigger_Brother; https://www.youtube.com/watch?v=6VmoZHtzR1k

LINK: http://www.instructables.com/id/Autonomous-Sumo-Battle-Bot-with-Pneumatic-Flipper-/?ALLSTEPS

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/01/28/looking-outwards-autonomous-sumo-combat-robot-with-pneumatic-flipper/feed/ 0
Looking Outward: Mr. Cube: One Cubic Inch Micro-Sumo Robot https://courses.ideate.cmu.edu/48-390/s2016/2016/01/23/looking-outward-sumo-bots-joseph-paetz/ https://courses.ideate.cmu.edu/48-390/s2016/2016/01/23/looking-outward-sumo-bots-joseph-paetz/#respond Sat, 23 Jan 2016 17:01:49 +0000 http://courses.ideate.cmu.edu/physcomp/s16/48-390/?p=50 Continue Reading →]]> EXPLAINED: An autonomous sumo robot that is only one cubic inch in volume.

CHOSEN: I chose this robot because it is an incredibly small robot. One of our ideas was to build one of our sumo bots smaller, but not nearly this small. In order to make the robot this small, there were lots of cool soldering and electronic tricks used.

CRITIQUED: The designer did an amazing job achieving their main goal which was to create an autonomous robot that is as small as possible. Additionally, I like that the robot can respond to touch and user input via an IR remote (for non-autonomous use). However, while the soldering required to get it in this space is incredible, I would be interested to see how small this could get by using a surface mount PCB.

RELATED: This robot is one of several very tiny robots, some examples are:

  • Mr. Cube Two: a 1/3 cubic inch robot (at the end of the explanation of Mr. Cube)
  • Picobot: https://www.youtube.com/watch?v=oKE9KEYHJOo

LINK: http://www.instructables.com/id/Building-Small-Robots-Making-One-Cubic-Inch-Micro/?ALLSTEPS

]]>
https://courses.ideate.cmu.edu/48-390/s2016/2016/01/23/looking-outward-sumo-bots-joseph-paetz/feed/ 0