Asssignment 8–Arduino Timer

#include

#ifdef __AVR__
#include
#endif

static const int PIN = 3;
static const int NUMPIXELS = 2;
int incomingByte;
char incomingLetter;
Adafruit_NeoPixel pixels = Adafruit_NeoPixel(NUMPIXELS, PIN, NEO_GRB + NEO_KHZ800);
unsigned long interval = 10000;
unsigned long previousMillis = 0;

void setup() {
pixels.begin();
pixels.setPixelColor(0, pixels.Color(0, 0, 0));
pixels.setPixelColor(1, pixels.Color(0, 0, 0));
pixels.show();

}

void loop() {

unsigned long currentMillis = millis();

if (Serial.available() > 0) {
incomingByte = Serial.read();
incomingLetter = (char) incomingByte;
if (incomingLetter == ‘S’) {
while(currentMillis – previousMillis < interval) {
currentMillis = millis();
}

previousMillis = currentMillis;
pixels.setPixelColor(0, pixels.Color(255, 0, 0));
pixels.show();
}

else if (incomingLetter == ‘B’) {
while(currentMillis – previousMillis < interval){
pixels.setPixelColor(1, pixels.Color(0, 255, 0));
pixels.show();
currentMillis = millis();
}
previousMillis = currentMillis;
pixels.setPixelColor(1, pixels.Color(0, 0, 0));
pixels.show();
}

else {
pixels.setPixelColor(0, pixels.Color(0, 0, 0));
pixels.setPixelColor(1, pixels.Color(0, 0, 0));
pixels.show();
}
}
}

Final project storyboarding – storytelling musicbox

The general idea of interaction goes like this:

  1. place some figurine on top of the box, assume the sequence being figurine1 (f1 for short),f2,f3……
  2. Hit play button to play music (where all figurine is a part of, from the very first story)
  3. remove some figurine and add more figurine or do nothing to play more music:
    1. if nothing change, go to the next song that all of them are in follow the story sequence
    2. if new figurine added, play the next song they all in
    3. if figurine got removed: (assume f1, f2,f3 was there)
      1. if f2 or f3 got removed, same as new figurine added
      2. if f1 got removed, f2 becomes the pivot*.

 

  • *Pivot in this case is that favour the pivot figurine, if no songs are really up to standards(eg: strictly solo/duet).

 

Assignment8 – Tabletop Haptic Interaction for AR

I challenged to work with an array of mini vibration motors and wireless local networking. The goal of this project is to create an immersive AR interactivity by generating haptic feedback of the virtual objects interacting with a physical desk. The user feels the vibration of the AR ball bouncing on a table through the haptic device. Every time it bounces, a signal is sent wirelessly to the main PC, which is sent serially to Arduino to activate motors. The future step is to more precisely activate the array of motors by computing the level of vibration based on the locations of the AR objects and the device.

Exercise 8

For this assignment, I wanted to go back to object programming, because that confuses me more than most things, and I tend to be a generally confused person.

I also decided I was going to use Processing, because I’ve been watching a ton of coding train videos and I got tired of seeing suggestions come up on the side of my screen that look super cool but use Processing and had me thinking “that’s probably going to take me a while, I can’t learn how to use that right now.”

WHAT I MADE:

I used a blob detection code that allowed me to track a colour using my computer camera. I then decided I wanted to detect someone’s facial expression by using the shape of their lips. I could then use my millenial/gen X social media skills to create emojis that reflect typical facial expressions, and flash the related emoji at the user. Using my stellar illustrator skills and some photo bits and pieces off of the internet, I collaged a couple of emojis. I then used my a-ma-zing object programming skills to decode the code I was using and integrate an analysis of blob sizes and numbers. This took an insane amount time, because there are just so many objects, but I have it working! It’s super jittery though, especially between the “shock” face and “happy” face, and you have to be at just the right distance from the camera.

(I also started thinking: hey, if I can detect colors, maybe I’ll be able to detect light colors my arduino gives off and have an elementary replacement to my malfunctioning serial control.)

——————————————————–

If anyone wants to test this code out, your skin and lip color will be too similar, and if you mess with the code’s colour threshold you will definitely make it worse, so here’s the easiest fix: Put on some bright lipstick. If you’re reluctant to do so you’re going to have to figure out a way to make your lips change colour. Once you’ve defied the laws of nature (or used someone’s makeup kit) hit the ‘i’ key and click on your lips, then hit the ‘i’ key again.

It’s preset to a bright red, so if you already have a bright red lipstick on, you’re all set.

———————————————————

I sincerely apologize for the weird faces.

code and image files:

use_your_lipstick