Project no. 2 – 62-362 Fall 2019 https://courses.ideate.cmu.edu/62-362/f2019 Activating the Body: Physical Computing and Technology in Performance Tue, 12 Nov 2019 22:20:34 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.20 Staircase Encounters https://courses.ideate.cmu.edu/62-362/f2019/staircase-encounters/ Tue, 12 Nov 2019 16:09:58 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=9038 Unity 3D, OpenPose, Kinect for Windows v2, Speakers, Acrylic Plastic
Ambient Auditory Experience Emanating Sculpture
Installed at the Basement Stairwell of Hunt Library, Carnegie Mellon University

2019
Individual Project
Rong Kang Chew

As Hunt Library guests enter the main stairwell, they are greeted with a quiet hum. Something’s changed but they can’t really see what. The hum changes as they walk along the staircase – they are amused but still curious. The sound becomes lower in pitch as they walk down to the basement. Someone else enters the stairwell and notices the noise too – there is brief eye contact with someone else on the staircase – did they hear that too?

As they reach the bottom and approach the door, they hear another sound – a chime as if they have reached their destination. Some of those not in a hurry notice a sleek machine, draped in smooth black plastic, next to the doorway. It is watching them, and seems to be the source of the sounds. Some try to experiment with the machine. Either way, the guests still leave the staircase, only to return sometime soon.


Process

Staircases are usually shared cramped spaces that are sometimes uncomfortable – we have to squeeze past people, make uneasy eye contact, or ask sheepishly if we could get help with the door. How can we become aware of our state of mind and that of other people as we move through staircases, and could this make being on a staircase a better experience?

After learning about with the underlying concept for the FLOW theme, which was that of transduction and the changes between various forms and states, I knew I wanted to apply that concept into a installation that occupied the space of a room. In this case, this room was a stairwell leading to the basement level of the Hunt Library in CMU. This space was close enough to the rest of the installations of our show, WEB OF WUBS.

I had to seek permission from CMU Libraries in order to have my installation sited in the stairwell, and therefore had to come up with a proposal detailing my plans and installation date. Due to the nature and siting of the installation, safety and privacy were important emphasis points in the proposal. I would like to thank my instructor Heidi for helping me to get the proposal across to the right people.

Placement in the stairwell is tricky, as I had to ensure that cabling and positions of objects were safe and would not cause any tripping. I iterated through various webcam and placements of cameras, computers and speakers to find out what would work well for the experience. Eventually, I settled with consolidating the entire installation into a single unit instead of trying to conceal its elements. Some of my earlier onsite testing showed that people didn’t really react with the sound if there was no visual element. This and the advice of Heidi encouraged me to put the installation “out there” so that people could see, interact and perhaps play with it.

The final enclosure for the sculpture was laser cut out of 1/8″ black acrylic plastic and glued together. Speaker holes were also included for the computer speakers used. Unfortunately, I ran out of material and decided to go with exposing the wiring on the sides. The nature of the glue used does allow disassembly and an opportunity to improve this in the future.

As for the software aspects of the implementation, I used the OpenPose library from the CMU Perceptual Computing lab. This allowed me to figure out where humans are in a particular scene. However, it only detected scenes in 2D, so I had to limit myself to working with the height and width of where people were in a scene. I used the Unity 3D game engine to process this information, and used the average horizontal and vertical positions of people’s heads to adjust the pitch in two “zones” of the staircase. (see end of post for some code).

X,Y position in zone <==> pitch of sounds for that zone 

The sounds used by the experience included those from the Listen to Wikipedia experience by Hatnote and some verbal phrases spoken by Google Cloud Text-to-Speech.

Reflection & Improvements

A lot of the learning from this project came from testing on site, and even so, I think I did not arrive at where I actually wanted to be for the installation of Staircase Encounters.

Hidden, Surreal |————————————X——| Explicit, Playful

The key issue was something I mentioned earlier: how noticeable and interacted with did I want my installation to be? In my first tests, it seemed like no one was paying attention to the sounds. But at the end, I think I perhaps made the installation too interactive. I received a lot of feedback from guests that were expecting the sounds to react more to their movements, especially since they were able to see all their limbs being tracked.

I guess given more time, I could have added more parameters to how the music reacts to users, e.g. speed of movement, “excitedness” of limbs, and encounters with any other guests. However, the visual element lead to engagement that was not followed up, which in itself was a little disappointing, like a broken toy.

My key learning from Staircase Encounters is to test and think clearly about the experience – it is easy to be fixated on the building, but not easy to be objective, emotion and measured about the experience and the end product, especially when the building is rushed.

Code

Here is some code that represents the pink and blue “zones”, which track people as they enter and move through them, and updates the sounds accordingly.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using System.Linq;

public class Zone : MonoBehaviour
{
    public enum Axis
    {
        X, Y
    }

    public List<GameObject> soundsToSpawn;
    public float min = -60;
    public float max = 30;
    public float minPitch = 1.5f;
    public float maxPitch = -0.5f;
    public Axis axisToUse = Axis.X;

    private Queue<float> positions;
    private int queueSize = 20;

    private AudioSource sound;
    private float timeStarted = 0;
    private bool played;

    private int soundMode = 0;

    // Start is called before the first frame update
    void Start()
    {
        positions = new Queue<float>();
    }

    // Update is called once per frame
    void Update()
    {
        if (sound != null && played && !sound.isPlaying && Time.time - timeStarted > 1)
        {
            Destroy(sound.gameObject);
            played = false;
            sound = null;
        }

        if (Input.GetKeyDown("0"))
        {
            soundMode = 0;
        }

        if (Input.GetKeyDown("1"))
        {
            soundMode = 1;
        }
    }

    void OnTriggerEnter2D(Collider2D col)
    {
        //Debug.Log(gameObject.name + " entered: " + col.gameObject.name + " : " + Time.time);
        
        if (sound == null)
        {
            timeStarted = Time.time;
            sound = Instantiate(this.soundsToSpawn[soundMode]).GetComponent<AudioSource>();
            sound.Play();
            played = true;
        }
    }

    void OnTriggerStay2D(Collider2D col)
    {
        if (sound != null)
        {
            RectTransform rTransform = col.gameObject.GetComponent<RectTransform>();
            float point = 0;

            switch (this.axisToUse)
            {
                case Zone.Axis.X:
                    point = rTransform.position.x;
                    break;
                case Zone.Axis.Y:
                    point = rTransform.position.y;
                    break;
                default:
                    break;
            }

            while (positions.Count >= queueSize)
            {
                positions.Dequeue();
            }

            positions.Enqueue(point);

            float avgPoint = positions.Average();

            //Debug.Log("Avg value of " + this.gameObject.name + " to " + avgPoint + " : " + Time.time);
            float targetPitch = map(point, this.min, this.max, this.minPitch, this.maxPitch);
            sound.pitch = targetPitch;
        }
    }

    static float map(float x, float in_min, float in_max, float out_min, float out_max)
    {
        return (x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min;
    }
}

 

]]>
It Never Made Any Sense https://courses.ideate.cmu.edu/62-362/f2019/it-never-made-any-sense/ Mon, 11 Nov 2019 17:12:16 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=9047 A multi-visualizer for a place that was once home.

I was born in Portland, OR in 1988, at Bess Kaiser Hospital, during a nurse’s strike. My older brother had been a traumatic and difficult birth, and so it was decided that a planned Caesarian section would be the best way to ensure a smooth delivery for me. As an added benefit, this meant that the birth could be scheduled around the strike. And so, that July, on a hot day weeks before my projected due date, my folks got a spot in the parking lot on Interstate Avenue, and went home in the mid-afternoon with a baby, feet and hands rubbed and warmed to make the lingering blue coloration fade. The hospital closed within the next couple of years, and was razed to the ground, replaced by a scrub embankment between the actual interstate highway and the street named for it.

This site is now the national headquarters for Adidas, who design shoes for human feet and shirts for human chests and backs, to be fabricated elsewhere by human hands and shoulders. It’s about two hours drive from there to Mount Hood, and about two hours from there to the low-class beach towns of Tillamook County I’ve been going to since shortly after that finally opening my eyes, a few weeks after returning from the hospital. From there to the Media Lab at CMU, travel time depends on your means of travel.

Waves

Here at CMU, there is a live projection of a wave-like shape on the wall. The waves are as a big as the waves are in Manzanita, OR, just outside the mouth of Tillamook Bay. The height of the waves on the wall is the height of the tide in Garibaldi, OR, just inside that bay. The scale of the waterline is 1:1; if you imagine the floor of this room to be “sea level”, these waves are as high above that as the water is in Oregon, right now. “Sea Level” is of course a relative term– the level of the actual sea is anywhere from +8.0′ to -2.0′ above Sea Level. There’s a certain similarity between this and the notion of a birth time, or a birth location: if not here, where? If not now, whenever.

There is a tiny screen in front of the big wave that shows you what what is happening at Manzanita right now. It’s what’s above, here. The waves you’re seeing in the video are the same waves you’re seeing in the other video, translated through a National Weather Service API and fifteen years of living on the East Coast. The relative scale between these things is about right, don’t you think? Some things are hard to let go, even when they’ve been reduced by time to a tiny picture-in-picture in your memory. I’m reconstructing this using random sinusoidal functions. I’m reconstructing this using a prejudicial feeling that ascetic minimalism, only tighter, only more monochrome, is the only way ethically correct, and accurate, way to model a place like this. I’m reconstructing this memory using another memory. Of something I was never here for. Poorly.

There is also a basin of water, with a bright light pointing into it, reflecting the surface of the water onto the wall. There is a contact speaker attached to the bottom of the basin, playing a filter-swept noise generator that sounds like waves. When the sound gets loud enough, it makes a little pump drip water into the basin from above, and ripples show up on the wall. You’re going to have to trust me here. The pictures didn’t come out. The pictures didn’t come out. But the idea was for it to look something like this:It, of course, didn’t look like this. This is a theater piece I saw maybe ten years ago, with an old friend I don’t get to see much any more– a person whose name starts with D, and for that reason or maybe some other, I would occasionally accidentally call him “dad”. It was very snowy– we made it to the theater at the last section, far up on the Upper East Side, far from everything. I’ve been thinking about it ever since. My friend said it was one of those singular experiences you’ll never forget, that you only get a handful of, and that made me feel proud to have been able to get him the ticket. The key about this show is it’s about memory, but more crucially that there are no people in it.

But, unfortunately, there are people in every room. The concept of “room” implies people, and so you find them there. Speaking of which: the drips from the pump. During critique for this piece, a professor asked me if the timing of the waves and drips was related in any way to the timing and the structure of the video. It’s not. But he asked it in such an aggressive, dismissive way that I lied about it– now, I lied in a way that didn’t make the piece any more conceptually coherent, of course, making some additional claim about the API that really didn’t make any difference. But it felt good to lie to him.

Here’s the code for the max patch:

<pre><code>
----------begin_max5_patcher----------
1342.3ocyZ0zbhaCF9L7qviO1kFrrrMvdnW5N6zCcu0sW5rSFgQPzFgkqkbR
n6D9sWYIYBjXqnrHfbHFirhezy6GOuuxleLbP3b1CXdXvGC9mfAC9wvACTC0
Lv.y2GDtF8PNEwUSKrjh1PIbw1vQ5qVTulTPwB0kAOMHqVzNZpYzRjH+FRwp
qqv4BMnooWEMJHY1jlO.oQMe.kGC9l4+grPAKa92+U.D1Bp9lK1Th02lPNYU
AhFNp6yZ9aAIWPXEnpMg6t4bwFp5ND1diWfDHC6MzePXNkTtyFs2EjWZIghK
Pq02iufxIEBF+lf+3Seb7W43J9X7bbwMrkKwUi+DKudMtPvG+6e4qAelQWTo
OCQoAfIi+a4BjUsY7mYL4bP0KHrw+I69f+hUfu5dzclkXKr2RJzlF0LaFYuI
PYrx87GJVvJDRzulKPBbKE2mLRyAstRPVaro64ETWcNhKWg0ERGndBIIGNgk
LJkc+JJaNhJvqKYl6yylT0ZTgHmU0DDHcHcMo+sFQIhMFeqB3vClPIQFIwug
rzDFAd1ZkUQVQjNeo2Yk3l07tYzZ1Bb+XvKw3Ece6aRB5Zc2X73hJrbw00kO
bUsaMICNEj7a4cPQ6lIisztIXOOAHVYAZmyiCG1dhYP8HpiONbjiJBRJyQqZ
i9BE3GTqHYZCii6TkHtSUBP+pDvoPkLQlRl.Nq4Xbb2pDS6WjnyDeOxXVItv
SDNBnHb5TEgmXivYSO4DVvVshheCJ9Vn1Lk2KQSv3j1icwrz9YFoIr+a6PoR
pAKvUWKEimq4YjWHdA9d4B4EN5eSl1NyOdZs4.JirSkeL0liNMwMywwQ4kTl
bw+F70wuJ4fph75h68RNKosiZzmap57Fc3lQTxjpUTl+BAJwnaQqK2FDmF40
HgHkT2L6FqndMVR2GxawB8vcYmMDDMXdvrrnH+Do.lk0P3rDUFvDqJ7I1kEF
4OcuyG+0N+3rTGn+jKL8+PSSK.+DzCln67OKVUUHwJwgmon950ywUdJtVWmy
vO6Rfvo9WBzmdcoRmmj5ZsJ55cIVq5AyNGU85gzKppKtUtbihjG5j6veVtGo
RxmZs0NX5Ej6iU79J+lma731Y8Yp5lO6zokeSm8544wydm1piOay2XPLdZqs
4GO4cZa9RvpXMYAdMGPWcWWou2Pj9S7OL73zp5Ej5IMOS1gVyah0p7fKol2G
j8y6K2cJPKzOK40KxARdsmr4IVqWxa+xZnpgGHvJqgWVVK81.+x5HUVMzZtM
H9bwZO1GahtvFLF9502l8dtM1eYqbK19NE2zcer0GS2kMVOeSNEusYmaodk6
Zw7IVod5kk576Vt8MTGKwxikMQ8TJh0jVlEjZY2p+juqJ+ZRv+2BT912fGOp
e1GCMpaY5mMc6wN6gO7T5QKXDNdqeZTs0mBLuhAqR2GYfr5xgTRwyemqp0Uy
3GZK3r5p7VHZe4FO8RfBWf4BRA5o2OztmOVf4877B6sqHkM8rgjC.0zezQCT
5YiRoINfTpWPxk.hTuX8hbAInOPxk.hXO.ThKFulcqe7H4hs6PCLqZgr4sFc
5SNxvtAFbV.F7RfiNJfgtjlmoQ93.xEkKnOxxgtjRztbNNjbIKOwGY4s4U1y
I7APSbIK2KH4jaJ8Tjs4Dzfdf93x2.Nw5HOXeAtnoj3gLa0uzLWkM8.R.2zI
ORjbTV7noTrKghdw34X3vg3naBGUVdGthalrBB4VO9NSkPLYj5qjB8WU6SLr
BeGoc9peQignJ4lIDxcRTWo60+go5c4q9kkUUTSLI1Rjeb3+CYljYjC
-----------end_max5_patcher-----------
</code></pre>

And while we’re at it, here’s the code for the P5.js wave modeler:

var weather;
const tideURL = 'https://www.tidesandcurrents.noaa.gov/api/datagetter?date=latest&station=9437540&product=one_minute_water_level&datum=MLLW&time_zone=lst&units=english&format=json';
const waveURL = 'https://api.weather.gov/gridpoints/PQR/46,124';
const canvasHeight = 800;
const canvasWidth = 1280;
const wallHeight = 10.0;
const wallBottom = 3;
const numWaves = 4;
const zeroPoint = canvasHeight - (canvasHeight/(wallHeight+wallBottom)*wallBottom);
var ampCtr = 10;
var newAmpCtr = 10;
var waterHeight;
var newWaterHeight;
var xSpacing;

var serial;
var portName = '/dev/tty.usbmodem14201'; 
var inData;                            
var outByte = 0;  

let waves = [];
let period = 500.0;
let fontsize = 20;
var time = "";
var name = "";

function preload(){
    askNOAA();
}

function setup() {
  waterHeight = newWaterHeight;
  ampCtr = newAmpCtr;
  createCanvas(canvasWidth,canvasHeight);
  serial = new p5.SerialPort();
  //serial.on('data', serialEvent);
  //serial.on('error', serialError);
  serial.open(portName); 
  setInterval(askNOAA, 60000);
  setInterval(serialDrip, 1000);
  for (let i=0; i<numWaves; i++) {waves[i] = new wave();}
  textSize(fontsize);
}

function wave(){
  this.size = Math.floor(Math.random()*10+7);
  this.offset = +(Math.random()*6.3).toFixed(2);
  this.speed = +(Math.random()/100).toFixed(5)+0.001;
  this.shade = Math.floor(Math.random()*75)+180;
  this.yVals = [];
  this.theta= 0;
  this.ampVar = Math.random()*ampCtr-ampCtr/2;
  this.amp = ampCtr + this.ampVar;
}

function askNOAA(){
  loadJSON(tideURL, gotTide);
  loadJSON(waveURL, gotWave);
}


function gotTide(data){
  weather = data;
  if (weather){
    newWaterHeight = (weather.data[0].v);
    newWaterHeight = map(newWaterHeight, 0.0, wallHeight, 0, canvasHeight);
    newWaterHeight = newWaterHeight.toFixed(2);
    name = weather.metadata.name;
    time = weather.data[0].t;
  }
}

function gotWave(data){
  weather = data;
  if (weather){
    waveHeight = (weather.properties.waveHeight.values[0].value);
    waveHeight = waveHeight * 3.28;
    if  (waveHeight !=0){
      newAmpCtr = Math.floor(waveHeight*(wallHeight+wallBottom));
    }
  }
}

function serialDrip(){
  serial.write("hello");
 // s = ['A','B','C','D'];
  //i = Math.floor(Math.random()*4)
  //Serial.write(s[i]);
}

function draw() {
  increment();

  background(0);
  for (let wave of waves){
    calcWave(wave);
    drawWave(wave);
  }
  drawText();
}

function increment(){
  if(waterHeight < newWaterHeight) waterHeight+=.01;
  else if (waterHeight > newWaterHeight) waterHeight -=.01;
  if (ampCtr<newAmpCtr) ampCtr++;
  else if (ampCtr<newAmpCtr) ampCtr--;
  for (let wave of waves){
    wave.amp = ampCtr + wave.ampVar;
    }

}


function calcWave(wave){
  var w = canvasWidth+wave.size;
  dx = (3.14159*2 / period)*wave.size;
  wave.theta += wave.speed;
  let x = wave.theta;
   for (let i=0; i <= w/wave.size; i++){
     wave.yVals[i] = sin(x+wave.offset)*wave.amp + (zeroPoint-waterHeight);
     x+=dx;
   }
}

function drawWave(wave){
  noStroke();
  fill(wave.shade);
  for (x=0; x<wave.yVals.length; x++){
    ellipse(x*wave.size, wave.yVals[x], wave.size, wave.size);
  }
  //yValues = [];
}

function drawText(){
  textAlign(RIGHT);
  text(name + " | " + time, canvasWidth, canvasHeight-10);
  textAlign(LEFT);
  let divisions = canvasHeight/(wallHeight+wallBottom);
  let n = -wallBottom;
  for (let i=canvasHeight;i>0;i-=divisions){
    text('> '+n, 0, i);
    n++;
  }
}

And as for the Arduino that controlled the pump, it was just a single transistor switching a 12v power supply, triggered over the serial port from Max. With some basic filtering to make less like the sky was peeing into the basin. Code:

const int buttonPin = 2;
const int pumpPin = 9;
const int ledPin = 13;
int newByte;
int lastByte;

void setup() {
  Serial.begin(9600);
  pinMode(buttonPin, INPUT);
  pinMode(pumpPin, OUTPUT);
  pinMode(ledPin, OUTPUT);

}

void loop() {
  if (Serial.available()){
    newByte = Serial.read();

    if (newByte == lastByte) newByte = 0;
  
    if (digitalRead(buttonPin) == HIGH || newByte == 1){
      digitalWrite(ledPin, HIGH);
      digitalWrite(pumpPin, HIGH);
    } else{
      digitalWrite(ledPin, LOW);
      digitalWrite(pumpPin, LOW);
    }
    lastByte = newByte;
  }
}

They say you can never go home, because it’s always different. Which is of course true: people change and times change and cities undergo massive gentrification driven by an insane belief that unyielding and constant growth is an unimpeachable positive that will continue to be sustainable for the same amount of time that human civilization has existed thus far. And besides, your parents get old and die, and your friends move, or lose their minds, or slowly disappear into themselves; and maybe you realize that nothing ever made any sense, and indeed that’s the one constant through all of this.

On the other hand, for all its environmentalist overtones, isn’t that a remarkably anthropocentric way to look at the world? Decay is a constant, and you’re part of it. Standing at the upper lip of a waterfall watching the freezing white water whip past, one notices that the water is always different, and the rock always smoothing and decaying, the gorge filled with fallen trees that decay and rejoin, your feet the same stupid feet standing and supporting your towering eyes, the feet that won’t stop walking despite mounting evidence that the contrary would be a better move– what makes these things is constantly in flux, never the same, always immaterial, and yet we try and call it by the same name as long as we possibly can.

Life is, properly, a curse: inescapable, defining, visited upon you when you weren’t paying attention, destined to define your ultimate end. Comforting in its familiarity, becoming ‘you’ through lack of any other serious offers.
The water comes and goes, and comes, and goes.

]]>
Hyphen  https://courses.ideate.cmu.edu/62-362/f2019/8991-2/ Thu, 07 Nov 2019 19:25:23 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=8991 Description – There is a tent structure covered in charcoal drawings of old organisms, cellular structures, and spirals. This tented structure is made from large branches found in the park. On the inside, I am there, with a prompt to state a personal truth. A computer program transfers this audio information into a sentence, and the volume of ones voice is reflected by a circle that changes sizes (the louder, the larger). I observe the circle and document the sentence the program returns. I write down a number of repetitions of ones statement, based on how loud one stated it (the louder, the more repetitions). I share the truth statements and how many repetitions they received at the end of the performance. 

Process Reflection –  In the future I definitely want to work with Arduino, because it was hard to work with a new library where I didn’t know many people who could support me in the technology aspect. Because P5.js and arduino are both very new to me, I really could have benefited from more guidance from someone who had works with speech rec before, but I was unsure who to reach out to that would make time to help me. I think in the future it would be better to work on something that I knew I could get support in in the class, as this makes it easier to finish the technology well and on time. 

In the critique, one thing I learned is that you cannot just consider something you make pithing interacting with one person, rather you must consider the context of someone watching another [person interacting with the thing you made, and what. That experience as an onlooker is like. I think what could help with this is staging practice performances ahead of time that imitate the interaction I would like to happen in my piece, and record it so I could gain more perspective on this. 

One thing that I am very happy with is how the space turned out! I definitely did plan my time better than the last piece in spacing out the work for myself. Even though I am in studio art, so I feel comfortable making things with my hands, this is the first time I have made an installation someone could walk inside. I was really proud of how this space turned out and the intimacy of the natural elements and the traces of drawing. 

I am a bit critical of myself in the conceptual consideration of this piece, even though I think some parts were strong. The more I think about the prompt I had, to assert a personal truth, the less powerful or interesting I think it is. I like the idea of being biases toward what is loud, in a space that celebrates silence, but I think this prompt was distracting to that exploration. I dislike the question too, because it implies that there is a kind of stable personal truth, or feels like it is pressuring someone to establish a personal truth, when in reality, I believe we are all in flux. I dislike confessional work because it feels as though the artist is attempting to create this space that is transformative instead of actually creating someone interesting that derives from exploration/ the loss of ego. 

 

marking the sticks with string to record their position 

drawing very old plants onto the fabric of the installation tent

reference image

brainstorming self-supporting structure.

drawing on the backside of the fabric, to create some color

creating some rubbings of plants on the fabric

inspiration for the creation of my tent

image of interior of the tent, showing the wooden beams, drawing, and plants

view looking up inside the structure

image of projected voice recording text

me, inside the space

the installation entrance

detail of marked fabric

outer view

outer view of installation

p5.js Code –

/// project title – hyphen 

//  this code recognizes one’s speech and returns one’s sentence. 

//  it also draws a circle which varies sizes according to how loud one speaks. 

/// code credit goes to the p5.SpeechRec examples as well as p5.Sound examples

let speechRec;

function setup() {  

createCanvas(400,400);

let lang = navigator.language || ‘en-US’  

speechRec = new p5.SpeechRec(‘lang’,gotSpeech);

speechRec.continuous = true;

  speechRec.start(); 

  //mic information

  mic = new p5.AudioIn();

  mic.start();

  

}

function gotSpeech(){

  console.log(“I got result, the result is”);

  console.log(speechRec.resultString);

  console.log(“—————————-“)

}

function draw(){

  background(250,230,230);

  if (speechRec.hasOwnProperty(“resultString”)){

    micLevel = mic.getLevel();

  text(speechRec.resultString,100,100);

    ellipse(width/2, 3*height/4, micLevel*500, micLevel*500); 

  } 

  

}

function onresult() {

  console.log(“this is working!!!!!”);

  console.log(“________________”);

}

]]>
Project 2: Walking Backwards https://courses.ideate.cmu.edu/62-362/f2019/project-2-walking-backwards/ Thu, 07 Nov 2019 19:05:01 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=8970 [62-362] Activating the Body | Project 2: Flow | Fall 2019 | Hugh Lee

Side view of the device

Front view of the device

Close up view of the device

Close view of device attached on arm

Both arms with device attached

Device not attached

How the device works

Project Description:

Walking backwards is a project that explores the uncomfortable feeling of walking backwards. This project was inspired by a small exercise with Slow Danger, who have taught us several physical exercises in which walking backwards was one of them. They instructed us to walk and imagine our backs as the front of our body. This resulted in a very interesting movements, and explored the uncomfortable feeling of walking without seeing. The device that I created measures the distance between the device and any surrounding objects and transduces into vibration. The vibration indicates how close one is to any obstacles behind.

Process Reflection:

In the middle of the process, I quickly realized the limitations of the project. At the final critique, I came up with two devices, which each one would be strapped on both arms individually, but it would have been more successful if I had more devices to put on different parts of the body. At the Final Critique, when I put the devices on Alex’s arms, his movement was very timid, as if he didn’t trust the devices. More devices would have led to a more accurate measurement and variation of movements.

The device would have also benefited if I had not used a breadboard and used less components. The jumper wires continued to fall out unexpectedly. Also, having the Arduino chip and bread board as two large pieces separately made the device unnecessarily large.

Flow Diagram:

 

Arduino Code:

//Project no.2: Walking Backwards
//Hugh Lee
//The ultrasonic sensor measures the distance between the sensor and any object nearby and converts into vibration. There are three different types of vibration, indicating how close the object is to the sensor. The different types of vibration are pulled from the "Adafruit Drv2605.h" library. 
#include <NewPing.h>
#include <Wire.h>
#include "Adafruit_DRV2605.h"

#define TRIGGER_PIN  12  // Arduino pin tied to trigger pin on the ultrasonic sensor.
#define ECHO_PIN     11  // Arduino pin tied to echo pin on the ultrasonic sensor.
#define MAX_DISTANCE 400 // Maximum distance we want to ping for (in centimeters). Maximum sensor distance is rated at 400-500cm.

Adafruit_DRV2605 drv;


NewPing sonar(TRIGGER_PIN, ECHO_PIN, MAX_DISTANCE); // NewPing setup of pins and maximum distance.


void setup() {
  Serial.begin(9600);
  Serial.println("DRV test");
  drv.begin();

  drv.selectLibrary(1);

  // I2C trigger by sending 'go' command
  // default, internal trigger when sending GO command
  drv.setMode(DRV2605_MODE_INTTRIG);
}

uint8_t effect = 1;

void loop() {
  //  Serial.print("Effect #"); Serial.println(effect);
  int ping = sonar.ping_cm();
  delay(50);                     // Wait 50ms between pings (about 20 pings/sec). 29ms should be the shortest delay between pings.
  Serial.print("Ping: ");
  Serial.print(ping); // Send ping, get distance in cm and print result (0 = outside set distance range)
  Serial.println("cm");


  if (ping < 5) {
    effect = 118; //if ping is smaller than 5, it uses vibration 118 from the "Adafruit_DRV2605.h" library
    drv.go();
  }
  else if (ping < 25) {
    effect = 48; //if ping is smaller than 25, it uses vibration 48 from the "Adafruit_DRV2605.h" library
    drv.go();
  }
  else if (ping < 50) {
    effect = 44; //if ping is smaller than 50, it uses vibration 44 from the "Adafruit_DRV2605.h" library
    drv.go();
  }
  else {
    drv.stop(); //stop vibrating for any other values
  }
  drv.setWaveform(0, effect);  // play effect
  drv.setWaveform(1, 0);  // end waveform
  // play the effect!

}
]]>
Blind Understanding https://courses.ideate.cmu.edu/62-362/f2019/blind-understanding/ Thu, 07 Nov 2019 18:40:35 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=8878 [62-362] Activating the Body | Project 2: Flow | Fall 2019 | Alex Lin

Long Exposure Shot of User Movements

Overall Project Technology Arrangement

Overall Technology Set-Up

Project Calibration Set-Up

Overall Calibration Set-Up with Garment Display

Project Installation

Installing the Project onto a User | Photo Credit: Scarlet Tong

Project Wiring Overview (HUB)

Project Main Wiring Hub Close-Up

Project Wiring Overview (EMBEDDED)

Project User Embedded Wiring Close-Up

Project Suit-Up Process

Process of Installing the Project On Site | Photo Credit: Dave Choi

Process of Installing Project Collage

Project Demonstration

Project Detail Highlights

Adjustability Highlight

Stack of Embedded Accelerometer, Speaker, and Neopixel LED Circle

Description
Abstract Description
Two people stand on each side of a curtain and move. Their movements cause lights on them to shine. The movements also cause music on the other person to play louder or softer and cause vibration motors to vibrate based on which way they lean.
Narrative Description

Blind Understanding is a performance piece where two people stand on opposite sides of a curtain and are equipped with accelerometers which monitor and capture their movement. This information is then sent through digital signals to create a combination of vibration and music for the opposing person to experience, who can then process that information which can impact their movement. Sited in the Media Lab to provide an intimate space for the dancers to perform and through movement communicate, the project aims to allow for the users to speak through a different medium and feel the presence of the other individual. The dancers will be blocked from each other’s view due to the position of the curtain, but will be free to move as they please (within the confines of the reach of wires). The goal of the exhibit is to convey and to exercise the reading of movement through technology that may reveal information and emotion that otherwise may not be within our frame of perception.

This concept was derived from John Berger’s Ways of Seeing in which he describes how photographs are inherently subjective to the control of the photographer. This was applied to an idea about how people see past one another and attempting to create an experience that brings people together in an experience driven by a focus on the senses rather than language to communicate. This concept is related to my interest in the ways that people explore and interact with space as well as other people. Introducing a visual limitation, changes the ways in which people move and act within a space. In addition, communicating in an intimate environment where one is separated from the other begins to tie into ideas of technology and the distortion of distance that it allows. The concept tries to propose an alternate future where one can sense through a call the physical movements of the person they are communicating with. The project also is a projection of my perception of how I interact with others as someone who is more introverted and reserved, and  has become constructive criticism, as well as encouragement, for myself to express and to confide in others more frequently.

Process Images

Initial Concept Sketches:

Elevation Concept Diagram

Data Flow Diagram

The initial idea was, as explained above, to have two lines of communication between two users that abstracted information about the other persons movements that would influence the users movement without being able to see one another. In developing the concept, most of the concept remained the same, except for the lights which became feedback for the dancers themselves to know that their movements were causing a fluctuation in the data being sent across to the other person. The main question that was vague was the form that the project would take in latching on to each user and how it would fulfill all of its necessary functions.

Wiring Regrets Timelapse:

This timelapse is a snippet of the effort applied to securing all of the wires necessary to make this project possible. Because the system I proposed is not wireless, and I wanted the dancers to have a reasonable level of freedom in movement, this meant that I wanted the wires to hang above them so that they wouldn’t trip and that I would need to overestimate the amount of wire to provide as slack so that they would be able to move around more.

Wiring Regrets:

Workspace Update and Wiring Regrets

At each stage of the wiring process, from soldering, to wiring to the Arduino board, to mounting the accelerometers, speakers, LEDs, and vibration motors to the garment, I needed to ensure that all of the connections were still working, so that required me to troubleshoot multiple times (ex. when a wire pops out of an Arduino pin).

Garment Design & Sewing:

It’s been a while since I have sewed anything, but getting back wasn’t as terrible as I had originally thought. I used canvas as the more flexible fabric, and a plastic material as the more rigid which worked out fine, but sewing the two together took a while, especially when making two of them. In addition, a key factor in the design of the garment was ensuring that it would be able to fit anyone that wanted to try it, and so I took extra care to implement strategies as to allow for that adaptability (which also took more time than I had originally intended).

Garment Folding:

My apologies for the beginning of the timelapse being out of focus. In any case, folding the tessellations that I had decided on took a while to get the initial grid pattern set, but the tessellating wasn’t terribly difficult. I had originally planned on using CNC-ed wood for this project, but due to various issues with photogrammetry and CNC-ing, I had to scale down and redesign an origami piece that would be feasible within the time constraints. I chose the tessellation I did to match the scale of the project’s LEDs, but the patterning was intended to symbolize the scales/armor that we put up to protect ourselves which represent the hesitation to fully communicate with others.

Project Layout:

Deconstructed Layout View of the Project

Laying out the project after all of the initial fabrication was complete was interesting to see the bundles of wire and the layers of material and information that were embedded in the project. It was also a representation of organizing the system and visualizing it before the absolute chaos of wiring, undoing the wire bundles, etc.

Wire Securing:

Wire Securing Process

To ensure that the wiring directly connected to the Arduinos would not get pulled out through the movements of the dancers, I used a piece of wood to secure the breadboards and Arduinos and used zip-ties to secure the wires, protecting the wiring as best as I could. The zip-ties were threaded through holes in the wood and tightened until the wires weren’t budging.

Process Reflection

The process of this project was a bit bumpy to be honest. I was fairly confident throughout the process that I would be able to get the electronics working, but I took too much time to assess fabrication processes. I ran into issues with the photogrammetry program I was using with the images I was inputting and the modeling for the CNC-ing was heavily reliant on the 3D model that the photogrammetry software would output. I burned a lot of time trying to get the initial idea to work, and took too long to bail. In addition, the garment design was a huge undertaking given my ambitions. As I didn’t have two specific designated people to perform, I had to make sure that the final design would be able to fit anyone (although that would have been my intent either way, I suppose). Many of the solutions to allow for more flexibility can only be described as “jank”, but they did function as intended and the piece seems to perform well in that context. Although it worked out in the end, the final project wasn’t as integrated between the technology band and the origami sculptural piece as I would have hoped. In some of the imagery, you can clearly see issues with dancers having wires lifting the origami piece or dealing with wires threaded under their arms.

In regards to technology, the DFRobot Mini MP3 Player was a massive asset to allow for the Exploded Ensemble’s music to play, but was also very finicky at times, with some functions not working and general troubleshooting occurred very often. In regards to the LED feedback, I managed to get it to the point where different movements clearly made different light patterns and colors, but I would have liked to further develop that so that people can better understand what data was being collected from their movements. I think the vibration was also a point that could have been composed better. The vibration motors were definitely felt, but because the two motors on each garment were attached to the same piece of plastic, the sensation of the motors was more easily confused or lost than I would prefer. Lastly, the general use of wires felt really wasteful. The concept would have been weaker and more fragile if the lengths had been shortened, but in the future it might be better to consider ways to reduce that sort of excessive expenditure of materials.

Conceptually, the review and feedback was very helpful to give insight on how others saw the project in how it was presented and in how it functioned. The project would have been more compelling if I had taken more strides to introduce the rules of the project beforehand or if I had choreographed a dance that would provide an intriguing performance for the audience. In addition, the idea of a goal or of stakes was frustrating because one of the original ideas that I had was to give dancers a prompt (through a piece of music or through text) that they would be trying to convey to the other dancer. I think if more planning had gone into this part of the project, it would have had much more depth and would have incentivized the dancers to reach for a specific goal. Looking forward, I want to consider perhaps implementing myself into the final project and becoming a performer. The goal of designing a performance sat in the backseat for this project mostly because I was overly concerned about the fabrication, functionality, and aesthetics of the project. Moving forward, I want to expedite the fabrication portion to allow more time to consider the other intriguing performance aspects of the work, or to stay diligent on implementing these ideas earlier on in the process.

Arduino Code
//Blind Understanding by Alex Lin 
//This code uses the input of an accelerometer to control several outputs, including a Neopixel LED circle, two vibration motors, and a speaker based on various data (ex. pitch and roll) processed by the accelerometer. 
//Code requires two Arduinos for the two users, but should be able to be condensed into one file of code and run through one Arduino. 

//Accelerometer Code From: https://www.electronicwings.com/arduino/adxl335-accelerometer-interfacing-with-arduino-uno
#include <math.h>
#define x_out A1 /* connect x_out of module to A1 of UNO board */
#define y_out A2 /* connect y_out of module to A2 of UNO board */
#define z_out A3 /* connect z_out of module to A3 of UNO board */

#include <Adafruit_NeoPixel.h>
#define LED_PIN 6
#define LED_COUNT 16

#define vibrationLeft A4
#define vibrationRight A5

int red = 0;
int green = 0;
int blue = 0;

int together;

int ledLoop = 0;

Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_KHZ800);

//DFPlayer Mini Code From: https://wiki.dfrobot.com/DFPlayer_Mini_SKU_DFR0299
#include "Arduino.h"
#include "SoftwareSerial.h"
#include "DFRobotDFPlayerMini.h"

SoftwareSerial mySoftwareSerial(10, 11); // RX, TX
DFRobotDFPlayerMini myDFPlayer;
void printDetail(uint8_t type, int value);

int volume = 10;
int max_vol = 30;
int min_vol = 5;

void setup()
{
  strip.begin();
  strip.show(); // Initialize all pixels to 'off'

  mySoftwareSerial.begin(9600);
  Serial.begin(115200);

  pinMode(vibrationLeft, OUTPUT);
  pinMode(vibrationRight, OUTPUT);

  Serial.println();
  Serial.println(F("DFRobot DFPlayer Mini Demo"));
  Serial.println(F("Initializing DFPlayer ... (May take 3~5 seconds)"));

  if (!myDFPlayer.begin(mySoftwareSerial)) {  //Use softwareSerial to communicate with mp3.
    Serial.println(F("Unable to begin:"));
    Serial.println(F("1.Please recheck the connection!"));
    Serial.println(F("2.Please insert the SD card!"));
    while(true);
  }
  Serial.println(F("DFPlayer Mini online."));

  myDFPlayer.volume(volume);  //Set volume value. From 0 to 30
//  myDFPlayer.loopFolder(15); //loop all mp3 files in folder SD:/05.
  myDFPlayer.play(1);  //Play the first mp3
}

void loop(){
  static unsigned long timer = millis();

  if (myDFPlayer.available()) {
    printDetail(myDFPlayer.readType(), myDFPlayer.read()); //Print the detail message from DFPlayer to handle different errors and states.
  }
  
  int x_adc_value, y_adc_value, z_adc_value; 
  double x_g_value, y_g_value, z_g_value;
  double roll, pitch;
  x_adc_value = analogRead(x_out); /* Digital value of voltage on x_out pin */ 
  y_adc_value = analogRead(y_out); /* Digital value of voltage on y_out pin */ 
  z_adc_value = analogRead(z_out); /* Digital value of voltage on z_out pin */ 
//  Serial.print("x = ");
//  Serial.print(x_adc_value);
//  Serial.print("\t\t");
//  Serial.print("y = ");
//  Serial.print(y_adc_value);
//  Serial.print("\t\t");
//  Serial.print("z = ");
//  Serial.print(z_adc_value);
//  Serial.print("\t\t");

  x_g_value = ( ( ( (double)(x_adc_value * 5)/1024) - 1.65 ) / 0.330 ); /* Acceleration in x-direction in g units */ 
  y_g_value = ( ( ( (double)(y_adc_value * 5)/1024) - 1.65 ) / 0.330 ); /* Acceleration in y-direction in g units */ 
  z_g_value = ( ( ( (double)(z_adc_value * 5)/1024) - 1.80 ) / 0.330 ); /* Acceleration in z-direction in g units */ 

  roll = ( ( (atan2(y_g_value,z_g_value) * 180) / 3.14 ) + 180 ); /* Formula for roll */
  pitch = ( ( (atan2(z_g_value,x_g_value) * 180) / 3.14 ) + 180 ); /* Formula for pitch */
  //yaw = ( ( (atan2(x_g_value,y_g_value) * 180) / 3.14 ) + 180 ); /* Formula for yaw */
  /* Not possible to measure yaw using accelerometer. Gyroscope must be used if yaw is also required */

  Serial.print("Roll = ");
  Serial.print(roll);
  Serial.print("\t");

  Serial.print("Pitch = ");
  Serial.print(pitch);
  Serial.print("\t");

  if(roll < 30 && volume > min_vol) {
      volume--;
      myDFPlayer.volumeDown(); //Volume Down
  }

  if(roll > 90 && volume < max_vol) {
    volume++;
    myDFPlayer.volumeUp(); //Volume Up
  }

  if(roll < 60 && red < 255) {
    red++;
  }

  if(roll > 60 && red > 0) {
    red--;
  }

  if(pitch < 150 && green < 255) {
    green++;
  }

  if(pitch > 150 && green > 0) {
    green--;
  }

  together = pitch + roll;
 
  if(together < 200 && blue < 255) {
    blue++;
  }

  if(together > 200 && blue > 0) {
    blue--;
  }

  if(pitch > 120) {
    analogWrite(vibrationLeft, 255);
    analogWrite(vibrationRight, 0);
  }

  if(pitch < 120) { 
    analogWrite(vibrationLeft, 0);
    analogWrite(vibrationRight, 255);
  }
  
  Serial.print("red = ");
  Serial.print(red);
  Serial.print("\t\t");
  Serial.print("green = ");
  Serial.print(green);
  Serial.print("\t\t");
  Serial.print("blue = ");
  Serial.print(blue);
  Serial.println("\t\t");
    
  strip.setPixelColor(ledLoop, red, green, blue);
  ledLoop++;
  if(ledLoop == 16) {
    ledLoop = 0;
  }
  strip.show();
}


void printDetail(uint8_t type, int value){
  switch (type) {
    case TimeOut:
      Serial.println(F("Time Out!"));
      break;
    case WrongStack:
      Serial.println(F("Stack Wrong!"));
      break;
    case DFPlayerCardInserted:
      Serial.println(F("Card Inserted!"));
      break;
    case DFPlayerCardRemoved:
      Serial.println(F("Card Removed!"));
      break;
    case DFPlayerCardOnline:
      Serial.println(F("Card Online!"));
      break;
    case DFPlayerPlayFinished:
      Serial.print(F("Number:"));
      Serial.print(value);
      Serial.println(F(" Play Finished!"));
      break;
    case DFPlayerError:
      Serial.print(F("DFPlayerError:"));
      switch (value) {
        case Busy:
          Serial.println(F("Card not found"));
          break;
        case Sleeping:
          Serial.println(F("Sleeping"));
          break;
        case SerialWrongStack:
          Serial.println(F("Get Wrong Stack"));
          break;
        case CheckSumNotMatch:
          Serial.println(F("Check Sum Not Match"));
          break;
        case FileIndexOut:
          Serial.println(F("File Index Out of Bound"));
          break;
        case FileMismatch:
          Serial.println(F("Cannot Find File"));
          break;
        case Advertise:
          Serial.println(F("In Advertise"));
          break;
        default:
          break;
      }
      break;
    default:
      break;
  }
}
]]>
Seeing the Wubs https://courses.ideate.cmu.edu/62-362/f2019/seeing-the-wubs/ Thu, 07 Nov 2019 01:39:03 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=8831 [62-362] Activating the Body | Project 2: Flow| Fall 2019 | Scarlet Tong

Music played through water-filled speakers and generates ripples that are reflected onto a wall. The change in frequency and amplitude causes a variation in the amount of movement on the water surface.

The project seeks to propose another method to push the extent of how music can affect and create an atmosphere that influences the audience in ways other than acoustically.

Final Installation 

Overview of installation Overview of installation Water reflection Installation interface

 

Here are gif highlighting the different elements of the project.

Disassembly of speaker element (and also making a mess along the way)

 

 

Process

For this project, I struggled during the conceptual development phase a lot because I did not have a clear idea of the physical form of the final product. Finding Dagny Rewera’s project Invisible Acoustics was really helpful as a reference to begin thinking about how to use different materials and mediums to visualize the effects of sound. It hardest part of the project is the constant trial and error I had to do to find an optimal way to show and see the vibrations caused by the music. I had a lot of assistance in procuring materials and speaker woofers to begin testing early on. This is the first time I have used water in my project and it was hard to make sure the woofers were protected from the water. Fabrication was very minimal to make acrylic rings that will fit around the woofers and hold the cling wrap in place. I used p5js to make an interface that will allow me to play different frequencies of sounds based on where the finger is touching on the screen as a way to integrate a possibility for other users to interact with the piece after the performance.

Wiring Material exploration First attempt to put water into woofer Experimenting with different vibration mediums

Testing the water reflection effect

//Scarlet Tong
//sntong@andrew.cmu.edu
//Activating the body
//Gradient Sound Interface


function setup() {
  createCanvas(windowWidth-20, windowHeight-20);

  osc = new p5.TriOsc(); // set frequency and type
  fft = new p5.FFT();
  frameRate(80);
}

function draw() {
    let millisecond = millis();
    var color1 = color(mouseX, 233, 210,200);
    var color2 = color(mouseY, 75, 95,100);
    setGradient(0, 0, windowWidth, windowHeight, color1, color2, "Y",255);
    noStroke();
    fill(255,255,255,100);
    ellipse(mouseX, mouseY, 100, 100);  

  // change oscillator frequency based on mouseX
  
  if (mouseIsPressed) {
    osc.start();
    let freq = map(mouseX, 0, width, 40, 200);
    osc.freq(freq);

    let amp = map(mouseY, 0, height, 0.5, 0.01);
    osc.amp(amp);
  }
  else {
    osc.stop();
  }
  

}

 

]]>
Project no. 2 – Sample Theremin https://courses.ideate.cmu.edu/62-362/f2019/project-no-2-sample-theremin/ Tue, 05 Nov 2019 19:15:02 +0000 https://courses.ideate.cmu.edu/62-362/f2019/?p=8857

The piece. Inside each of the glass fixtures are an infrared sensor.

These sensors can be manipulated with any object, including hands. (See: hands)

Description

    • Hanging from the ceiling is a utilitarian-looking chandelier, made out of rectangular pieces of metal. The chandelier points in four directions, with small bowls of frosted glass at the ends. Inside of these bowls are infrared distance sensors, which are wired through the arms and into an Arduino Uno in the base of the light fixture, connected into a laptop. When the laptop sends sound to four speakers surrounding the audience, the four sensors can be manipulated by any object that blocks light, affecting both volume and a combination of pitch and playback speed.
  • In this performance, my wallet and jackets became essential towards interacting with the sensors.

    • In this particular performance, the playback was a sample from Hildegard von Bingen’s ‘Voices of Angels”.

 

Progress Reflection

In addition to Activating the Body, I am also taking a class centered around the Max program called “Twisted Signals”, taught by Jesse Stiles. In an effort to combine the two workloads into one (considering I have no prior experience with Arduino, Max, or gutting found chandeliers), this performance is the first iteration of a result. As a means of giving myself enough time to create something functional, I spent most of my time front-loading the coding and sensor-testing aspects of this piece, which I thanked myself for as the presentation drew closer. Shown in the code below is a delineator (“!”) meant to help funnel information into its proper place, an aspect which truly took a grueling week to figure out how to implement, as no one was either available to show me, or able to answer it already online. This became one of the most rewarding takeaways from this project; being able to suffer, and then figure out the answer to my question on my own. With that experience, I feel far more confident in approaching aspects of physical computing that I do not have as much familiarity with.

The accursed @seperator attribute, key to all knowledge and wisdom.

This process also allowed me to explore just how much I can manipulate found objects. Whereas with the first project, the electronics were “embedded” within the headlight and a cardboard box, I went to far greater lengths to gut the chandelier before rewiring it with the sensors.

The original wiring for the chandelier.

The guts, the gore, and all else that was inside this thing.

There are some portion of the project that didn’t turn out… ‘as planned’ is harsh, but I would have liked to see some more headway on aspects of the piece in several regards. In particular, I was beginning to figure out functionality to move throughout any particular sample. In an earlier version of the Max patch, I created a metronome that could also make a BPM, so as to make loops of a certain number of measures. However, I still have a lot to learn about working with buffered audio in Max before the theremin could reliably mess around with this.

Pressing and releasing the + button in rhythm would end up giving you a BPM, and you can select between 2, 4, 8, and 16 bars to loop in the waveform viewer further below. Also pictured is a way to record through a microphone into the buffer, named friend, because it is my friend.

I may actually continue working on this, though, which makes me happy. The feedback from the critique session allowed me to think of this in far more interesting performance contexts, and combining furniture/home appliances with music has started to pop up in other ideas I’ve been having recently. Maybe this is the start of something!

Arduino Code

/*Project no. 2 - Sample Theremin
Padra Crisafulli

Description: This code allows for the four infrared sensors to communicate 
with the program Max to manipulate sound. Max reads the printed serial data
and delineates the exclamation marks out to funnel each piece of sensor data 
into its desired place within Max itself.

The four pins are named after different colors for convenience sake only, 
as the chandelier they were wired through was so tight that tracing the wiring
without color coding would be a task meant for the devil's enjoyment.
*/

int analogWhite = A0;
int analogBlue = A1;
int analogYellow = A3;
int analogGreen = A4;

void setup() {
  pinMode(analogYellow, INPUT);
  pinMode(analogWhite, INPUT);
  pinMode(analogBlue, INPUT);
  pinMode(analogGreen, INPUT);
  Serial.begin(9600);
}

void loop(){
  Serial.print(analogRead(analogWhite));
  Serial.print("!");
  Serial.print(analogRead(analogBlue));
  Serial.print("!");
  Serial.print(analogRead(analogYellow));
  Serial.print("!");
  Serial.println(analogRead(analogGreen));
}

 

Max 8 Code (Compressed)

<pre><code>
----------begin_max5_patcher----------
5637.3oc68rscihrcO69qPKk7Pt3wKp6Px4gj7Ej2mLKuvRXYFi.E.Y28Yxo
+1CT6BTgDfJfBI59L8ZFaWbceq12pM65O9xCqeI4qAYqW8us5WW8vC+wWd3A
4gJOvCpwOrdu+W2D4mIur06Cxx72Er9Q3b4AeMWd7r7jCUGL4XdTPd92ND.O
40qW8apScvOeyagw6dNMXSNbVpK9ImGWwb7J+EAU9Sbwgpum3i6CiKdfR..e
5fvqQdTj5ngak.SxK+9ufbDqKO3e6Keo7GOZH9EE9QvS67Ci+dE17geZr+9f
yN6ux9sNQ2rvcw9QqeT+uJ++WiR7yK+invr7Szjjzvf3b+7vjXcboMRkCSRp
Dtk+BQ3k+hJznUE3Qwc4GGGDsI4Xr79bLmNxZiNRWWCQoEDh7fzmCh8eIJPG
Zy7+HX6y944ogubLO3zekoH1JpcI8L5XPxqUGt535O98E.Z44+EgySJnuw4i
Rh20KO4rmlujiya8Yk8VRZt4OrJ1baOpiwg4Y4eCHLT37RwupeMPYwtmq4ml
OxIaX4rLpKQNYCasIajwLYKN3yh69B76Pj+299pWKmVrEYvrrW7i2YBVyHx4
MvrmtwZTqXMtMr1YVUwPuCpX3HoRERoVkBULft4YVEif+CiFFpE0vbuUuzwr
uzUYa7iBHiSAC20QN6BjeXWwZtioJXDtqsNFRGGFJnNOwJlZfXVECcQVDC+W
990zbzKF5xJwPhijGVn6jYEiDtTKhgE541VZebEZjxoBv1mPx+DNiwjPa3Ha
wvE4EnybvE4qspqLiMrAdgFlxogf0bqE1fqXss8TKJYz3HQNQDQA2WX1BGso
xzBIwuuBMI0MJEpDv0TjslJ5YQrbWZRxGAUNkhW8eTxTWgL2+LCXyXrjAinW
QYDwT+S8bV2v2ohiU7N6wwoW7yB2jW3mQo+zE9XPqbCY8qIQQIetKJ4E+n7f
8GRzcRo3ro68iy2jjVhTJmLqOaIopfnA3pD2VuOqlf.mGBooqqXexVv4II.V
42ToWs6BKHtQAw6xeS+tyC27t9Cn5JqAcD9jeeqO7leVPe29gvBdVGX2+6Q+
nv7u0B3kGVnVHOMn3dUO7U+138Vy3nFH2kDSHMubSiZ3GmzRP96lfFFoK0TA
Qp.zxAMvVuThJhQnxIF1NlgEiylLAj8VK6rIwYAEx.kClosbHCD7hgKRUI+w
1bQ5xHjAJ3aosCYfvVPgLnR0osCYfvWTgLToO0xgLPI+nDxfhMa8PFnr+Ljg
+9KjA7cHjALBcyWnAWmeXhY.+yTLCG2+RPZ21yLYI7PtRWS3PoBXoEvS30m7
fy8BUQhY.UEVGU6MPoQ5iMlKlkkOBY+PAwiCCIpb5Z4PAE30KEdHpv4r4.Cc
VL7PNeNDR41Lg0YmkPl1jDg.7fngXjwnnwoErPXerfzCVT5B7LfEb6iE3dvh
RmKmArfYer.0GVvmCjvlKarDEJ7pl33T33dw+wFoVbXtCDkMx4J3J2XU.nYC
WYqPOMRaxvLLqiqjESduHtNywRkSEKnrWV4YkkydIyYwvEwHwbvEYnkQ1KwT
7bTvCL7BJ6kXhybTvCL5hJ6kUg.X4rWx9gofGTrYqm8R9eVvC+7j8xNDLe43
quFjVIX18BNWmfxqluEFCeprvgzLYgDtPsoqbMQZhEPZJX.wxHMwa1PZr83z
f0E6gzixMuB8Q4Eyr5BotJt3gkNhyf0AhV8yoZCgX6zDlmraWTPm3YXbtIno
miUQSD6FkMzo4QKR3MKwkfWREhMeVpC6kSvkPRQrdYXae21+LLdaxmiLmuTo
Kspx.RXquqLNd1xGBgL5DhnRVmJgHHGKkPDpiMYooA9aGYwpAVIUqOKwVUPx
nJVsMI62GDmeA18ut5+pg8yN9JVvkxjvmWL8IQ4LOmImORpUqd9fnB.dEG0u
i.OdEVlKkqiqhwvyZ6CbzlwM+dv2NdnW+.drseYBZibUqUJa3KDC8RzFiWdb
XADqrkYvd1k+ZWtqBkgUC2ZLW9rYQAO9TrC9HXaCJjERYWhli7VRHKo7Vhmi
zVRFUJDdQBemJBkOBC97ivrvWBqRaS0qJ40WyBzSBktwwnjMuGrcap+trMoI
QQ5Sx1DEt4872RSNt6M8iCgP81k2.bhOt7DurqJUW0GIIcaAvqcjg9gch.l.
DXLDVg2jWbdbcfwA6eIX6EE2F.x54V70vBLNHMqYov8vZ+CGzNby5Fau+umH
ePt0kkUAvlj1rRsRCJ4mv8e5B8SKHE4EzgioPtC+JmdpxuJIyowGCkfRUIcU
ARRgmxBGK6f+FUhGKjwpSl3gfX8fRUVe.uDcgto.Qqx8JYqf3SiDGV9XBiOj
Fj0RIB9v5sAu5eLJ+4WShyyB+qPI4gOkzR8y+pBLa8j0U.2+YZXYJoUWxtzv
sIwk.QiWa4gqdcEHGXxgoiLxqH1+PK2Ljf4NNYgNk7iYu3mJm8AoV.Wmqzjj
nlmp99hBdMWc5CgwwmQEKTC28ISC28VO26KIEmbeeOa4Yxd9XLb1mKqAymKS
ldyqyOJRYRr4i+q9wg68yCJyDrDccpO4E5FPMOyGsblsEB5aB9LbKjraPVq9
bE2P3gJwn0074sg6Bxxadr7BsXMORc8MpcniUpNetLs4QE3QyKnQyRRedqtJ
4FG+LUyGBKDVhB2VqbFzenMiP2m.PibocEHnZtlPoT6z9BKY68ydWmlcM2uZ
7D5ZtXsnjr.Wan7o78F6uIO7i.46+7S1Z8fo50LciJEB0s9pd48cwaKoZRx2
muVpEK6oCk9Ke5hZy3p7DuGm7h1MWN776ElQT+tIMerW577EQHgZvIaKYhZm
eexGAukjF9WKzR4GUmpQs5qsxP+XjtZ3j5YcTmUEnev2xzOYC8rEtn4wPtbu
p+0m3TifUpeVsq1sKYBPOKVth8LZko5FOz1BOe.bF1EfvIazMIicZqtK60mc
6sY2tCa2ca+951v6xNdCa4FXOWACmrmqr4AFx8fXowmMEsca58XWuwkzpscs
Bh+Z12MzFeO14MyV+Ur2eUa9W0t+Ur8ec6+W0G.C7CvDeAFh+.83SvU8Knee
C52+f17Q3Bgp18RnKOEZ2agN7XvHuFZ2ygy0fbtN9KNeu4E8RE9+SYGB1TLy
XU9a9wumsJOY0+cgNk+4yugSH0aAQGJUCVvheIH57qq0EmRtvbv2gm5i34ro
R8oEu6js1l1bLjIxGN2ZYSKlSkpBvhA3tpxUnBXwqXU+blvdD8bXph+etjF3
e11.IViuAjrlIT4RAwrf7U+inyOcGw30sGGWPX6goPIOwDHJB4.+CgKO344j
4ptO1cdAZkEQtEBnW3hWSh8joymGgv0ctqyj2BcmIv0NBalo9pLkceo93kC0
WVrW2NpuyBf5WVGBvJoLMlPiBuYZbAHXWnoixQlwDPikIf8V.LgrOKKbP6qs
ebj+Bk9XjqbYegbPvcdxC6wcEmBy0.VBYzrD2k.KIHZ0d+ME5VRxdaETiGYV
YRxisrzkVQsElCNUZnM694Ojd3OrE.+Ic0+fyy6J.1ukUD+vBYpihQ3.8KEG
tYLBmQOQAudA3p5+9+SZw0uplWzKiIJLNntqDfWHJ7j1a3PqQQ0guYylUe7R
vmqvKiA9940klAGpi6kAgbY6V21Agr.3H6S1F9ZXPZ1px1itE3MOZP4iLQ9F
FAkQD3uLxg8Difvro6uFqGd0Rvo4MkYTapJudT6+s+Do1hlmisgyzzdzscY9
Vhhddqet+Yz9yH9kWnxlvYO8GVWc2+Zi6+hmP45VE7sVLmV2vYTzcYgS7XUA
lsV8o1zkPTuuIjYuIY8wLk2C1r2S4xjMo2Cwr2y1T+Oa68b1gZr1JF7wk0kn
woBG4rG+8WOv+2hzHJ1UbGLhdSRjmDNWOJpZqci.YwQ50hOGlla5Q66duol9
yhYJkeqjeeU4btUJOvy5Li0n6+Tg+xeYEZIkHOwSBWvWDUcTvmY4+kPPwks8
sKiTaLdlXemEOoehTDv0ES3XVIQe8j0B9xHoRkE0tisxizLvaX3K4Mdd0Sll
37Gb27GwBf+bv+8EXRXYs3J+rqNycQsvw5Q1biV7XuQs1w8YItY4xTlIrqTa
CRBQ400OIMK4X5lJ4Of283pKIKaCxxCiqKznesxssGq62.CkYOV3ST7NQFBe
3Vwk5hcGsXAbDoeH24lC4lJRftChD7A.eh6.7wF.74BRI2T3itv4uDz..Pr6
8.BcV7P3..P9c.9vdCgBdOjBwtCbZ7MG.YCvBCAcufPictv4d.g3gHFxtGP3
..Pu6.7gFhlFD8d.gkBVTCgP18B.wCwZB5d.gjg.g3aMDNj4wKovDFBfOkvD
z3G5K1hIKzxeL0h8uuOIEn8+oJu+5Tezw2JvWtD4rvG00EYVnmulJnxOHNU+
bLeMUNs+0Tg0WQztxfPKYOPiBn0+COOiAmHJsIopQZtTB8Bq3cMuB2T4Yurl
QAEHCfhxOWvytv8gaOjDFmqHHLYChpjUxpaEDMGA8Z.LU0OLXyENgLEmXcRY
mR6erqN+iJ0y90Etv09FVZeK1P8kGBM4HUw4N8dCDVr19nZP71U6ylB15fIx
tejBc8nVBcQNy.5JaEGSCg8fJVz1nKFYSzcaX1gH+usBZlqSCgEBOcFLBYMA
Z1bfxVfEy4BMVrpRUsABa0s2bmU+BZU4OpNSCuMHOYPKNoyUcoC8ZR6GLnM6
n5STjI2IZP1sY6n+MgYz2gt0HPdtx9THmIUPnZ7znqQedKb61FenuFPwD+jP
wbc8tQTL2eRnXBOxsghgc9Ighw8D2HJlUkwNFuM4FoXmC62mTnp9UTnoqWGO
pF.3qQIEO4Ai4WzgtGloMGJzBOqrsg6mFfpeIo68y6tu40RWQj0e2gtRLHMT
1++ftNf2OJzQOL33z7SE4+DSEcc7fFt57SFE+DSF4dnmtMTQ2aCUrtfWGJgr
dSLnm+PtU61nQwVXytp+IVQbTakG0cVn0eHCkJ6JaGuUbBtCrCIH+kmNq.1N
FpYsvtxPEZV1ybjevKN0oPsIui0Jmh2R+YcTlnMdqNFcG1pi8Dx13FPSQPEu
My6zwHueX1oiQVbmNtiG1McyNt8VEr+1Me2NadRHG024Y681gQtIlf3KpsHq
p8PVKuEYgr5tjavljzsU6OMSAaUJHPdRCftjwrcX0F1NeaAQH6sEDY4ciGqt
8yFT5+xujm5GmE4mmjZ2FbNz0ukHt2n1Dgaq20uL542kaW6k3FragZsl9Mc9
1h6PVdKtyaV1g6nz+bGtq4U9C7NbmoNsd68XUsmEAhsHW7svi0eTbX0ddqtP
cUcq+luuBuhthuxc801yCXf2Jhq3GJ030eqwVNi7g0rdQ.D67pYPgdWTECdM
JF5tJP1FEs+4UjAmB3FrLxb02erm9lPWyxy5BtfcAV5..VN9xQDvlrviUO5R
DwYZHhA3QyJ24b7P.9+6RgzPfqF.paGMf4ZBj4oUrui+M4Nf2Tu7Rj.3dZil
JrIlNroVZYnjjpj5PNSG13F.ah9mvBSQIPgvwgtI1YinXXDCrygmNbyLUQCp
S31EA6TfPupC1tu6X9I5F.rTyAVWGVO.6zTlblBudjH5DXqz0Ax.BWWIvRrf
zJwDBItWXioQGI74imaBnRMGTQddyGKGahAj9zAzXRu1foxuQlvu850xFSkF
XWvuoSilJrYh2CX5oJzeZuIzM4MYhLKoOxMiRfDIBaqAZilHnILwUBZuFQQd
pITfHfVcaOMHy0BlhfzWHbzLEIGYa0RlArzA.rbXernwnafYJgINU0qnZYeO
sVbPoyxBRCbSUKPrwaBcSdSlXef0qZXhxcZPuf9noBalXhfg5SfFK3ZBzkas
Zy1rOS.VpX..KmnM6SMRM6i53Ley9Lw3GB26zOXOBPIQnOZhRDbOK4jnhpR3
yl7fQfJ0bPsgzPCgg4z8QtI1TZ9IT2RwDdNdAkXNG5QnBX6bWNZphGtln1zY
fPagqEXGs+QY0vtqJ.SaA63qMsafT5xJNl3o8Og3Dr64YUXmbMXWLLX2yk7D
WmtiI0vtGDEpUfcSBli2aVRvPoCHT0bU8foBYXSc2naEwvFnqZsCU5iaNhoz
DhqGLUv1DCgbQudT.0pfPk7IOaQPcL0ehtY0pkeT8gPAE.5rjAai.VpX..KS
b4npjk34LaYv1Dml8LHA1PdF0FLQoAlIFn4NF34qJ0sUjaKj5VlIlcOy1b6F
YYfF.8QSE1DlBac5XiKxEp+b.3ZLbpPG2TnC2Ez4AdEp.N8QSE1XlBajNUn6
TXxCeB5ZNdpvGcxyHTyATKzh9noBajIuPXZq8kMWHLFd5TMU4Zae8HFkj09g
MvFWU9egcA9yFQUMRZGasDPLGaE4ux.mJx+YY4.XN1Jx+aPz8lnBhzaNBUe3
TpHQoptN5bDBsQvpy.fUNRKfZ0HEMmAQCMKzbp2zcj2iCqx5ocJT8+lI7zch
2axS.oBSUbzIsuxmRUx3cI1ZkBoFKC2I4jRbg17tzV.EV65YwMdpwBwlBrL1
kipDhUKG6rHDSMM0ac6trmd.H7YKzIi.Up4fZi.mZD2DxcF0ZXzpLyrPA1PM
cMh6btdEcPs9aZilJnYhsUdut8oVMPHYBZClJjYpNcrEdQnawKhXRPm8lvAl
xjJ7YYQcpFb4TCx7CoTjAfJD.OC9vIZL.jWTWl.a+42FgDDOyo2p04Te.fDJ
NQ6HA9FfDlKznMv1VEH7AnOsWKwpna0GM0odroCaLISVECYkOCVHFRhQVS6E
zT0FHn6kh7ZYDDBIUsO+ZgPHIFY9rW3Vs+i.TTkGA1ffhmdLXBl9pAS383EB
5F.rNlCryYfVDzj44PHTpY3.csw.rRB0qdvTkFbLb5UmzWlVQXVFC.1VNeYz
JiQudsCpVXLtk77B6M4jiyA0iLJDUsKzGOZNx5INGahsR1fWKcX0yEvR6wZY
TYk5oikMFgc0wR7zKQCLY3bmqhkbWowMNwSGKaLZF3WFEVTux+tEfgndsqzG
MUXynDjNvUbupZF.QC2BScxcWmyGyfsz55UsAOcMMFkBcx00zP0WBdJ19lCM
BPMobpczVZahi8MEhmbI0pnnLsvloL6GAAdp05oNAUuTOsaHmHi9Pr5ky64p
Wt55il3rGIrglDrgbTcgIjPnCcBwzgNiL6wGnpJUwKApl7PhmzzTUMTonpZE
bsfdJjIoE+rk485E5jpXl.fG4foOwwrVNfBeNslu1.iLIpXjW+xNXo2Tfa9J
IGAyNfFxDPq67RV1o6pAMjimvVvFa5pCfUfdFlvYR.6LhI9snVqbjTiUiAL8
UQ2BkgGxX2FocxrUs3lpRi.oRgmEbEDgmbfkppf15YS.4XhNoB86VXgBJlNM
7UxzVFfM+cirsOJC3cissGmC3cSrsSjv6FcG42n6H+FcG42n6D+lbC0kXz2Y
hSGBWna1KGYcp7.zgN4WDZ3uHnSg3e3vGAoYpqV9NVu2+2SRqaHlEV3hggxl
Ox5zfOBqtd3B7S27VXdvl7ioPCa4qbnoKI6YQowGCUrR4VUzZY6Tor4tjcve
ipkFk700e4u8k+efWpRlU
-----------end_max5_patcher-----------
</code></pre>

 

]]>