Bryt 2.0: A Music Visualizer (Project Proposal)

Concept: 

A box-like music visualizer  and music player that uses LEDs and mirrors t0 create a a visual representation of music. Users will be able to interact the with the device by changing the colors of the lights as well as being able to use headphone jack enabled devices to select the music being visualized.

LEDs will be attached to the edge of the mirror to recreate this effect. Except 4 will be used to create a 3-dimentional cube full of endless mirrors!

Hardware:

  • 3D printed frame for the mirrors (box-like for the device)
  • Mirrors (4 or 5)
  • LED strips
  • power source, battery
  • headphone adapter
  • sliding potenntiometer (change colors of lights)

Software:

  • Library for sound interpretation (getting inputs for music)
  • RGB led control (actual visualization)
  • Library for playing sounds (enabling the device to play music)

Order of Construction and Testing:

  1. RGB light control
  2. Sliding potentiometer control
  3. RGB control with potentiometer
  4. Play music with arduino by input from audiojack
  5. Interpret Music currently being played
  6. Visualize interpreted music
  7. 3D print box frame,
  8. Attach mirrors
  9. Attach speakers
  10. Attach arduino, audiojack, and other inputs
  11. Polish

The Indexical Mark Machine

I want to make a drawing machine.  What interests me about machines drawing is rhythms in mark making, rather than accuracy and depiction.  I think what’s beautiful about mechanical drawing is the pure abstraction of endless uniform marks done in a pattern, simple or complex, that is evidence of the same motion done over and over again.  
I feel what’s most beautiful about all art is the presence of the indexical mark: the grain of a brush stroke, the edge and slight vibrations in a line of ink that prove it was drawn with a human hand, or the finger prints in a clay sculpture.  I make the case that the difference between artistic media is defined by indexical marks.  Do two works have different indexical marks?  Then they are different forms of art entirely, showing us different aspects of compositional potential.

So I want to invent new indexical marks, ones that the human hand is not capable of producing.  I want to see patterns fall out of a mechanical gesture that I built, but didn’t anticipate all the behaviors of, and to capture a map of these patterns on paper.


I don’t care if the machine can make a representational image; rather I want to make a series of nodes and attachments that each make unique patterns, which can each be held by mechanical arms over a drawing surface, each hold a variety of drawing tools, and be programmed into “dancing” together.

Hardware

  • 5 V stepper motors
  • 12 V Stepper motors
  • 12 V DC motors
  • Sliding potentiometers; light and sound sensors (I want the frequencies of the mark making mechanisms to be adjustable by both controlled factors and factors influenced by the environment. )
  • Controller frame
  • Card board for prototyping the structure of the machine
  • Acrylic to be laser cut for the final structure

 

Software

  • Built from the ground up.  The most complex programing will be that of the arms which position the drawing attachments over different places on the drawing surface.  I may use a coordinate positioning library for a configuration of motors that pushes and pulls a node into various positions with crossing “X and Y” arms.

 

Timeline

 

  • Weeks 1 and 2

Make several attachable drawing tool mechanisms which each hold a drawing tool differently, and move it about in a different pattern.

 

  • Week 3

Build a structure that holds the attachable nodes over a drawing surface, with the capability of arms to move the nodes across different areas of the surface.

 

  • Week 4

Control board and sensory responders that can be used to change patterns of the arms, and the nodes.

 

  • Week 5

Program built-in patterns that the controls will influence factors of.

  • Week 6

Make some more nodes, and make some drawings!

Project Proposal: Fishies

Concept statement:

I plan on making an automatic fish feeder/ pump system that responds to texts (or emails, or some similar interaction) – certain key phrases will trigger specific responses in the system. I want to use this project to synthesize a more human interaction between people and their fish — while texting isn’t the most intimate form of communication, it’s such a casual means of talking to other people that I think it will be useful in creating an artificial sense of intimacy.

Hardware: some sort of feeding mechanism (motor-based?), submersible pump (small), lights (LEDs), fish tank, fish (I already have the last two, don’t worry)…. I’m not sure what I’d need to connect w/ an arduino via sms or through wifi

Software: I’ll need software to make the arduino respond to texting (or something similar),  and then perform fairly straightforward mechanical outputs

Order of constructing and testing: first I need to get the arduino response down pretty well, since the project largely hinges on that, then creating a feeding mechanism will be the next priority… everything after that will largely be “frills”/things that aren’t crucial to the project. As I add components, I’ll need to figure out how to display them non-ratchetly. I’m also definitely going to need constant reminders to document my process.

Final Project Proposal

Abstract

I’d like to make an interactive 3D drawing box. Users can draw an object in 3d space and see their drawing projected onto an interactive cube in real time. It will use Unity, Arduino, a projector, and the Leap Motion Sensor. It is heavily inspired by Ralf Breninek’s project: https://vimeo.com/173940321

As well as Leap Motion’s Pinch Draw:

Image result for pinch draw leap motion

Unfortunately, Pinch Draw is currently only compatible with VR headsets, so it won’t translate directly to my project idea. That’s where I think some of the technical complexity comes in- I will probably have to write my own custom program.

Hardware

  • Projector
  • Cube (made from white foam core)
  • Stand for cube
  • Stepper motor
  • Arduino
  • Leap Motion Sensor
  • Power supply

Software

  • Uniduino
  • Unity
  • Firmata for Arduino
  • Arduino

Order of Construction and Testing

  1. Order supplies and follow tutorials for 3D drawing tutorials for Unity
  2. Connect projector to computer and figure out dimensions/projection logistics for program
  3. Build projection cube
  4. Use Firmata and Uniduino to control Arduino and motor based on Unity output
  5. Put whole project together: project Unity game onto cube, have cube respond to hand gesture commands, finalize user interface
  6. Information poster and artist’s statement for final show

Final Project Proposal: Transistor Prop

Project fundamentals

My final project can be a cosplay prop that uses phys comp fundamentals to bring the prop closer to functions/behaviors in its original work, enhance static features, and bring others to engage with the great works that these props come from.

Phys Comp Components: Light sensors, IR sensors, mp3 shields, momentary switches, LED light strips

Things that a prop can be modified to do: light up (different patterns of lighting, color, intensity), make sounds/play dialogue, change configuration (physical changes, like add/remove armor, Psychopass gun etc.)

Besides adding to the list of things the prop does, I also want to think more about making a meaningful interaction between user and prop, perhaps through symbolism, theme, addition of a custom mode/feature.

Transistor Sword:

Red’s sword has several tutorials that incorporate physical computing elements already. I really love this game, and it means a lot to me, so I would want to move forward with this project proposal, but understand if it’s already done well.  

Here’s one good example of tutorials to make the phys comp version of the Transistor Sword: http://chrixdesign.blogspot.com/2016/06/transistor-sword-closer-look-at.html

Chris’s version of the sword has two modes of glowing and is sound reactive. However, she doesn’t use an Arduino, and has hacked the project using circuits and components from OTHER devices.

Input = sound

Output = LED light states

Proposal:

Interaction 1:Install a pulse sensor to the handle, outline where a person’s hand should go.

Input = pulse sensor

Output = LED light strips PULSE in time with heartbeat detection

 

Interaction 2: Corrupt Mode via button

Input: Pressure downwards (press sword into ground)

Output: LED strip color change into corrupt mode (green is normal, red is corrupt), maybe pulse red

State machine would be able to recognize if the sensor or the switch is activated and change sword light color to show sword mode.

 

I will try to think of ways to make this more interactive, but the purpose of this project would be purely entertainment-focused as a cosplay prop. As a prop, its function can remain purely aesthetic as most props are created to replicate the original work’s as closely as possible and supplement a person’s cosplay project. Giving it the ability to respond to outside stimuli will bring it closer to its function in-game, further pushing it semblance with the original work.

 

Other Alternatives Considered:

Overwatch Heroes

League of Legend Heroes

Psychopass gun: Yes, lots of dremmel work

Assignment 6 Updated- Proposal: Elton Spektor

For my final project I want to make a personal assistant that doesn’t not actually communicate direct information like weather on a screen or traffic information on a map. I want the personal assistant to articulate its messages through color and sound. I want the user and the device to understand each other without any words whether they are digital or printed.

I want the enclosure to look like a mini arcade box like this:

There will be a few buttons that will be used for controls like power, time of day, weather, traffic, and play music. I am thinking of using MIT’s android app builder to control the device with Bluetooth using a HC-06 chip and an android. This app will have the necessary buttons to control the different options of this device.

I will laser cut the weather and traffic designs out and install them with the NeoPixel behind a tinted acrylic so they are only visible when they need to be on. The android will be near the device and connected by Bluetooth so a user can walk around and control the device. If I encounter difficulty with the app, I will just use physical buttons to control the modes.

Sensors/Materials

  • 12 LED NeoPixel ring
  • Power switch
  • 4 momentary push buttons or an Android
  • RGB LED strip
  • individual 5mm RGB bulbs
  • SD card reader
  • Speaker

Steps of My Project

  1. Get the code and wiring for the startup LED strip
  2. Get the NeoPixel ring coded and wired.
  3. Make the android app
  4. Get a HC-06
  5. Get the modes communicating with the Arduino.
  6. Get the Bluetooth app to change between a few modes
  7. Incorporate that with the weather, traffic and music.
  8. Get the speaker playing from the SD card.
  9. Laser cut the weather, traffic and NeoPixel ring enclosures
  10. Build the full enclosure and wire everything into it.