The concept that I set out to convey with my project was distractions and apathy preventing you from achieving what you want/need.
I set about doing this by creating a robot that would meander towards various targets, and you can interfere and distract it along the way. In the end there is some overly interesting behavior, but for the most part it works.
The bot was controlled by a raspberry pi using openCV, and each target had an arduino pro mini with a neopixel and an IR range sensor.
While it mostly worked, there were still a couple of issues that I ran into. The internals of the bot are a mess:
With most of the space allocated for a battery that I’m still having power regulation issues with, so it may remain tethered to a power supply.
The idea that I want to portray with my project is general apathy/undirectedness being a detriment. The way I want to show this is by creating a robot that has clear goals to move to, but without outside intervention it move very lethargically, making random detours or stopping and is generally not very motivated.
However, motivation is not impossible to find, so if a viewer was to wave near the robot/nudge it in the right direction, it gets to where it needs to be faster.
- A frame of some sort, probably laser cut in the end
- Motors, probably continuous rotation servos
- Various proximity sensors, PIR is right out.
- At least one raspi, with servo hat
- raspi camera
- A couple of arduinos, lots of LEDs
Obviously as I prototype, this is very subject to change
- OpenCV on the raspi
- Will need to be able to identify its targets, and then not only go to them, but not go to them.
- Also needs to react to stimulus in a convincing and understandable way.
The first thing I’ll need to get working will be tracking the targets, because that is the most complex part and if it doesn’t work well, I’ll need time to come up with some other strategy. Then making the robot, and making it move to the targets, integrating the stimulus, and then allowing for multiple targets and signifying which one is active.
For this assignment, I set out to create a two-voice synth/sequencer. A major goal of mine was to be able to create each of the basic waveforms (I ended up with sine, square, triangle, and noise), so I made two R2R ladder DACs, each with a controlling arduino pro mini. The two pro minis were then controlled by an uno that told them what waveform to use, and what note to play. The uno had arrays of notes that it would tell each pro mini to play, and I threw together an interface for inputting them.
I’ve essentially learned that analog circuitry is really hard, especially on breadboards. I tried to do more thorough unit testing as I went on, trying different values for filters and making sure that I was getting the correct waveforms at the correct frequencies, but when all put together, it just kinda shat the bed. There were likely issues in the final signal combining and amplification, but then again, there were likely issues in every step. I think that in the future I’ll try and step away from using as much electronics and do something more material. I think this kind of project could have turned out well if I was able to put more time into it, but sadly I need to graduate.
At the very least, the sounds it makes are kinda spooky so it’s fitting for Halloween.
This tells a tiny bit of a story about a bull that hates everything, and so it rams into everything it sees. Oftentimes its hotheadedness is to its own detriment. It has two states essentially, looking for something to ram and ramming.
I had a couple of issues that made this not turn out the way that I would have hoped. I couldn’t find a way to make controlled, repeatable motions with the motors, so any kind of story using the bot’s motion was right out. The motors also don’t have enough torque to move slowly, so what motion I could get out of them had to be quite fast and erratic. I had planned for two bots that would interact with each other, but I also realized that we didn’t have any distance sensors that don’t interfere with themselves, like the IR and ping ones do. Overall, this turned out ok.
For my project, I made a dog/cat/some 4 legged thing up to the viewers interpretation. The emotion that is portrayed is fear.
It starts off on alert, with pulsing orange lights. Once it calms down, the lights turn off. At this point, if it detects any motion it goes crazy until it wears itself out, and the cycle continues.
There were a couple of technical issues that had to be solved, like servos not playing nice with neopixels on the arduino at the same time, but nothing super serious.
video – https://drive.google.com/file/d/0B6bFUuOZ0DHJRjI0amFveDNjUnc/view?usp=sharing
code – https://drive.google.com/file/d/0B6bFUuOZ0DHJeXFSZGVZLTNXa1k/view?usp=sharing