Acrobotic gives you the power of telekinesis as you control a full size acrobatic robot using your own body movements. The robot tracks your movements using a Microsoft Kinect v2 and then uses that data to dance and sway with you. With specific hand gestures and the sway of your body, you can control how Acrobotic moves. You can try to rotate it as fast as you can or try to balance it – its your choice. The best part: you can do all of this without even touching the robot.
Our previous flipping robots had a very interesting behavior that was very difficult to interact with. As it flipped you were compelled to touch it and play with it, however, it proved difficult as there was no real way to. With Acrobotic, you are able to interact with this intriguing motion without actually touching it.
Acrobotic is constructed out of beams of 80/20 extruded aluminum and stands roughly four and a half feet tall at rest. A smaller robot is mounted on one of its arms and is able to rotate and flip independently of the rest of the structure. The main structure’s rotation is controlled by a mass of plate steel that is able to move towards and away from the axis of rotation. As the mass passes the center of balance, it is able to move in either direction.
The structure’s motion is controlled wirelessly from a computer with a Kinect sensor. The Kinect tracks the Y coordinate of the person’s shoulders and is able to infer the level of the person’s sway. Holding two fingers on your left hand extends the mass away from the center while two fingers held up on the right hand pulls it in closer to the center. Two arduinos are required – one for each independent body of rotation. The wireless data is streamed from the Kinect through a computer to two Wixel modules – one for each axis of rotation.
Independent Rotating Robot
]]>One major problem faced in the world of physical computing and intelligent environments is the fact the most environments are not intelligent. We attempt to sense and interact with rooms and objects that are not able to think for themselves. We try and get around this by using a wide swath of external sensing to do most of the legwork.
Our project involves the use of external sensors to track people’s locations within a room and try to extract information from their positions. Our case study will utilize a Microsoft Kinect to track people’s positions in a computer cluster to create a list of available computers. This will involve the use of one Kinect to map the room and will use machine learning to understand which seats/positions represent which computer.
The power of robotics is in creating autonomous robots that exhibit a behavior and responds to its environment. This is a study on creating an autonomous behavior. The Acrobot is a robot that understands it’s own position and responds accordingly by waving its arm back and forth. The Acrobot is also human-like as it does get tired and requires breaks once in a while. The Acrobot will continues to exhibit the same behavior in various environments autonomously or with human interactions.
In a choreographed dance, the leader Acrobot guides the follower Acrobot in its movement, creating a mesmerizing visual display. Because of the differences between the Acrobots, the performance is not perfectly synced and the follower tends to step on the leaders toes. However, when they sync up, the results are fascinating.
The leader Acrobot uses a variety of sensors to attain its behavior. An accelerometer mounted on the center beam of the device keeps track of the pitch of the robot, which is fed to a servo that controls the pivoting arm. The pivoting arm creates additive motion, allowing the robot’s spin to accelerate. As the Acrobot flips, a hall-effect sensor tracks the number of rotations as the sensor passes over a neodymium magnet mounted on the stand. The Arduino program sets a random number of rotations and once the rotations have completed, the robot rests. The amount of rest is proportional to the number of rotations it just completed.
The follower Acrobot does not have any sensors attached to it and instead receives all of it’s data wirelessly from the leader Acrobot. The sensor data is transmitted over wireless serial between the two Acrobots using Wixel wireless transmitters. This allows the program to run simultaneously between the Acrobots and producing some interesting results.
The arduino gets its power from a 9V battery and supplies the power for all of the sensors and the servo. The structure is made entirely out of OpenBeam providing a very rigid and sturdy structure, allowing the Acrobot to spin freely without worrying about the structural integrity of the base.
Leader Acrobot Arduino Source Code
]]>The power of robotics is in creating autonomous robots that exhibit a behavior and responds to its environment. This is a study on creating an autonomous behavior. The Acrobot is a robot that understands it’s own position and responds accordingly by waving its arm back and forth. The Acrobot is also human-like as it does get tired and requires breaks once in a while. The Acrobot will continues to exhibit the same behavior in various environments autonomously or with human interactions.
The Acrobot uses a variety of sensors to attain its behavior. An accelerometer mounted on the center beam of the device keeps track of the pitch of the robot, which is fed to a servo that controls the pivoting arm. The pivoting arm creates additive motion, allowing the robot’s spin to accelerate. As the Acrobot flips, a hall-effect sensor tracks the number of rotations as the sensor passes over a neodymium magnet mounted on the stand. The Arduino program sets a random number of rotations and once the rotations have completed, the robot rests. The amount of rest is proportional to the number of rotations it just completed.
The arduino gets its power from a 9V battery and supplies the power for all of the sensors and the servo. The structure is made entirely out of OpenBeam providing a very rigid and sturdy structure, allowing the Acrobot to spin freely without worrying about the structural integrity of the base.
void setup()
{myservo.attach(9);
randRot = random(3, 20);
}
void loop()
{y = accel_t_gyro.value.y_accel;
z = accel_t_gyro.value.z_accel;
x = accel_t_gyro.value.x_accel;
gx = accel_t_gyro.value.x_gyro;
gy = accel_t_gyro.value.y_gyro;
gz = accel_t_gyro.value.z_gyro;complementaryFilter(x, y, z, gy, gz); // Filter pitch data from accelerometer
if (analogRead(A0) == 0) // Hall Effect Sensor
{
checkPrevious();
}magnetState = analogRead(A0);
if (rotations > randRot)
{
myservo.write(95);
delay(randRot * 1000);
randRot = random(3, 20);
rotations = 0;
}if (abs((int)pitch – 0) > 10)
{
myservo.write(95); // Vertical position
}myservo.write(95 + (int)pitch);
delay(10);}
void checkPrevious()
{if (magnetState > 500)
{
rotations = rotations + 1;
magnetState = analogRead(A0);
}
else
{
magnetState = analogRead(A0);
}}
As a group, we set out to create a physical representation of attraction. This representation took the form of two speakers that, when kept apart, create a fast, whiny, and high pitched sound. However, when the speakers are moved together and allowed to unite, the speakers create a calmer, more peaceful atmosphere. When together the speaker’s sound represents that of a calm heart beat and when pulled apart, a whiny sound is heard.
It took us a while to get to this concept. We always knew we wanted to work with sound and music, but had a hard time putting that in a one-in-one-system that would be analog. Most of our ideas were too complicated and would’ve needed an Arduino in order to be constructed. We were eventually able to settle on an idea that involved playing music from an iPod and passing a hand through a laser in order to start and stop the music. Although we thought this concept was pretty snazzy, we realized that it lacked a deeper meaning. From our previous brainstorming, we knew we wanted to keep music as our output and that we wanted to continue to incorporate movement in our project. At this point in the project, we had already invested in design materials for the casing so we also knew we would have to keep working with a similar casing shape as our initial idea. We started playing around with different sensors in order to think of a new input and kept bouncing around ideas until we finally found our final project concept.
Considered a “One-In-One-Out” system, the input is the human touch either bringing the speakers together or pulling them apart, while the output is the whiny sound created by the speakers. As the sound doesn’t affect the sound produced, it is a direct pipeline from proximity to electrical representation and out as sound.
Our system is comprised of two wooden speaker housings, made from oak wood. Each box contains a circuit that makes an unpleasant noise when they are separated. Pieces of conductive material lay on the outside of the speaker housings. When the two speakers come in contact with each other, a bridge forms between both circuits, ending the unpleasant noise.
Our circuit consists of two identical mini circuits we soldered stored in separate boxes. These two circuits each contain a LM555 timer, a 5v step up/step down, an audio amplifier, and a speaker. When these speakers are apart, the 5v (from a 9v battery) output a high frequency. When the circuits are connected, each Ain is connected to the other speakers ground, and the speakers turn off.