Heartbeat Pup is a soft good built for overly anxious children. It takes in the pulse of its human companion and creates a mirror heartbeat effect through a physical simulation inside the stuffed animal. So, the bear mirrors the heartbeat of its companion child. If the heartbeat is detected to be over a resting heart rate, the animal’s own heartbeat will gradually lower until itself and the child’s heart rate is resting.
The bear becomes a grounding mechanism used to manage anxiety by creating a movement or feeling to anchor its companions awareness back to the environment.
How it works:
The child’s pulse is sensed through the pulse sensor in the Pup’s nose. Based on the readings, the Pup’s heartbeat (created through the movement of a speaker’s cone as electricity runs through it) will either operate at a gradually lowering rate, or will mirror the child’s heartbeat.
This was a project I’d been thinking about quite a few months. Overall, I’m very happy with the execution, and craft of both the program (coming from a beginner level) and the design. The biggest issue I faced was how to clean and process the values from the sensor to get what I wanted. I was luckily able to find well-documented resources at https://pulsesensor.com/ that worked through the signal processing in an interrupt file that I needed (linked in the program as well).
all files, documentation, and process:
To re-iterate the project intention:
The intention of this project is to develop a children’s soft good that serves as a grounding mechanism for young children who are overly anxious. Grounding mechanisms are used to manage anxiety by creating a movement or feeling to anchor someone back to the environment.
I am using a pulse-sensor and speaker for this project at the moment- I became interested in the tangible movement the speaker generates when electricity runs periodically through it and used it as the output for the pulse readings.
Overall, this checkpoint was a success in getting a basic program and functionality to work. I experienced a few problems reading the pulse sensor as the data is overly sensitive and am still working on making this more fluid. I am also experiencing a slight inconsistency with the beating- this is likely due to code troubles which will need a bit more time debugging.
Buggy heart-beat reading; using an LED
Buggy heart-beat reading; this one’s just fun
Next steps after cleaning the program and establishing something more robust would be to start housing the electronics and placing it inside a stuffed creature. I am unsure if I will be purchasing a stuffed animal at the time or creating my own- I’d like to first define where to place the pulse sensor that would be most effective.
video 1 (apologies for background noise)
For this project, I integrated different musical melodies with objects around us. I wanted to use a set of objects with different weights, categorize them, and then produce separate musical identities of each object. While there is a range of objects available, only 3 sensors are available. So, different combinations using the objects produce slightly different experiences.
For the practice assignment, I used the playMelody Arduino tutorial to be the tune set off by the ball and original backbone for the program (the weight is captured by an FSR). However I needed to change the program heavily to adapt for three sensors. The biggest challenge I faced was wiring the FSRs appropriately to the get solid and consistent readings.
Overall, I’m very happy with how the project turned out and learned a lot about sensors as well as programming this time around. If I were to improve, I would like to fine-tune the thresholds of the sensors and weights and also develop the program to play melodies in parallel with one another.
documentation, fritz, and code
For this upcoming project, I want to integrate the experience of music with objects around us. I want to use a set of objects with different weights, categorize them, and then produce separate musical identities of each object. I think I’ll do a ‘musical chairs’ situation (though not finite) where maybe 4 objects are categorized, but only 3 sensors are available. So, different combinations using the 4 objects produce slightly different beats.
For this assignment, I used the melody provided in the playMelody Arduino tutorial to be the tune set off by the ball (the weight is captured by an FSR). In the future, I’d like to find or produce different melodies (maybe one can be a bass, and another a higher toned sound) that could all potentially fit together well, and create a set of three sensors that could register the information.
One problem I came across was the FSR I used was not sensitive enough. Even with the ball being pretty weighty as an object, the sensor could barely recognize it. I think I’ll need to find either a better sensor or much heavier objects.
For this assignment, I wanted to tackle the story of the Loneliest Whale and create a children’s mobile prototype that gives the whale friends!
The By matching the tones of the whale’s call using a potentiometer, you also match the level that all the whales swim at, so by matching the tones, you conclude the story of the Loneliest whale by finally giving it companionship.
To create this project, I connected servos that moved together (the companion whales) to sit at their own level with a potentiometer, while the Loneliest Whale moves at a different level (a different servo). Simultaneously, the entire mobile is placed on a laser-cut disk so that when I hold the top of the motor, the entire motor’s body (and thus the whole mobile) rotates instead. I couldn’t figure out how to turn the whales without tangling the wires any other way. Finally, the potentiometer used for the companion whales are connected to the output of one speaker (that emits a range of tones.) The intention is that the Lonely whale is emitting a singular tone in another speaker, and then the level the whales swim together at would be the same tone of the Lonely whale (so that their sounds match).
The above was the ideal, and I can say that what I achieved was quite close, but would require a little bit more tweaking to get higher fidelity.
The biggest problem I experienced was programming using millis() to avoid delay(). Overall, my process was to program each individual part (motor, servos, and sound) separately, and then mesh them together at the end. To do so meant I needed to avoid delay() all together to have everything run smoothly.
Another problem worth mentioning, was the problem of not getting the motor to work towards the end, because the tone() function I used caused problems in the pwm pins I had attached my motor too. This was a small but documented problem I solved! (With a lot of happiness).
Confused but Irritated Faces
For this assignment, I wanted to transition a face from a neutral expression to an irritated expression by changing the states of each motor (6 in total, 2 for the mouth, 2 for the eyes, and 2 for the eyebrows). If the audience were to come near the photosensor, the motors would go to new positions to scrunch and look towards the direction of the sensor (maybe to warn the audience not to come near).
This project was relatively straightforward, but over time I interestingly ran into a few problems (still unresolved). My general process was to take each servo one by one, to ensure each of the motors would respond to the sensor. The first time I put everything together, the servos worked correctly (see video), but after half an hour or so, the servos started fidgeting. At first, I thought it was the way I programmed it, but over time I came to realize that the amount of power needed to control 6 servos was too much. The second time around, when I assembled more than 2 together, they would fidget and not work correctly (no changes to the program made). If I were to refine this, I would like to play with an alternate power source to avoid the fidgeting.
Zip File: https://cmu.box.com/s/y8exprgw8plgt1gtg83e3b6j8i0cwl3i