Problem: This turned out to be a hard assignment for me, since I had difficulty coming up with a problem that should be solved. I ended up considering “fun” in general, and how play would be different if in a blind world for hard of hearing users. I figured in that world sound would be the primary way to playfully engage and communicate with each other, and was drawn to the piano scene from the movie Big. The floor piano has no tactile feedback, staying flat on the ground, and is only fun because multiple people can be on it at once. So, my problem to solve was how to offer a different type of creative fun using this musical structure. Probably too tall an order on my end.
Solution: Essentially, collaborative music visualization. Nothing novel, but it pushed me to understand this entire pipeline and actually learn p5.js, something I needed to do. Users would need to be able to interact with a visualization system that responded to their keyboard inputs, reacting differently based on a potentially varied amount of states. This would need to capture the feeling of creating ripples in an existing system, and having your inputs matter differently at different times.
Proof of Concept:
A microswitch “keyboard” was built to handle inputs. Compared to a floor keyboard, this is relatively miniature. Foam keys rested on lever microswitches that fed back to the board. In a final build, I envision RFID readers embedded in each key that could determine who was pressing what key, highlighting the actual collaboration that could take place.
The keys then affect a pattern on a browser window, based on what key and how many keys are pressed. The visualization interacts back with the user(s) by becoming more “resistant” to input the more it receives, and less the longer it has been unused. It further has time-interactive elements like color, frequency, and so on dependent on how users interact with it. Code partially based on an existing p5.js library, wavemaker.
With the RFID or other tracking mentioned above, this system could be extended to drive more into the feeling of collaborative creation I’m trying to capture. Different users could have different “heat” signatures they apply to the waveforms, different speeds, or different interactions with each key or section.
The Arduino code is extremely simple, basically just reading and passing values, with the bulk in the p5.js files. lytle_crit1.