sweetcorn-SoliSandbox

Dog Lullaby

I have created a lullaby generator for children to feel loved and lulled as they fall quickly asleep. It features an animated dog, generated lullaby lyrics with rhymes, and a generated melody that children can alter using the special Soli sensor.

Above is a concise demo of the app in use. It normally runs for a random number of lyrics from 35 to 45 lyrics, but for the purposes of a short video, only a few lines are shown.

Swiping left will increase the pitch of the left note, swiping up for the center note, and right for the right. A tapping gesture will mute the song and another will un-mute. If you feel inclined to test the app on desktop, the swipes map to arrow keys (left key press is equivalent to left swipe, etc.).

The dog and border are animated using p5.js, the lyrics are generated using RiTa.js, and the song itself is generated using Tone.js. It was a pleasure becoming more familiar with the latter two. RiTa.js is an excellent tool for generative poetry, and I’ve only just begun to experiment with it. I had originally been using p5.Sound to generate the song, but found the quality terrible and switched to Tone.js. My life instantly became easier, and I look forward to experimenting more with it as well.

The song is composed of one enveloped oscillator that creates a bell-like melody, one modulated oscillator that plays a random harmony based on the melody, and a synthesizer that plays a random bass sequence based on the melody as well. It runs in 3/4 time at 90 bpm, as similar to a lullaby as I could get.

In the future I hope to animate the tail and eyes of the dog more specifically. I also hope to refine the sounds—adding more dimension and nuance to something that should sound simple and lovely. A lovely thing I feel obligated to note: I experimented with this app around my dog (his name is Scout) and he immediately ran toward it. I suppose my “whine” instrument was more accurate than I had expected ;~0. I have yet to test the app with cats and children, but I have high hopes!

Good night,

sleep tight!

Don’t let the bed bugs bite!

xoxoxo

~sweetcorn

yanwen-SoliSandbox

DAGWOOD is a sandwich generator that creates random Dagwood sandwiches with layers of food or non-edible stuff. By using the two Soli interaction methods, swipe and tap, users can choose to add ingredients to the sandwich or swap out the last piece of ingredient if they don’t want it.

After generating their sandwiches, users can click on the “TWEET” button to let the Dagwood bot send out an emoji version of the sandwich, or click “RESTART” to build a new sandwich.

I tried to keep the gameplay minimal and gradually added features to the project. One feature that I eventually gave up is to build an image bot, since it would take quite an amount of time to figure out image hosting and tweeting with images, and the saveCanvas() function in p5 isn’t working very well with the Pixel 4 phone. I switched to the emoji bot instead, and also tried to add a retweet when mentioned applet that I found on IFTTT (worked for twice and then it’s somehow broken :’ ))).

Code | Demo – works for Pixel 4 and laptops (view with Pixel 4 screen display size but ingredient sizes are a bit off)

🤖 ENJOY MAKING YOUR DAGWOOD SANDWICHES : D! 🥪

lampsauce-SoliSandbox (3/3)

Breadcrumb Trails

This project uses Soli swipe events to navigate a filepath and see the breadcrumbs. I initially wanted to pursue this idea using d3.js, but I was unable to configure it correctly, so this was made with p5. The data is from d3.js's examples (flare.json). The up and down swipes change the current branch to a sibling branch, whereas the left and right swipes change branch to either a parent or child branch respectively. I also wanted to connect this filepath navigation to p5.speech() to allow users to change the entries in the selected field. However, with my struggles to get p5.speech() to work for another Soli project, I decided to leave this project as it is.

Project => ( app | code )
*use arrow keys in place of swipe events

lampsauce-SoliSandbox (2/3)

3D Construction Toy

This project allows users to use Soli swipe events to describe various transformations on a cube in 3D space. The project was build using three.js and I learned many things about the THREE.Matrix4 object. I also initially copied some code from 2013 to handle collision detection. However, it was really buggy, so I wrote my own which works by calculating the axis-aligned bounding box (AABB) and detecting collisions. Since each cube must pass through 6 transformation filters, the program requires a lot of swipe gestures. In order to make the program easier to use and more toy-like, the number of cubes is limited to 4. Personally, I could not get the Soli tap event to work on my phone, which is why I only relied on swipe events. Perhaps using tap to drop could make the interaction less awkward.

Project => ( app | code )
*use arrow keys in place of swipe events

Versions / Aesthetics I tried out

lampsauce-SoliSandbox (1/3)

Tile World Builder

Example of creative possibilities with tile world builder
This project allows users to create a world using Everest Pipkin's city game tilset. It was built using p5.js and p5.speech(). Soli swipe events allow users to move the cursor location, and parsed speech from p5.speech() allows users to place custom tiles. However, at the time of making this, Soli Sandbox did not support microphone access. The way I got the version filmed to work was by using two devices: the Pixel 4 updated the cursor and my laptop's microphone updated the world. Both devices were networked via socket.io.

Browser version => ( app | code )
*use arrow keys in place of swipe events
Phone version => ( app | code )
^^ this version is really jank

marimonda – SoliSandbox

This project is a landscape generator, the idea of it is that a person can go in and traverse through the environment and take images, much like a regular person would while sightseeing. I am very happy with the improvements I did to this project. I was able to get a variety of gestures to work in personalizing these environments (adding parallax, choosing colors and playing around with iterations, ect…) , and I am really into the idea of making it into an actual application that lets you take photos! So that was fun.

The main thing I learned about my process in making this is: It seems I am still a toddler without developed object permanence, because when I don’t see the console logs, I simply seem to completely forget they exist. That was my poor attempt at a joke, but I did realize I have bad debugging habits, and that cost me a lot of time while making this. Soli Sandbox also has some weird quirks in how it deals with data (images) downloaded from the app so for this reason you cant use Save() or SaveCanvas() to download an image (it seems to all go into app data folder, and I didn’t want to mess with folders I don’t have permission to view). To go around this I created a Twitter bot  to post some images, though dealing with the Imgur API has been painful.

Play with it here

Project link

EDITS:

– I added subtitles to the videos above. I might re-edit them so that the subtitles are forever in the video and not just a YouTube feature.

– I adjusted the Tap function to work as the gesture that takes the snapshot of the image, as well as making it tappable so it feels more like taking a picture on a Camera. 😀

– I removed the Twitter API from the documentation/code until I figure out how to make it work properly with Soli in terms of images. I will eventually update this with the working code!

Check the Twitter to see images generated by swipe down gesture:

 

 

tale-SoliSandbox

New Media Experience: Virtual Squash

( code | app )

New Media Experience: Virtual Squash is an app that recreates the experience of practicing squash in a squash court using Soli sensor on Google Pixel 4.

This app is made to motivate myself to do some workout in the COVID19 era.

Overview:

I made a virtual squash that could be played with Soli sensor. Left and right swipes mimic the squash swings, and tap reclaims the squash ball. Every time the squash ball hits the front wall, the number of hits on the top left corner increments by one. Similarly, every time the ball is reclaimed, the number of resets below the number of hits increments by one.

Reflection:

Though this project, I learned how to create space with WEBGL in p5.js, manipulate camera function to change viewpoint, and incorporate a sound file to be played every time a certain action has taken place.

Quick Demo Video:

~35 minute long video of playing squash with Soli:

 

* I used an aluminum bottle so that Soli detects my motion better. As you can see throughout the video, I noticed Soli recognizing almost every move I make in a certain bottle position and a certain speed of it moving across the screen and tried to swipe in that fixed position & speed.

Fastforward ver. of video above:

mokka – checkin

Wack-a-Mole: Might be pretty self-explanatory but I wanted the tap movement to be able to emulate this idea of “wacking” the mole. Or, perhaps I may use this concept of wack-a-mole but metaphorize it into something else. Like maybe wack a cockroach/bug.

Problems:

  • Wanting a Title/Start Page
  • Mole Holes don’t center or resize along with canvas
  • How to implement image into it while also being able to resize it (wanting to make my own pixel characters as moles)
  • Overall, visually stunted.

gregariosa – checkin

Code

For this project, I wanted to create an alarm that reacts to the range of conscious states you have in the morning.

First, the unconscious: you wake up in the morning, with a phone spewing information. Nothing registers in your head because you’re still trying to wake up. Hence, the phone spews out meaningless information as well.

Second, the conscious: you grab your phone for some new information. Things start to make sense, as you expose yourself to the news in the world. Hence, the phone also reacts, by reciting the most recent news of the day.

Lastly, the dismissal. you go about your day. Hence, when you leave the room, the phone also dismisses itself.

(Made with the New York Times API, Rita.js, and p5.speech.js. Still need to create better transitions and visual interest…)

Toad2 – soli-checkin

Concept: Allow the user to experience the joy of turning over rocks and finding beetles in a garden and watching them scuttle to a nearby shadow (the user’s hand or back under the rock they came from).

Reach In – beetles scuttle to the user’s hand shadow location, Reach In Too Fast – beetles scatter off screen (think they’re being crushed), No Presence – beetles wander idly across the screen, Swipe – roll rock to according to direction

In progress: Turning the rock over, implementing way to show multiple species of the same beetle, allowing the beetles to hide under the rock instead of the user’s hand, connect it to Soli

Future: Improving beetle physical and movement details, adding more details to environment, adding trail of mud where the rock used to be?