Student Area

gregariosa-SoliSandbox

Ambient News

‘Ambient News’ is a speculative news appliance for your home. In its passive state, it shows a nonsensical news headline. Only when you reach out with your hand does it reveal the real news hidden underneath.

The appliance is a reflection of our digital consumption habits. By layering nonsensical information over the real news, it tries to prevent passive browsing of media we are used to experiencing.

Code

*Nonsensical news is generated using Rita.JS and The New York Times TimeWire API.

‘Reach’ (or Up Arrow) to reveal the real news. ‘Reach out’ (or Down Arrow) to return to the fake version.

“Swipe” (or left/right arrow) to browse other content.

Previous Iteration:

Screenshots (Before and After ‘Reach’)

  

 

axol-soliSandbox

(backup link: https://vimeo.com/467470357)

Glitch Project Link: https://glitch.com/~a-nervous-chicken

Presentation mode: https://glitch.com/~a-nervous-chicken

It’s a chicken, it gets nervous when you get too close to it. 

The chicken is constructed through particles and springs, which gives its bouncy quality. Upon sensing human presence(Soli event presence), the chicken perks up and becomes alerted. You can slap the chicken (Soli event swipe), and the chicken will get shoved around and screams.

It’s a simple concept, and I wanted to focus on giving some character to the chicken, and making it chubby and lovable. I created a throwaway program that I can use to design the chicken and export it as a JSON file, and then there’s the actual program executing all the particle and spring motion. I’ve learned a lot about particles and timing animations while making the project, and it was super interesting to work with the Soli system. When I told my roommate to walk close to the phone and the chicken perked up she was super surprised on how it was able to know, so that was a really fun 😀

(more documentation)Spring system:

Chicken Maker(different color used to encode different body parts):

Toad2 – SoliSandbox

Shy Beetles

Video Link  

View Project               View Code

This project is about allowing the user to feel the joy of turning over a stone and watching beetles scuttle about about. The beetles should react similar to how real beetles would react by running to shadows when the rock is moved, run away from user’s hand when they move in. Additionally, when there is no presence the beetles will explore their environment but then go back to hiding under a rock when user is close to the device again.

For this project, I wanted to instill a feeling of joy from seeing the beetles, so I focused upon creating realistic but cute bug like movements. However, since the beetles are so detailed it was difficult to create a compelling environment for the beetles to inhabit. In this project, I learned how to determine if a point was inside a n-sided polygon but the algorithm was too inefficient to be used in real time as well as create compelling generative creatures. Hopefully, this project brings other people the same joy I feel from seeing beetles crawl.

OodBird – soliSandbox

TAP-TAP CAT

This app allows the user to interact with an animated cat and give it instructions using hand movements like swipe and reach. When the app is on and no presence is detected, the title screen will play. Once soli detects a presence, the cat will appear laying down. The user can then swipe left and right to make the cat roll in that respective direction. Swiping up will make the cat sit. If the cat is sitting and the user reaches for the phone the cat will react. If the user swipes up while the cat is sitting the cat will stand on it’s hind legs then return to a sitting position.

video

code

Title screen:

Roll motion:

 

Sit motion:

Stand and alert motions:

miniverse-SoliSandbox

This is a project linking the soli swiping motion to unleashing the plague on an unsuspecting group of believers. The current enabled plagues: plague of frogs, hail, and death to the first born.

I learned how to use physics and collision physics to implement real world feeling game dynamics.

code:

https://glitch.com/~plagues

(the cites for the sounds are at the top of the code)

for some reason, can’t export my screen recording off of pixel. It keeps failing. So here’s the video made off of the desktop version of the app.:

thumbpin-SoliSandbox

The app has storm, summer night, ocean, forest, library, party, and driving soundscapes that are interactive with reach and swipe (up, down, right, left) interactions. Sound begins when soli senses your presence. Sounds are associated with certain colors so the phone can serve as a colorful lamp as well. Reaching for the phone at least 3 times adds another layer of ambiance. Swiping up and down adjusts volume. Swiping right adds embellished sounds to the soundscape and swiping left generates a new soundscape and color. I learned to isolate individual parts of the app while developing it and I learned to simplify my ideas.

 

shoez-SoliSandbox

I created a DNA-Editor visualizer for my project. When you open the app, you’ll have a standard DNA strand with 20 nucleotides. You can remove nucleotides by touching the specific nucleotide. You can continuously add nucleotides by tapping outside the DNA strand. you’ll see that after a set amount of time, the strand will mend it itself regardless of if more nucleotides are added. If you swipe up, you delete the current strand you’re working on. If you swipe down, you’ll get a new standard strand. Swiping left and right is analogous to a zoom affect and you can see the inside of the DNA the more you zoom in. A lot of this project was problem solving and trying to find a way to manipulate objects in an interesting way. I found that timers are very powerful when I want to achieve certain looks.

Code

Youtube Video Demo

pinkkk-SoliSandbox

The Windshield Wiper

For full documentation of my struggles, please see this:

https://medium.com/@xinrant/soli-project-a230e872fddf

In this project I want to create a humorous simulation of driving without a destination while keep on getting random nasty objects thrown on your windshield, which users can respond by the hand swipe gesture to activate the windshield wipers. 

It's meant to be fun and entertaining that users can engage in to release some stress from being inside ~the whole time~.

Image for postSplash

Image for postJust driving

Image for postStatic

I learned a lot in this project, mainly how I struggle forever to implement the most simple things. My lack of control over the medium  shook me.

Project on Glitch

 

 

 

 

sweetcorn-SoliSandbox

Dog Lullaby

I have created a lullaby generator for children to feel loved and lulled as they fall quickly asleep. It features an animated dog, generated lullaby lyrics with rhymes, and a generated melody that children can alter using the special Soli sensor.

Above is a concise demo of the app in use. It normally runs for a random number of lyrics from 35 to 45 lyrics, but for the purposes of a short video, only a few lines are shown.

Swiping left will increase the pitch of the left note, swiping up for the center note, and right for the right. A tapping gesture will mute the song and another will un-mute. If you feel inclined to test the app on desktop, the swipes map to arrow keys (left key press is equivalent to left swipe, etc.).

The dog and border are animated using p5.js, the lyrics are generated using RiTa.js, and the song itself is generated using Tone.js. It was a pleasure becoming more familiar with the latter two. RiTa.js is an excellent tool for generative poetry, and I’ve only just begun to experiment with it. I had originally been using p5.Sound to generate the song, but found the quality terrible and switched to Tone.js. My life instantly became easier, and I look forward to experimenting more with it as well.

The song is composed of one enveloped oscillator that creates a bell-like melody, one modulated oscillator that plays a random harmony based on the melody, and a synthesizer that plays a random bass sequence based on the melody as well. It runs in 3/4 time at 90 bpm, as similar to a lullaby as I could get.

In the future I hope to animate the tail and eyes of the dog more specifically. I also hope to refine the sounds—adding more dimension and nuance to something that should sound simple and lovely. A lovely thing I feel obligated to note: I experimented with this app around my dog (his name is Scout) and he immediately ran toward it. I suppose my “whine” instrument was more accurate than I had expected ;~0. I have yet to test the app with cats and children, but I have high hopes!

Good night,

sleep tight!

Don’t let the bed bugs bite!

xoxoxo

~sweetcorn

yanwen-SoliSandbox

DAGWOOD is a sandwich generator that creates random Dagwood sandwiches with layers of food or non-edible stuff. By using the two Soli interaction methods, swipe and tap, users can choose to add ingredients to the sandwich or swap out the last piece of ingredient if they don’t want it.

After generating their sandwiches, users can click on the “TWEET” button to let the Dagwood bot send out an emoji version of the sandwich, or click “RESTART” to build a new sandwich.

I tried to keep the gameplay minimal and gradually added features to the project. One feature that I eventually gave up is to build an image bot, since it would take quite an amount of time to figure out image hosting and tweeting with images, and the saveCanvas() function in p5 isn’t working very well with the Pixel 4 phone. I switched to the emoji bot instead, and also tried to add a retweet when mentioned applet that I found on IFTTT (worked for twice and then it’s somehow broken :’ ))).

Code | Demo – works for Pixel 4 and laptops (view with Pixel 4 screen display size but ingredient sizes are a bit off)

🤖 ENJOY MAKING YOUR DAGWOOD SANDWICHES : D! 🥪