Student Area

sweetcorn-soli-checkin

Dog Lullaby

On desktop, use the arrow keys. Otherwise, use Soli swipe gestures to change the three notes used to generate the melody and a Soli tap gesture to mute and unmute the lullaby.

I’m using RiTa.js for lyric generation (markov-generated from a text file (lyrics.txt on glitch) I’ve filled with lullabies, love notes, children’s songs, bedtime prayers, and nursery rhymes). I hope to improve the rhyming, which currently simply replaces the last word of a line with a random word that rhymes with the last word of the line before. This results in nonsense that isn’t thematically consistent with the markov-generated nonsense. More complex rhyme schemes than AABB could draw attention away from this fact.

I’m using Tone.js for the song aspects. I was initially using p5.Sound but the quality was terrible in general but especially terrible on the device I am designing for, so last night I switched everything over to Tone.js. Due to that late change, I still have to refine the oscillators/synthesizers I’m using and explore other Tone.js functionalities. I’m very pleased with the Tone.js quality in comparison to p5.Sound and wish I had used it from the beginning.

I also have to refine the animation, as it was based around the previous drum pattern I was using, which ran on p5.Sound. I specifically mean the dog (Joe <3), as there are plenty of opportunities to animate him beyond what I currently have. I might ease the stars in and out beyond the simple fade introduced by having a background set to a low alpha value, but this would require creating and storing stars in an array of stars, drawn in draw() instead of in playBass(), where it currently is. This is a relatively straightforward change, the most complicated aspect being disposing of the elements in the array that do not need to be drawn. This will be based around a timer, which hopefully will be simpler to work with using Tone.js, possibly relating to ticks.value instead of counting the number of beats manually, as I am doing now.

I also have to implement functions for the beginning and end of the song. It runs for a random number of lyrics between 15 and 35 lines, but the program ends on a scope error from there.

~xoxo

sweetcorn

OodBird – checkin

CAT

In the game the user gets to interact with an animated cat by swiping to make it do different actions. With the current animations, the user would swipe right or left to make the cat roll over in that direction and taps to make them stand up. However they will be doing no such thing as I broke my code last night. Instead the user can click the right half to make the cat roll right, the left half of the screen to make the cat roll left and drag to make the cat stand up.

miniverse-soli-checkin

https://www.youtube.com/watch?v=B64Xl1yx_M8soli check in

 

 

The user is “god”.  If they swipe, they unleash a plague onto their believers.

right now I worked out the collision physics but I need to import some assets to make the plagues functional and the background pretty. (replace the black circles with hail, frogs, locusts, flies).

Laundry list of tasks in order of importance:

  • make perspective less western
  • make/import assets for: hail, frogs, locusts, flies
  • switch out circles for hail and frogs
  • cut springs to make the “death” of characters
  • hook up swarming demo for locusts and flies plague
  • make a sun, let it break for the darkness plague
  • import clothing assets and more traditionally female looking body parts
  • make animals? for the animal pestilence plague

yanwen-soli-checkin

demo version

DAGWOOD is a sandwich generator which create random sandwiches by using the two Soli interaction methods, swipe and tap. Users can tap to add ingredients to the sandwich and swipe to remove the last piece of ingredient if they don’t want it.

Currently working on connecting p5 to Twitter for auto posting screenshots to the Twitter bot.

marimonda-soli-checkin

Soli Landscapes

For this project, I am using the Soli sensor data as a way to traverse and look for specific objects in a generative landscape. The text quoted below in the screen is a statement that will reference a specific thing to look for in the environments. In the case of the video below, the excerpt references and eyes and so the player needs to traverse through the landscape to find them.

This project is not complete, and while the main elements of the game are complete, I still have a few things left to implement:

  1. I am still working on making the objects move smoothly using linear interpolation and parallax. I think this would greatly improve the visual presentation of the piece.
  2. I am working on using recursive tree structures and animations to make this environment more dynamic. I also want to add more variety of the objects and characters a person looks for in the landscapes, and text that references it. Also, more movement and responsive qualities in the background, such a birds when a specific gesture is done, currently an example of this type of responsiveness is how the moon-eye moves, it closes and opens depending on if human presence is detected.
  3. Varied texts and possible generative poems that come from the objects that are being looked at in the landscape. This excerpt comes from a piece of writing my friend and I worked at but I would like to make it generative.

Here is an example of how the environments look when swiping:

 

Here is another landscape:

lampsauce-soli-checkin

This demo does not work with the microphone yet (I’m working on a jank workaround). You can check out a web version here. Open the console to see the possible verbal commands. Use the arrow keys in place of swipe events.

 

This demo can be viewed on a browser here. Use the arrow keys in place of swipe events. The collision detection is pretty jank ¯\_(ツ)_/¯ . The main issue is that I have to to get the shadows & lighting fixed

tale-soli-checkin

Screen Recording

(using left/right arrow keys for swipes, enter key for tap)

Google Pixel 4 w. Soli sensor

Concept Decision:

I decided to create an app that helps me “work out” since I haven’t done any exercise for a while due to Covid 🙁

I’ve been playing squash for years and I really love squash, but I couldn’t play since March and I really miss being on court ( big sad 🙁 ). So for this project, I recreated squash practicing experience!

Current Problem:

One thing I didn’t anticipate for some unknown reason is that the phone is vertical, hence cuts off the sides of the court. Considering that rail shots (hitting straight deep to make the ball bounce parallel to the side wall) are most commonly practiced in squash, not being able to see the sides makes the experience not quite complete.

Potential Changes to Make:

My goal for next deliverable is to make it playable horizontally as well. I could also add the sound effect every time it bounces off the wall, if I could find a nice clean sound file of a squash ball hitting against the front wall. (Often times the court echos a lot, so it’s hard to get a sound of it without having other noises.)

 

 

thumbpin-soli-checkin

The app has storm, summer night, ocean, forest, library, party, driving ambiances that are somewhat interactive with reach and swipe (up, down, right, left) interactions. Sound begins when soli senses your presence. Sounds are associated with certain colors so phone can serve as a colorful lamp as well.

Desktop version because phone app is super buggy

Videos show that up+down swipes adjust volume and that swiping left changes the ambiance and color

Updates to come: fade between colors does not work – need to fix (done), adjust volume of sounds in music editing software for appropriate relative balance (done), use a different function than preload (done), auto change color instead of swipe? (based on feedback, will not implement), make rain and ocean sound more distinctive (done)

Additional ideas from feedback: simulate glowing light (will not implement), unique color and sound combos

pinkkk-soli-checkin

This is aimed to simulate the illusion of driving and having random stuff that comes in the way, and users can activate the windshield wiper by the hand swiping gesture.

Image for post

Currently I'm battling with the swiping effect. Also, this whole thing is just super ugly. I'm planning to add a lot more looping details to the background.

Image for postImage for post

This is what I'm going for, aesthetically speaking.