axol-ARSculpture

Chalk Shark

As I kid I drew on the sidewalk a lot, and indulge myself in my fantasy world. Thus for this project, I want to make such a virtual world come alive to me through AR.

I wanted to make a 3D chalk shark drawing, that surfaces from the ground drawing. However, I’ve been testing in my apartment hallway and wasn’t testing outside, and it really didn’t look as good as I wanted it to be 🙁

For the programming portion, I modified the code of the AR Controller so that the mesh is always placed sideways (90 degrees) towards the camera.

I made the 3D model and handpainted the texture(in Substance painter and Photoshop)  to give it a hand-drawn quality. I planned to hand particle systems and flipbook textures to animate the water, but I didn’t get that to work, so I’m just animating the two pieces of mesh that represent the water splashes by bobbing them up and down.

Hallway Test Vid:

More Documentation:

 

axol-ARSketch

I really want to “spice up” my working space through AR, so all of my ideas are based on my current desk setup.

(All my videos recorded in JustALine doesn’t upload to Vimeo for some reason, I’ll attach it later)

Desktop Shark

A shark lurking and swimming around on our desktop; it would consist of a 3D model and some VFX to accompany it.

Snow

I like snow 🙂 Make it snow in my bedroom!

Talk to the hand!

An extra hand can help lighten up your day perhaps? I imagine just to have a hand come out of a portal, and probably say some dumb things.

https://drive.google.com/file/d/10fm20yNf6uZK3VmrcTicgdw1S7nyG2pg/view?usp=sharing

axol-PhazeroVideo

I think it’s interesting to see someone approach creating a game from a non-game designer perspective– as there’s much less emphasis on gameplay, but how the game is experienced and the emotional impact it brings.

I certainly resonated with the part where she said she had to teach herself everything.

axol-LookingOutwards05

What I like the most about Lucid Trips is how the game takes advantage of the VR’s hand-tracking abilities to sync with its movement. In an ideal world, VR experiences can be experienced in an infinite space where you’re free to walk around — but that’s obviously not the case for most people. Lucid Trips simulates an “underwater feel” and makes the primary movements be directed by the player’s upper body gestures. Contrasting with VR experiences that relies on body movement or input controls, I believe this creates a more immersive experience because of its clever design.

axol-UnityEssentials

I completed all the lectures, but after setting up the high definition lighting and particle system my scene was running at a steady 1 frame per second, thus I didn’t bother with rendering and exporting.

Also encountered some problems with animating inside the scene, as it kept on giving me the “dereferencing NULL” error and I can’t spot why it’s going wrong 🙁

axol-GPT2

After playing around with GPT2, I noticed with certain phrases it can generate full on news report style articles (even with authors and the proper formatting) which it thought was pretty interesting. I fed it some political slogans or common sentence starters and it generated this sci-fi showdown announcement which is hilarious.

This Saturday, the first ever Octagon women’s featherweight bout will go down in the main event of UFC 216. Cris ‘Cyborg’ Justino will make her long-awaited Octagon debut, as she faces bantamweight champion Amanda Nunes. Amanda Nunes vs. Cris Cyborg — Title Bout Nunes will likely be fighting for the UFC bantamweight title when she enters the Octagon with Cyborg, but she wants to defend the belt against both her nemesis, Ronda Rousey, and Cyborg.
-----------------------------------------------------------------------
Trump's departure from the white house will happen this Friday at around 3:00 pm. Instead of being in Washington, he will be in New Jersey, on a working vacation. 

WHCD: Who should replace @POTUS at the White House Correspondents Dinner? — The White House (@WhiteHouse) September 17, 2017 

[Video via Fox Business]
-----------------------------------------------------------------------
This is New York Times reporter Johnny Sharpie, bring you this report from Pittsburgh, PA.  (BEGIN VIDEOTAPE) 
JOHNNY SHARPIE, REPORTER: We're walking towards this massive firehouse. 
UNIDENTIFIED MALE: What's that sound? 
SHARPIE: Oh, that's fire, sir. 
UNIDENTIFIED MALE: Oh, thank you, sir. 
SHARPIE: Back here. 
UNIDENTIFIED MALE: Come on, everybody, come on. 
SHARPIE: OK, well, just a second, sir. We're coming from a second floor, sir. The fire just went back there. 
UNIDENTIFIED MALE: Alright. Yeah, back here. 
UNIDENTIFIED MALE: All right, all right, we're all the way around

axol-LookingOutwards04

http://www.aiartonline.com/art/holly-grimm/

https://hollygrimm.com/acan_final

hollygrimm_ny_dieb201_3840 (1)

hollygrimm_pilar_bn178_3840 (1)

The project was done with additional constraints from a neural network trained on art composition attributes. It’s attempting to take traditional fine art concepts (Variety of Texture, size, color, shape, contrast …) and embodying them as a part of the constraints.

What fascinates me most about this project is the documentation in the second link. The documentation showed every step and the different passes of images through the model, the different dimensions of values that was modified to produce the look which I though was really cool. It’s interesting to see how some traditional concepts like color theory and textures gets replicated/translated into digital, machine learning space– and also just how the categorization of human art work fits into the listed categories.

axol-soliSandbox

(backup link: https://vimeo.com/467470357)

Glitch Project Link: https://glitch.com/~a-nervous-chicken

Presentation mode: https://glitch.com/~a-nervous-chicken

It’s a chicken, it gets nervous when you get too close to it. 

The chicken is constructed through particles and springs, which gives its bouncy quality. Upon sensing human presence(Soli event presence), the chicken perks up and becomes alerted. You can slap the chicken (Soli event swipe), and the chicken will get shoved around and screams.

It’s a simple concept, and I wanted to focus on giving some character to the chicken, and making it chubby and lovable. I created a throwaway program that I can use to design the chicken and export it as a JSON file, and then there’s the actual program executing all the particle and spring motion. I’ve learned a lot about particles and timing animations while making the project, and it was super interesting to work with the Soli system. When I told my roommate to walk close to the phone and the chicken perked up she was super surprised on how it was able to know, so that was a really fun 😀

(more documentation)Spring system:

Chicken Maker(different color used to encode different body parts):