Generating interactive 3D contents on a living room table using AR and a haptic puck.
The living room tables have rich stories behind it – conversation with family, studying for a homework, eating dinner, kids playing with LEGO, and so on. What if these tables in a living room could provide us interactive and engaging contents such as weather forecasting and entertainment for kids? This project allows the user to engage with the AR information and artifacts that are coupled to a physical desk through a haptic interface that consists of an array of small vibrators and a heat pad. The goal is to design text-less and immersive AR interaction techniques that enrich everyday living room table experience.
Test BLE on Unity and iOS
Test IR tracking using a webcam (or depth camera)
Battery check (drive 4 vibrators and BLE board)
Get all tracking, wireless, and actuation working
AR contents programming
3D print a package for haptics
Prepare a table and a webcam installation
LiPo battery (x5 for backups)
webcam setup tools
Deliverables for show:
Space for a table, PC, and a webcam installed on top of table
The living room tables have rich stories behind it – conversation with family, studying for a homework, eating dinner, kids playing with LEGO, and so on. What if these tables in a living room could provide us interactive and engaging contents such as weather forecasting and entertainment for kids? What would it be like to feel realistic vibrations and temperature changes about digital information from these tables? This project allows the user to engage with the AR information and artifacts that are coupled to a physical desk through a haptic interface that consists of an array of small vibrators and a heat pad. The goal is to design text-less and immersive AR interaction techniques that enrich everyday living room table experience.
The first stage of experience is the selection of a content. This experience uniquely occurs on a living room table, so it triggers when the table is recognized through a mobile device. There are 2 categories of content – a weather forecasting and an interactive animation for kids. In weather forecasting mode, imagine the animated 3D sun and cloud appearing on the table and the user feels the vibration of rain or the warmness through the haptic device. This allows text-less and intuitive form of transmitting information. The second scenario is an interactive animation for kids which the user can interact with a simulated dinosaur or a character on the table and being able to interact with it. As the user approaches the device closer to the character, it makes larger vibrations of a walking dinosaur, and when it is very close, the character starts acting toward the device. This would be a perfect interaction to play Pokemon on a table.
I challenged to work with an array of mini vibration motors and wireless local networking. The goal of this project is to create an immersive AR interactivity by generating haptic feedback of the virtual objects interacting with a physical desk. The user feels the vibration of the AR ball bouncing on a table through the haptic device. Every time it bounces, a signal is sent wirelessly to the main PC, which is sent serially to Arduino to activate motors. The future step is to more precisely activate the array of motors by computing the level of vibration based on the locations of the AR objects and the device.
For this assignment, I worked with ARKit to turn our mobile phones into an effective physical-assistive device. Using ARKit’s plane recognition feature, I created a virtual “stick” which the user can adjust its length and makes vibration when it is hitting the ground plane. From such haptic feedback, the user understands a correct navigation of the path. This is a more portable and customizable solution compared to the existing physical stick.
SketchSpace: Designing Interactive Behaviors with Passive Materials
A common disadvantage of tangible interaction is that the behavior of interface is limited to its physical form/characteristics and is usually not generally applicable. This solution tackles this problem by turning any passive objects into a control input device using a camera recognition and a 3D projection mapping so that the user can take advantage of unique tangibility of each object. It would be interesting to have a massive collection of tangible interactivities based on the object image database, and the user is capable of customizing tangible interaction at scale.
This is a preliminary experiment for my project which aims to visualize sensor network data of a room/building in three dimensions. For this first test, I took temperature data through I2C and visualized particles on Unity that changes the alpha of color depending on the sensor readings. The demo is a bit unintuitive since only a single sensor is being used, but the intention is to take temperature data from each part of a room, manually map the reading location on Unity, and visualize for the entire room.
I worked with Mappa.js to visualize a dataset on Google Maps. I found Mappa.js useful for quickly getting interactive online maps running. I tweaked one of the example code so that the objects get updated by the user’s mouse click.
I used NASA’s open data portal and took the meteorite dataset as input. Each row is the history of meteorite landings with latitude/longitude information. Every time the user clicks, it reads the new row and visualizes on Google Maps where it landed. It calls Google Maps API to map latitude/longitude to a city and creates an object for each city. If data with the previously appeared city is observed, it increments the size of the circle. Since there are cases where the city is not included in the API response, I used try/catch to ignore unidentifiable data.
When I think of human assistance, a topic that comes to my mind relates to music or rhythm since I’m poor at singing and playing instruments. In this project, I explored an approach to extract sound information from an instrument and to provide real-time visual feedback to the player so that he is always making an ideal sound.
Using the knock sensing interface from Assignment2, I built an assistive interface for drummers. A piezo sensor attached to a drum captures the strength of hit which is sent to P5JS to visualize whether or not the strength is within an appropriate range. The red line indicates the ideal strength and the gray bar is the actual hit value. The red bar continuously changes as the music progress.
Made a simple animation that responds to knock. Illustrated falling bouncy balls as the user knocks. The animation is mostly based on the P5JS example code. Circuitry is the same as the previous assignment.