sophiaq – LookingOutwards – 1

Soft Sound is using textiles as transducers. The cloth is woven with shapes of flat copper and silver coils and there is running and alternating currents that run through it, which creates soft speakers. They are connected to an amplifier and there is a magnet in close proximity to the coils, in order to force coils to go back and forth to induce sound waves. This inspires me because I love to sew and create soft sculpture, but I never imagined putting sounds or technology into the fabric. The project is effective in integrating the coils into the fabric- the coils are nicely designed into patterns rather than just put on, however, I’d like to see it to be more elaborate and create songs and clothing out of this innovative technology. EJ tech, made by Esteban de la Torre and Judit Eszter Karpati, wanted to innovate for contemporary interior design.

hdw – Looking Outwards – Week 1

Material Equilibria is an installation by Sean Ahlquist in collaboration with Achim Menges, Bum Suk Ko, Ali Tabatbai, Bettina Woerner, Helene Jensen, Vibeke Riisberg, and Mette Ramsgard Thomsen. The artists used a spring-based environment coded with processing. The creators did not cite any work for inspiration, but it is based on Alquist’s research on articulated material behavior and differentiated structural forms. The techniques used can be applied to clothing, product, interiors, or other industrial designs in the future. Their work can be viewed here, on an article written called “Material Equilibria: Variegated Surface Structures” by the University of Stuttgart.

LookingOutward-01

SixthSense site

     One interactive project that really stuck to me is the SixthSense project. I first came across this project in a TEDx video. SixthSense is a prototype consisting of a projector, mirror and camera. It allows the user to have an interactive projection/screen on various surfaces in everyday life, using simple hand gestures to control it.

I find this project very inspiring because of its ease. The prototype relies on simple hand gestures and various physical surfaces that can be found anywhere. Its brings high-tech concepts and software into the every day lives of people. The website itself even has instructions to creating your own prototype device.

It is also very interesting seeing the potential of the prototype and trying to envision where this technology will end up. Although a lot of software and tech has been developed for this prototype, it is still a long way from becoming a refined product.

akluk_LookingOutwards01

https://experiments.withgoogle.com/ai/quick-draw

Quick, draw! is a very simple game where the user simply draws or doodles on an empty canvas and the artificial intelligence/computer will attempt to guess and determine what the drawing is of, as the user doodles more the artificial intelligence gets better and more accurate with the guesses.

This application was developed by Jonas Jongejan, Henry Rowley, Takashi Kawashima, Jongmin Kim, Nick Fox-Gieg, and the a few other individuals in Google Creative Lab. The creators were first inspired during a hackathon where they wanted to a create a pictionary-esque game with artificial intelligence. It utilized technologies and algorithm that was developed by the Handwriting recognition engine developed for Google Translate (which also most likely what inspired it) as well as other machine learning concepts. The model was trained to become better and better at identifying doodles as more people provided more data and samples for it to analyze. It could be further developed into a picture keyboard.

I wish to do something with computer vision in the future so this is similar to what I strive to learn and understand in the future. It is also a very creative and fun way to apply machine learning and AI.

afukuda_LookingOutwards01

Over the summer I visited the ArtScience Museum in Singapore, where they were holding the FUTURE WORLD exhibition. The exhibition consisted of teamLab’s Sketch Town project, designed to engage children to inspire one another to meet the challenges of the urban setting as a shared space. Usually I feel that learning through technology has its negative consequences, however, teamLab’s project showed the capabilities of technology to foster and children’s imagination, creativity and growth, which was an inspirational experience. The biggest factor of this project is the PFU ScanSnap, a scanner which converts 2D drawings to 3D on a virtual screen, allowing a platform in which children could visualize how their drawings can be applied in a global urban system.

Link to work: https://www.teamlab.art/w/sketchtown (Author: teamLab Title: Sketch Town)

 

 

mecha-lookingoutwards-01

Daily Tous Les Jours

Every year, Quartier des spectacles in Montréal is decorated with installation art meant to bring together the community around the area. In 2011, the Canadian design firm Daily Tous Les Jours led a team consisting of over forty individuals including specialists in design, technical direction, music, animal/human behavior, and technical direction in order to create the urban instrument, 21 Balançoires.

Each swing represents a specific instrument (piano, guitar, harp, or vibraphone) that plays a note based on how high a user swings. Depending on how users interact with the swings and with each other, the instrument has the ability to play scales and melodies.In order to play the corresponding note to the height of the swing, the installation relies on sensors hidden in the swings that communicate its location to a central computer.

What inspired me the most about this project was the ability of the design firm to bring life to an area that was originally closed off and essentially abandoned. 21 Swings became so popular that the installation became scheduled to reappear every spring since. Through its playful nature, the piece was able to attract people of all different ages and demographics.

BrandonHyun-LookingOutwards-01

 

Videos: Motion Phone at Ars Electronica, 1996; Motion Phone at SIGGRAPH 1995; Motion Sketch at Brown University, 1989-1994

Scott Snibbe is a pioneer in augmented reality, gesture-based interfaces, digital video, and interactive art. He started this project called the Motion Phone in 1995 and evolved out from an exploration of how to make cinema out of one’s body.

Motion phone was created by Scott Snibbe himself. He explains that this program, Motion Phone, is a “networked version” of Motion Sketch. Rather than multiple humans controlling the program, several different computers are connected to run this program.

Scott Snibbe did use “Motion sketch” which he created himself, which attaches the movements of one’s hand to the movements of abstract forms.

Scott Snibbe had been inspired by two experimental animation pioneers. The first, Oskar Fischinger, pioneered a cinema of pure abstraction. His earliest films are simple black and white forms, drawn frame-by-frame in charcoal. Yet the resulting movements, such as in Study Number 7 (1931), have incredible emotional power. The second pioneer, Len Lye, pioneered “direct cinema,” created by marking directly on the film surface with pens, inks, or by scratching emulsion off of black leader.

This project is especially interesting because this is an earlier form of how humans can interact with computers. Since we are familiar with augmented reality today, it seems to be interesting how Scott was able to pioneer in that area.

For reference

https://www.snibbe.com/

 

 

rgroves – Looking Outwards 01

 

The Light Clock is a project that has been on display at the Carnegie Museum of Art by new media collective the Innovation Studio. Outside the museum, there is a clock with a hand that makes a rotation every 5 minutes, and when it reaches the top it takes a 360 degree photograph of its surroundings. In the lobby of the museum, the photographs are used in an interactive display that allows viewers to spin their bodies to left to change the point-of-view and to the right to change the time. The project required both new software as well as new hardware as nothing similar had been done before. They don’t elaborate on this in the blost post I found, but apparently the sensor that detected when and what direction a viewer was spinning was particularly challenging. The artists where all particularly inspired by a quote from critical theorist Roland Barthes – “…cameras, in short, were clocks for seeing…” The project is as much about the clock as much as it’s about the camera. It is meant to explore how we perceive space and time to be deeply, intrinsically related when in many ways they are not. Seeing them so blatantly ripped apart is jarring to the viewer. The display is curved around you, increasing the sense that you’ve been transported to a universe where you cannot navigate space and time simultaneously.

On their blog, the artists describe this project as the ultimate new media project. They claim that the success of this artwork as forced the museum to “consider where art objects can come from” and ponder whether perhaps the Light Clock isn’t art but something else entirely. I’m not sure whether it’s quite as groundbreaking as they say it is, but in the context of the Richard Serra and Henry Moore sculptures that neighbor it, it is certainly a unique project.

https://studio.carnegiemuseums.org/clock-9aa6da28a4e5

katieche – looking outwards 01

https://www.ingenie.com/one-four-nine/

One Four Nine is an interactive web and smartphone based game where the player has to solve a case surrounding a car crash. Designed by Epiphany for Ingenie, an insurance brand for young drivers, the site utilizes a first person point of view interactive video along with sounds to place the user in the perspective of an investigator on the crash site. Not only is the video beautifully made, the graphics and and movement flow of the entire project was stunning and inspired me to look more into interaction design.

Epiphany used GSAP animation (a type of HTML5 for animations), Hammer.js, and Zepto.js to create One Four Nine. As a company who has received several awards for their other websites, I’m sure they are composed of a well rounded team of designers and coders; however, I could not find exactly how they organized themselves or how many people were involved. This project served primarily as an educational site to raise awareness against distracted drivers, a common stereotype of young drivers. I believe the utilization of  gamification in the educational industry is continuing to grow as it has proven successful in being both entertaining and informative to our short-attention-spanned youth.

 

jennyzha – looking outwords 01

Karolina Kurkova attends the 'Manus x Machina: Fashion In An Age Of Technology' Costume Institute Gala at Metropolitan Museum of Art on May 2, 2016 in New York City. (Photo by Larry Busacca/Getty Images)

A particular computational project that I found to be inspiring was the Marchesa x IBM’s Watson dress collaboration for the Met Gala. While technology isn’t typically something one would associate with fashion or art, Marchesa and IBM were able to perfectly and harmoniously showcase their inspiration, and last year’s Met Gala’s theme, “Manus x. Machina: Fashion in the Age of Technology”.

Cognitive Marchesa dress lights up the night

The “cognitive dress,” it was called, was covered in fabric flowers embedded with LED lights that changed color. While physically, the dress stayed true to Marchesa’s signature elegance, the dress also stayed true to Watson’s powerful analytical power. Prior to the gala, Watson analyzed social media around Marchesa, assigning different emotions of the posts to different colors (rose for joy, coral for passion, aqua for excitement, lavender for curiosity, and butter for encouragement).

The team included a portion of the IBM Watson team, as well as Georgina Chapman and Keren Craig of Marchesa beginning their process just 5 weeks before the gala. There were no prior projects similar to this so they were the pioneers, as they opened up opportunity for more innovation between fashion and technology.