created by Phoebe Lin, Julia Solano, Yakshu Madaan, Tomas Vega

Short description

  1. Creator’s vision:
    1. WHAT?   Development of an affordable and efficient method to help visually impaired people navigate in space and prevent them from being lost.
    2. HOW?   By wearing a technological artefact that indicates walking paths and detects obstacles on them. The information is translated into haptic outputs (vibrations).
    3. WHY?   To replace already existing methods (white cane, dogs) that are expensive, difficult to handle, non-fashionable and lead to head injuries and non-obtrusive ways of navigation.
  2. Steps:
    • Conducting research based on real people’s experiences
    • First hand experimentation
    • Design of the computational loop
    • Fabrication of the device
  3. Innovation:
    • Two basic functions at the same time: (i) obstacle detection and (ii) navigation assistance
    • ultrasonic waves for the detection of spatial obstacles
    • GPS activation through voice command
    • All visual information is encoded into haptic feedback
    • Wearable, easy to carry
    • Fashionable accessorizes (neckless, glove)
  4. Details:

Neckless

Arduino mini

Ultrasonic sensor (2 meters range)

Vibrating motor at the nap of the neck (high sensitivity)

Glove

GPS with button (sends current location via WIFI)

Triple-axis Accelerometer + Magnetometer (Compass) Board

Vibrating motorOn a press of a button, GPS sends the user’s current location to Google Maps Directions API with the current and destination coordinates. This returns an array of steps to get to the destination. The software code calculates the direction vector in the form of an angle varying from 0 to 360 degrees. Finally, the user swings the arm, if it is within the range of the correct direction (+/- 15 degrees), it delivers haptic feedback to the user indicating he/she is pointing towards the correct direction.

  1. Contributions:
    1. Accessorizes with integrated technologies for visual impairment
    2. Easier navigation
    3. An experiential example of sensory substitution and intermodal transfer of information
    4. Low-cost/affordable

https://youtu.be/KK4kdUO4hAU

My thoughts:

  1. Good points:
    1. Interviewing and hands-on experimentation
    2. The placement of a sensor on a higher body level like the neck.
    3. The customized GPS system and the idea that a visual path can be decoded in haptic feedback
    4. The devices’ integration into accessories.
  1. Possible Improvements:
    • Multiple accessorizes with integrated sensors
    • The arm’s swing in the air would be exhausting and impractical for the user. According to my experience, visually impaired people don’t want to expose themselves with superfluous movements that emphasize their sensory impairment. Personally speaking, I would use body’s four limbs to transform the whole body into a remote control ((1) right arm àturn right, 2) right leg àgo ahead, 3) left leg àgo back, 4) left arm àturn left, 5) combinations among them). I would apply integrated accessorizes on the four limbs and according to the distribution of haptic feedback on them, the user would develop an inner ability to orient himself/herself in space.