Using advanced technologies such as lidar - also used in autonomous vehicles - Strap’s assistive device is aimed at helping visually impaired people navigate their environment.
Lots of companies are pouring resources into teaching cars to see the world around them. Now a startup called Strap Technologies is developing a wearable pod that uses some of the same kinds of sensors to give blind people a clearer sense of their surroundings.
“Each sensor has a different resolution, has a different threshold,” says founder and CEO Diego Roel. “We use the best of each sensor and we combine them.”
Strap’s chest-worn device weighs less than half a pound and is scheduled to go on sale next summer for $750. (It’s currently available for $500 on pre-order.) It calculates the proximity of such hazards as walls, steps, nearby people, and bumps in a sidewalk. Then it conveys this information to the user via haptic feedback - its four straps vibrating according to a grammar that users have to learn.
“The pattern and the strength of this means where the obstacle is, how to avoid it, and how far away it is,” Roel says, adding that the two most expensive components are the device’s radar sensors and microcontrollers. He notes that the device is designed to run for 72 hours on a single charge.
Strap began work on its device about three years ago. “We underestimated the technology complexity we needed to [make] this device,” Roel says of the journey since. There are some 250 people currently testing the device.