Sorry, you need to enable JavaScript to visit this website.

Soon we might all see with sound

Soon we might all see with sound

Bats use echolocation to see. Could wearable tech allow humans to do so too?

Bats use echolocation to see. Could wearable tech allow humans to do so too?

Kathryn Nave | April/May 2020

Rattling around the hills of Orange County, Brian Bushway looks like any other Californian mountain biker. The steady clicking of his tongue is the only sign of something unusual. But this is no nervous tic. Bushway is an echolocator: because he is blind, he has learned to use the sound of his tongue to work out his location.

Echoes are a constant source of information about the environment around us. Though this subtly shapes our experience, it usually registers only as a vague sense of open space or claustrophobic confinement. Consciously using that information requires considerable training. Even then, most echolocators can “see” only in fuzzy resolution: objects smaller than around 10cm typically go undetected unless they’re very close by. Bats, by contrast, can detect the contours of a human hair with their high-frequency echolocation calls.

To make us more bat-like, Fernando Albertorio, a sight-impaired entrepreneur based in America, dreamed up a solution: the Sunu band. A silicone strap roughly the size of a fitness tracker emits ultrasound waves that are closer to a bat’s frequency than human tongue-clicks. It senses how far away objects are by measuring how quickly these waves bounce against them. The user receives this information through vibrations: the strap buzzes more the closer objects are.

The $299 band can detect objects the size of a coin. Albertorio hopes soon to link the device up with Google Maps data and provide vibration-based directions.

For the estimated 285m visually impaired people worldwide, Sunu offers greater independence. For the merely distracted, it may also reduce the number of smartphone-induced lamppost collisions.