PUT 2020 IN YOUR REARVIEW.
As bats swoop around objects, they send out high-pitched sound waves that then bounce back to them at different time intervals. This helps the tiny mammals learn more about the geometry, texture, or movement of an object.
If humans can similarly recognize these three-dimensional acoustic patterns, it could literally expand how we see the world, says study author Miwa Sumiya, Ph.D., a researcher at the Center for Information and Neural Networks in Osaka, Japan.
“Examining how humans acquire new sensing abilities to recognize environments using sounds, or echolocation, may lead to the understanding of the flexibility of human brains,” says Sumiya. “We may also be able to gain insights into sensing strategies of other species by comparing with knowledge gained in studies on human echolocation.”
This study is not the first to demonstrate echolocation in humans—previous work has shown that people who are blind can use mouth clicking sounds to “see” two-dimensional shapes. But Sumiya says that this study is the first to explore a particular kind of echolocation called time-varying echolocation. Beyond simply locating an object, time-varying echolocation would enable human users to better perceive its shape and movement as well.
To test subjects’ ability to sense echolocation, Sumiya’s team gave participants headphones and two tablets—one to generate their synthetic echolocation signal, and the other to listen to the recorded echoes. In a second room not visible to participants, two oddly shaped cylinders would either rotate or stand still. The cross-section of these cylinders resembles a bike wheel with either four or eight spokes.
When prompted, the 15 participants initiated their echolocation signals through the tablet. Their sound waves released in pulses, traveling into the second room and hitting the cylinders.
It took a bit of creativity to transform the soundwaves back into something the human participants could recognize. “The synthetic echolocation signal used in this study included high-frequency signals up to 41 kHz that humans cannot listen to,” Sumiya explains. For comparison, bat echolocation signals in the wild range from 9 kHz all the way to 200 kHz—well outside our range of hearing of 20 Hz to 20 kHz.
The researchers employed a one-seventh scale dummy head with a microphone in each ear to record the sounds in the second room before transmitting them back to the human participants.
The microphones rendered the echoes binaural, like the surround-sound you might experience at a movie theater or while watching an autonomous sensory meridian response (ASMR) video recorded using a binaural mic. The signals were also lowered in frequency when received by the miniature head to an eighth of the original frequency so the human participants could hear them “with the sensation of listening to real spatial sounds in a 3D space,” says Sumiya.
Finally, the researchers asked participants to determine whether the echoes they heard were from a rotating or a stationary object. In the end, participants could reliably identify the two cylinders using the time-varying echolocation signals bouncing off the rotating cylinders by listening to the pitch.
They were less adept at identifying the shapes from the stationary cylinders. Nevertheless, the researchers say that this is evidence that humans are capable of interpreting time-varying echolocation.
Sumiya hopes it could one day help humans perceive their spatial surroundings in a different way; for example, helping visually impaired users better sense the shape and features of objects around them.
The next step for this research is to give participants freedom to move around when they’re interpreting these echolocation signals, Sumiya says. That will more closely mimic the action bats might take when using echolocation “because echolocation is ‘active’ sensing.”