Decoding the Source of Sound

Researchers at Massachusetts Eye and Ear unravel part of the mystery

Image: iStockAs baby boomers age, many experience difficulty in hearing and understanding conversations in noisy environments, such as restaurants. It is especially challenging for people who are hearing-impaired and who wear hearing aids or cochlear implants to hear in these environments.

Researchers know that the ability to locate the source of a sound with ease is vital to hearing well in these types of situations, but in order to design devices that work better in noisy environments, much more information about how hearing works is needed.

Researchers from the Eaton-Peabody Laboratories of Massachusetts Eye and Ear, Harvard Medical School, and the Research Laboratory of Electronics at the Massachusetts Institute of Technology, have gained new insight into how localized hearing works in the brain. Their research was published Oct. 2 in the Journal of Neuroscience.

“Most people are able to locate the source of a sound with ease, for example, a snapping twig on the left, or a honking horn on the right. However this is actually a difficult problem for the brain to solve,” said study co-author Mitchell Day, instructor of Otology and Laryngology at Harvard Medical School and investigator in the Eaton-Peabody Laboratories at Mass. Eye and Ear.

“The higher levels of the brain that decide the direction a sound is coming from do not have access to the actual sound, but only the representation of that sound in the electrical activity of neurons at lower levels in the brain. How higher levels of the brain use information contained in the electrical activity of these lower-level neurons to create the perception of sound location is not known," Day said.

In this experiment, researchers recorded the electrical activity of individual neurons in an essential lower-level auditory brain area called the inferior colliculus (IC) while an animal listened to sounds coming from different directions.

They found that the location of a sound source could be accurately predicted from the pattern of activation across a population of less than 100 IC neurons. That is, a particular pattern of IC activation indicated a particular location in space. Researchers further found that the pattern of IC activation could correctly distinguish whether there was a single sound source present or two sources coming from different directions.

“Our results show that higher levels of the brain may be able to accurately segregate and localize sound sources based on the detection of patterns in a relatively small population of IC neurons,” said Day.

It is currently not known why individuals with hearing impairment have difficulty hearing in noisy environments (such as comprehending speech in a restaurant) even though the sound they’re trying to listen to is audible, Day said.

The current study showed that the ability to distinguish between one sound source and two sources at different locations can be adequately decoded from the pattern of activation across a relatively small population of neurons.

This “pattern decoder” may be a useful tool in understanding what deficits occur in the brains of those with hearing impairment. Knowledge of such neural effects could guide the development of treatment and the design of new prosthetic devices.

“We hope to learn more so that someday we can design devices that work better in noisy environments,” Day said.

This work was funded by National Institute on Deafness and Other Communication Disorders grants RO1 DC002258 and P30 DC005209.

Adapted from a Massachusetts Eye and Ear Infirmary news release.