hearing IQ
(Credit: F.A. Giovanella/Unsplash)

If You Have Trouble Listening in a Crowded Room, Your Hearing May Not Be to Blame

Scientists have found that individuals with higher IQ scores have an increased ability to discern speech when other voices are talking simultaneously, compared to people with similar hearing ability but a lower IQ.

The research team behind the discovery suggested that individuals with lower cognitive abilities could benefit from the findings by compelling educators to assess the environmental conditions that may “challenge their complex listening thresholds” and adjust accordingly, such as adjusting a student’s seating position to improve attention and hearing.

According to a statement from the team, people with autism and those who suffered from fetal alcohol syndrome often complained of an inability to discern voices from one another when multiple people were talking at the same time. The researcher also notes that because these two groups often have a wider range between low IQ individuals and high IQ individuals compared to the general population, which includes some individuals with higher IQs, the researchers wondered if their ability to separate voices could be connected to cognitive processing abilities rather than diminished hearing.

To conduct the study, 10 people with autism and 12 with fetal alcohol syndrome were enlisted as volunteers. A third group of 27 “age- and biological sex-matched” neurotypical individuals was also recruited to function as a control. The ages of the volunteers across all groups ranged from 13 to 47 years.

Next, the study’s lead investigator, Bonnie Lau, a research assistant professor in otolaryngology-head and neck surgery at the University of Washington School of Medicine and a director of laboratory studies of auditory brain development, had her team test each volunteer’s hearing to confirm they were clinically normal. Each volunteer was then given a computer screen and headphones and told to follow the complex listening challenge on the screen.

During the challenge, the listeners were told to focus on one male voice that was introduced as two other male voices “emerged” from the background. While the initial speaker’s voice was male, the other voices could either be one male and one female or both male. Notably, the team did not test the ability to distinguish one male voice from two female voices speaking simultaneously.

Rather than speaking randomly, all the voices stated a single sentence that started with a call sign and then included a number, such as “Ready, Eagle, go to green five now.” While the voices were speaking, the volunteers were asked to periodically check a box on the screen that corresponded to the call sign and color spoken by the initial male voice, and not the different options mentioned by the “background” voices. Finally, each volunteer was administered standardized tests of intelligence that also included assessments of verbal and nonverbal abilities, as well as perceptual reasoning.

After comparing the data, the team found a pattern indicating that the study’s 49 volunteers with lower cognitive ability had a more difficult time separating the main speaker’s voice from the competing voices. This correlation was most pronounced with the highest and lowest IQ participants, indicating that their ability to cognitively process information may make it harder for them to separate simultaneous speakers.

“The relationship between cognitive ability and speech-perception performance transcended diagnostic categories,” Lau explained. “That finding was consistent across all three groups.”

When discussing the findings, Lau said it made sense from a cognitive standpoint. For example, the professor noted how the first task in listening in a crowd involved simply separating the “streams of speech” from one another to focus on the intended speaker, “and part of that is suppressing the competing noise characteristics.”

After that, Lau explained that the listener must comprehend the speaker’s words “from a linguistic standpoint” by “coding each phoneme, discerning syllables and words.” She also emphasized the importance of developing semantic skills, such as smiling and nodding in response.

“All these factors increase the cognitive load of communicating when it is noisy,” Lau explained.

In their conclusion, the authors highlighted the need for further research due to the study’s relatively small size and other limitations. They also noted that understanding the challenges lower IQ individuals may face when listening in a crowded situation could help educators and other professionals design strategies to limit the cognitive load on them, especially when discerning between voices is required. For Lau, the work offers hope to people who thought they were facing diminished hearing by diagnosing another factor that may make listening in a crowd harder for some than others, and potentially offering strategies to mitigate the problem.

“You don’t have to have a hearing loss to have a hard time listening in a restaurant or any other challenging real-world situation,” she concluded.

The study, “The relationship between intellectual ability and auditory multitalker speech perception in neurodivergent individuals,” was published in PLOS One.

Christopher Plain is a Science Fiction and Fantasy novelist and Head Science Writer at The Debrief. Follow and connect with him on X, learn about his books at plainfiction.com, or email him directly at christopher@thedebrief.org.