Illustration of individuals struggling to converse in a crowded, noisy restaurant, representing a study on cognitive ability and speech comprehension in noisy environments.
Illustration of individuals struggling to converse in a crowded, noisy restaurant, representing a study on cognitive ability and speech comprehension in noisy environments.
AI에 의해 생성된 이미지

Cognitive ability tied to understanding speech in noisy settings, study finds

AI에 의해 생성된 이미지
사실 확인됨

Among people with clinically normal hearing, intellectual ability strongly predicted how well they understood speech amid competing voices, according to a peer-reviewed study from University of Washington researchers.

Researchers at the University of Washington School of Medicine report a strong association between general intellectual ability and the capacity to follow speech in noisy environments. The paper, published in PLOS One on September 24, 2025, found the link held across diagnostic groups. (doi.org)

The study enrolled 49 participants: 12 with autism, 10 with fetal alcohol spectrum disorder (FASD) and 27 age- and sex-matched comparison participants. Ages spanned roughly 13 to 47 years. All participants met criteria for typical hearing after audiology screening; one autistic participant who did not pass was excluded. (journals.plos.org)

Participants completed a computer-based “multitalker” listening task built from Coordinate Response Measure sentences of the form “Ready [callsign], go to [color] [number] now.” The target voice was always male and identified by the callsign “Charlie,” while two competing talkers (“maskers”) were presented from different simulated spatial locations. After each trial, listeners selected the matching color and number on a screen. (doi.org)

Intellectual ability was measured with the Wechsler Abbreviated Scale of Intelligence–Second Edition (WASI‑II), combining verbal and nonverbal/perceptual reasoning subtests. Lower IQ scores were linked to poorer thresholds on the multitalker task in the full sample and within each group. Lead author Bonnie Lau said the relationship “transcended diagnostic categories,” a point echoed in the university’s release; the paper reports “a highly significant relationship” between directly assessed intellectual ability and multitalker speech perception. (newsroom.uw.edu)

Lau also noted that real-world listening in noise draws heavily on cognitive processes such as stream segregation, selective attention and language comprehension—not solely on the ears. “You don’t have to have a hearing loss to have a hard time listening in a restaurant or any other challenging real-world situation,” she said. (sciencedaily.com)

Because the study sample was under 50, the authors call for larger replications. They suggest practical accommodations—such as preferential seating or hearing-assistive tools—may help neurodivergent students or those with lower cognitive ability in busy classrooms. Co-authors are affiliated with several University of Washington departments and the University of Michigan. (sciencedaily.com)

관련 기사

Elderly woman describing a picture in a lab with speech pauses indicated, related to cognitive study
AI에 의해 생성된 이미지

Pauses and filler words in picture descriptions were linked to executive function in a Baycrest-led study

AI에 의해 보고됨 AI에 의해 생성된 이미지 사실 확인됨

Small speech timing habits—such as silent pauses, “um” and “uh,” and difficulty finding words—were associated with performance on standard executive-function tests in a study by researchers at Baycrest, the University of Toronto and York University.

A new study challenges the belief that closing one's eyes improves hearing in noise, finding it actually hinders detection of faint sounds. Researchers from Shanghai Jiao Tong University showed that relevant visual cues enhance auditory sensitivity instead. The findings were published in The Journal of the Acoustical Society of America.

AI에 의해 보고됨 사실 확인됨

University of Notre Dame researchers report evidence that general intelligence is associated with how efficiently and flexibly brain networks coordinate across the whole connectome, rather than being localized to a single “smart” region. The findings, published in Nature Communications, are based on neuroimaging and cognitive data from 831 Human Connectome Project participants and an additional 145 adults from the INSIGHT Study.

Researchers from the University of Pennsylvania have identified 'cognitive surrender,' where people outsource reasoning to AI without verification. In experiments, participants accepted incorrect AI responses 73.2 percent of the time across 1,372 participants. Factors like time pressure increased reliance on flawed outputs.

AI에 의해 보고됨 사실 확인됨

A small controlled experiment reported in Frontiers in Behavioral Neuroscience found that exposure to infrasound—ultra-low-frequency vibration below the range of human hearing—was associated with higher salivary cortisol and more negative mood ratings, even though participants could not reliably detect when the infrasound was present.

이 웹사이트는 쿠키를 사용합니다

사이트를 개선하기 위해 분석을 위한 쿠키를 사용합니다. 자세한 내용은 개인정보 보호 정책을 읽으세요.
거부