Blinking decreases during effortful listening in noisy settings

Researchers at Concordia University have discovered that people blink less when concentrating on speech amid background noise, highlighting a link between eye behavior and cognitive effort. This pattern persists regardless of lighting conditions, suggesting it's driven by mental demands rather than visual factors. The findings, published in Trends in Hearing, could offer a simple way to measure brain function during listening tasks.

Blinking, an automatic reflex like breathing, plays a subtle role in how the brain processes information, according to a new study from Concordia University. Published in the journal Trends in Hearing in 2025, the research explores how eye blinks relate to cognitive processes, particularly in filtering speech from noisy environments.

The study involved nearly 50 adult participants in a soundproof room, where they listened to short sentences through headphones while viewing a fixed cross on a screen. Eye-tracking glasses recorded blinks as background noise levels varied, creating signal-to-noise ratios from quiet to highly distracting. Blink rates dropped significantly during the sentences themselves, especially when noise made comprehension hardest, compared to periods before and after playback.

"We wanted to know if blinking was impacted by environmental factors and how it related to executive function," said lead author Pénélope Coupal, an Honours student at the Laboratory for Hearing and Cognition. "For instance, is there a strategic timing of a person's blinks so they would not miss out on what is being said?"

A second experiment tested lighting variations—dark, medium, and bright rooms—across similar noise levels. The blink suppression pattern remained consistent, indicating cognitive load, not light exposure, as the driver. Participants varied widely in baseline blink rates, from 10 to 70 times per minute, but the trend was statistically significant.

"We don't just blink randomly," Coupal noted. "In fact, we blink systematically less when salient information is presented."

Co-author Mickael Deroche, an associate professor in the Department of Psychology, emphasized the implications: "Our study suggests that blinking is associated with losing information, both visual and auditory. That is presumably why we suppress blinking when important information is coming."

Unlike prior work that dismissed blinks in favor of pupil dilation measures, this research treats them as indicators of mental effort. Yue Zhang also contributed to the paper, titled "Reduced Eye Blinking During Sentence Listening Reflects Increased Cognitive Load in Challenging Auditory Conditions." The authors propose blinks as a low-effort tool for assessing cognition in labs and everyday scenarios, with ongoing work mapping information loss during blinks led by postdoctoral fellow Charlotte Bigras.

관련 기사

Illustration of individuals struggling to converse in a crowded, noisy restaurant, representing a study on cognitive ability and speech comprehension in noisy environments.
AI에 의해 생성된 이미지

Cognitive ability tied to understanding speech in noisy settings, study finds

AI에 의해 보고됨 AI에 의해 생성된 이미지 사실 확인됨

Among people with clinically normal hearing, intellectual ability strongly predicted how well they understood speech amid competing voices, according to a peer-reviewed study from University of Washington researchers.

New research from MIT reveals that when sleep-deprived individuals experience attention lapses, their brains trigger waves of cerebrospinal fluid to clear waste, mimicking a sleep-like process. This compensation disrupts focus temporarily but may help maintain brain health. The findings, published in Nature Neuroscience, highlight the brain's adaptive response to missed rest.

AI에 의해 보고됨

New research shows that everyday sights and sounds can trap some people in harmful choices by influencing their brains through associative learning. Those highly sensitive to these cues struggle to update their responses when outcomes turn negative, leading to persistent risky behavior. The findings, led by Giuseppe di Pellegrino at the University of Bologna, highlight implications for addictions and anxiety.

Scientists at The Ohio State University have charted how patterns of brain wiring can predict activity linked to many mental functions across the entire brain. Each region shows a distinct “connectivity fingerprint” tied to roles such as language and memory. The peer‑reviewed findings in Network Neuroscience offer a baseline for studying healthy young adult brains and for comparisons with neurological or psychiatric conditions.

AI에 의해 보고됨

Researchers at Karolinska Institutet have identified how alpha oscillations in the brain help distinguish the body from the surroundings. Faster alpha rhythms enable precise integration of visual and tactile signals, strengthening the feeling of bodily self. The findings, published in Nature Communications, could inform treatments for conditions like schizophrenia and improve prosthetic designs.

A Finnish startup, ixi eyewear, has raised more than $40 million to create smart glasses with lenses that adjust focus based on eye movements. The lightweight prototype, weighing 22 grams, uses sensors and liquid crystal technology for instant adaptation. The company plans to launch the product within the next year, targeting the high-end eyewear market.

AI에 의해 보고됨

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

 

 

 

이 웹사이트는 쿠키를 사용합니다

사이트를 개선하기 위해 분석을 위한 쿠키를 사용합니다. 자세한 내용은 개인정보 보호 정책을 읽으세요.
거부