EEG study links schizophrenia voice-hearing to disrupted prediction of inner speech

Researchers at UNSW Sydney report evidence that auditory verbal hallucinations in schizophrenia-spectrum disorders may involve a breakdown in the brain’s normal ability to dampen responses to self-generated inner speech, causing internally generated thoughts to be processed more like external sounds.

A study led by psychologists at UNSW Sydney reports evidence that hearing voices in schizophrenia-spectrum disorders may be tied to a disruption in how the brain distinguishes internally generated “inner speech” from sounds coming from the outside world.

Published in Schizophrenia Bulletin, the research tested a long-debated idea in psychiatry: that some auditory verbal hallucinations (AVH) may occur when a person’s inner speech is misperceived as external speech.

Professor Thomas Whitford of the UNSW School of Psychology described inner speech as “the voice in your head that silently narrates your thoughts – what you're doing, planning, or noticing.” The researchers say that, in typical brain function, the auditory system shows a reduced response to predicted, self-generated speech-like signals. In people currently experiencing hallucinations, the study found an opposite pattern.

How the experiment worked

The team used electroencephalography (EEG) to measure participants’ brain responses while they listened to brief syllables through headphones and, at specific moments, imagined producing syllables silently.

Participants were divided into three groups:

  • 55 people with schizophrenia-spectrum disorders who had experienced AVH within the past week,
  • 44 people with schizophrenia-spectrum disorders who either had no history of AVH or had not experienced them recently,
  • 43 healthy control participants with no history of schizophrenia.

During the task, participants were asked to imagine saying a syllable such as “bah” or “bih” while hearing a syllable played aloud. Sometimes the imagined and audible syllables matched; other times they did not.

What researchers observed

In healthy controls, EEG responses showed a reduction in early auditory processing when the imagined syllable matched the sound played aloud—an effect consistent with the brain correctly predicting and suppressing responses to expected speech-like input.

Among participants who had recently experienced AVH, the researchers observed the opposite: brain responses were stronger when imagined and heard syllables matched.

“Their brains reacted more strongly to inner speech that matched the external sound, which was the exact opposite of what we found in the healthy participants,” Whitford said.

The group without recent hallucinations showed a different pattern from both the hallucinating group and healthy controls, with responses that the researchers reported as falling between the other two groups overall.

Implications and next steps

Whitford said the findings support a theory that has been discussed for decades but has been difficult to test because inner speech is private and cannot be directly observed. The study’s results suggest an EEG-based measure of this “inner speech” prediction mechanism could be explored as a potential biological marker related to psychosis risk.

“This sort of measure has great potential to be a biomarker for the development of psychosis,” Whitford said.

The researchers said they plan further work to assess whether these brain-response patterns could help predict who may later develop psychosis—an approach that, if validated, could support earlier identification and intervention.

Mga Kaugnay na Artikulo

Illustration of individuals struggling to converse in a crowded, noisy restaurant, representing a study on cognitive ability and speech comprehension in noisy environments.
Larawang ginawa ng AI

Cognitive ability tied to understanding speech in noisy settings, study finds

Iniulat ng AI Larawang ginawa ng AI Fact checked

Among people with clinically normal hearing, intellectual ability strongly predicted how well they understood speech amid competing voices, according to a peer-reviewed study from University of Washington researchers.

A new study reports that as people listen to a spoken story, neural activity in key language regions unfolds over time in a way that mirrors the layer-by-layer computations inside large language models. The researchers, who analyzed electrocorticography recordings from epilepsy patients during a 30-minute podcast, also released an open dataset intended to help other scientists test competing theories of how meaning is built in the brain.

Iniulat ng AI

Researchers have discovered that psychedelic substances suppress visual processing in the brain, leading to hallucinations by drawing on memory fragments. The study, conducted using advanced imaging on mice, shows how slow brain waves shift perception toward internal recall. These findings could inform therapies for depression and anxiety.

Researchers at MIT’s Picower Institute report that rotating waves of neural activity help the brain recover focus after distraction. In animal studies, the extent of these rotations tracked performance: full rotations aligned with correct responses, while incomplete cycles were linked to errors. The timing between a distraction and response also mattered, suggesting a timing‑dependent recovery cycle.

Iniulat ng AI

Researchers at Rutgers Health have identified how the brain integrates fast and slow processing through white matter connections, influencing cognitive abilities. Published in Nature Communications, the study analyzed data from nearly 1,000 people to map these neural timescales. Variations in this system may explain differences in thinking efficiency and hold promise for mental health research.

A new wearable device from MIT's AlterEgo company uses technology to interpret subtle neuromuscular signals for silent communication. The device, worn on the ears, enables tasks like conversation and device control without vocalizing words. While it offers privacy benefits, it also raises concerns about data handling in interactions.

Iniulat ng AI

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

 

 

 

Gumagamit ng cookies ang website na ito

Gumagamit kami ng cookies para sa analytics upang mapabuti ang aming site. Basahin ang aming patakaran sa privacy para sa higit pang impormasyon.
Tanggihan