Split-image illustration showing subtle differences in facial expressions of anger, happiness, and sadness between non-autistic and autistic adults, captured with motion capture markers, for a University of Birmingham study.
Split-image illustration showing subtle differences in facial expressions of anger, happiness, and sadness between non-autistic and autistic adults, captured with motion capture markers, for a University of Birmingham study.
Àwòrán tí AI ṣe

Study maps subtle differences in facial expressions of emotion in autistic and non-autistic adults

Àwòrán tí AI ṣe
Ti ṣayẹwo fun ododo

Researchers at the University of Birmingham used facial motion capture to compare how autistic and non-autistic adults produce facial expressions of anger, happiness and sadness, finding consistent differences in which facial features were emphasized. The work, published in *Autism Research*, suggests some misunderstandings about emotion may stem from mismatched expressive “styles” across groups rather than a one-sided problem.

A study led by researchers at the University of Birmingham has detailed how autistic and non-autistic adults move their faces when expressing basic emotions, identifying differences that could contribute to miscommunication.

Published in Autism Research, the study recorded facial motion capture data from 25 autistic adults and 26 non-autistic adults. Participants produced 4,896 expressions in total—2,448 “cued” expressions and 2,448 spoken expressions—while displaying anger, happiness and sadness across two contexts: matching facial movements to sounds and speaking. Researchers reported extracting more than 265 million data points to build a high-resolution library of facial movements.

The analysis found emotion-specific differences in how expressions were produced. For anger, autistic participants relied more on the mouth and less on the eyebrows than non-autistic participants. For happiness, autistic participants showed a less exaggerated smile that did not “reach the eyes.” For sadness, autistic participants more often produced a downturned look by raising the upper lip more than their non-autistic peers. The research team also reported that autistic participants produced a wider range of unique expressions.

The study additionally examined alexithymia—often described as difficulty identifying and describing one’s own emotions—and found that higher alexithymia was associated with less clearly differentiated facial expressions for anger and happiness, which could make those emotions appear more ambiguous.

Dr. Connor Keating, who led the work at the University of Birmingham and is now based at the University of Oxford, said the differences extended beyond the “shape” of expressions to how they unfold over time: “Our findings suggest autistic and non-autistic people differ not only in the appearance of facial expressions, but also in how smoothly these expressions are formed. These mismatches in facial expressions may help to explain why autistic people struggle to recognize non-autistic expressions and vice versa.”

Professor Jennifer Cook, the senior author at the University of Birmingham, said the results support a view of emotional expression differences as potentially reciprocal rather than inherently deficient: “Autistic and non-autistic people may express emotions in ways that are different but equally meaningful—almost like speaking different languages. What has sometimes been interpreted as difficulties for autistic people might instead reflect a two-way challenge in understanding each other’s expressions.”

According to the University of Birmingham, the project was funded by the UK Medical Research Council and the European Union’s Horizon 2020 Research and Innovation Programme. The paper is titled “Mismatching Expressions: Spatiotemporal and Kinematic Differences in Autistic and Non-Autistic Facial Expressions” (DOI: 10.1002/aur.70157).

Ohun tí àwọn ènìyàn ń sọ

Initial reactions on X to the University of Birmingham study focus on differences in facial expressions of anger, happiness, and sadness between autistic and non-autistic adults. Lead researcher Connor Keating shared a detailed thread explaining methods, findings, and implications for mutual misunderstandings. Autistic researcher Michelle Dawson noted the study used posed rather than spontaneous expressions. Autism advocacy accounts highlighted more individualized autistic expressions and two-way communication issues. Reactions are mostly neutral and informative, with shares from researchers and autism communities.

Awọn iroyin ti o ni ibatan

Illustration of a brain connectivity map from an Ohio State University study, showing neural patterns predicting cognitive activities, for a news article on neuroscience findings.
Àwòrán tí AI ṣe

Study maps how brain connectivity predicts activity across cognitive functions

Ti AI ṣe iroyin Àwòrán tí AI ṣe Ti ṣayẹwo fun ododo

Scientists at The Ohio State University have charted how patterns of brain wiring can predict activity linked to many mental functions across the entire brain. Each region shows a distinct “connectivity fingerprint” tied to roles such as language and memory. The peer‑reviewed findings in Network Neuroscience offer a baseline for studying healthy young adult brains and for comparisons with neurological or psychiatric conditions.

A new study has shown that the brain regions controlling facial expressions in macaques work together in unexpected ways, challenging prior assumptions about their division of labor. Researchers led by Geena Ianni at the University of Pennsylvania used advanced neural recordings to reveal how these gestures are encoded. The findings could pave the way for future brain-computer interfaces that decode facial signals for patients with neurological impairments.

Ti AI ṣe iroyin

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

A study in PLOS Biology reports that synchronizing activity between frontal and parietal brain regions using noninvasive electrical stimulation slightly increased participants’ willingness to share money in a standard economics task, including in choices that reduced their own payoff.

Ti AI ṣe iroyin Ti ṣayẹwo fun ododo

Researchers analyzing brain-imaging and treatment data from hundreds of people report that Parkinson’s disease is associated with abnormal connectivity involving the somato-cognitive action network (SCAN), a motor-cortex network described in 2023. In a small trial, stimulation aimed at this network produced a higher response rate than stimulation of nearby motor areas, raising the possibility of more targeted noninvasive treatments.

Neuroscientists at Trinity College Dublin have found that babies as young as two months old can already sort visual information into categories like animals and toys. Using brain scans and AI, the study reveals early foundations of perception. This challenges previous assumptions about infant cognition.

Ti AI ṣe iroyin

Researchers at Karolinska Institutet have identified how alpha oscillations in the brain help distinguish the body from the surroundings. Faster alpha rhythms enable precise integration of visual and tactile signals, strengthening the feeling of bodily self. The findings, published in Nature Communications, could inform treatments for conditions like schizophrenia and improve prosthetic designs.

 

 

 

Ojú-ìwé yìí nlo kuki

A nlo kuki fun itupalẹ lati mu ilọsiwaju wa. Ka ìlànà àṣírí wa fun alaye siwaju sii.
Kọ