Study uncovers neural basis of macaque facial gestures

A new study has shown that the brain regions controlling facial expressions in macaques work together in unexpected ways, challenging prior assumptions about their division of labor. Researchers led by Geena Ianni at the University of Pennsylvania used advanced neural recordings to reveal how these gestures are encoded. The findings could pave the way for future brain-computer interfaces that decode facial signals for patients with neurological impairments.

Neuroscientists have long puzzled over how the brain generates facial expressions, assuming a clear split between areas handling emotional signals and those managing deliberate movements like speaking. However, a study published in Science on January 20, 2026, upends this view through experiments on macaques, primates with facial musculature similar to humans.

Geena Ianni and her team at the University of Pennsylvania began by scanning the macaques' brains with fMRI while filming their faces during social interactions. The animals viewed videos of other macaques, interactive avatars, or live companions, prompting natural expressions such as lipsmacking to show submission, threat faces to deter rivals, and neutral chewing.

Using these scans, the researchers pinpointed key brain areas: the primary motor cortex, ventral premotor cortex, primary somatosensory cortex, and cingulate motor cortex. They then implanted micro-electrode arrays with sub-millimeter precision into these regions—the first such effort to record multiple neurons during facial gesture production.

Contrary to expectations, all four areas activated for every gesture, from social signals to chewing, in a coordinated pattern. "We expected a division where the cingulate cortex governs social signals, while the motor cortex is specialized in chewing," Ianni noted, but the data showed otherwise.

Further analysis revealed distinct neural codes. The cingulate cortex employs a static pattern, persistent for up to 0.8 seconds, likely integrating social context and sensory input. In contrast, the motor and somatosensory cortices use dynamic codes with rapidly shifting firing rates to control precise muscle movements, such as subtle lip twitches.

"The static means the firing pattern of neurons is persistent across both multiple repetitions... and across time," Ianni explained, suggesting it stabilizes the gesture's intent while dynamic areas execute the details.

This foundational work, detailed in the journal (doi.org/10.1126/science.aea0890), builds toward neural prostheses for restoring facial communication in stroke or paralysis patients. Ianni remains optimistic: "I hope our work goes towards enabling... more naturalistic and rich communication designs that will improve lives." Yet, she cautions that reliable devices remain years away, akin to early speech-decoding tech from the 1990s.

Связанные статьи

MRI brain scan highlighting auditory cortex response to chimpanzee vocalizations, illustrating evolutionary shared voice processing with primates.
Изображение, созданное ИИ

Human brain’s voice area shows selective response to chimpanzee calls

Сообщено ИИ Изображение, созданное ИИ Проверено фактами

Researchers at the University of Geneva have found that specific regions of the human auditory cortex respond particularly strongly to chimpanzee vocalizations compared with those of other primates, including bonobos and macaques. The work, published as a reviewed preprint in eLife, suggests that human brain areas involved in voice processing are also tuned to certain nonhuman primate calls, reflecting shared evolutionary and acoustic roots.

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

Сообщено ИИ Проверено фактами

Researchers at the University of Birmingham used facial motion capture to compare how autistic and non-autistic adults produce facial expressions of anger, happiness and sadness, finding consistent differences in which facial features were emphasized. The work, published in *Autism Research*, suggests some misunderstandings about emotion may stem from mismatched expressive “styles” across groups rather than a one-sided problem.

Researchers have developed a noninvasive method using EEG brain scans to detect movement intentions in people with spinal cord injuries. By capturing signals from the brain and potentially routing them to spinal stimulators, the approach aims to bypass damaged nerves. While promising, the technology still struggles with precise control, especially for lower limbs.

Сообщено ИИ Проверено фактами

A new study reports that as people listen to a spoken story, neural activity in key language regions unfolds over time in a way that mirrors the layer-by-layer computations inside large language models. The researchers, who analyzed electrocorticography recordings from epilepsy patients during a 30-minute podcast, also released an open dataset intended to help other scientists test competing theories of how meaning is built in the brain.

Humans are the only primates with a chin, a feature that has puzzled biologists. A new analysis suggests it emerged not for a specific purpose but as a side effect of other evolutionary changes. Researchers examined hundreds of ape skulls to reach this conclusion.

Сообщено ИИ Проверено фактами

Scientists from the Allen Institute and Japan’s University of Electro-Communications have built one of the most detailed virtual models of the mouse cortex to date, simulating roughly 9 million neurons and 26 billion synapses across 86 regions on the Fugaku supercomputer.

 

 

 

Этот сайт использует куки

Мы используем куки для анализа, чтобы улучшить наш сайт. Прочитайте нашу политику конфиденциальности для дополнительной информации.
Отклонить