Researchers uncover eight brain maps linking vision to touch

Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.

A team of researchers, led by Nicholas Hedger from the University of Reading and Tomas Knapen from the Netherlands Institute for Neuroscience and Vrije Universiteit Amsterdam, explored how the brain translates visual cues into tactile sensations. Collaborating with scientists from the UK, USA, and Amsterdam's VU and NIN (KNAW), they analyzed functional MRI data from participants watching clips from films like The Social Network and Inception. This approach captured natural brain responses, revealing how visual processing integrates with bodily feelings.

The study pinpointed eight distinct maps in the visual cortex that align with the somatosensory cortex's head-to-toe layout for touch. These maps allow the brain to interpret others' actions, injuries, or emotions as if experiencing them directly. "We found not one, or two, but eight remarkably similar maps in the visual cortex!" Knapen said. "Finding so many shows how strongly the visual brain speaks the language of touch."

Each map likely serves unique roles, such as identifying body parts or their spatial positions, with activation varying by focus. For instance, observing a hand action might engage one map, while assessing posture or expressions activates another. "Every time you look at a person, there are many different bodily translations that need to be conducted visually," Knapen explained. "We think that these maps are a fundamental ingredient in that exact process."

The overlapping maps enable flexible information processing. "This allows the brain to have many types of information in a single space, and make a translation in any way that is relevant in that moment," Knapen noted.

Implications extend to clinical and technological fields. These maps could aid research into social psychology and autism, where such processing may falter. "People with autism can struggle with this sort of processing," Knapen said. "Having this information could help us better identify effective treatments."

In neurotechnology, the findings suggest broader training methods for brain-computer interfaces beyond simple movements. For AI, incorporating bodily dimensions could enrich systems reliant on text and video. "This aspect of human experience is a fantastic area for AI development," Knapen emphasized, highlighting synergies between neuroscience and artificial intelligence.

The research, detailed in Nature (DOI: 10.1038/s41586-025-09796-0), underscores a core element of human empathy.

Related Articles

Illustration of a brain connectivity map from an Ohio State University study, showing neural patterns predicting cognitive activities, for a news article on neuroscience findings.
Image generated by AI

Study maps how brain connectivity predicts activity across cognitive functions

Reported by AI Image generated by AI Fact checked

Scientists at The Ohio State University have charted how patterns of brain wiring can predict activity linked to many mental functions across the entire brain. Each region shows a distinct “connectivity fingerprint” tied to roles such as language and memory. The peer‑reviewed findings in Network Neuroscience offer a baseline for studying healthy young adult brains and for comparisons with neurological or psychiatric conditions.

Researchers at Karolinska Institutet have identified how alpha oscillations in the brain help distinguish the body from the surroundings. Faster alpha rhythms enable precise integration of visual and tactile signals, strengthening the feeling of bodily self. The findings, published in Nature Communications, could inform treatments for conditions like schizophrenia and improve prosthetic designs.

Reported by AI

A new study has shown that the brain regions controlling facial expressions in macaques work together in unexpected ways, challenging prior assumptions about their division of labor. Researchers led by Geena Ianni at the University of Pennsylvania used advanced neural recordings to reveal how these gestures are encoded. The findings could pave the way for future brain-computer interfaces that decode facial signals for patients with neurological impairments.

Researchers behind a new review in Frontiers in Science argue that rapid progress in artificial intelligence and brain technologies is outpacing scientific understanding of consciousness, raising the risk of ethical and legal mistakes. They say developing evidence-based tests for detecting awareness—whether in patients, animals or emerging artificial and lab-grown systems—could reshape medicine, welfare debates and technology governance.

Reported by AI

Researchers have developed a noninvasive method using EEG brain scans to detect movement intentions in people with spinal cord injuries. By capturing signals from the brain and potentially routing them to spinal stimulators, the approach aims to bypass damaged nerves. While promising, the technology still struggles with precise control, especially for lower limbs.

Researchers at the University of Geneva have found that specific regions of the human auditory cortex respond particularly strongly to chimpanzee vocalizations compared with those of other primates, including bonobos and macaques. The work, published as a reviewed preprint in eLife, suggests that human brain areas involved in voice processing are also tuned to certain nonhuman primate calls, reflecting shared evolutionary and acoustic roots.

Reported by AI

Scientists are on the verge of simulating a human brain using the world's most powerful supercomputers, aiming to unlock secrets of brain function. Led by researchers at Germany's Jülich Research Centre, the project leverages the JUPITER supercomputer to model 20 billion neurons. This breakthrough could enable testing of theories on memory and drug effects that smaller models cannot achieve.

 

 

 

This website uses cookies

We use cookies for analytics to improve our site. Read our privacy policy for more information.
Decline