Neuroscientists have identified eight body-like maps in the visual cortex that mirror the organization of touch sensations, enabling the brain to physically feel what it sees in others. This discovery, based on brain scans during movie viewing, enhances understanding of empathy and holds promise for treatments in autism and advancements in AI. The findings were published in Nature.
A team of researchers, led by Nicholas Hedger from the University of Reading and Tomas Knapen from the Netherlands Institute for Neuroscience and Vrije Universiteit Amsterdam, explored how the brain translates visual cues into tactile sensations. Collaborating with scientists from the UK, USA, and Amsterdam's VU and NIN (KNAW), they analyzed functional MRI data from participants watching clips from films like The Social Network and Inception. This approach captured natural brain responses, revealing how visual processing integrates with bodily feelings.
The study pinpointed eight distinct maps in the visual cortex that align with the somatosensory cortex's head-to-toe layout for touch. These maps allow the brain to interpret others' actions, injuries, or emotions as if experiencing them directly. "We found not one, or two, but eight remarkably similar maps in the visual cortex!" Knapen said. "Finding so many shows how strongly the visual brain speaks the language of touch."
Each map likely serves unique roles, such as identifying body parts or their spatial positions, with activation varying by focus. For instance, observing a hand action might engage one map, while assessing posture or expressions activates another. "Every time you look at a person, there are many different bodily translations that need to be conducted visually," Knapen explained. "We think that these maps are a fundamental ingredient in that exact process."
The overlapping maps enable flexible information processing. "This allows the brain to have many types of information in a single space, and make a translation in any way that is relevant in that moment," Knapen noted.
Implications extend to clinical and technological fields. These maps could aid research into social psychology and autism, where such processing may falter. "People with autism can struggle with this sort of processing," Knapen said. "Having this information could help us better identify effective treatments."
In neurotechnology, the findings suggest broader training methods for brain-computer interfaces beyond simple movements. For AI, incorporating bodily dimensions could enrich systems reliant on text and video. "This aspect of human experience is a fantastic area for AI development," Knapen emphasized, highlighting synergies between neuroscience and artificial intelligence.
The research, detailed in Nature (DOI: 10.1038/s41586-025-09796-0), underscores a core element of human empathy.