Researchers gear up to simulate human brain on supercomputer

Scientists are on the verge of simulating a human brain using the world's most powerful supercomputers, aiming to unlock secrets of brain function. Led by researchers at Germany's Jülich Research Centre, the project leverages the JUPITER supercomputer to model 20 billion neurons. This breakthrough could enable testing of theories on memory and drug effects that smaller models cannot achieve.

Advancements in computing power are enabling researchers to simulate the human brain at an unprecedented scale. Today's supercomputers, approaching exascale performance with a billion billion operations per second, can handle simulations of billions of neurons, according to the Top500 list, which identifies only four such machines worldwide.

Markus Diesmann at the Jülich Research Centre in Germany explained the shift: “We have never been able to bring them all together into one place, into one larger brain model where we can check whether these ideas are at all consistent. This is now changing.” His team plans to use JUPITER, the Joint Undertaking Pioneer for Innovative and Transformative Exascale Research, based in Germany. Last month, they demonstrated that a spiking neural network—a simple model of neurons and synapses—could scale to run on JUPITER's thousands of graphical processing units, reaching 20 billion neurons and 100 trillion connections. This matches the size of the human cerebral cortex, the hub of higher brain functions.

Diesmann emphasized the value of scale: “We know now that large networks can do qualitatively different things than small ones. It’s clear the large networks are different.” Previous simulations, like those of a fruit fly brain, lack features emerging only in larger systems, similar to how large language models outperform smaller ones.

Thomas Nowotny at the University of Sussex in the UK stressed the need for full-scale efforts: “Downscaling is not just simplifying it a little bit, or making it a bit coarser, it means actually giving up certain properties altogether. It’s really important that eventually we can do full-scale [simulations], because otherwise we’re never going to get the real thing.”

The model draws on real data from human brain experiments, including synapse counts and activity levels, as noted by collaborator Johanna Senk at the University of Sussex. Diesmann added: “We now have these anatomical data as constraints, but also the computer power.”

Such simulations could test theories on memory formation by inputting images and observing reactions, or evaluate drugs for conditions like epilepsy, characterized by abnormal brain activity bursts. Enhanced power allows faster runs to study slow processes like learning and incorporates detailed neuron behaviors.

However, challenges remain. Nowotny cautioned that even brain-sized simulations lack real-world inputs and cannot fully replicate animal behavior. “We can’t actually build brains. Even if we can make simulations of the size of a brain, we can’t make simulations of the brain.”

Awọn iroyin ti o ni ibatan

Researchers observing a detailed mouse cortex simulation on Japan's Fugaku supercomputer, with a colorful 3D brain model on screen.
Àwòrán tí AI ṣe

Researchers run detailed mouse cortex simulation on Japan’s Fugaku supercomputer

Ti AI ṣe iroyin Àwòrán tí AI ṣe Ti ṣayẹwo fun ododo

Scientists from the Allen Institute and Japan’s University of Electro-Communications have built one of the most detailed virtual models of the mouse cortex to date, simulating roughly 9 million neurons and 26 billion synapses across 86 regions on the Fugaku supercomputer.

Researchers from Purdue University and the Georgia Institute of Technology have proposed a new computer architecture for AI models inspired by the human brain. This approach aims to address the energy-intensive 'memory wall' problem in current systems. The study, published in Frontiers in Science, highlights potential for more efficient AI in everyday devices.

Ti AI ṣe iroyin

Researchers at Nagoya University in Japan have developed miniature brain models using stem cells to study interactions between the thalamus and cortex. Their work reveals the thalamus's key role in maturing cortical neural networks. The findings could advance research into neurological disorders like autism.

A new analysis indicates that certain designs for fault-tolerant quantum computers could consume far more energy than the world's most powerful supercomputers. Presented at a recent conference, the estimates highlight a wide range of potential power needs, from modest to enormous. This variation stems from different technologies used to build and operate these machines.

Ti AI ṣe iroyin

Researchers anticipate that 2026 could mark the beginning of practical applications for quantum computers in chemistry, leveraging their inherent quantum nature to tackle complex molecular calculations. Advances in 2025 have laid the groundwork, with larger machines expected to enable more sophisticated simulations. This progress could benefit industrial and medical fields by improving predictions of molecular structures and reactivities.

Duke-NUS Medical School researchers, working with the University of Sydney, have developed BrainSTEM—a two-tier, single-cell atlas of the developing human brain that profiles nearly 680,000 cells. Published online in Science Advances on October 31, 2025, the resource focuses on midbrain dopaminergic neurons, flags off‑target cell types in lab-grown models, and will be released openly for the research community.

Ti AI ṣe iroyin

Researchers have developed the most detailed simulations yet of how matter accretes around black holes, incorporating full general relativity and radiation effects. Led by Lizhong Zhang from the Institute for Advanced Study and the Flatiron Institute, the study matches real astronomical observations. Published in The Astrophysical Journal, it focuses on stellar-mass black holes and uses powerful supercomputers.

 

 

 

Ojú-ìwé yìí nlo kuki

A nlo kuki fun itupalẹ lati mu ilọsiwaju wa. Ka ìlànà àṣírí wa fun alaye siwaju sii.
Kọ