Study suggests brain-inspired algorithms to cut AI energy use

Researchers from Purdue University and the Georgia Institute of Technology have proposed a new computer architecture for AI models inspired by the human brain. This approach aims to address the energy-intensive 'memory wall' problem in current systems. The study, published in Frontiers in Science, highlights potential for more efficient AI in everyday devices.

The rapid growth of AI has exacerbated challenges in computer design, particularly the separation of processing and memory in traditional systems. A study published on Monday in the journal Frontiers in Science outlines a brain-inspired solution to this issue. Led by Kaushik Roy, a computer engineering professor at Purdue University, the research argues for rethinking AI architecture to make it more energy-efficient.

Current computers follow the von Neumann architecture, developed in 1945, which keeps memory and processing separate. This design creates a bottleneck known as the 'memory wall,' a term coined by University of Virginia researchers in the 1990s. As AI models, especially language processors, have expanded 5,000-fold in size over the past four years, the disparity between memory speed and processing power has become more pressing. IBM recently emphasized this problem in a report.

The proposed solution draws from how the brain operates, using spiking neural networks (SNNs). These algorithms, once criticized for being slow and inaccurate, have improved significantly in recent years. The researchers advocate for 'compute-in-memory' (CIM), which integrates computing directly into the memory system. As stated in the paper's abstract, "CIM offers a promising solution to the memory wall problem by integrating computing capabilities directly into the memory system."

Roy noted, "Language processing models have grown 5,000-fold in size over the last four years. This alarmingly rapid expansion makes it crucial that AI is as efficient as possible. That means fundamentally rethinking how computers are designed."

Co-author Tanvi Sharma, a Purdue researcher, added, "AI is one of the most transformative technologies of the 21st century. However, to move it out of data centers and into the real world, we need to dramatically reduce its energy use." She explained that this could enable AI in compact devices like medical tools, vehicles, and drones, with longer battery life and less data transfer.

By minimizing energy waste, the approach could make AI more accessible beyond large data centers, supporting broader applications in resource-constrained environments.

Verwandte Artikel

Realistic depiction of a rhesus macaque in a Princeton lab with brain overlay showing prefrontal cortex assembling reusable cognitive 'Lego' modules for flexible learning.
Bild generiert von KI

Princeton study reveals brain’s reusable ‘cognitive Legos’ for flexible learning

Von KI berichtet Bild generiert von KI Fakten geprüft

Neuroscientists at Princeton University report that the brain achieves flexible learning by reusing modular cognitive components across tasks. In experiments with rhesus macaques, researchers found that the prefrontal cortex assembles these reusable “cognitive Legos” to adapt behaviors quickly. The findings, published November 26 in Nature, underscore differences from current AI systems and could eventually inform treatments for disorders that impair flexible thinking.

Researchers at Korea University have developed a dual-output artificial synapse to boost the energy efficiency of multitasking AI systems, the university announced. The device emits both electrical and optical signals simultaneously to enable parallel processing. Tests showed up to 47 percent faster computation and energy use reduced by as much as 32 times compared to conventional GPU hardware.

Von KI berichtet

Australia-based start-up Cortical Labs has announced plans to construct two data centres using neuron-filled chips. The facilities in Melbourne and Singapore will house its CL1 biological computers, which have demonstrated the ability to play video games like Doom. The initiative aims to scale up cloud-based brain-computing services while reducing energy consumption.

A global shortage of RAM, driven by AI data center demands, has caused PC memory prices to surge by 40 to 70 percent in 2025, leading to higher costs and lower specs for computers in 2026. This development is dampening the hype around so-called AI PCs, as manufacturers shift focus amid waning consumer interest. Analysts predict volatility in PC sales this year, with shortages persisting beyond 2026.

Von KI berichtet

A CNET commentary argues that describing AI as having human-like qualities such as souls or confessions misleads the public and erodes trust in the technology. It highlights how companies like OpenAI and Anthropic use such language, which obscures real issues like bias and safety. The piece calls for more precise terminology to foster accurate understanding.

Beim India AI Impact Summit bezeichnete Premierminister Narendra Modi Künstliche Intelligenz als Wendepunkt in der Menschheitsgeschichte, der die Richtung der Zivilisation neu ausrichten könnte. Er äußerte Bedenken hinsichtlich der Form der KI, die an zukünftige Generationen weitergegeben wird, und betonte, sie menschzentriert und verantwortungsvoll zu gestalten. Experten warnten vor Risiken wie Datenschutz, Deepfakes und autonomen Waffen.

Von KI berichtet

Chinese researchers have introduced photonic AI chips that promise significant speed advantages in specific generative tasks. These chips use photons instead of electrons, enabling greater parallelism through optical interference. The development could mark a step forward in AI hardware, though claims are limited to narrowly defined applications.

Dienstag, 03. März 2026, 13:21 Uhr

AI emerges as key player in modern warfare

Sonntag, 01. März 2026, 16:33 Uhr

Elon Musk praises efficiency of Tesla's AI4 computer

Freitag, 20. Februar 2026, 09:08 Uhr

Wired discusses potential for AI data centers in space

Samstag, 07. Februar 2026, 04:40 Uhr

IndiaAI-Chef skizziert pragmatische Roadmap vor KI-Gipfel

Mittwoch, 21. Januar 2026, 11:06 Uhr

AI boom set to increase US carbon emissions

Montag, 12. Januar 2026, 01:59 Uhr

Researchers gear up to simulate human brain on supercomputer

Mittwoch, 07. Januar 2026, 07:17 Uhr

2026 predicted as year of world models in AI

Freitag, 26. Dezember 2025, 09:57 Uhr

AI processing moves to devices for speed and privacy

Mittwoch, 24. Dezember 2025, 07:26 Uhr

Paper argues consciousness depends on “biological-style” computation, not abstract code

Montag, 22. Dezember 2025, 22:33 Uhr

Duke ai uncovers simple rules in complex systems

 

 

 

Diese Website verwendet Cookies

Wir verwenden Cookies für Analysen, um unsere Website zu verbessern. Lesen Sie unsere Datenschutzrichtlinie für weitere Informationen.
Ablehnen