Researchers from Purdue University and the Georgia Institute of Technology have proposed a new computer architecture for AI models inspired by the human brain. This approach aims to address the energy-intensive 'memory wall' problem in current systems. The study, published in Frontiers in Science, highlights potential for more efficient AI in everyday devices.
The rapid growth of AI has exacerbated challenges in computer design, particularly the separation of processing and memory in traditional systems. A study published on Monday in the journal Frontiers in Science outlines a brain-inspired solution to this issue. Led by Kaushik Roy, a computer engineering professor at Purdue University, the research argues for rethinking AI architecture to make it more energy-efficient.
Current computers follow the von Neumann architecture, developed in 1945, which keeps memory and processing separate. This design creates a bottleneck known as the 'memory wall,' a term coined by University of Virginia researchers in the 1990s. As AI models, especially language processors, have expanded 5,000-fold in size over the past four years, the disparity between memory speed and processing power has become more pressing. IBM recently emphasized this problem in a report.
The proposed solution draws from how the brain operates, using spiking neural networks (SNNs). These algorithms, once criticized for being slow and inaccurate, have improved significantly in recent years. The researchers advocate for 'compute-in-memory' (CIM), which integrates computing directly into the memory system. As stated in the paper's abstract, "CIM offers a promising solution to the memory wall problem by integrating computing capabilities directly into the memory system."
Roy noted, "Language processing models have grown 5,000-fold in size over the last four years. This alarmingly rapid expansion makes it crucial that AI is as efficient as possible. That means fundamentally rethinking how computers are designed."
Co-author Tanvi Sharma, a Purdue researcher, added, "AI is one of the most transformative technologies of the 21st century. However, to move it out of data centers and into the real world, we need to dramatically reduce its energy use." She explained that this could enable AI in compact devices like medical tools, vehicles, and drones, with longer battery life and less data transfer.
By minimizing energy waste, the approach could make AI more accessible beyond large data centers, supporting broader applications in resource-constrained environments.