Researchers have developed a mathematical approach showing quantum computers could efficiently process large datasets for AI tasks. By loading data in batches like streaming, the method avoids massive memory needs. A machine with just 60 logical qubits could outperform classical systems by decade's end.
Hsin-Yuan Huang at quantum firm Oratomic and colleagues argue their work lays foundations for quantum advantages in machine learning. Conventional skepticism held that inputting real-world data, like restaurant reviews or RNA sequences, into quantum superposition states required impossibly large memory. The team's solution streams data in smaller batches, processing it without full pre-storage, akin to watching a movie online rather than downloading it first. Haimeng Zhao at the California Institute of Technology notes this yields a memory edge so vast that 300 logical qubits would surpass a classical computer using every atom in the observable universe. Huang emphasizes machine learning's ubiquity: “Machine learning is really utilised everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there’s massive datasets available.” Experts praise the innovation but urge caution. Adrián Pérez-Salinas at ETH Zurich calls it promising for feeding quantum machines bit by bit, yet stresses testing against dequantisation, where quantum algorithms lose their edge on classical hardware. Vedran Dunjko at Leiden University sees fit for data-heavy experiments like the Large Hadron Collider, though not all AI workloads. The researchers plan to broaden applicable algorithms and optimize quantum hardware speed. A 60-logical-qubit system seems feasible by 2030, offering early advantages for big-data AI processing.