Quantum method promises AI boost from computers

Researchers have developed a mathematical approach showing quantum computers could efficiently process large datasets for AI tasks. By loading data in batches like streaming, the method avoids massive memory needs. A machine with just 60 logical qubits could outperform classical systems by decade's end.

Hsin-Yuan Huang at quantum firm Oratomic and colleagues argue their work lays foundations for quantum advantages in machine learning. Conventional skepticism held that inputting real-world data, like restaurant reviews or RNA sequences, into quantum superposition states required impossibly large memory. The team's solution streams data in smaller batches, processing it without full pre-storage, akin to watching a movie online rather than downloading it first. Haimeng Zhao at the California Institute of Technology notes this yields a memory edge so vast that 300 logical qubits would surpass a classical computer using every atom in the observable universe. Huang emphasizes machine learning's ubiquity: “Machine learning is really utilised everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there’s massive datasets available.” Experts praise the innovation but urge caution. Adrián Pérez-Salinas at ETH Zurich calls it promising for feeding quantum machines bit by bit, yet stresses testing against dequantisation, where quantum algorithms lose their edge on classical hardware. Vedran Dunjko at Leiden University sees fit for data-heavy experiments like the Large Hadron Collider, though not all AI workloads. The researchers plan to broaden applicable algorithms and optimize quantum hardware speed. A 60-logical-qubit system seems feasible by 2030, offering early advantages for big-data AI processing.

Liittyvät artikkelit

Quantum computers face significant challenges from errors that limit their usefulness, but recent breakthroughs in error correction are offering hope. Innovations involve creating logical qubits from fewer physical ones and enhancing reliability through entanglement and additional protections. Experts describe this as an exciting time where theory and practice are converging.

Raportoinut AI

Researchers have reduced the quantum computing power required to break the widely used RSA encryption algorithm by a factor of ten, to about 100,000 qubits. This advancement builds on prior work and highlights growing vulnerabilities in current security systems. However, significant engineering challenges persist in building such machines.

Scientists in Australia have developed the largest quantum simulator to date, using 15,000 qubits to model exotic quantum materials. This device, known as Quantum Twins, could help optimize superconductors and other advanced substances. Built by embedding phosphorus atoms in silicon chips, it offers unprecedented control over electron properties.

Raportoinut AI

Scientists have identified a method to create multiple copies of quantum information by encrypting them with a one-time decryption key, sidestepping the no-cloning theorem. This approach, developed by Achim Kempf and colleagues at the University of Waterloo, was tested on an IBM quantum processor. The technique could enhance redundancy in quantum computing and storage systems.

Building on 2026 qubit reductions like Iceberg Quantum's qLDPC breakthrough, recent studies project quantum computers cracking RSA-2048 and ECDLP-256 by 2029. Google and cybersecurity experts warn of imminent Q-Day, pushing post-quantum cryptography to avert a crisis worse than Y2K, with businesses ramping up quantum-safe migrations.

Tämä verkkosivusto käyttää evästeitä

Käytämme evästeitä analyysiä varten parantaaksemme sivustoamme. Lue tietosuojakäytäntömme tietosuojakäytäntö lisätietoja varten.
Hylkää