Computer language spots error in widely cited physics paper

A researcher using the Lean formalisation language has uncovered a fundamental flaw in a influential 2006 physics paper on the two Higgs doublet model. Joseph Tooby-Smith at the University of Bath made the discovery while building a library of verified physics theorems. The original authors have acknowledged the error and plan to issue an erratum.

Joseph Tooby-Smith, a researcher at the University of Bath in the UK, applied the Lean computer language—designed to verify mathematical proofs—to a 2006 physics paper examining the stability of the two Higgs doublet model (2HDM) potential. The paper, which has been widely cited since its publication, claimed that a specific condition, denoted as C, was sufficient to ensure a stable solution. However, Tooby-Smith's formalisation process revealed a counterexample where condition C failed to produce stability, undermining the theorem at its core. Tooby-Smith described his effort as a routine step to incorporate the result into PhysLib, a growing database of formalised physics research modeled after the mathematics library MathsLib. 'We’re not going out there to disprove papers; we’re going out there to build results that everyone can use,' he said. While the error significantly affects the original paper, Tooby-Smith noted it is unlikely to impact subsequent studies that cited it. He notified the authors, who confirmed the issue and intend to publish an erratum. This marks the first time such software has identified an error in a physics paper, prompting concerns about potential flaws in other works. 'Because a lot of physicists aren’t interested in these nitty-gritty details, sometimes they miss them, and that’s where you get an error,' Tooby-Smith explained. Kevin Buzzard at Imperial College London endorsed extending formalisation to theoretical physics, highlighting its benefits for building reliable theorem libraries and training AI models. He noted that creating a substantial body of formalised physics results will require initial manual effort before machines can assist more effectively. Tooby-Smith's findings appear in a preprint on arXiv (DOI: 10.48550/arXiv.2603.08139).

相关文章

Illustration depicting linguists studying why human language resists compression like computer code, contrasting brain processing with digital efficiency.
AI 生成的图像

Study explores why human language isn’t compressed like computer code

由 AI 报道 AI 生成的图像 事实核查

A new model from linguists Richard Futrell and Michael Hahn suggests that many hallmark features of human language—such as familiar words, predictable ordering and meaning built up step by step—reflect constraints on sequential information processing rather than a drive for maximum data compression. The work was published in Nature Human Behaviour.

Researchers have developed algorithms called phantom codes to make quantum computers less error-prone, potentially allowing them to run complex simulations more efficiently. These codes enable entanglement of logical qubits without physical manipulations, cutting down on error risks. The approach shows promise for tasks requiring extensive entanglement, though it is not a complete solution to quantum computing challenges.

由 AI 报道

Quantum computers face significant challenges from errors that limit their usefulness, but recent breakthroughs in error correction are offering hope. Innovations involve creating logical qubits from fewer physical ones and enhancing reliability through entanglement and additional protections. Experts describe this as an exciting time where theory and practice are converging.

Researchers have used two quantum computers and two supercomputers to simulate a molecule with 12,635 atoms, breaking the previous record. The hybrid approach targeted protein-ligand complexes relevant to drug discovery. The achievement marks progress toward practical quantum simulations despite current hardware limitations.

由 AI 报道

Two recent studies indicate quantum computers could crack elliptic curve cryptography—securing banks, internet traffic, and cryptocurrencies like Bitcoin—with far fewer qubits than previously estimated: around 10,000-30,000 for one approach or 500,000 for another. Researchers highlight rapid hardware progress, urging a shift to post-quantum standards.

此网站使用 cookie

我们使用 cookie 进行分析以改进我们的网站。阅读我们的 隐私政策 以获取更多信息。
拒绝