Researchers behind a new review in Frontiers in Science argue that rapid progress in artificial intelligence and brain technologies is outpacing scientific understanding of consciousness, raising the risk of ethical and legal mistakes. They say developing evidence-based tests for detecting awareness—whether in patients, animals or emerging artificial and lab-grown systems—could reshape medicine, welfare debates and technology governance.
The rapid development of artificial intelligence and neurotechnology is intensifying calls from consciousness researchers to clarify what it means to be conscious—and how to detect it.
In a review published in Frontiers in Science, Prof. Axel Cleeremans of Université Libre de Bruxelles, Prof. Liad Mudrik of Tel Aviv University, and Prof. Anil Seth of the University of Sussex argue that advances in these technologies are moving faster than scientific agreement on how consciousness arises. They describe consciousness in broadly familiar terms—as awareness of the world and of oneself—while noting that science still lacks consensus on how subjective experience emerges from physical processes.
The authors point to ongoing competition among major scientific theories of consciousness, including global workspace approaches, higher-order theories, integrated information theory and predictive processing frameworks. They argue that progress depends in part on developing stronger methods to test these ideas, including “adversarial collaborations” in which proponents of rival theories jointly design experiments intended to distinguish between them.
A key goal, the review argues, is the development of evidence-based tests for consciousness that can be applied beyond healthy adult humans. Such tools could affect clinical care by helping clinicians detect covert awareness in some patients who appear unresponsive, and by refining assessments in conditions such as coma, advanced dementia, and anesthesia—areas that can influence treatment planning and end-of-life decisions.
The review also outlines potential implications for mental health research. The authors argue that a better scientific account of subjective experience could help narrow gaps between findings in animal models and the lived experience of human symptoms, with possible relevance for conditions including depression, anxiety and schizophrenia.
Beyond medicine, the authors say improved ways of identifying consciousness could reshape debates over animal welfare and ethical obligations, influencing practices in research, agriculture and conservation if society gains clearer evidence about which animals are sentient.
They also highlight potential legal consequences. The review notes that neuroscience findings about unconscious influences on behavior could pressure legal systems to revisit how they interpret responsibility and concepts such as mens rea, the mental element traditionally required for criminal liability.
In technology, the authors argue that emerging systems—from advanced AI to brain organoids and brain–computer interfaces—raise new questions about whether consciousness could be created, altered, or convincingly simulated, and what moral and regulatory obligations might follow. Cleeremans warned that unintended creation of consciousness would pose “immense ethical challenges and even existential risk.” Seth said that advances in the science of consciousness are likely to reshape how humans understand themselves and their relationship to both AI and the natural world. Mudrik argued that a clearer understanding of consciousness in animals could transform how humans treat them and other emerging biological systems.
To move the field forward, the authors call for more coordinated, collaborative research that combines careful theory testing with greater attention to phenomenology—the qualities of experience itself—alongside functional and neural measures.
They argue that such work is needed not only to advance basic science, but also to prepare society for the medical, ethical and technological consequences of being able to detect—or potentially create—consciousness.