Monkeys control virtual worlds with brain implants

Three rhesus macaque monkeys equipped with brain-computer interfaces navigated virtual environments using only their thoughts. Researchers implanted around 300 electrodes in motor and premotor cortex areas to enable this control. The experiments aim to improve intuitive control for people with paralysis.

Peter Janssen and colleagues at KU Leuven in Belgium implanted three rhesus macaque monkeys with brain-computer interfaces. Each monkey received three arrays of 96 electrodes, totaling about 300, placed in the primary motor cortex, dorsal premotor cortex, and ventral premotor cortex. These areas relate to movement execution and higher-level planning. An AI model decoded the neural signals to steer virtual reality avatars on a 3D monitor the monkeys viewed. The animals controlled a sphere across virtual landscapes from a fixed viewpoint, animated monkey avatars from a third-person perspective like in video games, and even navigated virtual buildings by opening doors and moving between rooms. Janssen described the method as more intuitive than prior BCIs, which often require imagining specific physical actions like finger movements. “We cannot ask these monkeys, of course, but we just think that it’s a more intuitive way of controlling a computer, basically,” Janssen said. He noted that users of existing systems sometimes liken them to “trying to move your ears,” a skill that can take weeks to master. Janssen believes the approach could help humans with paralysis navigate virtual worlds or wheelchairs more naturally, though human implant locations need further study. “There’s a bit of work necessary to know exactly where to implant a human... But once we figure that out, it should be possible. It should actually be easier because you can explain to the human what they are supposed to do,” he added. Andrew Jackson at Newcastle University praised the monkeys' ability to adapt control across viewpoints and contexts, suggesting the implants tap into abstract movement representations in the brain. The findings appear in Science Advances.

Makala yanayohusiana

VR study participant with AI prosthetic arm reaching naturally in one second, lab graphs highlighting optimal speed for body ownership.
Picha iliyoundwa na AI

VR study finds AI prosthetic arms feel most natural when a reach takes about one second

Imeripotiwa na AI Picha iliyoundwa na AI Imethibitishwa ukweli

AI-driven prosthetic arms may feel most like part of the body when their autonomous reaching motion lasts about one second, a virtual reality experiment reported in *Scientific Reports* suggests. In the study, that mid-range speed produced the highest ratings of body ownership, sense of control and usability, while very fast and very slow movements reduced acceptance and increased discomfort.

A new study has shown that the brain regions controlling facial expressions in macaques work together in unexpected ways, challenging prior assumptions about their division of labor. Researchers led by Geena Ianni at the University of Pennsylvania used advanced neural recordings to reveal how these gestures are encoded. The findings could pave the way for future brain-computer interfaces that decode facial signals for patients with neurological impairments.

Imeripotiwa na AI

Chinese scientists have drawn inspiration from the Japanese paper-cutting art of kirigami to develop stretchable microelectrode arrays, aiming to overcome limitations in electrode technology such as that used by Neuralink. These arrays were implanted into macaque monkeys, where they flexed with brain tissue to record hundreds of neurons simultaneously. The research was published in the February 5 issue of Nature Electronics.

A bonobo named Kanzi has demonstrated the ability to engage in make-believe play, a cognitive skill previously unseen in non-human primates. In experiments conducted shortly before his death, Kanzi participated in a pretend tea party involving imaginary juice and grapes. The findings suggest that our closest primate relatives possess the capacity for imagination.

Imeripotiwa na AI

Researchers at Korea University have developed a dual-output artificial synapse to boost the energy efficiency of multitasking AI systems, the university announced. The device emits both electrical and optical signals simultaneously to enable parallel processing. Tests showed up to 47 percent faster computation and energy use reduced by as much as 32 times compared to conventional GPU hardware.

Neuroscientists at Trinity College Dublin have found that babies as young as two months old can already sort visual information into categories like animals and toys. Using brain scans and AI, the study reveals early foundations of perception. This challenges previous assumptions about infant cognition.

Imeripotiwa na AI Imethibitishwa ukweli

Researchers at Johns Hopkins University report that Kanzi, a language-trained bonobo, followed pretend “tea party” scenarios by pointing to where an experimenter had acted as if imaginary juice and grapes were located. The work, published in Science, adds experimental evidence to a long-running debate over whether elements of pretense and imagination are unique to humans.

 

 

 

Tovuti hii inatumia vidakuzi

Tunatumia vidakuzi kwa uchambuzi ili kuboresha tovuti yetu. Soma sera ya faragha yetu kwa maelezo zaidi.
Kataa