Brain-Computer Interfaces

팔로우
Illustration of Northwestern University's wireless micro-LED brain implant delivering light patterns to mouse neurons for sensory signaling.
AI에 의해 생성된 이미지

Northwestern team develops wireless implant that ‘speaks’ to the brain with light

AI에 의해 보고됨 AI에 의해 생성된 이미지 사실 확인됨

Scientists at Northwestern University have created a soft, wireless brain implant that delivers patterned light directly to neurons, enabling mice to interpret these signals as meaningful cues without relying on sight, sound or touch. The fully implantable device uses an array of up to 64 micro-LEDs to generate complex activity patterns across the cortex, a development that could advance next-generation prosthetics and sensory therapies, according to Northwestern and Nature Neuroscience.

A new study has shown that the brain regions controlling facial expressions in macaques work together in unexpected ways, challenging prior assumptions about their division of labor. Researchers led by Geena Ianni at the University of Pennsylvania used advanced neural recordings to reveal how these gestures are encoded. The findings could pave the way for future brain-computer interfaces that decode facial signals for patients with neurological impairments.

AI에 의해 보고됨

OpenAI CEO Sam Altman is launching a new brain-computer interface startup called Merge Labs. The venture, which aims to read brain activity using ultrasound, is being spun out from the Los Angeles-based nonprofit Forest Neurotech. Details come from a source familiar with the plans.

이 웹사이트는 쿠키를 사용합니다

사이트를 개선하기 위해 분석을 위한 쿠키를 사용합니다. 자세한 내용은 개인정보 보호 정책을 읽으세요.
거부