Language Processing

Segui
Illustration of a patient undergoing brain monitoring while listening to a podcast, with neural activity layers mirroring AI language model processing.
Immagine generata dall'IA

Study links step-by-step brain responses during speech to layered processing in large language models

Riportato dall'IA Immagine generata dall'IA Verificato

A new study reports that as people listen to a spoken story, neural activity in key language regions unfolds over time in a way that mirrors the layer-by-layer computations inside large language models. The researchers, who analyzed electrocorticography recordings from epilepsy patients during a 30-minute podcast, also released an open dataset intended to help other scientists test competing theories of how meaning is built in the brain.

Questo sito web utilizza i cookie

Utilizziamo i cookie per l'analisi per migliorare il nostro sito. Leggi la nostra politica sulla privacy per ulteriori informazioni.
Rifiuta