Linguistics
Hoton da AI ya samar
Study explores why human language isn’t compressed like computer code
An Ruwaito ta hanyar AI Hoton da AI ya samar An Binciki Gaskiya
A new model from linguists Richard Futrell and Michael Hahn suggests that many hallmark features of human language—such as familiar words, predictable ordering and meaning built up step by step—reflect constraints on sequential information processing rather than a drive for maximum data compression. The work was published in Nature Human Behaviour.