Linguistics

Folgen
Illustration depicting linguists studying why human language resists compression like computer code, contrasting brain processing with digital efficiency.
Bild generiert von KI

Study explores why human language isn’t compressed like computer code

Von KI berichtet Bild generiert von KI Fakten geprüft

A new model from linguists Richard Futrell and Michael Hahn suggests that many hallmark features of human language—such as familiar words, predictable ordering and meaning built up step by step—reflect constraints on sequential information processing rather than a drive for maximum data compression. The work was published in Nature Human Behaviour.

Diese Website verwendet Cookies

Wir verwenden Cookies für Analysen, um unsere Website zu verbessern. Lesen Sie unsere Datenschutzrichtlinie für weitere Informationen.
Ablehnen