Entropy Is Universal Rule of Language | Wired Science

Read Entropy Is Universal Rule of Language by Lisa Grossman (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.

Published by

Chris Aldrich

I'm a biomedical and electrical engineer with interests in information theory, complexity, evolution, genetics, signal processing, IndieWeb, theoretical mathematics, and big history. I'm also a talent manager-producer-publisher in the entertainment industry with expertise in representation, distribution, finance, production, content delivery, and new media.

Leave a Reply

Your email address will not be published. Required fields are marked *