Entropy Is Universal Rule of Language | Wired Science

Read Entropy Is Universal Rule of Language by Lisa Grossman (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.

Published by

Chris Aldrich

I'm a biomedical and electrical engineer with interests in information theory, complexity, evolution, genetics, signal processing, IndieWeb, theoretical mathematics, and big history. I'm also a talent manager-producer-publisher in the entertainment industry with expertise in representation, distribution, finance, production, content delivery, and new media.

Leave a Reply

Your email address will not be published. Required fields are marked *

To respond to a post on this site using your own website, create your post making sure to include the (target) URL/permalink for my post in your response. Then enter the URL/permalink of your response in the (source) box and click the 'Ping me' button. Your response will appear (possibly after moderation) on my page. Want to update or remove your response? Update or delete your post and re-enter your post's URL again. (Learn More)