Entropy Is Universal Rule of Language | Wired Science

Read Entropy Is Universal Rule of Language by Lisa Grossman (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.

Phrase of the Week: “Disconfirmation Bias”

Wordspy definition of disconfirmation bias


Dictionary: A Malevolent Literary Device

Ambrose Bierce (), American editorialist, journalist, short story writer, fabulist, and satirist
in The Devil’s Dictionary


HARASS SARAH is a PALINdrome, as well as a popular left-wing sport.

This is definitely the quote of the week:

Sol Golomb, mathematician and information theorist
via personal communication while discussing a palindromic word puzzle