I love that by following certain people, my timeline has become a stream of interesting and entertaining information. I love that sometimes I am able to fit my little publication just so into the 140 characters given to me.
As it was published, I had read Kevin Hartnett’s article and interview with Christoph Adami The Information Theory of Life in Quanta Magazine. I recently revisited it and read through the commentary and stumbled upon an interesting quote relating to the history of information in biology:
These two historical references predate Claude Shannon’s mathematical formalization of information in A Mathematical Theory of Communication (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book What is Life (1944).
For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in History of Psychiatry [.pdf]):
🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke 
Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)
Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.
Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’
The original arXiv article also referenced two lectures which are appended below:
[Original Draft of this was written on December 14, 2015.]
How a mathematical breakthrough from the 1960s now powers everything from spacecraft to cell phones.
Concurrent with the recent Pluto fly by, Alex Riley has a great popular science article on PBS that helps put the application of information theory and biology into perspective for the common person. Like a science version of “The Princess Bride”, this story has a little bit of everything that could be good and entertaining: information theory, biology, DNA, Reed-Solomon codes, fossils, interplanetary exploration, mathematics, music, genetics, computers, and even paleontology. Fans of Big History are sure to love the interconnections presented here.Syndicated copies to:
Yesterday, I saw an interesting linguistic exercise:
I have to imagine that once the conceptualization of language and some basic grammar existed, word generation was a much more common thing than it is now. It’s only been since the time of Noah Webster that humans have been actively standardizing things like spelling. If we can use Papua New Guinea as a model of pre-agrarian society and consider that almost 12% of extant languages on the Earth are spoken in an area about the size of Texas (and with about 1/5th the population of Texas too), then modern societies are actually severely limiting language (creation, growth, diversity, creativity, etc.) [cross reference: A World of Languages – and How Many Speak Them (Infographic)]
Consider that the current extinction of languages is about one every 14 weeks, which puts us on a course to loose about half of the 7,100 languages on the planet right now before the end of the century. Collective learning has potentially been growing at the expense of a shrinking body of diverse language! In the paper “Global distribution and drivers of language extinction risk” the authors indicate that of all the variables tested, economic growth was most strongly linked to language loss.
To help put this exercise into perspective, we can look at the corpus of extant written Latin (a technically dead language):
These numbers become even smaller when considering ancient Greek texts.
Another interesting measurement is the vocabulary of a modern 2 year old who typically has a 50-75 word vocabulary while a 4 year old has 250-500 words, which is about the level of the exercise.
As a contrast, consider the message in this TED Youth Talk from last year by Erin McKean, which students should be able to relate to:
And of course, there’s the dog Chaser, which 60 minutes recently reported has a vocabulary of over 1,000 words. (Are we now destroying variants of “dog language” for English too?!)
Hopefully the evolutionary value of the loss of the multiple languages will be more than balanced out by the power of collective learning in the long run.Syndicated copies to:
I’ve long been a student of the humanities (and particularly the classics) and have recently begun reviewing over my very old and decrepit knowledge of Latin. It’s been two decades since I made a significant study of classical languages, and lately (as the result of conversations with friends like Dave Harris, Jim Houser, Larry Richardson, and John Kountouris) I’ve been drawn to reviewing them for reading a variety of classical texts in their original languages. Fortunately, in the intervening years, quite a lot has changed in the tools relating to pedagogy for language acquisition.
The biggest change in the intervening time is the spread of the internet which supplies a broad variety of related websites with not only interesting resources for things like basic reading and writing, but even audio sources apparently including listening to the nightly news in Latin. There are a variety of blogs on Latin as well as even online courseware, podcasts, pronunciation recordings, and even free textbooks. I’ve written briefly about the RapGenius platform before, but I feel compelled to mention it as a potentially powerful resource as well. (Julius Caesar, Seneca, Ovid, Cicero, et al.) There is a paucity of these sources in a general sense in comparison with other modern languages, but given the size of the niche, there is quite a lot out there, and certainly a mountain in comparison to what existed only twenty years ago.
There has also been a spread of pedagogic aids like flashcard software including Anki and Mnemosyne with desktop, web-based, and even mobile-based versions making learning available in almost any situation. The psychology and learning research behind these types of technologies has really come a long way toward assisting students to best make use of their time in learning and retaining what they’ve learned in long term memory. Simple mobile applications like Duolingo exist for a variety of languages – though one doesn’t currently exist for classical Latin (yet).
The other great change is the advancement of the digital humanities which allows for a lot of interesting applications of knowledge acquisition. One particular one that I ran across this week was the Dickinson College Commentaries (DCC). Specifically a handful of scholars have compiled and documented a list of the most common core vocabulary words in Latin (and in Greek) based on their frequency of appearance in extant works. This very specific data is of interest to me in relation to my work in information theory, but it also becomes a tremendously handy tool when attempting to learn and master a language. It is a truly impressive fact that, simply by knowing that if one can memorize and master about 250 words in Latin, it will allow them to read and understand 50% of most written Latin. Further, knowledge of 1,500 Latin words will put one at the 80% level of vocabulary mastery for most texts. Mastering even a very small list of vocabulary allows one to read a large variety of texts very comfortably. I can only think about the old concept of a concordance (which was generally limited to heavily studied texts like the Bible or possibly Shakespeare) which has now been put on some serious steroids for entire cultures. Another half step and one arrives at the Google Ngram Viewer.
The best part is that one can, with very little technical knowledge, easily download the DCC Core Latin Vocabulary (itself a huge research undertaking) and upload and share it through the Anki platform, for example, to benefit a fairly large community of other scholars, learners, and teachers. With a variety of easy-to-use tools, shortly it may be even that much easier to learn a language like Latin – potentially to the point that it is no longer a dead language. For those interested, you can find my version of the shared DCC Core Latin Vocabulary for Anki online; the DCC’s Chris Francese has posted details and a version for Mnemosyne already.
[Editor’s note: Anki’s web service occasionally clears decks of cards from their servers, so if you find that the Anki link to the DCC Core Latin is not working, please leave a comment below, and we’ll re-upload the deck for shared use.]
What tools and tricks do you use for language study and pedagogy?Syndicated copies to:
Overall James Gleick’s book The Information: a History, a Theory, a Flood is an excellent read. Given that it’s an area with which I’m intimately interested, I’m not too surprised that most of it is “review”, but I’d highly recommend it to the general public to know more about some of the excellent history, philosophy, and theory which Gleick so nicely summarizes throughout the book.
There are one or two references in the back which I’ll have to chase down and read and one or two, which after many years, seem like they may be worth a second revisiting after having completed this.
Even for the specialist, Gleick manages to tie together some disparate thoughts to create an excellent whole which makes it a very worthwhile read. I found towards the last several chapters, Gleick’s style becomes much more flowery and less concrete, but most of it is as a result of covering the “humanities” perspective of information as opposed to the earlier parts of the text which were more specific to history and the scientific theories he covered.Syndicated copies to:
Computer pioneer who helped create the first spreadsheet, Bob Frankston, is this week's guest.
On a recent episode of Leo Laporte and Tom Merrit’s show Triangulation, they interviewed Bob Frankston of VisiCalc fame. They gave a great discussion of the current state of broadband in the U.S. and how it might be much better. They get just a bit technical in places, but it’s a fantastic and very accessible discussion of the topic of communications that every American should be aware of.
2011 Andrew Viterbi Lecture
Ming Hsieh Department of Electrical Engineering
“Adventures in Coding Theory”
Professor Elwyn Berlekamp
University of California, Berkeley
Gerontology Auditorium, Thursday, March 3, 4:30 to 5:30 p.m.
The inventors of error-correcting codes were initially motivated by problems in communications engineering. But coding theory has since also influenced several other fields, including memory technology, theoretical computer science, game theory, portfolio theory, and symbolic manipulation. This talk will recall some forays into these subjects.
Elwyn Berlekamp has been professor of mathematics and of electrical engineering and computer science at UC Berkeley since 1971; halftime since 1983, and Emeritus since 2002. He also has been active in several small companies in the sectors of computers-communications and finance. He is now chairman of Berkeley Quantitative LP, a small money-management company. He was chairman of the Board of Trustees of MSRI from 1994-1998, and was at the International Computer Science Institute from 2001-2003. He is a member of the National Academy of Sciences, the National Academy of Engineering, and the American Academy of Arts and Sciences. Berlekamp has 12 patented inventions, some of which were co-authored with USC Professor Emeritus Lloyd Welch. Some of Berlekamp’s algorithms for decoding Reed-Solomon codes are widely used on compact discs; others are NASA standards for deep space communications. He has more than 100 publications, including two books on algebraic coding theory and seven books on the mathematical theory of combinatorial games, including the popular Dots-and-Boxes Game: Sophisticated Child’s Play.
I wish I could be at this lecture in person today, but I’ll have to live with the live webcast.
A new study by Noam Sobel, of the Olfaction Research Group at the Weizmann Institute of Science in Rehovot, Israel, and others are reporting in the journal Science this week that men in their study who sniffed the tears of crying women produced less testosterone and found female faces less arousing.
Previous studies in animals such as mice and mole rats have shown that tears convey important chemical messages which are used to attract or repel others of the same species. There is good evidence for an interesting type means of higher-level chemical communication. These previous studies also incidentally show that “emotional” tears are chemically distinct from “eye-protecting” types of tears.
Scientific American’s “60 Second Science” (via link or listen below) podcast has a good audio overview of the study for those without the time to read the paper.
In press reports, Adam Anderson, a University of Toronto psychologist who was not involved with the study, posited that the results may imply that “tears have some influence on sexual selection, and that’s not something we associate with sadness.” He continued, “It could be a way of warding off unwanted advances.”
This study provides a new hypothesis for the evolution of crying in humans. (Now if only we could find some better reasons for laughter…)
The take home message may be that guys should not take their dates out to weepy chick flicks, or alternately women reluctantly accepting “pity dates” should force their suitors to exactly these types of testosterone damping films.