The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

Instagram filter used: Clarendon

Photo taken at: Gerrish Swim & Tennis Club

📗 Started reading A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages i-16 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

A great little introduction and start to what portends to be the science biography of the year. The book opens up with a story I’d heard Sol Golomb tell several times. It was actually a bittersweet memory as the last time I heard a recounting, it appeared on the occasion of Shannon’s 100th Birthday celebration in the New Yorker:

In 1985, at the International Symposium in Brighton, England, the Shannon Award went to the University of Southern California’s Solomon Golomb. As the story goes, Golomb began his lecture by recounting a terrifying nightmare from the night before: he’d dreamed that he was about deliver his presentation, and who should turn up in the front row but Claude Shannon. And then, there before Golomb in the flesh, and in the front row, was Shannon. His reappearance (including a bit of juggling at the banquet) was the talk of the symposium, but he never attended again.

I had emailed Sol about the story, and became concerned when I didn’t hear back. I discovered shortly after that he had passed away the following day.

nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

🔖 A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni, Rob Goodman

Bookmarked A Mind at Play: How Claude Shannon Invented the Information Age (Simon & Schuster)
The life and times of one of the foremost intellects of the twentieth century: Claude Shannon—the neglected architect of the Information Age, whose insights stand behind every computer built, email sent, video streamed, and webpage loaded. Claude Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass. In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener. And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.
I can’t wait to read this new biography about Claude Shannon! The bio/summer read I’ve been waiting for.

With any luck an advanced reader copy is speeding it way to me! (Sorry you can’t surprise me with a belated copy for my birthday.) A review is forthcoming.

You have to love the cover art by Lauren Peters-Collaer.

Warren Weaver Bot!

Liked Someone has built a Warren Weaver Bot! by WeaverbotWeaverbot (Twitter)
This is the signal for the second.
How can you not follow this twitter account?!

Now I’m waiting for a Shannon bot and a Weiner bot. Maybe a John McCarthy bot would be apropos too?!

Happy 100th Birthday Claude Shannon

Many regular readers here are sure to know who Claude Shannon is, but sadly most of the rest of the world is in the dark. To give you an idea of his importance in society and even a bit in pop culture, today’s Google doodle celebrates Shannon’s life and work.

Overview of Shannon’s Work

Most importantly, Shannon, in his 1937 Master’s Thesis at Massachusetts Institute of Technology applied George Boole’s algebra (better known now as Boolean Algebra) to electric circuits thereby making the modern digital revolution possible. To give you an idea of how far we’ve come, the typical high school student can now read and understand all of its content. If you’d like to give it a try, you can download it from MIT’s website.

His other huge accomplishment was a journal article he wrote in 1948 entitled “A Mathematical Theory of Communication” in the Bell Labs Journal. When it was republished a year later, one of the most notable changes was in the new title “The Mathematical Theory of Communication.” While copies of the original article are freely available on the internet, the more casual reader will appreciate the more recent edition from MIT Press which also includes a fabulous elucidative and extensive opening written by Warren Weaver. This paper contains the theoretical underpinning that allowed for the efflorescence of all modern digital communication to occur. It ranks as one of the most influential and far-reaching documents in human history rivaling even the Bible.

Further, my own excitement in Shannon stems in part from his Ph.D. thesis “An Algebra for Theoretical Genetics” (1940) which has inspired most of the theoretical material I’m always contemplating.

Google Doodle Art animated by artist Nate Swinehart celebrates Claude Shannon's 100th Birthday
Google Doodle Art animated by artist Nate Swinehart celebrates Claude Shannon’s 100th Birthday

Additional Sources:

For those looking for more information try some of the following (non-technical) sources:

Claude Elwood Shannon smoking

Donald Forsdyke Indicates the Concept of Information in Biology Predates Claude Shannon

As it was published, I had read Kevin Hartnett’s article and interview with Christoph Adami The Information Theory of Life in Quanta Magazine. I recently revisited it and read through the commentary and stumbled upon an interesting quote relating to the history of information in biology:

Polymath Adami has ‘looked at so many fields of science’ and has correctly indicated the underlying importance of information theory, to which he has made important contributions. However, perhaps because the interview was concerned with the origin of life and was edited and condensed, many readers may get the impression that IT is only a few decades old. However, information ideas in biology can be traced back to at least 19th century sources. In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler’s work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century. The emergence of his text – “What is Life” – from Dublin in the 1940s, inspired those who gave us DNA structure and the associated information concepts in “the classic period” of molecular biology. For more please see: Forsdyke, D. R. (2015) History of Psychiatry 26 (3), 270-287.

Donald Forsdyke, bioinformatician and theoretical biologist
in response to The Information Theory of Life in Quanta Magazine on

These two historical references predate Claude Shannon’s mathematical formalization of information in A Mathematical Theory of Communication (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book What is Life (1944).

For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in History of Psychiatry [.pdf]):

🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke  [1]
Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)

Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.

Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’

The original arXiv article also referenced two lectures which are appended below:

http://www.youtube.com/watch?v=a3yNbTUCPd4

[Original Draft of this was written on December 14, 2015.]

References

[1]
D. Forsdyke R., “‘A vehicle of symbols and nothing more’. George Romanes, theory of mind, information, and Samuel Butler,” History of Psychiatry, vol. 26, no. 3, Aug. 2015 [Online]. Available: http://journals.sagepub.com/doi/abs/10.1177/0957154X14562755

Popular Science Books on Information Theory, Biology, and Complexity

Previously, I had made a large and somewhat random list of books which lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.  Below I’ll begin to do a somewhat better job of providing a finer gradation of technical level for both the hobbyist or the aspiring student who wishes to bring themselves to a higher level of understanding of these areas.  In future posts, I’ll try to begin classifying other texts into graduated strata as well.  The final list will be maintained here: Books at the Intersection of Information Theory and Biology.

Introductory / General Readership / Popular Science Books

These books are written on a generally non-technical level and give a broad overview of their topics with occasional forays into interesting or intriguing subtopics. They include little, if any, mathematical equations or conceptualization. Typically, any high school student should be able to read, follow, and understand the broad concepts behind these books.  Though often non-technical, these texts can give some useful insight into the topics at hand, even for the most advanced researchers.

Complexity: A Guided Tour by Melanie Mitchell (review)

Possibly one of the best places to start, this text gives a great overview of most of the major areas of study related to these fields.

Entropy Demystified: The Second Law Reduced to Plain Common Sense by Arieh Ben-Naim

One of the best books on the concept of entropy out there.  It can be read even by middle school students with no exposure to algebra and does a fantastic job of laying out the conceptualization of how entropy underlies large areas of the broader subject. Even those with Ph.D.’s in statistical thermodynamics can gain something useful from this lovely volume.

The Information: A History, a Theory, a Flood by James Gleick (review)

A relatively recent popular science volume covering various conceptualizations of what information is and how it’s been dealt with in science and engineering.  Though it has its flaws, its certainly a good introduction to the beginner, particularly with regard to history.

The Origin of Species by Charles Darwin

One of the most influential pieces of writing known to man, this classical text is the basis from which major strides in biology have been made as a result. A must read for everyone on the planet.

Information, Entropy, Life and the Universe: What We Know and What We Do Not Know by Arieh Ben-Naim

Information Theory and Evolution by John Avery

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein (review)

Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey

The four books above have a significant amount of overlap. Though one could read all of them, I recommend that those pressed for time choose Ben-Naim first. As I write this I’ll note that Ben-Naim’s book is scheduled for release on May 30, 2015, but he’s been kind enough to allow me to read an advance copy while it was in process; it gets my highest recommendation in its class. Loewenstein covers a bit more than Avery who also has a more basic presentation. Most who continue with the subject will later come across Yockey’s Information Theory and Molecular Biology which is similar to his text here but written at a slightly higher level of sophistication. Those who finish at this level of sophistication might want to try Yockey third instead.

The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley

Grammatical Man: Information, Entropy, Language, and Life  by Jeremy Campbell

Life’s Ratchet: How Molecular Machines Extract Order from Chaos by Peter M. Hoffmann

Complexity: The Emerging Science at the Edge of Order and Chaos by M. Mitchell Waldrop

The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) 

In the coming weeks/months, I’ll try to continue putting recommended books on the remainder of the rest of the spectrum, the balance of which follows in outline form below. As always, I welcome suggestions and recommendations based on others’ experiences as well. If you’d like to suggest additional resources in any of the sections below, please do so via our suggestion box. For those interested in additional resources, please take a look at the ITBio Resources page which includes information about related research groups; references and journal articles; academic, research institutes, societies, groups, and organizations; and conferences, workshops, and symposia.

Lower Level Undergraduate

These books are written at a level that can be grasped and understood by most with a freshmen or sophomore university level. Coursework in math, science, and engineering will usually presume knowledge of calculus, basic probability theory, introductory physics, chemistry, and basic biology.

Upper Level Undergraduate

These books are written at a level that can be grasped and understood by those at a junior or senor university level. Coursework in math, science, and engineering may presume knowledge of probability theory, differential equations, linear algebra, complex analysis, abstract algebra, signal processing, organic chemistry, molecular biology, evolutionary theory, thermodynamics, advanced physics, and basic information theory.

Graduate Level

These books are written at a level that can be grasped and understood by most working at the level of a master’s level at most universities.  Coursework presumes all the previously mentioned classes, though may require a higher level of sub-specialization in one or more areas of mathematics, physics, biology, or engineering practice.  Because of the depth and breadth of disciplines covered here, many may feel the need to delve into areas outside of their particular specialization.

Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
 

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

  •  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
  • To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
  • To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
  • To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
  • To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

Book Review: Gregory Chaitin’s “Proving Darwin: Making Biology Mathematical”

Gregory Chaitin’s book Proving Darwin: Making Biology Mathematical combining biology, microbiology, mathematics, evolution and even information theory is directly in my wheelhouse. I had delayed reading it following a few initial poor reviews, and sadly I must confirm that I’m ultimately disappointed in the direct effort shown here, though there is some very significant value buried within. Unfortunately the full value is buried so deeply that very few, if any, will actually make the concerted effort to find it.

proving

This effort does seem to make a more high-minded and noble attempt than what I would call the “Brian Greene method” in which an academic seemingly gives up on serious science to publish multiple texts on a popular topic to cash in on public interest in that topic through sales of books. In this respect Chaitin is closer to Neil deGrasse Tyson in his effort to expound an interesting theory to the broader public and improve the public discourse, though I would admit he’s probably a bit more (self-) interested in pushing his own theory and selling books (or giving him the benefit of the doubt, perhaps the publisher has pushed him to this).

Though there is a reasonable stab at providing some philosophical background to fit the topic into the broader fabric of science and theory in the later chapters, most of it is rather poorly motivated and is covered far better in other non-technical works. While it is nice to have some semblance of Chaitin’s philosophy and feelings, the inclusion of this type of material only tends to soften the blow of his theoretical work and makes the text read more like pseudo-science or simple base philosophy without any actual rigorous underpinning.

I’m assuming that his purpose in writing the book is to make the theories he’s come up with in his primary paper on the topic more accessible to the broader community of science as well as the public itself. It’s easy for a groundbreaking piece of work to be hidden in the broader scientific literature, but Chaitin seems to be taking his pedestal as a reasonably popular science writer to increase the visibility of his work here. He admittedly mentions that his effort stems from his hobby as his primary area is algorithmic information theory and computer science and not biology or evolution, though his meager references in the text do at least indicate some facility with some of the “right” sources in these latter areas.

Speaking from a broad public perspective, there is certainly interest in this general topic to warrant such a book, though based on the reviews of others via Amazon, Goodreads, etc. the book has sadly missed it’s mark. He unfortunately sticks too closely to the rule that inclusion of mathematical equations is detrimental to the sale of ones books. Sadly, his broader point is seemingly lost on the broader public as his ability to analogize his work isn’t as strong as that of Brian Greene with respect to theoretical physics (string theory).

From the a higher perspective of a researcher who does work in all of the relevant areas relating to the topic, I was even more underwhelmed with the present text aside from the single URL link to the original much more technical paper which Chaitin wrote in 2010. To me this was the most valuable part of the entire text though he did provide some small amount of reasonable detail in his appendix.

I can certainly appreciate Chaitin’s enthusiastic following of John von Neumann but I’m disappointed in his lack of acknowledgement of Norbert Weiner or Claude Shannon who all collaborated in the mid part of the 20th century. I’m sure Chaitin is more than well aware of the father of information theory, but I’ll be willing to bet that although he’s probably read his infamous master’s thesis and his highly influential Bell Labs article on “A/The Mathematical Theory of Communication”, he is, like most, shamefully and wholly unaware of Shannon’s MIT doctoral thesis.

Given Chaitin’s own personal aim to further the acceptance of his own theories and work and the goal of the publisher to sell more copies, I would mention a few recommendations for future potential editions:

The greater majority of his broader audience will have at least a passably reasonable understanding of biology and evolution, but very little, if any, understanding of algorithmic information theory. He would be better off expounding upon this subject to bring people up to speed to better understand his viewpoint and his subsequent proof. Though I understand the need to be relatively light in regard to the number of equations and technicalities included, Chaitin could follow some of his heroes of mathematical exposition and do a slightly better job of explaining what is going on here. He could also go a long way toward adding some significant material to the appendices to help the higher end general readers and the specifically the biologists understand more of the technicalities of algorithmic information theory to better follow his proof which should appear in intricate glory in the appendix as well. I might also recommend excising some of the more philosophical material which tends to undermine his scientific “weight.” Though I found it interesting that he gives a mathematical definition of “intelligent design”, I have a feeling its intricacies were lost on most of his readership — this point alone could go a long way towards solidifying the position of evolution amongst non-scientists, particularly in America, and win the support of heavyweights like Dawkins himself.

I’ll agree wholeheartedly with one reviewer who said that Chaitin tends to “state small ideas repeatedly, and every time at the same shallow level with astonishing amount of redundancy (mostly consisting of chit-chat and self congratulations)”. This certainly detracted from my enjoyment of the work. Chaitin also includes an awful lot of name dropping of significant scientific figures tangential to the subject at hand. This may have been more impressive if he included the results of his discussions with them about the subject, but I’m left with the impression that he simply said hello, shook their hands, and at best was simply inspired by his having met them. It’s nice that he’s had these experiences, but it doesn’t help me to believe or follow his own work.

For the technically interested reader, save yourself some time and simply skim through chapter five and a portion of the appendix relating to his proof and then move on to his actual paper. For the non-technical reader, I expect you’ll get more out of reading Richard Dawkins’ early work (The Selfish Gene) or possibly Werner R. Loewenstein’s The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life.

Though I would certainly agree that we could use a mathematical proof of evolution, and that Chaitin has made a reasonable theoretical stab, this book sadly wasn’t the best one to motivate broader interest in such an effort. I’ll give him five stars for effort, three for general content, but in the end, for most it will have to be at most a 2 star work overall.

This review was originally published on June 17, 2013.

‘The Information’ by James Gleick – Book Review by Janet Maslin | New York Times

Reposted ‘The Information’ by James Gleick - Review (nytimes.com)
“The Information,” by James Gleick, is to the nature, history and significance of data what the beach is to sand.
This book is assuredly going to have to skip up to the top of my current reading list.

“The Information” is so ambitious, illuminating and sexily theoretical that it will amount to aspirational reading for many of those who have the mettle to tackle it. Don’t make the mistake of reading it quickly. Imagine luxuriating on a Wi-Fi-equipped desert island with Mr. Gleick’s book, a search engine and no distractions. “The Information” is to the nature, history and significance of data what the beach is to sand.

In this relaxed setting, take the time to differentiate among the Brownian (motion), Bodleian (library) and Boolean (logic) while following Mr. Gleick’s version of what Einstein called “spukhafte Fernwirkung,” or “spooky action at a distance.” Einstein wasn’t precise about what this meant, and Mr. Gleick isn’t always precise either. His ambitions for this book are diffuse and far flung, to the point where providing a thumbnail description of “The Information” is impossible.

So this book’s prologue is its most slippery section. It does not exactly outline a unifying thesis. Instead it hints at the amalgam of logic, philosophy, linguistics, research, appraisal and anecdotal wisdom that will follow. If Mr. Gleick has one overriding goal it is to provide an animated history of scientific progress, specifically the progress of the technology that allows information to be recorded, transmitted and analyzed. This study’s range extends from communication by drumbeat to cognitive assault by e-mail.

As an illustration of Mr. Gleick’s versatility, consider what he has to say about the telegraph. He describes the mechanical key that made telegraphic transmission possible; the compression of language that this new medium encouraged; that it literally was a medium, a midway point between fully verbal messages and coded ones; the damaging effect its forced brevity had on civility; the confusion it created as to what a message actually was (could a mother send her son a dish of sauerkraut?) and the new conceptual thinking that it helped implement. The weather, which had been understood on a place-by-place basis, was suddenly much more than a collection of local events.

Beyond all this Mr. Gleick’s telegraph chapter, titled “A Nervous System for the Earth,” finds time to consider the kind of binary code that began to make sense in the telegraph era. It examines the way letters came to treated like numbers, the way systems of ciphers emerged. It cites the various uses to which ciphers might be put by businessmen, governments or fiction writers (Lewis Carroll, Jules Verne and Edgar Allan Poe). Most of all it shows how this phase of communication anticipated the immense complexities of our own information age.

Although “The Information” unfolds in a roughly chronological way, Mr. Gleick is no slave to linearity. He freely embarks on colorful digressions. Some are included just for the sake of introducing the great eccentrics whose seemingly marginal inventions would prove to be prophetic. Like Richard Holmes’s “Age of Wonder” this book invests scientists with big, eccentric personalities. Augusta Ada Lovelace, the daughter of Lord Byron, may have been spectacularly arrogant about what she called “my immense reasoning faculties,” claiming that her brain was “something more than merely mortal.” But her contribution to the writing of algorithms can, in the right geeky circles, be mentioned in the same breath as her father’s contribution to poetry.

The segments of “The Information” vary in levels of difficulty. Grappling with entropy, randomness and quantum teleportation is the price of enjoying Mr. Gleick’s simple, entertaining riffs on the Oxford English Dictionary’s methodology, which has yielded 30-odd spellings of “mackerel” and an enchantingly tongue-tied definition of “bada-bing” and on the cyber-battles waged via Wikipedia. (As he notes, there are people who have bothered to fight over Wikipedia’s use of the word “cute” to accompany a picture of a young polar bear.) That Amazon boasts of being able to download a book called “Data Smog” in less than a minute does not escape his keen sense of the absurd.

As it traces our route to information overload, “The Information” pays tribute to the places that made it possible. He cites and honors the great cogitation hives of yore. In addition to the Institute for Advanced Study in Princeton, N.J., the Mount Rushmore of theoretical science, he acknowledges the achievements of corporate facilities like Bell Labs and I.B.M.’s Watson Research Center in the halcyon days when many innovations had not found practical applications and progress was its own reward.

“The Information” also lauds the heroics of mathematicians, physicists and computer pioneers like Claude Shannon, who is revered in the computer-science realm for his information theory but not yet treated as a subject for full-length, mainstream biography. Mr. Shannon’s interest in circuitry using “if … then” choices conducting arithmetic in a binary system had novelty when he began formulating his thoughts in 1937. “Here in a master’s thesis by a research assistant,” Mr. Gleick writes, “was the essence of the computer revolution yet to come.”

Among its many other virtues “The Information” has the rare capacity to work as a time machine. It goes back much further than Shannon’s breakthroughs. And with each step backward Mr. Gleick must erase what his readers already know. He casts new light on the verbal flourishes of the Greek poetry that preceded the written word: these turns of phrase could be as useful for their mnemonic power as for their art. He explains why the Greeks arranged things in terms of events, not categories; how one Babylonian text that ends with “this is the procedure” is essentially an algorithm; and why the telephone and the skyscraper go hand in hand. Once the telephone eliminated the need for hand-delivered messages, the sky was the limit.

In the opinion of “The Information” the world of information still has room for expansion. We may be drowning in spam, but the sky’s still the limit today.

Bookmarked Information theory in living systems, methods, applications, and challenges. by R. A. Gatenby, B. R. FriedenR. A. Gatenby, B. R. Frieden (Bull Math Biol. 2007 Feb;69(2):635-57. Epub 2006 Nov 3.)

Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease.

PMID: 17083004 DOI: 10.1007/s11538-006-9141-5

Brief Thoughts on the Google/Verizon Compromise and Net Neutrality in the Mobile Space

This last week there’s been a lot of interesting discussion about net neutrality as it relates particularly to the mobile space.  Though there has been some generally good discussion and interesting debate on the topic, I’ve found the best spirited discussion to be that held by Leo Laporte, Gina Trapani, Jeff Jarvis, and guest Stacey Higginbotham on this week’s episode of This Week in Google.

[youtube http://www.youtube.com/watch?v=jJQy2R6UT5U?wmode=transparent]

What I’ve found most interesting in many of these debates, including this one, is that though there is occasional discussion of building out additional infrastructure to provide additional capacity, there is generally never discussion of utilizing information theory to improve bandwidth either mathematically or from an engineering perspective.  Claude Shannon is rolling in his grave.

Apparently, despite last year’s great “digital switch” in television frequencies from analog to provide additional television capacity and the subsequent auction of the 700MHz spectrum, everyone forgets that engineering additional capacity is often cheaper and easier than just physically building more.  Shannon’s original limit is far from a reality, so we know there’s much room for improvement here, particularly because most of the improvement on reaching his limit in the past two decades has come about particularly because of the research in and growth of the mobile communications industry.

Perhaps our leaders could borrow a page from JFK in launching the space race in the 60’s, but instead of focusing on space, they might look at science and mathematics in making our communications infrastructure more robust and guaranteeing free and open internet access to all Americans?