
Acquired Symposium on Information Theory in Biology Gatlinburg, Tennessee, October 29-31, 1956

The next [major] thrust [in biology], the one that may bring us closer to the Archimedian ideal of science, we may expect to come from information theory.
At the dawn of the twentieth century, it was already clear that, chemically speaking, you and I are not much different from cans of soup. And yet we can do many complex and even fun things we do not usually see cans of soup doing.
Though there is a paucity of equations, particularly on the information theoretic side, Loewenstein does a fantastic job of discussing the theory and philosophy of what is going on in the overlapping fields of information theory and microbiology. (I will note that it is commonly held wisdom within publishing, particularly for books for the broader public, that the number of equations in a text is inversely proportional to the number of sales and I’m sure this is the reason for the lack of mathematical substantiation which he could easily have supplied.)
This is a much more specific and therefore much better – in my mind – book than John Avery’s Information Theory and Evolution which covers some similar ground. Loewenstein has a much better and more specific grasp of the material in my opinion. Those who feel overwhelmed by Loewenstein may prefer to take a step back to Avery’s more facile presentation.
Loewenstein has a deft ability to describe what is going on and give both an up-close view with many examples as well as a spectacular view of the broader picture – something which is often missing in general science books of this sort. Readers with no mathematical or microbiology background can benefit from it as much as those with more experience.
One thing which sets it apart from much of its competition, even in the broader general science area of non-fiction, is that the author has a quirky but adept ability to add some flowery language and analogy to clarify his points. Though many will find this off-putting, it really does add some additional flavor to what might be dry and dull explication to many. His range of background knowledge, philosophy and vocabulary are second only (and possibly even on par or exceeding in some cases) that of Simon Winchester.
I’d highly recommend this book to people prior to their academic studies of biochemistry or molecular cell biology or to budding biomedical engineers prior to their junior year of study. I truly wish I had read this in 1994 myself, but alas it didn’t exist until a few years after. I lament that I hadn’t picked it up and been able to read it thoroughly until now.
For my part, his drastically differing viewpoint of the way in which biology should be viewed moving forward, is spot on. I am firmly a member of this new “school”. His final chapter on this concept is truly illuminating from a philosophical and theoretical view and I encourage people to read it first instead of last.
I’ll also note briefly that I’ve seen some reviews of this book which make mention of creationism or intelligent design and whether or not proponents of those philosophies feel that Loewenstein’s work here supports them or not, particularly since Loewenstein appeared on a panel with Dembski once. I will state for those who take a purely scientific viewpoint of things, that this book is written in full support of evolution and microbiology and doesn’t use the concept of “information” to muddy the waters the way many ID arguments are typically made.
This pressure can be calculated by minimizing the Helmholtz function of the system. Details can be found in Fermi’s textbook on thermodynamics (Fermi 1956). But why does osmosis explain the behavior of a salted cucumber? This question is left to the reader as a parting gift.
Featured image by KTRYNA on Unsplash
How do we search for alien life if it's nothing like the life that we know? Christoph Adami shows how he uses his research into artificial life -- self-replicating computer programs -- to find a signature, a "biomarker," that is free of our preconceptions of what life is.
The central dogma of molecular biology deals with the detailed residue-by-residue transfer of sequential information. It states that information cannot be transferred back from protein to either protein or nucleic acid.
This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.
This is a fantastic book which, for the majority of people, I’d give a five star review. For my own purposes, however, I was expecting far more on the theoretical side of information theory and statistical mechanics as applied to microbiology that it didn’t live up to, so I’m giving it three stars from a purely personal perspective.
I do wish that someone had placed it in my hands and forced me to read it when I was a freshman in college entering the study of biomedical and electrical engineering. It is far more an impressive book at this level and for those in the general public who are interested in the general history of science and philosophy of the topics. The general reader may be somewhat scared by a small amount of mathematics in chapter 4, but there is really no loss of continuity by skimming through most of it. For those looking for a bit more rigor, Avery provides some additional details in appendix A, but for the specialist, the presentation is heavily lacking.
The book opens with a facile but acceptable overview of the history of the development for the theory of evolution whereas most other texts would simply begin with Darwin’s work and completely skip the important philosophical and scientific contributions of Aristotle, Averroes, Condorcet, Linnaeus, Erasmus Darwin, Lamarck, or the debates between Cuvier and St. Hilaire.
For me, the meat of the book was chapters 3-5 and appendix A which collectively covered molecular biology, evolution, statistical mechanics, and a bit of information theory, albeit from a very big picture point of view. Unfortunately the rigor of the presentation and the underlying mathematics were skimmed over all too quickly to accomplish what I had hoped to gain from the text. On the other hand, the individual sections of “suggestions for further reading” throughout the book seem well researched and offer an acceptable launching pad for delving into topics in places where they may be covered more thoroughly.
The final several chapters become a bit more of an overview of philosophy surrounding cultural evolution and information technology which are much better covered and discussed in James Gleick’s recent book The Information.
Overall, Avery has a well laid out outline of the broad array of subjects and covers it all fairly well in an easy to read and engaging style.
It may sound ridiculous to say that [Alexander Graham] Bell and his successors were the fathers of modern commercial architecture–of the skyscraper. But wait a minute. Take the Singer Building, the Flatiron Building, the Broad Exchange, the Trinity, or any of the giant office buildings. How many messages do you suppose go in and out of those buildings every day? Suppose there was no telephone and every message had to be carried by a personal messenger? How much room do you think the necessary elevators would leave for offices? Such structures would be an economic impossibility.
John Battelle recently posted a review of James Gleick’s last book The Information: A History, A Theory, A Flood. It reminds me that I find it almost laughable when the vast majority of the technology press and the digiterati bloviate about their beats when at its roots, they know almost nothing about how technology truly works or the mathematical or theoretical underpinnings of what is happening — and even worse that they don’t seem to really care.
I’ve seen hundreds of reviews and thousands of mentions of Steven Levy’s book In the Plex: How Google Thinks, Works, and Shapes Our Lives in the past few months, — in fact, Battelle reviewed it just before Gleick’s book — but I’ve seen few, if any, of Gleick’s book which I honestly think is a much more worthwhile read about what is going on in the world and has farther reaching implications about where we are headed.
I’ll give a BIG tip my hat to John for his efforts to have read Gleick and post his commentary and to continue to push the boundary further as he invites Gleick to speak at Web 2.0 Summit in the fall. I hope his efforts will bring the topic to the much larger tech community. I further hope he and others might take the time to read Claude Shannon’s original paper [.pdf download], and if he’s further interested in the concept of thermodynamic entropy, I can recommend Andre Thess’s text The Entropy Principle: Thermodynamics for the Unsatisfied, which I’ve recently discovered and think does a good (and logically) consistent job of defining the concept at a level accessible to the average public.
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
DOI:https://doi.org/10.1103/PhysRev.106.620
Nominated for quote of the week, which I encountered while reading Matt Ridley’s The Rational Optimist:
He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
Computer pioneer who helped create the first spreadsheet, Bob Frankston, is this week's guest.