Acquired Symposium on Information Theory in Biology Gatlinburg, Tennessee, October 29-31, 1956

Acquired Symposium on Information Theory in Biology Gatlinburg, Tennessee, October 29-31, 1956 by Hubert P. Yockey,  Robert P. Platzman, Henry Quastler, (editors)Hubert P. Yockey, Robert P. Platzman, Henry Quastler, (editors) (Pegamon Press; 1st edition (1958))

The next major thrust in biology

Werner R. Loewenstein (), biologist, physiologist
in The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life, Oxford University Press, 1999

 

The Touchstone of Life (Book Cover)

You and I Are Not Much Different from Cans of Soup

Philip Nelson, American physicist
in Biological Physics: Energy, Information, Life

 

Biological Physics: Energy, Information, Life written by Philip Nelson
Biological Physics: Energy, Information, Life written by Philip Nelson

Book Review: Werner Loewenstein’s “The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life”

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein

Though there is a paucity of equations, particularly on the information theoretic side, Loewenstein does a fantastic job of discussing the theory and philosophy of what is going on in the overlapping fields of information theory and microbiology. (I will note that it is commonly held wisdom within publishing, particularly for books for the broader public, that the number of equations in a text is inversely proportional to the number of sales and I’m sure this is the reason for the lack of mathematical substantiation which he could easily have supplied.)

The Touchstone of Life (Book Cover)

This is a much more specific and therefore much better – in my mind – book than John Avery’s Information Theory and Evolution which covers some similar ground. Loewenstein has a much better and more specific grasp of the material in my opinion. Those who feel overwhelmed by Loewenstein may prefer to take a step back to Avery’s more facile presentation.

Loewenstein has a deft ability to describe what is going on and give both an up-close view with many examples as well as a spectacular view of the broader picture – something which is often missing in general science books of this sort. Readers with no mathematical or microbiology background can benefit from it as much as those with more experience.

One thing which sets it apart from much of its competition, even in the broader general science area of non-fiction, is that the author has a quirky but adept ability to add some flowery language and analogy to clarify his points. Though many will find this off-putting, it really does add some additional flavor to what might be dry and dull explication to many. His range of background knowledge, philosophy and vocabulary are second only (and possibly even on par or exceeding in some cases) that of Simon Winchester.

I’d highly recommend this book to people prior to their academic studies of biochemistry or molecular cell biology or to budding biomedical engineers prior to their junior year of study. I truly wish I had read this in 1994 myself, but alas it didn’t exist until a few years after. I lament that I hadn’t picked it up and been able to read it thoroughly until now.

For my part, his drastically differing viewpoint of the way in which biology should be viewed moving forward, is spot on. I am firmly a member of this new “school”. His final chapter on this concept is truly illuminating from a philosophical and theoretical view and I encourage people to read it first instead of last.

I’ll also note briefly that I’ve seen some reviews of this book which make mention of creationism or intelligent design and whether or not proponents of those philosophies feel that Loewenstein’s work here supports them or not, particularly since Loewenstein appeared on a panel with Dembski once. I will state for those who take a purely scientific viewpoint of things, that this book is written in full support of evolution and microbiology and doesn’t use the concept of “information” to muddy the waters the way many ID arguments are typically made.

Original review posted to GoodReads.com on 9/4/12

Rod, Can You Tell Our Contestant What She’s Won?

Possibly one of the oddest closing sentences of a technical book–and a very good one at that–I’ve ever read:

This pressure can be calculated by minimizing the Helmholtz function of the system. Details can be found in Fermi’s textbook on thermodynamics (Fermi 1956). But why does osmosis explain the behavior of a salted cucumber? This question is left to the reader as a parting gift.

André Thess in The Entropy Prinicple: Thermodynamics for the Unsatisified (Springer, 2011)

 

Featured image by KTRYNA on Unsplash

Christoph Adami: Finding Life We Can’t Imagine | TEDx

Watched Finding life we can't imagine by Christoph Adami from ted.com
How do we search for alien life if it's nothing like the life that we know? Christoph Adami shows how he uses his research into artificial life -- self-replicating computer programs -- to find a signature, a "biomarker," that is free of our preconceptions of what life is.
Adami’s work is along similar lines to some of my own research. This short video gives an intriguing look into some of the basics of how to define life so that one can recognize it when one sees it.

Meaning according to Humpty Dumpty

Humpty Dumpty (in a rather scornful tone): When I use a word, it means just what I choose it to mean – neither more or less.
Alice: The question is, whether you can make a word mean so many different things?
Humpty Dumpty: The question is, which is to be master – that’s all.
Alice: (Too much puzzled to say anything, so after a minute Humpty Dumpty began again)
Humpty Dumpty: They’ve a temper, some of them – particularly verbs, they’re the proudest – adjectives you can do anything with, but not verbs – however, I can manage the whole of them! Impenetrability! That’s what I say!
Alice: Would you tell me, please what that means?
Humpty Dumpty (looking very much pleased): Now you talk like a reasonable child. I meant by impenetrability that we have had enough of that subject, and it would be just as well if you’d mention what you mean to do next, as I suppose you don’t mean to stop here all the rest of your life.
Alice (in a thoughtful tone): That’s a great deal to make one word mean.
Humpty Dumpty: When I make a word do a lot of work like that, I always pay it extra.
Alice (too much puzzled to make any other remark): Oh!

The Central Dogma of Molecular Biology

Francis Crick, OM, FRS (1916 – 2004), a British molecular biologist, biophysicist, and neuroscientist
first articulated in 1958 and restated in August 1970
“Central dogma of molecular biology.” Nature 227 (5258): 561-3.
Bibcode 1970Natur.227..561C doi:10.1038/227561a0 PMID 4913914

Book Review: John Avery’s “Information Theory and Evolution”

Information Theory and Evolution Book Cover Information Theory and Evolution
John Avery
Non-fiction, Popular Science
World Scientific
January 1, 2003
paperback
217

This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.

Information Theory and EvolutionInformation Theory and Evolution by John Avery
My rating: 3 of 5 stars

This is a fantastic book which, for the majority of people, I’d give a five star review. For my own purposes, however, I was expecting far more on the theoretical side of information theory and statistical mechanics as applied to microbiology that it didn’t live up to, so I’m giving it three stars from a purely personal perspective.

I do wish that someone had placed it in my hands and forced me to read it when I was a freshman in college entering the study of biomedical and electrical engineering. It is far more an impressive book at this level and for those in the general public who are interested in the general history of science and philosophy of the topics. The general reader may be somewhat scared by a small amount of mathematics in chapter 4, but there is really no loss of continuity by skimming through most of it. For those looking for a bit more rigor, Avery provides some additional details in appendix A, but for the specialist, the presentation is heavily lacking.

The book opens with a facile but acceptable overview of the history of the development for the theory of evolution whereas most other texts would simply begin with Darwin’s work and completely skip the important philosophical and scientific contributions of Aristotle, Averroes, Condorcet, Linnaeus, Erasmus Darwin, Lamarck, or the debates between Cuvier and St. Hilaire.

For me, the meat of the book was chapters 3-5 and appendix A which collectively covered molecular biology, evolution, statistical mechanics, and a bit of information theory, albeit from a very big picture point of view. Unfortunately the rigor of the presentation and the underlying mathematics were skimmed over all too quickly to accomplish what I had hoped to gain from the text. On the other hand, the individual sections of “suggestions for further reading” throughout the book seem well researched and offer an acceptable launching pad for delving into topics in places where they may be covered more thoroughly.

The final several chapters become a bit more of an overview of philosophy surrounding cultural evolution and information technology which are much better covered and discussed in James Gleick’s recent book The Information.

Overall, Avery has a well laid out outline of the broad array of subjects and covers it all fairly well in an easy to read and engaging style.

View all my reviews

Reading Progress
  • Started book on 07/11/11
  • Finished book on 08/14//11

On Telephones and Architecture

John J. Carty (), first head of Bell Laboratories, 1908

 

John Battelle Review of James Gleick’s “The Information” and Why It’s a Good Thing

John Battelle recently posted a review of James Gleick’s last book The Information: A History, A Theory, A Flood. It reminds me that I find it almost laughable when the vast majority of the technology press and the digiterati bloviate about their beats when at its roots, they know almost nothing about how technology truly works or the mathematical or theoretical underpinnings of what is happening — and even worse that they don’t seem to really care.

I’ve seen hundreds of reviews and thousands of mentions of Steven Levy’s book In the Plex: How Google Thinks, Works, and Shapes Our Lives in the past few months, — in fact, Battelle reviewed it just before Gleick’s book — but I’ve seen few, if any, of Gleick’s book which I honestly think is a much more worthwhile read about what is going on in the world and has farther reaching implications about where we are headed.

I’ll give a BIG tip my hat to John for his efforts to have read Gleick and post his commentary and to continue to push the boundary further as he invites Gleick to speak at Web 2.0 Summit in the fall. I hope his efforts will bring the topic to the much larger tech community.  I further hope he and others might take the time to read Claude Shannon’s original paper [.pdf download], and if he’s further interested in the concept of thermodynamic entropy, I can recommend Andre Thess’s text The Entropy Principle: Thermodynamics for the Unsatisfied, which I’ve recently discovered and think does a good (and logically) consistent job of defining the concept at a level accessible to the average public.

Bookmarked Information Theory and Statistical Mechanics by E. T. Jaynes (Physical Review, 106, 620 – Published 15 May 1957)

Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.

It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

DOI:https://doi.org/10.1103/PhysRev.106.620

On the Fallacy of Diminishing Returns

Nominated for quote of the week, which I encountered while reading Matt Ridley’s The Rational Optimist:

Thomas Jefferson (), American Founding Father and the principal author of the Declaration of Independence (1776)
in a letter to Isaac McPherson

 

Entropy Is Universal Rule of Language | Wired Science

Read Entropy Is Universal Rule of Language by Lisa Grossman (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.

Bob Frankston on Communications

Watched Triangulation 4: Bob Frankston by Leo Laporte and Tom Merritt from TWiT Network
Computer pioneer who helped create the first spreadsheet, Bob Frankston, is this week's guest.
On a recent episode of Leo Laporte and Tom Merrit’s show Triangulation, they interviewed Bob Frankston of VisiCalc fame. They gave a great discussion of the current state of broadband in the U.S. and how it might be much better.  They get just a bit technical in places, but it’s a fantastic and very accessible discussion of the topic of communications that every American should be aware of.