Book Review: Gregory Chaitin’s “Proving Darwin: Making Biology Mathematical”

Gregory Chaitin’s book Proving Darwin: Making Biology Mathematical combining biology, microbiology, mathematics, evolution and even information theory is directly in my wheelhouse. I had delayed reading it following a few initial poor reviews, and sadly I must confirm that I’m ultimately disappointed in the direct effort shown here, though there is some very significant value buried within. Unfortunately the full value is buried so deeply that very few, if any, will actually make the concerted effort to find it.

proving

This effort does seem to make a more high-minded and noble attempt than what I would call the “Brian Greene method” in which an academic seemingly gives up on serious science to publish multiple texts on a popular topic to cash in on public interest in that topic through sales of books. In this respect Chaitin is closer to Neil deGrasse Tyson in his effort to expound an interesting theory to the broader public and improve the public discourse, though I would admit he’s probably a bit more (self-) interested in pushing his own theory and selling books (or giving him the benefit of the doubt, perhaps the publisher has pushed him to this).

Though there is a reasonable stab at providing some philosophical background to fit the topic into the broader fabric of science and theory in the later chapters, most of it is rather poorly motivated and is covered far better in other non-technical works. While it is nice to have some semblance of Chaitin’s philosophy and feelings, the inclusion of this type of material only tends to soften the blow of his theoretical work and makes the text read more like pseudo-science or simple base philosophy without any actual rigorous underpinning.

I’m assuming that his purpose in writing the book is to make the theories he’s come up with in his primary paper on the topic more accessible to the broader community of science as well as the public itself. It’s easy for a groundbreaking piece of work to be hidden in the broader scientific literature, but Chaitin seems to be taking his pedestal as a reasonably popular science writer to increase the visibility of his work here. He admittedly mentions that his effort stems from his hobby as his primary area is algorithmic information theory and computer science and not biology or evolution, though his meager references in the text do at least indicate some facility with some of the “right” sources in these latter areas.

Speaking from a broad public perspective, there is certainly interest in this general topic to warrant such a book, though based on the reviews of others via Amazon, Goodreads, etc. the book has sadly missed it’s mark. He unfortunately sticks too closely to the rule that inclusion of mathematical equations is detrimental to the sale of ones books. Sadly, his broader point is seemingly lost on the broader public as his ability to analogize his work isn’t as strong as that of Brian Greene with respect to theoretical physics (string theory).

From the a higher perspective of a researcher who does work in all of the relevant areas relating to the topic, I was even more underwhelmed with the present text aside from the single URL link to the original much more technical paper which Chaitin wrote in 2010. To me this was the most valuable part of the entire text though he did provide some small amount of reasonable detail in his appendix.

I can certainly appreciate Chaitin’s enthusiastic following of John von Neumann but I’m disappointed in his lack of acknowledgement of Norbert Weiner or Claude Shannon who all collaborated in the mid part of the 20th century. I’m sure Chaitin is more than well aware of the father of information theory, but I’ll be willing to bet that although he’s probably read his infamous master’s thesis and his highly influential Bell Labs article on “A/The Mathematical Theory of Communication”, he is, like most, shamefully and wholly unaware of Shannon’s MIT doctoral thesis.

Given Chaitin’s own personal aim to further the acceptance of his own theories and work and the goal of the publisher to sell more copies, I would mention a few recommendations for future potential editions:

The greater majority of his broader audience will have at least a passably reasonable understanding of biology and evolution, but very little, if any, understanding of algorithmic information theory. He would be better off expounding upon this subject to bring people up to speed to better understand his viewpoint and his subsequent proof. Though I understand the need to be relatively light in regard to the number of equations and technicalities included, Chaitin could follow some of his heroes of mathematical exposition and do a slightly better job of explaining what is going on here. He could also go a long way toward adding some significant material to the appendices to help the higher end general readers and the specifically the biologists understand more of the technicalities of algorithmic information theory to better follow his proof which should appear in intricate glory in the appendix as well. I might also recommend excising some of the more philosophical material which tends to undermine his scientific “weight.” Though I found it interesting that he gives a mathematical definition of “intelligent design”, I have a feeling its intricacies were lost on most of his readership — this point alone could go a long way towards solidifying the position of evolution amongst non-scientists, particularly in America, and win the support of heavyweights like Dawkins himself.

I’ll agree wholeheartedly with one reviewer who said that Chaitin tends to “state small ideas repeatedly, and every time at the same shallow level with astonishing amount of redundancy (mostly consisting of chit-chat and self congratulations)”. This certainly detracted from my enjoyment of the work. Chaitin also includes an awful lot of name dropping of significant scientific figures tangential to the subject at hand. This may have been more impressive if he included the results of his discussions with them about the subject, but I’m left with the impression that he simply said hello, shook their hands, and at best was simply inspired by his having met them. It’s nice that he’s had these experiences, but it doesn’t help me to believe or follow his own work.

For the technically interested reader, save yourself some time and simply skim through chapter five and a portion of the appendix relating to his proof and then move on to his actual paper. For the non-technical reader, I expect you’ll get more out of reading Richard Dawkins’ early work (The Selfish Gene) or possibly Werner R. Loewenstein’s The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life.

Though I would certainly agree that we could use a mathematical proof of evolution, and that Chaitin has made a reasonable theoretical stab, this book sadly wasn’t the best one to motivate broader interest in such an effort. I’ll give him five stars for effort, three for general content, but in the end, for most it will have to be at most a 2 star work overall.

This review was originally published on June 17, 2013.

Syndicated copies to:

Regard the World as Made of Information

John Archibald Wheeler (1911-2008), American theoretical physicist
[attributed by Jacob Bekenstein in “Information in the Holographic Universe” (Scientific American, 2007)]

 

John Archibald Wheeler

Syndicated copies to:

Brief Book Review: James Gleick’s “The Information: a History, a Theory, a Flood”

Overall James Gleick’s book The Information: a History, a Theory, a Flood is an excellent read. Given that it’s an area with which I’m intimately interested, I’m not too surprised that most of it is “review”, but I’d highly recommend it to the general public to know more about some of the excellent history, philosophy, and theory which Gleick so nicely summarizes throughout the book.

Book Cover: The Information

There are one or two references in the back which I’ll have to chase down and read and one or two, which after many years, seem like they may be worth a second revisiting after having completed this.

Even for the specialist, Gleick manages to tie together some disparate thoughts to create an excellent whole which makes it a very worthwhile read. I found towards the last several chapters, Gleick’s style becomes much more flowery and less concrete, but most of it is as a result of covering the “humanities” perspective of information as opposed to the earlier parts of the text which were more specific to history and the scientific theories he covered.

Review originally posted at GoodReads.com.

Syndicated copies to:

The next major thrust in biology

Werner R. Loewenstein (), biologist, physiologist
in The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life, Oxford University Press, 1999

 

The Touchstone of Life (Book Cover)

Syndicated copies to:

You and I Are Not Much Different from Cans of Soup

Philip Nelson, American physicist
in Biological Physics: Energy, Information, Life

 

Biological Physics: Energy, Information, Life written by Philip Nelson
Biological Physics: Energy, Information, Life written by Philip Nelson
Syndicated copies to:

Book Review: Werner Loewenstein’s “The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life”

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein

Though there is a paucity of equations, particularly on the information theoretic side, Loewenstein does a fantastic job of discussing the theory and philosophy of what is going on in the overlapping fields of information theory and microbiology. (I will note that it is commonly held wisdom within publishing, particularly for books for the broader public, that the number of equations in a text is inversely proportional to the number of sales and I’m sure this is the reason for the lack of mathematical substantiation which he could easily have supplied.)

The Touchstone of Life (Book Cover)

This is a much more specific and therefore much better – in my mind – book than John Avery’s Information Theory and Evolution which covers some similar ground. Loewenstein has a much better and more specific grasp of the material in my opinion. Those who feel overwhelmed by Loewenstein may prefer to take a step back to Avery’s more facile presentation.

Loewenstein has a deft ability to describe what is going on and give both an up-close view with many examples as well as a spectacular view of the broader picture – something which is often missing in general science books of this sort. Readers with no mathematical or microbiology background can benefit from it as much as those with more experience.

One thing which sets it apart from much of its competition, even in the broader general science area of non-fiction, is that the author has a quirky but adept ability to add some flowery language and analogy to clarify his points. Though many will find this off-putting, it really does add some additional flavor to what might be dry and dull explication to many. His range of background knowledge, philosophy and vocabulary are second only (and possibly even on par or exceeding in some cases) that of Simon Winchester.

I’d highly recommend this book to people prior to their academic studies of biochemistry or molecular cell biology or to budding biomedical engineers prior to their junior year of study. I truly wish I had read this in 1994 myself, but alas it didn’t exist until a few years after. I lament that I hadn’t picked it up and been able to read it thoroughly until now.

For my part, his drastically differing viewpoint of the way in which biology should be viewed moving forward, is spot on. I am firmly a member of this new “school”. His final chapter on this concept is truly illuminating from a philosophical and theoretical view and I encourage people to read it first instead of last.

I’ll also note briefly that I’ve seen some reviews of this book which make mention of creationism or intelligent design and whether or not proponents of those philosophies feel that Loewenstein’s work here supports them or not, particularly since Loewenstein appeared on a panel with Dembski once. I will state for those who take a purely scientific viewpoint of things, that this book is written in full support of evolution and microbiology and doesn’t use the concept of “information” to muddy the waters the way many ID arguments are typically made.

Original review posted to GoodReads.com on 9/4/12

Syndicated copies to:

Rod, Can You Tell Our Contestant What She’s Won?

Possibly one of the oddest closing sentences of a technical book–and a very good one at that–I’ve ever read:

This pressure can be calculated by minimizing the Helmholtz function of the system. Details can be found in Fermi’s textbook on thermodynamics (Fermi 1956). But why does osmosis explain the behavior of a salted cucumber? This question is left to the reader as a parting gift.

André Thess in The Entropy Prinicple: Thermodynamics for the Unsatisified (Springer, 2011)

 

salted cucumber
 

Christoph Adami: Finding Life We Can’t Imagine | TEDx


–via Ted.com
 
Adami’s work is along similar lines to some of my own research. This short video gives an intriguing look into some of the basics of how to define life so that one can recognize it when one sees it.

Syndicated copies to:

Meaning according to Humpty Dumpty

Humpty Dumpty (in a rather scornful tone): When I use a word, it means just what I choose it to mean – neither more or less.
Alice: The question is, whether you can make a word mean so many different things?
Humpty Dumpty: The question is, which is to be master – that’s all.
Alice: (Too much puzzled to say anything, so after a minute Humpty Dumpty began again)
Humpty Dumpty: They’ve a temper, some of them – particularly verbs, they’re the proudest – adjectives you can do anything with, but not verbs – however, I can manage the whole of them! Impenetrability! That’s what I say!
Alice: Would you tell me, please what that means?
Humpty Dumpty (looking very much pleased): Now you talk like a reasonable child. I meant by impenetrability that we have had enough of that subject, and it would be just as well if you’d mention what you mean to do next, as I suppose you don’t mean to stop here all the rest of your life.
Alice (in a thoughtful tone): That’s a great deal to make one word mean.
Humpty Dumpty: When I make a word do a lot of work like that, I always pay it extra.
Alice (too much puzzled to make any other remark): Oh!
Syndicated copies to:

The Central Dogma of Molecular Biology

Francis Crick, OM, FRS (1916 – 2004), a British molecular biologist, biophysicist, and neuroscientist
first articulated in 1958 and restated in August 1970
“Central dogma of molecular biology.” Nature 227 (5258): 561-3.
Bibcode 1970Natur.227..561C doi:10.1038/227561a0 PMID 4913914

 

Central Dogma of Molecular Biology
Central Dogma of Molecular Biology

 

Syndicated copies to:

Book Review: John Avery’s “Information Theory and Evolution”

Information Theory and Evolution Book Cover Information Theory and Evolution
John Avery
Non-fiction, Popular Science
World Scientific
January 1, 2003
paperback
217

This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.

Information Theory and EvolutionInformation Theory and Evolution by John Avery
My rating: 3 of 5 stars

This is a fantastic book which, for the majority of people, I’d give a five star review. For my own purposes, however, I was expecting far more on the theoretical side of information theory and statistical mechanics as applied to microbiology that it didn’t live up to, so I’m giving it three stars from a purely personal perspective.

I do wish that someone had placed it in my hands and forced me to read it when I was a freshman in college entering the study of biomedical and electrical engineering. It is far more an impressive book at this level and for those in the general public who are interested in the general history of science and philosophy of the topics. The general reader may be somewhat scared by a small amount of mathematics in chapter 4, but there is really no loss of continuity by skimming through most of it. For those looking for a bit more rigor, Avery provides some additional details in appendix A, but for the specialist, the presentation is heavily lacking.

The book opens with a facile but acceptable overview of the history of the development for the theory of evolution whereas most other texts would simply begin with Darwin’s work and completely skip the important philosophical and scientific contributions of Aristotle, Averroes, Condorcet, Linnaeus, Erasmus Darwin, Lamarck, or the debates between Cuvier and St. Hilaire.

For me, the meat of the book was chapters 3-5 and appendix A which collectively covered molecular biology, evolution, statistical mechanics, and a bit of information theory, albeit from a very big picture point of view. Unfortunately the rigor of the presentation and the underlying mathematics were skimmed over all too quickly to accomplish what I had hoped to gain from the text. On the other hand, the individual sections of “suggestions for further reading” throughout the book seem well researched and offer an acceptable launching pad for delving into topics in places where they may be covered more thoroughly.

The final several chapters become a bit more of an overview of philosophy surrounding cultural evolution and information technology which are much better covered and discussed in James Gleick’s recent book The Information.

Overall, Avery has a well laid out outline of the broad array of subjects and covers it all fairly well in an easy to read and engaging style.

View all my reviews

Reading Progress
  • Started book on 07/11/11
  • Finished book on 08/14//11
Syndicated copies to:

On Telephones and Architecture

John J. Carty (), first head of Bell Laboratories, 1908

 

Syndicated copies to:

John Battelle Review of James Gleick’s “The Information” and Why It’s a Good Thing

John Battelle recently posted a review of James Gleick’s last book The Information: A History, A Theory, A Flood. It reminds me that I find it almost laughable when the vast majority of the technology press and the digiterati bloviate about their beats when at its roots, they know almost nothing about how technology truly works or the mathematical or theoretical underpinnings of what is happening — and even worse that they don’t seem to really care.

I’ve seen hundreds of reviews and thousands of mentions of Steven Levy’s book In the Plex: How Google Thinks, Works, and Shapes Our Lives in the past few months, — in fact, Battelle reviewed it just before Gleick’s book — but I’ve seen few, if any, of Gleick’s book which I honestly think is a much more worthwhile read about what is going on in the world and has farther reaching implications about where we are headed.

I’ll give a BIG tip my hat to John for his efforts to have read Gleick and post his commentary and to continue to push the boundary further as he invites Gleick to speak at Web 2.0 Summit in the fall. I hope his efforts will bring the topic to the much larger tech community.  I further hope he and others might take the time to read Claude Shannon’s original paper [.pdf download], and if he’s further interested in the concept of thermodynamic entropy, I can recommend Andre Thess’s text The Entropy Principle: Thermodynamics for the Unsatisfied, which I’ve recently discovered and think does a good (and logically) consistent job of defining the concept at a level accessible to the average public.

On the Fallacy of Diminishing Returns

Nominated for quote of the week, which I encountered while reading Matt Ridley’s The Rational Optimist:

Thomas Jefferson (), American Founding Father and the principal author of the Declaration of Independence (1776)
in a letter to Isaac McPherson

 

Syndicated copies to:

Entropy Is Universal Rule of Language | Wired Science

Reposted Entropy Is Universal Rule of Language (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…

The research this article is based on is quite interesting for those doing language research.

The amount of information carried in the arrangement of words is the same across all languages, even languages that aren’t related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech.

“It doesn’t matter what language or style you take,” said systems biologist Marcelo Montemurro of England’s University of Manchester, lead author of a study May 13 in PLoS ONE. “In languages as diverse as Chinese, English and Sumerian, a measure of the linguistic order, in the way words are arranged, is something that seems to be a universal of languages.”

Language carries meaning both in the words we choose, and the order we put them in. Some languages, like Finnish, carry most of their meaning in tags on the words themselves, and are fairly free-form in how words are arranged. Others, like English, are more strict “John loves Mary” means something different from “Mary loves John.”

Montemurro realized that he could quantify the amount of information encoded in word order by computing a text’s “entropy,” or a measure of how evenly distributed the words are. Drawing on methods from information theory, Montemurro co-author Dami??n Zanette of the National Atomic Energy Commission in Argentina calculated the entropy of thousands of texts in eight different languages: English, French, German, Finnish, Tagalog, Sumerian, Old Egyptian and Chinese.

Then the researchers randomly rearranged all the words in the texts, which ranged from the complete works of Shakespeare to The Origin of Species to prayers written on Sumerian tablets.

“If we destroy the original text by scrambling all the words, we are preserving the vocabulary,” Montemurro said. “What we are destroying is the linguistic order, the patterns that we use to encode information.”

The researchers found that the original texts spanned a variety of entropy values in different languages, reflecting differences in grammar and structure.

But strangely, the difference in entropy between the original, ordered text and the randomly scrambled text was constant across languages. This difference is a way to measure the amount of information encoded in word order, Montemurro says. The amount of information lost when they scrambled the text was about 3.5 bits per word.

“We found, very interestingly, that for all languages we got almost exactly the same value,” he said. “For some reason these languages evolved to be constrained in this framework, in these patterns of word ordering.”

This consistency could reflect some cognitive constraints that all human brains run up against, or give insight into the evolution of language, Montemurro suggests.

Cognitive scientists are still debating whether languages have universal features. Some pioneering linguists suggested that languages should evolve according to a limited set of rules, which would produce similar features of grammar and structure. But a study published last month that looked at the structure and syntax of thousands of languages found no such rules.

It may be that universal properties of language show up only at a higher level of organization, suggests linguist Kenny Smith of the University of Edinburgh.

“Maybe these broad-brushed features get down to what’s really essential” about language, he said. “Having words, and having rules for how the words are ordered, maybe those are the things that help you do the really basic functions of language. And the places where linguists traditionally look to see universals are not where the fundamentals of language are.”

Image: James Morrison/Flickr.

Citation:”Universal Entropy of Word Ordering Across Linguistic Families.” Marcelo A. Montemurro and Damián H. Zanette. PLoS ONE, Vol. 6, Issue 5, May 13, 2011. DOI: 10.1371/journal.pone.0019875.

via Wired.com

 

Syndicated copies to: