Christoph Adami: Finding Life We Can’t Imagine | TEDx

Watched Finding life we can't imagine by Christoph Adami from ted.com
How do we search for alien life if it's nothing like the life that we know? Christoph Adami shows how he uses his research into artificial life -- self-replicating computer programs -- to find a signature, a "biomarker," that is free of our preconceptions of what life is.
Adami’s work is along similar lines to some of my own research. This short video gives an intriguing look into some of the basics of how to define life so that one can recognize it when one sees it.

A Cosmologically Centered Definition of Hydrogen

An anonymous wit defining hydrogen in light of the Big Bang Theory
As relayed by David Christian in his book Maps of Time: An Introduction to Big History

 

Book cover of "The Maps of Time"

Meaning according to Humpty Dumpty

Humpty Dumpty (in a rather scornful tone): When I use a word, it means just what I choose it to mean – neither more or less.
Alice: The question is, whether you can make a word mean so many different things?
Humpty Dumpty: The question is, which is to be master – that’s all.
Alice: (Too much puzzled to say anything, so after a minute Humpty Dumpty began again)
Humpty Dumpty: They’ve a temper, some of them – particularly verbs, they’re the proudest – adjectives you can do anything with, but not verbs – however, I can manage the whole of them! Impenetrability! That’s what I say!
Alice: Would you tell me, please what that means?
Humpty Dumpty (looking very much pleased): Now you talk like a reasonable child. I meant by impenetrability that we have had enough of that subject, and it would be just as well if you’d mention what you mean to do next, as I suppose you don’t mean to stop here all the rest of your life.
Alice (in a thoughtful tone): That’s a great deal to make one word mean.
Humpty Dumpty: When I make a word do a lot of work like that, I always pay it extra.
Alice (too much puzzled to make any other remark): Oh!

The Central Dogma of Molecular Biology

Francis Crick, OM, FRS (1916 – 2004), a British molecular biologist, biophysicist, and neuroscientist
first articulated in 1958 and restated in August 1970
“Central dogma of molecular biology.” Nature 227 (5258): 561-3.
Bibcode 1970Natur.227..561C doi:10.1038/227561a0 PMID 4913914

Book Review: John Avery’s “Information Theory and Evolution”

Information Theory and Evolution Book Cover Information Theory and Evolution
John Avery
Non-fiction, Popular Science
World Scientific
January 1, 2003
paperback
217

This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.

Information Theory and EvolutionInformation Theory and Evolution by John Avery
My rating: 3 of 5 stars

This is a fantastic book which, for the majority of people, I’d give a five star review. For my own purposes, however, I was expecting far more on the theoretical side of information theory and statistical mechanics as applied to microbiology that it didn’t live up to, so I’m giving it three stars from a purely personal perspective.

I do wish that someone had placed it in my hands and forced me to read it when I was a freshman in college entering the study of biomedical and electrical engineering. It is far more an impressive book at this level and for those in the general public who are interested in the general history of science and philosophy of the topics. The general reader may be somewhat scared by a small amount of mathematics in chapter 4, but there is really no loss of continuity by skimming through most of it. For those looking for a bit more rigor, Avery provides some additional details in appendix A, but for the specialist, the presentation is heavily lacking.

The book opens with a facile but acceptable overview of the history of the development for the theory of evolution whereas most other texts would simply begin with Darwin’s work and completely skip the important philosophical and scientific contributions of Aristotle, Averroes, Condorcet, Linnaeus, Erasmus Darwin, Lamarck, or the debates between Cuvier and St. Hilaire.

For me, the meat of the book was chapters 3-5 and appendix A which collectively covered molecular biology, evolution, statistical mechanics, and a bit of information theory, albeit from a very big picture point of view. Unfortunately the rigor of the presentation and the underlying mathematics were skimmed over all too quickly to accomplish what I had hoped to gain from the text. On the other hand, the individual sections of “suggestions for further reading” throughout the book seem well researched and offer an acceptable launching pad for delving into topics in places where they may be covered more thoroughly.

The final several chapters become a bit more of an overview of philosophy surrounding cultural evolution and information technology which are much better covered and discussed in James Gleick’s recent book The Information.

Overall, Avery has a well laid out outline of the broad array of subjects and covers it all fairly well in an easy to read and engaging style.

View all my reviews

Reading Progress
  • Started book on 07/11/11
  • Finished book on 08/14//11

On Telephones and Architecture

John J. Carty (), first head of Bell Laboratories, 1908

 

John Battelle Review of James Gleick’s “The Information” and Why It’s a Good Thing

John Battelle recently posted a review of James Gleick’s last book The Information: A History, A Theory, A Flood. It reminds me that I find it almost laughable when the vast majority of the technology press and the digiterati bloviate about their beats when at its roots, they know almost nothing about how technology truly works or the mathematical or theoretical underpinnings of what is happening — and even worse that they don’t seem to really care.

I’ve seen hundreds of reviews and thousands of mentions of Steven Levy’s book In the Plex: How Google Thinks, Works, and Shapes Our Lives in the past few months, — in fact, Battelle reviewed it just before Gleick’s book — but I’ve seen few, if any, of Gleick’s book which I honestly think is a much more worthwhile read about what is going on in the world and has farther reaching implications about where we are headed.

I’ll give a BIG tip my hat to John for his efforts to have read Gleick and post his commentary and to continue to push the boundary further as he invites Gleick to speak at Web 2.0 Summit in the fall. I hope his efforts will bring the topic to the much larger tech community.  I further hope he and others might take the time to read Claude Shannon’s original paper [.pdf download], and if he’s further interested in the concept of thermodynamic entropy, I can recommend Andre Thess’s text The Entropy Principle: Thermodynamics for the Unsatisfied, which I’ve recently discovered and think does a good (and logically) consistent job of defining the concept at a level accessible to the average public.

Book Review: Matt Ridley’s “The Rational Optimist: How Prosperity Evolves”

Matt Ridley’s The Rational Optimist: How Prosperity Evolves is going to be my new bible. This is certainly bound to be one of the most influential books I’ve read since Jared Diamond’s Guns, Germs, and Steel — what a spectacular thesis!

I am now going to recommend it to everyone that I meet and have already begun proselytizing its thesis. Certainly worth a second, third, and a successive rereads given the broad array of topics it covers in such a cohesive way. Simply and truly SPECTACULAR!

Dare to be an optimist…

For those interested in short tangential video related to the broader thesis take a look at Matt Ridley’s related TedX talk: 

Reading Progress
  • 06/05/11 marked as: currently reading
  • 06/06/11 10:37 pm Page 98 22.0% “I love the thought of ideas having sex! Evolution in a whole different framework…”
  • Finished book on 07/05/11
Bookmarked Information Theory and Statistical Mechanics by E. T. Jaynes (Physical Review, 106, 620 – Published 15 May 1957)

Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.

It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

DOI:https://doi.org/10.1103/PhysRev.106.620

On the Fallacy of Diminishing Returns

Nominated for quote of the week, which I encountered while reading Matt Ridley’s The Rational Optimist:

Thomas Jefferson (), American Founding Father and the principal author of the Declaration of Independence (1776)
in a letter to Isaac McPherson

 

Darwin Library, Now Online, Reveals Mind of 19th-Century Naturalist | The Chronicle

Bookmarked Darwin Library, Now Online, Reveals Mind of 19th-Century Naturalist by Jie Jenny Zou (The Chronicle of Higher Education)

A portion of Charles Darwin’s vast scientific library—including handwritten notes that the 19-century English naturalist scribbled in the margins of his books—has been digitized and is available online. Readers can now get a firsthand look into the mind of the man behind the theory of evolution.

The project to digitize Darwin’s extensive library, which includes 1,480 scientific books, was a joint effort with the University of Cambridge, the Darwin Manuscripts Project at the American Museum of Natural History, the Natural History Museum in Britain, and the Biodiversity Heritage Library.

The digital library, which includes 330 of the most heavily annotated books in the collection, is fully indexed—allowing readers to search through transcriptions of the naturalist’s handwritten notes that were compiled by the Darwin scholars Mario A. Di Gregorio and Nick Gill in 1990.

Charles Darwin’s Library from the Biodiversity Heritage Library

📅 18th International C. elegans Meeting, 22nd-26th June 2011

RSVPed Attending 18th International C. elegans Meeting
The Organizing Committee invites you to attend the 18th International C. elegans Meeting, sponsored by the Genetics Society of America. The meeting will be held June 22 – 26, 2011 at the University of California, Los Angeles campus. The meeting will begin on Wednesday evening, June 22 at 7:00 pm and will end on Sunday, June 26 at 12:00 noon. On Friday, June 24 at 5:00 pm there will be a Keynote Address by Joseph Culotti, Samuel Lunenfeld Research Institute, Toronto, Canada

Entropy Is Universal Rule of Language | Wired Science

Read Entropy Is Universal Rule of Language by Lisa Grossman (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.

The Science of Why We Don’t Believe Science | Mother Jones

Read The Science of Why We Don't Believe Science by Chris Mooney (Mother Jones)
How our brains fool us on climate, creationism, and the vaccine-autism link.
This is a fantastic article that everyone should read and take some serious time to absorb!

Bob Frankston on Communications

Watched Triangulation 4: Bob Frankston by Leo Laporte and Tom Merritt from TWiT Network
Computer pioneer who helped create the first spreadsheet, Bob Frankston, is this week's guest.
On a recent episode of Leo Laporte and Tom Merrit’s show Triangulation, they interviewed Bob Frankston of VisiCalc fame. They gave a great discussion of the current state of broadband in the U.S. and how it might be much better.  They get just a bit technical in places, but it’s a fantastic and very accessible discussion of the topic of communications that every American should be aware of.