👓 The First Species to Have Every Individual’s Genome Sequenced | The Atlantic

The First Species to Have Every Individual’s Genome Sequenced by Ed Yong (The Atlantic)
It’s an endearing, giant, flightless, New Zealand parrot, and it’s a poster child for the quantified-self movement.

Kakapo

Syndicated copies to:

👓 White nationalists are flocking to genetic ancestry tests — but many don’t like their results | Stat News

White nationalists are flocking to genetic ancestry tests. Some don’t like what they find by Eric Boodman (Stat News)
It was a strange moment of triumph against racism: The gun-slinging white supremacist Craig Cobb, dressed up for daytime TV in a dark suit and red tie, hearing that his DNA testing revealed his ancestry to be only “86 percent European, and … 14 percent Sub-Saharan African.” The studio audience whooped and laughed and cheered. And Cobb — who was, in 2013, charged with terrorizing people while trying to create an all-white enclave in North Dakota — reacted like a sore loser in the schoolyard. “Wait a minute, wait a minute, hold on, just wait a minute,” he said, trying to put on an all-knowing smile. “This is called statistical noise.”
Syndicated copies to:

👓 EXCLUSIVE: First human embryos edited in U.S., using CRISPR | MIT Technology Review

EXCLUSIVE: First human embryos edited in U.S., using CRISPR by Steve Connor (MIT Technology Review)
Researchers have demonstrated they can efficiently improve the DNA of human embryos.
Syndicated copies to:

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease, March 1-3

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease (Institute for Pure and Applied Mathematics, UCLA | March 1-3, 2017)
Epigenetics refers to information transmitted during cell division other than the DNA sequence per se, and it is the language that distinguishes stem cells from somatic cells, one organ from another, and even identical twins from each other. In contrast to the DNA sequence, the epigenome is relatively susceptible to modification by the environment as well as stochastic perturbations over time, adding to phenotypic diversity in the population. Despite its strong ties to the environment, epigenetics has never been well reconciled to evolutionary thinking, and in fact there is now strong evidence against the transmission of so-called “epi-alleles,” i.e. epigenetic modifications that pass through the germline.

However, genetic variants that regulate stochastic fluctuation of gene expression and phenotypes in the offspring appear to be transmitted as an epigenetic or even Lamarckian trait. Furthermore, even the normal process of cellular differentiation from a single cell to a complex organism is not understood well from a mathematical point of view. There is increasingly strong evidence that stem cells are highly heterogeneous and in fact stochasticity is necessary for pluripotency. This process appears to be tightly regulated through the epigenome in development. Moreover, in these biological contexts, “stochasticity” is hardly synonymous with “noise”, which often refers to variation which obscures a “true signal” (e.g., measurement error) or which is structural, as in physics (e.g., quantum noise). In contrast, “stochastic regulation” refers to purposeful, programmed variation; the fluctuations are random but there is no true signal to mask.

This workshop will serve as a forum for scientists and engineers with an interest in computational biology to explore the role of stochasticity in regulation, development and evolution, and its epigenetic basis. Just as thinking about stochasticity was transformative in physics and in some areas of biology, it promises to fundamentally transform modern genetics and help to explain phase transitions such as differentiation and cancer.

This workshop will include a poster session; a request for poster titles will be sent to registered participants in advance of the workshop.

Speaker List:
Adam Arkin (Lawrence Berkeley Laboratory)
Gábor Balázsi (SUNY Stony Brook)
Domitilla Del Vecchio (Massachusetts Institute of Technology)
Michael Elowitz (California Institute of Technology)
Andrew Feinberg (Johns Hopkins University)
Don Geman (Johns Hopkins University)
Anita Göndör (Karolinska Institutet)
John Goutsias (Johns Hopkins University)
Garrett Jenkinson (Johns Hopkins University)
Andre Levchenko (Yale University)
Olgica Milenkovic (University of Illinois)
Johan Paulsson (Harvard University)
Leor Weinberger (University of California, San Francisco (UCSF))

Syndicated copies to:

NIMBioS Tutorial: Evolutionary Quantitative Genetics 2016

NIMBioS Tutorial: Evolutionary Quantitative Genetics 2016 by NIMBioS (nimbios.org)
This tutorial will review the basics of theory in the field of evolutionary quantitative genetics and its connections to evolution observed at various time scales. Quantitative genetics deals with the inheritance of measurements of traits that are affected by many genes. Quantitative genetic theory for natural populations was developed considerably in the period from 1970 to 1990 and up to the present, and it has been applied to a wide range of phenomena including the evolution of differences between the sexes, sexual preferences, life history traits, plasticity of traits, as well as the evolution of body size and other morphological measurements. Textbooks have not kept pace with these developments, and currently few universities offer courses in this subject aimed at evolutionary biologists. There is a need for evolutionary biologists to understand this field because of the ability to collect large amounts of data by computer, the development of statistical methods for changes of traits on evolutionary trees and for changes in a single species through time, and the realization that quantitative characters will not soon be fully explained by genomics. This tutorial aims to fill this need by reviewing basic aspects of theory and illustrating how that theory can be tested with data, both from single species and with multiple-species phylogenies. Participants will learn to use R, an open-source statistical programming language, to build and test evolutionary models. The intended participants for this tutorial are graduate students, postdocs, and junior faculty members in evolutionary biology.

Syndicated copies to:

Nick Lane and Philip Ball Discuss Mitochondria, Sex, and How to Live Longer

Nick Lane and Philip Ball Discuss Mitochondria, Sex, and How to Live Longer by Philip Ball (Nautil.us)
In his 2010 book, Life Ascending: The Ten Great Inventions of Evolution, Nick Lane, a biochemist at University College London, explores with eloquence and clarity the big questions of life: how it began, why we age and die, and why we have sex. Lane been steadily constructing an alternative view of evolution to the one in which genes explain it all. He argues that some of the major events during evolutionary history, including the origin of life itself, are best understood by considering where the energy comes from and how it is used. Lane describes these ideas in his 2015 book, The Vital Question: Why Is Life the Way It Is?. Recently Bill Gates called it “an amazing inquiry into the origins of life,” adding, Lane “is one of those original thinkers who make you say: More people should know about this guy’s work.” Nautilus caught up with Lane in his laboratory in London and asked him about his ideas on aging, sex, and death.

Biochemist Nick Lane explains the elements of life, sex, and aging in an engaging popular science interview.

Read more

Books by Nick Lane

Syndicated copies to:

Popular Science Books on Information Theory, Biology, and Complexity

The beginning of a four part series in which I provide a gradation of books and texts that lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.

Previously, I had made a large and somewhat random list of books which lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.  Below I’ll begin to do a somewhat better job of providing a finer gradation of technical level for both the hobbyist or the aspiring student who wishes to bring themselves to a higher level of understanding of these areas.  In future posts, I’ll try to begin classifying other texts into graduated strata as well.  The final list will be maintained here: Books at the Intersection of Information Theory and Biology.

Introductory / General Readership / Popular Science Books

These books are written on a generally non-technical level and give a broad overview of their topics with occasional forays into interesting or intriguing subtopics. They include little, if any, mathematical equations or conceptualization. Typically, any high school student should be able to read, follow, and understand the broad concepts behind these books.  Though often non-technical, these texts can give some useful insight into the topics at hand, even for the most advanced researchers.

Complexity: A Guided Tour by Melanie Mitchell (review)

Possibly one of the best places to start, this text gives a great overview of most of the major areas of study related to these fields.

Entropy Demystified: The Second Law Reduced to Plain Common Sense by Arieh Ben-Naim

One of the best books on the concept of entropy out there.  It can be read even by middle school students with no exposure to algebra and does a fantastic job of laying out the conceptualization of how entropy underlies large areas of the broader subject. Even those with Ph.D.’s in statistical thermodynamics can gain something useful from this lovely volume.

The Information: A History, a Theory, a Flood by James Gleick (review)

A relatively recent popular science volume covering various conceptualizations of what information is and how it’s been dealt with in science and engineering.  Though it has its flaws, its certainly a good introduction to the beginner, particularly with regard to history.

The Origin of Species by Charles Darwin

One of the most influential pieces of writing known to man, this classical text is the basis from which major strides in biology have been made as a result. A must read for everyone on the planet.

Information, Entropy, Life and the Universe: What We Know and What We Do Not Know by Arieh Ben-Naim

Information Theory and Evolution by John Avery

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein (review)

Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey

The four books above have a significant amount of overlap. Though one could read all of them, I recommend that those pressed for time choose Ben-Naim first. As I write this I’ll note that Ben-Naim’s book is scheduled for release on May 30, 2015, but he’s been kind enough to allow me to read an advance copy while it was in process; it gets my highest recommendation in its class. Loewenstein covers a bit more than Avery who also has a more basic presentation. Most who continue with the subject will later come across Yockey’s Information Theory and Molecular Biology which is similar to his text here but written at a slightly higher level of sophistication. Those who finish at this level of sophistication might want to try Yockey third instead.

The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley

Grammatical Man: Information, Entropy, Language, and Life  by Jeremy Campbell

Life’s Ratchet: How Molecular Machines Extract Order from Chaos by Peter M. Hoffmann

Complexity: The Emerging Science at the Edge of Order and Chaos by M. Mitchell Waldrop

The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) 

In the coming weeks/months, I’ll try to continue putting recommended books on the remainder of the rest of the spectrum, the balance of which follows in outline form below. As always, I welcome suggestions and recommendations based on others’ experiences as well. If you’d like to suggest additional resources in any of the sections below, please do so via our suggestion box. For those interested in additional resources, please take a look at the ITBio Resources page which includes information about related research groups; references and journal articles; academic, research institutes, societies, groups, and organizations; and conferences, workshops, and symposia.

Lower Level Undergraduate

These books are written at a level that can be grasped and understood by most with a freshmen or sophomore university level. Coursework in math, science, and engineering will usually presume knowledge of calculus, basic probability theory, introductory physics, chemistry, and basic biology.

Upper Level Undergraduate

These books are written at a level that can be grasped and understood by those at a junior or senor university level. Coursework in math, science, and engineering may presume knowledge of probability theory, differential equations, linear algebra, complex analysis, abstract algebra, signal processing, organic chemistry, molecular biology, evolutionary theory, thermodynamics, advanced physics, and basic information theory.

Graduate Level

These books are written at a level that can be grasped and understood by most working at the level of a master’s level at most universities.  Coursework presumes all the previously mentioned classes, though may require a higher level of sub-specialization in one or more areas of mathematics, physics, biology, or engineering practice.  Because of the depth and breadth of disciplines covered here, many may feel the need to delve into areas outside of their particular specialization.

Syndicated copies to:

Probability Models for DNA Sequence Evolution

Probability Models for DNA Sequence Evolution (Springer, 2008, 2nd Edition) by Rick Durrett (math.duke.edu)

While browsing through some textbooks and researchers today, I came across a fantastic looking title: Probability Models for DNA Sequence Evolution by Rick Durrett (Springer, 2008). While searching his website at Duke, I noticed that he’s made a .pdf copy of a LaTeX version of the 2nd edition available for download.   I hope others find it as interesting and useful as I do.

I’ll also give him a shout out for being a mathematician with a fledgling blog: Rick’s Ramblings.

Book Cover of Probability Models for DNA Sequence Evolution by Richard Durrett
Probability Models for DNA Sequence Evolution by Richard Durrett
Syndicated copies to:

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

 

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.


 

Logo for the journal Entropy

 

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

  1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
  2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

  • information theoretic intrinsic motivations
  • information theoretic quantification of behavior
  • information theoretic guidance of artificial evolution
  • information theoretic guidance of self-organization
  • information theoretic driving forces behind learning
  • information theoretic driving forces behind behavior
  • information theory in swarms
  • information theory in social behavior
  • information theory in evolution
  • information theory in the brain
  • information theory in system-environment distinction
  • information theory in the perception action loop
  • information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Syndicated copies to:

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

  •  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
  • To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
  • To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
  • To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
  • To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna.  A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.

The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Logo for the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Location: DACAM, Max F. Perutz Laboratories, University of Vienna, Dr. Bohrgasse 9, A-1030, Vienna, Austria
Dates: May 14, 2014 to May 17, 2014

The workshop is being organized by:

  • Richard Henchman (University of Manchester, United Kingdom)
  • Bojan Zagrovic (University of Vienna, Austria)
  • Michel Cuendet (Swiss Institute of Bioinformatics, Lausanne, Switzerland and Weill Cornell Medical College, New York, USA)
  • Chris Oostenbrink (University of Natural Resources and Life Sciences, Austria)

It’s being supported by CECAM, the European Research Council, and the Royal Society of Chemistry’s Statistical Mechanics and Thermodynamics Group.

I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.

The summary from the workshop website states:

This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.

Further details on the workshop can be found on the CECAM website.

 

As always, details on other upcoming workshops and conferences relating to information theory and biology can be found on our ITBio Conferences/Workshops page.

 

 

Information Theory and Paleoanthropology

A few weeks ago I had communicated a bit with paleoanthropologist John Hawks.  I wanted to take a moment to highlight the fact that he maintains an excellent blog primarily concerning his areas of research which include anthropology, genetics and evolution.  Even more specifically, he is one of the few people in these areas with at least a passing interest in the topic of information theory as it relates to his work. I recommend everyone take a look at his information theory specific posts.

silhouette of John Hawks from his blog

I’ve previously written a brief review of John Hawks’ (in collaboration with Anthony Martin) “Major Transitions in Evolution” course from The Learning Company as part of their Great Courses series of lectures. Given my interest in the MOOC revolution in higher education, I’ll also mention that Dr. Hawks has recently begun a free Coursera class entitled “Human Evolution: Past and Future“. I’m sure his current course focuses more on the area of human evolution compared with the prior course which only dedicated a short segment on this time period.  Given Hawks’ excellent prior teaching work, I’m sure this will be of general interest to readers interested in information theory as it relates to evolution, biology, and big history.

I’d love to hear from others in the area of anthropology who are interested in information theoretical applications.

 

Syndicated copies to:

Renaissance for Information Theory in Biology

This year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956. (I might argue it’s possibly even bigger than Claude Shannon’s Ph.D. thesis.)  It certainly portends to create a movement that will rapidly build upon and far surpass Norbert Weiner’s concept of Cybernetics and Ludwig von Bertalanffy’s concept of General Systems Theory.

This week John Baez has announced an upcoming three day workshop on “Entropy and Information in Biological Systems” to be hosted by the National Institute for Mathematical and Biological Synthesis in Knoxville, TN, tentatively scheduled for October 22-24, 2014.

Apparently unbeknownst to Baez, earlier this year Andrew Eckford, Toby Berger, and Peter Thomas announced a six day workshop on “Biological and Bio-Inspired Information Theory” to be hosted by the Banff International Research Station for Mathematical Innovation and Discovery scheduled for October 26-31, 2014 – just two days later!

What a bonanza!!

The BIRS workshop will be a bit more general in its approach while the NIMBioS workshop has a slightly tighter view specifically on maximum entropy as applied to biology.

Even more telling (and perhaps most promising) about the two workshops is the very heavy mathematical bent both intend to make their focus.  I have a theory that the bounds of science are held below the high water level of mathematics (aka are “bounded by” in mathematics-speak), so there is nothing more exciting than to see groups attempting to push the mathematics and its application further. It was both the lack of mathematical rigor and the general youth of biology (and specifically genetics and microbiology) in the 1950’s which heavily hampered the early growth of cybernetics as a movement. Fortunately this is no longer the case on either count. Now we just need more researchers who are more readily conversant in the two realms simultaneously.

Book Review: Werner Loewenstein’s “The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life”

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein

Though there is a paucity of equations, particularly on the information theoretic side, Loewenstein does a fantastic job of discussing the theory and philosophy of what is going on in the overlapping fields of information theory and microbiology. (I will note that it is commonly held wisdom within publishing, particularly for books for the broader public, that the number of equations in a text is inversely proportional to the number of sales and I’m sure this is the reason for the lack of mathematical substantiation which he could easily have supplied.)

The Touchstone of Life (Book Cover)

This is a much more specific and therefore much better – in my mind – book than John Avery’s Information Theory and Evolution which covers some similar ground. Loewenstein has a much better and more specific grasp of the material in my opinion. Those who feel overwhelmed by Loewenstein may prefer to take a step back to Avery’s more facile presentation.

Loewenstein has a deft ability to describe what is going on and give both an up-close view with many examples as well as a spectacular view of the broader picture – something which is often missing in general science books of this sort. Readers with no mathematical or microbiology background can benefit from it as much as those with more experience.

One thing which sets it apart from much of its competition, even in the broader general science area of non-fiction, is that the author has a quirky but adept ability to add some flowery language and analogy to clarify his points. Though many will find this off-putting, it really does add some additional flavor to what might be dry and dull explication to many. His range of background knowledge, philosophy and vocabulary are second only (and possibly even on par or exceeding in some cases) that of Simon Winchester.

I’d highly recommend this book to people prior to their academic studies of biochemistry or molecular cell biology or to budding biomedical engineers prior to their junior year of study. I truly wish I had read this in 1994 myself, but alas it didn’t exist until a few years after. I lament that I hadn’t picked it up and been able to read it thoroughly until now.

For my part, his drastically differing viewpoint of the way in which biology should be viewed moving forward, is spot on. I am firmly a member of this new “school”. His final chapter on this concept is truly illuminating from a philosophical and theoretical view and I encourage people to read it first instead of last.

I’ll also note briefly that I’ve seen some reviews of this book which make mention of creationism or intelligent design and whether or not proponents of those philosophies feel that Loewenstein’s work here supports them or not, particularly since Loewenstein appeared on a panel with Dembski once. I will state for those who take a purely scientific viewpoint of things, that this book is written in full support of evolution and microbiology and doesn’t use the concept of “information” to muddy the waters the way many ID arguments are typically made.

Original review posted to GoodReads.com on 9/4/12

Syndicated copies to: