Bookmarked Selective pressures on genomes in molecular evolution by Charles Ofria, Christoph Adami, Travis C. Collier (arXiv.org, 15 Jan 2003)
We describe the evolution of macromolecules as an information transmission process and apply tools from Shannon information theory to it. This allows us to isolate three independent, competing selective pressures that we term compression, transmission, and neutrality selection. The first two affect genome length: the pressure to conserve resources by compressing the code, and the pressure to acquire additional information that improves the channel, increasing the rate of information transmission into each offspring. Noisy transmission channels (replication with mutations) gives rise to a third pressure that acts on the actual encoding of information; it maximizes the fraction of mutations that are neutral with respect to the phenotype. This neutrality selection has important implications for the evolution of evolvability. We demonstrate each selective pressure in experiments with digital organisms.
To be published in J. theor. Biology 222 (2003) 477-483
DOI: 10.1016/S0022-5193(03)00062-6

Synthetic Biology’s Hunt for the Genetic Transistor | IEEE Spectrum

Read Synthetic Biology's Hunt for the Genetic Transistor (spectrum.ieee.org)
How genetic circuits will unlock the true potential of bioengineering 
This is a great short article on bioengineering and synthetic biology written for the layperson. It’s also one of the best crash courses I’ve read on genetics in a while.

Bill Davenhall at TEDMED 2009 on Geomedicine: How Your Environment May Affect Your Health

Watched TEDMED 2009 on Geomedicine: How Your Environment May Affect Your Health by Bill Davenhall from TEDMED
Does where you live have an impact on your overhall health? Bill Davenhall believes that the location of our homes is critical to our medical history.
This is a great thing to think about the next time your doctor asks for your medical history. Perhaps with more data and a better visualization of it, it may bring home the messages of pollution and global warming.

‘The Information’ by James Gleick – Book Review by Janet Maslin | New York Times

Reposted ‘The Information’ by James Gleick - Review (nytimes.com)
“The Information,” by James Gleick, is to the nature, history and significance of data what the beach is to sand.
This book is assuredly going to have to skip up to the top of my current reading list.

“The Information” is so ambitious, illuminating and sexily theoretical that it will amount to aspirational reading for many of those who have the mettle to tackle it. Don’t make the mistake of reading it quickly. Imagine luxuriating on a Wi-Fi-equipped desert island with Mr. Gleick’s book, a search engine and no distractions. “The Information” is to the nature, history and significance of data what the beach is to sand.

In this relaxed setting, take the time to differentiate among the Brownian (motion), Bodleian (library) and Boolean (logic) while following Mr. Gleick’s version of what Einstein called “spukhafte Fernwirkung,” or “spooky action at a distance.” Einstein wasn’t precise about what this meant, and Mr. Gleick isn’t always precise either. His ambitions for this book are diffuse and far flung, to the point where providing a thumbnail description of “The Information” is impossible.

So this book’s prologue is its most slippery section. It does not exactly outline a unifying thesis. Instead it hints at the amalgam of logic, philosophy, linguistics, research, appraisal and anecdotal wisdom that will follow. If Mr. Gleick has one overriding goal it is to provide an animated history of scientific progress, specifically the progress of the technology that allows information to be recorded, transmitted and analyzed. This study’s range extends from communication by drumbeat to cognitive assault by e-mail.

As an illustration of Mr. Gleick’s versatility, consider what he has to say about the telegraph. He describes the mechanical key that made telegraphic transmission possible; the compression of language that this new medium encouraged; that it literally was a medium, a midway point between fully verbal messages and coded ones; the damaging effect its forced brevity had on civility; the confusion it created as to what a message actually was (could a mother send her son a dish of sauerkraut?) and the new conceptual thinking that it helped implement. The weather, which had been understood on a place-by-place basis, was suddenly much more than a collection of local events.

Beyond all this Mr. Gleick’s telegraph chapter, titled “A Nervous System for the Earth,” finds time to consider the kind of binary code that began to make sense in the telegraph era. It examines the way letters came to treated like numbers, the way systems of ciphers emerged. It cites the various uses to which ciphers might be put by businessmen, governments or fiction writers (Lewis Carroll, Jules Verne and Edgar Allan Poe). Most of all it shows how this phase of communication anticipated the immense complexities of our own information age.

Although “The Information” unfolds in a roughly chronological way, Mr. Gleick is no slave to linearity. He freely embarks on colorful digressions. Some are included just for the sake of introducing the great eccentrics whose seemingly marginal inventions would prove to be prophetic. Like Richard Holmes’s “Age of Wonder” this book invests scientists with big, eccentric personalities. Augusta Ada Lovelace, the daughter of Lord Byron, may have been spectacularly arrogant about what she called “my immense reasoning faculties,” claiming that her brain was “something more than merely mortal.” But her contribution to the writing of algorithms can, in the right geeky circles, be mentioned in the same breath as her father’s contribution to poetry.

The segments of “The Information” vary in levels of difficulty. Grappling with entropy, randomness and quantum teleportation is the price of enjoying Mr. Gleick’s simple, entertaining riffs on the Oxford English Dictionary’s methodology, which has yielded 30-odd spellings of “mackerel” and an enchantingly tongue-tied definition of “bada-bing” and on the cyber-battles waged via Wikipedia. (As he notes, there are people who have bothered to fight over Wikipedia’s use of the word “cute” to accompany a picture of a young polar bear.) That Amazon boasts of being able to download a book called “Data Smog” in less than a minute does not escape his keen sense of the absurd.

As it traces our route to information overload, “The Information” pays tribute to the places that made it possible. He cites and honors the great cogitation hives of yore. In addition to the Institute for Advanced Study in Princeton, N.J., the Mount Rushmore of theoretical science, he acknowledges the achievements of corporate facilities like Bell Labs and I.B.M.’s Watson Research Center in the halcyon days when many innovations had not found practical applications and progress was its own reward.

“The Information” also lauds the heroics of mathematicians, physicists and computer pioneers like Claude Shannon, who is revered in the computer-science realm for his information theory but not yet treated as a subject for full-length, mainstream biography. Mr. Shannon’s interest in circuitry using “if … then” choices conducting arithmetic in a binary system had novelty when he began formulating his thoughts in 1937. “Here in a master’s thesis by a research assistant,” Mr. Gleick writes, “was the essence of the computer revolution yet to come.”

Among its many other virtues “The Information” has the rare capacity to work as a time machine. It goes back much further than Shannon’s breakthroughs. And with each step backward Mr. Gleick must erase what his readers already know. He casts new light on the verbal flourishes of the Greek poetry that preceded the written word: these turns of phrase could be as useful for their mnemonic power as for their art. He explains why the Greeks arranged things in terms of events, not categories; how one Babylonian text that ends with “this is the procedure” is essentially an algorithm; and why the telephone and the skyscraper go hand in hand. Once the telephone eliminated the need for hand-delivered messages, the sky was the limit.

In the opinion of “The Information” the world of information still has room for expansion. We may be drowning in spam, but the sky’s still the limit today.

2011 USC Viterbi Lecture “Adventures in Coding Theory” by Elwyn Berklekamp

Bookmarked 2011 Andrew Viterbi Lecture Ming Hsieh Department of Electrical Engineering (USC - Viterbi School of Engineering)

"Adventures in Coding Theory"

Professor Elwyn Berlekamp
University of California, Berkeley

Gerontology Auditorium, Thursday, March 3, 4:30 to 5:30 p.m.

>> Click here for live wedcast

Abstract
The inventors of error-correcting codes were initially motivated by problems in communications engineering. But coding theory has since also influenced several other fields, including memory technology, theoretical computer science, game theory, portfolio theory, and symbolic manipulation. This talk will recall some forays into these subjects.

I wish I could be at this lecture in person today, but I’ll have to live with the live webcast.
Bookmarked ScienceDirectThermodynamics of natural selection III: Landauer's principle in computation and chemistry by Eric Smith (Journal of Theoretical Biology Volume 252, Issue 2, 21 May 2008, Pages 213-220)
This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.
https://doi.org/10.1016/j.jtbi.2008.02.013
Bookmarked Thermodynamics of natural selection II: Chemical Carnot cycles by Eric Smith (Journal of Theoretical Biology Volume 252, Issue 2, 21 May 2008, Pages 198-212)
This is the second in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and to their relations to the thermodynamics of computation. In the first paper of the series, it was shown that a general-form dimensional argument from the second law of thermodynamics captures a number of scaling relations governing growth and development across many domains of life. It was also argued that models of physiology based on reversible transformations provide sensible approximations within which the second-law scaling is realized. This paper provides a formal basis for decomposing general cyclic, fixed-temperature chemical reactions, in terms of the chemical equivalent of Carnot's cycle for heat engines. It is shown that the second law relates the minimal chemical work required to perform a cycle to the Kullback–Leibler divergence produced in its chemical output ensemble from that of a Gibbs equilibrium. Reversible models of physiology are used to create reversible models of natural selection, which relate metabolic energy requirements to information gain under optimal conditions. When dissipation is added to models of selection, the second-law constraint is generalized to a relation between metabolic work and the combined energies of growth and maintenance.
https://doi.org/10.1016/j.jtbi.2008.02.008
Bookmarked Thermodynamics of natural selection I: Energy flow and the limits on organization by Eric Smith (Journal of Theoretical Biology, Volume 252, Issue 2, 21 May 2008, Pages 185-197)
This is the first of three papers analyzing the representation of information in the biosphere, and the energetic constraints limiting the imposition or maintenance of that information. Biological information is inherently a chemical property, but is equally an aspect of control flow and a result of processes equivalent to computation. The current paper develops the constraints on a theory of biological information capable of incorporating these three characterizations and their quantitative consequences. The paper illustrates the need for a theory linking energy and information by considering the problem of existence and reslience of the biosphere, and presents empirical evidence from growth and development at the organismal level suggesting that the theory developed will capture relevant constraints on real systems. The main result of the paper is that the limits on the minimal energetic cost of information flow will be tractable and universal whereas the assembly of more literal process models into a system-level description often is not. The second paper in the series then goes on to construct reversible models of energy and information flow in chemistry which achieve the idealized limits, and the third paper relates these to fundamental operations of computation.
https://doi.org/10.1016/j.jtbi.2008.02.010
Bookmarked Information and Meaning in Evolutionary Processes by William F. Harms (Cambridge University Press)
The most significant legacy of philosophical skepticism is the realization that our concepts, beliefs and theories are social constructs. This belief has led to epistemological relativism, or the thesis that since there is no ultimate truth about the world, theory preferences are only a matter of opinion. In this book, William Harms seeks to develop the conceptual foundations and tools for a science of knowledge through the application of evolutionary theory, thus allowing us to acknowledge the legacy of skepticism while denying its relativistic offspring.
Bookmarked The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein (Oxford University Press)
No one can escape a sense of wonder when looking at an organism from within. From the humblest amoeba to man, from the smallest cell organelle to the amazing human brain, life presents us with example after example of highly ordered cellular matter, precisely organized and shaped to perform coordinated functions. But where does this order spring from? How does a living organism manage to do what nonliving things cannot do--bring forth and maintain all that order against the unrelenting, disordering pressures of the universe? In The Touchstone of Life, world-renowned biophysicist Werner Loewenstein seeks answers to these ancient riddles by applying information theory to recent discoveries in molecular biology. Taking us into a fascinating microscopic world, he lays bare an all-pervading communication network inside and between our cells--a web of extraordinary beauty, where molecular information flows in gracefully interlaced circles. Loewenstein then takes us on an exhilarating journey along that web and we meet its leading actors, the macromolecules, and see how they extract order out of the erratic quantum world; and through the powerful lens of information theory, we are let in on their trick, the most dazzling of magician's acts, whereby they steal form out of formlessness. The Touchstone of Life flashes with fresh insights into the mystery of life. Boldly straddling the line between biology and physics, the book offers a breathtaking view of that hidden world where molecular information turns the wheels of life. Loewenstein makes these complex scientific subjects lucid and fascinating, as he sheds light on the most fundamental aspects of our existence.
Bookmarked Information Theory and Evolution by John S. Avery (World Scientific)
This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.
Bookmarked Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey (Cambridge University Press)
Information Theory, Evolution and the Origin of Life presents a timely introduction to the use of information theory and coding theory in molecular biology. The genetical information system, because it is linear and digital, resembles the algorithmic language of computers. George Gamow pointed out that the application of Shannon's information theory breaks genetics and molecular biology out of the descriptive mode into the quantitative mode and Dr Yockey develops this theme, discussing how information theory and coding theory can be applied to molecular biology. He discusses how these tools for measuring the information in the sequences of the genome and the proteome are essential for our complete understanding of the nature and origin of life. The author writes for the computer competent reader who is interested in evolution and the origins of life.

HARASS SARAH is a PALINdrome, as well as a popular left-wing sport.

This is definitely the quote of the week:

Sol Golomb, mathematician and information theorist
via personal communication while discussing a palindromic word puzzle

Another Reason to Shun Chick Flicks: Crying = Less Sex!?

A new study by Noam Sobel, of the Olfaction Research Group at the Weizmann Institute of Science in Rehovot, Israel, and others are reporting in the journal Science this week that men in their study who sniffed the tears of crying women produced less testosterone and found female faces less arousing.

Previous studies in animals such as mice and mole rats have shown that tears convey important chemical messages which are used to attract or repel others of the same species.  There is good evidence for an interesting type means of higher-level chemical communication. These previous studies also incidentally show that “emotional” tears are chemically distinct from “eye-protecting” types of tears.

Scientific American’s “60 Second Science” (via link or listen below) podcast has a good audio overview of the study for those without the time to read the paper.

In press reports, Adam Anderson, a University of Toronto psychologist who was not involved with the study, posited that the results may imply that “tears have some influence on sexual selection, and that’s not something we associate with sadness.” He continued, “It could be a way of warding off unwanted advances.”

This study provides a new hypothesis for the evolution of crying in humans. (Now if only we could find some better reasons for laughter…)

The take home message may be that guys should not take their dates out to weepy chick flicks, or alternately women reluctantly accepting “pity dates” should force their suitors to exactly these types of testosterone damping films.