What is Information? by Christoph Adami

What is Information? [1601.06176] by Christoph Adami (arxiv.org)
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: Christoph Adami [1601.06176] What is Information? on arXiv

Syndicated copies to:

Donald Forsdyke Indicates the Concept of Information in Biology Predates Claude Shannon

In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler's work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century.

As it was published, I had read Kevin Hartnett’s article and interview with Christoph Adami The Information Theory of Life in Quanta Magazine. I recently revisited it and read through the commentary and stumbled upon an interesting quote relating to the history of information in biology:

Polymath Adami has ‘looked at so many fields of science’ and has correctly indicated the underlying importance of information theory, to which he has made important contributions. However, perhaps because the interview was concerned with the origin of life and was edited and condensed, many readers may get the impression that IT is only a few decades old. However, information ideas in biology can be traced back to at least 19th century sources. In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler’s work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century. The emergence of his text – “What is Life” – from Dublin in the 1940s, inspired those who gave us DNA structure and the associated information concepts in “the classic period” of molecular biology. For more please see: Forsdyke, D. R. (2015) History of Psychiatry 26 (3), 270-287.

Donald Forsdyke, bioinformatician and theoretical biologist
in response to The Information Theory of Life in Quanta Magazine on

These two historical references predate Claude Shannon’s mathematical formalization of information in A Mathematical Theory of Communication (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book What is Life (1944).

For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in History of Psychiatry [.pdf]):

🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke  [1]
Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)

Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.

Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’

The original arXiv article also referenced two lectures which are appended below:

[Original Draft of this was written on December 14, 2015.]

References

[1]
D. Forsdyke R., “‘A vehicle of symbols and nothing more’. George Romanes, theory of mind, information, and Samuel Butler,” History of Psychiatry, vol. 26, no. 3, Aug. 2015 [Online]. Available: http://journals.sagepub.com/doi/abs/10.1177/0957154X14562755
Syndicated copies to:

The Information Theory of Life | Quanta Magazine

The Information Theory of Life by Kevin Hartnett (Quanta Magazine)
The Information Theory of Life: The polymath Christoph Adami is investigating life’s origins by reimagining living things as self-perpetuating information strings.

Syndicated copies to:

Information Theory is the New Central Discipline

Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
Syndicated copies to:

Christoph Adami: Finding Life We Can’t Imagine | TEDx


–via Ted.com
 
Adami’s work is along similar lines to some of my own research. This short video gives an intriguing look into some of the basics of how to define life so that one can recognize it when one sees it.

Syndicated copies to: