Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

Physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers.

Prior to the holidays Sean wrote a blogpost that contains a full overview table of contents, which will give everyone a stronger idea of its contents. For convenience I’ll excerpt it below.

Springer recently announced the publication of the book Quantum Biological Information Theory by Ivan B. Djordjevic, in which I’m sure many readers here will have interest. I hope to have a review of it shortly after I’ve gotten a copy. Until then…

From the publisher’s website:

This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects.

Integrates quantum information and quantum biology concepts;

Assumes only knowledge of basic concepts of vector algebra at undergraduate level;

Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology;

Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models on tumor and cancer development, quantum modeling of bird navigation compass, quantum aspects of photosynthesis, quantum biological error correction.

In a lecture at Caltech, Brian Swingle reviews the idea that entanglement is the glue which holds spacetime together and shows how Einstein's equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so he concludes by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle Colloquium at Caltech

From the Physics Research Conference 2015-2016
on Thursday, November 19, 2015 at 4:00 pm
at the California Institute of Technology, East Bridge 201 – Norman Bridge Laboratory of Physics, East

“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.

At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.

Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

A framework for determining the difference between the hard and soft sciences.

A recent post in one of the blogs at Discover Magazine the other day had me thinking about the shape of science over time.

The article made me wonder about the divide between the ‘soft’ and ‘hard’ sciences, and how we might better define and delineate them. Perhaps in a particular field, the greater the proliferation of “schools of though,” the more likely something is to be a soft science? (Or mathematically speaking, there’s an inverse relationship in a field between how well supported it is and the number of schools of thought it has.) I consider a school of thought to be a hypothetical/theoretical proposed structure meant to potentially help advance the state of the art and adherents join one of many varying camps while evidence is built up (or not) until one side carries the day.

Theorem: The greater the proliferation of “schools of though,” the more likely something is to be a soft science.

Generally in most of the hard sciences like physics, biology, or microbiology, there don’t seem to be any opposing or differing schools of thought. While in areas like psychology or philosophy they abound, and often have long-running debates between schools without any hard data or evidence to truly allow one school to win out over another. Perhaps as the structure of a particular science becomes more sound, the concept of schools of thought become more difficult to establish?

For some of the hard sciences, it would seem that schools of thought only exist at the bleeding edge of the state-of-the-art where there isn’t yet enough evidence to swing the field one way or another to firmer ground.

Example: Evolutionary Biology

We might consider the area of evolutionary biology in which definitive evidence in the fossil record is difficult to come by, so there’s room for the opposing thoughts for gradualism versus punctuated equilibrium to be individual schools. Outside of this, most of evolutionary theory is so firmly grounded that there aren’t other schools.

Example: Theoretical Physics

The relatively new field of string theory might be considered a school of thought, though there don’t seem to be a lot of other opposing schools at the moment. If it does, such a school surely exists, in part, because there isn’t the ability to validate it with predictions and current data. However, because of the strong mathematical supporting structure, I’ve yet to hear anyone use the concept of school of thought to describe string theory, which sits in a community which seems to believe its a foregone conclusion that it or something very close to it represents reality. (Though for counterpoint, see Lee Smolin’s The Trouble with Physics.)

Example: Mathematics

To my knowledge, I can’t recall the concept of school of thought ever being applied to mathematics except in the case of the Pythagorean School which historically is considered to have been almost as much a religion as a science. Because of its theoretical footings, I suppose there may never be competing schools, for even in the case of problems like P vs. NP, individuals may have some gut reaction to which way things are leaning, everyone ultimately knows it’s going to be one or the other ( or ). Many mathematicians also know that it’s useful to try to prove a theorem during the day and then try to disprove it (or find a counterexample) by night, so even internally and individually they’re self-segregating against creating schools of thought right from the start.

Example: Religion

Looking at the furthest end of the other side of the spectrum, because there is no verifiable way to prove that God exists, there has been an efflorescence of religions of nearly every size and shape since the beginning of humankind. Might we then presume that this is the softest of the ‘sciences’?

What examples or counter examples can you think of?

Amanda Peet presented the a lecture entitled "String Theory Legos for Black Holes" at the Perimeter Institute for Theoretical Physics.

Four decades ago, Stephen Hawking posed the black hole information paradox about black holes and quantum theory. It still challenges the imaginations of theoretical physicists today. Yesterday, Amanda Peet (University of Toronto) presented the a lecture entitled “String Theory Legos for Black Holes” yesterday at the Perimeter Institute for Theoretical Physics. A quick overview/teaser trailer for the lecture follows along with some additional information and the video of the lecture itself.

The “Information Paradox” with Amanda Peet (teaser trailer)

“Black holes are the ‘thought experiment’ par excellence, where the big three of physics – quantum mechanics, general relativity and thermodynamics – meet and fight it out, dragging in brash newcomers such as information theory and strings for support. Though a unification of gravity and quantum field theory still evades string theorists, many of the mathematical tools and ideas they have developed find applications elsewhere.

One of the most promising approaches to resolving the “information paradox” (the notion that nothing, not even information itself, survives beyond a black hole’s point-of-no-return event horizon) is string theory, a part of modern physics that has wiggled its way into the popular consciousness.

On May 6, 2015, Dr. Amanda Peet, a physicist at the University of Toronto, will describe how the string toolbox allows study of the extreme physics of black holes in new and fruitful ways. Dr. Peet will unpack that toolbox to reveal the versatility of strings and (mem)branes, and will explore the intriguing notion that the world may be a hologram.

Amanda Peet is an Associate Professor of Physics at the University of Toronto. She grew up in the South Pacific island nation of Aotearoa/New Zealand, and earned a B.Sc.(Hons) from the University of Canterbury in NZ and a Ph.D. from Stanford University in the USA. Her awards include a Radcliffe Fellowship from Harvard and an Alfred P. Sloan Foundation Research Fellowship. She was one of the string theorists interviewed in the three-part NOVA PBS TV documentary “Elegant Universe”.

“My vision of life is that everything extends from replicators, which are in practice DNA molecules on this planet. The replicators reach out into the world to influence their own probability of being passed on. Mostly they don’t reach further than the individual body in which they sit, but that’s a matter of practice, not a matter of principle. The individual organism can be defined as that set of phenotypic products which have a single route of exit of the genes into the future. That’s not true of the cuckoo/reed warbler case, but it is true of ordinary animal bodies. So the organism, the individual organism, is a deeply salient unit. It’s a unit of selection in the sense that I call a “vehicle”. There are two kinds of unit of selection. The difference is a semantic one. They’re both units of selection, but one is the replicator, and what it does is get itself copied. So more and more copies of itself go into the world. The other kind of unit is the vehicle. It doesn’t get itself copied. What it does is work to copy the replicators which have come down to it through the generations, and which it’s going to pass on to future generations. So we have this individual replicator dichotomy. They’re both units of selection, but in different senses. It’s important to understand that they are different senses.”

RICHARD DAWKINS is an evolutionary biologist; Emeritus Charles Simonyi Professor of the Public Understanding of Science, Oxford; Author, The Selfish Gene; The Extended Phenotype; Climbing Mount Improbable; The God Delusion; An Appetite For Wonder; and (forthcoming) A Brief Candle In The Dark.

Watch the entire video interview and read the transcript at Edge.org.

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Dr. Mike Miller has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Dr. Mike Miller, who had previously announced a two quarter sequence of classes on Lie Groups at UCLA, has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Prior to the first part of the course, I’d written some thoughts about the timbre and tempo of his lecture style and philosophy and commend those interested to take a peek. I also mentioned some additional resources for the course there as well. For those who missed the first portion, I’m happy to help fill you in and share some of my notes if necessary. The recommended minimum prerequisites for this class are linear algebra and some calculus.

Introduction to Lie Groups and Lie Algebras (Part 2)

Math X 450.7 / 3.00 units / Reg. # 251580W
Professor: Michael Miller, Ph.D.
Start Date: January 13, 2015
Location: UCLA, 5137 Math Sciences Building
Tuesday, 7-10pm
January 13 – March 24
11 meetings total Class will not meet on one Tuesday to be annouced.

A Lie group is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A Lie algebra is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course is the second in a 2-quarter sequence that offers an introductory survey of Lie groups, their associated Lie algebras, and their representations. Its focus is split between continuing last quarter’s study of matrix Lie groups and their representations and reconciling this theory with that for the more general manifold setting. Topics to be discussed include the Weyl group, complete reducibility, semisimple Lie algebras, root systems, and Cartan subalgebras. This is an advanced course, requiring a solid understanding of linear algebra, basic analysis, and, ideally, the material from the previous quarter.Internet access required to retrieve course materials.

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

I’m giving a short 30-minute talk at a workshop on Biological and Bio-Inspired Information Theory at the Banff International Research Institute. I’ll say more about the workshop later, but here’s my talk: * Biodiversity, entropy and thermodynamics. Most of the people at this workshop study neurobiology and cell signalling, not evolutionary game theory or…

I’m having a great time at a workshop on Biological and Bio-Inspired Information Theory in Banff, Canada. You can see videos of the talks online. There have been lots of good talks so far, but this one really blew my mind: * Naftali Tishby, Sensing and acting under information constraints—a principled approach to biology and…

John Harte is an ecologist who uses maximum entropy methods to predict the distribution, abundance and energy usage of species. Marc Harper uses information theory in bioinformatics and evolutionary game theory. Harper, Harte and I are organizing a workshop on entropy and information in biological systems, and I’m really excited about it!

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend: * Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015. Click the link, read the stuff and scroll down to “CLICK HERE” to apply.

There will be a 5-day workshop on Biological and Bio-Inspired Information Theory at BIRS from Sunday the 26th to Friday the 31st of October, 2014. It’s being organized by * Toby Berger (University of Virginia) * Andrew Eckford (York University) * Peter Thomas (Case Western Reserve University) BIRS is the Banff International Research Station,…

How does it feel to (co-)write a book and hold the finished product in your hands? About like this: Many, many thanks to my excellent co-authors, Tadashi Nakano and Tokuko Haraguchi, for their hard work; thanks to Cambridge for accepting this project and managing it well; and thanks to Satoshi Hiyama for writing a nice blurb.

You may have seen our PLOS ONE paper about tabletop molecular communication, which received loads of media coverage. One of the goals of this paper was to show that anyone can do experiments in molecular communication, without any wet labs or expensive apparatus.

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all). (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”