What is Information? by Christoph Adami

Bookmarked What is Information? [1601.06176] (arxiv.org)
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: Christoph Adami [1601.06176] What is Information? on arXiv

Syndicated copies to:

Forthcoming ITBio-related book from Sean Carroll: “The Big Picture: On the Origins of Life, Meaning, and the Universe Itself”

Physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers.

In catching up on blogs/reading from the holidays, I’ve noticed that physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers. One can already pre-order the book via Amazon.

Prior to the holidays Sean wrote a blogpost that contains a full overview table of contents, which will give everyone a stronger idea of its contents. For convenience I’ll excerpt it below.

I’ll post a review as soon as a copy arrives, but it looks like a strong new entry in the category of popular science books on information theory, biology and complexity as well as potentially the areas of evolution, the origin of life, and physics in general.

As a side bonus, for those reading this today (1/15/16), I’ll note that Carroll’s 12 part lecture series from The Great Courses The Higgs Boson and Beyond (The Learning Company, February 2015) is 80% off.

The Big Picture

 

THE BIG PICTURE: ON THE ORIGINS OF LIFE, MEANING, AND THE UNIVERSE ITSELF

0. Prologue

* Part One: Cosmos

  • 1. The Fundamental Nature of Reality
  • 2. Poetic Naturalism
  • 3. The World Moves By Itself
  • 4. What Determines What Will Happen Next?
  • 5. Reasons Why
  • 6. Our Universe
  • 7. Time’s Arrow
  • 8. Memories and Causes

* Part Two: Understanding

  • 9. Learning About the World
  • 10. Updating Our Knowledge
  • 11. Is It Okay to Doubt Everything?
  • 12. Reality Emerges
  • 13. What Exists, and What Is Illusion?
  • 14. Planets of Belief
  • 15. Accepting Uncertainty
  • 16. What Can We Know About the Universe Without Looking at It?
  • 17. Who Am I?
  • 18. Abducting God

* Part Three: Essence

  • 19. How Much We Know
  • 20. The Quantum Realm
  • 21. Interpreting Quantum Mechanics
  • 22. The Core Theory
  • 23. The Stuff of Which We Are Made
  • 24. The Effective Theory of the Everyday World
  • 25. Why Does the Universe Exist?
  • 26. Body and Soul
  • 27. Death Is the End

* Part Four: Complexity

  • 28. The Universe in a Cup of Coffee
  • 29. Light and Life
  • 30. Funneling Energy
  • 31. Spontaneous Organization
  • 32. The Origin and Purpose of Life
  • 33. Evolution’s Bootstraps
  • 34. Searching Through the Landscape
  • 35. Emergent Purpose
  • 36. Are We the Point?

* Part Five: Thinking

  • 37. Crawling Into Consciousness
  • 38. The Babbling Brain
  • 39. What Thinks?
  • 40. The Hard Problem
  • 41. Zombies and Stories
  • 42. Are Photons Conscious?
  • 43. What Acts on What?
  • 44. Freedom to Choose

* Part Six: Caring

  • 45. Three Billion Heartbeats
  • 46. What Is and What Ought to Be
  • 47. Rules and Consequences
  • 48. Constructing Goodness
  • 49. Listening to the World
  • 50. Existential Therapy
  • Appendix: The Equation Underlying You and Me
  • Acknowledgments
  • Further Reading
  • References
  • Index

Source: Sean Carroll | The Big Picture: Table of Contents

Syndicated copies to:

Quantum Biological Information Theory by Ivan B. Djordjevic | Springer

Bookmarked Quantum Biological Information Theory (Springer, 2015)

Springer recently announced the publication of the book Quantum Biological Information Theory by Ivan B. Djordjevic, in which I’m sure many readers here will have interest. I hope to have a review of it shortly after I’ve gotten a copy. Until then…

From the publisher’s website:

This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects.

  • Integrates quantum information and quantum biology concepts;
  • Assumes only knowledge of basic concepts of vector algebra at undergraduate level;
  • Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology;
  • Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models on tumor and cancer development, quantum modeling of bird navigation compass, quantum aspects of photosynthesis, quantum biological error correction.

Source: Quantum Biological Information Theory | Ivan B. Djordjevic | Springer

9783319228150I’ll note that it looks like it also assumes some reasonable facility with quantum mechanics in addition to the material listed above.

Springer also has a downloadable copy of the preface and a relatively extensive table of contents for those looking for a preview. Dr. Djordjevic has been added to the ever growing list of researchers doing work at the intersection of information theory and biology.

Syndicated copies to:

Einstein’s Equations From Entanglement

In a lecture at Caltech, Brian Swingle reviews the idea that entanglement is the glue which holds spacetime together and shows how Einstein's equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so he concludes by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle Colloquium at Caltech

From the Physics Research Conference 2015-2016
on Thursday, November 19, 2015 at 4:00 pm
at the California Institute of Technology, East Bridge 201 – Norman Bridge Laboratory of Physics, East

All talks are intended for a broad audience, and everyone is encouraged to attend. A list of future conferences can be found here.
Sponsored by Division of Physics, Mathematics and Astronomy

In recent years we have learned that the physics of quantum information plays a crucial role in the emergence of spacetime from microscopic degrees of freedom.

I will review the idea that entanglement is the glue which holds spacetime together and show how Einstein’s equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so I will conclude by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle, postdoctoral fellow at the Stanford Institute for Theoretical Physics and physicist focusing on quantum matter, quantum information, and quantum gravity
in Physics Research Conference | Caltech

Click here for full screen presentation.

Syndicated copies to:

Why Information Grows: The Evolution of Order, from Atoms to Economies

I just ordered a copy of Why Information Grows: The Evolution of Order, from Atoms to Economies by Cesar Hidalgo. Although it seems more focused on economics, the base theory seems to fit right into some similar thoughts I’ve long held about biology.

Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo
Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo

 

From the book description:

“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.

At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.

Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”

Syndicated copies to:

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Additonal resources from the Workshop:

 

Syndicated copies to:

Schools of Thought in the Hard and Soft Sciences

A framework for determining the difference between the hard and soft sciences.

A recent post in one of the blogs at Discover Magazine the other day had me thinking about the shape of science over time.

Neuroscientists don’t seem to disagree on the big issues. Why are there no big ideas in neuroscience?

Neuroskeptic, Where Are The Big Ideas in Neuroscience? (Part 1)

The article made me wonder about the divide between the ‘soft’ and ‘hard’ sciences, and how we might better define and delineate them. Perhaps in a particular field, the greater the proliferation of “schools of though,” the more likely something is to be a soft science? (Or mathematically speaking, there’s an inverse relationship in a field between how well supported it is and the number of schools of thought it has.) I consider a school of thought to be a hypothetical/theoretical proposed structure meant to potentially help advance the state of the art and adherents join one of many varying camps while evidence is built up (or not) until one side carries the day.

Firmness of Science vs. # of Schools of Thought
Simple linear approximation of the relationship, though honestly something more similar to y=1/x which is asymptotic to the x and y axes is far more realistic.

Theorem: The greater the proliferation of “schools of though,” the more likely something is to be a soft science.

Generally in most of the hard sciences like physics, biology, or microbiology, there don’t seem to be any opposing or differing schools of thought. While in areas like psychology or philosophy they abound, and often have long-running debates between schools without any hard data or evidence to truly allow one school to win out over another. Perhaps as the structure of a particular science becomes more sound, the concept of schools of thought become more difficult to establish?

For some of the hard sciences, it would seem that schools of thought only exist at the bleeding edge of the state-of-the-art where there isn’t yet enough evidence to swing the field one way or another to firmer ground.

Example: Evolutionary Biology

We might consider the area of evolutionary biology in which definitive evidence in the fossil record is difficult to come by, so there’s room for the opposing thoughts for gradualism versus punctuated equilibrium to be individual schools. Outside of this, most of evolutionary theory is so firmly grounded that there aren’t other schools.

Example: Theoretical Physics

The relatively new field of string theory might be considered a school of thought, though there don’t seem to be a lot of other opposing schools at the moment. If it does, such a school surely exists, in part, because there isn’t the ability to validate it with predictions and current data. However, because of the strong mathematical supporting structure, I’ve yet to hear anyone use the concept of school of thought to describe string theory, which sits in a community which seems to believe its a foregone conclusion that it or something very close to it represents reality. (Though for counterpoint, see Lee Smolin’s The Trouble with Physics.)

Example: Mathematics

To my knowledge, I can’t recall the concept of school of thought ever being applied to mathematics except in the case of the Pythagorean School which historically is considered to have been almost as much a religion as a science. Because of its theoretical footings, I suppose there may never be competing schools, for even in the case of problems like P vs. NP, individuals may have some gut reaction to which way things are leaning, everyone ultimately knows it’s going to be one or the other (P=NP or P \neq NP). Many mathematicians also know that it’s useful to try to prove a theorem during the day and then try to disprove it (or find a counterexample) by night, so even internally and individually they’re self-segregating against creating schools of thought right from the start.

Example: Religion

Looking at the furthest end of the other side of the spectrum, because there is no verifiable way to prove that God exists, there has been an efflorescence of religions of nearly every size and shape since the beginning of humankind. Might we then presume that this is the softest of the ‘sciences’?

What examples or counter examples can you think of?

Syndicated copies to:

String Theory, Black Holes, and Information

Amanda Peet presented the a lecture entitled "String Theory Legos for Black Holes" at the Perimeter Institute for Theoretical Physics.

Four decades ago, Stephen Hawking posed the black hole information paradox about black holes and quantum theory. It still challenges the imaginations of theoretical physicists today. Yesterday, Amanda Peet (University of Toronto) presented the a lecture entitled “String Theory Legos for Black Holes” yesterday at the Perimeter Institute for Theoretical Physics. A quick overview/teaser trailer for the lecture follows along with some additional information and the video of the lecture itself.

The “Information Paradox” with Amanda Peet (teaser trailer)

“Black holes are the ‘thought experiment’ par excellence, where the big three of physics – quantum mechanics, general relativity and thermodynamics – meet and fight it out, dragging in brash newcomers such as information theory and strings for support. Though a unification of gravity and quantum field theory still evades string theorists, many of the mathematical tools and ideas they have developed find applications elsewhere.

One of the most promising approaches to resolving the “information paradox” (the notion that nothing, not even information itself, survives beyond a black hole’s point-of-no-return event horizon) is string theory, a part of modern physics that has wiggled its way into the popular consciousness.

On May 6, 2015, Dr. Amanda Peet, a physicist at the University of Toronto, will describe how the string toolbox allows study of the extreme physics of black holes in new and fruitful ways. Dr. Peet will unpack that toolbox to reveal the versatility of strings and (mem)branes, and will explore the intriguing notion that the world may be a hologram.

Amanda Peet Amanda Peet is an Associate Professor of Physics at the University of Toronto. She grew up in the South Pacific island nation of Aotearoa/New Zealand, and earned a B.Sc.(Hons) from the University of Canterbury in NZ and a Ph.D. from Stanford University in the USA. Her awards include a Radcliffe Fellowship from Harvard and an Alfred P. Sloan Foundation Research Fellowship. She was one of the string theorists interviewed in the three-part NOVA PBS TV documentary “Elegant Universe”.

Web site: http://ap.io/home/.

Dr. Amanda Peet’s Lecture “String Theory Legos for Black Holes”

Syndicated copies to:

Richard Dawkins Interview: This Is My Vision Of “Life” | Edge.org

Bookmarked This Is My Vision Of "Life" by John Brockman (edge.org)
The Edge.org's interview with Richard Dawkins.

Richard Dawkins [4.30.15]

“My vision of life is that everything extends from replicators, which are in practice DNA molecules on this planet. The replicators reach out into the world to influence their own probability of being passed on. Mostly they don’t reach further than the individual body in which they sit, but that’s a matter of practice, not a matter of principle. The individual organism can be defined as that set of phenotypic products which have a single route of exit of the genes into the future. That’s not true of the cuckoo/reed warbler case, but it is true of ordinary animal bodies. So the organism, the individual organism, is a deeply salient unit. It’s a unit of selection in the sense that I call a “vehicle”.  There are two kinds of unit of selection. The difference is a semantic one. They’re both units of selection, but one is the replicator, and what it does is get itself copied. So more and more copies of itself go into the world. The other kind of unit is the vehicle. It doesn’t get itself copied. What it does is work to copy the replicators which have come down to it through the generations, and which it’s going to pass on to future generations. So we have this individual replicator dichotomy. They’re both units of selection, but in different senses. It’s important to understand that they are different senses.”

Richard Dawkins
Richard Dawkins

RICHARD DAWKINS is an evolutionary biologist; Emeritus Charles Simonyi Professor of the Public Understanding of Science, Oxford; Author, The Selfish Gene; The Extended Phenotype; Climbing Mount Improbable; The God Delusion; An Appetite For Wonder; and (forthcoming) A Brief Candle In The Dark.

Watch the entire video interview and read the transcript at Edge.org.

Syndicated copies to:

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with #ITBio

 

Syndicated copies to:

Free E-Book: Neural Networks and Deep Learning by Michael Nielsen

Bookmarked Neural networks and deep learning (neuralnetworksanddeeplearning.com)

Michael A. Nielsen, the author of one of our favorite books on Quantum Computation and Quantum Information, is writing a new book entitled Neural Networks and Deep Learning. He’s been releasing portions of it for free on the internet in draft form every two or three months since 2013. He’s also maintaining an open code repository for the book on GitHub.

Michael A. Nielsen
Michael A. Nielsen
Syndicated copies to:

Introduction to Lie Groups and Lie Algebras (Part 2) | UCLA Extension

Dr. Mike Miller has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Dr. Mike Miller, who had previously announced a two quarter sequence of classes on Lie Groups at UCLA, has just opened up registration for the second course in the series. His courses are always clear, entertaining, and invigorating, and I highly recommend them to anyone who is interested in math, science, or engineering.

Philosophy is written in this grand book, the universe which stands continually open to our gaze. But the book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. It is written in the language of mathematics, and its characters are triangles, circles and other geometric figures without which it is humanly impossible to understand a single word of it; without these, one wanders about in a dark labyrinth.

Galileo Galilee (1564–1642) in Il saggiatore (The assayer)

Prior to the first part of the course, I’d written some thoughts about the timbre and tempo of his lecture style and philosophy and commend those interested to take a peek. I also mentioned some additional resources for the course there as well.  For those who missed the first portion, I’m happy to help fill you in and share some of my notes if necessary. The recommended minimum prerequisites for this class are linear algebra and some calculus.


Introduction to Lie Groups and Lie Algebras (Part 2)

Math X 450.7 / 3.00 units / Reg. # 251580W
Professor: Michael Miller, Ph.D.
Start Date: January 13, 2015
Location: UCLA, 5137 Math Sciences Building
Tuesday, 7-10pm
January 13 – March 24
11 meetings total
Class will not meet on one Tuesday to be annouced.

Register here: https://www.uclaextension.edu/Pages/Course.aspx?reg=251580

Course Description

A Lie group is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A Lie algebra is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course is the second in a 2-quarter sequence that offers an introductory survey of Lie groups, their associated Lie algebras, and their representations. Its focus is split between continuing last quarter’s study of matrix Lie groups and their representations and reconciling this theory with that for the more general manifold setting. Topics to be discussed include the Weyl group, complete reducibility, semisimple Lie algebras, root systems, and Cartan subalgebras. This is an advanced course, requiring a solid understanding of linear algebra, basic analysis, and, ideally, the material from the previous quarter.Internet access required to retrieve course materials.

Recommended Textbook

Hall, Brian. Lie Groups, Lie Algebras, & Representations (Springer, 2004) ISBN: 9781441923134

 

A photograph of Sophus Lie's very full and bushy beard.
“I wouldn’t lie to you. This is Sophus’s beard.”

 

Syndicated copies to:

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

  1. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
  2. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
  3. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
  4. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
  5. CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna. http://t.co/F4Kn0ICIaT #ITBio http://t.co/Ty8dEIXQUT

    CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
  6. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
  7. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
  8. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
  9. Good morning from beautiful Banff. How can you not love the mountains? http://t.co/mxYBNz7yzl

    Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
  10. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
  11. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
  12. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
  13. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4 
  14. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
  15. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
  16. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
  17. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4 
  18. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4 
  19. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
  20. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
  21. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4 
  22. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454 
  23. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
  24. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4 
  25. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4 
  26. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4 
  27. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4 
  28. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
  29. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
  30. So information theory seems to be a big deal in ecology. @BIRS_Math
  31. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4 
  32. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
  33. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
  34. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk 
  35. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4 
  36. .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio http://t.co/VA8komuuSW

    .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
  37. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
  38. Baez has a post on Tishby's talk "Sensing &  Acting Under Information Constraints" http://t.co/t1nPVI1pxa @BIRS_Math http://t.co/dFuiVLFSGC

    Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
  39. INFORMATION THEORY is the new central ...

    INFORMATION THEORY is the new central …
  40. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF 
  41. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7 
  42. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
  43. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
  44. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4 
  45. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4 
  46. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4 
  47. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
  48. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
  49. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170 
  50. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
  51. A few thoughts about that workshop while I wait for my flight back to Toronto.
  52. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
  53. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
  54. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
  55. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
  56. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
  57. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.


Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
Syndicated copies to: