The Information Universe Conference

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

Keynote speakers

  • Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
  • Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
  • Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
  • Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
  • Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
  • Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands
Infoversum Theater, The Netherlands
Infoversum Theater, The Netherlands

Conference synopsis from their homepage:

The main ambition of this conference is to explore the question “What is the role of information in the physics of our Universe?”. This intellectual pursuit may have a key role in improving our understanding of the Universe at a time when we “build technology to acquire and manage Big Data”, “discover highly organized information systems in nature” and “attempt to solve outstanding issues on the role of information in physics”. The conference intends to address the “in vivo” (role of information in nature) and “in vitro” (theory and models) aspects of the Information Universe.

The discussions about the role of information will include the views and thoughts of several disciplines: astronomy, physics, computer science, mathematics, life sciences, quantum computing, and neuroscience. Different scientific communities hold various and sometimes distinct formulations of the role of information in the Universe indicating we still lack understanding of its intrinsic nature. During this conference we will try to identify the right questions, which may lead us towards an answer.

  • Is the universe one big information processing machine?
  • Is there a deeper layer in quantum mechanics?
  • Is the universe a hologram?
  • Is there a deeper physical description of the world based on information?
  • How close/far are we from solving the black hole information paradox?
  • What is the role of information in highly organized complex life systems?
  • The Big Data Universe and the Universe : are our numerical simulations and Big Data repositories (in vitro) different from real natural system (in vivo)?
  • Is this the road to understanding dark matter, dark energy?

The conference will be held in the new 260 seats planetarium theatre in Groningen, which provides an inspiring immersive 3D full dome display, e.g. numerical simulations of the formation of our Universe, and anything else our presenters wish to bring in. The digital planetarium setting will be used to visualize the theme with modern media.

The Information Universe Website

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Syndicated copies to:

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Additonal resources from the Workshop:


Syndicated copies to:

Popular Science Books on Information Theory, Biology, and Complexity

The beginning of a four part series in which I provide a gradation of books and texts that lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.

Previously, I had made a large and somewhat random list of books which lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.  Below I’ll begin to do a somewhat better job of providing a finer gradation of technical level for both the hobbyist or the aspiring student who wishes to bring themselves to a higher level of understanding of these areas.  In future posts, I’ll try to begin classifying other texts into graduated strata as well.  The final list will be maintained here: Books at the Intersection of Information Theory and Biology.

Introductory / General Readership / Popular Science Books

These books are written on a generally non-technical level and give a broad overview of their topics with occasional forays into interesting or intriguing subtopics. They include little, if any, mathematical equations or conceptualization. Typically, any high school student should be able to read, follow, and understand the broad concepts behind these books.  Though often non-technical, these texts can give some useful insight into the topics at hand, even for the most advanced researchers.

Complexity: A Guided Tour by Melanie Mitchell (review)

Possibly one of the best places to start, this text gives a great overview of most of the major areas of study related to these fields.

Entropy Demystified: The Second Law Reduced to Plain Common Sense by Arieh Ben-Naim

One of the best books on the concept of entropy out there.  It can be read even by middle school students with no exposure to algebra and does a fantastic job of laying out the conceptualization of how entropy underlies large areas of the broader subject. Even those with Ph.D.’s in statistical thermodynamics can gain something useful from this lovely volume.

The Information: A History, a Theory, a Flood by James Gleick (review)

A relatively recent popular science volume covering various conceptualizations of what information is and how it’s been dealt with in science and engineering.  Though it has its flaws, its certainly a good introduction to the beginner, particularly with regard to history.

The Origin of Species by Charles Darwin

One of the most influential pieces of writing known to man, this classical text is the basis from which major strides in biology have been made as a result. A must read for everyone on the planet.

Information, Entropy, Life and the Universe: What We Know and What We Do Not Know by Arieh Ben-Naim

Information Theory and Evolution by John Avery

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein (review)

Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey

The four books above have a significant amount of overlap. Though one could read all of them, I recommend that those pressed for time choose Ben-Naim first. As I write this I’ll note that Ben-Naim’s book is scheduled for release on May 30, 2015, but he’s been kind enough to allow me to read an advance copy while it was in process; it gets my highest recommendation in its class. Loewenstein covers a bit more than Avery who also has a more basic presentation. Most who continue with the subject will later come across Yockey’s Information Theory and Molecular Biology which is similar to his text here but written at a slightly higher level of sophistication. Those who finish at this level of sophistication might want to try Yockey third instead.

The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley

Grammatical Man: Information, Entropy, Language, and Life  by Jeremy Campbell

Life’s Ratchet: How Molecular Machines Extract Order from Chaos by Peter M. Hoffmann

Complexity: The Emerging Science at the Edge of Order and Chaos by M. Mitchell Waldrop

The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) 

In the coming weeks/months, I’ll try to continue putting recommended books on the remainder of the rest of the spectrum, the balance of which follows in outline form below. As always, I welcome suggestions and recommendations based on others’ experiences as well. If you’d like to suggest additional resources in any of the sections below, please do so via our suggestion box. For those interested in additional resources, please take a look at the ITBio Resources page which includes information about related research groups; references and journal articles; academic, research institutes, societies, groups, and organizations; and conferences, workshops, and symposia.

Lower Level Undergraduate

These books are written at a level that can be grasped and understood by most with a freshmen or sophomore university level. Coursework in math, science, and engineering will usually presume knowledge of calculus, basic probability theory, introductory physics, chemistry, and basic biology.

Upper Level Undergraduate

These books are written at a level that can be grasped and understood by those at a junior or senor university level. Coursework in math, science, and engineering may presume knowledge of probability theory, differential equations, linear algebra, complex analysis, abstract algebra, signal processing, organic chemistry, molecular biology, evolutionary theory, thermodynamics, advanced physics, and basic information theory.

Graduate Level

These books are written at a level that can be grasped and understood by most working at the level of a master’s level at most universities.  Coursework presumes all the previously mentioned classes, though may require a higher level of sub-specialization in one or more areas of mathematics, physics, biology, or engineering practice.  Because of the depth and breadth of disciplines covered here, many may feel the need to delve into areas outside of their particular specialization.

Syndicated copies to:

Nicolas Perony: Puppies! Now that I’ve got your attention, complexity theory | TED

Animal behavior isn't complicated, but it is complex. Nicolas Perony studies how individual animals — be they Scottish Terriers, bats or meerkats — follow simple rules that, collectively, create larger patterns of behavior. And how this complexity born of simplicity can help them adapt to new circumstances, as they arise.

For those who are looking for a good, simple, and entertaining explanation of the concept of emergent properties and behavior within complexity theory (or Big History), I just came across a nice TED talk that simplifies complexity using a few animal examples including a cute puppy video as well as a bat and a meerkat example. The latter two also have implications for evolution and survival which are lovely examples as well.

Syndicated copies to:

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with #ITBio


Syndicated copies to:

Probability Models for DNA Sequence Evolution

Bookmarked Probability Models for DNA Sequence Evolution (Springer, 2008, 2nd Edition) by Rick Durrett (

While browsing through some textbooks and researchers today, I came across a fantastic looking title: Probability Models for DNA Sequence Evolution by Rick Durrett (Springer, 2008). While searching his website at Duke, I noticed that he’s made a .pdf copy of a LaTeX version of the 2nd edition available for download.   I hope others find it as interesting and useful as I do.

I’ll also give him a shout out for being a mathematician with a fledgling blog: Rick’s Ramblings.

Book Cover of Probability Models for DNA Sequence Evolution by Richard Durrett
Probability Models for DNA Sequence Evolution by Richard Durrett
Syndicated copies to:

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

  1. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  #ITBio
  2. Mathematical and Statistical Models for Genetic Coding starts today.  @andreweckford might borrow attendees for BIRS
  3. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  #ITBio
  4. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  #ITBio
  5. CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna. #ITBio

    CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  #ITBio
  6. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
  7. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
  8. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
  9. Good morning from beautiful Banff. How can you not love the mountains?

    Good morning from beautiful Banff. How can you not love the mountains?
  10. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
  11. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
  12. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  #ITBio
  13. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables 
  14. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
  15. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
  16. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
  17. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat… 
  18. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag… 
  19. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
  20. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
  21. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a… 
  22. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram 
  23. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
  24. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems 
  25. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v… 
  26. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi… 
  27. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha… 
  28. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
  29. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
  30. So information theory seems to be a big deal in ecology. @BIRS_Math
  31. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer 
  32. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
  33. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
  34. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here: 
  35. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio… 
  36. .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio

    .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio
  37. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  @BIRS_Math
  38. Baez has a post on Tishby's talk "Sensing &  Acting Under Information Constraints" @BIRS_Math

    Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  @BIRS_Math
  39. INFORMATION THEORY is the new central ...

    INFORMATION THEORY is the new central …
  40. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. … 
  41. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS 
  42. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
  43. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  #ITBio
  44. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics 
  45. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati… 
  46. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi… 
  47. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
  48. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
  49. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like 
  50. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  #ITBio
  51. A few thoughts about that workshop while I wait for my flight back to Toronto.
  52. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
  53. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
  54. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
  55. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
  56. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
  57. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.

Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
Syndicated copies to:

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.


Logo for the journal Entropy


Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

  1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
  2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

  • information theoretic intrinsic motivations
  • information theoretic quantification of behavior
  • information theoretic guidance of artificial evolution
  • information theoretic guidance of self-organization
  • information theoretic driving forces behind learning
  • information theoretic driving forces behind behavior
  • information theory in swarms
  • information theory in social behavior
  • information theory in evolution
  • information theory in the brain
  • information theory in system-environment distinction
  • information theory in the perception action loop
  • information theoretic definitions of life


Manuscripts should be submitted online at by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Syndicated copies to:

Uri Alon: Why Truly Innovative Science Demands a Leap into the Unknown

I recently ran across this TED talk and felt compelled to share it. It really highlights some of my own personal thoughts on how science should be taught and done in the modern world.  It also overlaps much of the reading I’ve been doing lately on innovation and creativity. If these don’t get you to watch, then perhaps mentioning that Alon manages to apply comedy and improvisation techniques to science will.

Uri Alon was already one of my scientific heroes, but this adds a lovely garnish.



Syndicated copies to:

Brief Review: The Swerve: How the World Became Modern by Stephen Greenblatt

The Swerve: How the World Became ModernThe Swerve: How the World Became Modern by Stephen Greenblatt

My rating: 4 of 5 stars

Stephen Greenblatt provides an interesting synthesis of history and philosophy. Greenblatt’s love of the humanities certainly shines through. This stands as an almost over-exciting commercial for not only reading Lucretius’s “De Rerum Natura” (“On the Nature of Things”), but in motivating the reader to actually go out to learn Latin to appreciate it properly.

I would have loved more direct analysis and evidence of the immediate impact of Lucretius in the 1400’s as well as a longer in-depth analysis of the continuing impact through the 1700’s.

The first half of the book is excellent at painting a vivid portrait of the life and times of Poggio Bracciolini which one doesn’t commonly encounter. I’m almost reminded of Stacy Schiff’s Cleopatra: A Life, though Greenblatt has far more historical material with which to paint the picture. I may also be biased that I’m more interested in the mechanics of the scholarship of the resurgence of the classics in the Renaissance than I was of that particular political portion of the first century BCE. Though my background on the history of the time periods involved is reasonably advanced, I fear that Greenblatt may be leaving out a tad too much for the broader reading public who may not be so well versed. The fact that he does bring so many clear specifics to the forefront may more than compensate for this however.

In some interesting respects, this could be considered the humanities counterpart to the more science-centric story of Owen Gingerich’s The Book Nobody Read: Chasing the Revolutions of Nicolaus Copernicus. Though Simon Winchester is still by far my favorite nonfiction writer, Greenblatt does an exceedingly good job of narrating what isn’t necessarily a very linear story.

Greenblatt includes lots of interesting tidbits and some great history. I wish it had continued on longer… I’d love to have the spare time to lose myself in the extensive bibliography. Though the footnotes, bibliography, and index account for about 40% of the book, the average reader should take a reasonable look at the quarter or so of the footnotes which add some interesting additional background an subtleties to the text as well as to some of the translations that are discussed therein.

I am definitely very interested in the science behind textual preservation which is presented as the underlying motivation for the action in this book. I wish that Greenblatt had covered some of these aspects in the same vivid detail he exhibited for other portions of the story. Perhaps summarizing some more of the relevant scholarship involved in transmitting and restoring old texts as presented in Bart Ehrman and Bruce Metzter’s The Text of the New Testament: Its Transmission, Corruption & Restoration would have been a welcome addition given the audience of the book. It might also have presented a more nuanced picture of the character of the Church and their predicament presented in the text as well.

Though I only caught one small reference to modern day politics (a prison statistic for America which was obscured in a footnote), I find myself wishing that Greenblatt had spent at least a few paragraphs or even a short chapter drawing direct parallels to our present-day political landscape. I understand why he didn’t broach the subject as it would tend to date an otherwise timeless feeling text and generally serve to dissuade a portion of his readership and in particular, the portion which most needs to read such a book. I can certainly see a strong need for having another short burst of popularity for “On the Nature of Things” to assist with the anti-science and overly pro-religion climate we’re facing in American politics.

For those interested in the topic, I might suggest that this text has some flavor of Big History in its DNA. It covers not only a fairly significant chunk of recorded human history, but has some broader influential philosophical themes that underlie a potential change in the direction of history which we’ve been living for the past 300 years. There’s also an intriguing overlap of multidisciplinary studies going on in terms of the history, science, philosophy, and technology involved in the multiple time periods discussed.

This review was originally posted on on 7/8/2014. View all my reviews

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

  •  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
  • To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
  • To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
  • To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
  • To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna.  A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.

The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Logo for the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Location: DACAM, Max F. Perutz Laboratories, University of Vienna, Dr. Bohrgasse 9, A-1030, Vienna, Austria
Dates: May 14, 2014 to May 17, 2014

The workshop is being organized by:

  • Richard Henchman (University of Manchester, United Kingdom)
  • Bojan Zagrovic (University of Vienna, Austria)
  • Michel Cuendet (Swiss Institute of Bioinformatics, Lausanne, Switzerland and Weill Cornell Medical College, New York, USA)
  • Chris Oostenbrink (University of Natural Resources and Life Sciences, Austria)

It’s being supported by CECAM, the European Research Council, and the Royal Society of Chemistry’s Statistical Mechanics and Thermodynamics Group.

I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.

The summary from the workshop website states:

This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.

Further details on the workshop can be found on the CECAM website.


As always, details on other upcoming workshops and conferences relating to information theory and biology can be found on our ITBio Conferences/Workshops page.



Workshop on Information Theoretic Incentives for Artificial Life

For those interested in the topics of information theory in biology and artificial life, Christoph SalgeGeorg MartiusKeyan Ghazi-Zahedi, and Daniel Polani have announced a Satellite Workshop on Information Theoretic Incentives for Artificial Life at the 14th International Conference on the Synthesis and Simulation of Living Systems (ALife 2014) to be held at the Javits Center, New York, on July 30 or 31st.

ALife2014 Banner

Their synopsis states:

Artificial Life aims to understand the basic and generic principles of life, and demonstrate this understanding by producing life-like systems based on those principles. In recent years, with the advent of the information age, and the widespread acceptance of information technology, our view of life has changed. Ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. But what can information, or more formally Information Theory, offer to Artificial Life?

One relevant area is the motivation of behaviour for artificial agents, both virtual and real. Instead of learning to perform a specific task, informational measures can be used to define concepts such as boredom, empowerment or the ability to predict one’s own future. Intrinsic motivations derived from these concepts allow us to generate behaviour, ideally from an embodied and enactive perspective, which are based on basic but generic principles. The key questions here are: “What are the important intrinsic motivations a living agent has, and what behaviour can be produced by them?”

Related to an agent’s behaviour is also the question on how and where the necessary computation to realise this behaviour is performed. Can information be used to quantify the morphological computation of an embodied agent and to what degree are the computational limitations of an agent influencing its behaviour?

Another area of interest is the guidance of artificial evolution or adaptation. Assuming it is true that an agent wants to optimise its information processing, possibly obtain as much relevant information as possible for the cheapest computational cost, then what behaviour would naturally follow from that? Can the development of social interaction or collective phenomena be motivated by an informational gradient? Furthermore, evolution itself can be seen as a process in which an agent population obtains information from the environment, which begs the question of how this can be quantified, and how systems would adapt to maximise this information?

The common theme in those different scenarios is the identification and quantification of driving forces behind evolution, learning, behaviour and other crucial processes of life, in the hope that the implementation or optimisation of these measurements will allow us to construct life-like systems.

Details for submissions, acceptances, potential talks, and dates can be found via  Nihat Ay’s Research Group on Information Theory of Cognitive Systems. For more information on how to register, please visit the ALife 2014 homepage. If there are any questions, or if you just want to indicate interest in submitting or attending, please feel free to mail them at

According to their release, the open access journal Entropy will sponsor the workshop by an open call with a special issue on the topic of the workshop. More details will be announced to emails received via and over the alife and connectionists mailing lists.

Syndicated copies to:

Information Theory and Paleoanthropology

A few weeks ago I had communicated a bit with paleoanthropologist John Hawks.  I wanted to take a moment to highlight the fact that he maintains an excellent blog primarily concerning his areas of research which include anthropology, genetics and evolution.  Even more specifically, he is one of the few people in these areas with at least a passing interest in the topic of information theory as it relates to his work. I recommend everyone take a look at his information theory specific posts.

silhouette of John Hawks from his blog

I’ve previously written a brief review of John Hawks’ (in collaboration with Anthony Martin) “Major Transitions in Evolution” course from The Learning Company as part of their Great Courses series of lectures. Given my interest in the MOOC revolution in higher education, I’ll also mention that Dr. Hawks has recently begun a free Coursera class entitled “Human Evolution: Past and Future“. I’m sure his current course focuses more on the area of human evolution compared with the prior course which only dedicated a short segment on this time period.  Given Hawks’ excellent prior teaching work, I’m sure this will be of general interest to readers interested in information theory as it relates to evolution, biology, and big history.

I’d love to hear from others in the area of anthropology who are interested in information theoretical applications.


Syndicated copies to: