Molecular Programming Project

Bookmarked Molecular Programming Project (Molecular Programming Project)

“The Molecular Programming Project aims to develop computer science principles for programming information-bearing molecules like DNA and RNA to create artificial biomolecular programs of similar complexity. Our long-term vision is to establish molecular programming as a subdiscipline of computer science — one that will enable a yet-to-be imagined array of applications from chemical circuitry for interacting with biological molecules to nanoscale computing and molecular robotics.”

Source: MPP: Home

A world of languages – and how many speak them (Infographic)

An infographic from the South China Morning Post has some interesting statistics about which many modern people don’t know (or remember). It’s very interesting to see the distribution of languages and where they’re spoken. Of particular note that most will miss, even from this infographic, is that 839 languages are spoken in Papua New Guinea (11.8% of all known languages on Earth). Given the effects of history and modernity, imagine how many languages there might have been without them.

Why Information Grows: The Evolution of Order, from Atoms to Economies

I just ordered a copy of Why Information Grows: The Evolution of Order, from Atoms to Economies by Cesar Hidalgo. Although it seems more focused on economics, the base theory seems to fit right into some similar thoughts I’ve long held about biology.

From the book description:

“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.

At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.

Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”

The Information Universe Conference

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

Keynote speakers

• Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
• Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
• Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
• Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
• Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
• Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands

Conference synopsis from their homepage:

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Popular Science Books on Information Theory, Biology, and Complexity

The beginning of a four part series in which I provide a gradation of books and texts that lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.

Previously, I had made a large and somewhat random list of books which lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.  Below I’ll begin to do a somewhat better job of providing a finer gradation of technical level for both the hobbyist or the aspiring student who wishes to bring themselves to a higher level of understanding of these areas.  In future posts, I’ll try to begin classifying other texts into graduated strata as well.  The final list will be maintained here: Books at the Intersection of Information Theory and Biology.

Introductory / General Readership / Popular Science Books

These books are written on a generally non-technical level and give a broad overview of their topics with occasional forays into interesting or intriguing subtopics. They include little, if any, mathematical equations or conceptualization. Typically, any high school student should be able to read, follow, and understand the broad concepts behind these books.  Though often non-technical, these texts can give some useful insight into the topics at hand, even for the most advanced researchers.

Possibly one of the best places to start, this text gives a great overview of most of the major areas of study related to these fields.

Entropy Demystified: The Second Law Reduced to Plain Common Sense by Arieh Ben-Naim

One of the best books on the concept of entropy out there.  It can be read even by middle school students with no exposure to algebra and does a fantastic job of laying out the conceptualization of how entropy underlies large areas of the broader subject. Even those with Ph.D.’s in statistical thermodynamics can gain something useful from this lovely volume.

A relatively recent popular science volume covering various conceptualizations of what information is and how it’s been dealt with in science and engineering.  Though it has its flaws, its certainly a good introduction to the beginner, particularly with regard to history.

The Origin of Species by Charles Darwin

One of the most influential pieces of writing known to man, this classical text is the basis from which major strides in biology have been made as a result. A must read for everyone on the planet.

Information, Entropy, Life and the Universe: What We Know and What We Do Not Know by Arieh Ben-Naim

Information Theory and Evolution by John Avery

Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey

The four books above have a significant amount of overlap. Though one could read all of them, I recommend that those pressed for time choose Ben-Naim first. As I write this I’ll note that Ben-Naim’s book is scheduled for release on May 30, 2015, but he’s been kind enough to allow me to read an advance copy while it was in process; it gets my highest recommendation in its class. Loewenstein covers a bit more than Avery who also has a more basic presentation. Most who continue with the subject will later come across Yockey’s Information Theory and Molecular Biology which is similar to his text here but written at a slightly higher level of sophistication. Those who finish at this level of sophistication might want to try Yockey third instead.

The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley

Grammatical Man: Information, Entropy, Language, and Life  by Jeremy Campbell

Life’s Ratchet: How Molecular Machines Extract Order from Chaos by Peter M. Hoffmann

Complexity: The Emerging Science at the Edge of Order and Chaos by M. Mitchell Waldrop

The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016)

In the coming weeks/months, I’ll try to continue putting recommended books on the remainder of the rest of the spectrum, the balance of which follows in outline form below. As always, I welcome suggestions and recommendations based on others’ experiences as well. If you’d like to suggest additional resources in any of the sections below, please do so via our suggestion box. For those interested in additional resources, please take a look at the ITBio Resources page which includes information about related research groups; references and journal articles; academic, research institutes, societies, groups, and organizations; and conferences, workshops, and symposia.

These books are written at a level that can be grasped and understood by most with a freshmen or sophomore university level. Coursework in math, science, and engineering will usually presume knowledge of calculus, basic probability theory, introductory physics, chemistry, and basic biology.

These books are written at a level that can be grasped and understood by those at a junior or senor university level. Coursework in math, science, and engineering may presume knowledge of probability theory, differential equations, linear algebra, complex analysis, abstract algebra, signal processing, organic chemistry, molecular biology, evolutionary theory, thermodynamics, advanced physics, and basic information theory.

These books are written at a level that can be grasped and understood by most working at the level of a master’s level at most universities.  Coursework presumes all the previously mentioned classes, though may require a higher level of sub-specialization in one or more areas of mathematics, physics, biology, or engineering practice.  Because of the depth and breadth of disciplines covered here, many may feel the need to delve into areas outside of their particular specialization.

Nicolas Perony: Puppies! Now that I’ve got your attention, complexity theory | TED

Animal behavior isn't complicated, but it is complex. Nicolas Perony studies how individual animals — be they Scottish Terriers, bats or meerkats — follow simple rules that, collectively, create larger patterns of behavior. And how this complexity born of simplicity can help them adapt to new circumstances, as they arise.

For those who are looking for a good, simple, and entertaining explanation of the concept of emergent properties and behavior within complexity theory (or Big History), I just came across a nice TED talk that simplifies complexity using a few animal examples including a cute puppy video as well as a bat and a meerkat example. The latter two also have implications for evolution and survival which are lovely examples as well.

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

1. I know where I’ll be in Oct 2014! Let’s hear it for Biology & Information Theory!  https://www.birs.ca/events/2014/5-day-workshops/14w5170  #ITBio #Banff @andreweckford
2. . @andreweckford You might be interested in this grouping of research papers:  http://www.mendeley.com/groups/2545131/itbio/  #ITBio #Banff
3. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
4. #ITBio: @andreweckford has a new book on Molecular Communication available Oct 31.  http://bit.ly/15uEUzF
5. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
6. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
7. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
8. Workshop on Information Theoretic Incentives for Artificial Life  http://jhu.md/1lM8tAn  #ITBio #ALife14 @alifeofficial @14thALIFE @cxdig
9. Researchers working in information theory & biology  http://jhu.md/1gieQGR  #ITBio @andreweckford @ChristophAdami @wbialek @johnhawks
10. #ITBio http://t.co/Ty8dEIXQUT"/>

CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
11. Currently organizing my Banff workshop on bio-information theory …  https://www.birs.ca/events/2014/5-day-workshops/14w5170
12. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
13. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
14. How information theory could hold the key to quantifying nature.  http://wrd.cm/1uy1xdX  by @vero_greenwood pic.twitter.com/ek5DUb2Ul9
15. Good morning from Banff. Current temp: -1 C
16. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
17. Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
18. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
19. Peter Thomas (Case Western Reserve University), Signal Transduction and Information Theory  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410270940-Thomas.mp4
20. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
21. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
22. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4
23. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
24. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
25. Chris Rose (Rutgers University), Molecular Communication Channels: timing vs. payload  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271538-Rose.mp4
26. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
27. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4
28. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4
29. Terrific introduction of Canada/Banff by Andrew Eckford (York)The Landscape  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410270858-Eckford.mp4
30. Biological and Bio-Inspired Information Theory workshop videos!  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos  @BIRS_Math
31. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
32. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
33. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4
34. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454
35. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
36. Behnaam Aazhang (Rice University), Real-Time Network Modulation for Intractable Epilepsy  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281337-Aazhang.mp4
37. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4
38. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4
39. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4
40. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4
42. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
43. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
44. So information theory seems to be a big deal in ecology. @BIRS_Math
45. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4
46. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
47. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
48. .@andreweckford @BIRS_Math Harte’s book Maximum Entropy & Ecology is excellent in this area  http://amzn.to/1DwIl3V  pic.twitter.com/EIBDpM35uf
49. .@andreweckford @QuantaMagazine had a nice overview of some of John Harte’s work in September  http://bit.ly/1DwIWCD  @BIRS_Math
50. Chan-Byoung Chae (Yonsei University), Molecular MIMO: From Theory to Practice  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281705-Chae.mp4
51. John Baez (University of California, Riverside), Biodiversity, entropy and thermodynamics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291038-Baez.mp4
52. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk
53. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4
54. #ITBio http://t.co/VA8komuuSW"/>

.@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
55. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
56. Johnston Canyon selfie @BIRS_Math pic.twitter.com/MEKeY5To5s
57. Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
58. INFORMATION THEORY is the new central …
59. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF
60. Ilya Nemenman @EmoryUniversity on Predictive information  http://bit.ly/1titfOw
61. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7
62. Inkpots selfie from yesterday’s hike. @BIRS_Math pic.twitter.com/0A6ZQsQVwE
63. On the way home from Inkpots. @BIRS_Math pic.twitter.com/1XhO8mLOkq
64. Toby Berger (University of Virginia), Neruoscience Applications of GIG Distributions  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410280914-Berger.mp4
65. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
66. Daniel Polani (University of Hertfordshire), Informational Principles in Perception-Action Loops  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301038-Polani.mp4
67. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
68. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4
69. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4
70. Robert Shaw (ProtoLife Inc.), Information and Causality in a Reaction-Diffusion System  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301434-Shaw.mp4
71. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4
72. Nima Soltani (Stanford University), Applications of Directed Information to Neuroscience  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301647-Soltani.mp4
73. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
74. @lrvarshney I shoulda invited you to this BIRS workshop …
75. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
76. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170
77. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
78. Kudos @andreweckford, Toby Berger, Peter Thomas, @NGhoussoub, @BIRS_Math & friends on a fantastic workshop!  http://bit.ly/1ckttZq
79. This @BIRS_Math Workshop was biggest thing in #informationtheory & #biology since the Gatlinburg Symposium in 1956.  http://bit.ly/1rF4RRr
80. See you later Calgary. pic.twitter.com/mkmU6yrmVz
81. A few thoughts about that workshop while I wait for my flight back to Toronto.
82. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
83. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
84. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
85. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
86. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
87. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.

Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

• information theoretic intrinsic motivations
• information theoretic quantification of behavior
• information theoretic guidance of artificial evolution
• information theoretic guidance of self-organization
• information theoretic driving forces behind learning
• information theoretic driving forces behind behavior
• information theory in swarms
• information theory in social behavior
• information theory in evolution
• information theory in the brain
• information theory in system-environment distinction
• information theory in the perception action loop
• information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

•  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
• To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
• To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
• To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
• To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

New Routledge Text on Systems Theory

Over the holiday I ran across a press release, which follows with web links added, for a new book on systems theory. It promises to be an excellent read on the development and philosophy of systems theory for those interested in cybernetics, information theory, complexity and related topics.

MIAMI, Fla., Dec. 19, 2013
Dr. Darrell Arnold, Assistant Professor of Philosophy and Director of the Institute for World Languages and Cultures at St. Thomas University, has published an edited volume with Routledge entitled Traditions of Systems Theory: Major Figures and Contemporary Developments. Hans-Georg Moeller, of University College Cork, Ireland, notes that the book “provides a state-of-the-art survey of the increasingly influential and fascinating field of systems theory. It is a highly useful resource for a wide range of disciplines and contributes significantly to bringing together current trends in the sciences and the humanities.” The book includes 17 articles from leading theoreticians in the field, including pieces by Ranulph Glanville, the President of the American Society for Cybernetics, as well as Debora Hammond, the former President of the International Society for Systems Sciences. It is the first comprehensive edited volume in English on the major and countervailing developments within systems theory.

Dr. Arnold writes on 19th century German philosophy, contemporary social theory, as well as technology and globalization, with a focus on how these areas relate to the environmental problematic. He has translated numerous books from German, including C. Mantzavinos’s Naturalistic Hermeneutics (Cambridge UP) and Matthias Vogel’s Media of Reason (Columbia UP). Dr. Arnold is also editor-in-chief of the Humanities and Technology Review.

I’ve ordered my copy and will be providing a review shortly.

Review of The Signal and the Noise: Why So Many Predictions Fail – But Some Don’t

Read The Signal and the Noise: Why So Many Predictions Fail - But Some Don't by Nate Silver (Amazon.com)
The Signal and the Noise: Why So Many Predictions Fail, But Some Don't
Nate Silver
Penguin Press HC
September 27, 2012
Hardcover
534
personal library

The founder of FiveThirtyEight.com challenges myths about predictions in subjects ranging from the financial market and weather to sports and politics, profiling the world of prediction to explain how readers can distinguish true signals from hype, in a report that also reveals the sources and societal costs of wrongful predictions.

Given the technical nature of what Nate Silver does, and some of the early mentions of the book, I had higher hopes for the technical portions of the book. As usual for a popular text, I was left wanting a lot more. Again, the lack of any math left a lot to desire. I wish technical writers could get away with even a handful of equations, but wishing just won’t make it so.

The first few chapters were a bit more technical sounding, but eventually devolved into a more journalistic viewpoint of statistics, prediction, and forecasting in general within the areas of economics, political elections, weather forecasting, earthquakes, baseball, poker, chess, and terrorism. I have a feeling he lost a large part of his audience in the first few chapters by discussing the economic meltdown of 2008 first instead of baseball or poker and then getting into politics and economics.

While some of the discussion around each of these bigger topics are all intrinsically interesting and there were a few interesting tidbits I hadn’t heard or read about previously, on the whole it wasn’t really as novel as I had hoped it would be. I think it should be required reading for all politicians however, as I too often get the feeling that none of them think at this level.

There was some reasonably good philosophical discussion of Bayesian statistics versus Fisherian, but it was all too short and could have been fleshed out more significantly. I still prefer David Applebaum’s historical and philosophical discussion of probability in Probability and Information: An Integrated Approach though he surprisingly didn’t mention R.A. Fisher directly himself in his coverage.

It was interesting to run across additional mentions of power laws in the realms of earthquakes and terrorism after reading Melanie Mitchell’s Complexity: A Guided Tour (review here), but I’ll have to find some texts which describe the mathematics in full detail. There was surprisingly large amount of discussion skirting around the topics within complexity without delving into it in any substantive form.

For those with a pre-existing background in science and especially probability theory, I’d recommend skipping this and simply reading Daniel Kahneman’s book Thinking, Fast and Slow. Kahneman’s work is referenced several times and his book seems less intuitive than some of the material Silver presents here.

This is the kind of text which should be required reading in high school civics classes. Perhaps it might motivate more students to be interested in statistics and science related pursuits as these are almost always at the root of most political and policy related questions at the end of the day.

For me, I’d personally give this three stars, but the broader public should view it with at least four stars if not five as there is some truly great stuff here. Unfortunately a lot of it is old hat or retreaded material for me.

Book Review: “Complexity: A Guided Tour” by Melanie Mitchell

Read Complexity: A Guided Tour by Melanie Mitchell (amzn.to)
Complexity: A Guided Tour
Melanie Mitchell
Popular Science
Oxford University Press
May 28, 2009
Hardcover
366

This book provides an intimate, highly readable tour of the sciences of complexity, which seek to explain how large-scale complex, organized, and adaptive behavior can emerge from simple interactions among myriad individuals. The author, a leading complex systems scientist, describes the history of ideas, current research, and future prospects in this vital scientific effort.

This is handily one of the best, most interesting, and (to me at least) the most useful popularly written science books I’ve yet to come across. Most popular science books usually bore me to tears and end up being only pedantic for their historical backgrounds, but this one is very succinct with some interesting viewpoints (some of which I agree with and some of which my intuition says are terribly wrong) on the overall structure presented.

For those interested in a general and easily readable high-level overview of some of the areas of research I’ve been interested in (information theory, thermodynamics, entropy, microbiology, evolution, genetics, along with computation, dynamics, chaos, complexity, genetic algorithms, cellular automata, etc.) for the past two decades, this is really a lovely and thought-provoking book.

At the start I was disappointed that there were almost no equations in the book to speak of – and perhaps this is why I had purchased it when it came out and it’s subsequently been sitting on my shelf for so long. The other factor that prevented me from reading it was the depth and breadth of other more technical material I’ve read which covers the majority of topics in the book. I ultimately found myself not minding so much that there weren’t any/many supporting equations aside from a few hidden in the notes at the end of the text in most part because Dr. Mitchell does a fantastic job of pointing out some great subtleties within the various subjects which comprise the broader concept of complexity which one generally would take several years to come to on one’s own and at far greater expense of their time. Here she provides a much stronger picture of the overall subjects covered and this far outweighed the lack of specificity. I honestly wished I had read the book when it was released and it may have helped me to me more specific in my own research. Fortunately she does bring up several areas I will need to delve more deeply into and raised several questions which will significantly inform my future work.

In general, I wish there were more references I hadn’t read or been aware of yet, but towards the end there were a handful of topics relating to fractals, chaos, computer science, and cellular automata which I have been either ignorant of or which are further down my reading lists and may need to move closer to the top. I look forward to delving into many of these shortly. As a simple example, I’ve seen Zipf’s law separately from the perspectives of information theory, linguistics, and even evolution, but this is the first time I’ve seen it related to power laws and fractals.

I definitely appreciated the fact that Dr. Mitchell took the time to point out her own personal feelings on several topics and more so that she explicitly pointed them out as her own gut instincts instead of mentioning them passingly as if they were provable science which is what far too many other authors would have likely done. There are many viewpoints she takes which I certainly don’t agree with, but I suspect that it’s because I’m coming at things from the viewpoint of an electrical engineer with a stronger background in information theory and microbiology while hers is closer to that of computer science. She does mention that her undergraduate background was in mathematics, but I’m curious what areas she specifically studied to have a better understanding of her specific viewpoints.

Her final chapter looking at some of the pros and cons of the topic(s) was very welcome, particularly in light of previous philosophic attempts like cybernetics and general systems theory which I (also) think failed because of their lack of specificity. These caveats certainly help to place the scientific philosophy of complexity into a much larger context. I will generally heartily agree with her viewpoint (and that of others) that there needs to be a more rigorous mathematical theory underpinning the overall effort. I’m sure we’re all wondering “Where is our Newton?” or to use her clever aphorism that we’re “waiting for Carnot.” (Sounds like it should be a Tom Stoppard play title, doesn’t it?)

I might question her brief inclusion of her own Ph.D. thesis work in the text, but it did actually provide a nice specific and self-contained example within the broader context and also helped to tie several of the chapters together.

My one slight criticism of the work would be the lack of better footnoting within the text. Though many feel that footnote numbers within the text or inclusion at the bottom of the pages detracts from the “flow” of the work, I found myself wishing that she had done so here, particularly as I’m one of the few who actually cares about the footnotes and wants to know the specific references as I read. I hope that Oxford eventually publishes an e-book version that includes cross-linked footnotes in the future for the benefit of others.

I can heartily recommend this book to any fan of science, but I would specifically recommend it to any undergraduate science or engineering major who is unsure of what they’d specifically like to study and might need some interesting areas to take a look at. I will mention that one of the tough parts of the concept of complexity is that it is so broad and general that it encompasses over a dozen other fields of study each of which one could get a Ph.D. in without completely knowing the full depth of just one of them much less the full depth of all of them. The book is so well written that I’d even recommend it to senior researchers in any of the above mentioned fields as it is certainly sure to provide not only some excellent overview history of each, but it is sure to bring up questions and thoughts that they’ll want to include in their future researches in their own specific sub-areas of expertise.

How to Sidestep Mathematical Equations in Popular Science Books

In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.

I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.

In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, $S = k log W$, which appears engraved on the tombstone over his bust.

I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.