The Information Theory of Life | Quanta Magazine

Bookmarked The Information Theory of Life (Quanta Magazine)
The Information Theory of Life: The polymath Christoph Adami is investigating life’s origins by reimagining living things as self-perpetuating information strings.

Why Information Grows: The Evolution of Order, from Atoms to Economies

I just ordered a copy of Why Information Grows: The Evolution of Order, from Atoms to Economies by Cesar Hidalgo. Although it seems more focused on economics, the base theory seems to fit right into some similar thoughts I’ve long held about biology.

Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo
Why Information Grows: The Evolutiion of Order from Atoms to Economies by Cesar Hidalgo

 

From the book description:

“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.

At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.

Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”

8th Annual North American School of Information Theory (NASIT)

Bookmarked 8th Annual North American School of Information Theory (NASIT) (nasit15.ucsd.edu)

August 10-13, 2015 – UC San Diego, La Jolla, California
Application deadline: June 7, 2015

The School of Information Theory will bring together over 100 graduate students, postdoctoral scholars, and leading researchers for four action-packed days of learning, stimulating discussions, professional networking and fun activities, all on the beautiful campus of the University of California, San Diego (UCSD) and in the nearby beach town of La Jolla.

  • Tutorials by some of the best known researchers in information theory and related fields
  • Poster presentations by student participants with feedback and discussion
  • Panel discussion on “IT: Academia vs. Industry Perspectives”
  • Social events and fun activities

The Information Universe Conference

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

Keynote speakers

  • Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
  • Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
  • Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
  • Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
  • Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
  • Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands
Infoversum Theater, The Netherlands
Infoversum Theater, The Netherlands

Conference synopsis from their homepage:

The main ambition of this conference is to explore the question “What is the role of information in the physics of our Universe?”. This intellectual pursuit may have a key role in improving our understanding of the Universe at a time when we “build technology to acquire and manage Big Data”, “discover highly organized information systems in nature” and “attempt to solve outstanding issues on the role of information in physics”. The conference intends to address the “in vivo” (role of information in nature) and “in vitro” (theory and models) aspects of the Information Universe.

The discussions about the role of information will include the views and thoughts of several disciplines: astronomy, physics, computer science, mathematics, life sciences, quantum computing, and neuroscience. Different scientific communities hold various and sometimes distinct formulations of the role of information in the Universe indicating we still lack understanding of its intrinsic nature. During this conference we will try to identify the right questions, which may lead us towards an answer.

  • Is the universe one big information processing machine?
  • Is there a deeper layer in quantum mechanics?
  • Is the universe a hologram?
  • Is there a deeper physical description of the world based on information?
  • How close/far are we from solving the black hole information paradox?
  • What is the role of information in highly organized complex life systems?
  • The Big Data Universe and the Universe : are our numerical simulations and Big Data repositories (in vitro) different from real natural system (in vivo)?
  • Is this the road to understanding dark matter, dark energy?

The conference will be held in the new 260 seats planetarium theatre in Groningen, which provides an inspiring immersive 3D full dome display, e.g. numerical simulations of the formation of our Universe, and anything else our presenters wish to bring in. The digital planetarium setting will be used to visualize the theme with modern media.

The Information Universe Website

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Additonal resources from the Workshop:

 

Popular Science Books on Information Theory, Biology, and Complexity

The beginning of a four part series in which I provide a gradation of books and texts that lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.

Previously, I had made a large and somewhat random list of books which lie in the intersection of the application of information theory, physics, and engineering practice to the area of biology.  Below I’ll begin to do a somewhat better job of providing a finer gradation of technical level for both the hobbyist or the aspiring student who wishes to bring themselves to a higher level of understanding of these areas.  In future posts, I’ll try to begin classifying other texts into graduated strata as well.  The final list will be maintained here: Books at the Intersection of Information Theory and Biology.

Introductory / General Readership / Popular Science Books

These books are written on a generally non-technical level and give a broad overview of their topics with occasional forays into interesting or intriguing subtopics. They include little, if any, mathematical equations or conceptualization. Typically, any high school student should be able to read, follow, and understand the broad concepts behind these books.  Though often non-technical, these texts can give some useful insight into the topics at hand, even for the most advanced researchers.

Complexity: A Guided Tour by Melanie Mitchell (review)

Possibly one of the best places to start, this text gives a great overview of most of the major areas of study related to these fields.

Entropy Demystified: The Second Law Reduced to Plain Common Sense by Arieh Ben-Naim

One of the best books on the concept of entropy out there.  It can be read even by middle school students with no exposure to algebra and does a fantastic job of laying out the conceptualization of how entropy underlies large areas of the broader subject. Even those with Ph.D.’s in statistical thermodynamics can gain something useful from this lovely volume.

The Information: A History, a Theory, a Flood by James Gleick (review)

A relatively recent popular science volume covering various conceptualizations of what information is and how it’s been dealt with in science and engineering.  Though it has its flaws, its certainly a good introduction to the beginner, particularly with regard to history.

The Origin of Species by Charles Darwin

One of the most influential pieces of writing known to man, this classical text is the basis from which major strides in biology have been made as a result. A must read for everyone on the planet.

Information, Entropy, Life and the Universe: What We Know and What We Do Not Know by Arieh Ben-Naim

Information Theory and Evolution by John Avery

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life by Werner R. Loewenstein (review)

Information Theory, Evolution, and the Origin of Life by Hubert P. Yockey

The four books above have a significant amount of overlap. Though one could read all of them, I recommend that those pressed for time choose Ben-Naim first. As I write this I’ll note that Ben-Naim’s book is scheduled for release on May 30, 2015, but he’s been kind enough to allow me to read an advance copy while it was in process; it gets my highest recommendation in its class. Loewenstein covers a bit more than Avery who also has a more basic presentation. Most who continue with the subject will later come across Yockey’s Information Theory and Molecular Biology which is similar to his text here but written at a slightly higher level of sophistication. Those who finish at this level of sophistication might want to try Yockey third instead.

The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley

Grammatical Man: Information, Entropy, Language, and Life  by Jeremy Campbell

Life’s Ratchet: How Molecular Machines Extract Order from Chaos by Peter M. Hoffmann

Complexity: The Emerging Science at the Edge of Order and Chaos by M. Mitchell Waldrop

The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) 

In the coming weeks/months, I’ll try to continue putting recommended books on the remainder of the rest of the spectrum, the balance of which follows in outline form below. As always, I welcome suggestions and recommendations based on others’ experiences as well. If you’d like to suggest additional resources in any of the sections below, please do so via our suggestion box. For those interested in additional resources, please take a look at the ITBio Resources page which includes information about related research groups; references and journal articles; academic, research institutes, societies, groups, and organizations; and conferences, workshops, and symposia.

Lower Level Undergraduate

These books are written at a level that can be grasped and understood by most with a freshmen or sophomore university level. Coursework in math, science, and engineering will usually presume knowledge of calculus, basic probability theory, introductory physics, chemistry, and basic biology.

Upper Level Undergraduate

These books are written at a level that can be grasped and understood by those at a junior or senor university level. Coursework in math, science, and engineering may presume knowledge of probability theory, differential equations, linear algebra, complex analysis, abstract algebra, signal processing, organic chemistry, molecular biology, evolutionary theory, thermodynamics, advanced physics, and basic information theory.

Graduate Level

These books are written at a level that can be grasped and understood by most working at the level of a master’s level at most universities.  Coursework presumes all the previously mentioned classes, though may require a higher level of sub-specialization in one or more areas of mathematics, physics, biology, or engineering practice.  Because of the depth and breadth of disciplines covered here, many may feel the need to delve into areas outside of their particular specialization.

String Theory, Black Holes, and Information

Amanda Peet presented the a lecture entitled "String Theory Legos for Black Holes" at the Perimeter Institute for Theoretical Physics.

Four decades ago, Stephen Hawking posed the black hole information paradox about black holes and quantum theory. It still challenges the imaginations of theoretical physicists today. Yesterday, Amanda Peet (University of Toronto) presented the a lecture entitled “String Theory Legos for Black Holes” yesterday at the Perimeter Institute for Theoretical Physics. A quick overview/teaser trailer for the lecture follows along with some additional information and the video of the lecture itself.

The “Information Paradox” with Amanda Peet (teaser trailer)

“Black holes are the ‘thought experiment’ par excellence, where the big three of physics – quantum mechanics, general relativity and thermodynamics – meet and fight it out, dragging in brash newcomers such as information theory and strings for support. Though a unification of gravity and quantum field theory still evades string theorists, many of the mathematical tools and ideas they have developed find applications elsewhere.

One of the most promising approaches to resolving the “information paradox” (the notion that nothing, not even information itself, survives beyond a black hole’s point-of-no-return event horizon) is string theory, a part of modern physics that has wiggled its way into the popular consciousness.

On May 6, 2015, Dr. Amanda Peet, a physicist at the University of Toronto, will describe how the string toolbox allows study of the extreme physics of black holes in new and fruitful ways. Dr. Peet will unpack that toolbox to reveal the versatility of strings and (mem)branes, and will explore the intriguing notion that the world may be a hologram.

Amanda Peet Amanda Peet is an Associate Professor of Physics at the University of Toronto. She grew up in the South Pacific island nation of Aotearoa/New Zealand, and earned a B.Sc.(Hons) from the University of Canterbury in NZ and a Ph.D. from Stanford University in the USA. Her awards include a Radcliffe Fellowship from Harvard and an Alfred P. Sloan Foundation Research Fellowship. She was one of the string theorists interviewed in the three-part NOVA PBS TV documentary “Elegant Universe”.

Web site: http://ap.io/home/.

Dr. Amanda Peet’s Lecture “String Theory Legos for Black Holes”

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag , and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with

 

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

  1. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
  2. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
  3. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
  4. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
  5. #ITBio http://t.co/Ty8dEIXQUT"/>

    CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
  6. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
  7. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
  8. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
  9. Good morning from beautiful Banff. How can you not love the mountains? http://t.co/mxYBNz7yzl

    Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
  10. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
  11. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
  12. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
  13. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4 
  14. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
  15. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
  16. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
  17. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4 
  18. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4 
  19. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
  20. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
  21. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4 
  22. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454 
  23. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
  24. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4 
  25. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4 
  26. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4 
  27. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4 
  28. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
  29. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
  30. So information theory seems to be a big deal in ecology. @BIRS_Math
  31. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4 
  32. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
  33. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
  34. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk 
  35. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4 
  36. #ITBio http://t.co/VA8komuuSW"/>

    .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
  37. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
  38. Baez has a post on Tishby's talk "Sensing &  Acting Under Information Constraints" http://t.co/t1nPVI1pxa @BIRS_Math http://t.co/dFuiVLFSGC

    Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
  39. INFORMATION THEORY is the new central ...

    INFORMATION THEORY is the new central …
  40. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF 
  41. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7 
  42. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
  43. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
  44. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4 
  45. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4 
  46. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4 
  47. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
  48. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
  49. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170 
  50. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
  51. A few thoughts about that workshop while I wait for my flight back to Toronto.
  52. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
  53. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
  54. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
  55. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
  56. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
  57. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.


Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.


 

Logo for the journal Entropy

 

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

  1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
  2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

  • information theoretic intrinsic motivations
  • information theoretic quantification of behavior
  • information theoretic guidance of artificial evolution
  • information theoretic guidance of self-organization
  • information theoretic driving forces behind learning
  • information theoretic driving forces behind behavior
  • information theory in swarms
  • information theory in social behavior
  • information theory in evolution
  • information theory in the brain
  • information theory in system-environment distinction
  • information theory in the perception action loop
  • information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea
Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.

So, here are the goals of our workshop:

  •  To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
  • To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
  • To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
  • To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
  • To study the interplay between information theory and the thermodynamics of individual cells and organelles.

For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

The Mnemonic Major System and Gregg Shorthand Have the Same Underlying Structure!

I’ve been a proponent and user of a variety of mnemonic systems since I was about eleven years old.  The two biggest and most useful in my mind are commonly known as the “method of loci” and the “major system.” The major system is also variously known as the phonetic number system, the phonetic mnemonic system, or Hergione’s mnemonic system after French mathematician and astronomer Pierre Hérigone (1580-1643) who is thought to have originated its use.

The major system generally works by converting numbers into consonant sounds and then from there into words by adding vowels under the overarching principle that images (of the words) can be remembered more easily than the numbers themselves. For instance, one could memorize one’s grocery list of a hundred items by associating each shopping item on a numbered list with the word associated with the individual number in the list. As an example, if item 22 on the list is lemons, one could translate the number 22 as “nun” within the major system and then associate or picture a nun with lemons – perhaps a nun in full habit taking a bath in lemons to make the image stick in one’s memory better. Then at the grocery store, when going down one’s list, when arriving at number 22 on the list, one automatically translates the number 22 to “nun” which will almost immediately conjure the image of a nun taking a bath in lemons which gives one the item on the list that needed to be remembered.  This comes in handy particularly when one needs to be able to remember large lists of items in and out of order.

The following generalized chart, which can be found in a hoard of books and websites on the topic, is fairly canonical for the overall system:

Numeral IPA Associated Consonants Mnemonic for remembering the numeral and consonant relationship
0 /s/ /z/ s, z, soft c “z” is the first letter of zero; the other letters have a similar sound
1 /t/ /d/ t, d t & d have one downstroke and sound similar (some variant systems include “th”)
2 /n/ n n has two downstrokes
3 /m/ m m has three downstrokes; m looks like a “3” on its side
4 /r/ r last letter of four; 4 and R are almost mirror images of each other
5 /l/ l L is the Roman Numeral for 50
6 /ʃ/ /ʒ/ /tʃ/ /dʒ/ j, sh, soft g, soft “ch” a script j has a lower loop; g is almost a 6 rotated
7 /k/ /ɡ/ k, hard c, hard g, hard “ch”, q, qu capital K “contains” two sevens (some variant systems include “ng”)
8 /f/ /v/ f, v script f resembles a figure-8; v sounds similar (v is a voiced f)
9 /p/ /b/ p, b p is a mirror-image 9; b sounds similar and resembles a 9 rolled around
Unassigned Vowel sounds, w,h,y w and h are considered half-vowels; these can be used anywhere without changing a word’s number value

There are a variety of ways to use the major system as a code in addition to its uses in mnemonic settings.  When I was a youth, I used it to write coded messages and to encrypt a variety of things for personal use. After I had originally read Dr. Bruno Furst’s series of booklets entitled You Can Remember: A Home Study Course in Memory and Concentration 1, I had always wanted to spend some time creating an alternate method of writing using the method.  Sadly I never made the time to do the project, but yesterday I made a very interesting discovery that, to my knowledge, doesn’t seem to have been previously noticed!

My discovery began last week when I read an article in The Atlantic by journalist Dennis Hollier entitled How to Write 225 Words Per Minute with a Pen: A Lesson in the Lost Technology of Shorthand. 2 In the article, which starts off with a mention of the Livescribe pen – one of my favorite tools, Mr. Hollier outlines the use of the Gregg System of Shorthand which was invented by John Robert Gregg in 1888. The description of the method was intriguing enough to me that I read a dozen additional general articles on shorthand on the internet and purchased a copy of Louis A. Leslie’s two volume text Gregg Shorthand: Functional Method. 3

I was shocked, on page x of the front matter, just before the first page of the text, to find the following “Alphabet of Gregg Shorthand”:

Alphabet of Gregg Shorthand
Alphabet of Gregg Shorthand
Gregg Shorthand is using EXACTLY the same consonant-type breakdown of the alphabet as the major system!

Apparently I wasn’t the first to have the idea to turn the major system into a system of writing. The fact that the consonant breakdowns for the major system coincide almost directly to those for the shorthand method used by Gregg cannot be a coincidence!

The Gregg system works incredibly well precisely because the major system works so well. The biggest difference between the two systems is that Gregg utilizes a series of strokes (circles and semicircles) to indicate particular vowel sounds which allows for better differentiation of words which the major system doesn’t generally take into consideration. From an information theoretic standpoint, this is almost required to make the coding from one alphabet to the other possible, but much like ancient Hebrew, leaving out the vowels doesn’t remove that much information. Gregg, also like Hebrew, also uses dots and dashes above or below certain letters to indicate the precise sound of many of its vowels.

The upside of all of this is that the major system is incredibly easy to learn and use, and from here, learning Gregg shorthand is just a hop, skip , and a jump – heck, it’s really only just a hop because the underlying structure is so similar. Naturally as with the major system, one must commit some time to practicing it to improve on speed and accuracy, but the general learning of the system is incredibly straightforward.

Because the associations between the two systems are so similar, I wasn’t too surprised to find that some of the descriptions of why certain strokes were used for certain letters were very similar to the mnemonics for why certain letters were used for certain numbers in the major system.

From Dr. Bruno Furst's "You Can Remember!" The mnemonic for remembering 6, 7, 8, & 9 in the major system
From Dr. Bruno Furst’s “You Can Remember!”
The mnemonic for remembering 6, 7, 8, & 9 in the major system.
From Louis Leslie's "Gregg Shorthand: Functional Method" The mnemonic for remembering the strokes for k and g.
From Louis Leslie’s “Gregg Shorthand: Functional Method”
The mnemonic for remembering the strokes for k and g.

One thing I have noticed in my studies on these topics is the occasional references to the letter combinations “NG” and “NK”. I’m curious why these are singled out in some of these systems? I have a strong suspicion that their inclusion/exclusion in various incarnations of their respective systems may be helpful in dating the evolution of these systems over time.

I’m aware that various versions of shorthand have appeared over the centuries with the first recorded having been the “Tironian Notes” of Marcus Tullius Tiro (103-4 BCE) who apparently used his system to write down the speeches of his master Cicero. I’m now much more curious at what point the concepts for shorthand and the major system crossed paths or converged? My assumption would be that it happened in the late Renaissance, but it would be nice to have the underlying references and support for such a timeline. Perhaps it was with Timothy Bright’s publication of  Characterie; An Arte of Shorte, Swifte and Secrete Writing by Character (1588) 4, John Willis’s Art of Stenography (1602) 5, Edmond Willis’s An abbreviation of writing by character (1618) 6, or Thomas Shelton’s Short Writing (1626) 7? Shelton’s system was certainly very popular and well know because it was used by both Samuel Pepys and Sir Isaac Newton.

Certainly some in-depth research will tell, though if anyone has ideas, please don’t hesitate to indicate your ideas in the comments.

UPDATE on 7/6/14:

I’m adding a new chart making the correspondence between the major system and Gregg Shorthand more explicit.

A chart indicating the correspondences between the major system and Gregg Shorthand.
A chart indicating the correspondences between
the major system and Gregg Shorthand.

References

1.
Furst B. You Can Remember: A Home Study Course in Memory and Concentration. Markus-Campbell Co.; 1965.
2.
Hollier D. How to Write 225 Words Per Minute With a Pen: A lesson in the lost technology of shorthand. The Atlantic. http://www.theatlantic.com/technology/archive/2014/06/yeah-i-still-use-shorthand-and-a-smartpen/373281/. Published 2014.
3.
Leslie LA. Gregg Shorthand: Functional Method. Gregg Publishing Company; 1947.
4.
Bright T (1550-1615). Characterie; An Arte of Shorte, Swifte and Secrete Writing by Character. 1st ed. I. Windet; reprinted by W. Holmes, Ulverstone; 1588. https://archive.org/details/characteriearteo00brig.
5.
Willis J. Art of Stenography.; 1602.
6.
Willis E. An Abbreviation of Writing by Character.; 1618.
7.
Shelton T. Short Writing.; 1626.

Latin Pedagogy and the Digital Humanities

I’ve long been a student of the humanities (and particularly the classics) and have recently begun reviewing over my very old and decrepit knowledge of Latin.  It’s been two decades since I made a significant study of classical languages, and lately (as the result of conversations with friends like Dave Harris, Jim Houser, Larry Richardson, and John Kountouris) I’ve been drawn to reviewing them for reading a variety of classical texts in their original languages. Fortunately, in the intervening years, quite a lot has changed in the tools relating to pedagogy for language acquisition.

Jenny's Second Year Latin
A copy of Jenny’s Latin text which I had used 20 years ago and recently acquired a new copy for the pittance of $3.25.

Internet

The biggest change in the intervening time is the spread of the  internet which supplies a broad variety of related websites with not only interesting resources for things like basic reading and writing, but even audio sources apparently including listening to the nightly news in Latin. There are a variety of blogs on Latin as well as even online courseware, podcasts, pronunciation recordings, and even free textbooks. I’ve written briefly about the RapGenius platform before, but I feel compelled to mention it as a potentially powerful resource as well. (Julius Caesar, Seneca, Ovid, Cicero, et al.) There is a paucity of these sources in a general sense in comparison with other modern languages, but given the size of the niche, there is quite a lot out there, and certainly a mountain in comparison to what existed only twenty years ago.

Software

There has also been a spread of pedagogic aids like flashcard software including Anki and Mnemosyne with desktop, web-based, and even mobile-based versions making  learning available in almost any situation. The psychology and learning research behind these types of technologies has really come a long way toward assisting students to best make use of their time in learning and retaining what they’ve learned in long term memory.  Simple mobile applications like Duolingo exist for a variety of languages – though one doesn’t currently exist for classical Latin (yet).

Digital Humanities

The other great change is the advancement of the digital humanities which allows for a lot of interesting applications of knowledge acquisition. One particular one that I ran across this week was the Dickinson College Commentaries (DCC). Specifically a handful of scholars have compiled and documented a list of the most common core vocabulary words in Latin (and in Greek) based on their frequency of appearance in extant works.  This very specific data is of interest to me in relation to my work in information theory, but it also becomes a tremendously handy tool when attempting to learn and master a language.  It is a truly impressive fact that, simply by knowing that if one can memorize and master about 250 words in Latin, it will allow them to read and understand 50% of most written Latin.  Further, knowledge of 1,500 Latin words will put one at the 80% level of vocabulary mastery for most texts.  Mastering even a very small list of vocabulary allows one to read a large variety of texts very comfortably.  I can only think about the old concept of a concordance (which was generally limited to heavily studied texts like the Bible or possibly Shakespeare) which has now been put on some serious steroids for entire cultures. Another half step and one arrives at the Google Ngram Viewer.

The best part is that one can, with very little technical knowledge, easily download the DCC Core Latin Vocabulary (itself a huge research undertaking) and upload and share it through the Anki platform, for example, to benefit a fairly large community of other scholars, learners, and teachers. With a variety of easy-to-use tools, shortly it may be even that much easier to learn a language like Latin – potentially to the point that it is no longer a dead language. For those interested, you can find my version of the shared DCC Core Latin Vocabulary for Anki online; the DCC’s Chris Francese has posted details and a version for Mnemosyne already.

[Editor’s note: Anki’s web service occasionally clears decks of cards from their servers, so if you find that the Anki link to the DCC Core Latin is not working, please leave a comment below, and we’ll re-upload the deck for shared use.]

What tools and tricks do you use for language study and pedagogy?

The Single Biggest Problem in Communication

apocryphally attributed to George Bernard Shaw,
but more likely William H. Whyte in Fortune, “Is Anybody Listening?” Start Page 77, Quote Page 174, Published by Time, Inc., New York (September 1950)

 

George Bernard Shaw shading his eyes with his hands