CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna.  A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.

The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Logo for the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Location: DACAM, Max F. Perutz Laboratories, University of Vienna, Dr. Bohrgasse 9, A-1030, Vienna, Austria
Dates: May 14, 2014 to May 17, 2014

The workshop is being organized by:

  • Richard Henchman (University of Manchester, United Kingdom)
  • Bojan Zagrovic (University of Vienna, Austria)
  • Michel Cuendet (Swiss Institute of Bioinformatics, Lausanne, Switzerland and Weill Cornell Medical College, New York, USA)
  • Chris Oostenbrink (University of Natural Resources and Life Sciences, Austria)

It’s being supported by CECAM, the European Research Council, and the Royal Society of Chemistry’s Statistical Mechanics and Thermodynamics Group.

I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.

The summary from the workshop website states:

This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.

Further details on the workshop can be found on the CECAM website.

 

As always, details on other upcoming workshops and conferences relating to information theory and biology can be found on our ITBio Conferences/Workshops page.

 

 

2014 Andrew J. Viterbi Distinguished Lecture in Communication: Abbas El Gamal

The USC Viterbi School has recently announced Professor Abbas El Gamal of Stanford University will present the 2014 Andrew J. Viterbi Distinguished Lecture in Communication. The 12th annual lecture entitled “Common Information” will be given on Thursday, April 17, 2014 at 4:00 PM at the University of Southern California in the Seeley Wintersmith Mudd Memorial Hall of Philosophy (MHP) room 101. A reception will precede the lecture at 3:00 PM.

USC’s Viterbi School of Engineering has provided the following abstract for the talk:

Entropy, introduced by Shannon in 1948, arises naturally as a universal measure of information in single-source compression, randomness extraction, and random number generation. In distributed systems, such as communication networks, multiprocessors, distributed storage, and sensor networks, there are multiple correlated sources to be processed jointly. The information that is common between these sources can be utilized, for example, to reduce the amount of communication needed for compression, computing, simulation, and secret key generation. My talk will focus on the question of how such common information should be measured. While our understanding of common information is far from complete, I will aim to demonstrate the richness of this question through the lens of network information theory. I will show that, depending on the distributed information processing task considered, there can be several well-motivated measures of common information. Along the way, I will present some of the key models, ideas, and tools of information theory, which invite further investigation into this intriguing subject. Some parts of this talk are based on recent joint work with Gowtham Kumar and Cheuk Ting Li and on discussions with Young-Han Kim.

Headshot of Abbas El GamalBiography: Abbas El Gamal is the Hitachi America Professor in the School of Engineering and Chair of the Department of Electrical Engineering at Stanford University. He received his Ph.D. degree in electrical engineering from Stanford University in 1978. He was an Assistant Professor in the Department of Electrical Engineering at the University of Southern California (USC) from 1978 to 1980. His research interests and contributions have spanned the areas of information theory, wireless networks, CMOS imaging sensors and systems, and integrated circuit design and design automation. He has authored or coauthored over 200 papers and 30 patents in these areas. He is coauthor of the book Network Information Theory (Cambridge Press 2011). He has won several honors and awards, including the 2012 Claude E. Shannon Award, and the 2004 Infocom best paper award. He is a member of the National Academy of Engineering and a Fellow of the IEEE. He has been active in several IEEE societies, including serving on the Board on Governors of the IT society where he is currently its President. He cofounded and/or served in various leadership roles at several semiconductor, EDA, and biotechnology companies.

Audiences: Everyone Is Invited

Workshop on Information Theoretic Incentives for Artificial Life

For those interested in the topics of information theory in biology and artificial life, Christoph SalgeGeorg MartiusKeyan Ghazi-Zahedi, and Daniel Polani have announced a Satellite Workshop on Information Theoretic Incentives for Artificial Life at the 14th International Conference on the Synthesis and Simulation of Living Systems (ALife 2014) to be held at the Javits Center, New York, on July 30 or 31st.

ALife2014 Banner

Their synopsis states:

Artificial Life aims to understand the basic and generic principles of life, and demonstrate this understanding by producing life-like systems based on those principles. In recent years, with the advent of the information age, and the widespread acceptance of information technology, our view of life has changed. Ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. But what can information, or more formally Information Theory, offer to Artificial Life?

One relevant area is the motivation of behaviour for artificial agents, both virtual and real. Instead of learning to perform a specific task, informational measures can be used to define concepts such as boredom, empowerment or the ability to predict one’s own future. Intrinsic motivations derived from these concepts allow us to generate behaviour, ideally from an embodied and enactive perspective, which are based on basic but generic principles. The key questions here are: “What are the important intrinsic motivations a living agent has, and what behaviour can be produced by them?”

Related to an agent’s behaviour is also the question on how and where the necessary computation to realise this behaviour is performed. Can information be used to quantify the morphological computation of an embodied agent and to what degree are the computational limitations of an agent influencing its behaviour?

Another area of interest is the guidance of artificial evolution or adaptation. Assuming it is true that an agent wants to optimise its information processing, possibly obtain as much relevant information as possible for the cheapest computational cost, then what behaviour would naturally follow from that? Can the development of social interaction or collective phenomena be motivated by an informational gradient? Furthermore, evolution itself can be seen as a process in which an agent population obtains information from the environment, which begs the question of how this can be quantified, and how systems would adapt to maximise this information?

The common theme in those different scenarios is the identification and quantification of driving forces behind evolution, learning, behaviour and other crucial processes of life, in the hope that the implementation or optimisation of these measurements will allow us to construct life-like systems.

Details for submissions, acceptances, potential talks, and dates can be found via  Nihat Ay’s Research Group on Information Theory of Cognitive Systems. For more information on how to register, please visit the ALife 2014 homepage. If there are any questions, or if you just want to indicate interest in submitting or attending, please feel free to mail them at itialife@gmail.com.

According to their release, the open access journal Entropy will sponsor the workshop by an open call with a special issue on the topic of the workshop. More details will be announced to emails received via itialife@gmail.com and over the alife and connectionists mailing lists.

Information Theory is Something Like the Logarithm of Probability Theory

Dr. Daniel Polani, reader in Artificial Life, University of Hertfordshire
in “Research Questions”

 

Not only a great quote, but an interesting way to view the subjects.

Information Theory and Paleoanthropology

A few weeks ago I had communicated a bit with paleoanthropologist John Hawks.  I wanted to take a moment to highlight the fact that he maintains an excellent blog primarily concerning his areas of research which include anthropology, genetics and evolution.  Even more specifically, he is one of the few people in these areas with at least a passing interest in the topic of information theory as it relates to his work. I recommend everyone take a look at his information theory specific posts.

silhouette of John Hawks from his blog

I’ve previously written a brief review of John Hawks’ (in collaboration with Anthony Martin) “Major Transitions in Evolution” course from The Learning Company as part of their Great Courses series of lectures. Given my interest in the MOOC revolution in higher education, I’ll also mention that Dr. Hawks has recently begun a free Coursera class entitled “Human Evolution: Past and Future“. I’m sure his current course focuses more on the area of human evolution compared with the prior course which only dedicated a short segment on this time period.  Given Hawks’ excellent prior teaching work, I’m sure this will be of general interest to readers interested in information theory as it relates to evolution, biology, and big history.

I’d love to hear from others in the area of anthropology who are interested in information theoretical applications.

 

Forthcoming title on “Information Theory for Bioinformatics”

While reading today I ran across a notice on Wiley’s German-based website that Viswanathan Arunachalam has a text on Information Theory for Bioinformatics which is scheduled to be released in June 2014.

From the publisher’s website, they provide the following synopsis:

This book discusses information theory as a means of extracting data from large amounts of biological sequences. Utilizing the Shannon theory, the book explains using the information theory principles to interpret sequences and extract vital information. It provides a detailed overview of the practical applications in bioinformatics and includes coverage of diversity in nucleotide and amino acid sequences, sing-nucleotide polymorphism (SNP) and indel sites, binding sites in promoter regions, splicing sites, and more.

If I can manage to get an early copy, I’ll provide a review shortly.

New Routledge Text on Systems Theory

Over the holiday I ran across a press release, which follows with web links added, for a new book on systems theory. It promises to be an excellent read on the development and philosophy of systems theory for those interested in cybernetics, information theory, complexity and related topics.

Book cover image of Traditions of Systems Theory: Major Figures and Contemporary Developments

MIAMI, Fla., Dec. 19, 2013
Dr. Darrell Arnold, Assistant Professor of Philosophy and Director of the Institute for World Languages and Cultures at St. Thomas University, has published an edited volume with Routledge entitled Traditions of Systems Theory: Major Figures and Contemporary Developments. Hans-Georg Moeller, of University College Cork, Ireland, notes that the book “provides a state-of-the-art survey of the increasingly influential and fascinating field of systems theory. It is a highly useful resource for a wide range of disciplines and contributes significantly to bringing together current trends in the sciences and the humanities.” The book includes 17 articles from leading theoreticians in the field, including pieces by Ranulph Glanville, the President of the American Society for Cybernetics, as well as Debora Hammond, the former President of the International Society for Systems Sciences. It is the first comprehensive edited volume in English on the major and countervailing developments within systems theory.

Dr. Arnold writes on 19th century German philosophy, contemporary social theory, as well as technology and globalization, with a focus on how these areas relate to the environmental problematic. He has translated numerous books from German, including C. Mantzavinos’s Naturalistic Hermeneutics (Cambridge UP) and Matthias Vogel’s Media of Reason (Columbia UP). Dr. Arnold is also editor-in-chief of the Humanities and Technology Review.

For additional information on St. Thomas University academic programs and faculty publications, please contact Marivi Prado, Chief Marketing Officer, 305.474.6880; mprado@stu.edu

Dr. Darrell P. Arnold
Dr. Darrell P. Arnold

I’ve ordered my copy and will be providing a review shortly.

Renaissance for Information Theory in Biology

This year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956. (I might argue it’s possibly even bigger than Claude Shannon’s Ph.D. thesis.)  It certainly portends to create a movement that will rapidly build upon and far surpass Norbert Weiner’s concept of Cybernetics and Ludwig von Bertalanffy’s concept of General Systems Theory.

This week John Baez has announced an upcoming three day workshop on “Entropy and Information in Biological Systems” to be hosted by the National Institute for Mathematical and Biological Synthesis in Knoxville, TN, tentatively scheduled for October 22-24, 2014.

Apparently unbeknownst to Baez, earlier this year Andrew Eckford, Toby Berger, and Peter Thomas announced a six day workshop on “Biological and Bio-Inspired Information Theory” to be hosted by the Banff International Research Station for Mathematical Innovation and Discovery scheduled for October 26-31, 2014 – just two days later!

What a bonanza!!

The BIRS workshop will be a bit more general in its approach while the NIMBioS workshop has a slightly tighter view specifically on maximum entropy as applied to biology.

Even more telling (and perhaps most promising) about the two workshops is the very heavy mathematical bent both intend to make their focus.  I have a theory that the bounds of science are held below the high water level of mathematics (aka are “bounded by” in mathematics-speak), so there is nothing more exciting than to see groups attempting to push the mathematics and its application further. It was both the lack of mathematical rigor and the general youth of biology (and specifically genetics and microbiology) in the 1950’s which heavily hampered the early growth of cybernetics as a movement. Fortunately this is no longer the case on either count. Now we just need more researchers who are more readily conversant in the two realms simultaneously.

Book Review: “Complexity: A Guided Tour” by Melanie Mitchell

Read Complexity: A Guided Tour by Melanie MitchellMelanie Mitchell (amzn.to)
Complexity: A Guided Tour Book Cover Complexity: A Guided Tour
Melanie Mitchell
Popular Science
Oxford University Press
May 28, 2009
Hardcover
366

This book provides an intimate, highly readable tour of the sciences of complexity, which seek to explain how large-scale complex, organized, and adaptive behavior can emerge from simple interactions among myriad individuals. The author, a leading complex systems scientist, describes the history of ideas, current research, and future prospects in this vital scientific effort.

This is handily one of the best, most interesting, and (to me at least) the most useful popularly written science books I’ve yet to come across. Most popular science books usually bore me to tears and end up being only pedantic for their historical backgrounds, but this one is very succinct with some interesting viewpoints (some of which I agree with and some of which my intuition says are terribly wrong) on the overall structure presented.

For those interested in a general and easily readable high-level overview of some of the areas of research I’ve been interested in (information theory, thermodynamics, entropy, microbiology, evolution, genetics, along with computation, dynamics, chaos, complexity, genetic algorithms, cellular automata, etc.) for the past two decades, this is really a lovely and thought-provoking book.

At the start I was disappointed that there were almost no equations in the book to speak of – and perhaps this is why I had purchased it when it came out and it’s subsequently been sitting on my shelf for so long. The other factor that prevented me from reading it was the depth and breadth of other more technical material I’ve read which covers the majority of topics in the book. I ultimately found myself not minding so much that there weren’t any/many supporting equations aside from a few hidden in the notes at the end of the text in most part because Dr. Mitchell does a fantastic job of pointing out some great subtleties within the various subjects which comprise the broader concept of complexity which one generally would take several years to come to on one’s own and at far greater expense of their time. Here she provides a much stronger picture of the overall subjects covered and this far outweighed the lack of specificity. I honestly wished I had read the book when it was released and it may have helped me to me more specific in my own research. Fortunately she does bring up several areas I will need to delve more deeply into and raised several questions which will significantly inform my future work.

In general, I wish there were more references I hadn’t read or been aware of yet, but towards the end there were a handful of topics relating to fractals, chaos, computer science, and cellular automata which I have been either ignorant of or which are further down my reading lists and may need to move closer to the top. I look forward to delving into many of these shortly. As a simple example, I’ve seen Zipf’s law separately from the perspectives of information theory, linguistics, and even evolution, but this is the first time I’ve seen it related to power laws and fractals.

I definitely appreciated the fact that Dr. Mitchell took the time to point out her own personal feelings on several topics and more so that she explicitly pointed them out as her own gut instincts instead of mentioning them passingly as if they were provable science which is what far too many other authors would have likely done. There are many viewpoints she takes which I certainly don’t agree with, but I suspect that it’s because I’m coming at things from the viewpoint of an electrical engineer with a stronger background in information theory and microbiology while hers is closer to that of computer science. She does mention that her undergraduate background was in mathematics, but I’m curious what areas she specifically studied to have a better understanding of her specific viewpoints.

Her final chapter looking at some of the pros and cons of the topic(s) was very welcome, particularly in light of previous philosophic attempts like cybernetics and general systems theory which I (also) think failed because of their lack of specificity. These caveats certainly help to place the scientific philosophy of complexity into a much larger context. I will generally heartily agree with her viewpoint (and that of others) that there needs to be a more rigorous mathematical theory underpinning the overall effort. I’m sure we’re all wondering “Where is our Newton?” or to use her clever aphorism that we’re “waiting for Carnot.” (Sounds like it should be a Tom Stoppard play title, doesn’t it?)

I might question her brief inclusion of her own Ph.D. thesis work in the text, but it did actually provide a nice specific and self-contained example within the broader context and also helped to tie several of the chapters together.

My one slight criticism of the work would be the lack of better footnoting within the text. Though many feel that footnote numbers within the text or inclusion at the bottom of the pages detracts from the “flow” of the work, I found myself wishing that she had done so here, particularly as I’m one of the few who actually cares about the footnotes and wants to know the specific references as I read. I hope that Oxford eventually publishes an e-book version that includes cross-linked footnotes in the future for the benefit of others.

I can heartily recommend this book to any fan of science, but I would specifically recommend it to any undergraduate science or engineering major who is unsure of what they’d specifically like to study and might need some interesting areas to take a look at. I will mention that one of the tough parts of the concept of complexity is that it is so broad and general that it encompasses over a dozen other fields of study each of which one could get a Ph.D. in without completely knowing the full depth of just one of them much less the full depth of all of them. The book is so well written that I’d even recommend it to senior researchers in any of the above mentioned fields as it is certainly sure to provide not only some excellent overview history of each, but it is sure to bring up questions and thoughts that they’ll want to include in their future researches in their own specific sub-areas of expertise.

Beauty, Melody, and Entropy are an Equivalence Class

Sir Arthur Stanley Eddington, OM, FRS (1882-1944), a British astronomer, physicist, and mathematician
in The Nature of the Physical World, 1927

 

Sir Arthur Stanley Eddington

 

Book Review: Gregory Chaitin’s “Proving Darwin: Making Biology Mathematical”

Gregory Chaitin’s book Proving Darwin: Making Biology Mathematical combining biology, microbiology, mathematics, evolution and even information theory is directly in my wheelhouse. I had delayed reading it following a few initial poor reviews, and sadly I must confirm that I’m ultimately disappointed in the direct effort shown here, though there is some very significant value buried within. Unfortunately the full value is buried so deeply that very few, if any, will actually make the concerted effort to find it.

proving

This effort does seem to make a more high-minded and noble attempt than what I would call the “Brian Greene method” in which an academic seemingly gives up on serious science to publish multiple texts on a popular topic to cash in on public interest in that topic through sales of books. In this respect Chaitin is closer to Neil deGrasse Tyson in his effort to expound an interesting theory to the broader public and improve the public discourse, though I would admit he’s probably a bit more (self-) interested in pushing his own theory and selling books (or giving him the benefit of the doubt, perhaps the publisher has pushed him to this).

Though there is a reasonable stab at providing some philosophical background to fit the topic into the broader fabric of science and theory in the later chapters, most of it is rather poorly motivated and is covered far better in other non-technical works. While it is nice to have some semblance of Chaitin’s philosophy and feelings, the inclusion of this type of material only tends to soften the blow of his theoretical work and makes the text read more like pseudo-science or simple base philosophy without any actual rigorous underpinning.

I’m assuming that his purpose in writing the book is to make the theories he’s come up with in his primary paper on the topic more accessible to the broader community of science as well as the public itself. It’s easy for a groundbreaking piece of work to be hidden in the broader scientific literature, but Chaitin seems to be taking his pedestal as a reasonably popular science writer to increase the visibility of his work here. He admittedly mentions that his effort stems from his hobby as his primary area is algorithmic information theory and computer science and not biology or evolution, though his meager references in the text do at least indicate some facility with some of the “right” sources in these latter areas.

Speaking from a broad public perspective, there is certainly interest in this general topic to warrant such a book, though based on the reviews of others via Amazon, Goodreads, etc. the book has sadly missed it’s mark. He unfortunately sticks too closely to the rule that inclusion of mathematical equations is detrimental to the sale of ones books. Sadly, his broader point is seemingly lost on the broader public as his ability to analogize his work isn’t as strong as that of Brian Greene with respect to theoretical physics (string theory).

From the a higher perspective of a researcher who does work in all of the relevant areas relating to the topic, I was even more underwhelmed with the present text aside from the single URL link to the original much more technical paper which Chaitin wrote in 2010. To me this was the most valuable part of the entire text though he did provide some small amount of reasonable detail in his appendix.

I can certainly appreciate Chaitin’s enthusiastic following of John von Neumann but I’m disappointed in his lack of acknowledgement of Norbert Weiner or Claude Shannon who all collaborated in the mid part of the 20th century. I’m sure Chaitin is more than well aware of the father of information theory, but I’ll be willing to bet that although he’s probably read his infamous master’s thesis and his highly influential Bell Labs article on “A/The Mathematical Theory of Communication”, he is, like most, shamefully and wholly unaware of Shannon’s MIT doctoral thesis.

Given Chaitin’s own personal aim to further the acceptance of his own theories and work and the goal of the publisher to sell more copies, I would mention a few recommendations for future potential editions:

The greater majority of his broader audience will have at least a passably reasonable understanding of biology and evolution, but very little, if any, understanding of algorithmic information theory. He would be better off expounding upon this subject to bring people up to speed to better understand his viewpoint and his subsequent proof. Though I understand the need to be relatively light in regard to the number of equations and technicalities included, Chaitin could follow some of his heroes of mathematical exposition and do a slightly better job of explaining what is going on here. He could also go a long way toward adding some significant material to the appendices to help the higher end general readers and the specifically the biologists understand more of the technicalities of algorithmic information theory to better follow his proof which should appear in intricate glory in the appendix as well. I might also recommend excising some of the more philosophical material which tends to undermine his scientific “weight.” Though I found it interesting that he gives a mathematical definition of “intelligent design”, I have a feeling its intricacies were lost on most of his readership — this point alone could go a long way towards solidifying the position of evolution amongst non-scientists, particularly in America, and win the support of heavyweights like Dawkins himself.

I’ll agree wholeheartedly with one reviewer who said that Chaitin tends to “state small ideas repeatedly, and every time at the same shallow level with astonishing amount of redundancy (mostly consisting of chit-chat and self congratulations)”. This certainly detracted from my enjoyment of the work. Chaitin also includes an awful lot of name dropping of significant scientific figures tangential to the subject at hand. This may have been more impressive if he included the results of his discussions with them about the subject, but I’m left with the impression that he simply said hello, shook their hands, and at best was simply inspired by his having met them. It’s nice that he’s had these experiences, but it doesn’t help me to believe or follow his own work.

For the technically interested reader, save yourself some time and simply skim through chapter five and a portion of the appendix relating to his proof and then move on to his actual paper. For the non-technical reader, I expect you’ll get more out of reading Richard Dawkins’ early work (The Selfish Gene) or possibly Werner R. Loewenstein’s The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life.

Though I would certainly agree that we could use a mathematical proof of evolution, and that Chaitin has made a reasonable theoretical stab, this book sadly wasn’t the best one to motivate broader interest in such an effort. I’ll give him five stars for effort, three for general content, but in the end, for most it will have to be at most a 2 star work overall.

This review was originally published on June 17, 2013.

Regard the World as Made of Information

John Archibald Wheeler (1911-2008), American theoretical physicist
[attributed by Jacob Bekenstein in “Information in the Holographic Universe” (Scientific American, 2007)]

 

John Archibald Wheeler

Brief Book Review: James Gleick’s “The Information: a History, a Theory, a Flood”

Overall James Gleick’s book The Information: a History, a Theory, a Flood is an excellent read. Given that it’s an area with which I’m intimately interested, I’m not too surprised that most of it is “review”, but I’d highly recommend it to the general public to know more about some of the excellent history, philosophy, and theory which Gleick so nicely summarizes throughout the book.

Book Cover: The Information

There are one or two references in the back which I’ll have to chase down and read and one or two, which after many years, seem like they may be worth a second revisiting after having completed this.

Even for the specialist, Gleick manages to tie together some disparate thoughts to create an excellent whole which makes it a very worthwhile read. I found towards the last several chapters, Gleick’s style becomes much more flowery and less concrete, but most of it is as a result of covering the “humanities” perspective of information as opposed to the earlier parts of the text which were more specific to history and the scientific theories he covered.

Review originally posted at GoodReads.com.

The next major thrust in biology

Werner R. Loewenstein (), biologist, physiologist
in The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life, Oxford University Press, 1999

 

The Touchstone of Life (Book Cover)

You and I Are Not Much Different from Cans of Soup

Philip Nelson, American physicist
in Biological Physics: Energy, Information, Life

 

Biological Physics: Energy, Information, Life written by Philip Nelson
Biological Physics: Energy, Information, Life written by Philip Nelson