What is Information? by Christoph Adami

Bookmarked What is Information? [1601.06176] by Christoph Adami (arxiv.org)

Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

The Information Universe Conference

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

Keynote speakers

• Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
• Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
• Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
• Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
• Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
• Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands

Conference synopsis from their homepage:

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag , and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

1. I know where I’ll be in Oct 2014! Let’s hear it for Biology & Information Theory!  https://www.birs.ca/events/2014/5-day-workshops/14w5170  #ITBio #Banff @andreweckford
2. . @andreweckford You might be interested in this grouping of research papers:  http://www.mendeley.com/groups/2545131/itbio/  #ITBio #Banff
3. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
4. #ITBio: @andreweckford has a new book on Molecular Communication available Oct 31.  http://bit.ly/15uEUzF
5. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
6. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
7. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
8. Workshop on Information Theoretic Incentives for Artificial Life  http://jhu.md/1lM8tAn  #ITBio #ALife14 @alifeofficial @14thALIFE @cxdig
9. Researchers working in information theory & biology  http://jhu.md/1gieQGR  #ITBio @andreweckford @ChristophAdami @wbialek @johnhawks
10. #ITBio http://t.co/Ty8dEIXQUT"/>

CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
11. Currently organizing my Banff workshop on bio-information theory …  https://www.birs.ca/events/2014/5-day-workshops/14w5170
12. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
13. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
14. How information theory could hold the key to quantifying nature.  http://wrd.cm/1uy1xdX  by @vero_greenwood pic.twitter.com/ek5DUb2Ul9
15. Good morning from Banff. Current temp: -1 C
16. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
17. Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
18. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
19. Peter Thomas (Case Western Reserve University), Signal Transduction and Information Theory  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410270940-Thomas.mp4
20. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
21. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
22. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4
23. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
24. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
25. Chris Rose (Rutgers University), Molecular Communication Channels: timing vs. payload  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271538-Rose.mp4
26. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
27. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4
28. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4
29. Terrific introduction of Canada/Banff by Andrew Eckford (York)The Landscape  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410270858-Eckford.mp4
30. Biological and Bio-Inspired Information Theory workshop videos!  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos  @BIRS_Math
31. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
32. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
33. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4
34. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454
35. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
36. Behnaam Aazhang (Rice University), Real-Time Network Modulation for Intractable Epilepsy  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281337-Aazhang.mp4
37. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4
38. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4
39. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4
40. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4
42. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
43. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
44. So information theory seems to be a big deal in ecology. @BIRS_Math
45. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4
46. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
47. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
48. .@andreweckford @BIRS_Math Harte’s book Maximum Entropy & Ecology is excellent in this area  http://amzn.to/1DwIl3V  pic.twitter.com/EIBDpM35uf
49. .@andreweckford @QuantaMagazine had a nice overview of some of John Harte’s work in September  http://bit.ly/1DwIWCD  @BIRS_Math
50. Chan-Byoung Chae (Yonsei University), Molecular MIMO: From Theory to Practice  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281705-Chae.mp4
51. John Baez (University of California, Riverside), Biodiversity, entropy and thermodynamics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291038-Baez.mp4
52. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk
53. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4
54. #ITBio http://t.co/VA8komuuSW"/>

.@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
55. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
56. Johnston Canyon selfie @BIRS_Math pic.twitter.com/MEKeY5To5s
57. Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
58. INFORMATION THEORY is the new central …
59. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF
60. Ilya Nemenman @EmoryUniversity on Predictive information  http://bit.ly/1titfOw
61. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7
62. Inkpots selfie from yesterday’s hike. @BIRS_Math pic.twitter.com/0A6ZQsQVwE
63. On the way home from Inkpots. @BIRS_Math pic.twitter.com/1XhO8mLOkq
64. Toby Berger (University of Virginia), Neruoscience Applications of GIG Distributions  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410280914-Berger.mp4
65. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
66. Daniel Polani (University of Hertfordshire), Informational Principles in Perception-Action Loops  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301038-Polani.mp4
67. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
68. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4
69. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4
70. Robert Shaw (ProtoLife Inc.), Information and Causality in a Reaction-Diffusion System  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301434-Shaw.mp4
71. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4
72. Nima Soltani (Stanford University), Applications of Directed Information to Neuroscience  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301647-Soltani.mp4
73. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
74. @lrvarshney I shoulda invited you to this BIRS workshop …
75. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
76. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170
77. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
78. Kudos @andreweckford, Toby Berger, Peter Thomas, @NGhoussoub, @BIRS_Math & friends on a fantastic workshop!  http://bit.ly/1ckttZq
79. This @BIRS_Math Workshop was biggest thing in #informationtheory & #biology since the Gatlinburg Symposium in 1956.  http://bit.ly/1rF4RRr
80. See you later Calgary. pic.twitter.com/mkmU6yrmVz
81. A few thoughts about that workshop while I wait for my flight back to Toronto.
82. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
83. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
84. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
85. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
86. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
87. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.

Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

• information theoretic intrinsic motivations
• information theoretic quantification of behavior
• information theoretic guidance of artificial evolution
• information theoretic guidance of self-organization
• information theoretic driving forces behind learning
• information theoretic driving forces behind behavior
• information theory in swarms
• information theory in social behavior
• information theory in evolution
• information theory in the brain
• information theory in system-environment distinction
• information theory in the perception action loop
• information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna.  A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.

The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Location: DACAM, Max F. Perutz Laboratories, University of Vienna, Dr. Bohrgasse 9, A-1030, Vienna, Austria
Dates: May 14, 2014 to May 17, 2014

The workshop is being organized by:

• Richard Henchman (University of Manchester, United Kingdom)
• Bojan Zagrovic (University of Vienna, Austria)
• Michel Cuendet (Swiss Institute of Bioinformatics, Lausanne, Switzerland and Weill Cornell Medical College, New York, USA)
• Chris Oostenbrink (University of Natural Resources and Life Sciences, Austria)

It’s being supported by CECAM, the European Research Council, and the Royal Society of Chemistry’s Statistical Mechanics and Thermodynamics Group.

I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.

The summary from the workshop website states:

This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.

Further details on the workshop can be found on the CECAM website.

As always, details on other upcoming workshops and conferences relating to information theory and biology can be found on our ITBio Conferences/Workshops page.

Tool Review: Zojirushi Stainless Steel Mug

Designer/Artist William Morris once said, “Have nothing in your house that you do not know to be useful, or believe to be beautiful.” My Zojirushi stainless steel mug is one of the few things I’ve ever owned that I feel truly meets both of these criteria.

The design, materials, manufacturing and workmanship of the mug are nothing short of outstanding; the aesthetics and heft in the hand are truly fantastic. I really could not want for more out of such a product. I love looking at it, I love holding it, and I love using it.

I hope one day to come back and write a review worthy of how truly great this travel mug is, but for now, suffice it to say that I’m in love. I spent a LOT of time reading reviews on Amazon and elsewhere, and searching stores and vendors to find the best thermos/mug on the planet and settled on this one. Not only is it easy and intuitive to take completely apart and wash thoroughly (too many I’ve come across are impossible to take apart and clean properly, if at all), but it seals completely and doesn’t spill.

Even better it keeps my beverages piping hot or cold for far longer than I wish it would. There have been days that I’ve filled it with hot coffee or tea and come back several times to drink it hoping that it had cooled a bit only to find it still too hot to consume. After several rounds with this over an eight hour span, I finally opened it up and put in some ice so I could finally drink my coffee. Now I often just leave the cap open (or off) to let it cool a bit more quickly, although even this is a fairly slow process. Now I try to put my beverages in at the temperature I want to drink them knowing that that’s generally the temperature they’ll be when I get around to drinking them.

I love the fact that the cap is designed with a two stage opening mechanism (which probably won’t be noticed by most users because it’s so subtle). One pushes the button and the top opens just a few millimeters. Then letting go of the button allows the top to spring back and click neatly into place so that it doesn’t fall forward and bonk one on the nose when attempting to take a drink.

When I first came across it, I will admit I was a bit reticent at it’s relatively high price (particularly in comparison with cheaper mugs on the market, many of which I’ve tried and been highly disappointed with), but the Zojirushi is certainly worth ever penny; I would not hesitate for a moment to buy more of these.

As a small aside, I will mention that due to physics and the design of the mug that it can occasionally leak a bit when filled with carbonated beverages and then shaken. Doing this creates additional interior pressure that pushes up the internal seal mechanism on the cap that allows a small amount of liquid to escape. Beyond this small category of fluids, which I infrequently use with the mug (and I’m sure others probably won’t either), it has been absolutely airtight and worry-free.

Rating 5 out of 5 stars.

Review by Chris Aldrich

New Routledge Text on Systems Theory

Over the holiday I ran across a press release, which follows with web links added, for a new book on systems theory. It promises to be an excellent read on the development and philosophy of systems theory for those interested in cybernetics, information theory, complexity and related topics.

MIAMI, Fla., Dec. 19, 2013
Dr. Darrell Arnold, Assistant Professor of Philosophy and Director of the Institute for World Languages and Cultures at St. Thomas University, has published an edited volume with Routledge entitled Traditions of Systems Theory: Major Figures and Contemporary Developments. Hans-Georg Moeller, of University College Cork, Ireland, notes that the book “provides a state-of-the-art survey of the increasingly influential and fascinating field of systems theory. It is a highly useful resource for a wide range of disciplines and contributes significantly to bringing together current trends in the sciences and the humanities.” The book includes 17 articles from leading theoreticians in the field, including pieces by Ranulph Glanville, the President of the American Society for Cybernetics, as well as Debora Hammond, the former President of the International Society for Systems Sciences. It is the first comprehensive edited volume in English on the major and countervailing developments within systems theory.

Dr. Arnold writes on 19th century German philosophy, contemporary social theory, as well as technology and globalization, with a focus on how these areas relate to the environmental problematic. He has translated numerous books from German, including C. Mantzavinos’s Naturalistic Hermeneutics (Cambridge UP) and Matthias Vogel’s Media of Reason (Columbia UP). Dr. Arnold is also editor-in-chief of the Humanities and Technology Review.

I’ve ordered my copy and will be providing a review shortly.

Book Review: “Complexity: A Guided Tour” by Melanie Mitchell

Read Complexity: A Guided Tour by Melanie Mitchell (amzn.to)
Complexity: A Guided Tour
Melanie Mitchell
Popular Science
Oxford University Press
May 28, 2009
Hardcover
366

This book provides an intimate, highly readable tour of the sciences of complexity, which seek to explain how large-scale complex, organized, and adaptive behavior can emerge from simple interactions among myriad individuals. The author, a leading complex systems scientist, describes the history of ideas, current research, and future prospects in this vital scientific effort.

This is handily one of the best, most interesting, and (to me at least) the most useful popularly written science books I’ve yet to come across. Most popular science books usually bore me to tears and end up being only pedantic for their historical backgrounds, but this one is very succinct with some interesting viewpoints (some of which I agree with and some of which my intuition says are terribly wrong) on the overall structure presented.

For those interested in a general and easily readable high-level overview of some of the areas of research I’ve been interested in (information theory, thermodynamics, entropy, microbiology, evolution, genetics, along with computation, dynamics, chaos, complexity, genetic algorithms, cellular automata, etc.) for the past two decades, this is really a lovely and thought-provoking book.

At the start I was disappointed that there were almost no equations in the book to speak of – and perhaps this is why I had purchased it when it came out and it’s subsequently been sitting on my shelf for so long. The other factor that prevented me from reading it was the depth and breadth of other more technical material I’ve read which covers the majority of topics in the book. I ultimately found myself not minding so much that there weren’t any/many supporting equations aside from a few hidden in the notes at the end of the text in most part because Dr. Mitchell does a fantastic job of pointing out some great subtleties within the various subjects which comprise the broader concept of complexity which one generally would take several years to come to on one’s own and at far greater expense of their time. Here she provides a much stronger picture of the overall subjects covered and this far outweighed the lack of specificity. I honestly wished I had read the book when it was released and it may have helped me to me more specific in my own research. Fortunately she does bring up several areas I will need to delve more deeply into and raised several questions which will significantly inform my future work.

In general, I wish there were more references I hadn’t read or been aware of yet, but towards the end there were a handful of topics relating to fractals, chaos, computer science, and cellular automata which I have been either ignorant of or which are further down my reading lists and may need to move closer to the top. I look forward to delving into many of these shortly. As a simple example, I’ve seen Zipf’s law separately from the perspectives of information theory, linguistics, and even evolution, but this is the first time I’ve seen it related to power laws and fractals.

I definitely appreciated the fact that Dr. Mitchell took the time to point out her own personal feelings on several topics and more so that she explicitly pointed them out as her own gut instincts instead of mentioning them passingly as if they were provable science which is what far too many other authors would have likely done. There are many viewpoints she takes which I certainly don’t agree with, but I suspect that it’s because I’m coming at things from the viewpoint of an electrical engineer with a stronger background in information theory and microbiology while hers is closer to that of computer science. She does mention that her undergraduate background was in mathematics, but I’m curious what areas she specifically studied to have a better understanding of her specific viewpoints.

Her final chapter looking at some of the pros and cons of the topic(s) was very welcome, particularly in light of previous philosophic attempts like cybernetics and general systems theory which I (also) think failed because of their lack of specificity. These caveats certainly help to place the scientific philosophy of complexity into a much larger context. I will generally heartily agree with her viewpoint (and that of others) that there needs to be a more rigorous mathematical theory underpinning the overall effort. I’m sure we’re all wondering “Where is our Newton?” or to use her clever aphorism that we’re “waiting for Carnot.” (Sounds like it should be a Tom Stoppard play title, doesn’t it?)

I might question her brief inclusion of her own Ph.D. thesis work in the text, but it did actually provide a nice specific and self-contained example within the broader context and also helped to tie several of the chapters together.

My one slight criticism of the work would be the lack of better footnoting within the text. Though many feel that footnote numbers within the text or inclusion at the bottom of the pages detracts from the “flow” of the work, I found myself wishing that she had done so here, particularly as I’m one of the few who actually cares about the footnotes and wants to know the specific references as I read. I hope that Oxford eventually publishes an e-book version that includes cross-linked footnotes in the future for the benefit of others.

I can heartily recommend this book to any fan of science, but I would specifically recommend it to any undergraduate science or engineering major who is unsure of what they’d specifically like to study and might need some interesting areas to take a look at. I will mention that one of the tough parts of the concept of complexity is that it is so broad and general that it encompasses over a dozen other fields of study each of which one could get a Ph.D. in without completely knowing the full depth of just one of them much less the full depth of all of them. The book is so well written that I’d even recommend it to senior researchers in any of the above mentioned fields as it is certainly sure to provide not only some excellent overview history of each, but it is sure to bring up questions and thoughts that they’ll want to include in their future researches in their own specific sub-areas of expertise.

How to Sidestep Mathematical Equations in Popular Science Books

In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.

I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.

In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, $S = k log W$, which appears engraved on the tombstone over his bust.

I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.