In the early ’90s, Hank Rowan gave $100 million to a university in New Jersey, an act of extraordinary generosity that helped launch the greatest explosion in educational philanthropy since the days of Andrew Carnegie and the Rockefellers. But Rowan gave his money to Glassboro State University, a tiny, almost bankrupt school in South Jersey, while almost all of the philanthropists who followed his lead made their donations to elite schools such as Harvard and Yale. Why did no one follow Rowan’s example?
“My Little Hundred Million” is the third part of Revisionist History’s educational miniseries. It looks at the hidden ideologies behind giving and how a strange set of ideas has hijacked educational philanthropy.
The key idea laid out stunningly here is strong links versus weak links.
I’m generally flabbergasted by the general idea proposed here and will have to do some more research in the near future to play around further with the ideas presented. Fortunately, in addition to the education specific idea presented, Gladwell also comes up with an additional few examples in sports by using the differences between soccer and basketball to show the subtle differences.
If he and his lab aren’t aware of the general concept, I would recommend this particular podcast and the concept of strong and weak links to César Hidalgo (t) who might actually have some troves of economics data to use to play around with some general modeling to expand upon these ideas. I’ve been generally enamored of Hidalgo’s general thesis about the overall value of links as expressed in Why Information Grows: The Evolution of Order, from Atoms to Economies1. I often think of it with relation to political economies and how the current administration seems to be (often quietly) destroying large amounts of value by breaking down a variety of economic, social, and political links within the United States as well as between our country and others.
I wonder if the additional ideas about the differences between strong and weak links might further improve these broader ideas. The general ideas behind statistical mechanics and statistics make me think that Gladwell, like Hidalgo, is certainly onto a strong idea which can be continued to be refined to improve billions of lives. I’ll have to start some literature searches now…
References
1.
Hidalgo C. Why Information Grows: The Evolution of Order, from Atoms to Economies. New York: Basic Books; 2015.
Carlos is a brilliant student from South Los Angeles. He attends an exclusive private school on an academic scholarship. He is the kind of person the American meritocracy is supposed to reward. But in the hidden details of his life lies a cautionary tale about how hard it is to rise from the bottom to the top—and why the American school system, despite its best efforts, continues to leave an extraordinary amount of talent on the table.
Eric Eisner and students from his YES Program featured above. Photo credit: David Lauridsen and Los Angeles Magazine
“Carlos Doesn’t Remember” is the first in a three-part Revisionist History miniseries taking a critical look at the idea of capitalization—the measure of how well America is making use of its human potential.
Certainly a stunning episode! Some of this is just painful to hear though.
We should easily be able to make things simpler, fairer, and more resilient for a lot of the poor we’re overlooking in society. As a larger group competing against other countries, we’re heavily undervaluing a major portion of our populace, and we’re going to need them just to keep pace. America can’t be the “greatest” country without them.
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.
The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop.
I just found out about this from John Carlos Baez and wish I could go! How have I not managed to have heard about it?
November 16, 2016 – November 18, 2016
9:00 AM
Noyce Conference Room
Abstract.
This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.
The goal of this workshop is to address these issues by focusing on a set three specific question:
How has the fraction of free energy flux on earth that is used by biological computation changed with time?;
What is the free energy cost of biological computation / function?;
What is the free energy cost of the evolution of biological computation / function.
In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.
Purpose: Research Collaboration
SFI Host: David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert
I ran across a link to this textbook by way of a standing Google alert, and was excited to check it out. I was immediately disappointed to think that I would have to wait another month and change for the physical textbook to be released, but made my pre-order directly. Then with a bit of digging around, I realized that individual chapters are available immediately to quench my thirst until the physical text is printed next month.
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.
Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)
Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.
I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:
Keynote speakers
Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands
Conference synopsis from their homepage:
Additional details about the conference including the participants, program, venue, and registration can also be found at their website.
Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.
For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.
For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.
Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.
Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:
I’m giving a short 30-minute talk at a workshop on Biological and Bio-Inspired Information Theory at the Banff International Research Institute. I’ll say more about the workshop later, but here’s my talk: * Biodiversity, entropy and thermodynamics. Most of the people at this workshop study neurobiology and cell signalling, not evolutionary game theory or…
I’m having a great time at a workshop on Biological and Bio-Inspired Information Theory in Banff, Canada. You can see videos of the talks online. There have been lots of good talks so far, but this one really blew my mind: * Naftali Tishby, Sensing and acting under information constraints—a principled approach to biology and…
John Harte is an ecologist who uses maximum entropy methods to predict the distribution, abundance and energy usage of species. Marc Harper uses information theory in bioinformatics and evolutionary game theory. Harper, Harte and I are organizing a workshop on entropy and information in biological systems, and I’m really excited about it!
John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend: * Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015. Click the link, read the stuff and scroll down to “CLICK HERE” to apply.
There will be a 5-day workshop on Biological and Bio-Inspired Information Theory at BIRS from Sunday the 26th to Friday the 31st of October, 2014. It’s being organized by * Toby Berger (University of Virginia) * Andrew Eckford (York University) * Peter Thomas (Case Western Reserve University) BIRS is the Banff International Research Station,…
How does it feel to (co-)write a book and hold the finished product in your hands? About like this: Many, many thanks to my excellent co-authors, Tadashi Nakano and Tokuko Haraguchi, for their hard work; thanks to Cambridge for accepting this project and managing it well; and thanks to Satoshi Hiyama for writing a nice blurb.
You may have seen our PLOS ONE paper about tabletop molecular communication, which received loads of media coverage. One of the goals of this paper was to show that anyone can do experiments in molecular communication, without any wet labs or expensive apparatus.
[My comments posted to the original Facebook post follow below.]
I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.
In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.
Upcoming/recent conferences/workshops on information theory in biology include:
I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.
For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.
For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all). (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)
For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.
For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”
In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.
I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.
In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, , which appears engraved on the tombstone over his bust.
I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.
Information Theory and Evolution
John Avery
Non-fiction, Popular Science
World Scientific
January 1, 2003
paperback
217
This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.
This is a fantastic book which, for the majority of people, I’d give a five star review. For my own purposes, however, I was expecting far more on the theoretical side of information theory and statistical mechanics as applied to microbiology that it didn’t live up to, so I’m giving it three stars from a purely personal perspective.
I do wish that someone had placed it in my hands and forced me to read it when I was a freshman in college entering the study of biomedical and electrical engineering. It is far more an impressive book at this level and for those in the general public who are interested in the general history of science and philosophy of the topics. The general reader may be somewhat scared by a small amount of mathematics in chapter 4, but there is really no loss of continuity by skimming through most of it. For those looking for a bit more rigor, Avery provides some additional details in appendix A, but for the specialist, the presentation is heavily lacking.
The book opens with a facile but acceptable overview of the history of the development for the theory of evolution whereas most other texts would simply begin with Darwin’s work and completely skip the important philosophical and scientific contributions of Aristotle, Averroes, Condorcet, Linnaeus, Erasmus Darwin, Lamarck, or the debates between Cuvier and St. Hilaire.
For me, the meat of the book was chapters 3-5 and appendix A which collectively covered molecular biology, evolution, statistical mechanics, and a bit of information theory, albeit from a very big picture point of view. Unfortunately the rigor of the presentation and the underlying mathematics were skimmed over all too quickly to accomplish what I had hoped to gain from the text. On the other hand, the individual sections of “suggestions for further reading” throughout the book seem well researched and offer an acceptable launching pad for delving into topics in places where they may be covered more thoroughly.
The final several chapters become a bit more of an overview of philosophy surrounding cultural evolution and information technology which are much better covered and discussed in James Gleick’s recent book The Information.
Overall, Avery has a well laid out outline of the broad array of subjects and covers it all fairly well in an easy to read and engaging style.
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
Finally, after 140 years, Robert Strain and Philip Gressman at the University of Pennsylvania have found a mathematical proof of Boltzmann’s equation, which predicts the motion of gas molecules.