❤️ darenw tweet A time lapse for every hit of Ichiro’s MLB career

Liked a tweet by Daren WillmanDaren Willman (Twitter)
Syndicated copies to:

❤️ VioricaMarian1 tweet about afternoon classes

Liked a tweet by Viorica MarianViorica Marian (Twitter)

I wonder what a statistical analysis would do to improve peoples’ lives if registrars attempted to put the mass of classes in the middle of the day? Would educational outcomes improve along with peoples’ psyches? Many schedulers are trying to maximize based on the scarcity of classroom resources. What if they maximized on mental health and classroom performance? Is classroom scheduling potentially a valuable public health tool?

Syndicated copies to:

🎧 Episode 07 Hallelujah | Revisionist History

Listened to Episode 07 Hallelujah by Malcolm GladwellMalcolm Gladwell from Revisionist History

In 1984, Elvis Costello released what he would say later was his worst record: Goodbye Cruel World. Among the most discordant songs on the album was the forgettable “The Deportees Club.” But then, years later, Costello went back and re-recorded it as “Deportee,” and today it stands as one of his most sublime achievements.

“Hallelujah” is about the role that time and iteration play in the production of genius, and how some of the most memorable works of art had modest and undistinguished births.



And here I thought I knew a lot about the story of Hallelujah. I haven’t read any of the books on its history, nor written any myself, but this short story does have a good bit I’ve not heard before in the past. I did read quite a bit when Cohen passed away, and even spent some time making a Spotify playlist with over five hours of covers.

The bigger idea here of immediate genius versus “slow cooked” genius is the fun one to contemplate. I’ve previously heard stories about Mozart’s composing involved his working things out in his head and then later putting them on paper much the same way that a “cow pees” (i.e. all in one quick go or a fast flood.)

Another interesting thing I find here is the insanely small probability that the chain of events that makes the song popular actually happens. It seems worthwhile to look at the statistical mechanics of the production of genius. Perhaps applying Ridley’s concepts of “Ideas having sex” and Dawkin’s “meme theory” (aka selfish gene) could be interestingly useful. What does the state space of genius look like?

Syndicated copies to:

🎧 Episode 06 My Little Hundred Million | Revisionist History

Listened to Episode 06 My Little Hundred Million by Malcolm GladwellMalcolm Gladwell from Revisionist History


In the early ’90s, Hank Rowan gave $100 million to a university in New Jersey, an act of extraordinary generosity that helped launch the greatest explosion in educational philanthropy since the days of Andrew Carnegie and the Rockefellers. But Rowan gave his money to Glassboro State University, a tiny, almost bankrupt school in South Jersey, while almost all of the philanthropists who followed his lead made their donations to elite schools such as Harvard and Yale. Why did no one follow Rowan’s example?

“My Little Hundred Million” is the third part of Revisionist History’s educational miniseries. It looks at the hidden ideologies behind giving and how a strange set of ideas has hijacked educational philanthropy.

The key idea laid out stunningly here is strong links versus weak links.

I’m generally flabbergasted by the general idea proposed here and will have to do some more research in the near future to play around further with the ideas presented. Fortunately, in addition to the education specific idea presented, Gladwell also comes up with an additional few examples in sports by using the differences between soccer and basketball to show the subtle differences.

If he and his lab aren’t aware of the general concept, I would recommend this particular podcast and the concept of strong and weak links to César Hidalgo (t) who might actually have some troves of economics data to use to play around with some general modeling to expand upon these ideas. I’ve been generally enamored of Hidalgo’s general thesis about the overall value of links as expressed in Why Information Grows: The Evolution of Order, from Atoms to Economies1. I often think of it with relation to political economies and how the current administration seems to be (often quietly) destroying large amounts of value by breaking down a variety of economic, social, and political links within the United States as well as between our country and others.

I wonder if the additional ideas about the differences between strong and weak links might further improve these broader ideas. The general ideas behind statistical mechanics and statistics make me think that Gladwell, like Hidalgo, is certainly onto a strong idea which can be continued to be refined to improve billions of lives. I’ll have to start some literature searches now…

References

1.
Hidalgo C. Why Information Grows: The Evolution of Order, from Atoms to Economies. New York: Basic Books; 2015.
Syndicated copies to:

🎧 Episode 04 Carlos Doesn’t Remember | Revisionist History

Listened to Episode 04 Carlos Doesn't Remember by Malcolm GladwellMalcolm Gladwell from Revisionist History


Carlos is a brilliant student from South Los Angeles. He attends an exclusive private school on an academic scholarship. He is the kind of person the American meritocracy is supposed to reward. But in the hidden details of his life lies a cautionary tale about how hard it is to rise from the bottom to the top—and why the American school system, despite its best efforts, continues to leave an extraordinary amount of talent on the table.

Eric Eisner and students from his YES Program featured above. Photo credit: David Lauridsen and Los Angeles Magazine “Carlos Doesn’t Remember” is the first in a three-part Revisionist History miniseries taking a critical look at the idea of capitalization—the measure of how well America is making use of its human potential.

Eric Eisner and students from his YES Program featured above. Photo credit: David Lauridsen and Los Angeles Magazine

Certainly a stunning episode! Some of this is just painful to hear though.

We should easily be able to make things simpler, fairer, and more resilient for a lot of the poor we’re overlooking in society. As a larger group competing against other countries, we’re heavily undervaluing a major portion of our populace, and we’re going to need them just to keep pace. America can’t be the “greatest” country without them.

Syndicated copies to:

🔖 Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim | Arxiv

Bookmarked Can entropy be defined for and the Second Law applied to the entire universe? (arXiv)
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.
Syndicated copies to:

Statistical Physics, Information Processing, and Biology Workshop at Santa Fe Institute

Bookmarked Information Processing and Biology by John Carlos Baez (Azimuth)
The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop.

I just found out about this from John Carlos Baez and wish I could go! How have I not managed to have heard about it?

Stastical Physics, Information Processing, and Biology

Workshop

November 16, 2016 – November 18, 2016
9:00 AM
Noyce Conference Room

Abstract.
This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific question:

  1. How has the fraction of free energy flux on earth that is used by biological computation changed with time?;
  2. What is the free energy cost of biological computation / function?;
  3. What is the free energy cost of the evolution of biological computation / function.

In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

Purpose: Research Collaboration
SFI Host: David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert

Syndicated copies to:

Network Science by Albert-László Barabási

Bookmarked Network Science by Albert-László BarabásiAlbert-László Barabási (Cambridge University Press)

I ran across a link to this textbook by way of a standing Google alert, and was excited to check it out. I was immediately disappointed to think that I would have to wait another month and change for the physical textbook to be released, but made my pre-order directly. Then with a bit of digging around, I realized that individual chapters are available immediately to quench my thirst until the physical text is printed next month.

The power of network science, the beauty of network visualization.

Network Science, a textbook for network science, is freely available under the Creative Commons licence. Follow its development on Facebook, Twitter or by signing up to our mailing list, so that we can notify you of new chapters and developments.

The book is the result of a collaboration between a number of individuals, shaping everything, from content (Albert-László Barabási), to visualizations and interactive tools (Gabriele Musella, Mauro Martino, Nicole Samay, Kim Albrecht), simulations and data analysis (Márton Pósfai). The printed version of the book will be published by Cambridge University Press in 2016. In the coming months the website will be expanded with an interactive version of the text, datasets, and slides to teach the material.

Book Contents

Personal Introduction
1. Introduction
2. Graph Theory
3. Random Networks
4. The Scale-Free Property
5. The Barabási-Albert Model
6. Evolving Networks
7. Degree Correlations
8. Network Robustness
9. Communities
10. Spreading Phenomena
Usage & Acknowledgements
About

Albert-László Barabási
on Network Science (book website)

Networks are everywhere, from the Internet, to social networks, and the genetic networks that determine our biological existence. Illustrated throughout in full colour, this pioneering textbook, spanning a wide range of topics from physics to computer science, engineering, economics and the social sciences, introduces network science to an interdisciplinary audience. From the origins of the six degrees of separation to explaining why networks are robust to random failures, the author explores how viruses like Ebola and H1N1 spread, and why it is that our friends have more friends than we do. Using numerous real-world examples, this innovatively designed text includes clear delineation between undergraduate and graduate level material. The mathematical formulas and derivations are included within Advanced Topics sections, enabling use at a range of levels. Extensive online resources, including films and software for network analysis, make this a multifaceted companion for anyone with an interest in network science.

Source: Cambridge University Press

The textbook is available for purchase in September 2016 from Cambridge University Press. Pre-order now on Amazon.com.

If you’re not already doing so, you should follow Barabási on Twitter.

Syndicated copies to:

Weekly Recap: Interesting Articles 7/24-7/31 2016

Some of the interesting things I saw and read this week

Went on vacation or fell asleep at the internet wheel this week? Here’s some of the interesting stuff you missed.

Science & Math

Publishing

Indieweb, Internet, Identity, Blogging, Social Media

General

Syndicated copies to:

What is Information? by Christoph Adami

Bookmarked What is Information? [1601.06176] (arxiv.org)
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: Christoph Adami [1601.06176] What is Information? on arXiv

Syndicated copies to:

The Information Universe Conference

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

Keynote speakers

  • Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
  • Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
  • Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
  • Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
  • Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
  • Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands
Infoversum Theater, The Netherlands
Infoversum Theater, The Netherlands

Conference synopsis from their homepage:

The main ambition of this conference is to explore the question “What is the role of information in the physics of our Universe?”. This intellectual pursuit may have a key role in improving our understanding of the Universe at a time when we “build technology to acquire and manage Big Data”, “discover highly organized information systems in nature” and “attempt to solve outstanding issues on the role of information in physics”. The conference intends to address the “in vivo” (role of information in nature) and “in vitro” (theory and models) aspects of the Information Universe.

The discussions about the role of information will include the views and thoughts of several disciplines: astronomy, physics, computer science, mathematics, life sciences, quantum computing, and neuroscience. Different scientific communities hold various and sometimes distinct formulations of the role of information in the Universe indicating we still lack understanding of its intrinsic nature. During this conference we will try to identify the right questions, which may lead us towards an answer.

  • Is the universe one big information processing machine?
  • Is there a deeper layer in quantum mechanics?
  • Is the universe a hologram?
  • Is there a deeper physical description of the world based on information?
  • How close/far are we from solving the black hole information paradox?
  • What is the role of information in highly organized complex life systems?
  • The Big Data Universe and the Universe : are our numerical simulations and Big Data repositories (in vitro) different from real natural system (in vivo)?
  • Is this the road to understanding dark matter, dark energy?

The conference will be held in the new 260 seats planetarium theatre in Groningen, which provides an inspiring immersive 3D full dome display, e.g. numerical simulations of the formation of our Universe, and anything else our presenters wish to bring in. The digital planetarium setting will be used to visualize the theme with modern media.

The Information Universe Website

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Syndicated copies to:

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with #ITBio

 

Syndicated copies to:

BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Editor’s note: On 12/12/17 Storify announced they would be shutting down. As a result, I’m changing the embedded version of the original data served by Storify for an HTML copy which can be found below:

BIRS: Biological and Bio-Inspired Information Theory

A 5 Day workshop on Biology and Information Theory hosted by the Banff International Research Station

  1. Wishing I was at the Gene Regulation and Information Theory meeting starting tomorrow  http://bit.ly/XnHRZs  #ITBio
  2. Mathematical and Statistical Models for Genetic Coding starts today.  http://www.am.hs-mannheim.de/genetic_code_2013.php?id=1  @andreweckford might borrow attendees for BIRS
  3. Mathematical Foundations for Information Theory in Diffusion-Based Molecular Communications  http://bit.ly/1aTVR2c  #ITBio
  4. Bill Bialek giving plenary talk “Information flow & order in real biological networks” at Feb 2014 workshop  http://mnd.ly/19LQH8f  #ITBio
  5. CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna. http://t.co/F4Kn0ICIaT #ITBio http://t.co/Ty8dEIXQUT

    CECAM Workshop: “Entropy in Biomolecular Systems” starts May 14 in Vienna.  http://jhu.md/1faLR8t  #ITBio pic.twitter.com/Ty8dEIXQUT
  6. Last RT: wonder what the weather is going to be like at the end of October for my @BIRS_Math workshop
  7. @JoVanEvery I’m organizing a workshop in Banff in October … hopefully this isn’t a sign of weather to come!
  8. Banff takes its name from the town of Banff, Scotland, not to be confused with Bamff, also Scotland.
  9. Good morning from beautiful Banff. How can you not love the mountains? http://t.co/mxYBNz7yzl

    Good morning from beautiful Banff. How can you not love the mountains? pic.twitter.com/mxYBNz7yzl
  10. “Not an obvious connection between utility and information, just as there is no obvious connection between energy and entropy” @BIRS_Math
  11. Last RT: a lot of discussion of my signal transduction work with Peter Thomas.
  12. Live now: Nicolo Michelusi of @USCViterbi on Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/live  #ITBio
  13. Nicolo Michelusi (University of Southern California), A Stochastic Model for Electron Transfer in Bacterial Cables  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271450-Michelusi.mp4 
  14. Listening to the always awesome @cnmirose talk about the ultimate limits of molecular communication.
  15. “Timing is fundamental … subsumes time-varying concentration channel” @cnmirose @BIRS_Math
  16. Standard opening quote of these talks: “I’m not a biologist, but …” @BIRS_Math
  17. Stefan Moser (ETH Zurich), Capacity Bounds of the Memoryless AIGN Channel – a Toy-Model for Molecular Communicat…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271610-Moser.mp4 
  18. Weisi Guo (University of Warwick), Communication Envelopes for Molecular Diffusion and Electromagnetic Wave Propag…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410271643-Guo.mp4 
  19. .@ChrisAldrich @andreweckford @Storify @BIRS_Math Sounds like a fascinating workshop on bioinformation theory in Banff.
  20. Toby Berger, winner of the 2002 Shannon award, speaking right now. @BIRS_Math
  21. Naftali Tishby (Hebrew University of Jerusalem), Sensing and acting under information constraints – a principled a…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281032-Tishby.mp4 
  22. “…places such as BIRS and the Banff Centre exist to facilitate the exchange and pursuit of knowledge.” S. Sundaram  http://www.birs.ca/testimonials/#testimonial-1454 
  23. We’re going for a hike tomorrow. Many thanks to Lukas at the @ParksCanada info centre in Banff for helpful advice! @BIRS_Math
  24. Alexander Dimitrov (Washington State University), Invariant signal processing in auditory biological systems  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281416-Dimitrov.mp4 
  25. Joel Zylberberg (University of Washington), Communicating with noisy signals: lessons learned from the mammalian v…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281450-Zylberberg.mp4 
  26. Robert Schober (Universitat Erlangen-Nurnberg), Intersymbol interference mitigation in diffusive molecular communi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281549-Schober.mp4 
  27. Rudolf Rabenstein (Friedrich-Alexander-Universitat Erlangen-Nurnberg (FAU)), Modelling Molecular Communication Cha…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410281627-Rabenstein.mp4 
  28. THis week @BIRS_Math ” Biological and Bio-Inspired Information Theory ” @thebanffcentre #biology #math @NSF
  29. “Your theory might match the data, but the data might be wrong” – Crick @BIRS_Math
  30. So information theory seems to be a big deal in ecology. @BIRS_Math
  31. Tom Schneider (National Institutes of Health), Three Principles of Biological States: Ecology and Cancer  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410290904-Schneider.mp4 
  32. “In biodiversity, the entropy of an ecosystem is the expected … information we gain about an organism by learning its species” @BIRS_Math
  33. Seriously, I’m blown away by this work in information theory in ecology. Huge body of work; I had no idea. @BIRS_Math
  34. I encourage @BIRS_Math attendees at Biological & Bio-Inspired Information Theory to contribute references here:  http://bit.ly/1jQwObk 
  35. Christoph Adami (Michigan State University), Some Information-Theoretic Musings Concerning the Origin and Evolutio…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291114-Adami.mp4 
  36. .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio http://t.co/VA8komuuSW

    .@ChristophAdami talk Some Information-Theoretic Musings Concerning the Origin of Life @BIRS_Math this morning #ITBio pic.twitter.com/VA8komuuSW
  37. ICYMI @ChristophAdami had great paper: Information-theoretic Considerations on Origin of Life on arXiv  http://bit.ly/1yIhK2Q  @BIRS_Math
  38. Baez has a post on Tishby's talk "Sensing &  Acting Under Information Constraints" http://t.co/t1nPVI1pxa @BIRS_Math http://t.co/dFuiVLFSGC

    Baez has a post on Tishby’s talk “Sensing & Acting Under Information Constraints”  http://bit.ly/1yIDonR  @BIRS_Math pic.twitter.com/dFuiVLFSGC
  39. INFORMATION THEORY is the new central ...

    INFORMATION THEORY is the new central …
  40. I’m listening to a talk on the origin of life at a workshop on Biological and Bio-Inspired Information Theory. …  https://plus.google.com/117562920675666983007/posts/gqFL7XY3quF 
  41. Now accepting applications for the #Research Collaboration Workshop for Women in #MathBio at NIMBioS  http://ow.ly/DzeZ7 
  42. We removed a faulty microphone from our lecture room this morning. We’re now fixing the audio buzz in this week’s videos, and reposting.
  43. Didn’t get enough information theory & biology this week @BIRS_Math? Apply for NIMBioS workshop in April 2015  http://bit.ly/1yIeiWe  #ITBio
  44. Amin Emad (University of Illinois at Urbana-Champaign), Applications of Discrete Mathematics in Bioinformatics  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301329-Emad.mp4 
  45. Paul Bogdan (University of Southern California), Multiscale Analysis Reveals Complex Behavior in Bacteria Populati…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301401-Bogdan.mp4 
  46. Lubomir Kostal (Institute of Physiology, Academy of Sciences of the Czech Republic), Efficient information transmi…  http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410301534-Kostal.mp4 
  47. Banff ☀️❄️🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲🌲❤️
  48. @conservativelez I’m a big fan of your dad’s research & was reminded of much of it via a workshop on Biological Information Theory
  49. @conservativelez Though he may not have been able to attend, he can catch most of the talks online if he’d like  https://www.birs.ca/events/2014/5-day-workshops/14w5170 
  50. Depressed that @BIRS_Math Workshop on Biological & Bio-Inspired Information Theory is over? Relive it here:  http://bit.ly/1rF3G4B  #ITBio
  51. A few thoughts about that workshop while I wait for my flight back to Toronto.
  52. 1/ Everyone I talked to said it was the best workshop they’d ever been to, and they’d like to do a follow-up workshop @BIRS_Math
  53. 2/ There is an amazing diversity of work under the umbrella of “information theory”. @BIRS_Math
  54. 3/ Much of this work is outside the IT mainstream, and an issue is that people use different terms for related concepts. @BIRS_Math
  55. 4/ Some community building is in order. I think this workshop was a good first step. @BIRS_Math
  56. 5/ Many many thanks to @BIRS_Math and huge kudos to @NGhoussoub for excellent service to the Canadian scientific community. BIRS is a gem.
  57. 6/ Also many thanks to the participants for their excellent talks, and to @ChrisAldrich for maintaining a Storify.


Information Theory is the New Central Discipline

Replied to Information Theory is the new central discipline. by Nassim Nicholas Taleb (facebook.com)

INFORMATION THEORY is the new central discipline. This graph was from 20y ago in the seminal book Cover and Thomas, as the field was starting to be defined. Now Information Theory has been expanded to swallow even more fields.

Born in, of all disciplines, Electrical Engineering, the field has progressively infiltrating probability theory, computer science, statistical physics, data science, gambling theory, ruin problems, complexity, even how one deals with knowledge, epistemology. It defines noise/signal, order/disorder, etc. It studies cellular automata. You can use it in theology (FREE WILL & algorithmic complexity). As I said, it is the MOTHER discipline.

I am certain much of Medicine will naturally grow to be a subset of it, both operationally, and in studying how the human body works: the latter is an information machine. Same with linguistics. Same with political “science”, same with… everything.

I am saying this because I figured out what the long 5th volume of the INCERTO will be. Cannot say now with any precision but it has to do with a variant of entropy as the core natural generator of Antifragility.

[Revised to explain that it is not *replacing* other disciplines, just infiltrating them as the point was initially misunderstood…]

Nassim Nicholas Taleb via Facebook

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life”  which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s An Introduction to Information Theory: Symbols, Signals and Noise (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all).  (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in A Farewell to Entropy: Statistical Thermodynamics Based on Information.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Venn Diagram of how information theory relates to other fields.
Figure 1.1 [page 2] from
Thomas M. Cover and Joy Thomas’s textbook Elements of Information Theory, Second Edition
(John Wiley & Sons, Inc., 2006) [First Edition, 1991]
Syndicated copies to:

How to Sidestep Mathematical Equations in Popular Science Books

In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.

I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.

In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, S = k log W , which appears engraved on the tombstone over his bust.

Page 51 of Melanie Mitchell's book "Complexity: A Guided Tour"
Page 51 of Melanie Mitchell’s book “Complexity: A Guided Tour” featuring Boltzmann’s tombstone in Vienna.

I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.