🔖 Upcoming Special Issue “Information Theory in Neuroscience” | Entropy (MDPI)

Special Issue "Information Theory in Neuroscience" (Entropy | MDPI)
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017
Syndicated copies to:

👓 How the NSA identified Satoshi Nakamoto | Alexander Muse

How the NSA identified Satoshi Nakamoto by Alexander Muse (Alexander Muse | Medium)
The ‘creator’ of Bitcoin, Satoshi Nakamoto, is the world’s most elusive billionaire. Very few people outside of the Department of Homeland Security know Satoshi’s real name. In fact, DHS will not publicly confirm that even THEY know the billionaire’s identity. Satoshi has taken great care to keep his identity secret employing the latest encryption and obfuscation methods in his communications. Despite these efforts (according to my source at the DHS) Satoshi Nakamoto gave investigators the only tool they needed to find him — his own words. Using stylometry one is able to compare texts to determine authorship of a particular work. Throughout the years Satoshi wrote thousands of posts and emails and most of which are publicly available. According to my source, the NSA was able to the use the ‘writer invariant’ method of stylometry to compare Satoshi’s ‘known’ writings with trillions of writing samples from people across the globe. By taking Satoshi’s texts and finding the 50 most common words, the NSA was able to break down his text into 5,000 word chunks and analyse each to find the frequency of those 50 words. This would result in a unique 50-number identifier for each chunk. The NSA then placed each of these numbers into a 50-dimensional space and flatten them into a plane using principal components analysis. The result is a ‘fingerprint’ for anything written by Satoshi that could easily be compared to any other writing.

The article itself is dubious and unsourced and borders a bit on conspiracy theory, but the underlying concept about stylometry and its implications to privacy will be interesting to many. Naturally, it’s not much new.

Syndicated copies to:

🔖 Spontaneous fine-tuning to environment in many-species chemical reaction networks | PNAS

Spontaneous fine-tuning to environment in many-species chemical reaction networks by Jordan M. Horowitz and Jeremy L. England (Proceedings of the National Academy of Sciences)
Significance A qualitatively more diverse range of possible behaviors emerge in many-particle systems once external drives are allowed to push the system far from equilibrium; nonetheless, general thermodynamic principles governing nonequilibrium pattern formation and self-assembly have remained elusive, despite intense interest from researchers across disciplines. Here, we use the example of a randomly wired driven chemical reaction network to identify a key thermodynamic feature of a complex, driven system that characterizes the “specialness” of its dynamical attractor behavior. We show that the network’s fixed points are biased toward the extremization of external forcing, causing them to become kinetically stabilized in rare corners of chemical space that are either atypically weakly or strongly coupled to external environmental drives. Abstract A chemical mixture that continually absorbs work from its environment may exhibit steady-state chemical concentrations that deviate from their equilibrium values. Such behavior is particularly interesting in a scenario where the environmental work sources are relatively difficult to access, so that only the proper orchestration of many distinct catalytic actors can power the dissipative flux required to maintain a stable, far-from-equilibrium steady state. In this article, we study the dynamics of an in silico chemical network with random connectivity in an environment that makes strong thermodynamic forcing available only to rare combinations of chemical concentrations. We find that the long-time dynamics of such systems are biased toward states that exhibit a fine-tuned extremization of environmental forcing.

Suggested by First Support for a Physics Theory of Life in Quanta Magazine.

Syndicated copies to:

👓 First Support for a Physics Theory of Life | Quanta Magazine

First Support for a Physics Theory of Life by Natalie Wolchover (Quanta Magazine)
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.

Interesting article with some great references I’ll need to delve into and read.


The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.

I want to take a look at these papers as well as several about which the article is directly about.


Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”

Some truly harsh words from his former supervisor? Wow!


maybe there’s more that you can get for free

Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?

Syndicated copies to:

📅 Entropy 2018: From Physics to Information Sciences and Geometry

Might be attending Entropy 2018: From Physics to Information Sciences and Geometry
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain

One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:

  • Physics: classical Thermodynamics and Quantum
  • Statistical physics and Bayesian computation
  • Geometrical science of information, topology and metrics
  • Maximum entropy principle and inference
  • Kullback and Bayes or information theory and Bayesian inference
  • Entropy in action (applications)

The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.

All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy. 

Entropy 2018 Conference

Faculty of Chemistry, University of Barcelona, BCN, Spain

Syndicated copies to:

An Information Theory Playlist on Spotify

In honor of tomorrow’s release of Jimmy Soni and Rob Goodman’s book A Mind at Play: How Claude Shannon Invented the Information Age, I’ve created an Information Theory playlist on Spotify.

Songs about communication, telephones, conversation, satellites, love, auto-tune and even one about a typewriter! They all relate at least tangentially to the topic at hand. To up the ante, everyone should realize that digital music would be impossible without Shannon’s seminal work.

Let me know in the comments or by replying to one of the syndicated copies listed below if there are any great tunes that the list is missing.

Enjoy the list and the book!

Syndicated copies to:

👓 Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age | IEEE Spectrum

Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age by Stephen Cass (IEEE Spectrum)
Jimmy Soni and Rob Goodman wrote the first biography of the digital pioneer
Syndicated copies to:

📖 Read pages 16-55 of A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages 16-55 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

Knowing that I’ve read a lot about Shannon and even Vannevar Bush over the years, I’m pleasantly surprised to read some interesting tidbits about them that I’ve not previously come across.  I was a bit worried that this text wouldn’t provide me with much or anything new on the subjects at hand.

I’m really appreciating some of the prose and writing structure, particularly given that it’s a collaborative work between two authors. At times there are some really nonstandard sentence structures, but they’re wonderful in their rule breaking.

They’re doing an excellent job so far of explaining the more difficult pieces of science relating to information theory. In fact, some of the intro was as good as I think I’ve ever seen simple explanations of what is going on within the topic. I’m also pleased that they’ve made some interesting forays into topics like eugenics and the background role it played in the story for Shannon.

They had a chance to do a broader view of the history of computing, but opted against it, or at least must have made a conscious choice to leave out Babbage/Lovelace within the greater pantheon. I can see narratively why they may have done this knowing what is to come later in the text, but a few sentences as a nod would have been welcome.

The book does, however, get on my nerves with one of my personal pet peeves in popular science and biographical works like this: while there are reasonable notes at the end, absolutely no proper footnotes appear at the bottoms of pages or even indicators within the text other than pieces of text with quotation marks. I’m glad the notes even exist in the back, but it just drives me crazy that publishers blatantly hide them this way. The text could at least have had markers indicating where to find the notes. What are we? Animals?

Nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

Syndicated copies to:

📗 Started reading A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages i-16 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

A great little introduction and start to what portends to be the science biography of the year. The book opens up with a story I’d heard Sol Golomb tell several times. It was actually a bittersweet memory as the last time I heard a recounting, it appeared on the occasion of Shannon’s 100th Birthday celebration in the New Yorker:

In 1985, at the International Symposium in Brighton, England, the Shannon Award went to the University of Southern California’s Solomon Golomb. As the story goes, Golomb began his lecture by recounting a terrifying nightmare from the night before: he’d dreamed that he was about deliver his presentation, and who should turn up in the front row but Claude Shannon. And then, there before Golomb in the flesh, and in the front row, was Shannon. His reappearance (including a bit of juggling at the banquet) was the talk of the symposium, but he never attended again.

I had emailed Sol about the story, and became concerned when I didn’t hear back. I discovered shortly after that he had passed away the following day.

nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

Syndicated copies to:

🔖 A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni, Rob Goodman

A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob Goodman (Simon & Schuster)
The life and times of one of the foremost intellects of the twentieth century: Claude Shannon—the neglected architect of the Information Age, whose insights stand behind every computer built, email sent, video streamed, and webpage loaded. Claude Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass. In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener. And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.

I can’t wait to read this new biography about Claude Shannon! The bio/summer read I’ve been waiting for.

With any luck an advanced reader copy is speeding it way to me! (Sorry you can’t surprise me with a belated copy for my birthday.) A review is forthcoming.

You have to love the cover art by Lauren Peters-Collaer.

Syndicated copies to:

📺 A Universal Theory of Life: Math, Art & Information by Sara Walker

A Universal Theory of Life: Math, Art & Information by Sara Walker from TEDxASU
Dr. Walker introduces the concept of information, then proposes that information may be a necessity for biological complexity in this thought-provoking talk on the origins of life. Sara is a theoretical physicist and astrobiologist, researching the origins and nature of life. She is particularly interested in addressing the question of whether or not “other laws of physics” might govern life, as first posed by Erwin Schrodinger in his famous book What is life?. She is currently an Assistant Professor in the School of Earth and Space Exploration and Beyond Center for Fundamental Concepts in Science at Arizona State University. She is also Fellow of the ASU -Santa Fe Institute Center for Biosocial Complex Systems, Founder of the astrobiology-themed social website SAGANet.org, and is a member of the Board of Directors of Blue Marble Space. She is active in public engagement in science, with recent appearances on “Through the Wormhole” and NPR’s Science Friday.

Admittedly, she only had a few short minutes, but it would have been nice if she’d started out with a precise definition of information. I suspect the majority of her audience didn’t know the definition with which she’s working and it would have helped focus the talk.

Her description of Speigelman’s Monster was relatively interesting and not very often seen in much of the literature that covers these areas.

I wouldn’t rate this very highly as a TED Talk as it wasn’t as condensed and simplistic as most, nor was it as hyper-focused, but then again condensing this area into 11 minutes is far from simple task. I do love that she’s excited enough about the topic that she almost sounds a little out of breath towards the end.

There’s an excellent Eddington quote I’ve mentioned before that would have been apropos to have opened up her presentation that might have brought things into higher relief given her talk title:

Suppose that we were asked to arrange the following in two categories–

distance, mass, electric force, entropy, beauty, melody.

I think there are the strongest grounds for placing entropy alongside beauty and melody and not with the first three.

Sir Arthur Stanley Eddington, OM, FRS (1882-1944), a British astronomer, physicist, and mathematician
in The Nature of the Physical World, 1927

 

Syndicated copies to:

🔖 Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim | Arxiv

Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim (arXiv)
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.
Syndicated copies to:

🔖 The hidden simplicity of biology by Paul C W Davies and Sara Imari Walker | Reports on Progress in Physics

The hidden simplicity of biology by Paul C W Davies and Sara Imari Walker (Reports on Progress in Physics)
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.

Downloadable free copy available on ResearchGate.

Syndicated copies to: