An Information Theory Playlist on Spotify

In honor of tomorrow’s release of Jimmy Soni and Rob Goodman’s book A Mind at Play: How Claude Shannon Invented the Information Age, I’ve created an Information Theory playlist on Spotify.

Songs about communication, telephones, conversation, satellites, love, auto-tune and even one about a typewriter! They all relate at least tangentially to the topic at hand. To up the ante, everyone should realize that digital music would be impossible without Shannon’s seminal work.

Let me know in the comments or by replying to one of the syndicated copies listed below if there are any great tunes that the list is missing.

Enjoy the list and the book!

Syndicated copies to:

👓 Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age | IEEE Spectrum

Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age by Stephen Cass (IEEE Spectrum)
Jimmy Soni and Rob Goodman wrote the first biography of the digital pioneer
Syndicated copies to:

📖 Read pages 16-55 of A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages 16-55 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

Knowing that I’ve read a lot about Shannon and even Vannevar Bush over the years, I’m pleasantly surprised to read some interesting tidbits about them that I’ve not previously come across.  I was a bit worried that this text wouldn’t provide me with much or anything new on the subjects at hand.

I’m really appreciating some of the prose and writing structure, particularly given that it’s a collaborative work between two authors. At times there are some really nonstandard sentence structures, but they’re wonderful in their rule breaking.

They’re doing an excellent job so far of explaining the more difficult pieces of science relating to information theory. In fact, some of the intro was as good as I think I’ve ever seen simple explanations of what is going on within the topic. I’m also pleased that they’ve made some interesting forays into topics like eugenics and the background role it played in the story for Shannon.

They had a chance to do a broader view of the history of computing, but opted against it, or at least must have made a conscious choice to leave out Babbage/Lovelace within the greater pantheon. I can see narratively why they may have done this knowing what is to come later in the text, but a few sentences as a nod would have been welcome.

The book does, however, get on my nerves with one of my personal pet peeves in popular science and biographical works like this: while there are reasonable notes at the end, absolutely no proper footnotes appear at the bottoms of pages or even indicators within the text other than pieces of text with quotation marks. I’m glad the notes even exist in the back, but it just drives me crazy that publishers blatantly hide them this way. The text could at least have had markers indicating where to find the notes. What are we? Animals?

Nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

Syndicated copies to:

📗 Started reading A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages i-16 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

A great little introduction and start to what portends to be the science biography of the year. The book opens up with a story I’d heard Sol Golomb tell several times. It was actually a bittersweet memory as the last time I heard a recounting, it appeared on the occasion of Shannon’s 100th Birthday celebration in the New Yorker:

In 1985, at the International Symposium in Brighton, England, the Shannon Award went to the University of Southern California’s Solomon Golomb. As the story goes, Golomb began his lecture by recounting a terrifying nightmare from the night before: he’d dreamed that he was about deliver his presentation, and who should turn up in the front row but Claude Shannon. And then, there before Golomb in the flesh, and in the front row, was Shannon. His reappearance (including a bit of juggling at the banquet) was the talk of the symposium, but he never attended again.

I had emailed Sol about the story, and became concerned when I didn’t hear back. I discovered shortly after that he had passed away the following day.

nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

Syndicated copies to:

🔖 A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni, Rob Goodman

A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob Goodman (Simon & Schuster)
The life and times of one of the foremost intellects of the twentieth century: Claude Shannon—the neglected architect of the Information Age, whose insights stand behind every computer built, email sent, video streamed, and webpage loaded. Claude Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass. In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener. And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.

I can’t wait to read this new biography about Claude Shannon! The bio/summer read I’ve been waiting for.

With any luck an advanced reader copy is speeding it way to me! (Sorry you can’t surprise me with a belated copy for my birthday.) A review is forthcoming.

You have to love the cover art by Lauren Peters-Collaer.

Syndicated copies to:

📺 A Universal Theory of Life: Math, Art & Information by Sara Walker

A Universal Theory of Life: Math, Art & Information by Sara Walker from TEDxASU
Dr. Walker introduces the concept of information, then proposes that information may be a necessity for biological complexity in this thought-provoking talk on the origins of life. Sara is a theoretical physicist and astrobiologist, researching the origins and nature of life. She is particularly interested in addressing the question of whether or not “other laws of physics” might govern life, as first posed by Erwin Schrodinger in his famous book What is life?. She is currently an Assistant Professor in the School of Earth and Space Exploration and Beyond Center for Fundamental Concepts in Science at Arizona State University. She is also Fellow of the ASU -Santa Fe Institute Center for Biosocial Complex Systems, Founder of the astrobiology-themed social website SAGANet.org, and is a member of the Board of Directors of Blue Marble Space. She is active in public engagement in science, with recent appearances on “Through the Wormhole” and NPR’s Science Friday.

Admittedly, she only had a few short minutes, but it would have been nice if she’d started out with a precise definition of information. I suspect the majority of her audience didn’t know the definition with which she’s working and it would have helped focus the talk.

Her description of Speigelman’s Monster was relatively interesting and not very often seen in much of the literature that covers these areas.

I wouldn’t rate this very highly as a TED Talk as it wasn’t as condensed and simplistic as most, nor was it as hyper-focused, but then again condensing this area into 11 minutes is far from simple task. I do love that she’s excited enough about the topic that she almost sounds a little out of breath towards the end.

There’s an excellent Eddington quote I’ve mentioned before that would have been apropos to have opened up her presentation that might have brought things into higher relief given her talk title:

Suppose that we were asked to arrange the following in two categories–

distance, mass, electric force, entropy, beauty, melody.

I think there are the strongest grounds for placing entropy alongside beauty and melody and not with the first three.

Sir Arthur Stanley Eddington, OM, FRS (1882-1944), a British astronomer, physicist, and mathematician
in The Nature of the Physical World, 1927

 

Syndicated copies to:

🔖 Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim | Arxiv

Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim (arXiv)
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.
Syndicated copies to:

🔖 The hidden simplicity of biology by Paul C W Davies and Sara Imari Walker | Reports on Progress in Physics

The hidden simplicity of biology by Paul C W Davies and Sara Imari Walker (Reports on Progress in Physics)
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.

Downloadable free copy available on ResearchGate.

Syndicated copies to:

🔖 Statistical Mechanics, Spring 2016 (Caltech, Physics 12c with videos) by John Preskill

Statistical Mechanics, Spring 2016 (Physics 12c) by John Preskill (Caltech)
An introductory course in statistical mechanics.

Recommended textbook Thermal Physics by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Syndicated copies to:

👓 The Quantum Thermodynamics Revolution | Quanta Magazine

The Quantum Thermodynamics Revolution by Natalie Wolchover (Quanta Magazine)
As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.
Syndicated copies to:

🔖 Proceedings of the Artificial Life Conference 2016

Proceedings of the Artificial Life Conference 2016 by Carlos Gershenson, Tom Froese, Jesus M. Siqueiros, Wendy Aguilar, Eduardo J. Izquierdo and Hiroki Sayama (The MIT Press)
The ALife conferences are the major meeting of the artificial life research community since 1987. For its 15th edition in 2016, it was held in Latin America for the first time, in the Mayan Riviera, Mexico, from July 4 -8. The special them of the conference: How can the synthetic study of living systems contribute to societies: scientifically, technically, and culturally? The goal of the conference theme is to better understand societies with the purpose of using this understanding for a more efficient management and development of social systems.

Free download available.

Proceedings of the Artificial Life Conference 2016

Syndicated copies to:

🔖 From Matter to Life: Information and Causality by Sara Imari Walker, Paul C. W. Davies, George F. R. Ellis

From Matter to Life: Information and Causality by by Sara Imari Walker, Paul C. W. Davies, George F. R. Ellis (Cambridge University Press)
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science. Hardcover: 514 pages; ISBN-10: 1107150531; ISBN-13: 978-1107150539;
From Matter to Life: Information and Causality
Syndicated copies to:

🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems

An Introduction to Transfer Entropy: Information Flow in Complex Systems by Terry Bossomaier, Lionel Barnett, Michael Harré, Joseph T. Lizier (Springer; 1st ed. 2016 edition)
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)

Want to read; h/t to Joseph Lizier.
Continue reading “🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems”

Syndicated copies to: