📺 The Bit Player (Trailer) | IEEE Information Theory Society

Watched The Bit Player (Trailer) from IEEE Information Theory Society

The Bit Player Trailer from IEEE Information Theory Society on Vimeo.

In a blockbuster paper in 1948, Claude Shannon introduced the notion of a "bit" and laid the foundation for the information age. His ideas ripple through nearly every aspect of modern life, influencing such diverse fields as communication, computing, cryptography, neuroscience, artificial intelligence, cosmology, linguistics, and genetics. But when interviewed in the 1980s, Shannon was more interested in showing off the gadgets he’d constructed — juggling robots, a Rubik’s Cube solving machine, a wearable computer to win at roulette, a unicycle without pedals, a flame-throwing trumpet — than rehashing the past. Mixing contemporary interviews, archival film, animation and dialogue drawn from interviews conducted with Shannon himself, The Bit Player tells the story of an overlooked genius who revolutionized the world, but never lost his childlike curiosity.

👓 Bob Gallager on Shannon’s tips for research | An Ergodic Walk

Annotated Bob Gallager on Shannon’s tips for research (An Ergodic Walk)

Gallager gave a nice concise summary of what he learned from Shannon about how to do good theory work:

  1. Simplify the problem
  2. Relate it to other problems
  3. Restate the problem in as many ways as possible
  4. Break the problem into pieces
  5. Avoid getting locked into thinking ruts
  6. Generalize

As he said, “it’s a process of doing research… each one [step] gives you a little insight.” It’s tempting, as a theorist, to claim that at the end of this process you’ve solved the “fundamental” problem, but Gallager admonished us to remember that the first step is to simplify, often dramatically. As Alfred North Whitehead said, we should “seek simplicity and distrust it.”

I know I’ve read this before, but it deserves a re-read/review every now and then.

👓 Celebrating the Work and Life of Claude Elwood Shannon | IEEE Foundation

Read Celebrating the Work and Life of Claude Elwood Shannon (ieeefoundation.org)

Claude Shannon

In 2014 IEEE Information Theory Society President, Michelle Effros, knew that something had to be done. The man who coined the very phrase, Information Theory, had largely been forgotten. Given his importance, and the growing impact that his work was having on society at large, she led the IEEE Information Theory Society on a quest to use the Centennial of Claude Shannon’s birth to right this injustice.

A series of activities were planned, including a dual IEEE Milestone dedicated at both Nokia Bell Labs and MIT. Such was his stature that both institutions were intent on honoring the work he accomplished on their respective sites. His work, after all, foresaw and paved the way for the Information Revolution that we are experiencing, making possible everything from cell phones to GPS to Bitcoin.

By the time of the Nokia Bell Labs event, the keystone project – a documentary on Shannon’s life was in the formative stages. IEEE Information Theory Society leadership had secured the services of Mark Levinson, of Particle Fever acclaim. The script was being written and preliminary plans were underway.

To make the film a reality, a coalition of individuals, foundations and corporations came together with the common objective to bring the story of Shannon to as wide an audience as possible. An effective partnership was forged with the IEEE Foundation which was undertaking its own unique project - its first ever major fundraising campaign. The combination proved to be a winning entry, and the Shannon Centennial quickly became exemplary of the impact that can occur when the power of volunteers is bolstered by effective staff support.

19 June was the World Premiere of the finished product. The Bit Player was screened to a full house on the big screen at the IEEE Information Theory Society’s meeting in Vail, CO, US. The film was met with enthusiastic acclaim. Following the screening attendees were treated to a Q&A with the film’s director and star.

Among the techniques used to tell Shannon’s story was the testimony of current luminaries in the fields he inspired. All spoke of his importance and the need for his impact to be recognized. As one contributor, Andrea Goldsmith, Stephen Harris Professor in the School of Engineering, Stanford University, put it, “Today everyone carries Shannon around in their pocket”.

Based on this article the Claude Shannon movie The Bit Player has already had its premiere. I updated the IMDb entry, but I still have to wonder if it is ever going to get any distribution so that the rest of us might ever see it?

Reply to The Man Who Tried to Redeem the World with Logic | Nautilus

Replied to The Man Who Tried to Redeem the World with Logic by Amanda GefterAmanda Gefter (Nautilus)
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.

Quick note of a factual and temporal error: the article indicates:

After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.

In fact, it was Claude E. Shannon, one of Wiener’s colleagues, who wrote the influential A Mathematical Theory of Communication published in Bell System Technical Journal in 1948, almost 5 years after the 1943 part of the timeline the article is indicating. Not only did Wiener not write the paper, but it wouldn’t have existed yet to have been a factor in Pitts deciding to choose a school or adviser at the time. While Wiener may have been a tremendous polymath, I suspect that his mathematical area of expertise during those years would have been closer to analysis and not probability theory.

To put Pitts & McCulloch’s work into additional context, Claude Shannon’s stunning MIT master’s thesis A symbolic analysis of relay and switching circuits in 1940 applied Boolean algebra to electronic circuits for the first time and as a result largely allowed the digital age to blossom. It would be nice to know if Pitts & McCulloch were aware of it when they published their work three years later.

👓 The Man Who Tried to Redeem the World with Logic | Issue 21: Information – Nautilus

Read The Man Who Tried to Redeem the World with Logic (Nautilus)
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker,…

Highlights, Quotes, Annotations, & Marginalia

McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m.  

Now that is a business card title!

March 03, 2019 at 06:01PM

McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.  

tl;dr

March 03, 2019 at 06:06PM

Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.  

I don’t think I’ve ever heard this quirky story…

March 03, 2019 at 06:08PM

Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions.  

I’m curious what year this was, particularly in relation to Claude Shannon’s master’s thesis in which he applied Boolean algebra to electronics.
Based on their meeting date, it would have to be after 1940.And they published in 1943: https://link.springer.com/article/10.1007%2FBF02478259

March 03, 2019 at 06:14PM

McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up.  

A nice way to pass the time to be sure. Naturally mathematicians would have been turning “coffee into theorems” instead of whiskey.

March 03, 2019 at 06:15PM

“an idea wrenched out of time.” In other words, a memory.  

March 03, 2019 at 06:17PM

McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics.  

March 03, 2019 at 06:21PM

I really like this picture here. Perhaps for a business card?
colorful painting of man sitting with abstract structure around him
  
March 03, 2019 at 06:23PM

it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.  

Oops, I think this article is confusing Wiener with Claude Shannon?

March 03, 2019 at 06:34PM

By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world.  

March 03, 2019 at 06:32PM

Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core.  

Wiener always did like cyberneticians for it’s parallelism with mathematicians….

March 03, 2019 at 06:38PM

In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts.  

First Draft of a Report on EDVAC by jon von Neumann

March 03, 2019 at 06:43PM

Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin.  

March 03, 2019 at 06:44PM

at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks.  

March 03, 2019 at 06:44PM

In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson.  

March 03, 2019 at 06:46PM

Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology.  

March 03, 2019 at 06:47PM

“The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959.  

March 03, 2019 at 06:50PM

There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.”  

March 03, 2019 at 06:54PM

Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind.  

March 03, 2019 at 06:55PM

by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation.  

I feel like this is something more akin to what may have been already known from Boolean algebra and Whitehead/Russell by this time. Certainly Shannon would have known of it?

March 03, 2019 at 06:58PM

🎧 Episode 025 System Theories, Racism & Human Relationships: Interview with TK Coleman | Human Current

Listened to Episode 025 System Theories, Racism & Human Relationships by Haley Campbell-GrossHaley Campbell-Gross from Human Current

In this episode, Haley interviews TK Coleman to discuss how humans allow their conflicting mental models to influence the way they handle controversial topics like racism. TK also shares how understanding context and patterns within human systems ultimately empowers us to actively contribute to human progress.

I generally prefer the harder sciences among Human Current’s episodes, but even episodes on the applications in other areas are really solid. I’m glad to hear about TK Coleman’s overarching philosophy and the idea of “human beings” versus “human doings.”

Also glad to have the recommendation of General Systems Theory: Beginning With Wholes by Barbara G. Hanson as a more accessible text in comparison to Ludwig von Bertalanffy’s text. The gang at Human Current should set up an Amazon Affiliate link so that when I buy books they recommend (which happens frequently), it helps to support and underwrite their work.

Highlights, Quotes, Annotations, & Marginalia

Reality is objective, but meaning is contextual.

—Barbara Hanson, General Systems Theory: Beginning with Wholes quoted within the episode

This quote is an interesting recap of a sentence in the first two paragraphs of Claude Shannon’s The Mathematical Theory of Communication.

An Information Theory Playlist on Spotify

In honor of tomorrow’s release of Jimmy Soni and Rob Goodman’s book A Mind at Play: How Claude Shannon Invented the Information Age, I’ve created an Information Theory playlist on Spotify.

Songs about communication, telephones, conversation, satellites, love, auto-tune and even one about a typewriter! They all relate at least tangentially to the topic at hand. To up the ante, everyone should realize that digital music would be impossible without Shannon’s seminal work.

Let me know in the comments or by replying to one of the syndicated copies listed below if there are any great tunes that the list is missing.

Enjoy the list and the book!

👓 Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age | IEEE Spectrum

Read Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age (IEEE Spectrum)
Jimmy Soni and Rob Goodman wrote the first biography of the digital pioneer

📖 Read pages 16-55 of A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages 16-55 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

Knowing that I’ve read a lot about Shannon and even Vannevar Bush over the years, I’m pleasantly surprised to read some interesting tidbits about them that I’ve not previously come across.  I was a bit worried that this text wouldn’t provide me with much or anything new on the subjects at hand.

I’m really appreciating some of the prose and writing structure, particularly given that it’s a collaborative work between two authors. At times there are some really nonstandard sentence structures, but they’re wonderful in their rule breaking.

They’re doing an excellent job so far of explaining the more difficult pieces of science relating to information theory. In fact, some of the intro was as good as I think I’ve ever seen simple explanations of what is going on within the topic. I’m also pleased that they’ve made some interesting forays into topics like eugenics and the background role it played in the story for Shannon.

They had a chance to do a broader view of the history of computing, but opted against it, or at least must have made a conscious choice to leave out Babbage/Lovelace within the greater pantheon. I can see narratively why they may have done this knowing what is to come later in the text, but a few sentences as a nod would have been welcome.

The book does, however, get on my nerves with one of my personal pet peeves in popular science and biographical works like this: while there are reasonable notes at the end, absolutely no proper footnotes appear at the bottoms of pages or even indicators within the text other than pieces of text with quotation marks. I’m glad the notes even exist in the back, but it just drives me crazy that publishers blatantly hide them this way. The text could at least have had markers indicating where to find the notes. What are we? Animals?

Nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

Instagram filter used: Clarendon

Photo taken at: Gerrish Swim & Tennis Club

📗 Started reading A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages i-16 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

A great little introduction and start to what portends to be the science biography of the year. The book opens up with a story I’d heard Sol Golomb tell several times. It was actually a bittersweet memory as the last time I heard a recounting, it appeared on the occasion of Shannon’s 100th Birthday celebration in the New Yorker:

In 1985, at the International Symposium in Brighton, England, the Shannon Award went to the University of Southern California’s Solomon Golomb. As the story goes, Golomb began his lecture by recounting a terrifying nightmare from the night before: he’d dreamed that he was about deliver his presentation, and who should turn up in the front row but Claude Shannon. And then, there before Golomb in the flesh, and in the front row, was Shannon. His reappearance (including a bit of juggling at the banquet) was the talk of the symposium, but he never attended again.

I had emailed Sol about the story, and became concerned when I didn’t hear back. I discovered shortly after that he had passed away the following day.

nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.

🔖 A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni, Rob Goodman

Bookmarked A Mind at Play: How Claude Shannon Invented the Information Age (Simon & Schuster)
The life and times of one of the foremost intellects of the twentieth century: Claude Shannon—the neglected architect of the Information Age, whose insights stand behind every computer built, email sent, video streamed, and webpage loaded. Claude Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass. In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener. And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.

I can’t wait to read this new biography about Claude Shannon! The bio/summer read I’ve been waiting for.

With any luck an advanced reader copy is speeding it way to me! (Sorry you can’t surprise me with a belated copy for my birthday.) A review is forthcoming.

You have to love the cover art by Lauren Peters-Collaer.

Warren Weaver Bot!

Liked Someone has built a Warren Weaver Bot! by WeaverbotWeaverbot (Twitter)
This is the signal for the second.

How can you not follow this twitter account?!

Now I’m waiting for a Shannon bot and a Weiner bot. Maybe a John McCarthy bot would be apropos too?!

Why Norbert Weiner?

Replied to a tweet by Anand SarwateAnand Sarwate (Twitter)
@ChrisAldrich Why is Norbert Wiener the illustration for this?

Maybe I should have used Claude Shannon instead?

Photo of Claude Shannon sitting in front of blackboard
Claude Shannon, the Father of Information Theory

Happy 100th Birthday Claude Shannon

Today would have been Claude Shannon's 100th Birthday! Modern society owes most of its existence to his work.

Many regular readers here are sure to know who Claude Shannon is, but sadly most of the rest of the world is in the dark. To give you an idea of his importance in society and even a bit in pop culture, today’s Google doodle celebrates Shannon’s life and work.

Overview of Shannon’s Work

Most importantly, Shannon, in his 1937 Master’s Thesis at Massachusetts Institute of Technology applied George Boole’s algebra (better known now as Boolean Algebra) to electric circuits thereby making the modern digital revolution possible. To give you an idea of how far we’ve come, the typical high school student can now read and understand all of its content. If you’d like to give it a try, you can download it from MIT’s website.

His other huge accomplishment was a journal article he wrote in 1948 entitled “A Mathematical Theory of Communication” in the Bell Labs Journal. When it was republished a year later, one of the most notable changes was in the new title “The Mathematical Theory of Communication.” While copies of the original article are freely available on the internet, the more casual reader will appreciate the more recent edition from MIT Press which also includes a fabulous elucidative and extensive opening written by Warren Weaver. This paper contains the theoretical underpinning that allowed for the efflorescence of all modern digital communication to occur. It ranks as one of the most influential and far-reaching documents in human history rivaling even the Bible.

Further, my own excitement in Shannon stems in part from his Ph.D. thesis “An Algebra for Theoretical Genetics” (1940) which has inspired most of the theoretical material I’m always contemplating.

Google Doodle Art animated by artist Nate Swinehart celebrates Claude Shannon's 100th Birthday
Google Doodle Art animated by artist Nate Swinehart celebrates Claude Shannon’s 100th Birthday

Additional Sources:

For those looking for more information try some of the following (non-technical) sources:

Claude Elwood Shannon smoking