🔖 A First Step Toward Quantifying the Climate’s Information Production over the Last 68,000 Years

A First Step Toward Quantifying the Climate’s Information Production over the Last 68,000 Years by Joshua Garland, Tyler R. Jones, Elizabeth Bradley, Ryan G. James, James W. C. White (link.springer.com)
Paleoclimate records are extremely rich sources of information about the past history of the Earth system. We take an information-theoretic approach to analyzing data from the WAIS Divide ice core, the longest continuous and highest-resolution water isotope record yet recovered from Antarctica. We use weighted permutation entropy to calculate the Shannon entropy rate from these isotope measurements, which are proxies for a number of different climate variables, including the temperature at the time of deposition of the corresponding layer of the core. We find that the rate of information production in these measurements reveals issues with analysis instruments, even when those issues leave no visible traces in the raw data. These entropy calculations also allow us to identify a number of intervals in the data that may be of direct relevance to paleoclimate interpretation, and to form new conjectures about what is happening in those intervals—including periods of abrupt climate change.

Saw reference in Predicting unpredictability: Information theory offers new way to read ice cores [1]

References

[1]
“Predicting unpredictability: Information theory offers new way to read ice cores,” Phys.org. [Online]. Available: http://phys.org/news/2016-12-unpredictability-theory-ice-cores.html. [Accessed: 12-Dec-2016]
Syndicated copies to:

Chris Aldrich is reading “Predicting unpredictability: Information theory offers new way to read ice cores”

Predicting unpredictability: Information theory offers new way to read ice cores (phys.org)
At two miles long and five inches in diameter, the West Antarctic Ice Sheet Divide (WAIS) ice core is a tangible record of the last 68,000 years of our planet's climate.
Syndicated copies to:

Tangled Up in Spacetime

Tangled Up in Spacetime by Clara MoskowitzClara Moskowitz (Scientific American)
Hundreds of researchers in a collaborative project called "It from Qubit" say space and time may spring up from the quantum entanglement of tiny bits of information.
Syndicated copies to:

Complexity isn’t a Vice: 10 Word Answers and Doubletalk in Election 2016

How Donald Trump is leveraging an old Vaudeville trick to heavily contest the presidential election

A Problem with Transcripts

In the past few weeks, I’ve seen dozens of news outlets publish multi-paragraph excerpts of speeches from Donald Trump and have been appalled that I was unable to read them in any coherent way. I could not honestly follow or discern any coherent thought or argument in the majority of them. I was a bit shocked because in listening to him, he often sounds like he has some kind of point, though he seems to be spouting variations on one of ten one-liners he’s been using for over a year now. There’s apparently a flaw in our primal reptilian brains that seems to be tricking us into thinking that there’s some sort of substance in his speech when there honestly is none. I’m going to have to spend some time reading more on linguistics and cognitive neuroscience. Maybe Stephen Pinker knows of an answer?

The situation got worse this week as I turned to news sources for fact-checking of the recent presidential debate. While it’s nice to have web-based annotation tools like Genius[1] and Hypothes.is[2] to mark up these debates, it becomes another thing altogether to understand the meaning of what’s being said in order to actually attempt to annotate it. I’ve included some links so that readers can attempt the exercise for themselves.

Recent transcripts (some with highlights/annotations):

Doubletalk and Doublespeech

It’s been a while since Americans were broadly exposed to actual doubletalk. For the most part our national experience with it has been a passing curiosity highlighted by comedians.

dou·ble-talk
ˈdəblˌtôk/
n. (NORTH AMERICAN)
a deliberately unintelligible form of speech in which inappropriate, invented or nonsense syllables are combined with actual words. This type of speech is commonly used to give the appearance of knowledge and thereby confuse, amuse, or entertain the speaker’s audience.
another term for doublespeak
see also n. doubletalk [3]

Since the days of vaudeville (and likely before), comedians have used doubletalk to great effect on stage, in film, and on television. Some comedians who have historically used the technique as part of their acts include Al Kelly, Cliff Nazarro, Danny Kaye, Gary Owens, Irwin Corey, Jackie Gleason, Sid Caesar, Stanley Unwin, and Reggie Watts. I’m including some short video clips below as examples.

A well-known, but foreshortened, form of it was used by Dana Carvey in his Saturday Night Live performances caricaturizing George H.W. Bush by using a few standard catch phrases with pablum in between: “Not gonna do it…”, “Wouldn’t be prudent at this juncture”, and “Thousand Points of Light…”. These snippets in combination with some creative hand gestures (pointing, lacing fingers together), along with a voice melding of Mr. Rogers and John Wayne were the simple constructs that largely transformed a diminutive comedian convincingly into a president.

Doubletalk also has a more “educated” sibling known as technobabble. Engineers are sure to recall a famous (and still very humorous) example of both doubletalk and technobabble in the famed description of the Turboencabulator.[4] (See also, the short videos below.)

Doubletalk comedy examples

Al Kelly on Ernie Kovaks

Sid Caesar

Technobabble examples

Turboencabulator

Rockwell Turbo Encabulator Version 2

Politicobabble

And of course doubletalk and technobabble have closely related cousins named doublespeak and politicobabble. These are far more dangerous than the others because they move over the line of comedy into seriousness and are used by people who make decisions effecting hundreds of thousands to millions, if not billions, of people on the planet. I’m sure an archeo-linguist might be able to discern where exactly politicobabble emerged and managed to evolve into a non-comedic form of speech which people manage to take far more seriously than its close ancestors. One surely suspects some heavy influence from George Orwell’s corpus of work:

The term “doublespeak” probably has its roots in George Orwell’s book Nineteen Eighty-Four.[5] Although the term is not used in the book, it is a close relative of one of the book’s central concepts, “doublethink”. Another variant, “doubletalk”, also referring to deliberately ambiguous speech, did exist at the time Orwell wrote his book, but the usage of “doublespeak” as well as of “doubletalk” in the sense emphasizing ambiguity clearly postdates the publication of Nineteen Eighty-Four. Parallels have also been drawn between doublespeak and Orwell’s classic essay Politics and the English Language [6] , which discusses the distortion of language for political purposes.

in Wikipedia.com [7]

 

While politicobabble is nothing new, I did find a very elucidating passage from the 1992 U.S. Presidential Election cycle which seems to be a major part of the Trump campaign playbook:

Repetition of a meaningless mantra is supposed to empty the mind, clearing the way for meditation on more profound matters. This campaign has achieved the first part. I’m not sure about the second.

Candidates are now told to pick a theme, and keep repeating it-until polls show it’s not working, at which point the theme vanishes and another takes its place.

The mantra-style repetition of the theme of the week, however, leaves the impression that Teen Talk Barbie has acquired some life-size Campaign Talk Ken dolls. Pull the string and you get: ‘Congress is tough,’ ‘worst economic performance since the Depression,’ or ‘a giant sucking sound south of the border.’

A number of words and phrases, once used to express meaningful concepts, are becoming as useful as ‘ommm’ in the political discourse. Still, these words and phrases have meanings, just not the ones the dictionary originally intended.

Joanne Jacobs
in A Handy Guide To Politico-babble in the Chicago Tribune on

 

In the continuation of the article, Jacobs goes on to give a variety of examples of the term as well as a “translation” guide for some of the common politicobabble words from that particular election. I’ll leave it to the capable hands of others (perhaps in the comments, below?) to come up with the translation guide for our current political climate.

The interesting evolutionary change I’ll note for the current election cycle is that Trump hasn’t delved into any depth on any of his themes to offend anyone significantly enough. This has allowed him to stay with the dozen or so themes he started out using and therefore hasn’t needed to change them as in campaigns of old.

Filling in the Blanks

These forms of pseudo-speech area all meant to fool us into thinking that something of substance is being discussed and that a conversation is happening, when in fact, nothing is really being communicated at all. Most of the intended meaning and reaction to such speech seems to stem from the demeanor of the speaker as well as, in some part, to the reaction of the surrounding interlocutor and audience. In reading Donald Trump transcripts, an entirely different meaning (or lack thereof) is more quickly realized as the surrounding elements which prop up the narrative have been completely stripped away. In a transcript version, gone is the hypnotizing element of the crowd which is vehemently sure that the emperor is truly wearing clothes.

In many of these transcripts, in fact, I find so little is being said that the listener is actually being forced to piece together the larger story in their head. Being forced to fill in the blanks in this way leaves too much of the communication up to the listener who isn’t necessarily engaged at a high level. Without more detail or context to understand what is being communicated, the listener is far more likely to fill in the blanks to fit a story that doesn’t create any cognitive dissonance for themselves — in part because Trump is usually smiling and welcoming towards his adoring audiences.

One will surely recall that Trump even wanted Secretary Clinton to be happy during the debate when he said, “Now, in all fairness to Secretary Clinton — yes, is that OK? Good. I want you to be very happy. It’s very important to me.” (This question also doubles as an example of a standard psychological sales tactic of attempting to get the purchaser to start by saying ‘yes’ as a means to keep them saying yes while moving them towards making a purchase.)

His method of communicating by leaving large holes in his meaning reminds me of the way our brain smooths out information as indicated in this old internet meme [9]:

I cdn’uolt blveiee taht I cluod aulaclty uesdnatnrd waht I was rdanieg: the phaonmneel pweor of the hmuan mnid. Aoccdrnig to a rseearch taem at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoatnt tihng is taht the frist and lsat ltteer be in the rghit pclae. The rset can be a taotl mses and you can sitll raed it wouthit a porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe. Scuh a cdonition is arpppoiatrely cllaed typoglycemia.

 

I’m also reminded of the biases and heuristics research carried out in part (and the remainder cited) by Daniel Kahneman in his book Thinking, Fast and Slow [10] in which he discusses the mechanics of how system 1 and system 2 work in our brains. Is Trump taking advantage of the deficits of language processing in our brains in something akin to system 1 biases to win large blocks of votes? Is he creating a virtual real-time Choose-Your-Own-Adventure to subvert the laziness of the electorate? Kahneman would suggest the the combination of what Trump does say and what he doesn’t leaves it up to every individual listener to create their own story. Their system 1 is going to default to the easiest and most palatable one available to them: a happy story that fits their own worldview and is likely to encourage them to support Trump.

Ten Word Answers

As an information theorist, I know all too well that there must be a ‘linguistic Shannon limit’ to the amount of semantic meaning one can compress into a single word. [11] One is ultimately forced to attempt to form sentences to convey more meaning. But usually the less politicians say, the less trouble they can get into — a lesson hard won through generations of political fighting.

I’m reminded of a scene from The West Wing television series. In season 4, episode 6 which aired on October 30, 2002 on NBC, Game On had a poignant moment (video clip below) which is germane to our subject: [12]

Moderator: Governor Ritchie, many economists have stated that the tax cut, which is the centrepiece of your economic agenda, could actually harm the economy. Is now really the time to cut taxes?
Governor Ritchie, R-FL: You bet it is. We need to cut taxes for one reason – the American people know how to spend their money better than the federal government does.
Moderator: Mr. President, your rebuttal.
President Bartlet: There it is…
That’s the 10 word answer my staff’s been looking for for 2 weeks. There it is.
10 word answers can kill you in political campaigns — they’re the tip of the sword.
Here’s my question: What are the next 10 words of your answer?
“Your taxes are too high?” So are mine…
Give me the next 10 words: How are we going to do it?
Give me 10 after that — I’ll drop out of the race right now.
Every once in a while — every once in a while, there’s a day with an absolute right and an absolute wrong, but those days almost always include body counts. Other than that there aren’t very many un-nuanced moments in leading a country that’s way too big for 10 words.
I’m the President of the United States, not the president of the people who agree with me. And by the way, if the left has a problem with that, they should vote for somebody else.

As someone who studies information theory and complexity theory and even delves into sub-topics like complexity and economics, I can agree wholeheartedly with the sentiment. Though again, here I can also see the massive gaps between system 1 and 2 that force us to want to simplify things down to such a base level that we don’t have to do the work to puzzle them out.

(And yes, that is Jennifer Anniston’s father playing the moderator.)

One can’t but wonder why Mr. Trump doesn’t seem to have ever gone past the first ten words? Is it because he isn’t capable? interested? Or does he instinctively know better? It would seem that he’s been doing business by using the uncertainty inherent in his speech for decades, but always operating by using what he meant (or thought he wanted to mean) than what the other party heard and thought they understood. If it ain’t broke, don’t fix it.

Idiocracy or Something Worse?

In our increasingly specialized world, people eventually have to give in and quit doing some tasks that everyone used to do for themselves. Yesterday I saw a lifeworn woman in her 70s pushing a wheeled wire basket with a 5 gallon container of water from the store to her home. As she shuffled along, I contemplated Thracian people from fourth century BCE doing the same thing except they likely carried amphorae possibly with a yoke and without the benefit of the $10 manufactured custom shopping cart. 20,000 years before that people were still carrying their own water, but possibly without even the benefit of earthenware containers. Things in human history have changed very slowly for the most part, but as we continually sub-specialize further and further, we need to remember that we can’t give up one of the primary functions that makes us human: the ability to think deeply and analytically for ourselves.

I suspect that far too many people are too wrapped up in their own lives and problems to listen to more than the ten word answers our politicians are advertising to us. We need to remember to ask for the next ten words and the ten after that.

Otherwise there are two extreme possible outcomes:

We’re either at the beginning of what Mike Judge would term Idiocracy[13]

Or we’re headed to what Michiko Kakutani is “subtweeting” about in her recent review In ‘Hitler’ an Ascent from ‘Dunderhead’ to Demagogue [14] of Volker Ulrich’s new book Hitler: Ascent 1889-1939[15] 

Here, one is tempted to quote George Santayana’s famous line (from The Life of Reason, 1905), “Those who cannot remember the past are condemned to repeat it.” However, I far prefer the following as more apropos to our present national situation:

Sir Winston Leonard Spencer-Churchill (), a British statesman, historian, writer and artist,
in House of Commons, 2 May 1935, after the Stresa Conference, in which Britain, France and Italy agreed—futilely—to maintain the independence of Austria.

 

tl;dr

If Cliff Navarro comes back to run for president, I hope no one falls for his joke just because he wasn’t laughing as he acted it out. If his instructions for fixing the wagon (America) are any indication, the voters who are listening and making the repairs will be in severe pain.

Cliff Navarro

Footnotes

[1]
“Genius | Song Lyrics & Knowledge,” Genius, 2016. [Online]. Available: http://genius.com. [Accessed: 29-Sep-2016]
[2]
“Hypothesis | The Internet, peer reviewed. | Hypothesis,” hypothes.is, 2016. [Online]. Available: https://hypothes.is/. [Accessed: 29-Sep-2016]
[3]
“Double-talk – Wikipedia, the free encyclopedia,” en.wikipedia.org, 2016. [Online]. Available: https://en.wikipedia.org/wiki/Double-talk. [Accessed: 29-Sep-2016]
[4]
“Turboencabulator – Wikipedia, the free encyclopedia,” en.wikipedia.org, 2016. [Online]. Available: https://en.wikipedia.org/wiki/Turboencabulator. [Accessed: 29-Sep-2016]
[5]
G. Orwell, Nineteen Eighty-four, 1st ed. London: Harvill Secker & Warburg, 1949.
[6]
G. Orwell, “Politics and the English Language,” Horizon, vol. 13, no. 76, pp. 252–265, Apr. 1946 [Online]. Available: http://www.orwell.ru/library/essays/politics/english/e_polit/
[7]
“Doublespeak – Wikipedia, the free encyclopedia,” en.wikipedia.org, 29-Sep-2016. [Online]. Available: https://en.wikipedia.org/wiki/Doublespeak. [Accessed: 29-Sep-2016]
[8]
J. Jacobs, “A Handy Guide To Politico-babble,” Chicago Tribune, 31-Oct-1992. [Online]. Available: http://articles.chicagotribune.com/1992-10-31/news/9204080638_1_family-values-trickle-bill-clinton. [Accessed: 29-Sep-2016]
[9]
M. Davis, “cmabridge | Cognition and Brain Sciences Unit,” mrc-cbu.cam.ac.uk, 2012. [Online]. Available: https://www.mrc-cbu.cam.ac.uk/people/matt.davis/cmabridge/. [Accessed: 29-Sep-2016]
[10]
D. Kahneman, Thinking, Fast and Slow. Macmillan, 2011.
[11]
C. Shanon E., “A Mathematical Theory of Communication,” Bell System Technical Journal, vol. 27, no. 3, pp. 379–423, Jul. 1948. [Source]
[12]
A. Sorkin, J. Wells, and T. Schlamme , “Game On,” The West Wing, NBC, 30-Oct-2002.
[13]
M. Judge, Idiocracy. Twentieth Century Fox, 2006.
[14]
M. Katutani, “In ‘Hitler’ an Ascent from ‘Dunderhead’ to Demagogue,” New York Times, p. 1, 27-Sep-2016 [Online]. Available: http://www.nytimes.com/2016/09/28/books/hitler-ascent-volker-ullrich.html?_r=0. [Accessed: 28-Sep-2016] [Source]
[15]
V. Ullrich, Adolf Hitler: Ascent 1889-1939, 1st ed. Knopf Publishing Group, 2016.
Syndicated copies to:

Weekly Recap: Interesting Articles 7/24-7/31 2016

Some of the interesting things I saw and read this week

Went on vacation or fell asleep at the internet wheel this week? Here’s some of the interesting stuff you missed.

Science & Math

Publishing

Indieweb, Internet, Identity, Blogging, Social Media

General

Syndicated copies to:

Workshop on Methods of Information Theory in Computational Neuroscience | CNS 2016

Workshop on Methods of Information Theory in Computational Neuroscience (CNS 2016) by Joseph T. Lizier (lizier.me)
Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience. A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited. The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work. The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.
Syndicated copies to:

Devastating News: Sol Golomb has apparently passed away on Sunday

The world has certainly lost one of its greatest thinkers, and many of us have lost a dear friend, colleague, and mentor.

I was getting concerned that I hadn’t heard back from Sol for a while, particularly after emailing him late last week, and then I ran across this notice through ITSOC & the IEEE:

Solomon W. Golomb (May 30, 1932 – May 1, 2016)

Shannon Award winner and long-time ITSOC member Solomon W. Golomb passed away on May 1, 2016.
Solomon W. Golomb was the Andrew Viterbi Chair in Electrical Engineering at the University of Southern California (USC) and was at USC since 1963, rising to the rank of University and Distinguished Professor. He was a member of the National Academies of Engineering and Science, and was awarded the National Medal of Science, the Shannon Award, the Hamming Medal, and numerous other accolades. As USC Dean Yiannis C. Yortsos wrote, “With unparalleled scholarly contributions and distinction to the field of engineering and mathematics, Sol’s impact has been extraordinary, transformative and impossible to measure. His academic and scholarly work on the theory of communications built the pillars upon which our modern technological life rests.”

In addition to his many contributions to coding and information theory, Professor Golomb was one of the great innovators in recreational mathematics, contributing many articles to Scientific American and other publications. More recent Information Theory Society members may be most familiar with his mathematics puzzles that appeared in the Society Newsletter, which will publish a full remembrance later.

A quick search a moment later revealed this sad confirmation along with some great photos from an award Sol received just a week ago:

As is common in academia, I’m sure it will take a few days for the news to drip out, but the world has certainly lost one of its greatest thinkers, and many of us have lost a dear friend, colleague, and mentor.

I’ll try touch base with his family and pass along what information sniff I can. I’ll post forthcoming obituaries as I see them, and will surely post some additional thoughts and reminiscences of my own in the coming days.

Golomb and national medal of science
President Barack Obama presents Solomon Golomb with the National Medal of Science at an awards ceremony held at the White House in 2013.
Syndicated copies to:

Physicists Hunt For The Big Bang’s Triangles | Quanta Magazine

Physicists Hunt for the Big Bang'€™s Triangles by Natalie Wolchover (Quanta Magazine )

“The notion that counting more shapes in the sky will reveal more details of the Big Bang is implied in a central principle of quantum physics known as “unitarity.” Unitarity dictates that the probabilities of all possible quantum states of the universe must add up to one, now and forever; thus, information, which is stored in quantum states, can never be lost — only scrambled. This means that all information about the birth of the cosmos remains encoded in its present state, and the more precisely cosmologists know the latter, the more they can learn about the former.”

Syndicated copies to:

2016 North-American School of Information Theory, June 21-23

2016 North-American School of Information Theory, June 21-23, 2016 (itsoc.org)

The 2016 School of information will be hosted at Duke University, June 21-23. It is sponsored by the IEEE Information Theory Society, Duke University, the Center for Science of Information, and the National Science Foundation. The school provides a venue where doctoral and postdoctoral students can learn from distinguished professors in information theory, meet with fellow researchers, and form collaborations.

Program and Lectures

The daily schedule will consist of morning and afternoon lectures separated by a lunch break with poster sessions. Students from all research areas are welcome to attend and present their own research via a poster during the school.  The school will host lectures on core areas of information theory and interdisciplinary topics. The following lecturers are confirmed:

  • Helmut Bölcskei (ETH Zurich): The Mathematics of Deep Learning
  • Natasha Devroye (University of Illinois, Chicago): The Interference Channel
  • René Vidal (Johns Hopkins University): Global Optimality in Deep Learning and Beyond
  • Tsachy Weissman (Stanford University): Information Processing under Logarithmic Loss
  • Aylin Yener (Pennsylvania State University): Information-Theoretic Security

Logistics

Applications will be available on March 15 and will be evaluated starting April 1.  Accepted students must register by May 15, 2016.  The registration fee of $200 will include food and 3 nights accommodation in a single-occupancy room.  We suggest that attendees fly into the Raleigh-Durham (RDU) airport located about 30 minutes from the Duke campus. Housing will be available for check-in on the afternoon of June 20th.  The main part of the program will conclude after lunch on June 23rd so that attendees can fly home that evening.

To Apply: click “register” here (fee will accepted later after acceptance)

Administrative Contact: Kathy Peterson, itschool2016@gmail.com

Organizing Committee

Henry Pfister (chair) (Duke University), Dror Baron (North Carolina State University), Matthieu Bloch (Georgia Tech), Rob Calderbank (Duke University), Galen Reeves (Duke University). Advisors: Gerhard Kramer (Technical University of Munich) and Andrea Goldsmith (Stanford)

Sponsors

Syndicated copies to:

Introduction to Information Theory | SFI’s Complexity Explorer

The Santa Fe Institute's free online course "Introduction to Information Theory" taught by Seth Lloyd via Complexity Explorer.

Many readers often ask me for resources for delving into the basics of information theory. I hadn’t posted it before, but the Santa Fe Institute recently had an online course Introduction to Information Theory through their Complexity Explorer, which has some other excellent offerings. It included videos, fora, and other resources and was taught by the esteemed physicist and professor Seth Lloyd. There are a number of currently active students still learning and posting there.

Introduction to Information Theory

About the Tutorial:

This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.

In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.

Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.

About the Instructor(s):

Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.

From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.

Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.

Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.

Tutorial Team:

Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.

How to use Complexity Explorer: How to use Complexity Explore
Prerequisites: At least one year of high-school algebra
Like this tutorial? 


Syllabus

  1. Introduction
  2. Forms of Information
  3. Information and Probability
  4. Fundamental Formula of Information
  5. Computation and Logic: Information Processing
  6. Mutual Information
  7. Communication Capacity
  8. Shannon’s Coding Theorem
  9. The Manifold Things Information Measures
  10. Homework
Syndicated copies to:

Global Language Networks

Recent research on global language networks has interesting relations to big history, complexity economics, and current politics.

Yesterday I ran across this nice little video explaining some recent research on global language networks. It’s not only interesting in its own right, but is a fantastic example of science communication as well.

I’m interested in some of the information theoretic aspects of this as well as the relation of this to the area of corpus linguistics. I’m also curious if one could build worthwhile datasets like this for the ancient world (cross reference some of the sources I touch on in relation to the Dickinson College Commentaries within Latin Pedagogy and the Digital Humanities) to see what influences different language cultures have had on each other. Perhaps the historical record could help to validate some of the predictions made in relation to the future?

The paper “Global distribution and drivers of language extinction risk” indicates that of all the variables tested, economic growth was most strongly linked to language loss.

This research also has some interesting relation to the concept of “Collective Learning” within the realm of a Big History framework via David Christian, Fred Spier, et al.  I’m curious to revisit my hypothesis: Collective learning has potentially been growing at the expense of a shrinking body of diverse language some of which was informed by the work of Jared Diamond.

Some of the discussion in the video is reminiscent to me of some of the work Stuart Kauffman lays out in At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (Oxford, 1995). Particularly in chapter 3 in which Kauffman discusses the networks of life.  The analogy of this to the networks of language here indicate to me that some of Cesar Hidalgo’s recent work in Why Information Grows: The Evolution of Order, From Atoms to Economies (MIT Press, 2015) is even more interesting in helping to show the true value of links between people and firms (information sources which he measures as personbytes and firmbytes) within economies.

Finally, I can also only think about how this research may help to temper some of the xenophobic discussion that occurs in American political life with respect to fears relating to Mexican immigration issues as well as the position of China in the world economy.

Those intrigued by the video may find the website set up by the researchers very interesting. It contains links to the full paper as well as visualizations and links to the data used.

Abstract

Languages vary enormously in global importance because of historical, demographic, political, and technological forces. However, beyond simple measures of population and economic power, there has been no rigorous quantitative way to define the global influence of languages. Here we use the structure of the networks connecting multilingual speakers and translated texts, as expressed in book translations, multiple language editions of Wikipedia, and Twitter, to provide a concept of language importance that goes beyond simple economic or demographic measures. We find that the structure of these three global language networks (GLNs) is centered on English as a global hub and around a handful of intermediate hub languages, which include Spanish, German, French, Russian, Portuguese, and Chinese. We validate the measure of a language’s centrality in the three GLNs by showing that it exhibits a strong correlation with two independent measures of the number of famous people born in the countries associated with that language. These results suggest that the position of a language in the GLN contributes to the visibility of its speakers and the global popularity of the cultural content they produce.

Citation: Ronen S, Goncalves B, Hu KZ, Vespignani A, Pinker S, Hidalgo CA
Links that speak: the global language network and its association with global fame, Proceedings of the National Academy of Sciences (PNAS) (2014), 10.1073/pnas.1410931111

Related posts:

“A language like Dutch — spoken by 27 million people — can be a disproportionately large conduit, compared with a language like Arabic, which has a whopping 530 million native and second-language speakers,” Science reports. “This is because the Dutch are very multilingual and very online.”

Syndicated copies to:

What is Information? by Christoph Adami

What is Information? [1601.06176] by Christoph Adami (arxiv.org)
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: Christoph Adami [1601.06176] What is Information? on arXiv

Syndicated copies to:

Donald Forsdyke Indicates the Concept of Information in Biology Predates Claude Shannon

In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler's work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century.

As it was published, I had read Kevin Hartnett’s article and interview with Christoph Adami The Information Theory of Life in Quanta Magazine. I recently revisited it and read through the commentary and stumbled upon an interesting quote relating to the history of information in biology:

Polymath Adami has ‘looked at so many fields of science’ and has correctly indicated the underlying importance of information theory, to which he has made important contributions. However, perhaps because the interview was concerned with the origin of life and was edited and condensed, many readers may get the impression that IT is only a few decades old. However, information ideas in biology can be traced back to at least 19th century sources. In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler’s work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century. The emergence of his text – “What is Life” – from Dublin in the 1940s, inspired those who gave us DNA structure and the associated information concepts in “the classic period” of molecular biology. For more please see: Forsdyke, D. R. (2015) History of Psychiatry 26 (3), 270-287.

Donald Forsdyke, bioinformatician and theoretical biologist
in response to The Information Theory of Life in Quanta Magazine on

These two historical references predate Claude Shannon’s mathematical formalization of information in A Mathematical Theory of Communication (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book What is Life (1944).

For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in History of Psychiatry [.pdf]):

🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke  [1]
Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)

Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.

Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’

The original arXiv article also referenced two lectures which are appended below:

[Original Draft of this was written on December 14, 2015.]

References

[1]
D. Forsdyke R., “‘A vehicle of symbols and nothing more’. George Romanes, theory of mind, information, and Samuel Butler,” History of Psychiatry, vol. 26, no. 3, Aug. 2015 [Online]. Available: http://journals.sagepub.com/doi/abs/10.1177/0957154X14562755
Syndicated copies to:

Quantum Biological Information Theory by Ivan B. Djordjevic | Springer

Quantum Biological Information Theory by Ivan B. Djordjevic (Springer, 2015)

Springer recently announced the publication of the book Quantum Biological Information Theory by Ivan B. Djordjevic, in which I’m sure many readers here will have interest. I hope to have a review of it shortly after I’ve gotten a copy. Until then…

From the publisher’s website:

This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects.

  • Integrates quantum information and quantum biology concepts;
  • Assumes only knowledge of basic concepts of vector algebra at undergraduate level;
  • Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology;
  • Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models on tumor and cancer development, quantum modeling of bird navigation compass, quantum aspects of photosynthesis, quantum biological error correction.

Source: Quantum Biological Information Theory | Ivan B. Djordjevic | Springer

9783319228150I’ll note that it looks like it also assumes some reasonable facility with quantum mechanics in addition to the material listed above.

Springer also has a downloadable copy of the preface and a relatively extensive table of contents for those looking for a preview. Dr. Djordjevic has been added to the ever growing list of researchers doing work at the intersection of information theory and biology.

Syndicated copies to:

Einstein’s Equations From Entanglement

In a lecture at Caltech, Brian Swingle reviews the idea that entanglement is the glue which holds spacetime together and shows how Einstein's equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so he concludes by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle Colloquium at Caltech

From the Physics Research Conference 2015-2016
on Thursday, November 19, 2015 at 4:00 pm
at the California Institute of Technology, East Bridge 201 – Norman Bridge Laboratory of Physics, East

All talks are intended for a broad audience, and everyone is encouraged to attend. A list of future conferences can be found here.
Sponsored by Division of Physics, Mathematics and Astronomy

In recent years we have learned that the physics of quantum information plays a crucial role in the emergence of spacetime from microscopic degrees of freedom.

I will review the idea that entanglement is the glue which holds spacetime together and show how Einstein’s equations plausibly emerge from this perspective. One ubiquitous feature of these dynamical equations is the formation of black holes, so I will conclude by discussing some new ideas about the nature of spacetime inside a black hole.

Brian Swingle, postdoctoral fellow at the Stanford Institute for Theoretical Physics and physicist focusing on quantum matter, quantum information, and quantum gravity
in Physics Research Conference | Caltech

 

Click here for full screen presentation.

Syndicated copies to: