In this episode, Haley interviews Stephen Wolfram at the Ninth International Conference on Complex Systems. Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder and CEO of Wolfram Research. Wolfram talks with Haley about his professional journey and reflects on almost four decades of history, from his first introduction to the field of complexity science to the 30 year anniversary of Mathematica. He shares his hopes for the evolution of complexity science as a foundational field of study. He also gives advice for complexity researchers, recommending they focus on asking simple, foundational questions.
Tag: computer science
🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)
👓 Chris Aldrich is reading “How to Succeed in the Networked World”
The world’s connections have become more important than its divisions. To reap the rewards and avoid the pitfalls of this new order, the United States needs to adopt a grand strategy based on three pillars: open societies, open governments, and an open international system.
This article also definitely seems to take a broader historical approach to the general topics and is nearly close enough in philosophy that I might even begin considering it as a policy case with a Big History point of view.
Highly recommend.
Highlights, Quotes, & Marginalia
Think of a standard map of the world, showing the borders and capitals of the world’s 190-odd countries. That is the chessboard view.Now think of a map of the world at night, with the lit-up bursts of cities and the dark swaths of wilderness. Those corridors of light mark roads, cars, houses, and offices; they mark the networks of human relationships, where families and workers and travelers come together. That is the web view. It is a map not of separation, marking off boundaries of sovereign power, but of connection.
…the Westphalian world order mandated the sovereign equality of states not as an end in itself but as a means to protect the subjects of those states—the people.
The people must come first. Where they do not, sooner or later, they will overthrow their governments.
Open societies, open governments, and an open international system are risky propositions. But they are humankind’s best hope for harnessing the power not only of states but also of businesses, universities, civic organizations, and citizens to address the planetary problems that now touch us all.
…when a state abrogated its responsibility to protect the basic rights of its people, other states had a responsibility to protect those citizens, if necessary through military intervention.
But human rights themselves became politically polarized during the Cold War, with the West championing civil and political rights; the East championing economic, social, and cultural rights; and both sides tending to ignore violations in their client states.
The institutions built after World War II remain important repositories of legitimacy and authority. But they need to become the hubs of a flatter, faster, more flexible system, one that operates at the level of citizens as well as states.
U.S. policymakers should think in terms of translating chessboard alliances into hubs of connectedness and capability.
According to systems theory, the level of organization in a closed system can only stay the same or decrease. In open systems, by contrast, the level of organization can increase in response to new inputs and disruptions. That means that such a system should be able to ride out the volatility caused by changing power relationships and incorporate new kinds of global networks.
Writing about “connexity” 20 years ago, the British author and political adviser Geoff Mulgan argued that in adapting to permanent interdependence, governments and societies would have to rethink their policies, organizational structures, and conceptions of morality. Constant connectedness, he wrote, would place a premium on “reciprocity, the idea of give and take,” and a spirit of openness, trust, and transparency would underpin a “different way of governing.” Governments would “provide a framework of predictability, but leave space for people to organise themselves in flatter, more reciprocal structures.”
Instead of governing themselves through those who represent them, citizens can partner directly with the government to solve public problems.
…an open international order of the twenty-first century should be anchored in secure and self-reliant societies, in which citizens can participate actively in their own protection and prosperity. The first building block is open societies; the second is open governments.
The self-reliance necessary for open security depends on the ability to self-organize and take action.
The government’s role is to “invest in creating a more resilient nation,” which includes briefing and empowering the public, but more as a partner than a protector.
…much of the civil rights work of this century will entail championing digital rights.
Hard gatekeeping is a strategy of connection, but it calls for division, replacing the physical barriers of the twentieth century with digital ones of the twenty-first.
In this order, states must be waves and particles at the same time.
The legal order of the twenty-first century must be a double order, acknowledging the existence of domestic and international spheres of action and law but seeing the boundary between them as permeable.
In many countries, legislatures and government agencies have begun publishing draft legislation on open-source platforms such as GitHub, enabling their publics to contribute to the revision process.
The declaration’s three major principles are transparency, civic participation, and accountability.
Hector Zenil
If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).
His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf
A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.
Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]
YouTube
Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.
Universal Measures of Complexity
Relevant literature:
- A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity by Hector Zenil, Fernando Soler-Toscano, Narsis A. Kiani, Santiago Hernández-Orozco, Antonio Rueda-Toicen
- Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines by F. Soler-Toscano, H. Zenil, J.-P. Delahaye and N. Gauvrit; PLoS ONE 9(5): e96223, 2014.
- Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Randomness by Jean-Paul Delahaye, Hector Zenil; Applied Mathematics and Computation 219, pp. 63-77, 2012.
Reprogrammable World
Relevant literature:
Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671
Ed.: 9/7/16: Updated videos with links to relevant literature
The Hidden Algorithms Underlying Life | Quanta Magazine
The biological world is computational at its core, argues computer scientist Leslie Valiant.
I did expect something more entertaining from Google when I searched for “what will happen if I squeeze a paper cup full of hot coffee?”
Information Theory is Something Like the Logarithm of Probability Theory
Not only a great quote, but an interesting way to view the subjects.