Hector Zenil

A new paper (arXiv) and some videos on entropy and algorithmic complexity

I’ve run across some of his work before, but I ran into some new material by Hector Zenil that will likely interest those following information theory, complexity, and computer science here. I hadn’t previously noticed that he refers to himself on his website as an “information theoretic biologist” — everyone should have that as a title, shouldn’t they? As a result, I’ve also added him to the growing list of ITBio Researchers.

If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).

Hector Zenil:

His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf

A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.

Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]

YouTube

Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.

Universal Measures of Complexity

Relevant literature:

Reprogrammable World

Relevant literature:

Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671

Ed.: 9/7/16: Updated videos with links to relevant literature

Randomness And Complexity, from Leibniz To Chaitin | World Scientific Publishing

Bookmarked Randomness And Complexity, from Leibniz To Chaitin (amzn.to)
The book is a collection of papers written by a selection of eminent authors from around the world in honour of Gregory Chaitin s 60th birthday. This is a unique volume including technical contributions, philosophical papers and essays. Hardcover: 468 pages; Publisher: World Scientific Publishing Company (October 18, 2007); ISBN: 9789812770820

Network Science by Albert-László Barabási

Bookmarked Network Science by Albert-László BarabásiAlbert-László Barabási (Cambridge University Press)

I ran across a link to this textbook by way of a standing Google alert, and was excited to check it out. I was immediately disappointed to think that I would have to wait another month and change for the physical textbook to be released, but made my pre-order directly. Then with a bit of digging around, I realized that individual chapters are available immediately to quench my thirst until the physical text is printed next month.

The power of network science, the beauty of network visualization.

Network Science, a textbook for network science, is freely available under the Creative Commons licence. Follow its development on Facebook, Twitter or by signing up to our mailing list, so that we can notify you of new chapters and developments.

The book is the result of a collaboration between a number of individuals, shaping everything, from content (Albert-László Barabási), to visualizations and interactive tools (Gabriele Musella, Mauro Martino, Nicole Samay, Kim Albrecht), simulations and data analysis (Márton Pósfai). The printed version of the book will be published by Cambridge University Press in 2016. In the coming months the website will be expanded with an interactive version of the text, datasets, and slides to teach the material.

Book Contents

Personal Introduction
1. Introduction
2. Graph Theory
3. Random Networks
4. The Scale-Free Property
5. The Barabási-Albert Model
6. Evolving Networks
7. Degree Correlations
8. Network Robustness
9. Communities
10. Spreading Phenomena
Usage & Acknowledgements
About

Albert-László Barabási
on Network Science (book website)

Networks are everywhere, from the Internet, to social networks, and the genetic networks that determine our biological existence. Illustrated throughout in full colour, this pioneering textbook, spanning a wide range of topics from physics to computer science, engineering, economics and the social sciences, introduces network science to an interdisciplinary audience. From the origins of the six degrees of separation to explaining why networks are robust to random failures, the author explores how viruses like Ebola and H1N1 spread, and why it is that our friends have more friends than we do. Using numerous real-world examples, this innovatively designed text includes clear delineation between undergraduate and graduate level material. The mathematical formulas and derivations are included within Advanced Topics sections, enabling use at a range of levels. Extensive online resources, including films and software for network analysis, make this a multifaceted companion for anyone with an interest in network science.

Source: Cambridge University Press

The textbook is available for purchase in September 2016 from Cambridge University Press. Pre-order now on Amazon.com.

If you’re not already doing so, you should follow Barabási on Twitter.

Disconnected, Fragmented, or United? A Trans-disciplinary Review of Network Science

Bookmarked Disconnected, Fragmented, or United? A Trans-disciplinary Review of Network Science by César A. HidalgoCésar A. Hidalgo (Applied Network Science | SpringerLink)

Applied Network Science

Abstract

During decades the study of networks has been divided between the efforts of social scientists and natural scientists, two groups of scholars who often do not see eye to eye. In this review I present an effort to mutually translate the work conducted by scholars from both of these academic fronts hoping to continue to unify what has become a diverging body of literature. I argue that social and natural scientists fail to see eye to eye because they have diverging academic goals. Social scientists focus on explaining how context specific social and economic mechanisms drive the structure of networks and on how networks shape social and economic outcomes. By contrast, natural scientists focus primarily on modeling network characteristics that are independent of context, since their focus is to identify universal characteristics of systems instead of context specific mechanisms. In the following pages I discuss the differences between both of these literatures by summarizing the parallel theories advanced to explain link formation and the applications used by scholars in each field to justify their approach to network science. I conclude by providing an outlook on how these literatures can be further unified.

Design and Control of Self-organizing Systems

Bookmarked Design and Control of Self-organizing Systems by Carlos Gershenson (scifunam.fisica.unam.mx)

UNAM Mexico City has an available free download of Carlos Gershenson’s 2007 text.

Complex systems are usually difficult to design and control. There are several particular methods for coping with complexity, but there is no general approach to build complex systems. In this book I propose a methodology to aid engineers in the design and control of complex systems. This is based on the description of systems as self-organizing. Starting from the agent metaphor, the methodology proposes a conceptual framework and a series of steps to follow to find proper mechanisms that will promote elements to find solutions by actively interacting among themselves.

Design and Control of Self-organizing Systems by Carlos Gershenson (2007)
Design and Control of Self-organizing Systems by Carlos Gershenson (2007)

Introduction to Dynamical Systems and Chaos

Complexity Explorer is offering a free online course for Summer 2016

Introduction to Dynamical Systems and Chaos (Summer, 2016)

About the Course:

In this course you’ll gain an introduction to the modern study of dynamical systems, the interdisciplinary field of applied mathematics that studies systems that change over time.

Topics to be covered include: phase space, bifurcations, chaos, the butterfly effect, strange attractors, and pattern formation. The course will focus on some of the realizations from the study of dynamical systems that are of particular relevance to complex systems:

  1. Dynamical systems undergo bifurcations, where a small change in a system parameter such as the temperature or the harvest rate in a fishery leads to a large and qualitative change in the system’s
    behavior.
  2. Deterministic dynamical systems can behave randomly. This property, known as sensitive dependence or the butterfly effect, places strong limits on our ability to predict some phenomena.
  3. Disordered behavior can be stable. Non-periodic systems with the butterfly effect can have stable average properties. So the average or statistical properties of a system can be predictable, even if its details are not.
  4. Complex behavior can arise from simple rules. Simple dynamical systems do not necessarily lead to simple results. In particular, we will see that simple rules can produce patterns and structures of surprising complexity.

About the Instructor:

content_headshotDavid Feldman is Professor of Physics and Mathematics at College of the Atlantic. From 2004-2009 he was a faculty member in the Santa Fe Institute’s Complex Systems Summer School in Beijing, China. He served as the school’s co-director from 2006-2009. Dave is the author of Chaos and Fractals: An Elementary Introduction (Oxford University Press, 2012), a textbook on chaos and fractals for students with a background in high school algebra. Dave was a U.S. Fulbright Lecturer in Rwanda in 2011-12.

Course dates:

5 Jul 2016 9am PDT to
20 Sep 2016 3pm PDT

Prerequisites:

A familiarity with basic high school algebra. There will be optional lessons for those with stronger math backgrounds.

Syllabus

  • Introduction I: Iterated Functions
  • Introduction II: Differential Equations
  • Chaos and the Butterfly Effect
  • Bifurcations: Part I (Differential Equations)
  • Bifurcations: Part II (Logistic Map)
  • Universality
  • Phase Space
  • Strange Attractors
  • Pattern Formation
  • Summary and Conclusions

Source: Complexity Explorer

Calculating the Middle Ages?

Bookmarked Calculating the Middle Ages? The Project "Complexities and Networks in the Medieval Mediterranean and Near East" (COMMED) [1606.03433] (arxiv.org)
The project "Complexities and networks in the Medieval Mediterranean and Near East" (COMMED) at the Division for Byzantine Research of the Institute for Medieval Research (IMAFO) of the Austrian Academy of Sciences focuses on the adaptation and development of concepts and tools of network theory and complexity sciences for the analysis of societies, polities and regions in the medieval world in a comparative perspective. Key elements of its methodological and technological toolkit are applied, for instance, in the new project "Mapping medieval conflicts: a digital approach towards political dynamics in the pre-modern period" (MEDCON), which analyses political networks and conflict among power elites across medieval Europe with five case studies from the 12th to 15th century. For one of these case studies on 14th century Byzantium, the explanatory value of this approach is presented in greater detail. The presented results are integrated in a wider comparison of five late medieval polities across Afro-Eurasia (Byzantium, China, England, Hungary and Mamluk Egypt) against the background of the {guillemotright}Late Medieval Crisis{guillemotleft} and its political and environmental turmoil. Finally, further perspectives of COMMED are outlined.

Network and Complexity Theory Applied to History

This interesting paper (summary below) appears to apply network and complexity science to history and is sure to be of interest to those working at the intersection of some of these types of interdisciplinary studies. In particular, I’d be curious to see more coming out of this type of area to support theses written by scholars like Francis Fukuyama in the development of societal structures. Those interested in the emerging area of Big History are sure to enjoy this type of treatment. I’m also curious how researchers in economics (like Cesar Hidalgo) might make use of available(?) historical data in such related analyses. I’m curious if Dave Harris might consider such an analysis in his ancient Near East work?

Those interested in a synopsis of the paper might find some benefit from an overview from MIT Technology Review: How the New Science of Computational History Is Changing the Study of the Past.

A New Thermodynamics Theory of the Origin of Life | Quanta Magazine

Bookmarked A New Physics Theory of Life by Natalie Wolchover (quantamagazine.org)
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.

References:

Hypothesis annotations

[ hypothesis user = 'chrisaldrich' tags = 'EnglandQM']

Introduction to Information Theory | SFI’s Complexity Explorer

The Santa Fe Institute's free online course "Introduction to Information Theory" taught by Seth Lloyd via Complexity Explorer.

Many readers often ask me for resources for delving into the basics of information theory. I hadn’t posted it before, but the Santa Fe Institute recently had an online course Introduction to Information Theory through their Complexity Explorer, which has some other excellent offerings. It included videos, fora, and other resources and was taught by the esteemed physicist and professor Seth Lloyd. There are a number of currently active students still learning and posting there.

Introduction to Information Theory

About the Tutorial:

This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.

In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.

Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.

About the Instructor(s):

Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.

From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.

Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.

Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.

Tutorial Team:

Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.

How to use Complexity Explorer: How to use Complexity Explore
Prerequisites: At least one year of high-school algebra
Like this tutorial? 


Syllabus

  1. Introduction
  2. Forms of Information
  3. Information and Probability
  4. Fundamental Formula of Information
  5. Computation and Logic: Information Processing
  6. Mutual Information
  7. Communication Capacity
  8. Shannon’s Coding Theorem
  9. The Manifold Things Information Measures
  10. Homework

Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae”

Read Devourer of Encyclopedias: Stanislaw Lem's "Summa Technologiae" (The Los Angeles Review of Books)
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.

Summa Technologiae

AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.

His subjects, among others, include:

  • Virtual reality
  • Artificial intelligence
  • Nanotechnology and biotechnology
  • Evolutionary biology and evolutionary psychology
  • Artificial life
  • Information theory
  • Entropy and thermodynamics
  • Complexity theory, probability, and chaos
  • Population and ecological catastrophe
  • The “singularity” and “transhumanism”

Source: Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books

I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.

Can a Field in Which Physicists Think Like Economists Help Us Achieve Universal Knowledge?

Bookmarked Can a Field in Which Physicists Think Like Economists Help Us Achieve Universal Knowledge? by David Auerbach (Slate Magazine)
The Theory of Everything and Then Some: In complexity theory, physicists try to understand economics while sociologists think like biologists. Can they bring us any closer to universal knowledge?

A discussion of complexity and complexity theorist John H. Miller’s new book: A Crude Look at the Whole: The Science of Complex Systems in Business, Life, and Society.

Global Language Networks

Recent research on global language networks has interesting relations to big history, complexity economics, and current politics.

Yesterday I ran across this nice little video explaining some recent research on global language networks. It’s not only interesting in its own right, but is a fantastic example of science communication as well.

I’m interested in some of the information theoretic aspects of this as well as the relation of this to the area of corpus linguistics. I’m also curious if one could build worthwhile datasets like this for the ancient world (cross reference some of the sources I touch on in relation to the Dickinson College Commentaries within Latin Pedagogy and the Digital Humanities) to see what influences different language cultures have had on each other. Perhaps the historical record could help to validate some of the predictions made in relation to the future?

The paper “Global distribution and drivers of language extinction risk” indicates that of all the variables tested, economic growth was most strongly linked to language loss.

This research also has some interesting relation to the concept of “Collective Learning” within the realm of a Big History framework via David Christian, Fred Spier, et al.  I’m curious to revisit my hypothesis: Collective learning has potentially been growing at the expense of a shrinking body of diverse language some of which was informed by the work of Jared Diamond.

Some of the discussion in the video is reminiscent to me of some of the work Stuart Kauffman lays out in At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (Oxford, 1995). Particularly in chapter 3 in which Kauffman discusses the networks of life.  The analogy of this to the networks of language here indicate to me that some of Cesar Hidalgo’s recent work in Why Information Grows: The Evolution of Order, From Atoms to Economies (MIT Press, 2015) is even more interesting in helping to show the true value of links between people and firms (information sources which he measures as personbytes and firmbytes) within economies.

Finally, I can also only think about how this research may help to temper some of the xenophobic discussion that occurs in American political life with respect to fears relating to Mexican immigration issues as well as the position of China in the world economy.

Those intrigued by the video may find the website set up by the researchers very interesting. It contains links to the full paper as well as visualizations and links to the data used.

Abstract

Languages vary enormously in global importance because of historical, demographic, political, and technological forces. However, beyond simple measures of population and economic power, there has been no rigorous quantitative way to define the global influence of languages. Here we use the structure of the networks connecting multilingual speakers and translated texts, as expressed in book translations, multiple language editions of Wikipedia, and Twitter, to provide a concept of language importance that goes beyond simple economic or demographic measures. We find that the structure of these three global language networks (GLNs) is centered on English as a global hub and around a handful of intermediate hub languages, which include Spanish, German, French, Russian, Portuguese, and Chinese. We validate the measure of a language’s centrality in the three GLNs by showing that it exhibits a strong correlation with two independent measures of the number of famous people born in the countries associated with that language. These results suggest that the position of a language in the GLN contributes to the visibility of its speakers and the global popularity of the cultural content they produce.

Citation: Ronen S, Goncalves B, Hu KZ, Vespignani A, Pinker S, Hidalgo CA
Links that speak: the global language network and its association with global fame, Proceedings of the National Academy of Sciences (PNAS) (2014), 10.1073/pnas.1410931111

Related posts:

“A language like Dutch — spoken by 27 million people — can be a disproportionately large conduit, compared with a language like Arabic, which has a whopping 530 million native and second-language speakers,” Science reports. “This is because the Dutch are very multilingual and very online.”

Forthcoming ITBio-related book from Sean Carroll: “The Big Picture: On the Origins of Life, Meaning, and the Universe Itself”

Physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers.

In catching up on blogs/reading from the holidays, I’ve noticed that physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers. One can already pre-order the book via Amazon.

Prior to the holidays Sean wrote a blogpost that contains a full overview table of contents, which will give everyone a stronger idea of its contents. For convenience I’ll excerpt it below.

I’ll post a review as soon as a copy arrives, but it looks like a strong new entry in the category of popular science books on information theory, biology and complexity as well as potentially the areas of evolution, the origin of life, and physics in general.

As a side bonus, for those reading this today (1/15/16), I’ll note that Carroll’s 12 part lecture series from The Great Courses The Higgs Boson and Beyond (The Learning Company, February 2015) is 80% off.

The Big Picture

 

THE BIG PICTURE: ON THE ORIGINS OF LIFE, MEANING, AND THE UNIVERSE ITSELF

0. Prologue

* Part One: Cosmos

  • 1. The Fundamental Nature of Reality
  • 2. Poetic Naturalism
  • 3. The World Moves By Itself
  • 4. What Determines What Will Happen Next?
  • 5. Reasons Why
  • 6. Our Universe
  • 7. Time’s Arrow
  • 8. Memories and Causes

* Part Two: Understanding

  • 9. Learning About the World
  • 10. Updating Our Knowledge
  • 11. Is It Okay to Doubt Everything?
  • 12. Reality Emerges
  • 13. What Exists, and What Is Illusion?
  • 14. Planets of Belief
  • 15. Accepting Uncertainty
  • 16. What Can We Know About the Universe Without Looking at It?
  • 17. Who Am I?
  • 18. Abducting God

* Part Three: Essence

  • 19. How Much We Know
  • 20. The Quantum Realm
  • 21. Interpreting Quantum Mechanics
  • 22. The Core Theory
  • 23. The Stuff of Which We Are Made
  • 24. The Effective Theory of the Everyday World
  • 25. Why Does the Universe Exist?
  • 26. Body and Soul
  • 27. Death Is the End

* Part Four: Complexity

  • 28. The Universe in a Cup of Coffee
  • 29. Light and Life
  • 30. Funneling Energy
  • 31. Spontaneous Organization
  • 32. The Origin and Purpose of Life
  • 33. Evolution’s Bootstraps
  • 34. Searching Through the Landscape
  • 35. Emergent Purpose
  • 36. Are We the Point?

* Part Five: Thinking

  • 37. Crawling Into Consciousness
  • 38. The Babbling Brain
  • 39. What Thinks?
  • 40. The Hard Problem
  • 41. Zombies and Stories
  • 42. Are Photons Conscious?
  • 43. What Acts on What?
  • 44. Freedom to Choose

* Part Six: Caring

  • 45. Three Billion Heartbeats
  • 46. What Is and What Ought to Be
  • 47. Rules and Consequences
  • 48. Constructing Goodness
  • 49. Listening to the World
  • 50. Existential Therapy
  • Appendix: The Equation Underlying You and Me
  • Acknowledgments
  • Further Reading
  • References
  • Index

Source: Sean Carroll | The Big Picture: Table of Contents

The HumanCurrent Podcast on iTunes

Subscribing to The HumanCurrent podcast (@LetsWorkHappy) by Angie Cross and @Gabbleduck. https://itunes.apple.com/us/podcast/the-humancurrent/id1003870102 #complexity

Can computers help us read the mind of nature? by Paul Davies | The Guardian

Paul Davies waxes poetic about the application of physics, chemistry, and information theory to biology, genetics, and the origin of life.

For too long, scientists focused on what we can see. Now they are at last starting to decode life’s software.

“A soup of chemicals may spontaneously form a reaction network, but what does it take for such a molecular muddle to begin coherently organising information flow and storage? Rather than looking to biology or chemistry, we can perhaps dream that advances in the mathematics of information theory hold the key.”

Paul Davies, physicist, writer, and broadcaster
in Can computers help us read the mind of nature? in The Guardian

 

 ‘When we look at a plant or an animal we see the physical forms, not the swirling patterns of instructions inside them.’ Photograph: Abir Sultan/EPA
‘When we look at a plant or an animal we see the physical forms, not the swirling patterns of instructions inside them.’ Photograph: Abir Sultan/EPA