Sir Roger Penrose has had a remarkable life. He has contributed an enormous amount to our understanding of general relativity, perhaps more than anyone since Einstein himself — Penrose diagrams, singularity theorems, the Penrose process, cosmic censorship, and the list goes on. He has made important contributions to mathematics, including such fun ideas as the Penrose triangle and aperiodic tilings. He has also made bold conjectures in the notoriously contentious areas of quantum mechanics and the study of consciousness. In his spare time he’s managed to become an extremely successful author, writing such books as The Emperor’s New Mind and The Road to Reality. With far too much that we could have talked about, we decided to concentrate in this discussion on spacetime, black holes, and cosmology, but we made sure to reserve some time to dig into quantum mechanics and the brain by the end.
Maxwell’s Demon is a famous thought experiment in which a mischievous imp uses knowledge of the velocities of gas molecules in a box to decrease the entropy of the gas, which could then be used to do useful work such as pushing a piston. This is a classic example of converting information (what the gas molecules are doing) into work. But of course that kind of phenomenon is much more widespread — it happens any time a company or organization hires someone in order to take advantage of their know-how. César Hidalgo has become an expert in this relationship between information and work, both at the level of physics and how it bubbles up into economies and societies. Looking at the world through the lens of information brings new insights into how we learn things, how economies are structured, and how novel uses of data will transform how we live.
César Hidalgo received his Ph.D. in physics from the University of Notre Dame. He currently holds an ANITI Chair at the University of Toulouse, an Honorary Professorship at the University of Manchester, and a Visiting Professorship at Harvard’s School of Engineering and Applied Sciences. From 2010 to 2019, he led MIT’s Collective Learning group. He is the author of Why Information Grows and co-author of The Atlas of Economic Complexity. He is a co-founder of Datawheel, a data visualization company whose products include the Observatory of Economic Complexity.
I was also piqued at the mention of Lynne Kelly’s work, which I’m now knee deep into. I suspect it could dramatically expand on what we think of as the capacity of a personbyte, though the limit of knowledge there still exists. The idea of mnemotechniques within indigenous cultures certainly expands on the way knowledge worked in prehistory and what we classically think of and frame collective knowledge or collective learning.
I also think there are some interesting connections with Dr. Kelly’s mentions of social equity in prehistorical cultures and the work that Hidalgo mentions in the middle of the episode.
There are a small handful of references I’ll want to delve into after hearing this, though it may take time to pull them up unless they’re linked in the show notes.
hat-tip: Complexity Digest for the reminder that this is in my podcatcher. 🔖 November 22, 2019 at 03:28PM
Hard as he tried, Murray Gell-Mann could never make himself into a legend like his rakish colleague and collaborator, Richard Feynman -- even if he was probably the greater physicist
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.
From Reese Witherspoon to Jeff Bezos, those who are super successful in their chosen domain accept this truth.
One of the 20th century’s leading mathematical theorists, he revealed a connection between math and physics not seen since the 17th century.
Quantum theory provides an extremely accurate description of fundamental processes in physics. It thus seems likely that the theory is applicable beyond the, mostly microscopic, domain in which it has been tested experimentally. Here, we propose a Gedankenexperiment to investigate the question whether quantum theory can, in principle, have universal validity. The idea is that, if the answer was yes, it must be possible to employ quantum theory to model complex systems that include agents who are themselves using quantum theory. Analysing the experiment under this presumption, we find that one agent, upon observing a particular measurement outcome, must conclude that another agent has predicted the opposite outcome with certainty. The agents’ conclusions, although all derived within quantum theory, are thus inconsistent. This indicates that quantum theory cannot be extrapolated to complex systems, at least not in a straightforward manner.
According to quantum theory, a measurement may have multiple possible outcomes. Single-world interpretations assert that, nevertheless, only one of them "really" occurs. Here we propose a gedankenexperiment where quantum theory is applied to model an experimenter who herself uses quantum theory. We find that, in such a scenario, no single-world interpretation can be logically consistent. This conclusion extends to deterministic hidden-variable theories, such as Bohmian mechanics, for they impose a single-world interpretation.
A thought experiment has shaken up the world of quantum foundations, forcing physicists to clarify how various quantum interpretations (such as many-worlds and the Copenhagen interpretation) abandon seemingly sensible assumptions about reality.
Mark Newman is a British physicist and Anatol Rapoport Distinguished University Professor of Physics at the University of Michigan, as well as an external faculty member of the Santa Fe Institute. He is known for his fundamental contributions to the fields of complex networks and complex systems, for which he was awarded the 2014 Lagrange Prize.
The women who have accused the famed science educator of sexual impropriety have made claims not just about traumatized minds, but also about traumatized careers.
In this episode, Haley interviews Stephen Wolfram at the Ninth International Conference on Complex Systems. Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder and CEO of Wolfram Research. Wolfram talks with Haley about his professional journey and reflects on almost four decades of history, from his first introduction to the field of complexity science to the 30 year anniversary of Mathematica. He shares his hopes for the evolution of complexity science as a foundational field of study. He also gives advice for complexity researchers, recommending they focus on asking simple, foundational questions.
Statistical physics is the natural framework to model complex networks. In the last twenty years, it has brought novel physical insights on a variety of emergent phenomena, such as self-organisation, scale invariance, mixed distributions and ensemble non-equivalence, which cannot be deduced from the behaviour of the individual constituents. At the same time, thanks to its deep connection with information theory, statistical physics and the principle of maximum entropy have led to the definition of null models reproducing some features of empirical networks, but otherwise as random as possible. We review here the statistical physics approach for complex networks and the null models for the various physical problems, focusing in particular on the analytic frameworks reproducing the local features of the network. We show how these models have been used to detect statistically significant and predictive structural patterns in real-world networks, as well as to reconstruct the network structure in case of incomplete information. We further survey the statistical physics frameworks that reproduce more complex, semi-local network features using Markov chain Monte Carlo sampling, and the models of generalised network structures such as multiplex networks, interacting networks and simplicial complexes.
Comments: To appear on Nature Reviews Physics. The revised accepted version will be posted 6 months after publication
Sir Michael Atiyah, one of the world’s greatest living mathematicians, has proposed a derivation of α, the fine-structure constant of quantum electrodynamics. A preprint is here. The math her…
Say you made a Nobel-worthy scientific discovery and the prize went to your thesis supervisor instead. How would you take it?