One of the 20th century’s leading mathematical theorists, he revealed a connection between math and physics not seen since the 17th century.
Quantum theory provides an extremely accurate description of fundamental processes in physics. It thus seems likely that the theory is applicable beyond the, mostly microscopic, domain in which it has been tested experimentally. Here, we propose a Gedankenexperiment to investigate the question whether quantum theory can, in principle, have universal validity. The idea is that, if the answer was yes, it must be possible to employ quantum theory to model complex systems that include agents who are themselves using quantum theory. Analysing the experiment under this presumption, we find that one agent, upon observing a particular measurement outcome, must conclude that another agent has predicted the opposite outcome with certainty. The agents’ conclusions, although all derived within quantum theory, are thus inconsistent. This indicates that quantum theory cannot be extrapolated to complex systems, at least not in a straightforward manner.
Published version of arXiv paper at https://arxiv.org/abs/1604.07422v1
According to quantum theory, a measurement may have multiple possible outcomes. Single-world interpretations assert that, nevertheless, only one of them "really" occurs. Here we propose a gedankenexperiment where quantum theory is applied to model an experimenter who herself uses quantum theory. We find that, in such a scenario, no single-world interpretation can be logically consistent. This conclusion extends to deterministic hidden-variable theories, such as Bohmian mechanics, for they impose a single-world interpretation.
A thought experiment has shaken up the world of quantum foundations, forcing physicists to clarify how various quantum interpretations (such as many-worlds and the Copenhagen interpretation) abandon seemingly sensible assumptions about reality.
Mark Newman is a British physicist and Anatol Rapoport Distinguished University Professor of Physics at the University of Michigan, as well as an external faculty member of the Santa Fe Institute. He is known for his fundamental contributions to the fields of complex networks and complex systems, for which he was awarded the 2014 Lagrange Prize.
The women who have accused the famed science educator of sexual impropriety have made claims not just about traumatized minds, but also about traumatized careers.
In this episode, Haley interviews Stephen Wolfram at the Ninth International Conference on Complex Systems. Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder and CEO of Wolfram Research. Wolfram talks with Haley about his professional journey and reflects on almost four decades of history, from his first introduction to the field of complexity science to the 30 year anniversary of Mathematica. He shares his hopes for the evolution of complexity science as a foundational field of study. He also gives advice for complexity researchers, recommending they focus on asking simple, foundational questions.
Statistical physics is the natural framework to model complex networks. In the last twenty years, it has brought novel physical insights on a variety of emergent phenomena, such as self-organisation, scale invariance, mixed distributions and ensemble non-equivalence, which cannot be deduced from the behaviour of the individual constituents. At the same time, thanks to its deep connection with information theory, statistical physics and the principle of maximum entropy have led to the definition of null models reproducing some features of empirical networks, but otherwise as random as possible. We review here the statistical physics approach for complex networks and the null models for the various physical problems, focusing in particular on the analytic frameworks reproducing the local features of the network. We show how these models have been used to detect statistically significant and predictive structural patterns in real-world networks, as well as to reconstruct the network structure in case of incomplete information. We further survey the statistical physics frameworks that reproduce more complex, semi-local network features using Markov chain Monte Carlo sampling, and the models of generalised network structures such as multiplex networks, interacting networks and simplicial complexes.
Comments: To appear on Nature Reviews Physics. The revised accepted version will be posted 6 months after publication
Sir Michael Atiyah, one of the world’s greatest living mathematicians, has proposed a derivation of α, the fine-structure constant of quantum electrodynamics. A preprint is here. The math her…
Say you made a Nobel-worthy scientific discovery and the prize went to your thesis supervisor instead. How would you take it?
Complex networks describe a wide range of systems in nature and society. Frequently cited examples include the cell, a network of chemicals linked by chemical reactions, and the Internet, a network of routers and computers connected by physical links. While traditionally these systems have been modeled as random graphs, it is increasingly recognized that the topology and evolution of real networks are governed by robust organizing principles. This article reviews the recent advances in the field of complex networks, focusing on the statistical mechanics of network topology and dynamics. After reviewing the empirical data that motivated the recent interest in networks, the authors discuss the main models and analytical tools, covering random graphs, small-world and scale-free networks, the emerging theory of evolving networks, and the interplay between topology and the network's robustness against failures and attacks.
h/t Disconnected, fragmented, or united? a trans-disciplinary review of network science by César A. Hidalgo (Applied Network Science | SpringerLink)
Access our free college textbooks and low-cost learning materials.
After spending billions trying (and failing) to support beautiful ideas in physics, is it time to let evidence lead the way?
Andrew Jordan reviews Peter Woit's Quantum Theory, Groups and Representations and finds much to admire.
For the tourists, I’ve noted before that Peter maintains a free copy of his new textbook on his website.
I also don’t think I’ve ever come across the journal Inference before, but it looks quite nice in terms of content and editorial.
Sabine Hossenfelder’s new book Lost in Math should be starting to appear in bookstores around now. It’s very good and you should get a copy. I hope that the book will receive a lot of attention, but suspect that much of this will focus on an oversimplified version of the book’s argument, ignoring some of the more interesting material that she has put together. Hossenfelder’s main concern is the difficult current state of theoretical fundamental physics, sometimes referred to as a “crisis” or “nightmare scenario”. She is writing at what is likely to be a decisive moment for the subject: the negative LHC results for popular speculative models are now in. What effect will these have on those who have devoted decades to studying such models?
I love that he calls out the review in Science.