🔖 Statistical Mechanics, Spring 2016 (Caltech, Physics 12c with videos) by John Preskill

Statistical Mechanics, Spring 2016 (Physics 12c) by John Preskill (Caltech)
An introductory course in statistical mechanics.

Recommended textbook Thermal Physics by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Syndicated copies to:

👓 The Quantum Thermodynamics Revolution | Quanta Magazine

The Quantum Thermodynamics Revolution by Natalie Wolchover (Quanta Magazine)
As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.
Syndicated copies to:

🔖 Proceedings of the Artificial Life Conference 2016

Proceedings of the Artificial Life Conference 2016 by Carlos Gershenson, Tom Froese, Jesus M. Siqueiros, Wendy Aguilar, Eduardo J. Izquierdo and Hiroki Sayama (The MIT Press)
The ALife conferences are the major meeting of the artificial life research community since 1987. For its 15th edition in 2016, it was held in Latin America for the first time, in the Mayan Riviera, Mexico, from July 4 -8. The special them of the conference: How can the synthetic study of living systems contribute to societies: scientifically, technically, and culturally? The goal of the conference theme is to better understand societies with the purpose of using this understanding for a more efficient management and development of social systems.

Free download available.

Proceedings of the Artificial Life Conference 2016

Syndicated copies to:

🔖 From Matter to Life: Information and Causality by Sara Imari Walker, Paul C. W. Davies, George F. R. Ellis

From Matter to Life: Information and Causality by by Sara Imari Walker, Paul C. W. Davies, George F. R. Ellis (Cambridge University Press)
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science. Hardcover: 514 pages; ISBN-10: 1107150531; ISBN-13: 978-1107150539;
From Matter to Life: Information and Causality
Syndicated copies to:

🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems

An Introduction to Transfer Entropy: Information Flow in Complex Systems by Terry Bossomaier, Lionel Barnett, Michael Harré, Joseph T. Lizier (Springer; 1st ed. 2016 edition)
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)

Want to read; h/t to Joseph Lizier.
Continue reading “🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems”

Syndicated copies to:

Repost of John Carlos Baez’ Biology as Information Dynamics

Biology as Information Dynamics by John Carlos Baez (Google+)
I'm giving a talk at the Stanford Complexity Group this Thursday afternoon, April 20th. If you're around - like in Silicon Valley - please drop by! It will be in Clark S361 at 4 pm. Here's the idea. Everyone likes to say that biology is all about information. There's something true about this - just think about DNA. But what does this insight actually do for us? To figure it out, we need to do some work. Biology is also about things that can make copies of themselves. So it makes sense to figure out how information theory is connected to the 'replicator equation' — a simple model of population dynamics for self-replicating entities. To see the connection, we need to use relative information: the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Then everything pops into sharp focus. It turns out that free energy — energy in forms that can actually be used, not just waste heat — is a special case of relative information Since the decrease of free energy is what drives chemical reactions, biochemistry is founded on relative information. But there's a lot more to it than this! Using relative information we can also see evolution as a learning process, fix the problems with Fisher's fundamental theorem of natural selection, and more. So this what I'll talk about! You can see slides of an old version here: http://math.ucr.edu/home/baez/bio_asu/ but my Stanford talk will be videotaped and it'll eventually be here: https://www.youtube.com/user/StanfordComplexity You can already see lots of cool talks at this location! #biology

Wondering if there’s a way I can manufacture a reason to head to Northern California this week…

Syndicated copies to:

👓 A Conversation with @LPachter (BS ’94) | Caltech

A Conversation with Lior Pachter (BS '94) (The California Institute of Technology)
Pachter, a computational biologist, returns to CalTech to study the role and function of RNA.

Pachter, a computational biologist and Caltech alumnus, returns to the Institute to study the role and function of RNA.

Lior Pachter (BS ’94) is Caltech’s new Bren Professor of Computational Biology. Recently, he was elected a fellow of the International Society for Computational Biology, one of the highest honors in the field. We sat down with him to discuss the emerging field of applying computational methods to biology problems, the transition from mathematics to biology, and his return to Pasadena. Continue reading “👓 A Conversation with @LPachter (BS ’94) | Caltech”

Syndicated copies to:

🔖 The Epidemic Spreading Model and the Direction of Information Flow in Brain Networks

The Epidemic Spreading Model and the Direction of Information Flow in Brain Networks by J. Meier, X. Zhou, A. Hillebrand, P. Tewarie, C.J. Stam, P. Van Mieghem (NeuroImage, February 5, 2017)
The interplay between structural connections and emerging information flow in the human brain remains an open research problem. A recent study observed global patterns of directional information flow in empirical data using the measure of transfer entropy. For higher frequency bands, the overall direction of information flow was from posterior to anterior regions whereas an anterior-to-posterior pattern was observed in lower frequency bands. In this study, we applied a simple Susceptible-Infected-Susceptible (SIS) epidemic spreading model on the human connectome with the aim to reveal the topological properties of the structural network that give rise to these global patterns. We found that direct structural connections induced higher transfer entropy between two brain regions and that transfer entropy decreased with increasing distance between nodes (in terms of hops in the structural network). Applying the SIS model, we were able to confirm the empirically observed opposite information flow patterns and posterior hubs in the structural network seem to play a dominant role in the network dynamics. For small time scales, when these hubs acted as strong receivers of information, the global pattern of information flow was in the posterior-to-anterior direction and in the opposite direction when they were strong senders. Our analysis suggests that these global patterns of directional information flow are the result of an unequal spatial distribution of the structural degree between posterior and anterior regions and their directions seem to be linked to different time scales of the spreading process.
Syndicated copies to:

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease, March 1-3

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease (Institute for Pure and Applied Mathematics, UCLA | March 1-3, 2017)
Epigenetics refers to information transmitted during cell division other than the DNA sequence per se, and it is the language that distinguishes stem cells from somatic cells, one organ from another, and even identical twins from each other. In contrast to the DNA sequence, the epigenome is relatively susceptible to modification by the environment as well as stochastic perturbations over time, adding to phenotypic diversity in the population. Despite its strong ties to the environment, epigenetics has never been well reconciled to evolutionary thinking, and in fact there is now strong evidence against the transmission of so-called “epi-alleles,” i.e. epigenetic modifications that pass through the germline.

However, genetic variants that regulate stochastic fluctuation of gene expression and phenotypes in the offspring appear to be transmitted as an epigenetic or even Lamarckian trait. Furthermore, even the normal process of cellular differentiation from a single cell to a complex organism is not understood well from a mathematical point of view. There is increasingly strong evidence that stem cells are highly heterogeneous and in fact stochasticity is necessary for pluripotency. This process appears to be tightly regulated through the epigenome in development. Moreover, in these biological contexts, “stochasticity” is hardly synonymous with “noise”, which often refers to variation which obscures a “true signal” (e.g., measurement error) or which is structural, as in physics (e.g., quantum noise). In contrast, “stochastic regulation” refers to purposeful, programmed variation; the fluctuations are random but there is no true signal to mask.

This workshop will serve as a forum for scientists and engineers with an interest in computational biology to explore the role of stochasticity in regulation, development and evolution, and its epigenetic basis. Just as thinking about stochasticity was transformative in physics and in some areas of biology, it promises to fundamentally transform modern genetics and help to explain phase transitions such as differentiation and cancer.

This workshop will include a poster session; a request for poster titles will be sent to registered participants in advance of the workshop.

Speaker List:
Adam Arkin (Lawrence Berkeley Laboratory)
Gábor Balázsi (SUNY Stony Brook)
Domitilla Del Vecchio (Massachusetts Institute of Technology)
Michael Elowitz (California Institute of Technology)
Andrew Feinberg (Johns Hopkins University)
Don Geman (Johns Hopkins University)
Anita Göndör (Karolinska Institutet)
John Goutsias (Johns Hopkins University)
Garrett Jenkinson (Johns Hopkins University)
Andre Levchenko (Yale University)
Olgica Milenkovic (University of Illinois)
Johan Paulsson (Harvard University)
Leor Weinberger (University of California, San Francisco (UCSF))

Syndicated copies to:

Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Entropy | Special Issue : Maximum Entropy and Bayesian Methods (mdpi.com)
Open for submission now
Deadline for manuscript submissions: 31 August 2017
A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2017

Special Issue Editor

Guest Editor

Dr. Brendon J. Brewer


Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
Website | E-MailPhone: +64275001336
Interests: bayesian inference, markov chain monte carlo, nested sampling, MaxEnt

Special Issue Information

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer
Guest Editor


Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

Source: Entropy | Special Issue : Maximum Entropy and Bayesian Methods

🔖 The Hypercycle: A Principle of Natural Self-Organization | Springer

The Hypercycle - A Principle of Natural Self-Organization | M. Eigen | Springer by Manfred Eigen and Peter Schuster (Springer, 1979)
This book originated from a series of papers which were published in "Die Naturwissenschaften" in 1977178. Its division into three parts is the reflection of a logic structure, which may be abstracted in the form of three theses:

A. Hypercycles are a principle of natural self-organization allowing an inte­gration and coherent evolution of a set of functionally coupled self-rep­licative entities.

B. Hypercycles are a novel class of nonlinear reaction networks with unique properties, amenable to a unified mathematical treatment.

C. Hypercycles are able to originate in the mutant distribution of a single Darwinian quasi-species through stabilization of its diverging mutant genes. Once nucleated hypercycles evolve to higher complexity by a process analogous to gene duplication and specialization. In order to outline the meaning of the first statement we may refer to another principle of material self organization, namely to Darwin's principle of natural selection. This principle as we see it today represents the only understood means for creating information, be it the blue print for a complex living organism which evolved from less complex ancestral forms, or be it a meaningful sequence of letters the selection of which can be simulated by evolutionary model games.

Part A in .pdf format.

Syndicated copies to:

🔖 Cognition and biology: perspectives from information theory

Cognition and biology: perspectives from information theory by Roderick Wallace (ncbi.nlm.nih.gov)
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.
Syndicated copies to:

🔖 Thermodynamics of Prediction

Thermodynamics of Prediction by Susanne Still, David A. Sivak, Anthony J. Bell, and Gavin E. Crooks (journals.aps.org Phys. Rev. Lett. 109, 120604 (2012))
A system responding to a stochastic driving signal can be interpreted as computing, by means of its dynamics, an implicit model of the environmental variables. The system’s state retains information about past environmental fluctuations, and a fraction of this information is predictive of future ones. The remaining nonpredictive information reflects model complexity that does not improve predictive power, and thus represents the ineffectiveness of the model. We expose the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation. Our results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including biomolecular machines. They highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive.
Syndicated copies to:

🔖 Statistical Physics of Adaptation

Statistical Physics of Adaptation by Nikolay Perunov, Robert A. Marsland, and Jeremy L. England (journals.aps.org Phys. Rev. X 6, 021036 (2016))
Whether by virtue of being prepared in a slowly relaxing, high-free energy initial condition, or because they are constantly dissipating energy absorbed from a strong external drive, many systems subject to thermal fluctuations are not expected to behave in the way they would at thermal equilibrium. Rather, the probability of finding such a system in a given microscopic arrangement may deviate strongly from the Boltzmann distribution, raising the question of whether thermodynamics still has anything to tell us about which arrangements are the most likely to be observed. In this work, we build on past results governing nonequilibrium thermodynamics and define a generalized Helmholtz free energy that exactly delineates the various factors that quantitatively contribute to the relative probabilities of different outcomes in far-from-equilibrium stochastic dynamics. By applying this expression to the analysis of two examples—namely, a particle hopping in an oscillating energy landscape and a population composed of two types of exponentially growing self-replicators—we illustrate a simple relationship between outcome-likelihood and dissipative history. In closing, we discuss the possible relevance of such a thermodynamic principle for our understanding of self-organization in complex systems, paying particular attention to a possible analogy to the way evolutionary adaptations emerge in living things.
Syndicated copies to:

🔖 Meaning = Information + Evolution by Carlo Rovelli

Meaning = Information + Evolution by Carlo Rovelli (arxiv.org)
Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".
Syndicated copies to: