🔖 Thermodynamics of Prediction

Thermodynamics of Prediction by Susanne Still, David A. Sivak, Anthony J. Bell, and Gavin E. Crooks (journals.aps.org Phys. Rev. Lett. 109, 120604 (2012))
A system responding to a stochastic driving signal can be interpreted as computing, by means of its dynamics, an implicit model of the environmental variables. The system’s state retains information about past environmental fluctuations, and a fraction of this information is predictive of future ones. The remaining nonpredictive information reflects model complexity that does not improve predictive power, and thus represents the ineffectiveness of the model. We expose the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation. Our results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including biomolecular machines. They highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive.
Syndicated copies to:

🔖 Meaning = Information + Evolution by Carlo Rovelli

Meaning = Information + Evolution by Carlo Rovelli (arxiv.org)
Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".
Syndicated copies to:

🔖 Energy flow and the organization of life | Complexity

Energy flow and the organization of life by Harold Morowitz and Eric Smith (Complexity, September 2007)
Understanding the emergence and robustness of life requires accounting for both chemical specificity and statistical generality. We argue that the reverse of a common observation—that life requires a source of free energy to persist—provides an appropriate principle to understand the emergence, organization, and persistence of life on earth. Life, and in particular core biochemistry, has many properties of a relaxation channel that was driven into existence by free energy stresses from the earth's geochemistry. Like lightning or convective storms, the carbon, nitrogen, and phosphorus fluxes through core anabolic pathways make sense as the order parameters in a phase transition from an abiotic to a living state of the geosphere. Interpreting core pathways as order parameters would both explain their stability over billions of years, and perhaps predict the uniqueness of specific optimal chemical pathways.

Download .pdf copy

[1]
H. Morowitz and E. Smith, “Energy flow and the organization of life,” Complexity, vol. 13, no. 1. Wiley-Blackwell, pp. 51–59, 2007 [Online]. Available: http://dx.doi.org/10.1002/cplx.20191
Syndicated copies to:

🔖 How Life (and Death) Spring From Disorder | Quanta Magazine

How Life (and Death) Spring From Disorder by Philip Ball (Quanta Magazine)
Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.

This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. [1][2][3][4][5][6][7][8][9][10]

While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.

References

[1]
E. Mayr, What Makes Biology Unique? Cambridge University Press, 2004.
[2]
A. Wissner-Gross and C. Freer, “Causal entropic forces.,” Phys Rev Lett, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]
[3]
A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” Phys Rev Lett, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]
[4]
J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” Nat Rev Mol Cell Biol, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]
[5]
X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” Nature, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793
[6]
H. Morowitz and E. Smith, “Energy Flow and the Organization of Life,” Santa Fe Institute, 07-Aug-2006. [Online]. Available: http://samoa.santafe.edu/media/workingpapers/06-08-029.pdf. [Accessed: 03-Feb-2017]
[7]
R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183
[8]
C. Rovelli, “Meaning = Information + Evolution,” arXiv, Nov. 2006 [Online]. Available: https://arxiv.org/abs/1611.02420
[9]
N. Perunov, R. A. Marsland, and J. L. England, “Statistical Physics of Adaptation,” Physical Review X, vol. 6, no. 2. American Physical Society (APS), 16-Jun-2016 [Online]. Available: http://dx.doi.org/10.1103/PhysRevX.6.021036 [Source]
[10]
S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” Physical Review Letters, vol. 109, no. 12. American Physical Society (APS), 19-Sep-2012 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.109.120604 [Source]
Syndicated copies to:

How Life Turns Asymmetric | Quanta Magazine

How Life Turns Asymmetric | Quanta Magazine by By Tim Vernimmen (quantamagazine.org)
Scientists are uncovering how our bodies — and everything within them — tell right from left.

Continue reading “How Life Turns Asymmetric | Quanta Magazine”

Syndicated copies to:

NIMBioS Tutorial: Uncertainty Quantification for Biological Models

NIMBioS Tutorial: Uncertainty Quantification for Biological Models (nimbios.org)
NIMBioS will host an Tutorial on Uncertainty Quantification for Biological Models

Uncertainty Quantification for Biological Models

Meeting dates: June 26-28, 2017
Location: NIMBioS at the University of Tennessee, Knoxville

Organizers:
Marisa Eisenberg, School of Public Health, Univ. of Michigan
Ben Fitzpatrick, Mathematics, Loyola Marymount Univ.
James Hyman, Mathematics, Tulane Univ.
Ralph Smith, Mathematics, North Carolina State Univ.
Clayton Webster, Computational and Applied Mathematics (CAM), Oak Ridge National Laboratory; Mathematics, Univ. of Tennessee

Objectives:
Mathematical modeling and computer simulations are widely used to predict the behavior of complex biological phenomena. However, increased computational resources have allowed scientists to ask a deeper question, namely, “how do the uncertainties ubiquitous in all modeling efforts affect the output of such predictive simulations?” Examples include both epistemic (lack of knowledge) and aleatoric (intrinsic variability) uncertainties and encompass uncertainty coming from inaccurate physical measurements, bias in mathematical descriptions, as well as errors coming from numerical approximations of computational simulations. Because it is essential for dealing with realistic experimental data and assessing the reliability of predictions based on numerical simulations, research in uncertainty quantification (UQ) ultimately aims to address these challenges.

Uncertainty quantification (UQ) uses quantitative methods to characterize and reduce uncertainties in mathematical models, and techniques from sampling, numerical approximations, and sensitivity analysis can help to apportion the uncertainty from models to different variables. Critical to achieving validated predictive computations, both forward and inverse UQ analysis have become critical modeling components for a wide range of scientific applications. Techniques from these fields are rapidly evolving to keep pace with the increasing emphasis on models that require quantified uncertainties for large-scale applications. This tutorial will focus on the application of these methods and techniques to mathematical models in the life sciences and will provide researchers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties and perform sensitivity analysis for simulation models. Concepts to be covered may include: probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, adaptive surrogate model construction, high-dimensional approximation, random sampling and sparse grids, as well as local and global sensitivity analysis.

This tutorial is intended for graduate students, postdocs and researchers in mathematics, statistics, computer science and biology. A basic knowledge of probability, linear algebra, and differential equations is assumed.

Descriptive Flyer

Application deadline: March 1, 2017
To apply, you must complete an application on our online registration system:

  1. Click here to access the system
  2. Login or register
  3. Complete your user profile (if you haven’t already)
  4. Find this tutorial event under Current Events Open for Application and click on Apply

Participation in NIMBioS tutorials is by application only. Individuals with a strong interest in the topic are encouraged to apply, and successful applicants will be notified within two weeks after the application deadline. If needed, financial support for travel, meals, and lodging is available for tutorial attendees.

Summary Report. TBA

Live Stream. The Tutorial will be streamed live. Note that NIMBioS Tutorials involve open discussion and not necessarily a succession of talks. In addition, the schedule as posted may change during the Workshop. To view the live stream, visit http://www.nimbios.org/videos/livestream. A live chat of the event will take place via Twitter using the hashtag #uncertaintyTT. The Twitter feed will be displayed to the right of the live stream. We encourage you to post questions/comments and engage in discussion with respect to our Social Media Guidelines.


Source: NIMBioS Tutorial: Uncertainty Quantification for Biological Models

Syndicated copies to:

🔖 Information theory, predictability, and the emergence of complex life

Information theory, predictability, and the emergence of complex life by Luís F. Seoane and Ricard Solé (arxiv.org)
Abstract: Despite the obvious advantage of simple life forms capable of fast replication, different levels of cognitive complexity have been achieved by living systems in terms of their potential to cope with environmental uncertainty. Against the inevitable cost associated to detecting environmental cues and responding to them in adaptive ways, we conjecture that the potential for predicting the environment can overcome the expenses associated to maintaining costly, complex structures. We present a minimal formal model grounded in information theory and selection, in which successive generations of agents are mapped into transmitters and receivers of a coded message. Our agents are guessing machines and their capacity to deal with environments of different complexity defines the conditions to sustain more complex agents.
Syndicated copies to:

🔖 Emerging Frontiers of Neuroengineering: A Network Science of Brain Connectivity

Emerging Frontiers of Neuroengineering: A Network Science of Brain Connectivity by Danielle S. Bassett, Ankit N. Khambhati, Scott T. Grafton (arxiv.org)
Neuroengineering is faced with unique challenges in repairing or replacing complex neural systems that are composed of many interacting parts. These interactions form intricate patterns over large spatiotemporal scales, and produce emergent behaviors that are difficult to predict from individual elements. Network science provides a particularly appropriate framework in which to study and intervene in such systems, by treating neural elements (cells, volumes) as nodes in a graph and neural interactions (synapses, white matter tracts) as edges in that graph. Here, we review the emerging discipline of network neuroscience, which uses and develops tools from graph theory to better understand and manipulate neural systems, from micro- to macroscales. We present examples of how human brain imaging data is being modeled with network analysis and underscore potential pitfalls. We then highlight current computational and theoretical frontiers, and emphasize their utility in informing diagnosis and monitoring, brain-machine interfaces, and brain stimulation. A flexible and rapidly evolving enterprise, network neuroscience provides a set of powerful approaches and fundamental insights critical to the neuroengineer's toolkit.

17 pages, 6 figures. Manuscript accepted to the journal Annual Review of Biomedical Engineering [1]

References

[1]
D. Bassett S., A. Khambhati N., and S. Grafton T., “Emerging Frontiers of Neuroengineering: A Network Science of Brain Connectivity,” arXiv, 23-Dec-2016. [Online]. Available: https://arxiv.org/abs/1612.08059. [Accessed: 03-Jan-2017]
Syndicated copies to:

🔖 100 years after Smoluchowski: stochastic processes in cell biology

100 years after Smoluchowski: stochastic processes in cell biology by David Holcman and Zeev Schuss (arxiv.org)
100 years after Smoluchowski introduces his approach to stochastic processes, they are now at the basis of mathematical and physical modeling in cellular biology: they are used for example to analyse and to extract features from large number (tens of thousands) of single molecular trajectories or to study the diffusive motion of molecules, proteins or receptors. Stochastic modeling is a new step in large data analysis that serves extracting cell biology concepts. We review here the Smoluchowski's approach to stochastic processes and provide several applications for coarse-graining diffusion, studying polymer models for understanding nuclear organization and finally, we discuss the stochastic jump dynamics of telomeres across cell division and stochastic gene regulation.

65 pages, J. Phys A 2016 [1]

References

[1]
D. Holcman and Z. Schuss, “100 years after Smoluchowski: stochastic processes in cell biology,” arXiv, 26-Dec-2016. [Online]. Available: https://arxiv.org/abs/1612.08381. [Accessed: 03-Jan-2017]
Syndicated copies to:

Statistical Physics, Information Processing, and Biology Workshop at Santa Fe Institute

Information Processing and Biology by John Carlos Baez (Azimuth)
The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop.

I just found out about this from John Carlos Baez and wish I could go! How have I not managed to have heard about it?

Stastical Physics, Information Processing, and Biology

Workshop

November 16, 2016 – November 18, 2016
9:00 AM
Noyce Conference Room

Abstract.
This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific question:

  1. How has the fraction of free energy flux on earth that is used by biological computation changed with time?;
  2. What is the free energy cost of biological computation / function?;
  3. What is the free energy cost of the evolution of biological computation / function.

In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

Purpose: Research Collaboration
SFI Host: David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert

Syndicated copies to:

🔖 Quantum Information Science II

Quantum Information Science II (edX)
Learn about quantum computation and quantum information in this advanced graduate level course from MIT.

About this course

Already know something about quantum mechanics, quantum bits and quantum logic gates, but want to design new quantum algorithms, and explore multi-party quantum protocols? This is the course for you!

In this advanced graduate physics course on quantum computation and quantum information, we will cover:

  • The formalism of quantum errors (density matrices, operator sum representations)
  • Quantum error correction codes (stabilizers, graph states)
  • Fault-tolerant quantum computation (normalizers, Clifford group operations, the Gottesman-Knill Theorem)
  • Models of quantum computation (teleportation, cluster, measurement-based)
  • Quantum Fourier transform-based algorithms (factoring, simulation)
  • Quantum communication (noiseless and noisy coding)
  • Quantum protocols (games, communication complexity)

Research problem ideas are presented along the journey.

What you’ll learn

  • Formalisms for describing errors in quantum states and systems
  • Quantum error correction theory
  • Fault-tolerant quantum procedure constructions
  • Models of quantum computation beyond gates
  • Structures of exponentially-fast quantum algorithms
  • Multi-party quantum communication protocols

Meet the instructor

bio for Isaac ChuangIsaac Chuang Professor of Electrical Engineering and Computer Science, and Professor of Physics MIT

Syndicated copies to:

Hector Zenil

A new paper (arXiv) and some videos on entropy and algorithmic complexity

I’ve run across some of his work before, but I ran into some new material by Hector Zenil that will likely interest those following information theory, complexity, and computer science here. I hadn’t previously noticed that he refers to himself on his website as an “information theoretic biologist” — everyone should have that as a title, shouldn’t they? As a result, I’ve also added him to the growing list of ITBio Researchers.

If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).

Hector Zenil:

His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf

A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.

Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]

YouTube

Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.

Universal Measures of Complexity

Relevant literature:

Reprogrammable World

Relevant literature:

Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671

Ed.: 9/7/16: Updated videos with links to relevant literature

Syndicated copies to:

NIMBioS Tutorial: Evolutionary Quantitative Genetics 2016

NIMBioS Tutorial: Evolutionary Quantitative Genetics 2016 by NIMBioS (nimbios.org)
This tutorial will review the basics of theory in the field of evolutionary quantitative genetics and its connections to evolution observed at various time scales. Quantitative genetics deals with the inheritance of measurements of traits that are affected by many genes. Quantitative genetic theory for natural populations was developed considerably in the period from 1970 to 1990 and up to the present, and it has been applied to a wide range of phenomena including the evolution of differences between the sexes, sexual preferences, life history traits, plasticity of traits, as well as the evolution of body size and other morphological measurements. Textbooks have not kept pace with these developments, and currently few universities offer courses in this subject aimed at evolutionary biologists. There is a need for evolutionary biologists to understand this field because of the ability to collect large amounts of data by computer, the development of statistical methods for changes of traits on evolutionary trees and for changes in a single species through time, and the realization that quantitative characters will not soon be fully explained by genomics. This tutorial aims to fill this need by reviewing basic aspects of theory and illustrating how that theory can be tested with data, both from single species and with multiple-species phylogenies. Participants will learn to use R, an open-source statistical programming language, to build and test evolutionary models. The intended participants for this tutorial are graduate students, postdocs, and junior faculty members in evolutionary biology.

Syndicated copies to:

Network Science by Albert-László Barabási

Network Science by Albert-László BarabásiAlbert-László Barabási (Cambridge University Press)

I ran across a link to this textbook by way of a standing Google alert, and was excited to check it out. I was immediately disappointed to think that I would have to wait another month and change for the physical textbook to be released, but made my pre-order directly. Then with a bit of digging around, I realized that individual chapters are available immediately to quench my thirst until the physical text is printed next month.

The power of network science, the beauty of network visualization.

Network Science, a textbook for network science, is freely available under the Creative Commons licence. Follow its development on Facebook, Twitter or by signing up to our mailing list, so that we can notify you of new chapters and developments.

The book is the result of a collaboration between a number of individuals, shaping everything, from content (Albert-László Barabási), to visualizations and interactive tools (Gabriele Musella, Mauro Martino, Nicole Samay, Kim Albrecht), simulations and data analysis (Márton Pósfai). The printed version of the book will be published by Cambridge University Press in 2016. In the coming months the website will be expanded with an interactive version of the text, datasets, and slides to teach the material.

Book Contents

Personal Introduction
1. Introduction
2. Graph Theory
3. Random Networks
4. The Scale-Free Property
5. The Barabási-Albert Model
6. Evolving Networks
7. Degree Correlations
8. Network Robustness
9. Communities
10. Spreading Phenomena
Usage & Acknowledgements
About

Albert-László Barabási
on Network Science (book website)

Networks are everywhere, from the Internet, to social networks, and the genetic networks that determine our biological existence. Illustrated throughout in full colour, this pioneering textbook, spanning a wide range of topics from physics to computer science, engineering, economics and the social sciences, introduces network science to an interdisciplinary audience. From the origins of the six degrees of separation to explaining why networks are robust to random failures, the author explores how viruses like Ebola and H1N1 spread, and why it is that our friends have more friends than we do. Using numerous real-world examples, this innovatively designed text includes clear delineation between undergraduate and graduate level material. The mathematical formulas and derivations are included within Advanced Topics sections, enabling use at a range of levels. Extensive online resources, including films and software for network analysis, make this a multifaceted companion for anyone with an interest in network science.

Source: Cambridge University Press

The textbook is available for purchase in September 2016 from Cambridge University Press. Pre-order now on Amazon.com.

If you’re not already doing so, you should follow Barabási on Twitter.

Syndicated copies to:

Disconnected, Fragmented, or United? A Trans-disciplinary Review of Network Science

Disconnected, Fragmented, or United? A Trans-disciplinary Review of Network Science by César A. HidalgoCésar A. Hidalgo (Applied Network Science | SpringerLink)

Abstract

Applied Network Science

During decades the study of networks has been divided between the efforts of social scientists and natural scientists, two groups of scholars who often do not see eye to eye. In this review I present an effort to mutually translate the work conducted by scholars from both of these academic fronts hoping to continue to unify what has become a diverging body of literature. I argue that social and natural scientists fail to see eye to eye because they have diverging academic goals. Social scientists focus on explaining how context specific social and economic mechanisms drive the structure of networks and on how networks shape social and economic outcomes. By contrast, natural scientists focus primarily on modeling network characteristics that are independent of context, since their focus is to identify universal characteristics of systems instead of context specific mechanisms. In the following pages I discuss the differences between both of these literatures by summarizing the parallel theories advanced to explain link formation and the applications used by scholars in each field to justify their approach to network science. I conclude by providing an outlook on how these literatures can be further unified.

Syndicated copies to: