🔖 100 years after Smoluchowski: stochastic processes in cell biology

Bookmarked 100 years after Smoluchowski: stochastic processes in cell biology (arxiv.org)
100 years after Smoluchowski introduces his approach to stochastic processes, they are now at the basis of mathematical and physical modeling in cellular biology: they are used for example to analyse and to extract features from large number (tens of thousands) of single molecular trajectories or to study the diffusive motion of molecules, proteins or receptors. Stochastic modeling is a new step in large data analysis that serves extracting cell biology concepts. We review here the Smoluchowski's approach to stochastic processes and provide several applications for coarse-graining diffusion, studying polymer models for understanding nuclear organization and finally, we discuss the stochastic jump dynamics of telomeres across cell division and stochastic gene regulation.
65 pages, J. Phys A 2016 [1]

References

[1]
D. Holcman and Z. Schuss, “100 years after Smoluchowski: stochastic processes in cell biology,” arXiv, 26-Dec-2016. [Online]. Available: https://arxiv.org/abs/1612.08381. [Accessed: 03-Jan-2017]

🔖 A First Step Toward Quantifying the Climate’s Information Production over the Last 68,000 Years

Bookmarked A First Step Toward Quantifying the Climate’s Information Production over the Last 68,000 Years (link.springer.com)
Paleoclimate records are extremely rich sources of information about the past history of the Earth system. We take an information-theoretic approach to analyzing data from the WAIS Divide ice core, the longest continuous and highest-resolution water isotope record yet recovered from Antarctica. We use weighted permutation entropy to calculate the Shannon entropy rate from these isotope measurements, which are proxies for a number of different climate variables, including the temperature at the time of deposition of the corresponding layer of the core. We find that the rate of information production in these measurements reveals issues with analysis instruments, even when those issues leave no visible traces in the raw data. These entropy calculations also allow us to identify a number of intervals in the data that may be of direct relevance to paleoclimate interpretation, and to form new conjectures about what is happening in those intervals—including periods of abrupt climate change.
Saw reference in Predicting unpredictability: Information theory offers new way to read ice cores [1]

References

[1]
“Predicting unpredictability: Information theory offers new way to read ice cores,” Phys.org. [Online]. Available: http://phys.org/news/2016-12-unpredictability-theory-ice-cores.html. [Accessed: 12-Dec-2016]

🔖 H-theorem in quantum physics by G. B. Lesovik, et al.

Bookmarked H-theorem in quantum physics (Nature.com)

Abstract

Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. [1]

Footnotes

[1]
G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” Scientific Reports, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815

Chris Aldrich is reading “Department of Energy May Have Broken the Second Law of Thermodynamics”

Read Department of Energy May Have Broken the Second Law of Thermodynamics (Inverse)
“Quantum-based demons” sound like they'd be at home in 'Stranger Things.'

Statistical Physics, Information Processing, and Biology Workshop at Santa Fe Institute

Bookmarked Information Processing and Biology by John Carlos Baez (Azimuth)
The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop.
I just found out about this from John Carlos Baez and wish I could go! How have I not managed to have heard about it?

Stastical Physics, Information Processing, and Biology

Workshop

November 16, 2016 – November 18, 2016
9:00 AM
Noyce Conference Room

Abstract.
This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific question:

  1. How has the fraction of free energy flux on earth that is used by biological computation changed with time?;
  2. What is the free energy cost of biological computation / function?;
  3. What is the free energy cost of the evolution of biological computation / function.

In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

Purpose: Research Collaboration
SFI Host: David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert

Warren Weaver Bot!

Liked Someone has built a Warren Weaver Bot! by WeaverbotWeaverbot (Twitter)
This is the signal for the second.
How can you not follow this twitter account?!

Now I’m waiting for a Shannon bot and a Weiner bot. Maybe a John McCarthy bot would be apropos too?!

Tangled Up in Spacetime

Bookmarked Tangled Up in Spacetime by Clara MoskowitzClara Moskowitz (Scientific American)
Hundreds of researchers in a collaborative project called "It from Qubit" say space and time may spring up from the quantum entanglement of tiny bits of information.

🔖 Quantum Information Science II

Bookmarked Quantum Information Science II (edX)
Learn about quantum computation and quantum information in this advanced graduate level course from MIT.

About this course

Already know something about quantum mechanics, quantum bits and quantum logic gates, but want to design new quantum algorithms, and explore multi-party quantum protocols? This is the course for you!

In this advanced graduate physics course on quantum computation and quantum information, we will cover:

  • The formalism of quantum errors (density matrices, operator sum representations)
  • Quantum error correction codes (stabilizers, graph states)
  • Fault-tolerant quantum computation (normalizers, Clifford group operations, the Gottesman-Knill Theorem)
  • Models of quantum computation (teleportation, cluster, measurement-based)
  • Quantum Fourier transform-based algorithms (factoring, simulation)
  • Quantum communication (noiseless and noisy coding)
  • Quantum protocols (games, communication complexity)

Research problem ideas are presented along the journey.

What you’ll learn

  • Formalisms for describing errors in quantum states and systems
  • Quantum error correction theory
  • Fault-tolerant quantum procedure constructions
  • Models of quantum computation beyond gates
  • Structures of exponentially-fast quantum algorithms
  • Multi-party quantum communication protocols

Meet the instructor

bio for Isaac ChuangIsaac Chuang Professor of Electrical Engineering and Computer Science, and Professor of Physics MIT

[1609.02422] What can logic contribute to information theory?

Bookmarked [1609.02422] What can logic contribute to information theory? by David EllermanDavid Ellerman (128.84.21.199)
Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. But a recent development in logic changes this situation. In category theory, the notion of a subset is dual to the notion of a quotient set or partition, and recently the logic of partitions has been developed in a parallel relationship to the Boolean logic of subsets (subset logic is usually mis-specified as the special case of propositional logic). What then is the quantitative measure based on partition logic in the same sense that logical probability theory is based on subset logic? It is a measure of information that is named "logical entropy" in view of that logical basis. This paper develops the notion of logical entropy and the basic notions of the resulting logical information theory. Then an extensive comparison is made with the corresponding notions based on Shannon entropy.
Ellerman is visiting at UC Riverside at the moment. Given the information theory and category theory overlap, I’m curious if he’s working with John Carlos Baez, or what Baez is aware of this.

Based on a cursory look of his website(s), I’m going to have to start following more of this work.

Hector Zenil

I’ve run across some of his work before, but I ran into some new material by Hector Zenil that will likely interest those following information theory, complexity, and computer science here. I hadn’t previously noticed that he refers to himself on his website as an “information theoretic biologist” — everyone should have that as a title, shouldn’t they? As a result, I’ve also added him to the growing list of ITBio Researchers.

If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).

Hector Zenil:

His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf

A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.

Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]

YouTube

Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.

Universal Measures of Complexity

Relevant literature:

Reprogrammable World

Relevant literature:

Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671

Ed.: 9/7/16: Updated videos with links to relevant literature

Randomness And Complexity, from Leibniz To Chaitin | World Scientific Publishing

Bookmarked Randomness And Complexity, from Leibniz To Chaitin (amzn.to)
The book is a collection of papers written by a selection of eminent authors from around the world in honour of Gregory Chaitin s 60th birthday. This is a unique volume including technical contributions, philosophical papers and essays. Hardcover: 468 pages; Publisher: World Scientific Publishing Company (October 18, 2007); ISBN: 9789812770820

Peter Webb’s A Course in Finite Group Representation Theory

Bookmarked A Course in Finite Group Representation Theory by Peter WebbPeter Webb (math.umn.edu)
Download a pre-publication version of the book which will be published by Cambridge University Press. The book arises from notes of courses taught at the second year graduate level at the University of Minnesota and is suitable to accompany study at that level.

“Why should we want to know about representations over rings that are not fields of characteristic zero? It is because they arise in many parts of mathematics. Group representations appear any time we have a group of symmetries where there is some linear structure present, over some commutative ring. That ring need not be a field of characteristic zero.

Here are some examples.

  • […]
  • In the theory of error-correcting codes many important codes have a non-trivial symmetry group and are vector spaces over a finite field, thereby providing a representation of the group over that field.”
Peter Webb, February 23, 2016, Professor of Mathematics, University of Minnesota
in A Course in Finite Group Representation Theory to be published soon by Cambridge University Press

 

Human Collective Memory from Biographical Data

Bookmarked Estimating technological breaks in the size and composition of human collective memory from biographical data (arxiv.org)

The ability of humans to accumulate knowledge and information across generations is a defining feature of our species. This ability depends on factors that range from the psychological biases that predispose us to learn from skillful, accomplished, and prestigious people, to the development of technologies for recording and communicating information: from clay tablets to the Internet. In this paper we present empirical evidence documenting how communication technologies have shaped human collective memory. We show that changes in communication technologies, including the introduction of printing and the maturity of shorter forms of printed media, such as newspapers, journals, and pamphlets, were accompanied by sharp changes (or breaks) in the per-capita number of memorable biographies from a time period that are present in current online and offline sources. Moreover, we find that changes in technology, such as the introduction of printing, film, radio, and television, coincide with sharp shifts in the occupations of the individuals present in these biographical records. These two empirical facts provide evidence in support of theories arguing that human collective memory is shaped by the technologies we use to record and communicate information.

C. Jara-Figueroa, Amy Z. Yu, and Cesar A. Hidalgo
in Estimating technological breaks in the size and composition of human collective memory from biographical data via arXiv

 

A New Thermodynamics Theory of the Origin of Life | Quanta Magazine

Bookmarked A New Physics Theory of Life by Natalie Wolchover (quantamagazine.org)
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
References:

Hypothesis annotations

[ hypothesis user = 'chrisaldrich' tags = 'EnglandQM']