🔖 Want to read: From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett

Bookmarked From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. DennettDaniel C. Dennett (W. W. Norton & Company; 1 edition, 496 pages (February 7, 2017))

One of America’s foremost philosophers offers a major new account of the origins of the conscious mind.

How did we come to have minds?

For centuries, this question has intrigued psychologists, physicists, poets, and philosophers, who have wondered how the human mind developed its unrivaled ability to create, imagine, and explain. Disciples of Darwin have long aspired to explain how consciousness, language, and culture could have appeared through natural selection, blazing promising trails that tend, however, to end in confusion and controversy. Even though our understanding of the inner workings of proteins, neurons, and DNA is deeper than ever before, the matter of how our minds came to be has largely remained a mystery.

That is now changing, says Daniel C. Dennett. In From Bacteria to Bach and Back, his most comprehensive exploration of evolutionary thinking yet, he builds on ideas from computer science and biology to show how a comprehending mind could in fact have arisen from a mindless process of natural selection. Part philosophical whodunit, part bold scientific conjecture, this landmark work enlarges themes that have sustained Dennett’s legendary career at the forefront of philosophical thought.

In his inimitable style―laced with wit and arresting thought experiments―Dennett explains that a crucial shift occurred when humans developed the ability to share memes, or ways of doing things not based in genetic instinct. Language, itself composed of memes, turbocharged this interplay. Competition among memes―a form of natural selection―produced thinking tools so well-designed that they gave us the power to design our own memes. The result, a mind that not only perceives and controls but can create and comprehend, was thus largely shaped by the process of cultural evolution.

An agenda-setting book for a new generation of philosophers, scientists, and thinkers, From Bacteria to Bach and Back will delight and entertain anyone eager to make sense of how the mind works and how it came about.

4 color, 18 black-and-white illustrations

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease, March 1-3

Bookmarked IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease (Institute for Pure and Applied Mathematics, UCLA | March 1-3, 2017)
Epigenetics refers to information transmitted during cell division other than the DNA sequence per se, and it is the language that distinguishes stem cells from somatic cells, one organ from another, and even identical twins from each other. In contrast to the DNA sequence, the epigenome is relatively susceptible to modification by the environment as well as stochastic perturbations over time, adding to phenotypic diversity in the population. Despite its strong ties to the environment, epigenetics has never been well reconciled to evolutionary thinking, and in fact there is now strong evidence against the transmission of so-called “epi-alleles,” i.e. epigenetic modifications that pass through the germline.

However, genetic variants that regulate stochastic fluctuation of gene expression and phenotypes in the offspring appear to be transmitted as an epigenetic or even Lamarckian trait. Furthermore, even the normal process of cellular differentiation from a single cell to a complex organism is not understood well from a mathematical point of view. There is increasingly strong evidence that stem cells are highly heterogeneous and in fact stochasticity is necessary for pluripotency. This process appears to be tightly regulated through the epigenome in development. Moreover, in these biological contexts, “stochasticity” is hardly synonymous with “noise”, which often refers to variation which obscures a “true signal” (e.g., measurement error) or which is structural, as in physics (e.g., quantum noise). In contrast, “stochastic regulation” refers to purposeful, programmed variation; the fluctuations are random but there is no true signal to mask.

This workshop will serve as a forum for scientists and engineers with an interest in computational biology to explore the role of stochasticity in regulation, development and evolution, and its epigenetic basis. Just as thinking about stochasticity was transformative in physics and in some areas of biology, it promises to fundamentally transform modern genetics and help to explain phase transitions such as differentiation and cancer.

This workshop will include a poster session; a request for poster titles will be sent to registered participants in advance of the workshop.
Speaker List:
Adam Arkin (Lawrence Berkeley Laboratory)
Gábor Balázsi (SUNY Stony Brook)
Domitilla Del Vecchio (Massachusetts Institute of Technology)
Michael Elowitz (California Institute of Technology)
Andrew Feinberg (Johns Hopkins University)
Don Geman (Johns Hopkins University)
Anita Göndör (Karolinska Institutet)
John Goutsias (Johns Hopkins University)
Garrett Jenkinson (Johns Hopkins University)
Andre Levchenko (Yale University)
Olgica Milenkovic (University of Illinois)
Johan Paulsson (Harvard University)
Leor Weinberger (University of California, San Francisco (UCSF))

🔖 IPAM Workshop on Gauge Theory and Categorification, March 6-10

Bookmarked IPAM Workshop on Gauge Theory and Categorification (Institute of Pure and Applied Mathematics at UCLA - March 6-10, 2017)
The equations of gauge theory lie at the heart of our understanding of particle physics. The Standard Model, which describes the electromagnetic, weak, and strong forces, is based on the Yang-Mills equations. Starting with the work of Donaldson in the 1980s, gauge theory has also been successfully applied in other areas of pure mathematics, such as low dimensional topology, symplectic geometry, and algebraic geometry.

More recently, Witten proposed a gauge-theoretic interpretation of Khovanov homology, a knot invariant whose origins lie in representation theory. Khovanov homology is a “categorification” of the celebrated Jones polynomial, in the sense that its Euler characteristic recovers this polynomial. At the moment, Khovanov homology is only defined for knots in the three-sphere, but Witten’s proposal holds the promise of generalizations to other three-manifolds, and perhaps of producing new invariants of four-manifolds.

This workshop will bring together researchers from several different fields (theoretical physics, mathematical gauge theory, topology, analysis / PDE, representation theory, symplectic geometry, and algebraic geometry), and thus help facilitate connections between these areas. The common focus will be to understand Khovanov homology and related invariants through the lens of gauge theory.

This workshop will include a poster session; a request for posters will be sent to registered participants in advance of the workshop.
Edward Witten will be giving two public lectures as part of the Green Family Lecture series:

March 6, 2017
From Gauge Theory to Khovanov Homology Via Floer Theory
The goal of the lecture is to describe a gauge theory approach to Khovanov homology of knots, in particular, to motivate the relevant gauge theory equations in a way that does not require too much physics background. I will give a gauge theory perspective on the construction of singly-graded Khovanov homology by Abouzaid and Smith.

March 8, 2017
An Introduction to the SYK Model
The Sachdev-Ye model was originally a model of quantum spin liquids that was introduced in the mid-1990′s. In recent years, it has been reinterpreted by Kitaev as a model of quantum chaos and black holes. This lecture will be primarily a gentle introduction to the SYK model, though I will also describe a few more recent results.

Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Bookmarked Entropy | Special Issue : Maximum Entropy and Bayesian Methods (mdpi.com)
Open for submission now
Deadline for manuscript submissions: 31 August 2017

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2017

Special Issue Editor


Guest Editor
Dr. Brendon J. Brewer

 

Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
Website | E-MailPhone: +64275001336
Interests: bayesian inference, markov chain monte carlo, nested sampling, MaxEnt

Special Issue Information

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

🔖 The Hypercycle: A Principle of Natural Self-Organization | Springer

Bookmarked The Hypercycle - A Principle of Natural Self-Organization | M. Eigen | Springer (Springer, 1979)
This book originated from a series of papers which were published in "Die Naturwissenschaften" in 1977178. Its division into three parts is the reflection of a logic structure, which may be abstracted in the form of three theses:

A. Hypercycles are a principle of natural self-organization allowing an inte­gration and coherent evolution of a set of functionally coupled self-rep­licative entities.

B. Hypercycles are a novel class of nonlinear reaction networks with unique properties, amenable to a unified mathematical treatment.

C. Hypercycles are able to originate in the mutant distribution of a single Darwinian quasi-species through stabilization of its diverging mutant genes. Once nucleated hypercycles evolve to higher complexity by a process analogous to gene duplication and specialization. In order to outline the meaning of the first statement we may refer to another principle of material self organization, namely to Darwin's principle of natural selection. This principle as we see it today represents the only understood means for creating information, be it the blue print for a complex living organism which evolved from less complex ancestral forms, or be it a meaningful sequence of letters the selection of which can be simulated by evolutionary model games.
Part A in .pdf format.

🔖 Cognition and biology: perspectives from information theory

Bookmarked Cognition and biology: perspectives from information theory (ncbi.nlm.nih.gov)
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.

🔖 Thermodynamics of Prediction

Bookmarked Thermodynamics of Prediction (journals.aps.org Phys. Rev. Lett. 109, 120604 (2012))
A system responding to a stochastic driving signal can be interpreted as computing, by means of its dynamics, an implicit model of the environmental variables. The system’s state retains information about past environmental fluctuations, and a fraction of this information is predictive of future ones. The remaining nonpredictive information reflects model complexity that does not improve predictive power, and thus represents the ineffectiveness of the model. We expose the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation. Our results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including biomolecular machines. They highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive.

🔖 Statistical Physics of Adaptation

Bookmarked Statistical Physics of Adaptation (journals.aps.org Phys. Rev. X 6, 021036 (2016))
Whether by virtue of being prepared in a slowly relaxing, high-free energy initial condition, or because they are constantly dissipating energy absorbed from a strong external drive, many systems subject to thermal fluctuations are not expected to behave in the way they would at thermal equilibrium. Rather, the probability of finding such a system in a given microscopic arrangement may deviate strongly from the Boltzmann distribution, raising the question of whether thermodynamics still has anything to tell us about which arrangements are the most likely to be observed. In this work, we build on past results governing nonequilibrium thermodynamics and define a generalized Helmholtz free energy that exactly delineates the various factors that quantitatively contribute to the relative probabilities of different outcomes in far-from-equilibrium stochastic dynamics. By applying this expression to the analysis of two examples—namely, a particle hopping in an oscillating energy landscape and a population composed of two types of exponentially growing self-replicators—we illustrate a simple relationship between outcome-likelihood and dissipative history. In closing, we discuss the possible relevance of such a thermodynamic principle for our understanding of self-organization in complex systems, paying particular attention to a possible analogy to the way evolutionary adaptations emerge in living things.

🔖 Meaning = Information + Evolution by Carlo Rovelli

Bookmarked Meaning = Information + Evolution (arxiv.org)
Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".

🔖 Irreversibility and Heat Generation in the Computing Process by R. Landauer

Bookmarked Irreversibility and Heat Generation in the Computing Process (ieeexplore.ieee.org)
It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.
A classical paper in the history of entropy.

🔖 Why Boltzmann Brains Are Bad by Sean M. Carroll

Bookmarked Why Boltzmann Brains Are Bad (arxiv.org)
Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the rug by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed.

🔖 Energy flow and the organization of life | Complexity

Bookmarked Energy flow and the organization of life (Complexity, September 2007)
Understanding the emergence and robustness of life requires accounting for both chemical specificity and statistical generality. We argue that the reverse of a common observation—that life requires a source of free energy to persist—provides an appropriate principle to understand the emergence, organization, and persistence of life on earth. Life, and in particular core biochemistry, has many properties of a relaxation channel that was driven into existence by free energy stresses from the earth's geochemistry. Like lightning or convective storms, the carbon, nitrogen, and phosphorus fluxes through core anabolic pathways make sense as the order parameters in a phase transition from an abiotic to a living state of the geosphere. Interpreting core pathways as order parameters would both explain their stability over billions of years, and perhaps predict the uniqueness of specific optimal chemical pathways.
Download .pdf copy

[1]
H. Morowitz and E. Smith, “Energy flow and the organization of life,” Complexity, vol. 13, no. 1. Wiley-Blackwell, pp. 51–59, 2007 [Online]. Available: http://dx.doi.org/10.1002/cplx.20191

🔖 Evidence for a limit to human lifespan | Nature Research

Bookmarked Evidence for a limit to human lifespan (nature.com)
Driven by technological progress, human life expectancy has increased greatly since the nineteenth century. Demographic evidence has revealed an ongoing reduction in old-age mortality and a rise of the maximum age at death, which may gradually extend human longevity. Together with observations that lifespan in various animal species is flexible and can be increased by genetic or pharmaceutical intervention, these results have led to suggestions that longevity may not be subject to strict, species-specific genetic constraints. Here, by analysing global demographic data, we show that improvements in survival with age tend to decline after age 100, and that the age at death of the world’s oldest person has not increased since the 1990s. Our results strongly suggest that the maximum lifespan of humans is fixed and subject to natural constraints.
[1]
X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan.,” Nature, vol. 538, no. 7624, pp. 257–259, Oct. 2016. [PubMed]

🔖 Hayflick, his limit, and cellular ageing | Nature Reviews Molecular Cell Biology

Bookmarked Hayflick, his limit, and cellular ageing ( Nature Reviews Molecular Cell Biology)
Almost 40 years ago, Leonard Hayflick discovered that cultured normal human cells have limited capacity to divide, after which they become senescent — a phenomenon now known as the ‘Hayflick limit’. Hayflick's findings were strongly challenged at the time, and continue to be questioned in a few circles, but his achievements have enabled others to make considerable progress towards understanding and manipulating the molecular mechanisms of ageing.
[1]
J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” Nat Rev Mol Cell Biol, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]

🔖 Thermodynamic Uncertainty Relation for Biomolecular Processes, Phys. Rev. Lett. 114, 158101 (2015)

Bookmarked Thermodynamic Uncertainty Relation for Biomolecular Processes (Phys. Rev. Lett. 114, 158101 (2015) - journals.aps.org)
Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is constrained by the thermodynamic cost of generating it. An uncertainty ε requires at least a cost of 2k_B T/ε^2 independent of the time required to generate the output.
[1]
A. C. Barato and U. Seifert, “Thermodynamic Uncertainty Relation for Biomolecular Processes,” Physical Review Letters, vol. 114, no. 15. American Physical Society (APS), 15-Apr-2015 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.114.158101 [Source]