🔖 Upcoming Special Issue “Information Theory in Neuroscience” | Entropy (MDPI)

Bookmarked Special Issue "Information Theory in Neuroscience" (Entropy | MDPI)
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017

👓 EXCLUSIVE: First human embryos edited in U.S., using CRISPR | MIT Technology Review

Read EXCLUSIVE: First human embryos edited in U.S., using CRISPR by Steve Connor (MIT Technology Review)
Researchers have demonstrated they can efficiently improve the DNA of human embryos.

👓 First Support for a Physics Theory of Life | Quanta Magazine

Read First Support for a Physics Theory of Life by Natalie Wolchover (Quanta Magazine)
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
Interesting article with some great references I’ll need to delve into and read.


The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.

I want to take a look at these papers as well as several about which the article is directly about.


Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”

Some truly harsh words from his former supervisor? Wow!


maybe there’s more that you can get for free

Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?

The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

The quintessential poolside summer reading: A Mind at Play

Instagram filter used: Clarendon

Photo taken at: Gerrish Swim & Tennis Club

📺 A Universal Theory of Life: Math, Art & Information by Sara Walker

Watched A Universal Theory of Life: Math, Art & Information from TEDxASU
Dr. Walker introduces the concept of information, then proposes that information may be a necessity for biological complexity in this thought-provoking talk on the origins of life. Sara is a theoretical physicist and astrobiologist, researching the origins and nature of life. She is particularly interested in addressing the question of whether or not “other laws of physics” might govern life, as first posed by Erwin Schrodinger in his famous book What is life?. She is currently an Assistant Professor in the School of Earth and Space Exploration and Beyond Center for Fundamental Concepts in Science at Arizona State University. She is also Fellow of the ASU -Santa Fe Institute Center for Biosocial Complex Systems, Founder of the astrobiology-themed social website SAGANet.org, and is a member of the Board of Directors of Blue Marble Space. She is active in public engagement in science, with recent appearances on “Through the Wormhole” and NPR’s Science Friday.
https://www.youtube.com/watch?v=kXnt79JhrbY

Admittedly, she only had a few short minutes, but it would have been nice if she’d started out with a precise definition of information. I suspect the majority of her audience didn’t know the definition with which she’s working and it would have helped focus the talk.

Her description of Speigelman’s Monster was relatively interesting and not very often seen in much of the literature that covers these areas.

I wouldn’t rate this very highly as a TED Talk as it wasn’t as condensed and simplistic as most, nor was it as hyper-focused, but then again condensing this area into 11 minutes is far from simple task. I do love that she’s excited enough about the topic that she almost sounds a little out of breath towards the end.

There’s an excellent Eddington quote I’ve mentioned before that would have been apropos to have opened up her presentation that might have brought things into higher relief given her talk title:

Suppose that we were asked to arrange the following in two categories–

distance, mass, electric force, entropy, beauty, melody.

I think there are the strongest grounds for placing entropy alongside beauty and melody and not with the first three.

Sir Arthur Stanley Eddington, OM, FRS (1882-1944), a British astronomer, physicist, and mathematician
in The Nature of the Physical World, 1927

 

🔖 The hidden simplicity of biology by Paul C W Davies and Sara Imari Walker | Reports on Progress in Physics

Bookmarked The hidden simplicity of biology (Reports on Progress in Physics)
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.
Downloadable free copy available on ResearchGate.

🔖 The “Hard Problem” of Life by Sara Imari Walker & Paul C.W. Davies

Bookmarked The "Hard Problem" of Life (arXiv)
Chalmer's famously identified pinpointing an explanation for our subjective experience as the "hard problem of consciousness". He argued that subjective experience constitutes a "hard problem" in the sense that its explanation will ultimately require new physical laws or principles. Here, we propose a corresponding "hard problem of life" as the problem of how `information' can affect the world. In this essay we motivate both why the problem of information as a causal agent is central to explaining life, and why it is hard - that is, why we suspect that a full resolution of the hard problem of life will, similar to as has been proposed for the hard problem of consciousness, ultimately not be reducible to known physical principles. Comments: To appear in "From Matter to Life: Information and Causality". S.I. Walker, P.C.W. Davies and G.F.R. Ellis (eds). Cambridge University Press

🔖 The Algorithmic Origins of Life – Sara Walker (SETI Talks)

Bookmarked The Algorithmic Origins of Life by Sara I. Walker (SETI Institute Talks)
The origin of life is arguably one of the greatest unanswered questions in science. A primary challenge is that without a proper definition for life -- a notoriously challenging problem in its own right -- the problem of how life began is not well posed. Here we propose that the transition from non-life to life may correspond to a fundamental shift in causal structure, where information gains direct, and context-dependent, causal efficacy over matter, a transition that may be mapped to a nontrivial distinction in how living systems process information. Dr. Walker will discuss potential measures of such a transition, which may be amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems.
https://youtu.be/dPiI4nYD0Vg

🔖 Origins of Life: A Problem for Physics

Bookmarked Origins of Life: A Problem for Physics by Sara I. Walker (arXiv)
The origins of life stands among the great open scientific questions of our time. While a number of proposals exist for possible starting points in the pathway from non-living to living matter, these have so far not achieved states of complexity that are anywhere near that of even the simplest living systems. A key challenge is identifying the properties of living matter that might distinguish living and non-living physical systems such that we might build new life in the lab. This review is geared towards covering major viewpoints on the origin of life for those new to the origin of life field, with a forward look towards considering what it might take for a physical theory that universally explains the phenomenon of life to arise from the seemingly disconnected array of ideas proposed thus far. The hope is that a theory akin to our other theories in fundamental physics might one day emerge to explain the phenomenon of life, and in turn finally permit solving its origins.

🔖 Proceedings of the Artificial Life Conference 2016

Bookmarked Proceedings of the Artificial Life Conference 2016 (The MIT Press)
The ALife conferences are the major meeting of the artificial life research community since 1987. For its 15th edition in 2016, it was held in Latin America for the first time, in the Mayan Riviera, Mexico, from July 4 -8. The special them of the conference: How can the synthetic study of living systems contribute to societies: scientifically, technically, and culturally? The goal of the conference theme is to better understand societies with the purpose of using this understanding for a more efficient management and development of social systems.
Free download available.

Proceedings of the Artificial Life Conference 2016

🔖 From Matter to Life: Information and Causality by Sara Imari Walker, Paul C. W. Davies, George F. R. Ellis

Bookmarked From Matter to Life: Information and Causality by (Cambridge University Press)
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science. Hardcover: 514 pages; ISBN-10: 1107150531; ISBN-13: 978-1107150539;
From Matter to Life: Information and Causality

Repost of John Carlos Baez’ Biology as Information Dynamics

Bookmarked Biology as Information Dynamics by John Carlos Baez (Google+)
I'm giving a talk at the Stanford Complexity Group this Thursday afternoon, April 20th. If you're around - like in Silicon Valley - please drop by! It will be in Clark S361 at 4 pm. Here's the idea. Everyone likes to say that biology is all about information. There's something true about this - just think about DNA. But what does this insight actually do for us? To figure it out, we need to do some work. Biology is also about things that can make copies of themselves. So it makes sense to figure out how information theory is connected to the 'replicator equation' — a simple model of population dynamics for self-replicating entities. To see the connection, we need to use relative information: the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Then everything pops into sharp focus. It turns out that free energy — energy in forms that can actually be used, not just waste heat — is a special case of relative information Since the decrease of free energy is what drives chemical reactions, biochemistry is founded on relative information. But there's a lot more to it than this! Using relative information we can also see evolution as a learning process, fix the problems with Fisher's fundamental theorem of natural selection, and more. So this what I'll talk about! You can see slides of an old version here: http://math.ucr.edu/home/baez/bio_asu/ but my Stanford talk will be videotaped and it'll eventually be here: https://www.youtube.com/user/StanfordComplexity You can already see lots of cool talks at this location! #biology
Wondering if there’s a way I can manufacture a reason to head to Northern California this week…

Study debunks old concept of how anesthesia works | Phys.Org

Read Study debunks old concept of how anesthesia works (phys.org)
Anesthesia induces unconsciousness by changing the function of proteins that reside on the surface of a thin membrane that forms a barrier around all cells, according to new research from Weill Cornell Medicine scientists. The findings challenge a century-old concept of how anesthetics work and may help guide the development of new agents associated with fewer side effects.
Continue reading Study debunks old concept of how anesthesia works | Phys.Org

🔖 "Opposite-of"-information improves similarity calculations in phenotype ontologies

Bookmarked "Opposite-of"-information improves similarity calculations in phenotype ontologies (bioRxiv)
One of the most important use cases of ontologies is the calculation of similarity scores between a query and items annotated with classes of an ontology. The hierarchical structure of an ontology does not necessarily reflect all relevant aspects of the domain it is modelling, and this can reduce the performance of ontology-based search algorithms. For instance, the classes of phenotype ontologies may be arranged according to anatomical criteria, but individual phenotypic features may affect anatomic entities in opposite ways. Thus, "opposite" classes may be located in close proximity in an ontology; for example enlarged liver and small liver are grouped under abnormal liver size. Using standard similarity measures, these would be scored as being similar, despite in fact being opposites. In this paper, we use information about opposite ontology classes to extend two large phenotype ontologies, the human and the mammalian phenotype ontology. We also show that this information can be used to improve rankings based on similarity measures that incorporate this information. In particular, cosine similarity based measures show large improvements. We hypothesize this is due to the natural embedding of opposite phenotypes in vector space. We support the idea that the expressivity of semantic web technologies should be explored more extensively in biomedical ontologies and that similarity measures should be extended to incorporate more than the pure graph structure defined by the subclass or part-of relationships of the underlying ontologies.