All living things are made of cells, and all cells are powered by electrochemical charges across thin lipid membranes — the ‘proton motive force.’ We know how these electrical charges are generated by protein machines at virtually atomic resolution, but we know very little about how membrane bioenergetics first arose. By tracking back cellular evolution to the last universal common ancestor and beyond, scientist Nick Lane argues that geologically sustained electrochemical charges across semiconducting barriers were central to both energy flow and the formation of new organic matter — growth — at the very origin of life. Dr. Lane is a professor of evolutionary biochemistry in the Department of Genetics, Evolution and Environment at University College London. His research focuses on how energy flow constrains evolution from the origin of life to the traits of complex multicellular organisms. He is a co-director of the new Centre for Life’s Origins and Evolution (CLOE) at UCL, and author of four celebrated books on life’s origins and evolution. His work has been recognized by the Biochemical Society Award in 2015 and the Royal Society Michael Faraday Prize in 2016.
According to Google Scholar, Turing's paper inventing modern computing is only his _second_ most cited paper pic.twitter.com/T1M4k4dMYK
— michael_nielsen (@michael_nielsen) October 7, 2015
Looks like Alan Turing, like Claude Shannon, was interested in microbiology too! I’ll have to dig into this. [pdf]
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
Interesting article with some great references I’ll need to delve into and read.
The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.
I want to take a look at these papers as well as several about which the article is directly about.
Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”
Some truly harsh words from his former supervisor? Wow!
maybe there’s more that you can get for free
Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?Syndicated copies to:
Apparently there’s a beta Bioinformatics group on Stack Exchange now. It’s just come out of alpha in the last few days and it appears anyone can join now.Syndicated copies to:
Dr. Walker introduces the concept of information, then proposes that information may be a necessity for biological complexity in this thought-provoking talk on the origins of life. Sara is a theoretical physicist and astrobiologist, researching the origins and nature of life. She is particularly interested in addressing the question of whether or not “other laws of physics” might govern life, as first posed by Erwin Schrodinger in his famous book What is life?. She is currently an Assistant Professor in the School of Earth and Space Exploration and Beyond Center for Fundamental Concepts in Science at Arizona State University. She is also Fellow of the ASU -Santa Fe Institute Center for Biosocial Complex Systems, Founder of the astrobiology-themed social website SAGANet.org, and is a member of the Board of Directors of Blue Marble Space. She is active in public engagement in science, with recent appearances on “Through the Wormhole” and NPR’s Science Friday.
Admittedly, she only had a few short minutes, but it would have been nice if she’d started out with a precise definition of information. I suspect the majority of her audience didn’t know the definition with which she’s working and it would have helped focus the talk.
Her description of Speigelman’s Monster was relatively interesting and not very often seen in much of the literature that covers these areas.
I wouldn’t rate this very highly as a TED Talk as it wasn’t as condensed and simplistic as most, nor was it as hyper-focused, but then again condensing this area into 11 minutes is far from simple task. I do love that she’s excited enough about the topic that she almost sounds a little out of breath towards the end.
There’s an excellent Eddington quote I’ve mentioned before that would have been apropos to have opened up her presentation that might have brought things into higher relief given her talk title:
Syndicated copies to:
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.
Downloadable free copy available on ResearchGate.Syndicated copies to:
Chalmer's famously identified pinpointing an explanation for our subjective experience as the "hard problem of consciousness". He argued that subjective experience constitutes a "hard problem" in the sense that its explanation will ultimately require new physical laws or principles. Here, we propose a corresponding "hard problem of life" as the problem of how `information' can affect the world. In this essay we motivate both why the problem of information as a causal agent is central to explaining life, and why it is hard - that is, why we suspect that a full resolution of the hard problem of life will, similar to as has been proposed for the hard problem of consciousness, ultimately not be reducible to known physical principles. Comments: To appear in "From Matter to Life: Information and Causality". S.I. Walker, P.C.W. Davies and G.F.R. Ellis (eds). Cambridge University Press
The origin of life is arguably one of the greatest unanswered questions in science. A primary challenge is that without a proper definition for life -- a notoriously challenging problem in its own right -- the problem of how life began is not well posed. Here we propose that the transition from non-life to life may correspond to a fundamental shift in causal structure, where information gains direct, and context-dependent, causal efficacy over matter, a transition that may be mapped to a nontrivial distinction in how living systems process information. Dr. Walker will discuss potential measures of such a transition, which may be amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems.
The origins of life stands among the great open scientific questions of our time. While a number of proposals exist for possible starting points in the pathway from non-living to living matter, these have so far not achieved states of complexity that are anywhere near that of even the simplest living systems. A key challenge is identifying the properties of living matter that might distinguish living and non-living physical systems such that we might build new life in the lab. This review is geared towards covering major viewpoints on the origin of life for those new to the origin of life field, with a forward look towards considering what it might take for a physical theory that universally explains the phenomenon of life to arise from the seemingly disconnected array of ideas proposed thus far. The hope is that a theory akin to our other theories in fundamental physics might one day emerge to explain the phenomenon of life, and in turn finally permit solving its origins.
The ALife conferences are the major meeting of the artificial life research community since 1987. For its 15th edition in 2016, it was held in Latin America for the first time, in the Mayan Riviera, Mexico, from July 4 -8. The special them of the conference: How can the synthetic study of living systems contribute to societies: scientifically, technically, and culturally? The goal of the conference theme is to better understand societies with the purpose of using this understanding for a more efficient management and development of social systems.
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science. Hardcover: 514 pages; ISBN-10: 1107150531; ISBN-13: 978-1107150539;
I'm giving a talk at the Stanford Complexity Group this Thursday afternoon, April 20th. If you're around - like in Silicon Valley - please drop by! It will be in Clark S361 at 4 pm. Here's the idea. Everyone likes to say that biology is all about information. There's something true about this - just think about DNA. But what does this insight actually do for us? To figure it out, we need to do some work. Biology is also about things that can make copies of themselves. So it makes sense to figure out how information theory is connected to the 'replicator equation' — a simple model of population dynamics for self-replicating entities. To see the connection, we need to use relative information: the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Then everything pops into sharp focus. It turns out that free energy — energy in forms that can actually be used, not just waste heat — is a special case of relative information Since the decrease of free energy is what drives chemical reactions, biochemistry is founded on relative information. But there's a lot more to it than this! Using relative information we can also see evolution as a learning process, fix the problems with Fisher's fundamental theorem of natural selection, and more. So this what I'll talk about! You can see slides of an old version here: http://math.ucr.edu/home/baez/bio_asu/ but my Stanford talk will be videotaped and it'll eventually be here: https://www.youtube.com/user/StanfordComplexity You can already see lots of cool talks at this location! #biology
Wondering if there’s a way I can manufacture a reason to head to Northern California this week…Syndicated copies to: