Instagram filter used: Clarendon
Photo taken at: Gerrish Swim & Tennis Club
Apparently there’s a beta Bioinformatics group on Stack Exchange now. It’s just come out of alpha in the last few days and it appears anyone can join now.Syndicated copies to:
Dr. Walker introduces the concept of information, then proposes that information may be a necessity for biological complexity in this thought-provoking talk on the origins of life. Sara is a theoretical physicist and astrobiologist, researching the origins and nature of life. She is particularly interested in addressing the question of whether or not “other laws of physics” might govern life, as first posed by Erwin Schrodinger in his famous book What is life?. She is currently an Assistant Professor in the School of Earth and Space Exploration and Beyond Center for Fundamental Concepts in Science at Arizona State University. She is also Fellow of the ASU -Santa Fe Institute Center for Biosocial Complex Systems, Founder of the astrobiology-themed social website SAGANet.org, and is a member of the Board of Directors of Blue Marble Space. She is active in public engagement in science, with recent appearances on “Through the Wormhole” and NPR’s Science Friday.
Admittedly, she only had a few short minutes, but it would have been nice if she’d started out with a precise definition of information. I suspect the majority of her audience didn’t know the definition with which she’s working and it would have helped focus the talk.
Her description of Speigelman’s Monster was relatively interesting and not very often seen in much of the literature that covers these areas.
I wouldn’t rate this very highly as a TED Talk as it wasn’t as condensed and simplistic as most, nor was it as hyper-focused, but then again condensing this area into 11 minutes is far from simple task. I do love that she’s excited enough about the topic that she almost sounds a little out of breath towards the end.
There’s an excellent Eddington quote I’ve mentioned before that would have been apropos to have opened up her presentation that might have brought things into higher relief given her talk title:
Syndicated copies to:
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.
Downloadable free copy available on ResearchGate.Syndicated copies to:
Chalmer's famously identified pinpointing an explanation for our subjective experience as the "hard problem of consciousness". He argued that subjective experience constitutes a "hard problem" in the sense that its explanation will ultimately require new physical laws or principles. Here, we propose a corresponding "hard problem of life" as the problem of how `information' can affect the world. In this essay we motivate both why the problem of information as a causal agent is central to explaining life, and why it is hard - that is, why we suspect that a full resolution of the hard problem of life will, similar to as has been proposed for the hard problem of consciousness, ultimately not be reducible to known physical principles. Comments: To appear in "From Matter to Life: Information and Causality". S.I. Walker, P.C.W. Davies and G.F.R. Ellis (eds). Cambridge University Press
The origin of life is arguably one of the greatest unanswered questions in science. A primary challenge is that without a proper definition for life -- a notoriously challenging problem in its own right -- the problem of how life began is not well posed. Here we propose that the transition from non-life to life may correspond to a fundamental shift in causal structure, where information gains direct, and context-dependent, causal efficacy over matter, a transition that may be mapped to a nontrivial distinction in how living systems process information. Dr. Walker will discuss potential measures of such a transition, which may be amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems.
The origins of life stands among the great open scientific questions of our time. While a number of proposals exist for possible starting points in the pathway from non-living to living matter, these have so far not achieved states of complexity that are anywhere near that of even the simplest living systems. A key challenge is identifying the properties of living matter that might distinguish living and non-living physical systems such that we might build new life in the lab. This review is geared towards covering major viewpoints on the origin of life for those new to the origin of life field, with a forward look towards considering what it might take for a physical theory that universally explains the phenomenon of life to arise from the seemingly disconnected array of ideas proposed thus far. The hope is that a theory akin to our other theories in fundamental physics might one day emerge to explain the phenomenon of life, and in turn finally permit solving its origins.
The ALife conferences are the major meeting of the artificial life research community since 1987. For its 15th edition in 2016, it was held in Latin America for the first time, in the Mayan Riviera, Mexico, from July 4 -8. The special them of the conference: How can the synthetic study of living systems contribute to societies: scientifically, technically, and culturally? The goal of the conference theme is to better understand societies with the purpose of using this understanding for a more efficient management and development of social systems.
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science. Hardcover: 514 pages; ISBN-10: 1107150531; ISBN-13: 978-1107150539;
I'm giving a talk at the Stanford Complexity Group this Thursday afternoon, April 20th. If you're around - like in Silicon Valley - please drop by! It will be in Clark S361 at 4 pm. Here's the idea. Everyone likes to say that biology is all about information. There's something true about this - just think about DNA. But what does this insight actually do for us? To figure it out, we need to do some work. Biology is also about things that can make copies of themselves. So it makes sense to figure out how information theory is connected to the 'replicator equation' — a simple model of population dynamics for self-replicating entities. To see the connection, we need to use relative information: the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Then everything pops into sharp focus. It turns out that free energy — energy in forms that can actually be used, not just waste heat — is a special case of relative information Since the decrease of free energy is what drives chemical reactions, biochemistry is founded on relative information. But there's a lot more to it than this! Using relative information we can also see evolution as a learning process, fix the problems with Fisher's fundamental theorem of natural selection, and more. So this what I'll talk about! You can see slides of an old version here: http://math.ucr.edu/home/baez/bio_asu/ but my Stanford talk will be videotaped and it'll eventually be here: https://www.youtube.com/user/StanfordComplexity You can already see lots of cool talks at this location! #biology
Wondering if there’s a way I can manufacture a reason to head to Northern California this week…Syndicated copies to:
Anesthesia induces unconsciousness by changing the function of proteins that reside on the surface of a thin membrane that forms a barrier around all cells, according to new research from Weill Cornell Medicine scientists. The findings challenge a century-old concept of how anesthetics work and may help guide the development of new agents associated with fewer side effects.
One of the most important use cases of ontologies is the calculation of similarity scores between a query and items annotated with classes of an ontology. The hierarchical structure of an ontology does not necessarily reflect all relevant aspects of the domain it is modelling, and this can reduce the performance of ontology-based search algorithms. For instance, the classes of phenotype ontologies may be arranged according to anatomical criteria, but individual phenotypic features may affect anatomic entities in opposite ways. Thus, "opposite" classes may be located in close proximity in an ontology; for example enlarged liver and small liver are grouped under abnormal liver size. Using standard similarity measures, these would be scored as being similar, despite in fact being opposites. In this paper, we use information about opposite ontology classes to extend two large phenotype ontologies, the human and the mammalian phenotype ontology. We also show that this information can be used to improve rankings based on similarity measures that incorporate this information. In particular, cosine similarity based measures show large improvements. We hypothesize this is due to the natural embedding of opposite phenotypes in vector space. We support the idea that the expressivity of semantic web technologies should be explored more extensively in biomedical ontologies and that similarity measures should be extended to incorporate more than the pure graph structure defined by the subclass or part-of relationships of the underlying ontologies.
@lpachter Your cup of tea over at UCLA next week? Regulatory & Epigenetic Stochasticity in Development & Disease http://www.ipam.ucla.edu/programs/workshops/regulatory-and-epigenetic-stochasticity-in-development-and-disease
Pachter, a computational biologist, returns to CalTech to study the role and function of RNA.
Pachter, a computational biologist and Caltech alumnus, returns to the Institute to study the role and function of RNA.
Lior Pachter (BS ’94) is Caltech’s new Bren Professor of Computational Biology. Recently, he was elected a fellow of the International Society for Computational Biology, one of the highest honors in the field. We sat down with him to discuss the emerging field of applying computational methods to biology problems, the transition from mathematics to biology, and his return to Pasadena. Continue reading “👓 A Conversation with @LPachter (BS ’94) | Caltech”
The interplay between structural connections and emerging information flow in the human brain remains an open research problem. A recent study observed global patterns of directional information flow in empirical data using the measure of transfer entropy. For higher frequency bands, the overall direction of information flow was from posterior to anterior regions whereas an anterior-to-posterior pattern was observed in lower frequency bands. In this study, we applied a simple Susceptible-Infected-Susceptible (SIS) epidemic spreading model on the human connectome with the aim to reveal the topological properties of the structural network that give rise to these global patterns. We found that direct structural connections induced higher transfer entropy between two brain regions and that transfer entropy decreased with increasing distance between nodes (in terms of hops in the structural network). Applying the SIS model, we were able to confirm the empirically observed opposite information flow patterns and posterior hubs in the structural network seem to play a dominant role in the network dynamics. For small time scales, when these hubs acted as strong receivers of information, the global pattern of information flow was in the posterior-to-anterior direction and in the opposite direction when they were strong senders. Our analysis suggests that these global patterns of directional information flow are the result of an unequal spatial distribution of the structural degree between posterior and anterior regions and their directions seem to be linked to different time scales of the spreading process.