Brief book overview of Matthew Cobb's "Life’s Greatest Secret" from The Economist.
Life’s Greatest Secret: The Story of the Race to Crack the Genetic Code. By Matthew Cobb. Basic Books; 434 pages; $29.99. Profile Books; £25.
Brief book overview of Matthew Cobb's "Life’s Greatest Secret" from The Economist.
Life’s Greatest Secret: The Story of the Race to Crack the Genetic Code. By Matthew Cobb. Basic Books; 434 pages; $29.99. Profile Books; £25.
How a mathematical breakthrough from the 1960s now powers everything from spacecraft to cell phones.
“The Molecular Programming Project aims to develop computer science principles for programming information-bearing molecules like DNA and RNA to create artificial biomolecular programs of similar complexity. Our long-term vision is to establish molecular programming as a subdiscipline of computer science — one that will enable a yet-to-be imagined array of applications from chemical circuitry for interacting with biological molecules to nanoscale computing and molecular robotics.”
Source: MPP: Home
Top chefs and Harvard researchers explore how everyday cooking and haute cuisine can illuminate basic principles in physics and engineering, and vice versa.
The School of Information Theory will bring together over 100 graduate students, postdoctoral scholars, and leading researchers for four action-packed days of learning, stimulating discussions, professional networking and fun activities, all on the beautiful campus of the University of California, San Diego (UCSD) and in the nearby beach town of La Jolla.
- Tutorials by some of the best known researchers in information theory and related fields
- Poster presentations by student participants with feedback and discussion
- Panel discussion on “IT: Academia vs. Industry Perspectives”
- Social events and fun activities
BIRS 5 day worksop, arriving in Banff, Alberta Sunday, June 7 and departing Friday, June 12, 2015
In the years since the first assembly of the human genome, the complex and vital role of RNA and RNA binding proteins in regulation of the genome expression has expanded through the discovery of RNA-binding proteins and large classes of non-coding RNA that control many developmental decisions as part of protein- RNA complexes. Our molecular level understanding of RNA regulation has dramatically improved as many new structures of RNA–protein complexes have been determined and new sophisticated experimental technologies and dedicated computational modeling have emerged to investigate these interactions at the whole-genome level. Further deep insight on the molecular mechanisms that underline genome expression regulation is critical for understanding fundamental biology and disease progression towards the discovery of new approaches to interfere with disease progression.
The proposed workshop will bring together experts in RNA biology with structural biologists that focus on RNA-protein complexes, as well as computational biologists who seek to model and develop predictive tools based on the confluence of these experimental advances. The workshop intends to foster new collaborations between experimental and computational biologists and catalyze the development of new and improved technologies (such as single cell binding methods) that merge experimental analysis with novel mathematical and computational techniques to better understand the rules of protein-RNA recognition and RNA-based biological regulation.
The organizers of the workshop are both leaders in the field of protein-RNA recognition and interactions: Yael Mandel-Gutfreund has been working in the field of protein-Nucleic Acids interactions since 1994. Her main research interest is protein-RNA recognition and regulation. She has developed several tools and web servers for predicting RNA binding proteins and RNA binding motifs. Yael is the head to the computational molecular laboratory at the Technion and the president of the Israeli society of Bioinformatics and Computational Biology. Gabriele Varani has been working in the field of RNA structure and protein-RNA interactions since 1987. His main research interest is the structural basis for protein-RNA recognition and the prediction and design of RNA-binding proteins. He determined some of the first few structures of protein-RNA complexes and developed computational tools to analyze and predict the specificity of RNA -binding proteins. His group applies these tools to design RNA-binding proteins with new specificity to control gene expression. Our invitation to participate in the workshop has been met with great enthusiasm by the researchers. More than 20 principle investigators have already confirmed their interest in attending. Six of the confirmed participants are female scientists including the organizer Yael Mandel-Gutfreund as well as Traci Hall, Lynne Maquat, Elena Conti, Susan Jones, Drena Dobbs. We also have invited and confirmed the participation of young and promising researchers including Markus Landthaler, Gene Yeo, Jernej Ule, Uwe Ohler and others. Our confirmed participants come from many different countries: US, Canada, UK, Scotland, Germany, Spain, Switzerland, Poland and Israel. Two confirmed participants as well as the organizer have attended the BIRS workshops in the past.
A key objective of the workshop is to bring together researchers with experimental, mathematical and computational background to share results and discuss the main advances and challenges in the prediction, analysis and control of RNA-protein recognition and RNA-based regulation of gene expression. Towards this aim, we plan to adopt the format of previous BIRS meetings in which invited participants (including selected students) will present relatively short presentations of 20 minutes plus 10 minutes of active discussions. This format will leave aside ample time for informal discussions to foster exchanges between participants. To stress the collaborative, multidisciplinary nature of the workshop, we plan to dedicate each of the workshop sessions to a specific topic that will comprise presentations of structural, experimental and computational approaches, rather than create session focused on a particular approach. Each session we will include at least one lecture from a young scientist/postdoctoral fellow/student to be chosen among attendees by the organizers.
Suggested preliminary schedule:
- Day 1: Modeling and high throughput approaches to genome-wide analysis of protein-RNA interactions
- Day 2: Predicting and designing new RNA binding proteins
- Day 3: Generating and modeling RNA-based regulatory networks
- Day 4: Principles of RNA regulation by RNA binding proteins
- Day 5: Conclusion round table discussion on the present and future challenges of the field
The Edge.org's interview with Richard Dawkins.
“My vision of life is that everything extends from replicators, which are in practice DNA molecules on this planet. The replicators reach out into the world to influence their own probability of being passed on. Mostly they don’t reach further than the individual body in which they sit, but that’s a matter of practice, not a matter of principle. The individual organism can be defined as that set of phenotypic products which have a single route of exit of the genes into the future. That’s not true of the cuckoo/reed warbler case, but it is true of ordinary animal bodies. So the organism, the individual organism, is a deeply salient unit. It’s a unit of selection in the sense that I call a “vehicle”. There are two kinds of unit of selection. The difference is a semantic one. They’re both units of selection, but one is the replicator, and what it does is get itself copied. So more and more copies of itself go into the world. The other kind of unit is the vehicle. It doesn’t get itself copied. What it does is work to copy the replicators which have come down to it through the generations, and which it’s going to pass on to future generations. So we have this individual replicator dichotomy. They’re both units of selection, but in different senses. It’s important to understand that they are different senses.”
RICHARD DAWKINS is an evolutionary biologist; Emeritus Charles Simonyi Professor of the Public Understanding of Science, Oxford; Author, The Selfish Gene; The Extended Phenotype; Climbing Mount Improbable; The God Delusion; An Appetite For Wonder; and (forthcoming) A Brief Candle In The Dark.
Watch the entire video interview and read the transcript at Edge.org.
Beauty, even in Maths, can exist in the eye of the beholder. That might sound a little surprising, when, after all, what could be more objective than mathematics when thinking about truth, and what, therefore, could be more natural than for beauty and goodness, the twin accomplices to truth, to be co-joined ?
In the 70 odd years since Samuel Eilenberg and Saunders Mac Lane published their now infamous paper “A General Theory of Natural Equivalences“, the pursuit of maths by professionals (I use here the reference point definition of Michael Harris – see his recent publication “Mathematics without Apologies“) has become ever more specialised. I, for one, don’t doubt cross disciplinary excellence is alive and sometimes robustly so, but the industrially specialised silos that now create, produce and then sustain academic tenure are formidable within the community of mathematicians.
Beauty, in the purest sense, does not need to be captured in a definition but recognised through intuition. Whether we take our inspiration from Hardy or Dirac, or whether we experience a gorgeous thrill when encountering an austere proof that may have been confronted thousands of times before, the confluence of simplicity and beauty in maths may well be one of the few remaining places where the commonality of the “eye” across a spectrum of different beholders remains at its strongest.
Neither Eilenberg nor Mac Lane could have thought that Category theory, which was their attempt to link topology and algebra, would become so pervasive or so foundational in its influence when they completed and submitted their paper in those dark days of WW 2. But then neither could Cantor, have dreamt about his work on Set theory being adopted as the central pillar of “modern” mathematics so soon after his death. Under attack from establishment figures such as Kronecker during his lifetime, Cantor would not have believed that set theory would become the central edifice around which so much would be constructed.
Of course that is exactly what has happened. Set theory and the ascending magnitude of infinities that were unleashed through the crack in the door that was represented by Cantor’s diagonal conquered all before them.
Until now, that is.
In an article in Science News, Julie Rehmeyer describes Category Theory as “perhaps the most abstract area of all mathematics” and “where math is the abstraction of the real world, category theory is an abstraction of mathematics”.
Slowly, without fanfare, and with an alliance built with the emergent post transistor age discipline of computer science, Category theory looks set to become the dominant foundational basis for all mathematics. It could, in fact, already have achieved that status through stealth. After all, if sets are merely an example of a category, they become suborned without question or query. One might even use the description ‘subsumed’.
There is, in parallel, a wide ranging discussion in mathematics about the so called Univalent Foundation that is most widely associated with Voevodsky which is not the same. The text book produced for the year long univalence programme iniated at the IAS that was completed in 2013 Homotopy type theory – Univalent Foundations Programme states:
“The univalence ax-iom implies, in particular, that isomorphic structures can be identified, a principle that mathematicians have been happily using on workdays, despite its incompatibility with the “official”doctrines of conventional foundations..”
before going on to present the revelatory exposition that Univalent Foundations are the real unifying binding agent around mathematics.
I prefer to think of Voevodsky’s agenda as being narrower in many crucial respects than Category Theory, although both owe a huge amount to the over-arching reach of computational advances made through the mechanical aid proffered through the development of computers, particularly if one shares Voevodsky’s view that proofs will eventually have to be subject to mechanical confirmation.
In contrast, the journey, post Russell, for type theory based clarificatory approaches to formal logic continues in various ways, but Category theory brings a unifying effort to the whole of mathematics that had to wait almost two decades after Eilenberg and Mac Lane’s paper when a then virtually unknown mathematician, William Lawvere published his now much vaunted “An Elementary Theory of the Category of Sets” in 1964. This paper, and the revolutionary work of Grothendieck (see below) brought about a depth and breadth of work which created the environment from which Category Theory emerged through the subsequent decades until the early 2000’s.
Lawvere’s work has, at times, been seen as an attempt to simply re-work set theory in Category theoretic terms. This limitation is no longer prevalent, indeed the most recent biographical reviews of Grothendieck, following his death, assume that the unificatory expedient that is the essential feature of Category theory (and I should say here not just ETCS) is taken for granted, axiomatic, even. Grothendieck eventually went much further than defining Category theory in set theoretic terms, with both Algebraic Topology and Mathematical Physics being fields that now could not be approached without a foundational setting that is Category theory. The early language and notation of Category Theory where categories ‘C’ are described essentially as sets whose members satisfy the conditions of composition, morphism and identity eventually gave way post Lawvere and then Lambek to a systematic adoption of the approach we now see where any and all deductive systems can be turned into categories. Most standard histories give due credit to Eilenberg and Mac Lane as well as Lawvere (and sometimes Cartan), but it is Grothendieck’s ‘Sur quelques points d’algebre homologique’ in 1957 that is now seen as the real ground breaker.
My own pathway to Category theory has been via my interest in Lie Groups, and more broadly, in Quantum Computing, and it was only by accident (the best things really are those that come about by accident !) that I decided I had better learn the language of Category theory when I found Lawvere’s paper misleadingly familiar but annoyingly distant when, in common with most people, I assumed that my working knowledge of notation in logic and in set theory would map smoothly across to Category theory. That, of course, is not the case, and it was only after I gained some grounding in this new language that I realised just how and why Category theory has an impact far beyond computer science. It is this journey that also brings me face to face with a growing appreciation of the natural intersection between Category theory and a Wittgensteinian approach to the Philosophy of Mathematics. Wittgenstein’s disdain for Cantor is well documented (this short note is not an attempt to justify, using Category theory, a Wittgensteinian criticism of set theory). More specifically however, it was Abramsky and Coecke’s “Categorical Quantum Mechanics” that helped me to discern more carefully the links between Category Theory and Quantum Computing. They describe Category Theory as the ‘language of modern structural mathematics’ and use it as the tool for building a mathematical representation of quantum processes, and their paper is a thought provoking nudge in the ribs for anyone who is trying to make sense of the current noise that surrounds Quantum mechanics.
Awodey and Spivak are the two most impressive contemporary mathematicians currently working on Category Theory in my view, and whilst it is asking for trouble to choose one or two selected works as exemplars of their approach, I would have to say that Spivak’s book on Category Theory for the Sciences is the standout work of recent times (incidentally the section in this book on ‘aspects’ bears close scrutiny with Wittgenstein’s well known work on ‘family resemblances’).
Awodey’s 2003 paper is as good a recent balance between a mathematical and philosophical exposition of the importance of category theory as exists whilst his textbook is often referred to as the standard entry point for working mathematicians.
Going back to beauty, which is how I started this short note. Barry Mazur wrote an article in memory of Saunders Mac Lane titled ‘When is one thing equal to another‘ which is a gem of rare beauty, and the actual catalyst for this short note. If you read only one document in the links from this article, then I hope it is Mazur’s paper.
I’ll also give him a shout out for being a mathematician with a fledgling blog: Rick’s Ramblings.
The Postdoctoral Experience Revisited builds on the 2000 report Enhancing the Postdoctoral Experience for Scientists and Engineers. That ground-breaking report assessed the postdoctoral experience and provided principles, action points, and recommendations to enhance that experience. Since the publication of the 2000 report, the postdoctoral landscape has changed considerably. The percentage of PhDs who pursue postdoctoral training is growing steadily and spreading from the biomedical and physical sciences to engineering and the social sciences. The average length of time spent in postdoctoral positions seems to be increasing. The Postdoctoral Experience Revisited reexamines postdoctoral programs in the United States, focusing on how postdocs are being guided and managed, how institutional practices have changed, and what happens to postdocs after they complete their programs. This book explores important changes that have occurred in postdoctoral practices and the research ecosystem and assesses how well current practices meet the needs of these fledgling scientists and engineers and of the research enterprise. The Postdoctoral Experience Revisited takes a fresh look at current postdoctoral fellows - how many there are, where they are working, in what fields, and for how many years. This book makes recommendations to improve aspects of programs - postdoctoral period of service, title and role, career development, compensation and benefits, and mentoring. Current data on demographics, career aspirations, and career outcomes for postdocs are limited. This report makes the case for better data collection by research institution and data sharing. A larger goal of this study is not only to propose ways to make the postdoctoral system better for the postdoctoral researchers themselves but also to better understand the role that postdoctoral training plays in the research enterprise. It is also to ask whether there are alternative ways to satisfy some of the research and career development needs of postdoctoral researchers that are now being met with several years of advanced training. Postdoctoral researchers are the future of the research enterprise. The discussion and recommendations of The Postdoctoral Experience Revisited will stimulate action toward clarifying the role of postdoctoral researchers and improving their status and experience.
The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward.
Most might agree that our educational system is far less than ideal, but few pay attention to significant problems at the highest levels of academia which are holding back a great deal of our national “innovation machinery”. The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward. There are some interesting ideas here, but we’ve still got a long way to go.
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
We describe the evolution of macromolecules as an information transmission process and apply tools from Shannon information theory to it. This allows us to isolate three independent, competing selective pressures that we term compression, transmission, and neutrality selection. The first two affect genome length: the pressure to conserve resources by compressing the code, and the pressure to acquire additional information that improves the channel, increasing the rate of information transmission into each offspring. Noisy transmission channels (replication with mutations) gives rise to a third pressure that acts on the actual encoding of information; it maximizes the fraction of mutations that are neutral with respect to the phenotype. This neutrality selection has important implications for the evolution of evolvability. We demonstrate each selective pressure in experiments with digital organisms.
To be published in J. theor. Biology 222 (2003) 477-483
This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.
This is the second in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and to their relations to the thermodynamics of computation. In the first paper of the series, it was shown that a general-form dimensional argument from the second law of thermodynamics captures a number of scaling relations governing growth and development across many domains of life. It was also argued that models of physiology based on reversible transformations provide sensible approximations within which the second-law scaling is realized. This paper provides a formal basis for decomposing general cyclic, fixed-temperature chemical reactions, in terms of the chemical equivalent of Carnot's cycle for heat engines. It is shown that the second law relates the minimal chemical work required to perform a cycle to the Kullback–Leibler divergence produced in its chemical output ensemble from that of a Gibbs equilibrium. Reversible models of physiology are used to create reversible models of natural selection, which relate metabolic energy requirements to information gain under optimal conditions. When dissipation is added to models of selection, the second-law constraint is generalized to a relation between metabolic work and the combined energies of growth and maintenance.