Owning a book isn’t the same as reading it; we need only look at our own bloated bookshelves for confirmation.
Links
A new view of the tree of life
An update to the tree of life has revealed a dominance of bacterial diversity in many ecosystems and extensive evolution in some branches of the tree. It also highlights how few organisms we have been able to cultivate for further investigation.
Abstract
The tree of life is one of the most important organizing principles in biology. Gene surveys suggest the existence of an enormous number of branches, but even an approximation of the full scale of the tree has remained elusive. Recent depictions of the tree of life have focused either on the nature of deep evolutionary relationships or on the known, well-classified diversity of life with an emphasis on eukaryotes. These approaches overlook the dramatic change in our understanding of life’s diversity resulting from genomic sampling of previously unexamined environments. New methods to generate genome sequences illuminate the identity of organisms and their metabolic capacities, placing them in community and ecosystem contexts. Here, we use new genomic data from over 1,000 uncultivated and little known organisms, together with published sequences, to infer a dramatically expanded version of the tree of life, with Bacteria, Archaea and Eukarya included. The depiction is both a global overview and a snapshot of the diversity within each major lineage. The results reveal the dominance of bacterial diversification and underline the importance of organisms lacking isolated representatives, with substantial evolution concentrated in a major radiation of such organisms. This tree highlights major lineages currently underrepresented in biogeochemical models and identifies radiations that are probably important for future evolutionary analyses.
Laura A. Hug, Brett J. Baker, Karthik Anantharaman, Christopher T. Brown, Alexander J. Probst, Cindy J. Castelle, Cristina N. Butterfield, Alex W. Hernsdorf, Yuki Amano, Kotaro Ise, Yohey Suzuki, Natasha Dudek, David A. Relman, Kari M. Finstad, Ronald Amundson, Brian C. Thomas & Jillian F. Banfield in Nature Microbiology, Article number: 16048 (2016) doi:10.1038/nmicrobiol.2016.48

Carl Zimmer also has a nice little write up of the paper in today’s New York Times:
Most of the diversity outlined on the new tree has been hiding in plain sight.
in Scientists Unveil New ‘Tree of Life’ from The New York Times 4/11/16
2016 North-American School of Information Theory, June 21-23
The 2016 School of information will be hosted at Duke University, June 21-23. It is sponsored by the IEEE Information Theory Society, Duke University, the Center for Science of Information, and the National Science Foundation. The school provides a venue where doctoral and postdoctoral students can learn from distinguished professors in information theory, meet with fellow researchers, and form collaborations.
Program and Lectures
The daily schedule will consist of morning and afternoon lectures separated by a lunch break with poster sessions. Students from all research areas are welcome to attend and present their own research via a poster during the school. The school will host lectures on core areas of information theory and interdisciplinary topics. The following lecturers are confirmed:
- Helmut Bölcskei (ETH Zurich): The Mathematics of Deep Learning
- Natasha Devroye (University of Illinois, Chicago): The Interference Channel
- René Vidal (Johns Hopkins University): Global Optimality in Deep Learning and Beyond
- Tsachy Weissman (Stanford University): Information Processing under Logarithmic Loss
- Aylin Yener (Pennsylvania State University): Information-Theoretic Security
Logistics
Applications will be available on March 15 and will be evaluated starting April 1. Accepted students must register by May 15, 2016. The registration fee of $200 will include food and 3 nights accommodation in a single-occupancy room. We suggest that attendees fly into the Raleigh-Durham (RDU) airport located about 30 minutes from the Duke campus. Housing will be available for check-in on the afternoon of June 20th. The main part of the program will conclude after lunch on June 23rd so that attendees can fly home that evening.
To Apply: click “register” here (fee will accepted later after acceptance)
Administrative Contact: Kathy Peterson, itschool2016@gmail.com
Organizing Committee
Henry Pfister (chair) (Duke University), Dror Baron (North Carolina State University), Matthieu Bloch (Georgia Tech), Rob Calderbank (Duke University), Galen Reeves (Duke University). Advisors: Gerhard Kramer (Technical University of Munich) and Andrea Goldsmith (Stanford)
Sponsors
- IEEE Information Theory Society
- Duke University
- Center for Science of Information
- National Science Foundation
Hopkins Humanities Center celebrates 50 years as home to a diverse intellectual community
One of the most famous stories about the development of literary and critical theory in the United States has its origin at Johns Hopkins University’s Homewood campus about half a century ago.
It was at “The Languages of Criticism and the Sciences of Man” symposium held at the Milton S. Eisenhower Library in October 1966 that a then relatively unknown French thinker named Jacques Derrida threw a wrench into a few of the central ideas supporting structuralism, a linguistic methodology for understanding and conceptualizing human culture dominant at the time and epitomized by luminaries such as Claude Lévi-Strauss, Louis Althusser, Jacques Lacan, and Roland Barthes. What’s often forgotten about that event is that it was in fact the inaugural conference organized by Johns Hopkins University’s Humanities Center, an academic department that celebrates its 50th anniversary this year.
IndieWeb “Press This” Bookmarklet for WordPress
One big IndieWeb raison d’être is using your own web site to reply, like, repost, and RSVP to posts and events. You do this by annotating links on your site with simple microformats2 HTML.
Having said that, most people don’t want to write HTML just to like or reply to something. WordPress’s Press This bookmarklets can already start a new post with a link to the page you’re currently viewing. This code adds IndieWeb microformats2 markup to that link. Combined the wordpress-webmention plugin, you can use this to respond to the current page with just two clicks.
What’s more, if you’re currently on a Facebook post or Twitter tweet, this adds the Bridgy Publish link that will reply, like, favorite, retweet, or even RSVP inside those social networks.
Introduction to Information Theory | SFI’s Complexity Explorer
Introduction to Information Theory
About the Tutorial:
This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.
In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.
Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.
About the Instructor(s):
Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.
From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.
Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.
Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.
Tutorial Team:
Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.
How to use Complexity Explorer: How to use Complexity Explore
Prerequisites: At least one year of high-school algebra
Like this tutorial? Donate to help fund more like it
Syllabus
- Introduction
- Forms of Information
- Information and Probability
- Fundamental Formula of Information
- Computation and Logic: Information Processing
- Mutual Information
- Communication Capacity
- Shannon’s Coding Theorem
- The Manifold Things Information Measures
- Homework
Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae”
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.
Summa Technologiae
AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.
His subjects, among others, include:
- Virtual reality
- Artificial intelligence
- Nanotechnology and biotechnology
- Evolutionary biology and evolutionary psychology
- Artificial life
- Information theory
- Entropy and thermodynamics
- Complexity theory, probability, and chaos
- Population and ecological catastrophe
- The “singularity” and “transhumanism”
Source: Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books
I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.
Can a Field in Which Physicists Think Like Economists Help Us Achieve Universal Knowledge?
The Theory of Everything and Then Some: In complexity theory, physicists try to understand economics while sociologists think like biologists. Can they bring us any closer to universal knowledge?
A discussion of complexity and complexity theorist John H. Miller’s new book: A Crude Look at the Whole: The Science of Complex Systems in Business, Life, and Society.
The Hidden Algorithms Underlying Life | Quanta Magazine
The biological world is computational at its core, argues computer scientist Leslie Valiant.
I did expect something more entertaining from Google when I searched for “what will happen if I squeeze a paper cup full of hot coffee?”
What is Information? by Christoph Adami
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.
Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]
The Information Theory of Life | Quanta Magazine
The Information Theory of Life: The polymath Christoph Adami is investigating life’s origins by reimagining living things as self-perpetuating information strings.
Why math? JHU mathematician on teaching, theory, and the value of math in a modern world | Hub
Click through for the full interview: Q+A with Richard Brown, director of undergraduate studies in Johns Hopkins University’s Department of Mathematics
https://youtu.be/kg2mOl042ng
3 Rules of Academic Blogging
Not only is the form alive and well, but one of its most vibrant subsections is in academe.
Winter Q-BIO Quantitative Biology Meeting February 15-18, 2016
The Winter Q-BIO Quantitative Biology Meeting is coming up at the Sheraton Waikiki in Oahu, HI, USA
A predictive understanding of living systems is a prerequisite for designed manipulation in bioengineering and informed intervention in medicine. Such an understanding requires quantitative measurements, mathematical analysis, and theoretical abstraction. The advent of powerful measurement technologies and computing capacity has positioned biology to drive the next scientific revolution. A defining goal of Quantitative Biology (qBIO) is the development of general principles that arise from networks of interacting elements that initially defy conceptual reasoning. The use of model organisms for the discovery of general principles has a rich tradition in biology, and at a fundamental level the philosophy of qBIO resonates with most molecular and cell biologists. New challenges arise from the complexity inherent in networks, which require mathematical modeling and computational simulation to develop conceptual “guideposts” that can be used to generate testable hypotheses, guide analyses, and organize “big data.”
The Winter q-bio meeting welcomes scientists and engineers who are interested in all areas of q-bio. For 2016, the meeting will be hosted at the Sheraton Waikiki, which is located in Honolulu, on the island of Oahu. The resort is known for its breathtaking oceanfront views, a first-of-its-kind recently opened “Superpool” and many award-winning dining venues. Registration and accommodation information can be found via the links at the top of the page.
Shinichi Mochizuki and the impenetrable proof of the abc conjecture
A Japanese mathematician claims to have solved one of the most important problems in his field. The trouble is, hardly anyone can work out whether he's right.
The biggest mystery in mathematics
This article in Nature is just wonderful. Everyone will find it interesting, but those in the Algebraic Number Theory class this fall will be particularly interested in the topic – by the way, it’s not too late to join the class. After spending some time over the summer looking at Category Theory, I’m tempted to tackle Mochizuki’s proof as I’m intrigued at new methods in mathematical thinking (and explaining.)
The abc conjecture refers to numerical expressions of the type a + b = c. The statement, which comes in several slightly different versions, concerns the prime numbers that divide each of the quantities a, b and c. Every whole number, or integer, can be expressed in an essentially unique way as a product of prime numbers — those that cannot be further factored out into smaller whole numbers: for example, 15 = 3 × 5 or 84 = 2 × 2 × 3 × 7. In principle, the prime factors of a and b have no connection to those of their sum, c. But the abc conjecture links them together. It presumes, roughly, that if a lot of small primes divide a and b then only a few, large ones divide c.
Thanks to Rama for bringing this to my attention!





