Even in 2016, publishers and authors are still struggling when it comes to re-releasing decades-old books, but Penguin had a unique problem when it set out to publish a 30th anniversary edition of Richard Dawkin's The Blind Watchmaker.<br /><br />The Bookseller reports that Penguin decided to revive four programs Dawkins wrote in 1986. Written in Pascal for the Mac, The Watchmaker Suite was an experiment in algorithmic evolution. Users could run the programs and create a biomorph, and then watch it evolve across the generations.<br /><br />And now you can do the same in your web browser.<br /><br />A website, MountImprobable.com, was built by the publisher’s in-house Creative Technology team—comprising community manager Claudia Toia, creative developer Mathieu Triay and cover designer Matthew Young—who resuscitated and redeployed code Dawkins wrote in the 1980s and ’90s to enable users to create unique, “evolutionary” imprints. The images will be used as cover imagery on Dawkins’ trio to grant users an entirely individual, personalised print copy.
Links
Exhibition at BC Space | Amerikan Krazy: Life Out of Balance
“Amerikan Krazy: Life Out of Balance” takes part of its name from the new book <a href="http://boffosockobooks.com/books/authors/henry-james-korn/amerikan-krazy/">"Amerikan Krazy”</a> by <a href="http://www.henryjameskorn.com">Henry James Korn</a>. From 2008 to 2013, Korn worked at the Orange County Great Park. He was responsible for the creation of the Palm Court arts complex and culture, music, art and history programs.<br /><br /> “The book is very much about total corporate control of public and private space,” Korn said. The story follows a wounded Marine veteran haunted after having missed the chance to assassinate a presidential candidate who later causes massive human suffering and wreaks havoc on America’s wealth and democracy.<br /><br /> It’s a way of understanding what’s happening in politics now, Korn said.<br /><br /> “Because if ever there was a recognition that our public life and politics have gone crazy, it’s this moment.”
A New Thermodynamics Theory of the Origin of Life | Quanta Magazine
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
- Jeremy L. England Lab
- Talks
- Statistical physics of self-replication, Jeremy L. England; J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538
- Statistical Physics of Adaptation, Nikolai Perunov, Robert Marsland, and Jeremy England, arXiv, December 8, 2014
- Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences, Gavin E. Crooks, arXiv, February 1, 2008
- Life as a manifestation of the second law of thermodynamics, E.D. Schneider, J.J. Kay, doi:10.1016/0895-7177(94)90188-0, Mathematical and Computer Modelling, Volume 19, Issues 6–8, March–April 1994, Pages 25-48
Hypothesis annotations
Bits from Brains for Biologically Inspired Computing | Computational Intelligence
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.
The fun was out there | First Person | Johns Hopkins Magazine | Hub
For the first couple of months of freshman year, I spent my evenings breaking into buildings on campus.
Having just passed our 20th college reunion, an old friend starts spilling the beans…
Apparently the statute of limitations on college shenanigans has run out and one of my best friends has written a nice little essay about some of “our” adventures. Fortunately he has kindly left out the names of his co-conspirators, so I’ll also remain silent about who was responsible for which particular crimes. Like him, I will leave the numerous other crimes he redacted unsung.
For the first couple of months of freshman year, I spent my evenings breaking into buildings on campus. This began, naturally, because a few of us who lived in and around the Vincent-Willard dorm had mail ordered lock-picking kits, and, well, we needed something to practice on besides our own dorm rooms.
So down into the midnight bowels of Krieger we crept, sneaking deep underground into disused classrooms, mute hallways, and one strange lab whose floor was tight-knit mesh wiring with a Silence of the Lambs–esque chamber below. We touched little, took nothing (except, once, a jar of desiccant—sorry!), and were never caught.
Such was the state of fun at Johns Hopkins in the fall of 1992, an era when the administration seemed to have adopted a policy of benign neglect toward the extracurricular happiness of its undergraduate body. We had Spring Fair and the occasional bus trip to New York for the day. What more could we want?
For many—really, most—of my cutthroat classmates, this was reason to grumble. Why, they moaned from the depths of D-level, couldn’t school be more exciting? A student union, they pleaded. A bar. A café. Anything to make campus life more bearable.
But for my friends and me, the school’s DGAF attitude meant freedom: We could do whatever we wanted, on campus or off. When lock-picking grew old (quickly, I’m pleased to say), we began to roam, wandering among the half-abandoned industrial sites that lined the unreconstructed harbor, or driving (when someone happened to have a car) under the interstates that cut through and around the city. We were set loose upon Baltimore, and all we ever wanted was to go and see what there was.
Here’s what we found: A large yellow smiley face painted on the end of an oil-storage tank. The 16mm film collection at the Pratt Library. A man who claimed to have been hanging out with Mama Cass Elliot of the Mamas & the Papas the night she lost her virginity. The Baltimore Streetcar Museum. How to clear the dance floor at Club Midnite by playing the 1978 song “Fish Heads” (eat them up, yum!). The big slice at Angelo’s and the $4.95 crabcake subs at Sip & Bite. Smart drugs, Neal Stephenson, and 2600 magazine at Atomic Books. The indie movie screenings at Skizz Cyzyk’s funeral home “mansion.”
None of these alone was world-changing (okay, except maybe “Fish Heads”). Put together, though, they amounted to a constant stream of stimulation, novelty, and excitement, the discoveries that make new adulthood feel fresh and occasionally profound.
All the while, I heard the no-fun grumbling from around campus and failed to understand it. We had freedom—what more could we need? The world was all around us, begging to be explored. We didn’t even have to leave campus: One spring, my girlfriend and I simply stepped off the sidewalk next to Mudd Hall into a little dell—and discovered a stand of wild scallions. We picked a ton, brought them home, and feasted on our foraged bounty. All we’d had to do was to leave the asphalt path—no red brick in those days—behind.
Matt Gross, Johns Hopkins A&S ’96, ’98 (MA), is a food and travel writer/editor who’s worked for everyone from The New York Times and Bon Appétit to The Guardian, The Village Voice, and Saveur. He lives in Brooklyn with his wife, Jean Liu, A&S ’96, and their two daughters.
Incidentally he also had two other meaty pieces that came out yesterday as well:
- Are we Living in a Post-Bacon World? | ExtraCrispy.com
- A New Book About Nathan’s Famous Feeds Our Need for Cheap Eats — and the Prosperity Myth | Village Voice
Quantum Information Meets Quantum Matter
This is the draft version of a textbook, which aims to introduce the quantum information science viewpoints on condensed matter physics to graduate students in physics (or interested researchers). We keep the writing in a self-consistent way, requiring minimum background in quantum information science. Basic knowledge in undergraduate quantum physics and condensed matter physics is assumed. We start slowly from the basic ideas in quantum information theory, but wish to eventually bring the readers to the frontiers of research in condensed matter physics, including topological phases of matter, tensor networks, and symmetry-protected topological phases.
Matter, energy… knowledge: How to harness physics’ demonic power | New Scientist
Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?
References
- Second Law of Thermodynamics with Discrete Quantum Feedback Control by Takahiro Sagawa and Masahito Ueda; Phys. Rev. Lett. 100, 080403 – Published 26 February 2008
- Work and information processing in a solvable model of Maxwell’s demon by Dibyendu Mandal and Christopher Jarzynski; PNAS vol. 109 no. 29, July 17, 2012
- Thermodynamic Costs of Information Processing in Sensory Adaptation by Pablo Sartori, Léo Granger, Chiu Fan Lee, and Jordan M. Horowitz; PLOS December 11, 2014 http://dx.doi.org/10.1371/journal.pcbi.1003974
- Intermittent transcription dynamics for the rapid production of long transcripts of high fidelity by Depken M1, Parrondo JM, Grill SW; Cell Rep. 2013 Oct 31;5(2):521-30. doi: 10.1016/j.celrep.2013.09.007
- The stepping motor protein as a feedback control ratchet by Martin Bier; BioSystems 88 (2007) 301–307
How Can We Apply Physics to Biology?
We don’t yet know quite what a physics of biology will consist of. But we won’t understand life without it.
How can we be sure old books were ever read? – University of Glasgow Library
Owning a book isn’t the same as reading it; we need only look at our own bloated bookshelves for confirmation.
A new view of the tree of life
An update to the tree of life has revealed a dominance of bacterial diversity in many ecosystems and extensive evolution in some branches of the tree. It also highlights how few organisms we have been able to cultivate for further investigation.
Abstract
The tree of life is one of the most important organizing principles in biology. Gene surveys suggest the existence of an enormous number of branches, but even an approximation of the full scale of the tree has remained elusive. Recent depictions of the tree of life have focused either on the nature of deep evolutionary relationships or on the known, well-classified diversity of life with an emphasis on eukaryotes. These approaches overlook the dramatic change in our understanding of life’s diversity resulting from genomic sampling of previously unexamined environments. New methods to generate genome sequences illuminate the identity of organisms and their metabolic capacities, placing them in community and ecosystem contexts. Here, we use new genomic data from over 1,000 uncultivated and little known organisms, together with published sequences, to infer a dramatically expanded version of the tree of life, with Bacteria, Archaea and Eukarya included. The depiction is both a global overview and a snapshot of the diversity within each major lineage. The results reveal the dominance of bacterial diversification and underline the importance of organisms lacking isolated representatives, with substantial evolution concentrated in a major radiation of such organisms. This tree highlights major lineages currently underrepresented in biogeochemical models and identifies radiations that are probably important for future evolutionary analyses.
Laura A. Hug, Brett J. Baker, Karthik Anantharaman, Christopher T. Brown, Alexander J. Probst, Cindy J. Castelle, Cristina N. Butterfield, Alex W. Hernsdorf, Yuki Amano, Kotaro Ise, Yohey Suzuki, Natasha Dudek, David A. Relman, Kari M. Finstad, Ronald Amundson, Brian C. Thomas & Jillian F. Banfield in Nature Microbiology, Article number: 16048 (2016) doi:10.1038/nmicrobiol.2016.48
Carl Zimmer also has a nice little write up of the paper in today’s New York Times:
2016 North-American School of Information Theory, June 21-23
The 2016 School of information will be hosted at Duke University, June 21-23. It is sponsored by the IEEE Information Theory Society, Duke University, the Center for Science of Information, and the National Science Foundation. The school provides a venue where doctoral and postdoctoral students can learn from distinguished professors in information theory, meet with fellow researchers, and form collaborations.
Program and Lectures
The daily schedule will consist of morning and afternoon lectures separated by a lunch break with poster sessions. Students from all research areas are welcome to attend and present their own research via a poster during the school. The school will host lectures on core areas of information theory and interdisciplinary topics. The following lecturers are confirmed:
- Helmut Bölcskei (ETH Zurich): The Mathematics of Deep Learning
- Natasha Devroye (University of Illinois, Chicago): The Interference Channel
- René Vidal (Johns Hopkins University): Global Optimality in Deep Learning and Beyond
- Tsachy Weissman (Stanford University): Information Processing under Logarithmic Loss
- Aylin Yener (Pennsylvania State University): Information-Theoretic Security
Logistics
Applications will be available on March 15 and will be evaluated starting April 1. Accepted students must register by May 15, 2016. The registration fee of $200 will include food and 3 nights accommodation in a single-occupancy room. We suggest that attendees fly into the Raleigh-Durham (RDU) airport located about 30 minutes from the Duke campus. Housing will be available for check-in on the afternoon of June 20th. The main part of the program will conclude after lunch on June 23rd so that attendees can fly home that evening.
To Apply: click “register” here (fee will accepted later after acceptance)
Administrative Contact: Kathy Peterson, itschool2016@gmail.com
Organizing Committee
Henry Pfister (chair) (Duke University), Dror Baron (North Carolina State University), Matthieu Bloch (Georgia Tech), Rob Calderbank (Duke University), Galen Reeves (Duke University). Advisors: Gerhard Kramer (Technical University of Munich) and Andrea Goldsmith (Stanford)
Sponsors
- IEEE Information Theory Society
- Duke University
- Center for Science of Information
- National Science Foundation
Hopkins Humanities Center celebrates 50 years as home to a diverse intellectual community
One of the most famous stories about the development of literary and critical theory in the United States has its origin at Johns Hopkins University’s Homewood campus about half a century ago.
It was at “The Languages of Criticism and the Sciences of Man” symposium held at the Milton S. Eisenhower Library in October 1966 that a then relatively unknown French thinker named Jacques Derrida threw a wrench into a few of the central ideas supporting structuralism, a linguistic methodology for understanding and conceptualizing human culture dominant at the time and epitomized by luminaries such as Claude Lévi-Strauss, Louis Althusser, Jacques Lacan, and Roland Barthes. What’s often forgotten about that event is that it was in fact the inaugural conference organized by Johns Hopkins University’s Humanities Center, an academic department that celebrates its 50th anniversary this year.
IndieWeb “Press This” Bookmarklet for WordPress
One big IndieWeb raison d’être is using your own web site to reply, like, repost, and RSVP to posts and events. You do this by annotating links on your site with simple microformats2 HTML.
Having said that, most people don’t want to write HTML just to like or reply to something. WordPress’s Press This bookmarklets can already start a new post with a link to the page you’re currently viewing. This code adds IndieWeb microformats2 markup to that link. Combined the wordpress-webmention plugin, you can use this to respond to the current page with just two clicks.
What’s more, if you’re currently on a Facebook post or Twitter tweet, this adds the Bridgy Publish link that will reply, like, favorite, retweet, or even RSVP inside those social networks.
Introduction to Information Theory | SFI’s Complexity Explorer
Introduction to Information Theory
About the Tutorial:
This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.
In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.
Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.
About the Instructor(s):
Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.
From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.
Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.
Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.
Tutorial Team:
Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.
How to use Complexity Explorer: How to use Complexity Explore
Prerequisites: At least one year of high-school algebra
Like this tutorial? Donate to help fund more like it
Syllabus
- Introduction
- Forms of Information
- Information and Probability
- Fundamental Formula of Information
- Computation and Logic: Information Processing
- Mutual Information
- Communication Capacity
- Shannon’s Coding Theorem
- The Manifold Things Information Measures
- Homework
Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae”
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.
Summa Technologiae
AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.
His subjects, among others, include:
- Virtual reality
- Artificial intelligence
- Nanotechnology and biotechnology
- Evolutionary biology and evolutionary psychology
- Artificial life
- Information theory
- Entropy and thermodynamics
- Complexity theory, probability, and chaos
- Population and ecological catastrophe
- The “singularity” and “transhumanism”
Source: Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books
I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.