I have become increasingly frustrated by the fact that many of the publications I used to like are turning into churnicle factories, creating platforms for anybody and everybody to post whatever dr…
Bioinformatics is a broad discipline in which one common denominator is the need to produce and/or use software that can be applied to biological data in different contexts. To enable and ensure the replicability and traceability of scientific claims, it is essential that the scientific publication, the corresponding datasets, and the data analysis are made publicly available [1,2]. All software used for the analysis should be either carefully documented (e.g., for commercial software) or, better yet, openly shared and directly accessible to others [3,4]. The rise of openly available software and source code alongside concomitant collaborative development is facilitated by the existence of several code repository services such as SourceForge, Bitbucket, GitLab, and GitHub, among others. These resources are also essential for collaborative software projects because they enable the organization and sharing of programming tasks between different remote contributors. Here, we introduce the main features of GitHub, a popular web-based platform that offers a free and integrated environment for hosting the source code, documentation, and project-related web content for open-source projects. GitHub also offers paid plans for private repositories (see Box 1) for individuals and businesses as well as free plans including private repositories for research and educational use.
For a little over two years, I have been involved in Indiewebcamp. This past weekend, for the first time in five years, I was able to attend WordCamp. WordCamp NYC was a massive undertaking, to which I must give credit to the organizers. WordCamp was moved to coincide with OpenCamps week at the United Nations, …
Advances in computing power, natural language processing, and digitization of text now make it possible to study our a culture's evolution through its texts using a "big data" lens. Our ability to communicate relies in part upon a shared emotional experience, with stories often following distinct emotional trajectories, forming patterns that are meaningful to us. Here, by classifying the emotional arcs for a filtered subset of 1,737 stories from Project Gutenberg's fiction collection, we find a set of six core trajectories which form the building blocks of complex narratives. We strengthen our findings by separately applying optimization, linear decomposition, supervised learning, and unsupervised learning. For each of these six core emotional arcs, we examine the closest characteristic stories in publication today and find that particular emotional arcs enjoy greater success, as measured by downloads.
Even in 2016, publishers and authors are still struggling when it comes to re-releasing decades-old books, but Penguin had a unique problem when it set out to publish a 30th anniversary edition of Richard Dawkin's The Blind Watchmaker.<br /><br />The Bookseller reports that Penguin decided to revive four programs Dawkins wrote in 1986. Written in Pascal for the Mac, The Watchmaker Suite was an experiment in algorithmic evolution. Users could run the programs and create a biomorph, and then watch it evolve across the generations.<br /><br />And now you can do the same in your web browser.<br /><br />A website, MountImprobable.com, was built by the publisher’s in-house Creative Technology team—comprising community manager Claudia Toia, creative developer Mathieu Triay and cover designer Matthew Young—who resuscitated and redeployed code Dawkins wrote in the 1980s and ’90s to enable users to create unique, “evolutionary” imprints. The images will be used as cover imagery on Dawkins’ trio to grant users an entirely individual, personalised print copy.
“Amerikan Krazy: Life Out of Balance” takes part of its name from the new book <a href="http://boffosockobooks.com/books/authors/henry-james-korn/amerikan-krazy/">"Amerikan Krazy”</a> by <a href="http://www.henryjameskorn.com">Henry James Korn</a>. From 2008 to 2013, Korn worked at the Orange County Great Park. He was responsible for the creation of the Palm Court arts complex and culture, music, art and history programs.<br /><br /> “The book is very much about total corporate control of public and private space,” Korn said. The story follows a wounded Marine veteran haunted after having missed the chance to assassinate a presidential candidate who later causes massive human suffering and wreaks havoc on America’s wealth and democracy.<br /><br /> It’s a way of understanding what’s happening in politics now, Korn said.<br /><br /> “Because if ever there was a recognition that our public life and politics have gone crazy, it’s this moment.”
If you haven’t manage to make it down, this exhibition is running for another week at BC Space!
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
- Jeremy L. England Lab
- Statistical physics of self-replication, Jeremy L. England; J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538
- Statistical Physics of Adaptation, Nikolai Perunov, Robert Marsland, and Jeremy England, arXiv, December 8, 2014
- Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences, Gavin E. Crooks, arXiv, February 1, 2008
- Life as a manifestation of the second law of thermodynamics, E.D. Schneider, J.J. Kay, doi:10.1016/0895-7177(94)90188-0, Mathematical and Computer Modelling, Volume 19, Issues 6–8, March–April 1994, Pages 25-48
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.
For the first couple of months of freshman year, I spent my evenings breaking into buildings on campus.
Having just passed our 20th college reunion, an old friend starts spilling the beans…
Apparently the statute of limitations on college shenanigans has run out and one of my best friends has written a nice little essay about some of “our” adventures. Fortunately he has kindly left out the names of his co-conspirators, so I’ll also remain silent about who was responsible for which particular crimes. Like him, I will leave the numerous other crimes he redacted unsung.
For the first couple of months of freshman year, I spent my evenings breaking into buildings on campus. This began, naturally, because a few of us who lived in and around the Vincent-Willard dorm had mail ordered lock-picking kits, and, well, we needed something to practice on besides our own dorm rooms.
So down into the midnight bowels of Krieger we crept, sneaking deep underground into disused classrooms, mute hallways, and one strange lab whose floor was tight-knit mesh wiring with a Silence of the Lambs–esque chamber below. We touched little, took nothing (except, once, a jar of desiccant—sorry!), and were never caught.
Such was the state of fun at Johns Hopkins in the fall of 1992, an era when the administration seemed to have adopted a policy of benign neglect toward the extracurricular happiness of its undergraduate body. We had Spring Fair and the occasional bus trip to New York for the day. What more could we want?
For many—really, most—of my cutthroat classmates, this was reason to grumble. Why, they moaned from the depths of D-level, couldn’t school be more exciting? A student union, they pleaded. A bar. A café. Anything to make campus life more bearable.
But for my friends and me, the school’s DGAF attitude meant freedom: We could do whatever we wanted, on campus or off. When lock-picking grew old (quickly, I’m pleased to say), we began to roam, wandering among the half-abandoned industrial sites that lined the unreconstructed harbor, or driving (when someone happened to have a car) under the interstates that cut through and around the city. We were set loose upon Baltimore, and all we ever wanted was to go and see what there was.
Here’s what we found: A large yellow smiley face painted on the end of an oil-storage tank. The 16mm film collection at the Pratt Library. A man who claimed to have been hanging out with Mama Cass Elliot of the Mamas & the Papas the night she lost her virginity. The Baltimore Streetcar Museum. How to clear the dance floor at Club Midnite by playing the 1978 song “Fish Heads” (eat them up, yum!). The big slice at Angelo’s and the $4.95 crabcake subs at Sip & Bite. Smart drugs, Neal Stephenson, and 2600 magazine at Atomic Books. The indie movie screenings at Skizz Cyzyk’s funeral home “mansion.”
None of these alone was world-changing (okay, except maybe “Fish Heads”). Put together, though, they amounted to a constant stream of stimulation, novelty, and excitement, the discoveries that make new adulthood feel fresh and occasionally profound.
All the while, I heard the no-fun grumbling from around campus and failed to understand it. We had freedom—what more could we need? The world was all around us, begging to be explored. We didn’t even have to leave campus: One spring, my girlfriend and I simply stepped off the sidewalk next to Mudd Hall into a little dell—and discovered a stand of wild scallions. We picked a ton, brought them home, and feasted on our foraged bounty. All we’d had to do was to leave the asphalt path—no red brick in those days—behind.
Matt Gross, Johns Hopkins A&S ’96, ’98 (MA), is a food and travel writer/editor who’s worked for everyone from The New York Times and Bon Appétit to The Guardian, The Village Voice, and Saveur. He lives in Brooklyn with his wife, Jean Liu, A&S ’96, and their two daughters.
Incidentally he also had two other meaty pieces that came out yesterday as well:
- Are we Living in a Post-Bacon World? | ExtraCrispy.com
- A New Book About Nathan’s Famous Feeds Our Need for Cheap Eats — and the Prosperity Myth | Village Voice
This is the draft version of a textbook, which aims to introduce the quantum information science viewpoints on condensed matter physics to graduate students in physics (or interested researchers). We keep the writing in a self-consistent way, requiring minimum background in quantum information science. Basic knowledge in undergraduate quantum physics and condensed matter physics is assumed. We start slowly from the basic ideas in quantum information theory, but wish to eventually bring the readers to the frontiers of research in condensed matter physics, including topological phases of matter, tensor networks, and symmetry-protected topological phases.
Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?
This is a nice little overview article of some of the history of thermodynamics relating to information in physics and includes some recent physics advances as well. There are a few references to applications in biology at the micro level as well.
- Second Law of Thermodynamics with Discrete Quantum Feedback Control by Takahiro Sagawa and Masahito Ueda; Phys. Rev. Lett. 100, 080403 – Published 26 February 2008
- Work and information processing in a solvable model of Maxwell’s demon by Dibyendu Mandal and Christopher Jarzynski; PNAS vol. 109 no. 29, July 17, 2012
- Thermodynamic Costs of Information Processing in Sensory Adaptation by Pablo Sartori, Léo Granger, Chiu Fan Lee, and Jordan M. Horowitz; PLOS December 11, 2014 http://dx.doi.org/10.1371/journal.pcbi.1003974
- Intermittent transcription dynamics for the rapid production of long transcripts of high fidelity by Depken M1, Parrondo JM, Grill SW; Cell Rep. 2013 Oct 31;5(2):521-30. doi: 10.1016/j.celrep.2013.09.007
- The stepping motor protein as a feedback control ratchet by Martin Bier; BioSystems 88 (2007) 301–307
We don’t yet know quite what a physics of biology will consist of. But we won’t understand life without it.
This is an awesome little article with some interesting thought and philosophy on the current state of physics within biology and other related areas of study. It’s also got some snippets of history which aren’t frequently discussed in longer form texts.
An update to the tree of life has revealed a dominance of bacterial diversity in many ecosystems and extensive evolution in some branches of the tree. It also highlights how few organisms we have been able to cultivate for further investigation.
The tree of life is one of the most important organizing principles in biology. Gene surveys suggest the existence of an enormous number of branches, but even an approximation of the full scale of the tree has remained elusive. Recent depictions of the tree of life have focused either on the nature of deep evolutionary relationships or on the known, well-classified diversity of life with an emphasis on eukaryotes. These approaches overlook the dramatic change in our understanding of life’s diversity resulting from genomic sampling of previously unexamined environments. New methods to generate genome sequences illuminate the identity of organisms and their metabolic capacities, placing them in community and ecosystem contexts. Here, we use new genomic data from over 1,000 uncultivated and little known organisms, together with published sequences, to infer a dramatically expanded version of the tree of life, with Bacteria, Archaea and Eukarya included. The depiction is both a global overview and a snapshot of the diversity within each major lineage. The results reveal the dominance of bacterial diversification and underline the importance of organisms lacking isolated representatives, with substantial evolution concentrated in a major radiation of such organisms. This tree highlights major lineages currently underrepresented in biogeochemical models and identifies radiations that are probably important for future evolutionary analyses.
Laura A. Hug, Brett J. Baker, Karthik Anantharaman, Christopher T. Brown, Alexander J. Probst, Cindy J. Castelle, Cristina N. Butterfield, Alex W. Hernsdorf, Yuki Amano, Kotaro Ise, Yohey Suzuki, Natasha Dudek, David A. Relman, Kari M. Finstad, Ronald Amundson, Brian C. Thomas & Jillian F. Banfield in Nature Microbiology, Article number: 16048 (2016) doi:10.1038/nmicrobiol.2016.48
Carl Zimmer also has a nice little write up of the paper in today’s New York Times:
The 2016 School of information will be hosted at Duke University, June 21-23. It is sponsored by the IEEE Information Theory Society, Duke University, the Center for Science of Information, and the National Science Foundation. The school provides a venue where doctoral and postdoctoral students can learn from distinguished professors in information theory, meet with fellow researchers, and form collaborations.
Program and Lectures
The daily schedule will consist of morning and afternoon lectures separated by a lunch break with poster sessions. Students from all research areas are welcome to attend and present their own research via a poster during the school. The school will host lectures on core areas of information theory and interdisciplinary topics. The following lecturers are confirmed:
- Helmut Bölcskei (ETH Zurich): The Mathematics of Deep Learning
- Natasha Devroye (University of Illinois, Chicago): The Interference Channel
- René Vidal (Johns Hopkins University): Global Optimality in Deep Learning and Beyond
- Tsachy Weissman (Stanford University): Information Processing under Logarithmic Loss
- Aylin Yener (Pennsylvania State University): Information-Theoretic Security
Applications will be available on March 15 and will be evaluated starting April 1. Accepted students must register by May 15, 2016. The registration fee of $200 will include food and 3 nights accommodation in a single-occupancy room. We suggest that attendees fly into the Raleigh-Durham (RDU) airport located about 30 minutes from the Duke campus. Housing will be available for check-in on the afternoon of June 20th. The main part of the program will conclude after lunch on June 23rd so that attendees can fly home that evening.
Administrative Contact: Kathy Peterson, firstname.lastname@example.org
Henry Pfister (chair) (Duke University), Dror Baron (North Carolina State University), Matthieu Bloch (Georgia Tech), Rob Calderbank (Duke University), Galen Reeves (Duke University). Advisors: Gerhard Kramer (Technical University of Munich) and Andrea Goldsmith (Stanford)