Dodging The Memory Hole is an action-oriented conference and event series that brings together journalists, technologists, and information specialists to strategize solutions for organizing and preserving born-digital news.
A summary/recap of the Dodging the Memory Hole 2016 conference held at UCLA's Charles Young Research Library in Los Angeles, California over two days in October to discuss and highlight potential solutions to the issue of preserving born-digital news.
Today, we’re excited to announce that Instapaper is joining Pinterest. In the three years since betaworks acquired Instapaper from Marco Arment, we’ve completely rewritten our backend, overhauled our mobile and web clients, improved parsing and search, and introduced tons of great features like highlights, text-to-speech, and speed reading to the product.
This is the signal for the second.
How can you not follow this twitter account?!
Now I’m waiting for a Shannon bot and a Weiner bot. Maybe a John McCarthy bot would be apropos too?!
Hundreds of researchers in a collaborative project called "It from Qubit" say space and time may spring up from the quantum entanglement of tiny bits of information.
Peter Woit has just made the final draft (dated 10/25/16) of his new textbook Quantum Theory, Groups and Representations: An Introduction freely available for download from his website. It covers quantum theory with a heavy emphasis on groups and representation theory and
“contains significant amounts of material not well-explained elsewhere.” He expects to finish up the diagrams and publish it next year some time, potentially through Springer.
During decades the study of networks has been divided between the efforts of social scientists and natural scientists, two groups of scholars who often do not see eye to eye. In this review I present an effort to mutually translate the work conducted by scholars from both of these academic fronts hoping to continue to unify what has become a diverging body of literature. I argue that social and natural scientists fail to see eye to eye because they have diverging academic goals. Social scientists focus on explaining how context specific social and economic mechanisms drive the structure of networks and on how networks shape social and economic outcomes. By contrast, natural scientists focus primarily on modeling network characteristics that are independent of context, since their focus is to identify universal characteristics of systems instead of context specific mechanisms. In the following pages I discuss the differences between both of these literatures by summarizing the parallel theories advanced to explain link formation and the applications used by scholars in each field to justify their approach to network science. I conclude by providing an outlook on how these literatures can be further unified.
I have become increasingly frustrated by the fact that many of the publications I used to like are turning into churnicle factories, creating platforms for anybody and everybody to post whatever dr…
Bioinformatics is a broad discipline in which one common denominator is the need to produce and/or use software that can be applied to biological data in different contexts. To enable and ensure the replicability and traceability of scientific claims, it is essential that the scientific publication, the corresponding datasets, and the data analysis are made publicly available [1,2]. All software used for the analysis should be either carefully documented (e.g., for commercial software) or, better yet, openly shared and directly accessible to others [3,4]. The rise of openly available software and source code alongside concomitant collaborative development is facilitated by the existence of several code repository services such as SourceForge, Bitbucket, GitLab, and GitHub, among others. These resources are also essential for collaborative software projects because they enable the organization and sharing of programming tasks between different remote contributors. Here, we introduce the main features of GitHub, a popular web-based platform that offers a free and integrated environment for hosting the source code, documentation, and project-related web content for open-source projects. GitHub also offers paid plans for private repositories (see Box 1) for individuals and businesses as well as free plans including private repositories for research and educational use.
For a little over two years, I have been involved in Indiewebcamp. This past weekend, for the first time in five years, I was able to attend WordCamp. WordCamp NYC was a massive undertaking, to which I must give credit to the organizers. WordCamp was moved to coincide with OpenCamps week at the United Nations, …
Advances in computing power, natural language processing, and digitization of text now make it possible to study our a culture's evolution through its texts using a "big data" lens. Our ability to communicate relies in part upon a shared emotional experience, with stories often following distinct emotional trajectories, forming patterns that are meaningful to us. Here, by classifying the emotional arcs for a filtered subset of 1,737 stories from Project Gutenberg's fiction collection, we find a set of six core trajectories which form the building blocks of complex narratives. We strengthen our findings by separately applying optimization, linear decomposition, supervised learning, and unsupervised learning. For each of these six core emotional arcs, we examine the closest characteristic stories in publication today and find that particular emotional arcs enjoy greater success, as measured by downloads.
Even in 2016, publishers and authors are still struggling when it comes to re-releasing decades-old books, but Penguin had a unique problem when it set out to publish a 30th anniversary edition of Richard Dawkin's The Blind Watchmaker.<br /><br />The Bookseller reports that Penguin decided to revive four programs Dawkins wrote in 1986. Written in Pascal for the Mac, The Watchmaker Suite was an experiment in algorithmic evolution. Users could run the programs and create a biomorph, and then watch it evolve across the generations.<br /><br />And now you can do the same in your web browser.<br /><br />A website, MountImprobable.com, was built by the publisher’s in-house Creative Technology team—comprising community manager Claudia Toia, creative developer Mathieu Triay and cover designer Matthew Young—who resuscitated and redeployed code Dawkins wrote in the 1980s and ’90s to enable users to create unique, “evolutionary” imprints. The images will be used as cover imagery on Dawkins’ trio to grant users an entirely individual, personalised print copy.
“Amerikan Krazy: Life Out of Balance” takes part of its name from the new book <a href="http://boffosockobooks.com/books/authors/henry-james-korn/amerikan-krazy/">"Amerikan Krazy”</a> by <a href="http://www.henryjameskorn.com">Henry James Korn</a>. From 2008 to 2013, Korn worked at the Orange County Great Park. He was responsible for the creation of the Palm Court arts complex and culture, music, art and history programs.<br /><br /> “The book is very much about total corporate control of public and private space,” Korn said. The story follows a wounded Marine veteran haunted after having missed the chance to assassinate a presidential candidate who later causes massive human suffering and wreaks havoc on America’s wealth and democracy.<br /><br /> It’s a way of understanding what’s happening in politics now, Korn said.<br /><br /> “Because if ever there was a recognition that our public life and politics have gone crazy, it’s this moment.”
If you haven’t manage to make it down, this exhibition is running for another week at BC Space!
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
- Jeremy L. England Lab
- Statistical physics of self-replication, Jeremy L. England; J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538
- Statistical Physics of Adaptation, Nikolai Perunov, Robert Marsland, and Jeremy England, arXiv, December 8, 2014
- Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences, Gavin E. Crooks, arXiv, February 1, 2008
- Life as a manifestation of the second law of thermodynamics, E.D. Schneider, J.J. Kay, doi:10.1016/0895-7177(94)90188-0, Mathematical and Computer Modelling, Volume 19, Issues 6–8, March–April 1994, Pages 25-48
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.