It’s an endearing, giant, flightless, New Zealand parrot, and it’s a poster child for the quantified-self movement.
Syndicated copies to:
The ‘creator’ of Bitcoin, Satoshi Nakamoto, is the world’s most elusive billionaire. Very few people outside of the Department of Homeland Security know Satoshi’s real name. In fact, DHS will not publicly confirm that even THEY know the billionaire’s identity. Satoshi has taken great care to keep his identity secret employing the latest encryption and obfuscation methods in his communications. Despite these efforts (according to my source at the DHS) Satoshi Nakamoto gave investigators the only tool they needed to find him — his own words. Using stylometry one is able to compare texts to determine authorship of a particular work. Throughout the years Satoshi wrote thousands of posts and emails and most of which are publicly available. According to my source, the NSA was able to the use the ‘writer invariant’ method of stylometry to compare Satoshi’s ‘known’ writings with trillions of writing samples from people across the globe. By taking Satoshi’s texts and finding the 50 most common words, the NSA was able to break down his text into 5,000 word chunks and analyse each to find the frequency of those 50 words. This would result in a unique 50-number identifier for each chunk. The NSA then placed each of these numbers into a 50-dimensional space and flatten them into a plane using principal components analysis. The result is a ‘fingerprint’ for anything written by Satoshi that could easily be compared to any other writing.
The article itself is dubious and unsourced and borders a bit on conspiracy theory, but the underlying concept about stylometry and its implications to privacy will be interesting to many. Naturally, it’s not much new.Syndicated copies to:
Rolling out to Medium users over the coming week will be a new, more satisfying way for readers to give feedback to writers. We call it “Claps.” It’s no longer simply whether you like, or don’t like, something. Now you can give variable levels of applause to a story. Maybe clap once, or maybe 10 or 20 times. You’re in control and can clap to your heart’s desire.
Yet another way to “like” a post….
This reminds me a lot of Path’s pivot to stickers. We all know how relevant it has made them since.
And all this just after Netflix, the company that has probably done more research on ranking than any other, has gone from a multi-star intent to a thumbs up/thumbs down in the past month.
Most of the measurements social media and other companies are really trying to make are signal to noise ratios as well as creating some semblance of dynamic range. A simple thumbs up creates almost no dynamic range compared to thumbs up/nothing/thumbs-down. Major platforms drive enough traffic that the SNR all comes out in the wash. Without the negative intent (dis-like, thumbs down, etc.) we’re missing out on some important data. It’s almost reminiscent to the science community only publishing their positive results and not the negative results. As a result scientific research is losing a tremendous amount of value.
We need to be more careful what we’re doing and why…Syndicated copies to:
SFI and Arizona State University soon will offer the world’s first comprehensive online master’s degree in complexity science. It will be the Institute’s first graduate degree program, a vision that dates to SFI’s founding. “With technology, a growing recognition of the value of online education, widespread acceptance of complexity science, and in partnership with ASU, we are now able to offer the world a degree in the field we helped invent,” says SFI President David Krakauer, “and it will be taught by the very people who built it into a legitimate domain of scholarship.”
Rich Borschelt is the communication director for science at the Department of Energy, and recently attended a science communication workshop. He describes at some length his frustration at the failed model of science communication, in which every meeting hashes over the same futile set of assumptions: “Communication, Literacy, Policy: Thoughts on SciComm in a Democracy. After several other issues, he turns to the conferences’ attitude about scientists...
John’s note reminds me that I’ve been watching a growing and nasty trend against science, much less science communication, in the past several years. We’re going to be needing a lot more help than we’re getting lately to turn the tide for the better. Perhaps more scientists having their own websites and expanding on the practice of samizdat would help things out a bit?
I recently came across Science Sites, a non-profit web company, courtesy of mathematician Steven Strogatz who has a site built by them. In some sense, I see some of what they’re doing to be enabling scientists to become part of the IndieWeb. It would be great to see them support standards like Webmention or functionality like Micropub as well. (It looks like they’re doing a lot of building on SquareSpace, so by proxy it would be great if they were supporting these open standards.) I love that it seems to have been created by a group of science journalists to help out the cause.
As I watch some of the Domain of One’s Own community in higher education, it feels to me that it’s primarily full of humanities related professors and researchers and doesn’t seem to be doing enough outreach to their science, engineering, math, or other colleagues who desperately need these tools as well as help with basic communication.Syndicated copies to:
ABSTRACT Recent studies of active matter have stimulated interest in the driven self-assembly of complex structures. Phenomenological modeling of particular examples has yielded insight, but general thermodynamic principles unifying the rich diversity of behaviors observed have been elusive. Here, we study the stochastic search of a toy chemical space by a collection of reacting Brownian particles subject to periodic forcing. We observe the emergence of an adaptive resonance in the system matched to the drive frequency, and show that the increased work absorption by these resonant structures is key to their stabilization. Our findings are consistent with a recently proposed thermodynamic mechanism for far-from-equilibrium self-organization.
Suggested by First Support for a Physics Theory of Life in Quanta Magazine.Syndicated copies to:
Significance A qualitatively more diverse range of possible behaviors emerge in many-particle systems once external drives are allowed to push the system far from equilibrium; nonetheless, general thermodynamic principles governing nonequilibrium pattern formation and self-assembly have remained elusive, despite intense interest from researchers across disciplines. Here, we use the example of a randomly wired driven chemical reaction network to identify a key thermodynamic feature of a complex, driven system that characterizes the “specialness” of its dynamical attractor behavior. We show that the network’s fixed points are biased toward the extremization of external forcing, causing them to become kinetically stabilized in rare corners of chemical space that are either atypically weakly or strongly coupled to external environmental drives. Abstract A chemical mixture that continually absorbs work from its environment may exhibit steady-state chemical concentrations that deviate from their equilibrium values. Such behavior is particularly interesting in a scenario where the environmental work sources are relatively difficult to access, so that only the proper orchestration of many distinct catalytic actors can power the dissipative flux required to maintain a stable, far-from-equilibrium steady state. In this article, we study the dynamics of an in silico chemical network with random connectivity in an environment that makes strong thermodynamic forcing available only to rare combinations of chemical concentrations. We find that the long-time dynamics of such systems are biased toward states that exhibit a fine-tuned extremization of environmental forcing.
Suggested by First Support for a Physics Theory of Life in Quanta Magazine.
Syndicated copies to:
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
Interesting article with some great references I’ll need to delve into and read.
The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.
I want to take a look at these papers as well as several about which the article is directly about.
Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”
Some truly harsh words from his former supervisor? Wow!
maybe there’s more that you can get for free
Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?Syndicated copies to:
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain
One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:
- Physics: classical Thermodynamics and Quantum
- Statistical physics and Bayesian computation
- Geometrical science of information, topology and metrics
- Maximum entropy principle and inference
- Kullback and Bayes or information theory and Bayesian inference
- Entropy in action (applications)
The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.
All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy.
Syndicated copies to:
In honor of tomorrow’s release of Jimmy Soni and Rob Goodman’s book A Mind at Play: How Claude Shannon Invented the Information Age, I’ve created an Information Theory playlist on Spotify.
Songs about communication, telephones, conversation, satellites, love, auto-tune and even one about a typewriter! They all relate at least tangentially to the topic at hand. To up the ante, everyone should realize that digital music would be impossible without Shannon’s seminal work.
Let me know in the comments or by replying to one of the syndicated copies listed below if there are any great tunes that the list is missing.
Enjoy the list and the book!Syndicated copies to:
📖 Read pages 16-55 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman
Knowing that I’ve read a lot about Shannon and even Vannevar Bush over the years, I’m pleasantly surprised to read some interesting tidbits about them that I’ve not previously come across. I was a bit worried that this text wouldn’t provide me with much or anything new on the subjects at hand.
I’m really appreciating some of the prose and writing structure, particularly given that it’s a collaborative work between two authors. At times there are some really nonstandard sentence structures, but they’re wonderful in their rule breaking.
They’re doing an excellent job so far of explaining the more difficult pieces of science relating to information theory. In fact, some of the intro was as good as I think I’ve ever seen simple explanations of what is going on within the topic. I’m also pleased that they’ve made some interesting forays into topics like eugenics and the background role it played in the story for Shannon.
They had a chance to do a broader view of the history of computing, but opted against it, or at least must have made a conscious choice to leave out Babbage/Lovelace within the greater pantheon. I can see narratively why they may have done this knowing what is to come later in the text, but a few sentences as a nod would have been welcome.
The book does, however, get on my nerves with one of my personal pet peeves in popular science and biographical works like this: while there are reasonable notes at the end, absolutely no proper footnotes appear at the bottoms of pages or even indicators within the text other than pieces of text with quotation marks. I’m glad the notes even exist in the back, but it just drives me crazy that publishers blatantly hide them this way. The text could at least have had markers indicating where to find the notes. What are we? Animals?
Nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.Syndicated copies to: