Energy is the only universal currency; it is necessary for getting anything done. The conversion of energy on Earth ranges from terra-forming forces of plate tectonics to cumulative erosive effects of raindrops. Life on Earth depends on the photosynthetic conversion of solar energy into plant biomass. Humans have come to rely on many more energy flows -- ranging from fossil fuels to photovoltaic generation of electricity -- for their civilized existence. In this monumental history, Vaclav Smil provides a comprehensive account of how energy has shaped society, from pre-agricultural foraging societies through today's fossil fuel--driven civilization.
Humans are the only species that can systematically harness energies outside their bodies, using the power of their intellect and an enormous variety of artifacts -- from the simplest tools to internal combustion engines and nuclear reactors. The epochal transition to fossil fuels affected everything: agriculture, industry, transportation, weapons, communication, economics, urbanization, quality of life, politics, and the environment. Smil describes humanity's energy eras in panoramic and interdisciplinary fashion, offering readers a magisterial overview. This book is an extensively updated and expanded version of Smil's Energy in World History (1994). Smil has incorporated an enormous amount of new material, reflecting the dramatic developments in energy studies over the last two decades and his own research over that time.
Observational data about human behavior is often heterogeneous, i.e., generated by subgroups within the population under study that vary in size and behavior. Heterogeneity predisposes analysis to Simpson's paradox, whereby the trends observed in data that has been aggregated over the entire population may be substantially different from those of the underlying subgroups. I illustrate Simpson's paradox with several examples coming from studies of online behavior and show that aggregate response leads to wrong conclusions about the underlying individual behavior. I then present a simple method to test whether Simpson's paradox is affecting results of analysis. The presence of Simpson's paradox in social data suggests that important behavioral differences exist within the population, and failure to take these differences into account can distort the studies' findings.
This paper answers Bell’s question: What does quantum information refer to? It is about quantum properties represented by subspaces of the quantum Hilbert space, or their projectors, to which standard (Kolmogorov) probabilities can be assigned by using a projective decomposition of the identity (PDI or framework) as a quantum sample space. The single framework rule of consistent histories prevents paradoxes or contradictions. When only one framework is employed, classical (Shannon) information theory can be imported unchanged into the quantum domain. A particular case is the macroscopic world of classical physics whose quantum description needs only a single quasiclassical framework. Nontrivial issues unique to quantum information, those with no classical analog, arise when aspects of two or more incompatible frameworks are compared.
All living things are made of cells, and all cells are powered by electrochemical charges across thin lipid membranes — the ‘proton motive force.’ We know how these electrical charges are generated by protein machines at virtually atomic resolution, but we know very little about how membrane bioenergetics first arose. By tracking back cellular evolution to the last universal common ancestor and beyond, scientist Nick Lane argues that geologically sustained electrochemical charges across semiconducting barriers were central to both energy flow and the formation of new organic matter — growth — at the very origin of life.
Dr. Lane is a professor of evolutionary biochemistry in the Department of Genetics, Evolution and Environment at University College London. His research focuses on how energy flow constrains evolution from the origin of life to the traits of complex multicellular organisms. He is a co-director of the new Centre for Life’s Origins and Evolution (CLOE) at UCL, and author of four celebrated books on life’s origins and evolution. His work has been recognized by the Biochemical Society Award in 2015 and the Royal Society Michael Faraday Prize in 2016.
About half the normal matter in our universe had never been observed – until now. Two teams have finally seen it by combining millions of faint images into one
Discoveries seem to back up many of our ideas about how the universe got its large-scale structure
Andrey Kravtsov (The University of Chicago) and Anatoly Klypin (New Mexico State University). Visualisation by Andrey Kravtsov
The missing links between galaxies have finally been found. This is the first detection of the roughly half of the normal matter in our universe – protons, neutrons and electrons – unaccounted for by previous observations of stars, galaxies and other bright objects in space.
You have probably heard about the hunt for dark matter, a mysterious substance thought to permeate the universe, the effects of which we can see through its gravitational pull. But our models of the universe also say there should be about twice as much ordinary matter out there, compared with what we have observed so far.
Jeffrey Hall, a retired professor at Brandeis University, shared the 2017 Nobel Prize in medicine for discoveries elucidating how our internal body clock works. He was honored along with Michael Young and his close collaborator Michael Roshbash. Hall said in an interview from his home in rural Maine that he collaborated with Roshbash because they shared...
This is an all-too-often heard story. The difference is that now a Nobel Prize winner is telling it about himself!
News reporters and anchors have repeatedly referred to the recent tragedy in Las Vegas as the “worst mass shooting in U.S. history.” Like all things that are constantly repeated, the proclamation has become fact.
There’s some great history here. It reminds me about the podcast Seeing White which I’ve been listening to recently.
Art and science have in some ways always overlapped, with early scientists using illustrations to depict what they saw under the microscope. Janet Iwasa of the University of Utah is trying to re-establish this link to make thorny scientific data and models approachable to the common eye. Iwasa offers her brief but spectacular take on how 3D animation can make molecular science more accessible.
Visualizations can be tremendously valuable. This story reminds me of an Intersession course that Mary Spiro did at Johns Hopkins to help researchers communicate what their research is about as well as some of the work she did with the Johns Hopkins Institute for NanoBioTechnology.
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions.
This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome.
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".
Deadline for manuscript submissions: 1 December 2017
The ‘creator’ of Bitcoin, Satoshi Nakamoto, is the world’s most elusive billionaire. Very few people outside of the Department of Homeland Security know Satoshi’s real name. In fact, DHS will not publicly confirm that even THEY know the billionaire’s identity. Satoshi has taken great care to keep his identity secret employing the latest encryption and obfuscation methods in his communications. Despite these efforts (according to my source at the DHS) Satoshi Nakamoto gave investigators the only tool they needed to find him — his own words.
Using stylometry one is able to compare texts to determine authorship of a particular work. Throughout the years Satoshi wrote thousands of posts and emails and most of which are publicly available. According to my source, the NSA was able to the use the ‘writer invariant’ method of stylometry to compare Satoshi’s ‘known’ writings with trillions of writing samples from people across the globe. By taking Satoshi’s texts and finding the 50 most common words, the NSA was able to break down his text into 5,000 word chunks and analyse each to find the frequency of those 50 words. This would result in a unique 50-number identifier for each chunk. The NSA then placed each of these numbers into a 50-dimensional space and flatten them into a plane using principal components analysis. The result is a ‘fingerprint’ for anything written by Satoshi that could easily be compared to any other writing.
The article itself is dubious and unsourced and borders a bit on conspiracy theory, but the underlying concept about stylometry and its implications to privacy will be interesting to many. Naturally, it’s not much new.
Rolling out to Medium users over the coming week will be a new, more satisfying way for readers to give feedback to writers. We call it “Claps.” It’s no longer simply whether you like, or don’t like, something. Now you can give variable levels of applause to a story. Maybe clap once, or maybe 10 or 20 times. You’re in control and can clap to your heart’s desire.
Yet another way to “like” a post….
This reminds me a lot of Path’s pivot to stickers. We all know how relevant it has made them since.
And all this just after Netflix, the company that has probably done more research on ranking than any other, has gone from a multi-star intent to a thumbs up/thumbs down in the past month.
Most of the measurements social media and other companies are really trying to make are signal to noise ratios as well as creating some semblance of dynamic range. A simple thumbs up creates almost no dynamic range compared to thumbs up/nothing/thumbs-down. Major platforms drive enough traffic that the SNR all comes out in the wash. Without the negative intent (dis-like, thumbs down, etc.) we’re missing out on some important data. It’s almost reminiscent to the science community only publishing their positive results and not the negative results. As a result scientific research is losing a tremendous amount of value.
We need to be more careful what we’re doing and why…