🔖 Linking Economic Complexity, Institutions and Income Inequality

Linking Economic Complexity, Institutions and Income Inequality by Dominik Hartmann, Miguel R. Guevara, Cristian Jara-Figueroa, Manuel Aristarán, César A. Hidalgo (arxiv.org)
A country's mix of products predicts its subsequent pattern of diversification and economic growth. But does this product mix also predict income inequality? Here we combine methods from econometrics, network science, and economic complexity to show that countries exporting complex products (as measured by the Economic Complexity Index) have lower levels of income inequality than countries exporting simpler products. Using multivariate regression analysis, we show that economic complexity is a significant and negative predictor of income inequality and that this relationship is robust to controlling for aggregate measures of income, institutions, export concentration, and human capital. Moreover, we introduce a measure that associates a product to a level of income inequality equal to the average GINI of the countries exporting that product (weighted by the share the product represents in that country's export basket). We use this measure together with the network of related products (or product space) to illustrate how the development of new products is associated with changes in income inequality. These findings show that economic complexity captures information about an economy's level of development that is relevant to the ways an economy generates and distributes its income. Moreover, these findings suggest that a country's productive structure may limit its range of income inequality. Finally, we make our results available through an online resource that allows for its users to visualize the structural transformation of over 150 countries and their associated changes in income inequality between 1963 and 2008.

MIT has a pretty good lay-person’s overview of this article. The final published version is separately available.

 

Syndicated copies to:

Income inequality linked to export “complexity” | MIT News

Income inequality linked to export “complexity” by Larry Hardesty (MIT News)
The mix of products that countries export is a good predictor of income distribution, study finds.

Continue reading “Income inequality linked to export “complexity” | MIT News”

🔖 The Epidemic Spreading Model and the Direction of Information Flow in Brain Networks

The Epidemic Spreading Model and the Direction of Information Flow in Brain Networks by J. Meier, X. Zhou, A. Hillebrand, P. Tewarie, C.J. Stam, P. Van Mieghem (NeuroImage, February 5, 2017)
The interplay between structural connections and emerging information flow in the human brain remains an open research problem. A recent study observed global patterns of directional information flow in empirical data using the measure of transfer entropy. For higher frequency bands, the overall direction of information flow was from posterior to anterior regions whereas an anterior-to-posterior pattern was observed in lower frequency bands. In this study, we applied a simple Susceptible-Infected-Susceptible (SIS) epidemic spreading model on the human connectome with the aim to reveal the topological properties of the structural network that give rise to these global patterns. We found that direct structural connections induced higher transfer entropy between two brain regions and that transfer entropy decreased with increasing distance between nodes (in terms of hops in the structural network). Applying the SIS model, we were able to confirm the empirically observed opposite information flow patterns and posterior hubs in the structural network seem to play a dominant role in the network dynamics. For small time scales, when these hubs acted as strong receivers of information, the global pattern of information flow was in the posterior-to-anterior direction and in the opposite direction when they were strong senders. Our analysis suggests that these global patterns of directional information flow are the result of an unequal spatial distribution of the structural degree between posterior and anterior regions and their directions seem to be linked to different time scales of the spreading process.
Syndicated copies to:

🔖 Want to read: From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett

From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett (W. W. Norton & Company; 1 edition, 496 pages (February 7, 2017))
One of America’s foremost philosophers offers a major new account of the origins of the conscious mind.

How did we come to have minds?

For centuries, this question has intrigued psychologists, physicists, poets, and philosophers, who have wondered how the human mind developed its unrivaled ability to create, imagine, and explain. Disciples of Darwin have long aspired to explain how consciousness, language, and culture could have appeared through natural selection, blazing promising trails that tend, however, to end in confusion and controversy. Even though our understanding of the inner workings of proteins, neurons, and DNA is deeper than ever before, the matter of how our minds came to be has largely remained a mystery.

That is now changing, says Daniel C. Dennett. In From Bacteria to Bach and Back, his most comprehensive exploration of evolutionary thinking yet, he builds on ideas from computer science and biology to show how a comprehending mind could in fact have arisen from a mindless process of natural selection. Part philosophical whodunit, part bold scientific conjecture, this landmark work enlarges themes that have sustained Dennett’s legendary career at the forefront of philosophical thought.

In his inimitable style―laced with wit and arresting thought experiments―Dennett explains that a crucial shift occurred when humans developed the ability to share memes, or ways of doing things not based in genetic instinct. Language, itself composed of memes, turbocharged this interplay. Competition among memes―a form of natural selection―produced thinking tools so well-designed that they gave us the power to design our own memes. The result, a mind that not only perceives and controls but can create and comprehend, was thus largely shaped by the process of cultural evolution.

An agenda-setting book for a new generation of philosophers, scientists, and thinkers, From Bacteria to Bach and Back will delight and entertain anyone eager to make sense of how the mind works and how it came about.

4 color, 18 black-and-white illustrations

🔖 Want to read: From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett

 

Syndicated copies to:

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease, March 1-3

IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease (Institute for Pure and Applied Mathematics, UCLA | March 1-3, 2017)
Epigenetics refers to information transmitted during cell division other than the DNA sequence per se, and it is the language that distinguishes stem cells from somatic cells, one organ from another, and even identical twins from each other. In contrast to the DNA sequence, the epigenome is relatively susceptible to modification by the environment as well as stochastic perturbations over time, adding to phenotypic diversity in the population. Despite its strong ties to the environment, epigenetics has never been well reconciled to evolutionary thinking, and in fact there is now strong evidence against the transmission of so-called “epi-alleles,” i.e. epigenetic modifications that pass through the germline.

However, genetic variants that regulate stochastic fluctuation of gene expression and phenotypes in the offspring appear to be transmitted as an epigenetic or even Lamarckian trait. Furthermore, even the normal process of cellular differentiation from a single cell to a complex organism is not understood well from a mathematical point of view. There is increasingly strong evidence that stem cells are highly heterogeneous and in fact stochasticity is necessary for pluripotency. This process appears to be tightly regulated through the epigenome in development. Moreover, in these biological contexts, “stochasticity” is hardly synonymous with “noise”, which often refers to variation which obscures a “true signal” (e.g., measurement error) or which is structural, as in physics (e.g., quantum noise). In contrast, “stochastic regulation” refers to purposeful, programmed variation; the fluctuations are random but there is no true signal to mask.

This workshop will serve as a forum for scientists and engineers with an interest in computational biology to explore the role of stochasticity in regulation, development and evolution, and its epigenetic basis. Just as thinking about stochasticity was transformative in physics and in some areas of biology, it promises to fundamentally transform modern genetics and help to explain phase transitions such as differentiation and cancer.

This workshop will include a poster session; a request for poster titles will be sent to registered participants in advance of the workshop.

Speaker List:
Adam Arkin (Lawrence Berkeley Laboratory)
Gábor Balázsi (SUNY Stony Brook)
Domitilla Del Vecchio (Massachusetts Institute of Technology)
Michael Elowitz (California Institute of Technology)
Andrew Feinberg (Johns Hopkins University)
Don Geman (Johns Hopkins University)
Anita Göndör (Karolinska Institutet)
John Goutsias (Johns Hopkins University)
Garrett Jenkinson (Johns Hopkins University)
Andre Levchenko (Yale University)
Olgica Milenkovic (University of Illinois)
Johan Paulsson (Harvard University)
Leor Weinberger (University of California, San Francisco (UCSF))

Syndicated copies to:

Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Entropy | Special Issue : Maximum Entropy and Bayesian Methods (mdpi.com)
Open for submission now
Deadline for manuscript submissions: 31 August 2017
A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2017

Special Issue Editor


Guest Editor

Dr. Brendon J. Brewer

 

Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
Website | E-MailPhone: +64275001336
Interests: bayesian inference, markov chain monte carlo, nested sampling, MaxEnt

Special Issue Information

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

Source: Entropy | Special Issue : Maximum Entropy and Bayesian Methods

🎧 Air-cured sausages | Eat This Podcast

Air-cured sausages by Jeremy Cherfas (Eat This Podcast)
Among the more miraculous edible transformations is the one that turns raw meat, salt and a few basic spices into some of the most delicious foods around.

Time was when curing meat, especially stuffed into a casing to make a sausage, was the only way both to use every part of an animal and to help make it last longer than raw meat. Done right, a sausage would stay good to the next slaughtering season and beyond.

The process relied on the skill of the sausage-maker, the help of beneficial bacteria and moulds, the right conditions, a great deal of patience, and sometimes luck. Luck is less of a factor now, because to keep up with demand the vast majority of cured meats are produced in artificial conditions of controlled precision. Here and there, though, the old ways survive. Jan Davison spent months touring the sausage high-spots of Europe looking for the genuine article, and shared some of her favourites at the Oxford Symposium on Food and Cooking last year.

This tempts me greatly to consider decommissioning an incubator from science related use to food related use…

Syndicated copies to:

🎧 Bog Butter | Eat This Podcast

Bog Butter by Jeremy Cherfas (Eat This Podcast, March 4, 2013)
Peat diggers in Ireland and elsewhere have occasionally unearthed objects, usually made of wood, that contained some kind of greasy, fatty material with a “distinctive, pungent and slightly offensive smell”. Butter. Centuries-old butter.

Who buried it, and why, remain mysteries that motivated Ben Reade, an experimental chef at the Nordic Food Lab in Copenhagen, to make some himself. He brought some of his modern-day bog butter, still nestled in moss and wrapped in its birch-bark barrel, to share with the Oxford Symposium on Food and Cookery last year.

Notes:
Ben mentioned two plants that have been found around bog butter, hypnum moss (Hypnum cupressiforme) and bog cotton (Eriophorum angustifolium).
The Nordic Food Lab research blog details all of their astonishing edible experiments.
I found Seamus Heaney reading his poem Bogland at The Internet Poetry Archive.
Caroline Earwood (1997) Bog Butter: A Two Thousand Year History, The Journal of Irish Archaeology, 8: 25-42 is available at JStor, which has a new scheme allowing you to read up to three items at a time online for free.
Music by Dan-O at DanoSongs.com.

An awesome little podcast I found recently, so I’m going back to the beginning to catch up on all the past episodes. Science, food, heaps of technical expertise, great interviews, and spectacular production quality. Highly recommend it to everyone.

Syndicated copies to:

🔖 The Hypercycle: A Principle of Natural Self-Organization | Springer

The Hypercycle - A Principle of Natural Self-Organization | M. Eigen | Springer by Manfred Eigen and Peter Schuster (Springer, 1979)
This book originated from a series of papers which were published in "Die Naturwissenschaften" in 1977178. Its division into three parts is the reflection of a logic structure, which may be abstracted in the form of three theses:

A. Hypercycles are a principle of natural self-organization allowing an inte­gration and coherent evolution of a set of functionally coupled self-rep­licative entities.

B. Hypercycles are a novel class of nonlinear reaction networks with unique properties, amenable to a unified mathematical treatment.

C. Hypercycles are able to originate in the mutant distribution of a single Darwinian quasi-species through stabilization of its diverging mutant genes. Once nucleated hypercycles evolve to higher complexity by a process analogous to gene duplication and specialization. In order to outline the meaning of the first statement we may refer to another principle of material self organization, namely to Darwin's principle of natural selection. This principle as we see it today represents the only understood means for creating information, be it the blue print for a complex living organism which evolved from less complex ancestral forms, or be it a meaningful sequence of letters the selection of which can be simulated by evolutionary model games.

Part A in .pdf format.

Syndicated copies to:

🔖 Cognition and biology: perspectives from information theory

Cognition and biology: perspectives from information theory by Roderick Wallace (ncbi.nlm.nih.gov)
The intimate relation between biology and cognition can be formally examined through statistical models constrained by the asymptotic limit theorems of communication theory, augmented by methods from statistical mechanics and nonequilibrium thermodynamics. Cognition, often involving submodules that act as information sources, is ubiquitous across the living state. Less metabolic free energy is consumed by permitting crosstalk between biological information sources than by isolating them, leading to evolutionary exaptations that assemble shifting, tunable cognitive arrays at multiple scales, and levels of organization to meet dynamic patterns of threat and opportunity. Cognition is thus necessary for life, but it is not sufficient: An organism represents a highly patterned outcome of path-dependent, blind, variation, selection, interaction, and chance extinction in the context of an adequate flow of free energy and an environment fit for development. Complex, interacting cognitive processes within an organism both record and instantiate those evolutionary and developmental trajectories.
Syndicated copies to:

🔖 Thermodynamics of Prediction

Thermodynamics of Prediction by Susanne Still, David A. Sivak, Anthony J. Bell, and Gavin E. Crooks (journals.aps.org Phys. Rev. Lett. 109, 120604 (2012))
A system responding to a stochastic driving signal can be interpreted as computing, by means of its dynamics, an implicit model of the environmental variables. The system’s state retains information about past environmental fluctuations, and a fraction of this information is predictive of future ones. The remaining nonpredictive information reflects model complexity that does not improve predictive power, and thus represents the ineffectiveness of the model. We expose the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation. Our results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including biomolecular machines. They highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive.
Syndicated copies to:

🔖 Statistical Physics of Adaptation

Statistical Physics of Adaptation by Nikolay Perunov, Robert A. Marsland, and Jeremy L. England (journals.aps.org Phys. Rev. X 6, 021036 (2016))
Whether by virtue of being prepared in a slowly relaxing, high-free energy initial condition, or because they are constantly dissipating energy absorbed from a strong external drive, many systems subject to thermal fluctuations are not expected to behave in the way they would at thermal equilibrium. Rather, the probability of finding such a system in a given microscopic arrangement may deviate strongly from the Boltzmann distribution, raising the question of whether thermodynamics still has anything to tell us about which arrangements are the most likely to be observed. In this work, we build on past results governing nonequilibrium thermodynamics and define a generalized Helmholtz free energy that exactly delineates the various factors that quantitatively contribute to the relative probabilities of different outcomes in far-from-equilibrium stochastic dynamics. By applying this expression to the analysis of two examples—namely, a particle hopping in an oscillating energy landscape and a population composed of two types of exponentially growing self-replicators—we illustrate a simple relationship between outcome-likelihood and dissipative history. In closing, we discuss the possible relevance of such a thermodynamic principle for our understanding of self-organization in complex systems, paying particular attention to a possible analogy to the way evolutionary adaptations emerge in living things.
Syndicated copies to:

🔖 Meaning = Information + Evolution by Carlo Rovelli

Meaning = Information + Evolution by Carlo Rovelli (arxiv.org)
Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".
Syndicated copies to:

🔖 Irreversibility and Heat Generation in the Computing Process by R. Landauer

Irreversibility and Heat Generation in the Computing Process by R. Landauer (ieeexplore.ieee.org)
It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.

A classical paper in the history of entropy.

Syndicated copies to:

🔖 Why Boltzmann Brains Are Bad by Sean M. Carroll

Why Boltzmann Brains Are Bad by Sean M. Carroll (arxiv.org)
Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the rug by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed.
Syndicated copies to: