Linking Economic Complexity, Institutions and Income Inequality
A country's mix of products predicts its subsequent pattern of diversification and economic growth. But does this product mix also predict income inequality? Here we combine methods from econometrics, network science, and economic complexity to show that countries exporting complex products (as measured by the Economic Complexity Index) have lower levels of income inequality than countries exporting simpler products. Using multivariate regression analysis, we show that economic complexity is a significant and negative predictor of income inequality and that this relationship is robust to controlling for aggregate measures of income, institutions, export concentration, and human capital. Moreover, we introduce a measure that associates a product to a level of income inequality equal to the average GINI of the countries exporting that product (weighted by the share the product represents in that country's export basket). We use this measure together with the network of related products (or product space) to illustrate how the development of new products is associated with changes in income inequality. These findings show that economic complexity captures information about an economy's level of development that is relevant to the ways an economy generates and distributes its income. Moreover, these findings suggest that a country's productive structure may limit its range of income inequality. Finally, we make our results available through an online resource that allows for its users to visualize the structural transformation of over 150 countries and their associated changes in income inequality between 1963 and 2008.
MIT has a pretty good lay-person’s overview of this article. The final published version is separately available.
Syndicated copies to:
Post filtering fixes at Homebrew Website Club
Last night Tucker Hottes, Den Temple and I held the first Homebrew Website Club at The Keys in Scranton, PA. I really appreciate that HWC will force me to set aside some time to work on my personal site since it is often neglected for more pressing projects.
Nota bene: Colin is dogfooding his IndieWeb friendly WordPress theme on Github! It’s a beautiful, simple, and very clean theme for a personal website/blog.
Colin, do you mind if we provide a link to your theme on https://indieweb.org/WordPress/Themes for others to potentially use and/or improve upon? (See also discussion at https://indieweb.org/WordPress/Development#Themes.)
Syndicated copies to:
From Bacteria to Bach and Back: The Evolution of Minds
by (W. W. Norton & Company; 1 edition, 496 pages (February 7, 2017))
One of America’s foremost philosophers offers a major new account of the origins of the conscious mind.
How did we come to have minds?
For centuries, this question has intrigued psychologists, physicists, poets, and philosophers, who have wondered how the human mind developed its unrivaled ability to create, imagine, and explain. Disciples of Darwin have long aspired to explain how consciousness, language, and culture could have appeared through natural selection, blazing promising trails that tend, however, to end in confusion and controversy. Even though our understanding of the inner workings of proteins, neurons, and DNA is deeper than ever before, the matter of how our minds came to be has largely remained a mystery.
That is now changing, says Daniel C. Dennett. In From Bacteria to Bach and Back, his most comprehensive exploration of evolutionary thinking yet, he builds on ideas from computer science and biology to show how a comprehending mind could in fact have arisen from a mindless process of natural selection. Part philosophical whodunit, part bold scientific conjecture, this landmark work enlarges themes that have sustained Dennett’s legendary career at the forefront of philosophical thought.
In his inimitable style―laced with wit and arresting thought experiments―Dennett explains that a crucial shift occurred when humans developed the ability to share memes, or ways of doing things not based in genetic instinct. Language, itself composed of memes, turbocharged this interplay. Competition among memes―a form of natural selection―produced thinking tools so well-designed that they gave us the power to design our own memes. The result, a mind that not only perceives and controls but can create and comprehend, was thus largely shaped by the process of cultural evolution.
An agenda-setting book for a new generation of philosophers, scientists, and thinkers, From Bacteria to Bach and Back will delight and entertain anyone eager to make sense of how the mind works and how it came about.
4 color, 18 black-and-white illustrations
🔖 Want to read: From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett
Syndicated copies to:
IPAM Workshop on Regulatory and Epigenetic Stochasticity in Development and Disease (Institute for Pure and Applied Mathematics, UCLA | March 1-3, 2017)
Epigenetics refers to information transmitted during cell division other than the DNA sequence per se, and it is the language that distinguishes stem cells from somatic cells, one organ from another, and even identical twins from each other. In contrast to the DNA sequence, the epigenome is relatively susceptible to modification by the environment as well as stochastic perturbations over time, adding to phenotypic diversity in the population. Despite its strong ties to the environment, epigenetics has never been well reconciled to evolutionary thinking, and in fact there is now strong evidence against the transmission of so-called “epi-alleles,” i.e. epigenetic modifications that pass through the germline.
However, genetic variants that regulate stochastic fluctuation of gene expression and phenotypes in the offspring appear to be transmitted as an epigenetic or even Lamarckian trait. Furthermore, even the normal process of cellular differentiation from a single cell to a complex organism is not understood well from a mathematical point of view. There is increasingly strong evidence that stem cells are highly heterogeneous and in fact stochasticity is necessary for pluripotency. This process appears to be tightly regulated through the epigenome in development. Moreover, in these biological contexts, “stochasticity” is hardly synonymous with “noise”, which often refers to variation which obscures a “true signal” (e.g., measurement error) or which is structural, as in physics (e.g., quantum noise). In contrast, “stochastic regulation” refers to purposeful, programmed variation; the fluctuations are random but there is no true signal to mask.
This workshop will serve as a forum for scientists and engineers with an interest in computational biology to explore the role of stochasticity in regulation, development and evolution, and its epigenetic basis. Just as thinking about stochasticity was transformative in physics and in some areas of biology, it promises to fundamentally transform modern genetics and help to explain phase transitions such as differentiation and cancer.
This workshop will include a poster session; a request for poster titles will be sent to registered participants in advance of the workshop.
Adam Arkin (Lawrence Berkeley Laboratory)
Gábor Balázsi (SUNY Stony Brook)
Domitilla Del Vecchio (Massachusetts Institute of Technology)
Michael Elowitz (California Institute of Technology)
Andrew Feinberg (Johns Hopkins University)
Don Geman (Johns Hopkins University)
Anita Göndör (Karolinska Institutet)
John Goutsias (Johns Hopkins University)
Garrett Jenkinson (Johns Hopkins University)
Andre Levchenko (Yale University)
Olgica Milenkovic (University of Illinois)
Johan Paulsson (Harvard University)
Leor Weinberger (University of California, San Francisco (UCSF))
Syndicated copies to:
IPAM Workshop on Gauge Theory and Categorification (Institute of Pure and Applied Mathematics at UCLA - March 6-10, 2017)
The equations of gauge theory lie at the heart of our understanding of particle physics. The Standard Model, which describes the electromagnetic, weak, and strong forces, is based on the Yang-Mills equations. Starting with the work of Donaldson in the 1980s, gauge theory has also been successfully applied in other areas of pure mathematics, such as low dimensional topology, symplectic geometry, and algebraic geometry.
More recently, Witten proposed a gauge-theoretic interpretation of Khovanov homology, a knot invariant whose origins lie in representation theory. Khovanov homology is a “categorification” of the celebrated Jones polynomial, in the sense that its Euler characteristic recovers this polynomial. At the moment, Khovanov homology is only defined for knots in the three-sphere, but Witten’s proposal holds the promise of generalizations to other three-manifolds, and perhaps of producing new invariants of four-manifolds.
This workshop will bring together researchers from several different fields (theoretical physics, mathematical gauge theory, topology, analysis / PDE, representation theory, symplectic geometry, and algebraic geometry), and thus help facilitate connections between these areas. The common focus will be to understand Khovanov homology and related invariants through the lens of gauge theory.
This workshop will include a poster session; a request for posters will be sent to registered participants in advance of the workshop.
Edward Witten will be giving two public lectures as part of the Green Family Lecture series:
March 6, 2017
From Gauge Theory to Khovanov Homology Via Floer Theory
The goal of the lecture is to describe a gauge theory approach to Khovanov homology of knots, in particular, to motivate the relevant gauge theory equations in a way that does not require too much physics background. I will give a gauge theory perspective on the construction of singly-graded Khovanov homology by Abouzaid and Smith.
March 8, 2017
An Introduction to the SYK Model
The Sachdev-Ye model was originally a model of quantum spin liquids that was introduced in the mid-1990′s. In recent years, it has been reinterpreted by Kitaev as a model of quantum chaos and black holes. This lecture will be primarily a gentle introduction to the SYK model, though I will also describe a few more recent results.
Syndicated copies to:
A special issue of Entropy
Deadline for manuscript submissions: 31 August 2017
Special Issue Editor
Dr. Brendon J. Brewer
Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
bayesian inference, markov chain monte carlo, nested sampling, MaxEnt
Special Issue Information
Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.
This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?
More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.
Dr. Brendon J. Brewer
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).
No papers have been published in this special issue yet.
Source: Entropy | Special Issue : Maximum Entropy and Bayesian Methods
Irreversibility and Heat Generation in the Computing Process
It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.
A classical paper in the history of entropy.
Syndicated copies to: