From the groundbreaking author team behind the bestselling Winner-Take-All Politics, a timely and topical work that examines what’s good for American business and what’s good for Americans—and why those interests are misaligned.<br><br>
In Winner-Take-All Politics, Jacob S. Hacker and Paul Pierson explained how political elites have enabled and propelled plutocracy. Now in American Amnesia, they trace the economic and political history of the United States over the last century and show how a viable mixed economy has long been the dominant engine of America’s prosperity.<br><br>
Like every other prospering democracy, the United States developed a mixed economy that channeled the spirit of capitalism into strong growth and healthy social development. In this bargain, government and business were as much partners as rivals. Public investments in education, science, transportation, and technology laid the foundation for broadly based prosperity. Programs of economic security and progressive taxation provided a floor of protection and business focused on the pursuit of profit—and government addressed needs business could not.<br><br>
The mixed economy was the most important social innovation of the twentieth century. It spread a previously unimaginable level of broad prosperity. It enabled steep increases in education, health, longevity, and economic security. And yet, extraordinarily, it is anathema to many current economic and political elites. And as the advocates of anti-government free market fundamentalist have gained power, they are hell-bent on scrapping the instrument of nearly a century of unprecedented economic and social progress. In American Amnesia, Hacker and Pierson explain how—and why they must be stopped.
Earlier tonight I watched a segment on The PBS NewsHour about infrastructure in America that featured this book which came out earlier this year.
Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. 
G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” Scientific Reports, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815
The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop.
I just found out about this from John Carlos Baez and wish I could go! How have I not managed to have heard about it?
November 16, 2016 – November 18, 2016
Noyce Conference Room
This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.
The goal of this workshop is to address these issues by focusing on a set three specific question:
How has the fraction of free energy flux on earth that is used by biological computation changed with time?;
What is the free energy cost of biological computation / function?;
What is the free energy cost of the evolution of biological computation / function.
In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.
Purpose: Research Collaboration
SFI Host: David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert
Peter Woit has just made the final draft (dated 10/25/16) of his new textbook Quantum Theory, Groups and Representations: An Introduction freely available for download from his website. It covers quantum theory with a heavy emphasis on groups and representation theory and “contains significant amounts of material not well-explained elsewhere.” He expects to finish up the diagrams and publish it next year some time, potentially through Springer.
Advanced Data Analysis from an Elementary Point of View
by Cosma Rohilla Shalizi
This is a draft textbook on data analysis methods, intended for a one-semester course for advance undergraduate students who have already taken classes in probability, mathematical statistics, and linear regression. It began as the lecture notes for 36-402 at Carnegie Mellon University.
By making this draft generally available, I am not promising to provide any assistance or even clarification whatsoever. Comments are, however, welcome.
The book is under contract to Cambridge University Press; it should be turned over to the press before the end of 2015. A copy of the next-to-final version will remain freely accessible here permanently.
Generalized Linear Models and Generalized Additive Models
Classification and Regression Trees
II. Distributions and Latent Structure
Relative Distributions and Smooth Tests of Goodness-of-Fit
Principal Components Analysis
Nonlinear Dimensionality Reduction
III. Dependent Data
Spatial and Network Data
IV. Causal Inference
Graphical Causal Models
Identifying Causal Effects
Causal Inference from Experiments
Estimating Causal Effects
Discovering Causal StructureAppendices
Data-Analysis Problem Sets
Reminders from Linear Algebra
Big O and Little o Notation
Algebra with Expectations and Variances
Propagation of Error, and Standard Errors for Derived Quantities
chi-squared and the Likelihood Ratio Test
Proof of the Gauss-Markov Theorem
Rudimentary Graph Theory
Writing R Functions
Random Variable Generation
Unified treatment of information-theoretic topics (relative entropy / Kullback-Leibler divergence, entropy, mutual information and independence, hypothesis-testing interpretations) in an appendix, with references from chapters on density estimation, on EM, and on independence testing
More detailed treatment of calibration and calibration-checking (part II)
Missing data and imputation (part II)
Move d-separation material from “causal models” chapter to graphical models chapter as no specifically causal content (parts II and IV)?
Expand treatment of partial identification for causal inference, including partial identification of effects by looking at all data-compatible DAGs (part IV)
Figure out how to cut at least 50 pages
Make sure notation is consistent throughout: insist that vectors are always matrices, or use more geometric notation?
Move simulation to an appendix
Move variance/weights chapter to right before logistic regression
Move some appendices online (i.e., after references)?
(Text last updated 30 March 2016; this page last updated 6 November 2015)
Directed by Sam Peckinpah. With Warren Oates, Isela Vega, Robert Webber, Gig Young. An American bartender and his prostitute girlfriend go on a road trip through the Mexican underworld to collect a $1 million bounty on the head of a dead gigolo.
In Washington: A Life celebrated biographer Ron Chernow provides a richly nuanced portrait of the father of our nation. With a breadth and depth matched by no other one-volume life of Washington, this crisply paced narrative carries the reader through his troubled boyhood, his precocious feats in the French and Indian War, his creation of Mount Vernon, his heroic exploits with the Continental Army, his presiding over the Constitutional Convention, and his magnificent performance as America's first president.
Despite the reverence his name inspires, Washington remains a lifeless waxwork for many Americans, worthy but dull. A laconic man of granite self-control, he often arouses more respect than affection. In this groundbreaking work, based on massive research, Chernow dashes forever the stereotype of a stolid, unemotional man. A strapping six feet, Washington was a celebrated horseman, elegant dancer, and tireless hunter, with a fiercely guarded emotional life. Chernow brings to vivid life a dashing, passionate man of fiery opinions and many moods. Probing his private life, he explores his fraught relationship with his crusty mother, his youthful infatuation with the married Sally Fairfax, and his often conflicted feelings toward his adopted children and grandchildren. He also provides a lavishly detailed portrait of his marriage to Martha and his complex behavior as a slave master.
At the same time, Washington is an astute and surprising portrait of a canny political genius who knew how to inspire people. Not only did Washington gather around himself the foremost figures of the age, including James Madison, Alexander Hamilton, John Adams, and Thomas Jefferson, but he also brilliantly orchestrated their actions to shape the new federal government, define the separation of powers, and establish the office of the presidency.
In this unique biography, Ron Chernow takes us on a page-turning journey through all the formative events of America's founding. With a dramatic sweep worthy of its giant subject, Washington is a magisterial work from one of our most elegant storytellers.
Traditional network television programming has always followed the same script: executives approve a pilot, order a trial number of episodes, and broadcast them, expecting viewers to watch a given show on their television sets at the same time every week. But then came Netflix's House of Cards. Netflix gauged the show's potential from data it had gathered about subscribers' preferences, ordered two seasons without seeing a pilot, and uploaded the first thirteen episodes all at once for viewers to watch whenever they wanted on the devices of their choice. In this book, Michael Smith and Rahul Telang, experts on entertainment analytics, show how the success of House of Cards upended the film and TV industries -- and how companies like Amazon and Apple are changing the rules in other entertainment industries, notably publishing and music.
We're living through a period of unprecedented technological disruption in the entertainment industries. Just about everything is affected: pricing, production, distribution, piracy. Smith and Telang discuss niche products and the long tail, product differentiation, price discrimination, and incentives for users not to steal content. To survive and succeed, businesses have to adapt rapidly and creatively. Smith and Telang explain how. How can companies discover who their customers are, what they want, and how much they are willing to pay for it? Data. The entertainment industries, must learn to play a little "moneyball." The bottom line: follow the data.
A very beautiful classical theory on field extensions of a certain type (Galois extensions) initiated by Galois in the 19th century. Explains, in particular, why it is not possible to solve an equation of degree 5 or more in the same way as we solve quadratic or cubic equations. You will learn to compute Galois groups and (before that) study the properties of various field extensions.
We first shall survey the basic notions and properties of field extensions: algebraic, transcendental, finite field extensions, degree of an extension, algebraic closure, decomposition field of a polynomial.
Then we shall do a bit of commutative algebra (finite algebras over a field, base change via tensor product) and apply this to study the notion of separability in some detail.
After that we shall discuss Galois extensions and Galois correspondence and give many examples (cyclotomic extensions, finite fields, Kummer extensions, Artin-Schreier extensions, etc.).
We shall address the question of solvability of equations by radicals (Abel theorem). We shall also try to explain the relation to representations and to topological coverings.
Finally, we shall briefly discuss extensions of rings (integral elemets, norms, traces, etc.) and explain how to use the reduction modulo primes to compute Galois groups.
I’ve been watching MOOCs for several years and this is one of the few I’ve come across that covers some more advanced mathematical topics. I’m curious to see how it turns out and what type of interest/results it returns.
It’s being offered by National Research University – Higher School of Economics (HSE) in Russia.
Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. But a recent development in logic changes this situation. In category theory, the notion of a subset is dual to the notion of a quotient set or partition, and recently the logic of partitions has been developed in a parallel relationship to the Boolean logic of subsets (subset logic is usually mis-specified as the special case of propositional logic). What then is the quantitative measure based on partition logic in the same sense that logical probability theory is based on subset logic? It is a measure of information that is named "logical entropy" in view of that logical basis. This paper develops the notion of logical entropy and the basic notions of the resulting logical information theory. Then an extensive comparison is made with the corresponding notions based on Shannon entropy.
Ellerman is visiting at UC Riverside at the moment. Given the information theory and category theory overlap, I’m curious if he’s working with John Carlos Baez, or what Baez is aware of this.
Based on a cursory look of his website(s), I’m going to have to start following more of this work.
The book is a collection of papers written by a selection of eminent authors from around the world in honour of Gregory Chaitin s 60th birthday. This is a unique volume including technical contributions, philosophical papers and essays.
Hardcover: 468 pages;
Publisher: World Scientific Publishing Company (October 18, 2007);
The Science of the Oven
Columbia University Press
Mayonnaise "takes" when a series of liquids form a semisolid consistency. Eggs, a liquid, become solid as they are heated, whereas, under the same conditions, solids melt. When meat is roasted, its surface browns and it acquires taste and texture. What accounts for these extraordinary transformations? The answer: chemistry and physics. With his trademark eloquence and wit, Hervé This launches a wry investigation into the chemical art of cooking. Unraveling the science behind common culinary technique and practice, Hervé This breaks food down to its molecular components and matches them to cooking's chemical reactions. He translates the complex processes of the oven into everyday knowledge for professional chefs and casual cooks, and he demystifies the meaning of taste and the making of flavor. He describes the properties of liquids, salts, sugars, oils, and fats and defines the principles of culinary practice, which endow food with sensual as well as nutritional value.
For fans of Hervé This's popular volumes and for those new to his celebrated approach, The Science of the Oven expertly expands the possibilities of the kitchen, fusing the physiology of taste with the molecular structure of bodies and food.