How to Sidestep Mathematical Equations in Popular Science Books

In the publishing industry there is a general rule-of-thumb that every mathematical equation included in a book will cut the audience of science books written for a popular audience in half – presumably in a geometric progression. This typically means that including even a handful of equations will give you an effective readership of zero – something no author and certainly no editor or publisher wants.

I suspect that there is a corollary to this that every picture included in the text will help to increase your readership, though possibly not by as proportionally a large amount.

In any case, while reading Melanie Mitchell’s text Complexity: A Guided Tour [Cambridge University Press, 2009] this weekend, I noticed that, in what appears to be a concerted effort to include an equation without technically writing it into the text and to simultaneously increase readership by including a picture, she cleverly used a picture of Boltzmann’s tombstone in Vienna! Most fans of thermodynamics will immediately recognize Boltzmann’s equation for entropy, S = k log W , which appears engraved on the tombstone over his bust.

Page 51 of Melanie Mitchell's book "Complexity: A Guided Tour"
Page 51 of Melanie Mitchell’s book “Complexity: A Guided Tour” featuring Boltzmann’s tombstone in Vienna.

I hope that future mathematicians, scientists, and engineers will keep this in mind and have their tombstones engraved with key formulae to assist future authors in doing the same – hopefully this will help to increase the amount of mathematics that is deemed “acceptable” by the general public.

Regard the World as Made of Information

John Archibald Wheeler (1911-2008), American theoretical physicist
[attributed by Jacob Bekenstein in “Information in the Holographic Universe” (Scientific American, 2007)]

 

John Archibald Wheeler

Rod, Can You Tell Our Contestant What She’s Won?

Possibly one of the oddest closing sentences of a technical book–and a very good one at that–I’ve ever read:

This pressure can be calculated by minimizing the Helmholtz function of the system. Details can be found in Fermi’s textbook on thermodynamics (Fermi 1956). But why does osmosis explain the behavior of a salted cucumber? This question is left to the reader as a parting gift.

André Thess in The Entropy Prinicple: Thermodynamics for the Unsatisified (Springer, 2011)

 

Featured image by KTRYNA on Unsplash

A Cosmologically Centered Definition of Hydrogen

An anonymous wit defining hydrogen in light of the Big Bang Theory
As relayed by David Christian in his book Maps of Time: An Introduction to Big History

 

Book cover of "The Maps of Time"

Bookmarked ScienceDirectThermodynamics of natural selection III: Landauer's principle in computation and chemistry by Eric Smith (Journal of Theoretical Biology Volume 252, Issue 2, 21 May 2008, Pages 213-220)
This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.
https://doi.org/10.1016/j.jtbi.2008.02.013
Bookmarked Thermodynamics of natural selection II: Chemical Carnot cycles by Eric Smith (Journal of Theoretical Biology Volume 252, Issue 2, 21 May 2008, Pages 198-212)
This is the second in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and to their relations to the thermodynamics of computation. In the first paper of the series, it was shown that a general-form dimensional argument from the second law of thermodynamics captures a number of scaling relations governing growth and development across many domains of life. It was also argued that models of physiology based on reversible transformations provide sensible approximations within which the second-law scaling is realized. This paper provides a formal basis for decomposing general cyclic, fixed-temperature chemical reactions, in terms of the chemical equivalent of Carnot's cycle for heat engines. It is shown that the second law relates the minimal chemical work required to perform a cycle to the Kullback–Leibler divergence produced in its chemical output ensemble from that of a Gibbs equilibrium. Reversible models of physiology are used to create reversible models of natural selection, which relate metabolic energy requirements to information gain under optimal conditions. When dissipation is added to models of selection, the second-law constraint is generalized to a relation between metabolic work and the combined energies of growth and maintenance.
https://doi.org/10.1016/j.jtbi.2008.02.008
Bookmarked Thermodynamics of natural selection I: Energy flow and the limits on organization by Eric Smith (Journal of Theoretical Biology, Volume 252, Issue 2, 21 May 2008, Pages 185-197)
This is the first of three papers analyzing the representation of information in the biosphere, and the energetic constraints limiting the imposition or maintenance of that information. Biological information is inherently a chemical property, but is equally an aspect of control flow and a result of processes equivalent to computation. The current paper develops the constraints on a theory of biological information capable of incorporating these three characterizations and their quantitative consequences. The paper illustrates the need for a theory linking energy and information by considering the problem of existence and reslience of the biosphere, and presents empirical evidence from growth and development at the organismal level suggesting that the theory developed will capture relevant constraints on real systems. The main result of the paper is that the limits on the minimal energetic cost of information flow will be tractable and universal whereas the assembly of more literal process models into a system-level description often is not. The second paper in the series then goes on to construct reversible models of energy and information flow in chemistry which achieve the idealized limits, and the third paper relates these to fundamental operations of computation.
https://doi.org/10.1016/j.jtbi.2008.02.010