Owning a book isn’t the same as reading it; we need only look at our own bloated bookshelves for confirmation.
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.
Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)
These two historical references predate Claude Shannon’s mathematical formalization of information in A Mathematical Theory of Communication (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book What is Life (1944).
For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in History of Psychiatry [.pdf]):
🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke 
Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)
Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.
Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’
The original arXiv article also referenced two lectures which are appended below:
[Original Draft of this was written on December 14, 2015.]
The Winter Q-BIO Quantitative Biology Meeting is coming up at the Sheraton Waikiki in Oahu, HI, USA
A predictive understanding of living systems is a prerequisite for designed manipulation in bioengineering and informed intervention in medicine. Such an understanding requires quantitative measurements, mathematical analysis, and theoretical abstraction. The advent of powerful measurement technologies and computing capacity has positioned biology to drive the next scientific revolution. A defining goal of Quantitative Biology (qBIO) is the development of general principles that arise from networks of interacting elements that initially defy conceptual reasoning. The use of model organisms for the discovery of general principles has a rich tradition in biology, and at a fundamental level the philosophy of qBIO resonates with most molecular and cell biologists. New challenges arise from the complexity inherent in networks, which require mathematical modeling and computational simulation to develop conceptual “guideposts” that can be used to generate testable hypotheses, guide analyses, and organize “big data.”
The Winter q-bio meeting welcomes scientists and engineers who are interested in all areas of q-bio. For 2016, the meeting will be hosted at the Sheraton Waikiki, which is located in Honolulu, on the island of Oahu. The resort is known for its breathtaking oceanfront views, a first-of-its-kind recently opened “Superpool” and many award-winning dining venues. Registration and accommodation information can be found via the links at the top of the page.
Wes Craven, the famed maestro of horrorÂ known for the Nightmare on Elm Street and Scream franchises, died Sunday after a battle with brain cancer. He was 76.
The Postdoctoral Experience Revisited builds on the 2000 report Enhancing the Postdoctoral Experience for Scientists and Engineers. That ground-breaking report assessed the postdoctoral experience and provided principles, action points, and recommendations to enhance that experience. Since the publication of the 2000 report, the postdoctoral landscape has changed considerably. The percentage of PhDs who pursue postdoctoral training is growing steadily and spreading from the biomedical and physical sciences to engineering and the social sciences. The average length of time spent in postdoctoral positions seems to be increasing. The Postdoctoral Experience Revisited reexamines postdoctoral programs in the United States, focusing on how postdocs are being guided and managed, how institutional practices have changed, and what happens to postdocs after they complete their programs. This book explores important changes that have occurred in postdoctoral practices and the research ecosystem and assesses how well current practices meet the needs of these fledgling scientists and engineers and of the research enterprise. The Postdoctoral Experience Revisited takes a fresh look at current postdoctoral fellows - how many there are, where they are working, in what fields, and for how many years. This book makes recommendations to improve aspects of programs - postdoctoral period of service, title and role, career development, compensation and benefits, and mentoring. Current data on demographics, career aspirations, and career outcomes for postdocs are limited. This report makes the case for better data collection by research institution and data sharing. A larger goal of this study is not only to propose ways to make the postdoctoral system better for the postdoctoral researchers themselves but also to better understand the role that postdoctoral training plays in the research enterprise. It is also to ask whether there are alternative ways to satisfy some of the research and career development needs of postdoctoral researchers that are now being met with several years of advanced training. Postdoctoral researchers are the future of the research enterprise. The discussion and recommendations of The Postdoctoral Experience Revisited will stimulate action toward clarifying the role of postdoctoral researchers and improving their status and experience.
The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward.
Most might agree that our educational system is far less than ideal, but few pay attention to significant problems at the highest levels of academia which are holding back a great deal of our national “innovation machinery”. The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward. There are some interesting ideas here, but we’ve still got a long way to go.
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
We describe the evolution of macromolecules as an information transmission process and apply tools from Shannon information theory to it. This allows us to isolate three independent, competing selective pressures that we term compression, transmission, and neutrality selection. The first two affect genome length: the pressure to conserve resources by compressing the code, and the pressure to acquire additional information that improves the channel, increasing the rate of information transmission into each offspring. Noisy transmission channels (replication with mutations) gives rise to a third pressure that acts on the actual encoding of information; it maximizes the fraction of mutations that are neutral with respect to the phenotype. This neutrality selection has important implications for the evolution of evolvability. We demonstrate each selective pressure in experiments with digital organisms.
To be published in J. theor. Biology 222 (2003) 477-483
This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.
This is the second in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and to their relations to the thermodynamics of computation. In the first paper of the series, it was shown that a general-form dimensional argument from the second law of thermodynamics captures a number of scaling relations governing growth and development across many domains of life. It was also argued that models of physiology based on reversible transformations provide sensible approximations within which the second-law scaling is realized. This paper provides a formal basis for decomposing general cyclic, fixed-temperature chemical reactions, in terms of the chemical equivalent of Carnot's cycle for heat engines. It is shown that the second law relates the minimal chemical work required to perform a cycle to the Kullback–Leibler divergence produced in its chemical output ensemble from that of a Gibbs equilibrium. Reversible models of physiology are used to create reversible models of natural selection, which relate metabolic energy requirements to information gain under optimal conditions. When dissipation is added to models of selection, the second-law constraint is generalized to a relation between metabolic work and the combined energies of growth and maintenance.
This is the first of three papers analyzing the representation of information in the biosphere, and the energetic constraints limiting the imposition or maintenance of that information. Biological information is inherently a chemical property, but is equally an aspect of control flow and a result of processes equivalent to computation. The current paper develops the constraints on a theory of biological information capable of incorporating these three characterizations and their quantitative consequences. The paper illustrates the need for a theory linking energy and information by considering the problem of existence and reslience of the biosphere, and presents empirical evidence from growth and development at the organismal level suggesting that the theory developed will capture relevant constraints on real systems. The main result of the paper is that the limits on the minimal energetic cost of information flow will be tractable and universal whereas the assembly of more literal process models into a system-level description often is not. The second paper in the series then goes on to construct reversible models of energy and information flow in chemistry which achieve the idealized limits, and the third paper relates these to fundamental operations of computation.
The most significant legacy of philosophical skepticism is the realization that our concepts, beliefs and theories are social constructs. This belief has led to epistemological relativism, or the thesis that since there is no ultimate truth about the world, theory preferences are only a matter of opinion. In this book, William Harms seeks to develop the conceptual foundations and tools for a science of knowledge through the application of evolutionary theory, thus allowing us to acknowledge the legacy of skepticism while denying its relativistic offspring.
No one can escape a sense of wonder when looking at an organism from within. From the humblest amoeba to man, from the smallest cell organelle to the amazing human brain, life presents us with example after example of highly ordered cellular matter, precisely organized and shaped to perform coordinated functions. But where does this order spring from? How does a living organism manage to do what nonliving things cannot do--bring forth and maintain all that order against the unrelenting, disordering pressures of the universe? In The Touchstone of Life, world-renowned biophysicist Werner Loewenstein seeks answers to these ancient riddles by applying information theory to recent discoveries in molecular biology. Taking us into a fascinating microscopic world, he lays bare an all-pervading communication network inside and between our cells--a web of extraordinary beauty, where molecular information flows in gracefully interlaced circles. Loewenstein then takes us on an exhilarating journey along that web and we meet its leading actors, the macromolecules, and see how they extract order out of the erratic quantum world; and through the powerful lens of information theory, we are let in on their trick, the most dazzling of magician's acts, whereby they steal form out of formlessness. The Touchstone of Life flashes with fresh insights into the mystery of life. Boldly straddling the line between biology and physics, the book offers a breathtaking view of that hidden world where molecular information turns the wheels of life. Loewenstein makes these complex scientific subjects lucid and fascinating, as he sheds light on the most fundamental aspects of our existence.
This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. This paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources, as the author shows. The role of information in human cultural evolution is another focus of the book. One of the final chapters discusses the merging of information technology and biotechnology into a new discipline — bio-information technology.