Syndicated copies to:

# Tag: physics

## Information Theory is the New Central Discipline

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

- BIRS Workshop: Biological and Bio-Inspired Information Theory
- Entropy and Information in Biological Systems at NIMBios
- CECAM Workshop: Entropy in Biomolecular Systems
- ALife breakout session on Information Theoretic Incentives for Artificial Life (which will also spawn off a special issue of the journal Entropy):

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life” which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s *An Introduction to Information Theory: Symbols, Signals and Noise* (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book *Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games*. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all). (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in *A Farewell to Entropy: Statistical Thermodynamics Based on Information*.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Syndicated copies to:## Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

**Editor’s Note:** Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

- the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
- the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

- information theoretic intrinsic motivations
- information theoretic quantification of behavior
- information theoretic guidance of artificial evolution
- information theoretic guidance of self-organization
- information theoretic driving forces behind learning
- information theoretic driving forces behind behavior
- information theory in swarms
- information theory in social behavior
- information theory in evolution
- information theory in the brain
- information theory in system-environment distinction
- information theory in the perception action loop
- information theoretic definitions of life

## Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. *Entropy* is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

**Deadline for manuscript submissions: 28 February 2015**

## Special Issue Editors

*Guest Editor*

**Dr. Christoph Salge**

Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK

Website: http://homepages.stca.herts.ac.uk/~cs08abi

E-Mail: c.salge@herts.ac.uk

Phone: +44 1707 28 4490

**Interests:** Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

*Guest Editor*

**Dr. Georg Martius**

Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany

Website: http://www.mis.mpg.de/jjost/members/georg-martius.html

E-Mail: martius@mis.mpg.de

Phone: +49 341 9959 545

**Interests:** Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

*Guest Editor*

**Dr. Keyan Ghazi-Zahedi**

Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany

Website: http://personal-homepages.mis.mpg.de/zahedi

E-Mail: zahedi@mis.mpg.de

Phone: +49 341 9959 535

**Interests:** Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

*Guest Editor*

**Dr. Daniel Polani**

Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK

Website: http://homepages.feis.herts.ac.uk/~comqdp1/

E-Mail: d.polani@herts.ac.uk

**Interests:** artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

## Introduction to Lie Groups and Lie Algebras | UCLA Extension

## Exercise Your Brain

As many may know or have already heard, Dr. Mike Miller, a retired mathematician from RAND and long-time math professor at UCLA, is offering a course on Introduction to Lie Groups and Lie Algebras this fall through UCLA Extension. Whether you’re a professional mathematician, engineer, physicist, physician, or even a hobbyist interested in mathematics you’ll be sure to get something interesting out of this course, not to mention the camaraderie of 20-30 other “regulars” with widely varying backgrounds (actors to surgeons and evolutionary theorists to engineers) who’ve been taking almost everything Mike has offered over the years (and yes, he’s THAT good — we’re sure you’ll be addicted too.)

## “Beginners” Welcome!

Even if it’s been years since you last took Calculus or Linear Algebra, Mike (and the rest of the class) will help you get quickly back up to speed to delve into what is often otherwise a very deep subject. If you’re interested in advanced physics, quantum mechanics, quantum information or string theory, this is one of the topics that is de rigueur for delving in deeply and being able to understand them better. The topic is also one near and dear to the hearts of those in robotics, graphics, 3-D modelling, gaming, and areas utilizing multi-dimensional rotations. And naturally, it’s simply a beautiful and elegant subject for those who have no need to apply it to anything, but who just want to meander their way through higher mathematics for the fun of it (this will comprise the largest majority of the class by the way.)

Whether you’ve been away from serious math for decades or use it every day or even if you’ve never gone past Calculus or Linear Algebra, this is bound to be the most entertaining thing you can do with your Tuesday nights in the fall. If you’re not sure what you’re getting into (or are scared a bit by the course description), I highly encourage to come and join us for at least the first class before you pass up on the opportunity. I’ll mention that the greater majority of new students to Mike’s classes join the ever-growing group of regulars who take almost everything he teaches subsequently. (For the reticent, I’ll mention that one of the first courses I took from Mike was Algebraic Topology which generally requires a few semesters of Abstract Algebra and a semester of Topology as prerequisites. I’d taken neither of these prerequisites, but due to Mike’s excellent lecture style and desire to make everything comprehensible, I was able to do exceedingly well in the course.) I’m happy to chat with those who may be reticent. Also keep in mind that you can register to take the class for a grade, pass/fail, or even no grade at all to suit your needs/lifestyle.

As a group, some of us have a collection of a few dozen texts in the area which we’re happy to loan out as well. In addition to the one recommended text (Mike always gives such comprehensive notes that any text for his classes is purely supplemental at best), several of us have also found some good similar texts:

- Stillwell, John. Naïve Lie Theory. (Springer, 2008) ISBN: 9780387782157
- Baker, Andrew. Matrix Groups: An Introduction to Lie Group Theory (Springer, 2002) ISBN: 9781447101833
- Tapp, Kristopher. Matrix Groups for Undergraduates (AMS, 2005) ISBN: 0821837850

Given the breadth and diversity of the backgrounds of students in the class, I’m sure Mike will spend some reasonable time at the beginning [or later in the class, as necessary] doing a quick overview of some linear algebra and calculus related topics that will be needed later in the quarter(s).

Further information on the class and a link to register can be found below. If you know of others who might be interested in this, please feel free to forward it along – the more the merrier.

I hope to see you all soon.

## Introduction to Lie Groups and Lie Algebras

MATH X 450.6 / 3.00 units / Reg. # 249254W

Professor: Michael Miller, Ph.D.

Start Date: 9/30/2014

Location UCLA: 5137 Math Sciences Building

Tuesday, 7-10pm

September 30 – December 16, 2014

11 meetings total *(no mtg 11/11)*

Register here: https://www.uclaextension.edu/Pages/Course.aspx?reg=249254

### Course Description

A *Lie group* is a differentiable manifold that is also a group for which the product and inverse maps are differentiable. A *Lie algebra* is a vector space endowed with a binary operation that is bilinear, alternating, and satisfies the so-called Jacobi identity. This course, the first in a 2-quarter sequence, is an introductory survey of Lie groups, their associated Lie algebras, and their representations. This first quarter will focus on the special case of matrix Lie groups–including general linear, special linear, orthogonal, unitary, and symplectic. The second quarter will generalize the theory developed to the case of arbitrary Lie groups. Topics to be discussed include compactness and connectedness, homomorphisms and isomorphisms, exponential mappings, the Baker-Campbell-Hausdorff formula, covering groups, and the Weyl group. This is an advanced course, requiring a solid understanding of linear algebra and basic analysis.

### Recommended Textbook

Hall, Brian. *Lie Groups, Lie Algebras, & Representations* (Springer, 2004) ISBN: 9781441923134

Syndicated copies to:

## Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez’s blog Azimuth is so on-topic, that attempting to embellish it seems silly.

## Entropy and Information in Biological Systems (Part 2)

John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend:

• Information and entropy in biological systems, National Institute for Mathematical and Biological Synthesis, Knoxville Tennesee, Wednesday-Friday, 8-10 April 2015.

Click the link, read the stuff and scroll down to “CLICK HERE” to apply. The deadline is 12 November 2014.

Financial support for travel, meals, and lodging is available for workshop attendees who need it. We will choose among the applicants and invite 10-15 of them.

The idea

Information theory and entropy methods are becoming powerful tools in biology, from the level of individual cells, to whole ecosystems, to experimental design, model-building, and the measurement of biodiversity. The aim of this investigative workshop is to synthesize different ways of applying these concepts to help systematize and unify work in biological systems. Early attempts at “grand syntheses” often misfired, but applications of information theory and entropy to specific highly focused topics in biology have been increasingly successful. In ecology, entropy maximization methods have proven successful in predicting the distribution and abundance of species. Entropy is also widely used as a measure of biodiversity. Work on the role of information in game theory has shed new light on evolution. As a population evolves, it can be seen as gaining information about its environment. The principle of maximum entropy production has emerged as a fascinating yet controversial approach to predicting the behavior of biological systems, from individual organisms to whole ecosystems. This investigative workshop will bring together top researchers from these diverse fields to share insights and methods and address some long-standing conceptual problems.So, here are the goals of our workshop:

- To study the validity of the principle of Maximum Entropy Production (MEP), which states that biological systems – and indeed all open, non-equilibrium systems – act to produce entropy at the maximum rate.
- To familiarize all the participants with applications to ecology of the MaxEnt method: choosing the probabilistic hypothesis with the highest entropy subject to the constraints of our data. We will compare MaxEnt with competing approaches and examine whether MaxEnt provides a sufficient justification for the principle of MEP.
- To clarify relations between known characterizations of entropy, the use of entropy as a measure of biodiversity, and the use of MaxEnt methods in ecology.
- To develop the concept of evolutionary games as “learning” processes in which information is gained over time.
- To study the interplay between information theory and the thermodynamics of individual cells and organelles.
For more details, go here.

If you’ve got colleagues who might be interested in this, please let them know. You can download a PDF suitable for printing and putting on a bulletin board by clicking on this:

## CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna. A quick websearch found it quickly, and not having heard about it myself yet, I thought I’d pass it along to others who are regular readers and interested in the area.

The workshop on “Entropy in Biomolecular Systems” is being hosted by the Centre Européen de Calcul Atomique et Moléculaire (CECAM)

Location: **DACAM, Max F. Perutz Laboratories**, University of Vienna, Dr. Bohrgasse 9, A-1030, Vienna, Austria

Dates: **May 14, 2014 to May 17, 2014**

The workshop is being organized by:

- Richard Henchman
*(University of Manchester, United Kingdom)* - Bojan Zagrovic
*(University of Vienna, Austria)* - Michel Cuendet
*(Swiss Institute of Bioinformatics, Lausanne, Switzerland and Weill Cornell Medical College, New York, USA)* - Chris Oostenbrink
*(University of Natural Resources and Life Sciences, Austria)*

It’s being supported by CECAM, the European Research Council, and the Royal Society of Chemistry’s Statistical Mechanics and Thermodynamics Group.

I’ll note that the registration deadline is on April 21 with a payment deadline of April 30, so check in quickly if you haven’t already.

The summary from the workshop website states:

This workshop brings together the world’s experts to address the challenges of determining the entropy of biomolecular systems, either by experiment or computer simulation. Entropy is one the main driving forces for any biological process such as binding, folding, partitioning and reacting. Our deficient understandng of entropy, however, means that such important processes remain controversial and only partially understood. Contributions of water, ions, cofactors, and biomolecular flexibility are actively examined but yet to be resolved. The state-of-the-art of each entropy method will be presented and explained, highlighting its capabilities and deficiencies. This will be followed by intensive discussion on the main areas that need improving, leading suitable actions and collaborations to address the main biological and industrial questions.

Further details on the workshop can be found on the CECAM website.

As always, details on other upcoming workshops and conferences relating to information theory and biology can be found on our ITBio Conferences/Workshops page.

## Information Theory is Something Like the Logarithm of Probability Theory

## Why a Ph.D. in Physics is Worse Than Drugs

In the essay, Dr. Katz provides a bevy of solid reasons why one shouldn’t become a researcher. I highly recommend everyone read it and then carefully consider how we can turn these problems around.

*Editor’s Note*: The original article has since been moved to another server.

### How might we end the war against science in America?

Syndicated copies to:## The Two Cultures

## Book Review: “Complexity: A Guided Tour” by Melanie Mitchell

*(amzn.to)*

Popular Science

Oxford University Press

May 28, 2009

Hardcover

366

This book provides an intimate, highly readable tour of the sciences of complexity, which seek to explain how large-scale complex, organized, and adaptive behavior can emerge from simple interactions among myriad individuals. The author, a leading complex systems scientist, describes the history of ideas, current research, and future prospects in this vital scientific effort.

This is handily one of the best, most interesting, and (to me at least) the most useful popularly written science books I’ve yet to come across. Most popular science books usually bore me to tears and end up being only pedantic for their historical backgrounds, but this one is very succinct with some interesting viewpoints (some of which I agree with and some of which my intuition says are terribly wrong) on the overall structure presented.

For those interested in a general and easily readable high-level overview of some of the areas of research I’ve been interested in (information theory, thermodynamics, entropy, microbiology, evolution, genetics, along with computation, dynamics, chaos, complexity, genetic algorithms, cellular automata, etc.) for the past two decades, this is really a lovely and thought-provoking book.

At the start I was disappointed that there were almost no equations in the book to speak of – and perhaps this is why I had purchased it when it came out and it’s subsequently been sitting on my shelf for so long. The other factor that prevented me from reading it was the depth and breadth of other more technical material I’ve read which covers the majority of topics in the book. I ultimately found myself not minding so much that there weren’t any/many supporting equations aside from a few hidden in the notes at the end of the text in most part because Dr. Mitchell does a fantastic job of pointing out some great subtleties within the various subjects which comprise the broader concept of complexity which one generally would take several years to come to on one’s own and at far greater expense of their time. Here she provides a much stronger picture of the overall subjects covered and this far outweighed the lack of specificity. I honestly wished I had read the book when it was released and it may have helped me to me more specific in my own research. Fortunately she does bring up several areas I will need to delve more deeply into and raised several questions which will significantly inform my future work.

In general, I wish there were more references I hadn’t read or been aware of yet, but towards the end there were a handful of topics relating to fractals, chaos, computer science, and cellular automata which I have been either ignorant of or which are further down my reading lists and may need to move closer to the top. I look forward to delving into many of these shortly. As a simple example, I’ve seen Zipf’s law separately from the perspectives of information theory, linguistics, and even evolution, but this is the first time I’ve seen it related to power laws and fractals.

I definitely appreciated the fact that Dr. Mitchell took the time to point out her own personal feelings on several topics and more so that she explicitly pointed them out as her own gut instincts instead of mentioning them passingly as if they were provable science which is what far too many other authors would have likely done. There are many viewpoints she takes which I certainly don’t agree with, but I suspect that it’s because I’m coming at things from the viewpoint of an electrical engineer with a stronger background in information theory and microbiology while hers is closer to that of computer science. She does mention that her undergraduate background was in mathematics, but I’m curious what areas she specifically studied to have a better understanding of her specific viewpoints.

Her final chapter looking at some of the pros and cons of the topic(s) was very welcome, particularly in light of previous philosophic attempts like cybernetics and general systems theory which I (also) think failed because of their lack of specificity. These caveats certainly help to place the scientific philosophy of complexity into a much larger context. I will generally heartily agree with her viewpoint (and that of others) that there needs to be a more rigorous mathematical theory underpinning the overall effort. I’m sure we’re all wondering “Where is our Newton?” or to use her clever aphorism that we’re “waiting for Carnot.” (Sounds like it should be a Tom Stoppard play title, doesn’t it?)

I might question her brief inclusion of her own Ph.D. thesis work in the text, but it did actually provide a nice specific and self-contained example within the broader context and also helped to tie several of the chapters together.

My one slight criticism of the work would be the lack of better footnoting within the text. Though many feel that footnote numbers within the text or inclusion at the bottom of the pages detracts from the “flow” of the work, I found myself wishing that she had done so here, particularly as I’m one of the few who actually cares about the footnotes and wants to know the specific references as I read. I hope that Oxford eventually publishes an e-book version that includes cross-linked footnotes in the future for the benefit of others.

I can heartily recommend this book to any fan of science, but I would specifically recommend it to any undergraduate science or engineering major who is unsure of what they’d specifically like to study and might need some interesting areas to take a look at. I will mention that one of the tough parts of the concept of complexity is that it is so broad and general that it encompasses over a dozen other fields of study each of which one could get a Ph.D. in without completely knowing the full depth of just one of them much less the full depth of all of them. The book is so well written that I’d even recommend it to senior researchers in any of the above mentioned fields as it is certainly sure to provide not only some excellent overview history of each, but it is sure to bring up questions and thoughts that they’ll want to include in their future researches in their own specific sub-areas of expertise.

## Some Brief Thoughts on Cliodynamics and Big History

As an electrical engineer (in the subfields of information theory and molecular biology), I have to say that I’m very intrigued by the articles (1, 2) that Marc Parry has written for the Chronicle in the past few weeks on the subjects of quantitative history, cliometrics/cliodynamics, or what I might term Big History (following the tradition of David Christian; I was initially turned onto it by a Chronicle article). I have lately coincidentally been reading Steven Pinker’s book “The Better Angels of Our Nature” as well as Daniel Kanheman’s “Thinking, Fast and Slow”. (I’ll also mention that I’m a general fan of the work of Jared Diamond and Matt Ridley who impinge on these topics as well.)

I’m sure that all of these researchers are onto something in terms of trying to better quantify our historical perspectives in using science and applying it to history. I think the process might be likened to the ways in which methods of computed tomography, P.E.T., S.P.E.C.T, et al have been applied to the areas of psychology since the late 70’s to create the field of cognitive neuropsychology which has now grown much more closely to the more concrete areas of neurophysiology within biology, chemistry, and medicine.

I can see both sides of the “controversy” which is mentioned in the articles as well as in the comments in all of the articles, but I have a very visceral gut feeling that they can be ironed out over time. I say this as areas like behavioral economics which have grown out of the psychology work mentioned in Kahneman’s book become more concrete. The data available for application with relation to history will be much more useful as people’s psychological interactions with their surroundings are better understood. People in general are exceptionally poor at extrapolating statistical knowledge of the world around them and putting it into the best use. For example, although one can make an accurate calculation of the time-value of money, most people who know it won’t use it to determine the best way of taking a large lottery payout (either a lump sum or paid out over time), and this doesn’t even take into consideration the phenomenal odds against even playing the lottery in the first place. Kahneman’s system 1 and system 2 structures in conjunction with more historical data and analysis of the two in conjunction may be a far better method than either that of historians’ previous attempts or that of the quantitative method separately. Put into mathematical terms, it’s much more likely the case that human interactions follow a smaller local min-max curve/equation on a limited horizon, but do not necessarily follow the global maxima and minima that are currently being viewed at the larger scales of big history. We’ll need to do a better job of sifting through the data and coming up with a better interpretation of it on the correct historical scales for the problem at hand.

Perhaps, by analogy, we might look at this disconnect between the two camps as the same type of disconnect seen in the areas of Newtonian and quantum physics. They’re both interlinked somehow and do a generally good job of providing accurate viewpoints and predictions of their own sub-areas, but haven’t been put together coherently into one larger catch-all theory encompassing both. Without the encouragement of work in the quantitative areas of history, we’ll certainly be at a great disadvantage.

Syndicated copies to:## Regard the World as Made of Information

## David Christian’s “Maps of Time” and “Big History” – a Profound Thesis

David Christian, a trained historian, is one of the leading proponents of the relatively new concept of **Big History**, which I view as a sea-change in the way humans will begin to view not only the world but our place in it and what we might expect to come in the future. His work presents a truly monumental and profound thesis and a drastically new framework for where humankind fits into the universe. Of the broad variety of works I’ve read in the past several decades, it is simply one of the most interesting and cohesive theses I’ve come across, and I highly and unreservedly recommend it to everyone I know. I’d put it on par or above works like Jared Diamond’s *Guns, Germs, and Steel *and Matt Ridley’s* The Rational Optimist* among others for its broad impact on how I now view the world. For scientists and researchers it has the potential to be the philosophical equivalent of *The Bible* and in fact, like many religious texts, it is in effect a modern day “creation myth,” albeit one with a scientific underpinning.

Christian’s work was initially brought to my attention by an article in the Chronicle of Higher Education by Jeffrey R. Young in which he mentioned that Bill Gates was a big fan of Christian’s work and had recommended it himself at a TED conference. (Gates is now also a financial supporter of Christian’s Big History Project.) I myself was aware of the Learning Company’s generally excellent coursework offerings and within a few weeks got an audio copy of the course of forty-eight lectures to listen to on my daily commute.

I’ve now devoured both his rather large text on the subject as well as a lecture series he created for a course on the subject. Below are brief reviews of the two works.The magnum written opus *Maps of Time: An Introduction to Big History* is an interesting change of reference from a historical perspective combining the disciplines of physics, cosmology, astronomy, geology, chemistry, microbiology, evolutionary theory, archaeology, politics, religion, economics, sociology, and history into one big area of contiguous study based upon much larger timescales than those traditionally taken in the study of historical time periods. Though it takes pieces from many disciplines, it provides for an interesting, fresh, and much needed perspective on who humans are and their place in not only the world, but the entire universe.

By looking at history from a much broader viewpoint (billions of years versus the more common decades or even just a few centuries) one comes away with a drastically different perspective on the universe and life.

I’d highly recommend this to any general reader as early as they can find time to read through it, particularly because it provides such an excellent base for a variety of disciplines thereby better framing their future studies. I wish I had been able to read this book in the ninth or tenth grade or certainly at the latest by my freshman year in college – alas the general conception of the topic itself didn’t exist until after I had graduated from university.

Although I have significant backgrounds in most, if not all, of the disciplines which comprise the tapestry of big history, the background included in the book is more than adequate to give the general reader the requisite introductions to these subjects to make big history a coherent subject on its own.

This could be an extremely fundamental and life-changing book for common summer reading programs of incoming college freshman. If I could, I would make it required reading for all students at the high school level. Fortunately Bill Gates and others are helping to fund David Christian’s work to help introduce it more broadly at the high school and other educational levels.

Within David Christian’s opus, there is also a collection of audio lectures produced by The Learning Company as part of their Great Courses series which I listened to as well. The collection of forty-eight lectures is entitled *Big History: The Big Bang, Life on Earth, and the Rise of Humanity* (Great Courses, Course No. 8050). It provides a much quicker philosophical overview of the subject and doesn’t delve as deeply into the individual disciplines as the text does, but still provides a very cohesive presentation of the overall thesis. In fact, for me, the introduction to the topic was much better in these audio lectures than it was in the written book. Christian’s lecture style is fantastic and even better than his already excellent writing style.

In the audio lectures Christian highlights eight major thresholds which he uses as a framework by which to view the 13.4 billion years of history which the Universe has presently traversed. Then within those he uses the conceptualization of disparities in power/energy as the major driving forces/factors in history in a unique and enlightening way which provides a wealth of perspective on almost every topic (scientific or historical) one can consider. This allows one to see parallels and connections between seemingly disparate topics like the creations of stars and the first building of cities or how the big bang is similar to the invention of agriculture.

I can easily say that David Christian’s works on big history are some of the most influential works I’ve ever come across – and having experienced them, I can never see our universe in the same naive way again.

For those interested in taking a short and immediate look at Christian’s work, I can recommend his Ted Talk “The History of Our World in 18 Minutes” which only begins to scratch the surface of his much deeper and profound thesis:

Given how profound the topic of big history is, I’m sure I’ll be writing about and referring to it often. Posts in relation to it can be found here with the tag: “big history“.

Added material below on 21-October-2016

##### Reading Progress

- 03/11/2011 marked as: want to read after reading
*Bill Gates Promotes Professor’s Online Course at TED*by Jeffrey R. Young in*The Chronicle of Higher Education* - 08/26/11 started reading
- 08/27/11 5.0% done on page 34 of 668
- 03/01/12 5.0% done or on page 35 of 668; “Reread some of the beginning this morning as I get back into David Christian’s audio lectures.”
- 04/28/12 9.0% done on page 60 of 668
- 04/30/12 11.0% done
- 05/03/12 14.0% done
- 05/05/12 18.0% done
- 05/23/12 35.0% done
- 06/02/12 40.0% done
- 06/14/12 52.0% done
- 06/17/12 60.0% done; “Technically finished reading, but need to still go through the endnotes.”
- 10/21/12 Finished book

Added material below on 23-October-2016

### Highlights, Quotes, & Marginalia

###### Guide to highlight colors

Yellow–general highlights and highlights which don’t fit under another category below

Orange–Vocabulary word; interesting and/or rare word

Green–Reference to read

Blue–Interesting Quote

Gray–Typography Problem

Red–Example to work through

*Editor’s Note*: Data relating to reading progress was added to this post on 10/21/16. Data relating to highlights, quotes, and marginalia added on 10/23/16.

## Mathematics in Popular Science Books | The Economist

*(The Economist)*

Popular physics has enjoyed a new-found regard. Now comes a brave attempt to inject mathematics into an otherwise fashionable subject

This review of Brian Cox and Jeff Forshaw’s forthcoming book The Quantum Universe: Everything That Can Happen Does Happen sounds intriguing. I’m highly impressed that so much of the review focuses on the author’s decision to include a more mathematical treatment of their subject for what is supposed to be a popular science book. I always wish books like these at least had the temerity to include much more in the way of the mathematical underpinnings of their subjects; I’m glad that the popular press (or at least The Economist in this case) is willing to be asking for the mathematics as well. Hopefully it will mark a broader trend in popular books on scientific topics!

## Fundamental physics

## Big bang

## Popular physics has enjoyed a new-found regard. Now comes a brave attempt to inject mathematics into an otherwise fashionable subject

Nov 5th 2011 | from the print edition

The Quantum Universe: Everything That Can Happen Does Happen.By Brian Cox and Jeff Forshaw.Allen Lane; 255 pages; £20. To be published in America in January by Da Capo Press; $25.PREVIOUSLY the preserve of dusty, tweed-jacketed academics, physics has enjoyed a surprising popular renaissance over the past few years. In America Michio Kaku, a string theorist, has penned several successful books and wowed television and radio audiences with his presentations on esoteric subjects such as the existence of wormholes and the possibility of alien life. In Britain Brian Cox, a former pop star whose music helped propel Tony Blair to power, has become the front man for physics, which recently regained its status as a popular subject in British classrooms, an effect many attribute to Mr Cox’s astonishing appeal.

Mr Cox, a particle physicist, is well-known as the presenter of two BBC television series that have attracted millions of viewers (a third series will be aired next year) and as a bestselling author and public speaker. His latest book, “The Quantum Universe”, which he co-wrote with Jeff Forshaw of the University of Manchester, breaks the rules of popular science-writing that were established over two decades ago by Stephen Hawking, who launched the modern genre with his famous book, “A Brief History of Time”.

Mr Hawking’s literary success was ascribed to his eschewing equations. One of his editors warned him that sales of the book would be halved by every equation he included; Mr Hawking inserted just one, E=mc

^{2}, and, even then, the volume acquired a sorry reputation for being bought but not read. By contrast, Mr Cox, whose previous book with Mr Forshaw investigated “Why does E=mc^{2}?” (2009), has bravely sloshed a generous slug of mathematics throughout his texts.The difficulties in explaining physics without using maths are longstanding. Einstein mused, “The eternal mystery of the world is its comprehensibility,” and “the fact that it is comprehensible is a miracle.” Yet the language in which the world is described is that of maths, a relatively sound grasp of which is needed to comprehend the difficulties that physicists are trying to resolve as well as the possible solutions. Mr Cox has secured a large fan base with his boyish good looks, his happy turns of phrase and his knack for presenting complex ideas using simple analogies. He also admirably shies away from dumbing down. “The Quantum Universe” is not a dry undergraduate text book, but nor is it a particularly easy read.

The subject matter is hard. Quantum mechanics, which describes in subatomic detail a shadowy world in which cats can be simultaneously alive and dead, is notoriously difficult to grasp. Its experiments yield bizarre results that can be explained only by embracing the maths that describe them, and its theories make outrageous predictions (such as the existence of antimatter) that have nevertheless later been verified. Messrs Cox and Forshaw say they have included the maths “mainly because it allows us to really explain why things are the way they are. Without it, we should have to resort to the physicist-guru mentality whereby we pluck profundities out of thin air, and neither author would be comfortable with guru status.”

That stance might comfort the authors, but to many readers they will nonetheless seem to pluck equations out of thin air. Yet their decision to include some of the hard stuff leaves open the possibility that some readers might actually engage in the slog that leads to higher pleasures. For non-sloggers alternative routes are offered: Messrs Cox and Forshaw use clockfaces to illustrate how particles interact with one another, a drawing of how guitar strings twang and a photograph of a vibrating drum. A diagram, rather than an equation, is used to explain one promising theory of how matter acquires mass, a question that experiments on the Large Hadron Collider at CERN, the European particle-physics laboratory near Geneva, will hopefully soon answer.

The authors have wisely chosen to leaven their tome with amusing tales of dysfunctional characters among scholars who developed quantum mechanics in the 1920s and beyond, as well as with accounts of the philosophical struggles with which they grappled and the occasional earthy aside. Where the subject matter is a trifle dull, Messrs Cox and Forshaw acknowledge it: of Heinrich Kayser, who a century ago completed a six-volume reference book documenting the spectral lines generated by every known element, they observe, “He must have been great fun at dinner parties.” And they make some sweeping generalisations about their colleagues who pore over equations, “Physicists are very lazy, and they would not go to all this trouble unless it saved time in the long run.”

Whether or not readers of “The Quantum Universe” will follow all the maths, the authors’ love for their subject shines through the book. “There is no better demonstration of the power of the scientific method than quantum theory,” they write. That may be so, but physicists all over the world, Messrs Cox and Forshaw included, are longing for the next breakthrough that will supersede the claim. Hopes are pinned on experiments currently under way at CERN that may force physicists to rethink their understanding of the universe, and inspire Messrs Cox and Forshaw to write their next book—equations and all.

from the print edition | Books and arts