🔖 The Negentropy Principle of Information by Leon Brillouin | Journal of Applied Physics: Vol 24, No 9

Bookmarked The Negentropy Principle of Information by Leon Brillouin (Journal of Applied Physics 24, 1152 (1953))

The statistical definition of information is compared with Boltzmann's formula for entropy. The immediate result is that information I corresponds to a negative term in the total entropy S of a system.
S=S0‚ąíI
. A generalized second principle states that S must always increase. If an experiment yields an increase őĒI of the information concerning a physical system, it must be paid for by a larger increase őĒS0 in the entropy of the system and its surrounding laboratory. The efficiency őĶ of the experiment is defined as őĶ = őĒI/őĒS0‚ȧ1. Moreover, there is a lower limit k ln2 (k, Boltzmann's constant) for the őĒS0 required in an observation. Some specific examples are discussed: length or distance measurements, time measurements, observations under a microscope. In all cases it is found that higher accuracy always means lower efficiency. The information őĒI increases as the logarithm of the accuracy, while őĒS0 goes up faster than the accuracy itself. Exceptional circumstances arise when extremely small distances (of the order of nuclear dimensions) have to be measured, in which case the efficiency drops to exceedingly low values. This stupendous increase in the cost of observation is a new factor that should probably be included in the quantum theory.

https://doi.org/10.1063/1.1721463

First appearance of the word “negentropy” that I’ve seen in the literature.

Leave a Reply

Your email address will not be published. Required fields are marked *