🔖 The Negentropy Principle of Information by Leon Brillouin | Journal of Applied Physics: Vol 24, No 9

Bookmarked The Negentropy Principle of Information by Leon Brillouin (Journal of Applied Physics 24, 1152 (1953))

The statistical definition of information is compared with Boltzmann's formula for entropy. The immediate result is that information I corresponds to a negative term in the total entropy S of a system.
S=S0−I
. A generalized second principle states that S must always increase. If an experiment yields an increase ΔI of the information concerning a physical system, it must be paid for by a larger increase ΔS0 in the entropy of the system and its surrounding laboratory. The efficiency ε of the experiment is defined as ε = ΔI/ΔS0≤1. Moreover, there is a lower limit k ln2 (k, Boltzmann's constant) for the ΔS0 required in an observation. Some specific examples are discussed: length or distance measurements, time measurements, observations under a microscope. In all cases it is found that higher accuracy always means lower efficiency. The information ΔI increases as the logarithm of the accuracy, while ΔS0 goes up faster than the accuracy itself. Exceptional circumstances arise when extremely small distances (of the order of nuclear dimensions) have to be measured, in which case the efficiency drops to exceedingly low values. This stupendous increase in the cost of observation is a new factor that should probably be included in the quantum theory.

https://doi.org/10.1063/1.1721463

First appearance of the word "negentropy" that I've seen in the literature.

🔖 The Entropy Decrement Method and the Erdos Discrepancy Problem | Simons Institute for the Theory of Computing

Bookmarked The Entropy Decrement Method and the Erdos Discrepancy Problem (Simons Institute for the Theory of Computing)

Tuesday, April 11th, 2017 9:30 am – 10:30 am
Structure vs. Randomness
Speaker: Terry Tao, UCLA

We discuss a variant of the density and energy increment arguments that we call an "entropy decrement method", which can be used to locate a scale in which two relevant random variables share very little mutual information, and thus behave somewhat like independent random variables.  We were able to use this method to obtain a new correlation estimate for multiplicative functions, which in turn was used to establish the Erdos discrepancy conjecture that any sequence taking values in {-1,+1} had unbounded sums on homogeneous arithmetic progressions.

🔖 Special Issue : Information Dynamics in Brain and Physiological Networks | Entropy

Bookmarked Special Issue "Information Dynamics in Brain and Physiological Networks" (mdpi.com)

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory".

Deadline for manuscript submissions: 30 December 2018

It is, nowadays, widely acknowledged that the brain and several other organ systems, including the cardiovascular, respiratory, and muscular systems, among others, exhibit complex dynamic behaviors that result from the combined effects of multiple regulatory mechanisms, coupling effects and feedback interactions, acting in both space and time.

The field of information theory is becoming more and more relevant for the theoretical description and quantitative assessment of the dynamics of the brain and physiological networks, defining concepts, such as those of information generation, storage, transfer, and modification. These concepts are quantified by several information measures (e.g., approximate entropy, conditional entropy, multiscale entropy, transfer entropy, redundancy and synergy, and many others), which are being increasingly used to investigate how physiological dynamics arise from the activity and connectivity of different structural units, and evolve across a variety of physiological states and pathological conditions.

This Special Issue focuses on blending theoretical developments in the new emerging field of information dynamics with innovative applications targeted to the analysis of complex brain and physiological networks in health and disease. To favor this multidisciplinary view, contributions are welcome from different fields, ranging from mathematics and physics to biomedical engineering, neuroscience, and physiology.

Prof. Dr. Luca Faes
Prof. Dr. Alberto Porta
Prof. Dr. Sebastiano Stramaglia
Guest Editors

🔖 Quantum Information: What Is It All About? by Robert B. Griffiths | Entropy

Bookmarked Quantum Information: What Is It All About? by Robert B. Griffiths (MDPI (Entropy))
This paper answers Bell’s question: What does quantum information refer to? It is about quantum properties represented by subspaces of the quantum Hilbert space, or their projectors, to which standard (Kolmogorov) probabilities can be assigned by using a projective decomposition of the identity (PDI or framework) as a quantum sample space. The single framework rule of consistent histories prevents paradoxes or contradictions. When only one framework is employed, classical (Shannon) information theory can be imported unchanged into the quantum domain. A particular case is the macroscopic world of classical physics whose quantum description needs only a single quasiclassical framework. Nontrivial issues unique to quantum information, those with no classical analog, arise when aspects of two or more incompatible frameworks are compared.
Entropy 2017, 19(12), 645; doi:10.3390/e19120645 This article belongs to the Special Issue Quantum Information and Foundations View Full-Text | Download PDF [211 KB, uploaded 29 November 2017]

🔖 Upcoming Special Issue “Information Theory in Neuroscience” | Entropy (MDPI)

Bookmarked Special Issue "Information Theory in Neuroscience" (Entropy | MDPI)
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017

📅 Entropy 2018: From Physics to Information Sciences and Geometry

RSVPed Might be attending Entropy 2018: From Physics to Information Sciences and Geometry
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain
One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these…

🔖 Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim | Arxiv

Bookmarked Can entropy be defined for and the Second Law applied to the entire universe? (arXiv)
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.

📗 Read pages i – xxix of An Introduction to Transfer Entropy: Information Flow in Complex Systems

📗 Read pages i - xxix of An Introduction to Transfer Entropy: Information Flow in Complex Systems by Terry Bossomaier, Lionel Barnett, Michael Harré, and Joseph T. Lizier From page vi: The structure of the book is a bit like stone fruit, with a soft wrapping of a hard core, ... Transfer entropy is hard…

🔖 An Introduction to Transfer Entropy: Information Flow in Complex Systems

Bookmarked An Introduction to Transfer Entropy: Information Flow in Complex Systems (Springer; 1st ed. 2016 edition)
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. ISBN: 978-3-319-43221-2 (Print), 978-3-319-43222-9 (Online)
Want to read; h/t to Joseph Lizier.

Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Bookmarked Entropy | Special Issue : Maximum Entropy and Bayesian Methods (mdpi.com)
Open for submission now
Deadline for manuscript submissions: 31 August 2017

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2017

Special Issue Editor


Guest Editor
Dr. Brendon J. Brewer

 

Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
Website | E-MailPhone: +64275001336
Interests: bayesian inference, markov chain monte carlo, nested sampling, MaxEnt

Special Issue Information

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Dr. Christoph Salge asked me to cross-post this notice from the Entropy site here.

Workshop Announcement: Entropy and Information in Biological Systems

I rarely, if ever, reblog anything here, but this particular post from John Baez's blog Azimuth is so on-topic, that attempting to embellish it seems silly. Entropy and Information in Biological Systems (Part 2) John Harte, Marc Harper and I are running a workshop! Now you can apply here to attend: • Information and entropy…

CECAM Workshop: “Entropy in Biomolecular Systems”

On Friday, I had an excellent and stimulating conversation with Arieh Ben-Naim about his recent writing and work, and he mentioned in passing that he had been invited to a conference relating to entropy and biology in Vienna.  A quick websearch found it quickly, and not having heard about it myself yet, I thought I'd…