Logical probability theory was developed as a quantitative measure based on Boole's logic of subsets. But information theory was developed into a mature theory by Claude Shannon with no such connection to logic. But a recent development in logic changes this situation. In category theory, the notion of a subset is dual to the notion of a quotient set or partition, and recently the logic of partitions has been developed in a parallel relationship to the Boolean logic of subsets (subset logic is usually mis-specified as the special case of propositional logic). What then is the quantitative measure based on partition logic in the same sense that logical probability theory is based on subset logic? It is a measure of information that is named "logical entropy" in view of that logical basis. This paper develops the notion of logical entropy and the basic notions of the resulting logical information theory. Then an extensive comparison is made with the corresponding notions based on Shannon entropy.
Ellerman is visiting at UC Riverside at the moment. Given the information theory and category theory overlap, I’m curious if he’s working with John Carlos Baez, or what Baez is aware of this.
Based on a cursory look of his website(s), I’m going to have to start following more of this work.