In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves ∼90% of its theoretical maximum information transmission.
To understand the structure of a large-scale biological, social, or technological network, it can be helpful to decompose the network into smaller subunits or modules. In this article, we develop an information-theoretic foundation for the concept of modularity in networks. We identify the modules of which the network is composed by finding an optimal compression of its topology, capitalizing on regularities in its structure. We explain the advantages of this approach and illustrate them by partitioning a number of real-world and model networks.
Information theory in biology by Henry Quastler, Editor. 1953. 273 pp. Urbana: University of Illinois Press
There are two kinds of scientific books worth reading. One is the monograph or treatise type, in which a more or less large field of science is presented in a systematic way, and in the form of a product as finished as possible at the given time. This kind of book may be considered a source of knowledge then available. The other type of book may present a collection of chapters or individual articles which do not claim to be a complete and systematic treatment of the subject; however the reader not only finds interesting ideas there, but the reading as such suggests new ideas. Such books are useful. For, although a rough and unfinished idea per se does not even remotely have the value of a well-elaborated scientific study, yet no elaborate study, no important theory, can be developed without first having a few rough ideas.
The book under consideration definitely belongs to the second category: it is a collection of essays. As the editor states in the Introduction (p. 2) : "The papers in this volume are of a very different degree of maturity. They range from authoritative reviews of well-known facts to hesitant and tentative formulations of embryonic ideas." He further states (p. 3): "We are aware of the fact that this volume is largely exploratory."
If the above is to be considered as a shortcoming, then the reviewer does not need to dwell on it, because the editor, and undoubtedly the authors, are fully aware of it, and duly warn the reader. If we evaluate the book from the point of view of how many ideas it suggests to the reader, then, at least so far as this reviewer is concerned, it must be considered a great success.
MOTIVATION: Traditional sequence distances require an alignment and therefore are not directly applicable to the problem of whole genome phylogeny where events such as rearrangements make full length alignments impossible. We present a sequence distance that works on unaligned sequences using the information theoretical concept of Kolmogorov complexity and a program to estimate this distance.
RESULTS: We establish the mathematical foundations of our distance and illustrate its use by constructing a phylogeny of the Eutherian orders using complete unaligned mitochondrial genomes. This phylogeny is consistent with the commonly accepted one for the Eutherians. A second, larger mammalian dataset is also analyzed, yielding a phylogeny generally consistent with the commonly accepted one for the mammals.
AVAILABILITY: The program to estimate our sequence distance, is available at http://www.cs.cityu.edu.hk/~cssamk/gencomp/GenCompress1.htm. The distance matrices used to generate our phylogenies are available at http://www.math.uwaterloo.ca/~mli/distance.html.
MOTIVATION: As an increasing number of protein structures become available, the need for algorithms that can quantify the similarity between protein structures increases as well. Thus, the comparison of proteins' structures, and their clustering accordingly to a given similarity measure, is at the core of today's biomedical research. In this paper, we show how an algorithmic information theory inspired Universal Similarity Metric (USM) can be used to calculate similarities between protein pairs. The method, besides being theoretically supported, is surprisingly simple to implement and computationally efficient.
RESULTS: Structural similarity between proteins in four different datasets was measured using the USM. The sample employed represented alpha, beta, alpha-beta, tim-barrel, globins and serpine protein types. The use of the proposed metric allows for a correct measurement of similarity and classification of the proteins in the four datasets.
AVAILABILITY: All the scripts and programs used for the preparation of this paper are available at http://www.cs.nott.ac.uk/~nxk/USM/protocol.html. In that web-page the reader will find a brief description on how to use the various scripts and programs.
PMID: 14751983 DOI: 10.1093/bioinformatics/bth031
Living systems are distinguished in nature by their ability to maintain stable, ordered states far from equilibrium. This is despite constant buffeting by thermodynamic forces that, if unopposed, will inevitably increase disorder. Cells maintain a steep transmembrane entropy gradient by continuous application of information that permits cellular components to carry out highly specific tasks that import energy and export entropy. Thus, the study of information storage, flow and utilization is critical for understanding first principles that govern the dynamics of life. Initial biological applications of information theory (IT) used Shannon's methods to measure the information content in strings of monomers such as genes, RNA, and proteins. Recent work has used bioinformatic and dynamical systems to provide remarkable insights into the topology and dynamics of intracellular information networks. Novel applications of Fisher-, Shannon-, and Kullback-Leibler informations are promoting increased understanding of the mechanisms by which genetic information is converted to work and order. Insights into evolution may be gained by analysis of the the fitness contributions from specific segments of genetic information as well as the optimization process in which the fitness are constrained by the substrate cost for its storage and utilization. Recent IT applications have recognized the possible role of nontraditional information storage structures including lipids and ion gradients as well as information transmission by molecular flux across cell membranes. Many fascinating challenges remain, including defining the intercellular information dynamics of multicellular organisms and the role of disordered information storage and flow in disease.
PMID: 17083004 DOI: 10.1007/s11538-006-9141-5
Degradable quantum channels are among the only channels whose quantum and private classical capacities are known. As such, determining the structure of these channels is a pressing open question in quantum information theory. We give a comprehensive review of what is currently known about the structure of degradable quantum channels, including a number of new results as well as alternate proofs of some known results. In the case of qubits, we provide a complete characterization of all degradable channels with two dimensional output, give a new proof that a qubit channel with two Kraus operators is either degradable or anti-degradable, and present a complete description of anti-degradable unital qubit channels with a new proof. For higher output dimensions we explore the relationship between the output and environment dimensions (dB and dE, respectively) of degradable channels. For several broad classes of channels we show that they can be modeled with an environment that is “small” in the sense of ΦC. Such channels include all those with qubit or qutrit output, those that map some pure state to an output with full rank, and all those which can be represented using simultaneously diagonal Kraus operators, even in a non-orthogonal basis. Perhaps surprisingly, we also present examples of degradable channels with “large” environments, in the sense that the minimal dimension dE>dB. Indeed, one can have dE>14d2B. These examples can also be used to give a negative answer to the question of whether additivity of the coherent information is helpful for establishing additivity for the Holevo capacity of a pair of channels. In the case of channels with diagonal Kraus operators, we describe the subclasses that are complements of entanglement breaking channels. We also obtain a number of results for channels in the convex hull of conjugations with generalized Pauli matrices. However, a number of open questions remain about these channels and the more general case of random unitary channels.
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory in the Bell System Technical Journal more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.