If Leonard Riggio, Barnes & Noble's chairman, joins Liberty Media's proposed buyout of his company, the board needs to decide how to handle his 30 percent stake before shareholders vote on the deal.
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The Organizing Committee invites you to attend the 18th International C. elegans Meeting, sponsored by the Genetics Society of America. The meeting will be held June 22 – 26, 2011 at the University of California, Los Angeles campus. The meeting will begin on Wednesday evening, June 22 at 7:00 pm and will end on Sunday, June 26 at 12:00 noon. On Friday, June 24 at 5:00 pm there will be a Keynote Address by Joseph Culotti, Samuel Lunenfeld Research Institute, Toronto, Canada
A portion of Charles Darwin’s vast scientific library—including handwritten notes that the 19-century English naturalist scribbled in the margins of his books—has been digitized and is available online. Readers can now get a firsthand look into the mind of the man behind the theory of evolution.
The project to digitize Darwin’s extensive library, which includes 1,480 scientific books, was a joint effort with the University of Cambridge, the Darwin Manuscripts Project at the American Museum of Natural History, the Natural History Museum in Britain, and the Biodiversity Heritage Library.
The digital library, which includes 330 of the most heavily annotated books in the collection, is fully indexed—allowing readers to search through transcriptions of the naturalist’s handwritten notes that were compiled by the Darwin scholars Mario A. Di Gregorio and Nick Gill in 1990.
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.