Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
The Organizing Committee invites you to attend the 18th International C. elegans Meeting, sponsored by the Genetics Society of America. The meeting will be held June 22 – 26, 2011 at the University of California, Los Angeles campus. The meeting will begin on Wednesday evening, June 22 at 7:00 pm and will end on Sunday, June 26 at 12:00 noon. On Friday, June 24 at 5:00 pm there will be a Keynote Address by Joseph Culotti, Samuel Lunenfeld Research Institute, Toronto, Canada
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…
The research this article is based on is quite interesting for those doing language research.
If Leonard Riggio, Barnes & Noble's chairman, joins Liberty Media's proposed buyout of his company, the board needs to decide how to handle his 30 percent stake before shareholders vote on the deal.
This story from the New York Times’ Dealbook is a good quick read on some of the details and machinations of the Barnes & Noble buyout. Perhaps additional analysis on it from a game theoretical viewpoint would yield new insight?
How our brains fool us on climate, creationism, and the vaccine-autism link.
Computer pioneer who helped create the first spreadsheet, Bob Frankston, is this week's guest.
On a recent episode of Leo Laporte and Tom Merrit’s show Triangulation, they interviewed Bob Frankston of VisiCalc fame. They gave a great discussion of the current state of broadband in the U.S. and how it might be much better. They get just a bit technical in places, but it’s a fantastic and very accessible discussion of the topic of communications that every American should be aware of.
Read between January 02 – May 09, 2011
Quotes and Highlights:
You may remember the old Persian saying, ‘There is danger for him who taketh the tiger cub, and danger also for whoso snatches a delusion from a woman.’ There is as much sense in Hafiz as in Horace, and as much knowledge of the world.
Singularity is almost invariably a clue. The more featureless and commonplace a crime is, the more difficult it is to bring it home.
Well, moonshine is a brighter thing than fog, …
…as I said then, that a man should keep his little brain-attic stocked with all the furniture that he is likely to use, and the rest he can put away in the lumber-room of his library, where he can get it if he wants it.
“My God! It’s Watson,” said he. He was in a pitiable state of reaction, with every nerve in a twitter.
41% Note: An interesting early use of @Twitter…
I should be very much obliged if you would slip your revolver into your pocket. An Eley’s No. 2 is an excellent argument with gentlemen who can twist steel pokers into knots. That and a tooth-brush are, I think, all that we need.
87% First reference to Holmes with a magnifying lens in print that I’ve seen.Like
We describe the evolution of macromolecules as an information transmission process and apply tools from Shannon information theory to it. This allows us to isolate three independent, competing selective pressures that we term compression, transmission, and neutrality selection. The first two affect genome length: the pressure to conserve resources by compressing the code, and the pressure to acquire additional information that improves the channel, increasing the rate of information transmission into each offspring. Noisy transmission channels (replication with mutations) gives rise to a third pressure that acts on the actual encoding of information; it maximizes the fraction of mutations that are neutral with respect to the phenotype. This neutrality selection has important implications for the evolution of evolvability. We demonstrate each selective pressure in experiments with digital organisms.
To be published in J. theor. Biology 222 (2003) 477-483