## 👓 How our 1,000-year-old math curriculum cheats America’s kids | LA Times

Read How our 1,000-year-old math curriculum cheats America's kids – LA Times by Edward Frenkel (Los Angeles Times)
Imagine you had to take an art class in which you were taught how to paint a fence or a wall, but you were never shown the paintings of the great masters, and you weren't even told that such paintings existed. Pretty soon you'd be asking, why study art? That's absurd, of course, but it's surprisingly close to the way we teach children mathematics. In elementary and middle school and even into high school, we hide math's great masterpieces from students' view. The arithmetic, algebraic equations and geometric proofs we do teach are important, but they are to mathematics what whitewashing a fence is to Picasso — so reductive it's almost a lie. Most of us never get to see the real mathematics because our current math curriculum is more than 1,000 years old. For example, the formula for solutions of quadratic equations was in al-Khwarizmi's book published in 830, and Euclid laid the foundations of Euclidean geometry around 300 BC. If the same time warp were true in physics or biology, we wouldn't know about the solar system, the atom and DNA. This creates an extraordinary educational gap for our kids, schools and society.

An interesting train of thought to be sure. I should post in response to this, or at least think about how it could be structured. I definitely want to come back to write more about this topic.

## Probability Models for DNA Sequence Evolution

Bookmarked Probability Models for DNA Sequence Evolution (Springer, 2008, 2nd Edition) by Rick Durrett (math.duke.edu)

While browsing through some textbooks and researchers today, I came across a fantastic looking title: Probability Models for DNA Sequence Evolution by Rick Durrett (Springer, 2008). While searching his website at Duke, I noticed that he’s made a .pdf copy of a LaTeX version of the 2nd edition available for download.   I hope others find it as interesting and useful as I do.

I’ll also give him a shout out for being a mathematician with a fledgling blog: Rick’s Ramblings.

Syndicated copies to:

## Free E-Book: Neural Networks and Deep Learning by Michael Nielsen

Bookmarked Neural networks and deep learning (neuralnetworksanddeeplearning.com)

Michael A. Nielsen, the author of one of our favorite books on Quantum Computation and Quantum Information, is writing a new book entitled Neural Networks and Deep Learning. He’s been releasing portions of it for free on the internet in draft form every two or three months since 2013. He’s also maintaining an open code repository for the book on GitHub.

Syndicated copies to:

## The Postdoctoral Experience (Revisited)

Bookmarked The Postdoctoral Experience Revisited (2014) (The National Academies Press)
The Postdoctoral Experience Revisited builds on the 2000 report Enhancing the Postdoctoral Experience for Scientists and Engineers. That ground-breaking report assessed the postdoctoral experience and provided principles, action points, and recommendations to enhance that experience. Since the publication of the 2000 report, the postdoctoral landscape has changed considerably. The percentage of PhDs who pursue postdoctoral training is growing steadily and spreading from the biomedical and physical sciences to engineering and the social sciences. The average length of time spent in postdoctoral positions seems to be increasing. The Postdoctoral Experience Revisited reexamines postdoctoral programs in the United States, focusing on how postdocs are being guided and managed, how institutional practices have changed, and what happens to postdocs after they complete their programs. This book explores important changes that have occurred in postdoctoral practices and the research ecosystem and assesses how well current practices meet the needs of these fledgling scientists and engineers and of the research enterprise. The Postdoctoral Experience Revisited takes a fresh look at current postdoctoral fellows - how many there are, where they are working, in what fields, and for how many years. This book makes recommendations to improve aspects of programs - postdoctoral period of service, title and role, career development, compensation and benefits, and mentoring. Current data on demographics, career aspirations, and career outcomes for postdocs are limited. This report makes the case for better data collection by research institution and data sharing. A larger goal of this study is not only to propose ways to make the postdoctoral system better for the postdoctoral researchers themselves but also to better understand the role that postdoctoral training plays in the research enterprise. It is also to ask whether there are alternative ways to satisfy some of the research and career development needs of postdoctoral researchers that are now being met with several years of advanced training. Postdoctoral researchers are the future of the research enterprise. The discussion and recommendations of The Postdoctoral Experience Revisited will stimulate action toward clarifying the role of postdoctoral researchers and improving their status and experience.

The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward.

Most might agree that our educational system is far less than ideal, but few pay attention to significant problems at the highest levels of academia which are holding back a great deal of our national “innovation machinery”. The National Academy of Sciences has published a (free) book: The Postdoctoral Experience (Revisited) discussing where we’re at and some ideas for a way forward. There are some interesting ideas here, but we’ve still got a long way to go.

## 2014 Fields Medal and Nevanlinna Prize Winners Announced

Here's some of the best coverage I've seen about yesterday's awards.

The 2014 Fields Medal and Nevanlinna Prize winners were announced yesterday.

## General announcement

Nature: “Iranian is first woman to nab highest prize in maths”
Also includes coverage of the Gauss Prize for research that has had an impact outside mathematics, which was awarded to Stanley Osher of the University of California at Los Angeles.)

## Great personal profiles with short videos via Quanta Magazine

Artur AvilaA Brazilian Wunderkind Who Calms Chaos

Manjul BhargavaThe Musical, Magical Number Theorist

Martin HairerIn Noisy Equations, One Who Heard Music

Maryam MirzakhaniA Tenacious Explorer of Abstract Surfaces

Subhash KhotA Grand Vision for the Impossible

## Technical explanation of their work

Terry Tao (previous Fields Medal Winner): Avila, Bhargava, Hairer, Mirzakhani

Syndicated copies to:

## Academy of Motion Picture Arts & Sciences study on The Digital Dilemma

With a slight nod toward the Academy’s announcements of the Oscar nominees this morning, there’s something more interesting which they’ve recently released which hasn’t gotten nearly as much press, but portends to be much more vital in the long run.

As books enter the digital age and we watch the continued convergence of rich media like video and audio enter into e-book formats with announcements last week like Apple’s foray into digital publishing, the ability to catalog, maintain and store many types of digital media is becoming an increasing problem.  Last week the Academy released part two of their study on strategic issues in archiving and accessing digital motion picture materials in their report entitled The Digital Dilemma 2. Many of you will find it interesting/useful, particularly in light of the Academy’s description

Clicking on the image of the report below provides some additional information as well as the ability (with a simple login) to download a .pdf copy of their entire report.

There is also a recent Variety article which gives a more fully fleshed out overview of many of the issues at hand.

In the meanwhile, if you’re going to make a bet in this year’s Oscar pool, perhaps putting your money on the “Digital Dilemma” might be more useful than on Brad Pitt for Best Actor in “Moneyball”?

## Mathematics in Popular Science Books | The Economist

Reposted Big bang (The Economist)
Popular physics has enjoyed a new-found regard. Now comes a brave attempt to inject mathematics into an otherwise fashionable subject

This review of Brian Cox and Jeff Forshaw’s forthcoming book The Quantum Universe: Everything That Can Happen Does Happen sounds intriguing. I’m highly impressed that so much of the review focuses on the author’s decision to include a more mathematical treatment of their subject for what is supposed to be a popular science book. I always wish books like these at least had the temerity to include much more in the way of the mathematical underpinnings of their subjects; I’m glad that the popular press (or at least The Economist in this case) is willing to be asking for the mathematics as well. Hopefully it will mark a broader trend in popular books on scientific topics!

### Big bang

##### Popular physics has enjoyed a new-found regard. Now comes a brave attempt to inject mathematics into an otherwise fashionable subject

Nov 5th 2011 | from the print edition

The Quantum Universe: Everything That Can Happen Does Happen. By Brian Cox and Jeff Forshaw. Allen Lane; 255 pages; £20. To be published in America in January by Da Capo Press; \$25.

PREVIOUSLY the preserve of dusty, tweed-jacketed academics, physics has enjoyed a surprising popular renaissance over the past few years. In America Michio Kaku, a string theorist, has penned several successful books and wowed television and radio audiences with his presentations on esoteric subjects such as the existence of wormholes and the possibility of alien life. In Britain Brian Cox, a former pop star whose music helped propel Tony Blair to power, has become the front man for physics, which recently regained its status as a popular subject in British classrooms, an effect many attribute to Mr Cox’s astonishing appeal.

Mr Cox, a particle physicist, is well-known as the presenter of two BBC television series that have attracted millions of viewers (a third series will be aired next year) and as a bestselling author and public speaker. His latest book, “The Quantum Universe”, which he co-wrote with Jeff Forshaw of the University of Manchester, breaks the rules of popular science-writing that were established over two decades ago by Stephen Hawking, who launched the modern genre with his famous book, “A Brief History of Time”.

Mr Hawking’s literary success was ascribed to his eschewing equations. One of his editors warned him that sales of the book would be halved by every equation he included; Mr Hawking inserted just one, E=mc2, and, even then, the volume acquired a sorry reputation for being bought but not read. By contrast, Mr Cox, whose previous book with Mr Forshaw investigated “Why does E=mc2?” (2009), has bravely sloshed a generous slug of mathematics throughout his texts.

The difficulties in explaining physics without using maths are longstanding. Einstein mused, “The eternal mystery of the world is its comprehensibility,” and “the fact that it is comprehensible is a miracle.” Yet the language in which the world is described is that of maths, a relatively sound grasp of which is needed to comprehend the difficulties that physicists are trying to resolve as well as the possible solutions. Mr Cox has secured a large fan base with his boyish good looks, his happy turns of phrase and his knack for presenting complex ideas using simple analogies. He also admirably shies away from dumbing down. “The Quantum Universe” is not a dry undergraduate text book, but nor is it a particularly easy read.

The subject matter is hard. Quantum mechanics, which describes in subatomic detail a shadowy world in which cats can be simultaneously alive and dead, is notoriously difficult to grasp. Its experiments yield bizarre results that can be explained only by embracing the maths that describe them, and its theories make outrageous predictions (such as the existence of antimatter) that have nevertheless later been verified. Messrs Cox and Forshaw say they have included the maths “mainly because it allows us to really explain why things are the way they are. Without it, we should have to resort to the physicist-guru mentality whereby we pluck profundities out of thin air, and neither author would be comfortable with guru status.”

That stance might comfort the authors, but to many readers they will nonetheless seem to pluck equations out of thin air. Yet their decision to include some of the hard stuff leaves open the possibility that some readers might actually engage in the slog that leads to higher pleasures. For non-sloggers alternative routes are offered: Messrs Cox and Forshaw use clockfaces to illustrate how particles interact with one another, a drawing of how guitar strings twang and a photograph of a vibrating drum. A diagram, rather than an equation, is used to explain one promising theory of how matter acquires mass, a question that experiments on the Large Hadron Collider at CERN, the European particle-physics laboratory near Geneva, will hopefully soon answer.

The authors have wisely chosen to leaven their tome with amusing tales of dysfunctional characters among scholars who developed quantum mechanics in the 1920s and beyond, as well as with accounts of the philosophical struggles with which they grappled and the occasional earthy aside. Where the subject matter is a trifle dull, Messrs Cox and Forshaw acknowledge it: of Heinrich Kayser, who a century ago completed a six-volume reference book documenting the spectral lines generated by every known element, they observe, “He must have been great fun at dinner parties.” And they make some sweeping generalisations about their colleagues who pore over equations, “Physicists are very lazy, and they would not go to all this trouble unless it saved time in the long run.”

Whether or not readers of “The Quantum Universe” will follow all the maths, the authors’ love for their subject shines through the book. “There is no better demonstration of the power of the scientific method than quantum theory,” they write. That may be so, but physicists all over the world, Messrs Cox and Forshaw included, are longing for the next breakthrough that will supersede the claim. Hopes are pinned on experiments currently under way at CERN that may force physicists to rethink their understanding of the universe, and inspire Messrs Cox and Forshaw to write their next book—equations and all.

from the print edition | Books and arts

## Entropy Is Universal Rule of Language | Wired Science

Reposted Entropy Is Universal Rule of Language (Wired)
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren't related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. "It doesn't matter what language or style you take," said systems biologist…

Syndicated copies to:

## Barnes & Noble Board Would Face Tough Choices in a Buyout Vote | Dealbook

Read Barnes & Noble Faces Tough Choices in a Buyout Vote by Steven Davidoff Solomon (DealBook)
If Leonard Riggio, Barnes & Noble's chairman, joins Liberty Media's proposed buyout of his company, the board needs to decide how to handle his 30 percent stake before shareholders vote on the deal.

This story from the New York Times’ Dealbook is a good quick read on some of the details and machinations of the Barnes & Noble buyout. Perhaps additional analysis on it from a game theoretical viewpoint would yield new insight?

Syndicated copies to:

## The Science of Why We Don’t Believe Science | Mother Jones

Read The Science of Why We Don't Believe Science by Chris Mooney (Mother Jones)
How our brains fool us on climate, creationism, and the vaccine-autism link.

This is a fantastic article that everyone should read and take some serious time to absorb!

## Synthetic Biology’s Hunt for the Genetic Transistor | IEEE Spectrum

Replied to Synthetic Biology's Hunt for the Genetic Transistor (spectrum.ieee.org)
How genetic circuits will unlock the true potential of bioengineering

This is a great short article on bioengineering and synthetic biology written for the layperson. It’s also one of the best crash courses I’ve read on genetics in a while.

## IPTV primer: an overview of the fusion of TV and the Internet | Ars Technica

Reposted IPTV primer: an overview of the fusion of TV and the Internet by Iljitsch Van Beijnum (Ars Technica)

This brief overview of IPTV is about as concise as they get. It’s recommended for entertainment executives who need to get caught up on the space as well as for people who are contemplating “cutting the cable cord.” There’s still a lot of improvement the area can use…

Profound as it may be, the Internet revolution still pales in comparison to that earlier revolution that first brought screens in millions of homes: the TV revolution. Americans still spend more of their non-sleep, non-work time on watching TV than on any other activity. And now the immovable object (the couch potato) and the irresistible force (the business-model destroying Internet) are colliding.

For decades, the limitations of technology only allowed viewers to watch TV programs as they were broadcast. Although limiting, this way of watching TV has the benefit of simplicity: the viewer only has to turn on the set and select a channel. They then get to see what was deemed broadcast-worthy at that particular time. This is the exact opposite of the Web, where users type a search query or click a link and get their content whenever they want. Unsurprisingly, TV over the Internet, a combination that adds Web-like instant gratification to the TV experience, has seen an enormous growth in popularity since broadband became fast enough to deliver decent quality video. So is the Internet going to wreck TV, or is TV going to wreck the Internet? Arguments can certainly be made either way.

The process of distributing TV over a data network such as the Internet, a process often called IPTV, is a little more complex than just sending files back and forth. Unless, that is, a TV broadcast is recorded and turned into a file. The latter, file-based model is one that Apple has embraced with its iTunes Store, where shows are simply downloaded like any other file. This has the advantage that shows can be watched later, even when there is no longer a network connection available, but the download model doesn’t exactly lend itself to live broadcasts—or instant gratification, for that matter.

### Streaming

Most of the new IPTV services, like Netflix and Hulu, and all types of live broadcasts use a streaming model. Here, the program is set out in real time. The computer—or, usually by way of a set-top-box, the TV—decodes the incoming stream of audio and video and then displays it pretty much immediately. This has the advantage that the video starts within seconds. However, it also means that the network must be fast enough to carry the audio/video at the bitrate that it was encoded with. The bitrate can vary a lot depending on the type of program—talking heads compress a lot better than car crashes—but for standard definition (SD) video, think two megabits per second (Mbps).

To get a sense just how significant this 2Mbps number is, it’s worth placing it in the context of the history of the Internet, as it has moved from transmitting text to images to audio and video. A page of text that takes a minute to read is a few kilobytes in size. Images are tens to a few hundred kilobytes. High quality audio starts at about 128 kilobits per second (kbps), or about a megabyte per minute. SD TV can be shoehorned in some two megabits per second (Mbps), or about 15 megabytes per minute. HDTV starts around 5Mbps, 40 megabytes per minute. So someone watching HDTV over the Internet uses about the same bandwidth as half a million early-1990s text-only Web surfers. Even today, watching video uses at least ten times as much bandwidth as non-video use of the network.

In addition to raw capacity, streaming video also places other demands on the network. Most applications communicate through TCP, a layer in the network stack that takes care of retransmitting lost data and delivering data to the receiving application in the right order. This is despite the fact that the IP packets that do TCP’s bidding may arrive out of order. And when the network gets congested, TCP’s congestion control algorithms slow down the transmission rate at the sender, so the network remains usable.

However, for real-time audio and video, TCP isn’t such a good match. If a fraction of a second of audio or part of a video frame gets lost, it’s much better to just skip over the lost data and continue with what follows, rather than wait for a retransmission to arrive. So streaming audio and video tended to run on top of UDP rather than TCP. UDP is the thinnest possible layer on top of IP and doesn’t care about lost packets and such. But UDP also means that TCP’s congestion control is out the door, so a video stream may continue at full speed even though the network is overloaded and many packets—also from other users—get lost. However, more advanced streaming solutions are able to switch to lower quality video when network conditions worsen. And Apple has developed a way to stream video using standard HTTP on top of TCP, by splitting the stream into small files that are downloaded individually. Should a file fail to download because of network problems, it can be skipped, continuing playback with the next file.

### Where are the servers? Follow the money

Like any Internet application, streaming of TV content can happen from across town or across the world. However, as the number of users increases, the costs of sending such large amounts of data over large distances become significant. For this reason, content delivery networks (CDNs), of which Akamai is probably the most well-known, try to place servers as close to the end-users as possible, either close to important interconnect locations where lots of Internet traffic comes together, or actually inside the networks of large ISPs.

Interestingly, it appears that CDNs are actually paying large ISPs for this privilege. This makes the IPTV business a lot like the cable TV business. On the Internet, the assumption is that both ends (the consumer and the provider of over-the-Internet services) pay their own ISPs for the traffic costs, and the ISPs just transport the bits and aren’t involved otherwise. In the cable TV world, this is very different. An ISP provides access to the entire Internet; a cable TV provider doesn’t provide access to all possible TV channels. Often, the cable companies pay for access to content.

A recent dispute between Level3 and Comcast can be interpreted as evidence of a power struggle between the CDNs and the ISPs in the IPTV arena.

### Walled gardens

For services like Netflix or Hulu, where everyone is watching their own movie or their own show, streaming makes a lot of sense. Not so much with live broadcasts.

So far, we’ve only been looking at IPTV over the public Internet. However, many ISPs around the world already provide cable-like service on top of ADSL or Fiber-To-The-Home (FTTH). With such complete solutions, the ISPs can control the whole service, from streaming servers to the set-top box that decodes the IPTV data and delivers it to a TV. This “walled garden” type of IPTV typically provides a better and more TV-like experience—changing channels is faster, image quality is better, and the service is more reliable.

Such an IPTV Internet access service is a lot like what cable networks provide, but there is a crucial difference: with cable, the bandwidth of the analog cable signal is split into channels, which can be used for analog or digital TV broadcasts or for data. TV and data don’t get in each other’s way. With IPTV on the other hand, TV and Internet data are communication vessels: what is used by one is unavailable to the other. And to ensure a good experience, IPTV packets are given higher priority than other packets. When bandwidth is plentiful, this isn’t an issue, but when a network fills up to the point that Internet packets regularly have to take a backseat to IPTV packets, this could easily become a network neutrality headache.

### Multicast to the rescue

Speaking of networks that fill up: for services like Netflix or Hulu, where everyone is watching their own movie or their own show, streaming makes a lot of sense. Not so much with live broadcasts. If 30 million people were to tune into Dancing with the Stars using streaming, that means 30 million copies of each IPTV packet must flow down the tubes. That’s not very efficient, especially given that routers and switches have the capability to take one packet and deliver a copy to anyone who’s interested. This ability to make multiple copies of a packet is called multicast, and it occupies territory between broadcasts, which go to everyone, and regular communications (called unicast), which go to only one recipient. Multicast packets are addressed to a special group address. Only systems listening for the right group address get a copy of the packet.

Multicast is already used in some private IPTV networks, but it has never gained traction on the public Internet. Partially, this is a chicken/egg situation, where there is no demand because there is no supply and vice versa. But multicast is also hard to make work as the network gets larger and the number of multicast groups increases. However, multicast is very well suited to broadcast type network infrastructures, such as cable networks and satellite transmission. Launching multiple satellites that just send thousands of copies of the same packets to thousands of individual users would be a waste of perfectly good rockets.

Converging to a single IP network that can carry the Web, other data services, telephony, and TV seems like a no-brainer.

Multicast works well for a relatively limited number of streams that are each watched by a reasonably sized group of people—but having very many multicast groups takes up too much memory in routers and switches. For less popular content, there’s another delivery method that requires no or few streaming servers: peer-to-peer streaming. This was the technology used by the Joost service in 2007 and 2008. With peer-to-peer streaming, all the systems interested in a given stream get blocks of audio/video data from upstream peers, and then send those on to downstream peers. This approach has two downsides: the bandwidth of the stream has to be limited to fit within the upload capacity of most peers, and changing channels is a very slow process because a whole new set of peers must be contacted.

### IP addresses and home networks

A possible issue with IPTV could be the extra IP addresses required. There are basically two approaches to handling this issue: the one where the user is in full control, and the one where an IPTV service provider (usually the ISP) has some control. In the former case, streaming and downloading happens through the user’s home network and no extra addresses are required. However, wireless home networks may not be able to provide bandwidth with enough consistency to make streaming work well, so pulling Ethernet cabling may be required.

When the IPTV provider provides a set-top box, it’s often necessary to address packets toward that set-top box, so the box must be addressable in some way. This can eat up a lot of addresses, which is a problem in these IPv4-starved times. For really large ISPs, the private address ranges in IPv4 may not even be sufficient to provide a unique address to every customer. Issues in this area are why Comcast has been working on adopting IPv6 in the non-public part of its network for many years. When an IPTV provider provides a home gateway, this gateway is often outfitted with special quality-of-service mechanisms that make (wireless) streaming work better than run-of-the-mill home gateways that treat all packets the same.

### Predicting the future

Converging to a single IP network that can carry the Web, other data services, telephony, and TV seems like a no-brainer. The phone companies have been working on this for years because that will allow them to buy cheap off-the-shelf routers and switches, rather than the specialty equipment they use now. So it seems highly likely that in the future, we’ll be watching our TV shows over the Internet—or at least over an IP network of some sort. The extra bandwidth required is going to be significant, but so far, the Internet has been able to meet all challenges thrown at it in this area. Looking at the technologies, it would make sense to combine nightly pushed downloads for popular non-live content, multicast for popular live content, and regular streaming or peer-to-peer streaming for back catalog shows and obscure live content.

However, the channel flipping model of TV consumption has proven to be quite popular over the past half century, and many consumers may want to stick with it—for at least part of their TV viewing time. If nothing else, this provides an easy way to discover new shows. The networks are also unlikely to move away from this model voluntarily, because there is no way they’ll be able to sell 16 minutes of commercials per hour using most of the other delivery methods. However, we may see some innovations. For instance, if you stumble upon a show in progress, wouldn’t it be nice to be able to go back to the beginning? In the end, TV isn’t going anywhere, and neither is the Internet, so they’ll have to find a way to live together.

Correction: The original article incorrectly stated that cable providers get paid by TV networks. For broadcast networks, cable operators are required by the law’s “must carry” provisions to carry all of the TV stations broadcast in a market. Ars regrets the error.

## ‘The Information’ by James Gleick – Book Review by Janet Maslin | New York Times

Reposted ‘The Information’ by James Gleick - Review (nytimes.com)
“The Information,” by James Gleick, is to the nature, history and significance of data what the beach is to sand.

This book is assuredly going to have to skip up to the top of my current reading list.

“The Information” is so ambitious, illuminating and sexily theoretical that it will amount to aspirational reading for many of those who have the mettle to tackle it. Don’t make the mistake of reading it quickly. Imagine luxuriating on a Wi-Fi-equipped desert island with Mr. Gleick’s book, a search engine and no distractions. “The Information” is to the nature, history and significance of data what the beach is to sand.

In this relaxed setting, take the time to differentiate among the Brownian (motion), Bodleian (library) and Boolean (logic) while following Mr. Gleick’s version of what Einstein called “spukhafte Fernwirkung,” or “spooky action at a distance.” Einstein wasn’t precise about what this meant, and Mr. Gleick isn’t always precise either. His ambitions for this book are diffuse and far flung, to the point where providing a thumbnail description of “The Information” is impossible.

So this book’s prologue is its most slippery section. It does not exactly outline a unifying thesis. Instead it hints at the amalgam of logic, philosophy, linguistics, research, appraisal and anecdotal wisdom that will follow. If Mr. Gleick has one overriding goal it is to provide an animated history of scientific progress, specifically the progress of the technology that allows information to be recorded, transmitted and analyzed. This study’s range extends from communication by drumbeat to cognitive assault by e-mail.

As an illustration of Mr. Gleick’s versatility, consider what he has to say about the telegraph. He describes the mechanical key that made telegraphic transmission possible; the compression of language that this new medium encouraged; that it literally was a medium, a midway point between fully verbal messages and coded ones; the damaging effect its forced brevity had on civility; the confusion it created as to what a message actually was (could a mother send her son a dish of sauerkraut?) and the new conceptual thinking that it helped implement. The weather, which had been understood on a place-by-place basis, was suddenly much more than a collection of local events.

Beyond all this Mr. Gleick’s telegraph chapter, titled “A Nervous System for the Earth,” finds time to consider the kind of binary code that began to make sense in the telegraph era. It examines the way letters came to treated like numbers, the way systems of ciphers emerged. It cites the various uses to which ciphers might be put by businessmen, governments or fiction writers (Lewis Carroll, Jules Verne and Edgar Allan Poe). Most of all it shows how this phase of communication anticipated the immense complexities of our own information age.

Although “The Information” unfolds in a roughly chronological way, Mr. Gleick is no slave to linearity. He freely embarks on colorful digressions. Some are included just for the sake of introducing the great eccentrics whose seemingly marginal inventions would prove to be prophetic. Like Richard Holmes’s “Age of Wonder” this book invests scientists with big, eccentric personalities. Augusta Ada Lovelace, the daughter of Lord Byron, may have been spectacularly arrogant about what she called “my immense reasoning faculties,” claiming that her brain was “something more than merely mortal.” But her contribution to the writing of algorithms can, in the right geeky circles, be mentioned in the same breath as her father’s contribution to poetry.

The segments of “The Information” vary in levels of difficulty. Grappling with entropy, randomness and quantum teleportation is the price of enjoying Mr. Gleick’s simple, entertaining riffs on the Oxford English Dictionary’s methodology, which has yielded 30-odd spellings of “mackerel” and an enchantingly tongue-tied definition of “bada-bing” and on the cyber-battles waged via Wikipedia. (As he notes, there are people who have bothered to fight over Wikipedia’s use of the word “cute” to accompany a picture of a young polar bear.) That Amazon boasts of being able to download a book called “Data Smog” in less than a minute does not escape his keen sense of the absurd.

As it traces our route to information overload, “The Information” pays tribute to the places that made it possible. He cites and honors the great cogitation hives of yore. In addition to the Institute for Advanced Study in Princeton, N.J., the Mount Rushmore of theoretical science, he acknowledges the achievements of corporate facilities like Bell Labs and I.B.M.’s Watson Research Center in the halcyon days when many innovations had not found practical applications and progress was its own reward.

“The Information” also lauds the heroics of mathematicians, physicists and computer pioneers like Claude Shannon, who is revered in the computer-science realm for his information theory but not yet treated as a subject for full-length, mainstream biography. Mr. Shannon’s interest in circuitry using “if … then” choices conducting arithmetic in a binary system had novelty when he began formulating his thoughts in 1937. “Here in a master’s thesis by a research assistant,” Mr. Gleick writes, “was the essence of the computer revolution yet to come.”

Among its many other virtues “The Information” has the rare capacity to work as a time machine. It goes back much further than Shannon’s breakthroughs. And with each step backward Mr. Gleick must erase what his readers already know. He casts new light on the verbal flourishes of the Greek poetry that preceded the written word: these turns of phrase could be as useful for their mnemonic power as for their art. He explains why the Greeks arranged things in terms of events, not categories; how one Babylonian text that ends with “this is the procedure” is essentially an algorithm; and why the telephone and the skyscraper go hand in hand. Once the telephone eliminated the need for hand-delivered messages, the sky was the limit.

In the opinion of “The Information” the world of information still has room for expansion. We may be drowning in spam, but the sky’s still the limit today.

## Confessions of David Seidler, a 73-year-old Oscars virgin

Read Confessions of David Seidler, a 73-year-old Oscars virgin (LA Times)
My first realization I was hooked on Oscar was when I seriously began pondering one of mankind's most profound dilemmas: whether to rent or buy a tux. That first step, as with any descent down a...

This is a great (and hilarious) story by and about the writer of THE KING’S SPEECH.

## The Decline Effect and the Scientific Method | The New Yorker

Replied to The Truth Wears Off: Is there something wrong with the scientific method? (The New Yorker)

Jonah Lehrer’s New Yorker article “The Truth Wears Off: Is there something wrong with the scientific method?” is an interesting must-read article. In it he discusses the “Decline Effect” and outlier statistical effects within scientific research.

Among other interesting observations in it, he calls attention to the fact that, “according to the journal Nature, a third of all studies never even get cited, let alone repeated.”

For scholars of Fisher, Popper, and Kuhn, some of this discussion won’t be quite so novel, but for anyone designing scientific experiments, the effects discussed here are certainly worthy of notice and further study and scrutiny.