A simple preface followed by an anecdote about the beginning of a deal relating to telecom. The style is quick moving and history, details, and philosophy are liberally injected into the story as it moves along. This seems both interesting as well as instructive.
Highlights, Quotes, & Marginalia
“There are some who believe that an immutable law of history holds that conflict is inevitable when a rising power begins to bump up against an established one. But no law is immutable. Choices matter. Lessons can be learned.”
“Prescriptions, after all, are easier to make than predictions.”
“Note taking allows Party and government officials to get quick reads on what went on at meetings they didn’t attend. […] Private meetings with senior government officials without recoring devices or note takers are rare and highly sought after.”
“…the so-called iron rice bowl, the cradle-to-grave care and support guaranteed by the government through the big companies people worked for.”
“The Party had made a simple bargain with the people: economic growth in return for political stability. That in turn meant Party control. Prosperity was the source of Party legitimacy.”
“Messages in China are sent in ways that aren’t always direct; you have to read the signs.”
“It was the nature of dealing with China: nothing was done until it was done.”
During decades the study of networks has been divided between the efforts of social scientists and natural scientists, two groups of scholars who often do not see eye to eye. In this review I present an effort to mutually translate the work conducted by scholars from both of these academic fronts hoping to continue to unify what has become a diverging body of literature. I argue that social and natural scientists fail to see eye to eye because they have diverging academic goals. Social scientists focus on explaining how context specific social and economic mechanisms drive the structure of networks and on how networks shape social and economic outcomes. By contrast, natural scientists focus primarily on modeling network characteristics that are independent of context, since their focus is to identify universal characteristics of systems instead of context specific mechanisms. In the following pages I discuss the differences between both of these literatures by summarizing the parallel theories advanced to explain link formation and the applications used by scholars in each field to justify their approach to network science. I conclude by providing an outlook on how these literatures can be further unified.
An exclusive look at data from the controversial web site Sci-Hub reveals that the whole world, both poor and rich, is reading pirated research papers.
Sci Hub has been in the news quite a bit over the past half a year and the bookmarked article here gives some interesting statistics. I’ll preface some of the following editorial critique with the fact that I love John Bohannon’s work; I’m glad he’s spent the time to do the research he has. Most of the rest of the critique is aimed at the publishing industry itself.
From a journalistic standpoint, I find it disingenuous that the article didn’t actually hyperlink to Sci Hub. Neither did it link out (or provide a full quote) to Alicia Wise’s Twitter post(s) nor link to her rebuttal list of 20 ways to access their content freely or inexpensively. Of course both of these are editorial related, and perhaps the rebuttal was so flimsy as to be unworthy of a link from such an esteemed publication anyway.
Sadly, Elsevier’s list of 20 ways of free/inexpensive access doesn’t really provide any simple coverage for graduate students or researchers in poorer countries which are the likeliest group of people using Sci Hub, unless they’re going to fraudulently claim they’re part of a class which they’re not, and is this morally any better than the original theft method? It’s almost assuredly never used by patients, which seem to be covered under one of the options, as the option to do so is painfully undiscoverable past their typical $30/paper firewalls. Their patchwork hodgepodge of free access is so difficult to not only discern, but one must keep in mind that this is just one of dozens of publishers a researcher must navigate to find the one thing they’re looking for right now (not to mention the thousands of times they need to do this throughout a year, much less a career).
Consider this experiment, which could be a good follow up to the article: is it easier to find and download a paper by title/author/DOI via Sci Hub (a minute) versus through any of the other publishers’ platforms with a university subscription (several minutes) or without a subscription (an hour or more to days)? Just consider the time it would take to dig up every one of 30 references in an average journal article: maybe just a half an hour via Sci Hub versus the days and/or weeks it would take to jump through the multiple hoops to first discover, read about, and then gain access and then download them from the over 14 providers (and this presumes the others provide some type of “access” like Elsevier).
Those who lived through the Napster revolution in music will realize that the dead simplicity of their system is primarily what helped kill the music business compared to the ecosystem that exists now with easy access through the multiple streaming sites (Spotify, Pandora, etc.) or inexpensive paid options like (iTunes). If the publishing business doesn’t want to get completely killed, they’re going to need to create the iTunes of academia. I suspect they’ll have internal bean-counters watching the percentage of the total (now apparently 5%) and will probably only do something before it passes a much larger threshold, though I imagine that they’re really hoping that the number stays stable which signals that they’re not really concerned. They’re far more likely to continue to maintain their status quo practices.
Some of this ease-of-access argument is truly borne out by the statistics of open access papers which are downloaded by Sci Hub–it’s simply easier to both find and download them that way compared to traditional methods; there’s one simple pathway for both discovery and download. Surely the publishers, without colluding, could come up with a standardized method or protocol for finding and accessing their material cheaply and easily?
“Hart-Davidson obtained more than 100 years of biology papers the hard way—legally with the help of the publishers. ‘It took an entire year just to get permission,’ says Thomas Padilla, the MSU librarian who did the negotiating.” John Bohannon in Who’s downloading pirated papers? Everyone
Personally, I use use relatively advanced tools like LibX, which happens to be offered by my institution and which I feel isn’t very well known, and it still takes me longer to find and download a paper than it would via Sci Hub. God forbid if some enterprising hacker were to create a LibX community version for Sci Hub. Come to think of it, why haven’t any of the dozens of publishers built and supported simple tools like LibX which make their content easy to access? If we consider the analogy of academic papers to the introduction of machine guns in World War I, why should modern researchers still be using single-load rifles against an enemy that has access to nuclear weaponry?
My last thought here comes on the heels of the two tweets from Alicia Wise mentioned, but not shown in the article:
She mentions that the New York Times charges more than Elsevier does for a full subscription. This is tremendously disingenuous as Elsevier is but one of dozens of publishers for which one would have to subscribe to have access to the full panoply of material researchers are typically looking for. Further, Elsevier nor their competitors are making their material as easy to find and access as the New York Times does. Neither do they discount access to the point that they attempt to find the subscription point that their users find financially acceptable. Case in point: while I often read the New York Times, I rarely go over their monthly limit of articles to need any type of paid subscription. Solely because they made me an interesting offer to subscribe for 8 weeks for 99 cents, I took them up on it and renewed that deal for another subsequent 8 weeks. Not finding it worth the full $35/month price point I attempted to cancel. I had to cancel the subscription via phone, but why? The NYT customer rep made me no less than 5 different offers at ever decreasing price points–including the 99 cents for 8 weeks which I had been getting!!–to try to keep my subscription. Elsevier, nor any of their competitors has ever tried (much less so hard) to earn my business. (I’ll further posit that it’s because it’s easier to fleece at the institutional level with bulk negotiation, a model not too dissimilar to the textbook business pressuring professors on textbook adoption rather than trying to sell directly the end consumer–the student, which I’ve written about before.)
(Trigger alert: Apophasis to come) And none of this is to mention the quality control that is (or isn’t) put into the journals or papers themselves. Fortunately one need’t even go further than Bohannon’s other writings like Who’s Afraid of Peer Review? Then there are the hordes of articles on poor research design and misuse of statistical analysis and inability to repeat experiments. Not to give them any ideas, but lately it seems like Elsevier buying the Enquirer and charging $30 per article might not be a bad business decision. Maybe they just don’t want to play second-banana to TMZ?
Interestingly there’s a survey at the end of the article which indicates some additional sources of academic copyright infringement. I do have to wonder how the data for the survey will be used? There’s always the possibility that logged in users will be indicating they’re circumventing copyright and opening themselves up to litigation.
I also found the concept of using the massive data store as a means of applied corpus linguistics for science an entertaining proposition. This type of research could mean great things for science communication in general. I have heard of people attempting to do such meta-analysis to guide the purchase of potential intellectual property for patent trolling as well.
Finally, for those who haven’t done it (ever or recently), I’ll recommend that it’s certainly well worth their time and energy to attend one or more of the many 30-60 minute sessions most academic libraries offer at the beginning of their academic terms to train library users on research tools and methods. You’ll save yourself a huge amount of time.
Yesterday I ran across this nice little video explaining some recent research on global language networks. It’s not only interesting in its own right, but is a fantastic example of science communication as well.
I’m interested in some of the information theoretic aspects of this as well as the relation of this to the area of corpus linguistics. I’m also curious if one could build worthwhile datasets like this for the ancient world (cross reference some of the sources I touch on in relation to the Dickinson College Commentaries within Latin Pedagogy and the Digital Humanities) to see what influences different language cultures have had on each other. Perhaps the historical record could help to validate some of the predictions made in relation to the future?
The paper “Global distribution and drivers of language extinction risk” indicates that of all the variables tested, economic growth was most strongly linked to language loss.
Finally, I can also only think about how this research may help to temper some of the xenophobic discussion that occurs in American political life with respect to fears relating to Mexican immigration issues as well as the position of China in the world economy.
Those intrigued by the video may find the website set up by the researchers very interesting. It contains links to the full paper as well as visualizations and links to the data used.
Languages vary enormously in global importance because of historical, demographic, political, and technological forces. However, beyond simple measures of population and economic power, there has been no rigorous quantitative way to define the global influence of languages. Here we use the structure of the networks connecting multilingual speakers and translated texts, as expressed in book translations, multiple language editions of Wikipedia, and Twitter, to provide a concept of language importance that goes beyond simple economic or demographic measures. We find that the structure of these three global language networks (GLNs) is centered on English as a global hub and around a handful of intermediate hub languages, which include Spanish, German, French, Russian, Portuguese, and Chinese. We validate the measure of a language’s centrality in the three GLNs by showing that it exhibits a strong correlation with two independent measures of the number of famous people born in the countries associated with that language. These results suggest that the position of a language in the GLN contributes to the visibility of its speakers and the global popularity of the cultural content they produce.
“A language like Dutch — spoken by 27 million people — can be a disproportionately large conduit, compared with a language like Arabic, which has a whopping 530 million native and second-language speakers,” Science reports. “This is because the Dutch are very multilingual and very online.”
t’s the beginning of yet another quarter/semester (or ovester, if you prefer) and a new crop of inquiries have come up around selling back used textbooks and purchasing new textbooks for upcoming classes. I’m not talking about the philosophical discussion about choosing your own textbooks that I’ve mentioned before. I’m considering, in the digital era,
What are the best options for purchasing, renting, or utilizing textbook products in what is a relatively quickly shifting market?
The popular press has a variety of evergreen stories that hit the wire at the beginning of each semester that scratch just the surface of the broader textbook issue or focus on one tiny upstart company that promises to drastically disrupt the market (yet somehow never does), but these articles never delve just a bit deeper into the market to give a broader array of ideas and, more importantly, solutions for the students/parents who are spending the bulk of the money to support the inequalities the market has built.
I aim to facilitate some of this digging and revealing based on years of personal book buying experience as well as having specified textbooks as an instructor in the past.
Most current students won’t have been born late enough that electronic files for books and texts will have been common enough to prefer them over physical texts, but with practice and time, many will prefer electronic texts in the long term, particularly as one can highlight, mark up, and more easily search, store, and even carry electronic texts.
Before taking a look at the pure economics of the market for the various forms of purchase, resale, or even renting, one should first figure out one’s preference for reading format. There are obviously many different means of learning (visual, auditory, experiential, etc.) which some will prefer over others, so try to tailor your “texts” to your preferred learning style as much as possible. For those who prefer auditory learning modes, be sure to check out alternatives like Audible or the wealth of online video/audio materials that have proliferated in the MOOC revolution. For those who are visual learners or who learn best by reading, do you prefer ebook formats over physical books? There are many studies showing the benefit of one over the other, but some of this comes down to personal preference and how comfortable one is with particular formats. Most current students won’t have been born late enough that electronic files for books and texts will have been common enough to prefer them over physical texts, but with practice and time, many will prefer electronic texts in the long term, particularly as one can highlight, mark up, and more easily search, store, and even carry electronic texts. It’s taken me (an avowed paper native) several years, but I now vastly prefer to have books in electronic format for some of the reasons indicated above in addition to the fact that I can carry a library of 2,500+ books with me almost anywhere I go. I also love being able to almost instantly download anything that I don’t currently own but may need/want.
The one caveat I’ll mention, particularly for visual learners (or those with pseudo-photographic or eidetic memory), is that they attempt to keep a two-page reading format on their e-reading devices as their long-term memory for reading will increase with the ability to place the knowledge on the part of the page(s) where they originally encountered it (that is, I remember seeing that particular item on the top left, or middle right portion of a particular page.) Sometimes this isn’t always possible due to an e-reader’s formatting capabilities or the readability of the size of the text (for example, a .pdf file on a Kindle DX would be preferable to the same file on a much smaller smartphone) , but for many it can be quite helpful. Personally, I can remember where particular words and grammatical constructs appeared in my 10th grade Latin text many years later while I would be very unlikely to be able to do this with the presentation of some modern-day e-readers or alternate technologies like rapid serial visual presentation (RSVP).
Purchasing to Keep
Personally, as a student and a bibliophile (read: bibliomaniac), I would typically purchase all of the physical texts for all of my classes. I know this isn’t a realizable reality for everyone, so, for the rest, I would recommend purchasing all of the texts (physical or electronic, depending on one’s preference for personal use) in one’s main area of study, which one could then keep for the long term and not sell back. This allows one to build a library that will serve as a long term reference for one’s primary area(s) of study.
Renting vs Short-term Ownership
In general, I’m opposed to renting books or purchasing them for a semester or year and then returning them for a partial refund. It’s rarely a great solution for the end consumer who ends up losing the greater value of the textbook. Even books returned and sold later as used, often go for many multiples of their turn in price the following term, so if it’s a newer or recent edition, it’s probably better to hold on to it for a few months and then sell it for a used price, slightly lower than the college bookstore’s going rate.
For tangential texts in classes I know I don’t want to keep for the long term, I’d usually find online versions or borrow (for free) from the local college or public library (many books are available electronically through the library or are borrow-able through the library reserve room.)
Often college students forget that they’re not just stuck with their local institutional library, so I’ll remind everyone to check out their local public library(s) as well as other nearby institutional libraries and inter-library loan options which may give them longer term loan terms.
General Economics in the Textbook Market
One of the most important changes in the textbook market that every buyer should be aware of: last year in Kirtsaeng v. John Wiley & Sons, Inc.the US Supreme Court upheld the ability for US-based students to buy copies of textbooks printed in foreign countries (often at huge cut-rate prices) [see also Ars Technica]. This means that searching online bookstores in India, Indonesia, Pakistan, etc. will often find the EXACT same textbooks (usually with slightly different ISBNs, and slightly cheaper paper) for HUGE discounts in the 60-95% range.
Example: I recently bought an international edition of Walter Rudin’s Principles of Mathematical Analysis (Amazon $121) for $5 (and it even happened to ship from within the US for $3). Not only was this 96% off of the cover price, but it was 78% off of Amazon’s rental price! How amazing is it to spend almost as much to purchase a book as it is to ship it to yourself!? I’ll also note here that the first edition of this book appeared in 1964 and this very popular third edition is from 1976, so it isn’t an example of “edition creep”, but it’s still got a tremendous mark up in relation to other common analysis texts which list on Amazon for $35-50.
For some of the most expensive math/science/engineering texts one can buy an edition one or two earlier than the current one. In these cases, the main text changes very little, if any, and the primary difference is usually additional problems in the homework sections (which causes small discrepancies in page number counts). If necessary, the problem sets can be easily obtained via the reserve room in the library or by briefly borrowing/photocopying problems from classmates who have the current edition. The constant “edition-churning” by publishers is mean to help prop up high textbook prices.
Definition: “Edition Churning” or “Edition Creep“: a common practice of textbook publishers of adding scant new material, if any, to textbooks on a yearly or every-other-yearly basis thereby making older editions seem prematurely obsolete and thereby propping up the prices of their textbooks. Professors who blithely utilize the newest edition of a texbook are often unknowingly complicit in propping up prices in these situations.
One may find some usefulness or convenience in traditional bookstores, particularly Barnes & Noble, the last of the freestanding big box retailers. If you’re a member of their affinity program and get an additional discount for ordering books directly through them, then it may not be a horrible idea to do so. Still, they’re paying for a relatively large overhead and it’s likely that you’ll find cheaper prices elsewhere.
These are becoming increasingly lean and many may begin disappearing over the next decade or so, much the way many traditional bookstores have disappeared in the last decade with the increasing competition online. Because many students aren’t the best at price comparison, however, and because of their position in the economic chain, many are managing to hang on quite well. Keep in mind that many campus bookstores have fine print deals in which they’ll match or beat pricing you find online, so be sure to take advantage of this fact, particularly when shipping from many services will make an equivalent online purchase a few dollars more expensive.
There are fewer and fewer of these around these days and even fewer textbook-specific stores that traditionally sprouted up next to major campuses. This last type may not be a horrible place to shop, but they’re likely to specialize in used texts of only official texts. Otherwise, general used bookstores are more likely to specialize in paperbacks and popular used fiction and have very lean textbook selection, if any.
Naturally when shopping for textbooks there are a veritable wealth of websites to shop around online including: Amazon, Alibris, Barnes & Noble, AbeBooks, Google Play, Half/EBay. Chegg, Valore, CampusBookRentals, TextBooks.com, and ECampus. But in the Web2.0 world, we can now uses websites with even larger volumes of data and meta-data as a clearing-house for our shopping. So instead of shopping and doing price comparison at the dozens of competing sites, why not use a meta-site to do the comparison for us algorithmically and much more quickly.
There are a variety of meta-retailer shopping methods including several browser plugins and comparison sites (Chrome, Firefox, InvisibleHand, PriceBlink, PriceGong, etc.) that one can install to provide pricing comparisons, so that, for example, while shopping on Amazon, one will see lower priced offerings from their competitors. However, possibly the best website I’ve come across for cross-site book comparisons is GetTextbooks.com. One can easily search for textbooks (by author, title, ISBN, etc.) and get back a list of retailers with copies that is sortable by price (including shipping) as well as by new/used and even by rental availability. They even highlight one entry algorithmicly to indicate their recommended “best value”.
Similar to GetTextbooks is the webservice SlugBooks, though it doesn’t appear to search as many sites or present as much data.
When searching for potential textbooks, don’t forget that one can “showroom” the book in one’s local bookstore or even at one’s local library(s). This is particularly useful if one is debating whether or not to take a particular class, or if one is kicking tires to see if it’s really the best book for them, or if they should be looking at other textbooks.
From an economic standpoint, keep in mind there is usually more availability and selection on editions bought a month or so before the start of classes, as often-used texts are used by thousands of students over the world, thus creating a spot market for used texts at semester and quarter starts. Professors often list their textbooks when class listings for future semesters are released, so students surfing for the best deals for used textbooks can very often find them in mid-semester (or mid-quarter) well before the purchasing rush begins for any/most titles.
And finally, there is also the black market (also known as outright theft), which is usually spoken of in back-channels either online or in person. Most mainstream articles which reference this portion of the market usually refer tangentially to a grey market in which one student passes along a .pdf or other pirated file to fellow students rather than individual students being enterprising enough to go out hunting for their own files.
Most will know of or have heard about websites like PirateBay, but there are a variety of lesser-known torrent sites which are typically hosted in foreign countries which extend beyond the reach of the United States Copyright law enforcement. Increasingly, mega-pirate websites in the vein of the now-defunct Library.nu (or previously Gigapedia) or the slowly dying empire of Library Genesis are hiding all over the web and become quick and easy clearing houses for pirated copies of ebooks, typically in .pdf or .djvu formats, though many are in .epub, .mobi, .azw, or alternate e-book formats. The typical set up for these sites is one or more illegal file repositories for allowing downloads with one (or more) primary hubs that don’t necessarily store the pirated materials, but instead serve as a searchable hub which points to the files.
Creative advanced searches for book authors, titles, ISBNs along with the words .pdf, .djvu, torrent, etc. can often reveal portions of this dark web. Naturally, caveat emptor applies heavily to these types of sites as often files can be corrupted or contain viruses to unwary or unwitting thieves. Many of these sites may attempt to extract a small token monthly fee as a subscription or will rely heavily on serving banner advertising to help to offset large web hosting and traffic fees associated with their maintenance, though it is posited that many of them make in the millions of dollars in profit annually due to advertising arrangements, though this is incredibly hard to validate given the nature of these types of markets and how they operate.
Rather than stoop as low as finding textbooks on the black market this way, students should place pressure on their professors, the faculty of their departments, and their colleges or universities to help assist in smoothing out some of the pricing inequities in the system (see below). In the long run, this will not only tend to help them, but many future generations of students who will be left adrift in the market otherwise.
Long Term Solution(s) to Improving the Textbook Market
The biggest primary issue facing the overpriced textbook market is that the end consumers of the textbooks aren’t really firmly in charge of the decision of which textbook to purchase. This is why I advocate that students research and decide by themselves which textbook they’re going to use and whether or not they really need to make that purchase. Instead, individual professors or the departments for which they work are dictating the textbooks that will be purchased. The game theory dynamics behind this small decision are the massive fulcrum which allows the publishing industry to dictate their own terms. Students (and parents) should, in a sense, unionize and make their voices heard not only to the professors, but to the departments and even the colleges/universities which they’re attending. If universities took a strong stance on how the markets worked, either for or against them and their students, they could create strong market-moving forces to drastically decrease the cost of textbooks.
The other larger issue is that market forces aren’t allowed to play out naturally in the college textbook market. Publishers lean on professors and departments to “adopt” overpriced textbooks. These departments in turn “require” these texts and students aren’t questioning enough to use other texts for fear of not succeeding in courses. If the system were questioned, they’d realize that instead of their $200-300 textbook, they could easily purchase alternate, equivalent, and often even better textbooks for $20-50. To put things into perspective, the time, effort, energy, and production cost for the typical book isn’t drastically different than the average textbook, yet we’re not paying $250 for a copy of the average new hardcover on the best seller list. I wouldn’t go so far as to say that universities, departments, and professors are colluding with publishers, but they’re certainly not helping to make the system better.
I’ve always taken the view that the ‘required’ textbook was really just a ‘suggestion’. (Have you ever known a professor to fail a student for not purchasing the ‘required’ textbook?!)
In past generations, one of the first jobs of a student was to select their own textbook. Reverting back to this paradigm may help to drastically change the economics of the situation. For the interested students, I’ve written a bit about the philosophy and mechanics here: On Choosing Your Own Textbooks.
Basic economics 101 theory of supply and demand would typically indicate to us that basic textbooks for subjects like calculus, intro physics, or chemistry that are used by very large numbers of students should be not only numerous, but also very cheap, while more specialized books like Lie Groups and Lie Algebras or Electromagnetic Theory should be less numerous and also more expensive. Unfortunately and remarkably, the most popular calculus textbooks are 2-5 times more expensive than their advanced abstract mathematical brethren and similarly for introductory physics texts versus EM theory books.
To drastically cut down on these market inequities, when possible, Colleges and Universities should:
Heavily discourage “edition creep” or “edition churning” when there really aren’t major changes to textbooks. In an online and connected society, it’s easy enough to add supplemental errata or small amounts of supplemental material by means of the web.
Quit making institution-specific readers and sub-editions of books for a specific department
If they’re going to make departmental level textbook choices, they should shoulder the burden of purchasing all the textbooks in quantity (and taking quantity discounts). I’ll note here, that students shouldn’t encourage institutions to bundle the price of textbooks into their tuition as then there is a “dark curtain,” which allows institutions to take the drastic mark-ups for themselves instead of allowing the publishers to take it or passing it along to their students. Cross-reference Benjamin Ginsberg’s article Administrators Ate My Tuition or his much longer text The Fall of the Faculty (Oxford University Press, 2013).
Discourage the use of unpopularly used textbooks written by their own faculty. Perhaps a market share of 5-10% or more should be required for a common textbook to be usable by a department, and, until that point, the professor should compete aggressively to build market share? This may help encourage professors to write new original texts instead of producing yet-another-introductory-calculus-textbook that no one needs.
Discourage packaged electronic supplemental materials, which
are rarely used by students,
could be supplied online for free as a supplement,
and often double or triple the price of a textbook package.
Strongly encourage professors to supply larger lists of relatively equivalent books and encourage their students to make their purchase choices individually.
Consider barring textbook sales on campus and relying on the larger competitive market to supply textbooks to students.
Calibre: E-book and Document Management Made Simple
As an added bonus, for those with rather large (or rapidly growing) e-book collections, I highly recommend downloading and using the free Calibre Library software. For my 2000+ e-books and documents, this is an indispensable program that is to books as iTunes is to music. I also use it to download dozens of magazines and newspapers on a daily basis for reading on my Kindle. I love that it’s under constant development with weekly updates for improved functionality. It works on all major OSes and is compatible with almost every e-reader on the planet. Additionally, plug-ins and a myriad of settings allow for additional extensibility for integration with other e-book software and web services (for example: integration with GoodReads or the ability to add additional data and meta-data to one’s books.)
Be sure to read through the commentary on some of these posts for some additional great information.
What other textbook purchasing services and advice can you offer the market?
I invite everyone to include their comments and advice below as I’m sure I haven’t covered the topic completely or there are bound to be new players in the space increasing competition as time goes by.
I’ve just recently finished the excellent book Why Information Grows by César Hidalgo. I hope to post a reasonable review soon, but the ideas in it are truly excellent and fit into a thesis I’ve been working on for a while. For those interested, he does a reasonable synopsis of some of his thought in the talk he gave the the RSA recently, the video can be found below.
The underlying mathematics of what he’s discussing are fantastic (though he doesn’t go into them in his book), but the overarching implications of his ideas with relation to the future of humankind as a function of our economic system and society could have some significant impact.
“César visits the RSA to present a new view of the relationship between individual and collective knowledge, linking information theory, economics and biology to explain the deep evolution of social and economic systems.
In a radical rethink of what an economy is, one of WIRED magazine’s 50 People Who Could Change the World, César Hidalgo argues that it is the measure of a nation’s cultural complexity – the nexus of people, ideas and invention – rather than its GDP or per-capita income, that explains the success or failure of its economic performance. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order itself.”
“What is economic growth? And why, historically, has it occurred in only a few places? Previous efforts to answer these questions have focused on institutions, geography, finances, and psychology. But according to MIT’s antidisciplinarian César Hidalgo, understanding the nature of economic growth demands transcending the social sciences and including the natural sciences of information, networks, and complexity. To understand the growth of economies, Hidalgo argues, we first need to understand the growth of order.
At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order–or information–will disappear. Whispers vanish in the wind just like the beauty of swirling cigarette smoke collapses into disorderly clouds. But thermodynamics also has loopholes that promote the growth of information in pockets. Our cities are pockets where information grows, but they are not all the same. For every Silicon Valley, Tokyo, and Paris, there are dozens of places with economies that accomplish little more than pulling rocks off the ground. So, why does the US economy outstrip Brazil’s, and Brazil’s that of Chad? Why did the technology corridor along Boston’s Route 128 languish while Silicon Valley blossomed? In each case, the key is how people, firms, and the networks they form make use of information.
Seen from Hidalgo’s vantage, economies become distributed computers, made of networks of people, and the problem of economic development becomes the problem of making these computers more powerful. By uncovering the mechanisms that enable the growth of information in nature and society, Why Information Grows lays bear the origins of physical order and economic growth. Situated at the nexus of information theory, physics, sociology, and economics, this book propounds a new theory of how economies can do, not just more, but more interesting things.”
The Signal and the Noise: Why So Many Predictions Fail, But Some Don't
Business & Economics
Penguin Press HC
September 27, 2012
The founder of FiveThirtyEight.com challenges myths about predictions in subjects ranging from the financial market and weather to sports and politics, profiling the world of prediction to explain how readers can distinguish true signals from hype, in a report that also reveals the sources and societal costs of wrongful predictions.
Started Reading: May 25, 2013 Finished Reading: October 13, 2013
Given the technical nature of what Nate Silver does, and some of the early mentions of the book, I had higher hopes for the technical portions of the book. As usual for a popular text, I was left wanting a lot more. Again, the lack of any math left a lot to desire. I wish technical writers could get away with even a handful of equations, but wishing just won’t make it so.
The first few chapters were a bit more technical sounding, but eventually devolved into a more journalistic viewpoint of statistics, prediction, and forecasting in general within the areas of economics, political elections, weather forecasting, earthquakes, baseball, poker, chess, and terrorism. I have a feeling he lost a large part of his audience in the first few chapters by discussing the economic meltdown of 2008 first instead of baseball or poker and then getting into politics and economics.
While some of the discussion around each of these bigger topics are all intrinsically interesting and there were a few interesting tidbits I hadn’t heard or read about previously, on the whole it wasn’t really as novel as I had hoped it would be. I think it should be required reading for all politicians however, as I too often get the feeling that none of them think at this level.
There was some reasonably good philosophical discussion of Bayesian statistics versus Fisherian, but it was all too short and could have been fleshed out more significantly. I still prefer David Applebaum’s historical and philosophical discussion of probability in Probability and Information: An Integrated Approach though he surprisingly didn’t mention R.A. Fisher directly himself in his coverage.
It was interesting to run across additional mentions of power laws in the realms of earthquakes and terrorism after reading Melanie Mitchell’s Complexity: A Guided Tour (review here), but I’ll have to find some texts which describe the mathematics in full detail. There was surprisingly large amount of discussion skirting around the topics within complexity without delving into it in any substantive form.
For those with a pre-existing background in science and especially probability theory, I’d recommend skipping this and simply reading Daniel Kahneman’s book Thinking, Fast and Slow. Kahneman’s work is referenced several times and his book seems less intuitive than some of the material Silver presents here.
This is the kind of text which should be required reading in high school civics classes. Perhaps it might motivate more students to be interested in statistics and science related pursuits as these are almost always at the root of most political and policy related questions at the end of the day.
For me, I’d personally give this three stars, but the broader public should view it with at least four stars if not five as there is some truly great stuff here. Unfortunately a lot of it is old hat or retreaded material for me.
As an electrical engineer (in the subfields of information theory and molecular biology), I have to say that I’m very intrigued by the articles (1, 2) that Marc Parry has written for the Chronicle in the past few weeks on the subjects of quantitative history, cliometrics/cliodynamics, or what I might term Big History (following the tradition of David Christian; I was initially turned onto it by a Chronicle article). I have lately coincidentally been reading Steven Pinker’s book “The Better Angels of Our Nature” as well as Daniel Kanheman’s “Thinking, Fast and Slow”. (I’ll also mention that I’m a general fan of the work of Jared Diamond and Matt Ridley who impinge on these topics as well.)
I’m sure that all of these researchers are onto something in terms of trying to better quantify our historical perspectives in using science and applying it to history. I think the process might be likened to the ways in which methods of computed tomography, P.E.T., S.P.E.C.T, et al have been applied to the areas of psychology since the late 70’s to create the field of cognitive neuropsychology which has now grown much more closely to the more concrete areas of neurophysiology within biology, chemistry, and medicine.
I can see both sides of the “controversy” which is mentioned in the articles as well as in the comments in all of the articles, but I have a very visceral gut feeling that they can be ironed out over time. I say this as areas like behavioral economics which have grown out of the psychology work mentioned in Kahneman’s book become more concrete. The data available for application with relation to history will be much more useful as people’s psychological interactions with their surroundings are better understood. People in general are exceptionally poor at extrapolating statistical knowledge of the world around them and putting it into the best use. For example, although one can make an accurate calculation of the time-value of money, most people who know it won’t use it to determine the best way of taking a large lottery payout (either a lump sum or paid out over time), and this doesn’t even take into consideration the phenomenal odds against even playing the lottery in the first place. Kahneman’s system 1 and system 2 structures in conjunction with more historical data and analysis of the two in conjunction may be a far better method than either that of historians’ previous attempts or that of the quantitative method separately. Put into mathematical terms, it’s much more likely the case that human interactions follow a smaller local min-max curve/equation on a limited horizon, but do not necessarily follow the global maxima and minima that are currently being viewed at the larger scales of big history. We’ll need to do a better job of sifting through the data and coming up with a better interpretation of it on the correct historical scales for the problem at hand.
Perhaps, by analogy, we might look at this disconnect between the two camps as the same type of disconnect seen in the areas of Newtonian and quantum physics. They’re both interlinked somehow and do a generally good job of providing accurate viewpoints and predictions of their own sub-areas, but haven’t been put together coherently into one larger catch-all theory encompassing both. Without the encouragement of work in the quantitative areas of history, we’ll certainly be at a great disadvantage.
Proofiness: The Dark Arts of Mathematical Deception
Mathematics, Popular Science
September 23, 2010
The bestselling author of Zero shows how mathematical misinformation pervades-and shapes-our daily lives. According to MSNBC, having a child makes you stupid. You actually lose IQ points. Good Morning America has announced that natural blondes will be extinct within two hundred years. Pundits estimated that there were more than a million demonstrators at a tea party rally in Washington, D.C., even though roughly sixty thousand were there. Numbers have peculiar powers-they can disarm skeptics, befuddle journalists, and hoodwink the public into believing almost anything. "Proofiness," as Charles Seife explains in this eye-opening book, is the art of using pure mathematics for impure ends, and he reminds readers that bad mathematics has a dark side. It is used to bring down beloved government officials and to appoint undeserving ones (both Democratic and Republican), to convict the innocent and acquit the guilty, to ruin our economy, and to fix the outcomes of future elections. This penetrating look at the intersection of math and society will appeal to readers of Freakonomics and the books of Malcolm Gladwell.
Charles Seife doesn’t prove that mathematics is essential for a democracy, but he certainly shows how the lack of proper use of mathematics can fray heavily at the edges!
Proofiness was a great book to have read over a long Fourth of July holiday. Though many people may realize some of the broad general concepts in the book, it’s great to have a better structure for talking about concepts like Potemkin numbers, disestimation, fruit packing, cherry picking, apple polishing, comparing apples to oranges, causuistry, randnumbness, regression to the moon, tragedy of the commons, and moral hazard among others. If you didn’t think mathematics was important to daily life or our democratic society, this book will certainly change your mind.
Seife covers everything from polls, voting, politics, economics, marketing, law, and even health to show how numbers are misused in a modern world that can ill-afford to ignore what is really going on around us.
This is a fantastic book for nearly everyone in the general public, but I’d highly recommend it for high school students while taking civics.