This blog might seem an odd place to find a piece related to the recent U.S. presidential election, were it not for the fact that one of the highlights of the electoral drama was the polling scene and all those wonderful data. Many of you, numbers geeks like me no doubt, probably followed one or more of the excellent polls analyses online as the whole thing unfolded. Prominent among those were the analytical poll aggregators, my favourites being Nate Silver’s FiveThirtyEight, Votamatic, and the Princeton Election Consortium. The sheer audacious accuracy of those folks was a stinging indictment of political punditry, basically handing the talking heads Algorithms, Statistics, and Science. For those of you not familiar with what I’m talking about here, the numbers guys basically took the data being gathered by the armies of pollsters out there, and algorithmically decided what they meant.

And the pundits weren’t the only ones left lying about on the battlefield. If any of you were following the polls, the swings, disagreements and discrepancies left many of us scratching our heads sometimes. What one earth was wrong with them?! I certainly do not have an answer, but apparently some of the pollsters do. Frank Newport, Editor-in-Chief of the seriously (but apparently not fatally) wounded Gallup Poll offered up an explanation, which you can read in it’s totality here. He makes three main points (as far as I can discern). First, Gallup’s poll does not try to determine the winner of the election. It’s actual objective is to assess the popular vote. Okay, but then it is rather pointless in a republic based on an electoral college. No harm done, but I’ll scratch them off my RSS feed in 2016. Second, the political campaign game really has changed, what with the invention of these things called cell phones and social networks. They promise to get caught up before the next election. Third, and this is the one that I really want to address here, Newport backhandedly slams the aggregators as parasites, but he couches it in a somewhat clever reference to the Tragedy of the Commons. Okay, my turn to get analytical.

The Tragedy of the Commons, based on Garrett Hardin’s ground breaking 1968 essay published in Science, makes a simple argument: In a situation where a resource has multiple rational users, actions that benefit an individual can bring ruin to all. For example, consider a group of herdsmen. It seems reasonable that when he is able to, a herdsman will add another goat to his flock. The benefits of the additional goat are to the herdsman only, but because all herdsmen share a common pasture, the costs are distributed to all. This is quite a favourable arrangement for all the herdsmen, except for the fact that as herd sizes increase the quality of the pasture declines until it can no longer support any goats. Ruin to all. Newport, who refers to the concept as the Law of the Commons, claims that the statisticians are operating under a reverse tragedy. His argument is that you have these polling companies out there, such as Gallup, who are doing their business as best as they can, but by releasing their results the parasitic statisticians can then aggregate multiple datasets, crunch the numbers, and come up with a better answer than any single pollster could. In other words, and here’s the reverse, each pollster incurs all its own costs while distributing the benefits to all. Good point, right? No, not so fast, and here’s why.

  1. Gallup is a business, not a non-profit organization. If they were the only polling company, they would still distribute their data/results in order to earn income. Their distribution of benefits therefore feeds back into the company, but the worry now is that the quality of the product is questionable.
  2. The collective data of the pollsters is a commons, but for the aggregators only, because they are the only ones partaking of the aggregated data. It is therefore difficult to argue that any harm is being done to Gallup if they are simply dumping grass out there for someone else’s goats.
  3. In fact, if Newport and friends were smart, they would take his analogy seriously and run with it. Treat the data as a commons indeed and partake of the benefits. It is clear that the analytic methods being applied to aggregated data are vastly superior to the singular reports of any particular pollster. Rather than damaging polling, I think that Silver and others have demonstrated that there actually is a good data signal in there.

Therefore, if Gallup and others are serious about providing insight, they would each take advantage of all the data, conduct proper analyses, and produce useful results. Then the reversal of the tragedy would indeed be complete: the actions of the individual bring success to all! Oh, but I forgot something: They weren’t interested in the outcome of the election. Never mind.

p.s. You can read one of our food web-related arguments on the Tragedy here:

Roopnarine, P. D. and K. D. Angielczyk. 2012. The evolutionary palaeoecology of species and the tragedy of the commons. Biology Letters 8:147-150. DOI:10.1098/rsbl.2011.0662

Advertisements