# Simulating a Tragedy of the Commons II – Indirect cost

I continue the series on the Tragedy of the Commons, based on my recent paper in the journal Sustainability.

Two users, in a meadow…

A tragedy of the commons (TOC) is initiated when one or more users of a common, and unmanaged, resource increases its use of the resource. Because the resource is a commons, all the benefits of increased use accrue to that user alone but the cost to the resource is shared by all users. According to Hardin’s argument, other users are then compelled to increase their own use, presumably to maintain their levels of benefit. This argument is generally accepted by TOC studies, whether they are for or against Hardin’s suggestions of the frequency of TOC or his suggested solutions. If one takes a historical view of any particular TOC, however, then there must have been a point when total utilization by all the users was below the level of resource available and the amount being produced (remember, the resource is renewable!). The question then arises, why, if one user’s increased utilization does not affect your own benefit because of the plenitude of the resource, would you feel compelled to increase your own utilization? There are assuredly multiple, non-exclusive answers, including the ability of humans to forecast situations. In that case, you could perceive a future limitation of your own benefits, or potential for growth, and therefore engage in a somewhat competitive escalation of resource use. No matter, because benefits are still accrued by yourself only, while costs are distributed among all the users. This perceived, or indirect cost can be measured in terms of the developing model (see previous post) as
$c_{i}(t) = \frac{1}{N} \left [ R(t) - R(t+1)\right ]$
where c is the average cost to each user. Any reduction of the standing resource available is a positive cost, while increases are negative costs. A stable resource level means that no cost is incurred by any users. I illustrate the situation with the following cartoons:

One user increases his use of the meadow. The grey around the goats represents resource destruction or degradation.

The other user increases his own herd in response, further degrading the resource. Note, however, that there is still plenty of meadow available for herd expansion.

The situation changes significantly when total resource utilization reaches a point where individual user benefits cease to grow and actually begin to decline. Then, users are indeed compelled to increase use simply in order to maintain their current benefit. That is the classic TOC, but it leaves wanting the explanation for escalation of use prior to that point. I’ll take this up in the next post.

# Simulating a Tragedy of the Commons I

I introduced a recent paper, in the previous post, on the Tragedy of the Commons. The paper is published in Sustainability, an open access paper and is therefore available to everyone. One bit that I did not include in the paper, however, is the code that I used for the simulations therein. On the suggestion of a friend, I’ll publish them here in a series of posts. (Thanks Mauro)

The simulations are very simple, being based on difference equations, and were all coded in Octave. The code is not pretty, but I cleaned it up and added some comments. If you have questions, just drop me a comment on the blog. This first simulation is of a basic tragedy involving two users, and corresponds to Figures 1B-C in the paper. A resource is simulated to a steady state based on a Ricker model, for 100 steps, and then users begin to utilize it. They increase their utilization per time step at a steady rate (acceleration of utilization is constant), and the simulation is run until the resource is exhausted. The basic output of resource, one user’s benefit, and total utilization, are plotted in the figure here. The equations simulated are outlined in the excerpt from pg. 754 of the paper:
We now introduce a term to represent consumption or utilization by our human TOC agents.
\begin{aligned} R(t) &=& R(t)e^{r\left ( 1-\frac{R(t)}{K} \right )} - U(t)R(t) \nonumber \\ & \Rightarrow & R(t)\left [ e^{r\left ( 1-\frac{R(t)}{K} \right )} - U(t)\right ] \end{aligned}
where U(t) is the total fraction of R utilized by human users at time t, and ranges from 0 to 1. The term in square brackets on the right hand side of the equation is the modified growth rate of R. The resource is stable when this term is positive or zero, that is, the rate of resource renewal exceeds or is equal to utilization. Since U is the total standardized utilization rate of all users, it may be expanded to
$0 \leq U(t) = \sum_{i=1}^{N}u_{i}(t) \leq 1$
where there are N users and $u_{i}(t)$ is the standardized utilization rate of the $i^{th}$ user at time t. The benefit gained from utilization is modelled as
\begin{aligned} b_{i}(t) &=& f_{i}u_{i}(t)R(t)\nonumber \\ B(t) &=& R(t)\sum_{i=1}^{N}f_{i}u_{i} \end{aligned}
where $b_{i}$ is the benefit to user i, f is a factor that converts resources utilized into another commodity, for example converting harvested food to energy or currency, and B is the total benefit to all users.

The code follows, and can be executed by running it in an interactive Octave session. Please be aware that WordPress has somewhat limited options for formatting code. I’ve used Matlab formatting here, but wrapped lines might not be obvious. Single lines are all terminated with a semicolon, “;”, so just look out for those.

#RESOURCE DATA
#initial resource level
R0 = 1;
R = zeros(1,2);
R(1, 1) = 1;
R(1, 2) = 1;
#carrying capacity
K = 50;
#intrinsic growth rate r
r = 1.5;

#USER DATA
#There are 2 users
#initial acceleration of utilization
du0 = 1.05;
#individual user parameters, users 1 and 2
#initialize arrays
u1 = zeros(1,2);
u2 = zeros(1,2);
b1 = zeros(1,2);
b2 = zeros(1,2);
u1(1,1) = 1;
#initial utilization rate
u1(1,2) = .001;
#conversion factor of resource to benefit
f1 = 1;
b1(1,1) = 1;
b1(1,2) = 0;
u2(1,1) = 1;
#initial utilization rate
u2(1,2) = .001;
#conversion factor of resource to benefit
f2 = 1;
b2(1,1) = 1;
b2(1,2) = 0;
U = zeros(1,2);
U(1,1) = 1;
#total utilization
U(1,2) = b1(1,2) + b2(1,2);
du = zeros(1,2);
du(1,1) = 0;
du(1,2) = du0;

#BEGIN SIMULATION

#RUN RESOURCE TO STEADY STATE, NO UTILIZATION
for count2 = 2:100
R(count2, 1) = count2;
R(count2, 2) = R(count2-1, 2) * (exp(r*(1-(R(count2-1, 2)/K))));
u1(count2, 1) = count2;
u1(count2, 2) = u1(1,2);
u2(count2, 1) = count2;
u2(count2, 2) = u2(1,2);
du(count2,1) = count2;
du(count2,2) = du0;
endfor

#INITIATE UTILIZATION
for count1 = 101:265
R(count1, 1) = count1;
#modified Ricker model with utilization
R(count1, 2) = R(count1-1, 2) * (exp(r*(1-(R(count1-1, 2)/K))) - (u1(count1-1,2)+u2(count1-1,2)));
u1(count1, 1) = count1;
#increase utilization
u1(count1, 2) = u1(count1-1, 2) * du(count1-1,2);
#benefits (equal for both users in this case)
b1(count1, 1) = count1;
b1(count1, 2) = f1 * u1(count1-1, 2) * R(count1-1, 2);
b2(count1, 1) = count1;
b2(count1, 2) = f2 * u2(count1-1, 2) * R(count1-1, 2);
u2(count1, 1) = count1;
u2(count1, 2) = u2(count1-1, 2) * du(count1-1,2);
#total benefits
U(count1, 1) = count1;
U(count1, 2) = b1(count1, 2) + b2(count1, 2);
du(count1,1) = count1;
du(count1,2) = du(count1-1,2);
endfor


# Ecology and the Tragedy of the Commons

Well, it’s been quite some time since the last post, but I’ve been busy! This post is just a short notice of a new paper, just published today. The paper is part of a special issue on the Tragedy of the Commons in the journal Sustainability. My paper takes a comparative look at the Tragedy in ecological communities and human societies, and the potential of human mutualisms for avoiding tragedies. The situation is not a very hopeful one, however, given our ever-growing human population. Hardin did note this in his original essay. Finally, this paper was inspired by an earlier paper by myself and Ken Angielczyk.

Here’s a link to the paper, as well as the abstract.

Roopnarine, P. Ecology and the Tragedy of the Commons. Sustainability 2013, 5, 749-773.

Abstract

This paper develops mathematical models of the tragedy of the commons analogous to ecological models of resource consumption. Tragedies differ fundamentally from predator–prey relationships in nature because human consumers of a resource are rarely controlled solely by that resource. Tragedies do occur, however, at the level of the ecosystem, where multiple species interactions are involved. Human resource systems are converging rapidly toward ecosystem-type systems as the number of exploited resources increase, raising the probability of system-wide tragedies in the human world. Nevertheless, common interests exclusive of exploited commons provide feasible options for avoiding tragedy in a converged world.

# Of elections, polls and tragedies

This blog might seem an odd place to find a piece related to the recent U.S. presidential election, were it not for the fact that one of the highlights of the electoral drama was the polling scene and all those wonderful data. Many of you, numbers geeks like me no doubt, probably followed one or more of the excellent polls analyses online as the whole thing unfolded. Prominent among those were the analytical poll aggregators, my favourites being Nate Silver’s FiveThirtyEight, Votamatic, and the Princeton Election Consortium. The sheer audacious accuracy of those folks was a stinging indictment of political punditry, basically handing the talking heads Algorithms, Statistics, and Science. For those of you not familiar with what I’m talking about here, the numbers guys basically took the data being gathered by the armies of pollsters out there, and algorithmically decided what they meant.

And the pundits weren’t the only ones left lying about on the battlefield. If any of you were following the polls, the swings, disagreements and discrepancies left many of us scratching our heads sometimes. What one earth was wrong with them?! I certainly do not have an answer, but apparently some of the pollsters do. Frank Newport, Editor-in-Chief of the seriously (but apparently not fatally) wounded Gallup Poll offered up an explanation, which you can read in it’s totality here. He makes three main points (as far as I can discern). First, Gallup’s poll does not try to determine the winner of the election. It’s actual objective is to assess the popular vote. Okay, but then it is rather pointless in a republic based on an electoral college. No harm done, but I’ll scratch them off my RSS feed in 2016. Second, the political campaign game really has changed, what with the invention of these things called cell phones and social networks. They promise to get caught up before the next election. Third, and this is the one that I really want to address here, Newport backhandedly slams the aggregators as parasites, but he couches it in a somewhat clever reference to the Tragedy of the Commons. Okay, my turn to get analytical.

The Tragedy of the Commons, based on Garrett Hardin’s ground breaking 1968 essay published in Science, makes a simple argument: In a situation where a resource has multiple rational users, actions that benefit an individual can bring ruin to all. For example, consider a group of herdsmen. It seems reasonable that when he is able to, a herdsman will add another goat to his flock. The benefits of the additional goat are to the herdsman only, but because all herdsmen share a common pasture, the costs are distributed to all. This is quite a favourable arrangement for all the herdsmen, except for the fact that as herd sizes increase the quality of the pasture declines until it can no longer support any goats. Ruin to all. Newport, who refers to the concept as the Law of the Commons, claims that the statisticians are operating under a reverse tragedy. His argument is that you have these polling companies out there, such as Gallup, who are doing their business as best as they can, but by releasing their results the parasitic statisticians can then aggregate multiple datasets, crunch the numbers, and come up with a better answer than any single pollster could. In other words, and here’s the reverse, each pollster incurs all its own costs while distributing the benefits to all. Good point, right? No, not so fast, and here’s why.

1. Gallup is a business, not a non-profit organization. If they were the only polling company, they would still distribute their data/results in order to earn income. Their distribution of benefits therefore feeds back into the company, but the worry now is that the quality of the product is questionable.
2. The collective data of the pollsters is a commons, but for the aggregators only, because they are the only ones partaking of the aggregated data. It is therefore difficult to argue that any harm is being done to Gallup if they are simply dumping grass out there for someone else’s goats.
3. In fact, if Newport and friends were smart, they would take his analogy seriously and run with it. Treat the data as a commons indeed and partake of the benefits. It is clear that the analytic methods being applied to aggregated data are vastly superior to the singular reports of any particular pollster. Rather than damaging polling, I think that Silver and others have demonstrated that there actually is a good data signal in there.

Therefore, if Gallup and others are serious about providing insight, they would each take advantage of all the data, conduct proper analyses, and produce useful results. Then the reversal of the tragedy would indeed be complete: the actions of the individual bring success to all! Oh, but I forgot something: They weren’t interested in the outcome of the election. Never mind.

p.s. You can read one of our food web-related arguments on the Tragedy here:

Roopnarine, P. D. and K. D. Angielczyk. 2012. The evolutionary palaeoecology of species and the tragedy of the commons. Biology Letters 8:147-150. DOI:10.1098/rsbl.2011.0662

# One more time: Why model?

A wonderful statement by Jay Melosh in a recent interview with Physics Today. It really underscores the approach of much of the research presented in this blog; just replace “physics” with “biology” or “ecology”. And while Melosh is a master of planetary geology and hence the history of the Solar System, we’re dealing here with evolution and the histories of species and ecosystems. Okay, here’s the quote:

Computer modeling, of course, plays a big role in evaluating the consequences of different hypotheses, which we then compare to observations. While the physics of an individual process may be simple, Nature is messy and computers are one of our principal tools for combining simple processes into the complex fabrics necessary to mimic observations and thus either validate or refute different hypotheses about what we see.

# PNAS: Late Cretaceous restructuring of terrestrial communities facilitated the End-Cretaceous mass extinction in North America

That’s the title of our new paper, hot off the PNAS press. This study was a lot of fun, because it combines my food web work with one of the best known events in the fossil record. The lead author is Jonathan Mitchell, a graduate student at the University of Chicago. Jon became familiar with the food web work via Ken Angielczyk at the Field Museum, also in Chicago, a former post-doctoral researcher in my lab and close collaborator.  Jon wondered what Late Cretaceous, dinosaur-bearing communities would look like when subjected to CEG perturbations (just search this blog for info. on CEG!), and presented his results two years ago at the Annual Meeting of the Geological Society of America. The results were so intriguing that we decided then to explore the question in much greater detail, and ask what sorts of community and ecosystem changes unfolded in the years before the Chicxulub impact, and what role they might have played in the subsequent extinctions. And here are the results! I will list the full reference below, and you can obtain a complete copy of the paper from PNAS (sorry, not open access). Also, here are links to some news websites that have covered the paper, as well as the paper’s abstract. Enjoy!

Jonathan S. Mitchell, Peter D. Roopnarine, and Kenneth D. Angielczyk. Late Cretaceous restructuring of terrestrial communities facilitated the End-Cretaceous mass extinction in North America. PNAS, October 29, 2012

ABSTRACT

The sudden environmental catastrophe in the wake of the end-
Cretaceous asteroid impact had drastic effects that rippled through
animal communities. To explore how these effects may have been
exacerbated by prior ecological changes, we used a food-web
model to simulate the effects of primary productivity disruptions,
such as those predicted to result from an asteroid impact, on ten
Campanian and seven Maastrichtian terrestrial localities in North
America. Our analysis documents that a shift in trophic structure
between Campanian and Maastrichtian communities in North
America led Maastrichtian communities to experience more second-
ary extinction at lower levels of primary production shutdown and
possess a lower collapse threshold than Campanian communities.
Of particular note is the fact that changes in dinosaur richness had
a negative impact on the robustness of Maastrichtian ecosystems
against environmental perturbations. Therefore, earlier ecological
restructuring may have exacerbated the impact and severity of the
end-Cretaceous extinction, at least in North America.

Aside

# Coral reef data now available on Dryad

In addition to the publication (see previous post), data for the Greater Antillean coral reef food webs are now available on Dryad.

From Dryad: “

Dryad is an international repository of data underlying peer-reviewed articles in the basic and applied biosciences. Dryad enables scientists to validate published findings, explore new analysis methodologies, repurpose data for research questions unanticipated by the original authors, and perform synthetic studies. Dryad is governed by a consortium of journals that collaboratively promote data archiving and ensure the sustainability of the repository.”

# Coral reef food webs are out!

The first paper dealing with our Caribbean coral reef work is finally out. This paper is really just a detailed account of the data and webs compilation, but the data are now available to all. Enjoy!

Roopnarine, Peter D. and Rachel Hertog. 2013. Detailed Food Web Networks of Three Greater Antillean Coral Reef Systems: The Cayman Islands, Cuba, and Jamaica. Dataset Papers in Ecology, Vol. 2013, Article ID 857470, 9 pages.

Abstract: Food webs represent one of the most complex aspects of community biotic interactions. Complex food webs are represented as networks of interspecific interactions, where nodes represent species or groups of species, and links are predator-prey interactions. This paper presents reconstructions of coral reef food webs in three Greater Antillean regions of the Caribbean: the Cayman Islands, Cuba, and Jamaica. Though not taxonomically comprehensive, each food web nevertheless comprises producers and consumers, single-celled and multicellular organisms, and species foraging on reefs and adjacent seagrass beds. Species are grouped into trophic guilds if their prey and predator links are indistinguishable. The data list guilds, taxonomic composition, prey guilds/species, and predators. Primary producer and invertebrate richness are regionally uniform, but vertebrate richness varies on the basis of more detailed occurrence data. Each region comprises 169 primary producers, 513 protistan and invertebrate consumer species, and 159, 178, and 170 vertebrate species in the Cayman Islands, Cuba, and Jamaica, respectively. Caribbean coral reefs are among the world’s most endangered by anthropogenic activities. The datasets presented here will facilitate comparisons of historical and regional variation, the assessment of impacts of species loss and invasion, and the application of food webs to ecosystem analyses.

# Of cusps and folds

I am currently working on an essay (overdue!) for which I created this figure. It’s a cusp manifold and a very useful heuristic device for demonstrating some of the concepts and applications of Catastrophe Theory. Following is an excerpt from the current draft of the essay where the figure is introduced. And for those of you who are interested, following the excerpt is a very brief explanation and code for plotting the manifold.

Excerpt: Much of our theoretical understanding of tipping points is captured by Catastrophe Theory, a deep and somewhat ominously named mathematical theory. “Catastrophe” as used in the theory is generally understood to imply a dramatic change of state, with no necessary judgement as to whether the change is for the better or worse. Though mathematically complicated, the theory provides us with a very useful heuristic device, the catastrophe manifold, which can be used to visualize the manner in which a system will respond to external forces or controls. The manifold for a system controlled by two parameters is shown in Figure 1. The surface in the figure, known as a cusp catastrophe, illustrates the behaviour of a system, controlled by two factors, that is capable of a catastrophic state shift. For our case, the system is the global biosphere and the controlling factors are population size and resource consumption. The height of the surface is the condition of the biosphere’s state, with greater height corresponding to a healthier biosphere. It is easy to see that height, and hence biosphere condition, decreases as either population size or resource consumption increase.

The manifold was plotted using Mathematica, with code adopted from the notebook available here. The catastrophe is an unfolding of the singularity for the function
$f(x) = x^{4}$
The controlling equation is
$f(x,a,b) = x^{4} + ax^{2} + bx$
So the planar axes of the figure are parameters a and b, and the vertical axis is x. The surface of the manifold are the equilibrium points, or minima of the function, i.e. the points at which the first derivative
$x^{3} + 2a + b = 0$
Those points are the real roots of the above equation. The Mathematica code is
F[x_, u_, v_] := x^4 + u*x^2 + v*x
y = ContourPlot3D[
Evaluate[D[F[x, u, v], x]], {u, -2.5, 3}, {v, -2.5, 3}, {x, -1.4,
1.4}, PlotPoints -> 7, ViewPoint -> {-1.25, 1.6, 1.2},
Axes -> False, Boxed -> False,
ContourStyle -> Directive[Red, Yellow, Opacity[0.5]], Mesh -> 0,
Contours -> 1, MaxRecursion -> 15, PlotPoints -> 50]

# Extinction is not an Evolutionary imperative

Tags

Therefore, please stop using phrases such as, “Extinction is the fate of all species”. True, maybe 99.99% of all species that have ever existed are now extinct, but that is a frequency, not a law. Extinction results from a failure to acclimate on a short timescale, or to adapt on a longer, evolutionary timescales. The apparent extinction of species on the long run is the result of those failures, and is neither a requirement nor a certainty.