Jabberwocky Ecology

Exploring MaxEnt based species-area relationship predictions [Research Summary]

This is a guest post by Dan McGlinn, a weecology postdoc (@DanMcGlinn on Twitter). It is a Research Summary of: McGlinn, D.J., X. Xiao, and E.P. White. 2013. An empirical evaluation of four variants of a universal species–area relationship. PeerJ 1:e212 http://dx.doi.org/10.7717/peerj.212. These posts are intended to help communicate our research to folks who might not have the time, energy, expertise, or inclination to read the full paper, but who are interested in a <1000 general language summary.

It is well established in ecology that if the area of a sample is increased you will in general see an increase in the number species observed.  There are a lot of different reasons why larger areas harbor more species: larger areas contain more individuals, habitats, and environmental variation, and they are likely to cross more barriers to dispersal – all things that promote more species to be able to exist together in an area. We typically observe relatively smooth and simple looking increases in species number with area. This observation has mystified ecologists: How can a pattern that should be influenced by many different and biologically idiosyncratic processes appear so similar across scales, taxonomic groups, and ecological systems?

Recently a theory was proposed (Harte et al. 2008, Harte et al. 2009) which suggests that detailed knowledge of the complex processes that influence the increase in species number may not be necessary to accurately predict the pattern. The theory proposes that ecological systems tend to simply be in their most likely configuration. Specifically, the theory suggests that if we have information on the total number of species and individuals in an area then we can predict the number of species in smaller portions of that area.

Published work on this new theory suggests that it has potential for accurately predicting how species number changes with area; however, it has not been appreciated that there are actually four different ways that the theory can be operationalized to make a prediction.  We were interested to learn

  1. Can the theory accurately predict how species number changes with area across many different ecological systems, and
  2. Do the different versions of the theory consistently perform better than others

To answer these questions we needed data. We searched online and made requests to our colleagues for datasets that documented the spatial configuration of ecological communities.  We were able to pull together a collection of 16 plant community datasets. The communities spanned a wide range of systems including hyper-diverse, old-growth tropical forests, a disturbance prone tropical forest, temperate oak-hickory and pine forests, a Mediterranean mixed-evergreen forest, a low diversity oak woodland, and a serpentine grassland.

Fig 1. A) Results from one of the datasets, the open circles display the observed data and the lines are the four different versions of the theory we examined.  B) A comparison of the observed and predicted number of species across all areas and communities we examined for one of the versions of the theory.

Across the different communities we found that the theory was generally quite accurate at predicting the number of species (Fig 1 above), and that one of the versions of the theory was typically better than the others in terms of the accuracy of its predictions and the quantity of information it required to make predictions. There were a couple of noteworthy exceptions in our results. The low diversity oak woodland and the serpentine grassland both displayed unusual patterns of change in richness. The species in the serpentine grassland were more spatially clustered than was typically observed in the other communities and thus better described by the versions of the theory that predicted stronger clustering. Abundance in the oak woodland was primarily distributed across two species whereas the other 5 species where only observed once or twice. This unusual pattern of abundance resulted in a rather unique S-shaped relationship between the number of species and area and required inputting the observed species abundances to accurately model the pattern.

The two key findings from our study were

  1. The theory provides a practical tool for accurately predicting the number of species in sub-samples of a given site using only information on the total number of species and individuals in that entire area.
  2. The different versions of the theory do make different predictions and one appears to be superior

Of course there are still a lot of interesting questions to address.  One question we are interested in is whether or not we can predict the inputs of the theory (total number of species and individuals for a community) using a statistical model and then plug those predictions into the theory to generate accurate fine-scaled predictions.  This kind of application would be important for conservation applications because it would allow scientists to estimate the spatial pattern of rarity and diversity in the community without having to sample it directly. We are also interested in future development of the theory that provides predictions for the number of species at areas that are larger (rather than smaller) than the reference point which may have greater applicability to conservation work.

The accuracy of the theory also has the potential to help us understand the role of specific biological processes in shaping the relationship between species number and area.  Because the theory didn’t include any explicit biological processes, our findings suggest that specific processes may only influence the observed relationship indirectly through the total number of species and individuals. Our results do not suggest that biological processes are not shaping the relationship but only that their influence may be rather indirect.  This may be welcome news to practitioners who rely on the relationship between species number and area to devise reserve designs and predict the effects of habitat loss on diversity.

If you want to learn more you can read the full paper (it’s open access!) or check out the code underlying the analysis (it’s open source and includes instructions for replicating the analysis!).

References:

Harte, J., A. B. Smith, and D. Storch. 2009. Biodiversity scales from plots to biomes with a universal species-area curve. Ecology Letters 12:789–797.

Harte, J., T. Zillio, E. Conlisk, and A. B. Smith. 2008. Maximum entropy and the state-variable approach to macroecology. Ecology 89:2700–2711.

On the value of fundamental scientific research

Jeremy Fox over at the Oikos Blog has written an excellent piece explaining why fundamental, basic science, research is worth investing in, even when time and resources are limited. His central points include:

  • Fundamental research is where a lot of our methodological advances come from.
  • Fundamental research provides generally-applicable insights.
  • Current applied research often relies on past fundamental research.
  • Fundamental research often is relevant to the solution of many different problems, but in diffuse and indirect ways.
  • Fundamental research lets us address newly-relevant issues.
  • Fundamental research alerts us to relevant questions and possibilities we didn’t recognize as relevant.
  • Fundamental research suggests novel solutions to practical problems.
  • The only way to train fundamental researchers is to fund fundamental research.

I don’t have a lot to add to what Jeremy has already said, except that I strongly agree with the points that he has made and think that in an era where much of ecology has direct applications to things like global change we need to guard against the temptation to justify all of our research based on its applications.

When I think about the value of fundamental research I always recall a scene from an early season of The West Wing where a politician (SAM) and a scientist (MILLGATE) are discussing how to explain the importance of something akin to the Large Hadron Collider. It loses a little something as a script (complements of Unofficial West Wing Transcript Archive), but nonetheless:

SAM
What is it?

MILLGATE
It’s a machine that reveals the origin of matter… By smashing protons together at very high speeds and at very high temperatures, we can recreate the Big Bang in a laboratory setting, creating the kinds of particles that only existed in the first trillionth of a second after the universe was created.

SAM
Okay, terrific. I understand that. What kind of practical applications does it have?

MILLGATE
None at all.

SAM
You’re not in any way a helpful person.

MILLGATE
Don’t have to be. I have tenure.

SAM
Doctor.

MILLGATE
There are no practical applications, Sam. Anybody who says different is lying.

ENLOW
If only we could only say what benefit this thing has, but no one’s been able to do that.

MILLGATE
That’s because great achievement has no road map. The X-ray’s pretty good. So is penicillin. Neither were discovered with a practical objective in mind. I mean, when the electron was discovered in 1897, it was useless. And now, we have an entire world run by electronics. Haydn and Mozart never studied the classics. They couldn’t. They invented them.

SAM
Discovery.

MILLGATE
What?

SAM
That’s the thing that you were… Discovery is what. That’s what this is used for. It’s for discovery.

The episode is “Dead Irish Writers” and I’d highly recommend watching the whole thing if you want to feel inspired about doing fundamental research.

Sometimes it’s important to ignore the details [Things you should read]

Joan Strassman has a very nice post about why it is sometimes useful to step back from the intricate details of biological systems in order to understand the general processes that are operating. Here’s a little taste of the general message

In this talk, Jay said that MacArthur claimed the best ecologists had blurry vision so they could see the big patterns without being overly distracted by the contradictory details. This immediately made a huge amount of sense to me. Biology is so full of special cases, of details that don’t fit theories, that it is easy to despair of advancing with broad, general theories. But we need those theories, for they tell us where to look next, what data to collect, and even what theory to challenge. I am a details person, but love the big theories.

The whole post is definitely worth a read.

Distributed Ecology [Blogrolling]

I’ve been waiting for a while now for Ted Hart’s blog to get up enough steam to send folks over there, and since in the last two weeks he’s had three posts, revamped the mission of the blog, and engaged in the ongoing conversation about Lindenmayer & Likens, it seems like that time has arrived.

The blog is called Distributed Ecology because, as Ted describes,

I chose distributed ecology as a title because I like the idea of ecological thought like distributed computing. Lots of us scientists like little nodes around the web thinking and processing ideas into something great.

Sounds like what I’m hoping to see (and am increasingly witnessing) from the ecology blogs. So, head on over, check it out, click on the RSS button, and welcome Ted to the ecology blogging community.

The war over academic publishing has officially begun

https://twitter.com/#!/ethanwhite/status/94412695587143680

The last week has been an interesting one for academic publishing. First a 24 year old programmer name Aaron Swartz was arrested for allegedly breaking into MIT’s network and downloading 5 million articles from JSTOR. Given his background it has been surmised that he planned on making the documents publicly available. He faces up to 35 years in federal prison.

In response to the arrest Gregory Maxwell, a “technologist” and hobbyist scientist uploaded nearly 20,000 JSTOR [1] articles from the Philosophical Transactions of the Royal Society to The Pirate Bay, a bittorrent file sharing site infamous for facilitating the illegal sharing of music and movies. As explanation for the upload Maxwell posted a scathing, and generally trenchant, critique of the current academic publishing system that I am going to reproduce here in it’s entirety so that those uncomfortable with [2], or blocked from, visiting The Pirate Bay can read it [3]. In it he notes that since all of the articles he posted were published prior to 1923 they are all in the public domain.

This archive contains 18,592 scientific publications totaling
33GiB, all from Philosophical Transactions of the Royal Society
and which should be  available to everyone at no cost, but most
have previously only been made available at high prices through
paywall gatekeepers like JSTOR.

Limited access to the  documents here is typically sold for $19
USD per article, though some of the older ones are available as
cheaply as $8. Purchasing access to this collection one article
at a time would cost hundreds of thousands of dollars.

Also included is the basic factual metadata allowing you to
locate works by title, author, or publication date, and a
checksum file to allow you to check for corruption.

I've had these files for a long time, but I've been afraid that if I
published them I would be subject to unjust legal harassment by those who
profit from controlling access to these works.

I now feel that I've been making the wrong decision.

On July 19th 2011, Aaron Swartz was criminally charged by the US Attorney
General's office for, effectively, downloading too many academic papers
from JSTOR.

Academic publishing is an odd system - the authors are not paid for their
writing, nor are the peer reviewers (they're just more unpaid academics),
and in some fields even the journal editors are unpaid. Sometimes the
authors must even pay the publishers.

And yet scientific publications are some of the most outrageously
expensive pieces of literature you can buy. In the past, the high access
fees supported the costly mechanical reproduction of niche paper journals,
but online distribution has mostly made this function obsolete.

As far as I can tell, the money paid for access today serves little
significant purpose except to perpetuate dead business models. The
"publish or perish" pressure in academia gives the authors an impossibly
weak negotiating position, and the existing system has enormous inertia.

Those with the most power to change the system--the long-tenured luminary
scholars whose works give legitimacy and prestige to the journals, rather
than the other way around--are the least impacted by its failures. They
are supported by institutions who invisibly provide access to all of the
resources they need. And as the journals depend on them, they may ask
for alterations to the standard contract without risking their career on
the loss of a publication offer. Many don't even realize the extent to
which academic work is inaccessible to the general public, nor do they
realize what sort of work is being done outside universities that would
benefit by it.

Large publishers are now able to purchase the political clout needed
to abuse the narrow commercial scope of copyright protection, extending
it to completely inapplicable areas: slavish reproductions of historic
documents and art, for example, and exploiting the labors of unpaid
scientists. They're even able to make the taxpayers pay for their
attacks on free society by pursuing criminal prosecution (copyright has
classically been a civil matter) and by burdening public institutions
with outrageous subscription fees.

Copyright is a legal fiction representing a narrow compromise: we give
up some of our natural right to exchange information in exchange for
creating an economic incentive to author, so that we may all enjoy more
works. When publishers abuse the system to prop up their existence,
when they misrepresent the extent of copyright coverage, when they use
threats of frivolous litigation to suppress the dissemination of publicly
owned works, they are stealing from everyone else.

Several years ago I came into possession, through rather boring and
lawful means, of a large collection of JSTOR documents.

These particular documents are the historic back archives of the
Philosophical Transactions of the Royal Society - a prestigious scientific
journal with a history extending back to the 1600s.

The portion of the collection included in this archive, ones published
prior to 1923 and therefore obviously in the public domain, total some
18,592 papers and 33 gigabytes of data.

The documents are part of the shared heritage of all mankind,
and are rightfully in the public domain, but they are not available
freely. Instead the articles are available at $19 each--for one month's
viewing, by one person, on one computer. It's a steal. From you.

When I received these documents I had grand plans of uploading them to
Wikipedia's sister site for reference works, Wikisource - where they
could be tightly interlinked with Wikipedia, providing interesting
historical context to the encyclopedia articles. For example, Uranus
was discovered in 1781 by William Herschel; why not take a look at
the paper where he originally disclosed his discovery? (Or one of the
several follow on publications about its satellites, or the dozens of
other papers he authored?)

But I soon found the reality of the situation to be less than appealing:
publishing the documents freely was likely to bring frivolous litigation
from the publishers.

As in many other cases, I could expect them to claim that their slavish
reproduction - scanning the documents - created a new copyright
interest. Or that distributing the documents complete with the trivial
watermarks they added constituted unlawful copying of that mark. They
might even pursue strawman criminal charges claiming that whoever obtained
the files must have violated some kind of anti-hacking laws.

In my discreet inquiry, I was unable to find anyone willing to cover
the potentially unbounded legal costs I risked, even though the only
unlawful action here is the fraudulent misuse of copyright by JSTOR and
the Royal Society to withhold access from the public to that which is
legally and morally everyone's property.

In the meantime, and to great fanfare as part of their 350th anniversary,
the RSOL opened up "free" access to their historic archives - but "free"
only meant "with many odious terms", and access was limited to about
100 articles.

All too often journals, galleries, and museums are becoming not
disseminators of knowledge - as their lofty mission statements
suggest - but censors of knowledge, because censoring is the one thing
they do better than the Internet does. Stewardship and curation are
valuable functions, but their value is negative when there is only one
steward and one curator, whose judgment reigns supreme as the final word
on what everyone else sees and knows. If their recommendations have value
they can be heeded without the coercive abuse of copyright to silence
competition.

The liberal dissemination of knowledge is essential to scientific
inquiry. More than in any other area, the application of restrictive
copyright is inappropriate for academic works: there is no sticky question
of how to pay authors or reviewers, as the publishers are already not
paying them. And unlike 'mere' works of entertainment, liberal access
to scientific work impacts the well-being of all mankind. Our continued
survival may even depend on it.

If I can remove even one dollar of ill-gained income from a poisonous
industry which acts to suppress scientific and historic understanding,
then whatever personal cost I suffer will be justified ΓΓé¼ΓÇ¥it will be one
less dollar spent in the war against knowledge. One less dollar spent
lobbying for laws that make downloading too many scientific papers
a crime.

I had considered releasing this collection anonymously, but others pointed
out that the obviously overzealous prosecutors of Aaron Swartz would
probably accuse him of it and add it to their growing list of ridiculous
charges. This didn't sit well with my conscience, and I generally believe
that anything worth doing is worth attaching your name to.

I'm interested in hearing about any enjoyable discoveries or even useful
applications which come of this archive.

- ----
Greg Maxwell - July 20th 2011
gmaxwell@gmail.com  Bitcoin: 14csFEJHk3SYbkBmajyJ3ktpsd2TmwDEBb

These stories have been covered widely and the discussion has been heavy on Twitter and in the blogosphere. The important part of this discussion for academic publishing is that it has brought many of the absurdities of the current academic publishing system into the public eye, and a lot of people are shocked and unhappy [4]. This is all happening at the same time that Britain is finally standing up to the big publishing companies as their profits [5] and business models increasingly hamper rather than benefit the scientific process, and serious questions are raised about whether we should be publishing in peer-reviewed journals at all. I suspect that we will look back on 2011 as the tipping point year when academic publishing changed forever.

————————————————————————————————————————————————————————————

[1] In an interview with Wired Campus JSTOR claimed that these aren’t technically their articles because even though JSTOR did digitize these files, and each file includes an indication of JSTORs involvement, the files lack JSTOR’s cover page, so it’s not really their files, it’s the Royal Society’s files. Which first made me think “Wow, that’s about the lamest duck and cover excuse I’ve ever heard” and then “Hey, so if I just delete the cover page off a JSTOR file then apparently they surrender all claim to it. Nice!”

[2] In addition to questionable legality of the site some of the advertising there isn’t exactly workplace appropriate.

[3] I think that given the context he would be fine with us reprinting the entire statement. I’ve done some very minor cleaning up of some junk codes for readability. The original is available here.

[4] But also conflicted about the behavior of the individuals in question.

[5] ~$120 million/year for Wiley and ~$1 billion/year for Reed Elsevier (source LibraryJournal.com).

Bridging, not building, divides in ecology [Things you should read]

There is an excellent post over at EEB & Flow on the empirical divide,inspired by an editorial by David Lindenmayer and Gene Likens in the most recent ESA Bulletin, titled “Losing the Culture of Ecology”. It was great to see some thoughtful and data driven consideration of the idea that we should choose to emphasize one broad area of ecology over another. I really like their conclusion that these “divides” are really driven by other things:

The tensions between “indoor ecology” and field ecology have been conflated with changes in the philosophy of modern ecology, in the difficulties of obtaining funding and publishing as a modern ecologist, and some degree of thinking the “grass is always greener” in the other field. In fact, the empirical divide may not be as wide as is often suggested.

This post motivated some discussion in the comments, and on Twitter,

https://twitter.com/#!/ethanwhite/status/95696496741203969

https://twitter.com/#!/ethanwhite/status/95697531081732097

https://twitter.com/#!/recology_/status/95699437812326400

And a nice follow up post by Jeremy Fox at the Oikos blog.

It’s all pretty short and well worth the read.

Oikos has a blog? [Blogrolling]

Thanks to an email from Jeremy Fox I just found out that Oikos has started a blog. It clearly isn’t on most folks radars (I represent 50% of its Google Reader subscribers), and Jeremy has been putting up some really interesting posts over there so I thought it was worth a mention. According to Jeremy:

I view the Oikos blog as a place where the Oikos editors can try to do the sort of wonderful armchair ecology that John [Lawton] used to do in his ‘View From the Park’ column. I say ‘try’ because I doubt any of us could live up to John’s high standard (I’m sure I don’t!). I’m going to try to do posts that will be thought-provoking for students in particular. Oikos used to be the place to go with interesting, provocative ideas that were well worth publishing even if they were a bit off the wall or not totally correct. It’s our hope (well, my hope anyway) that this blog will become one way for Oikos to reclaim that niche.

I think they’re doing a pretty good job of accomplishing their goal, so go check out recent posts on the importance of hand waving and synthesizing ecology, and then think about subscribing to keep up on the new provocative things they’re up to.

A GitHub of Science? [Things you should read]

There is an excellent post on open science, prestige economies, and the social web over at Marciovm’s posterous*. For those of you who aren’t insanely nerdy** GitHub is… well… let’s just call it a very impressive collaborative tool for developing and sharing software***. But don’t worry, you don’t need to spend your days tied to a computer or have any interest in writing your own software to enjoy gems like:

Evangelists for Open Science should focus on promoting new, post-publication prestige metrics that will properly incentivize scientists to focus on the utility of their work, which will allow them to start worrying less about publishing in the right journals.

Thanks to Carl Boettiger for pointing me to the post. It’s definitely worth reading in its entirety.

_______________________________________________________

*A blog I’d never heard of before, but I subscribed to it’s RSS feed before I’d even finished the entire post.

**As far as biologists go. And, yes, when I say “insanely nerdy” I do mean it as a complement.

***For those interested in slightly more detail it’s a social application wrapped around the popular distributed version control system named Git. Kind of like Sourceforge on steroids.

Science 2.0 [Things you should read]

I’ve read two great posts in the last couple of days that highlight what the recent debate over the the possibility of ‘arsenic based life’ has shown about how scientists are leveraging the modern web to quickly evaluate, discuss and improve science.

Marc Cadotte, Nicholas Mirotchnick and Caroline Tucker have a great post over at EEB & flow that will fill in the background for you. They use this example as a rallying cry to encourage the use of this new technology to improve the scientific process:

Academics should be the first, not the last, to adopt new communication tools. We are no longer limited by the postal service, email or PDFs; the web has gone 2.0 and we should follow suite. So go forth, young researchers, and blog, edit and share. And then go tweet about it all so your eight year-old kid knows how hip you are.

RealClimate’s piece is more focused on what this recent debate proves about the self-correcting nature of science and thus the inherent lack of vast scientific conspiracies related to things like climate change:

The arseno-DNA episode has displayed this process in full public view. If anything, this incident has demonstrated the credibility of scientists, and should promote public confidence in the scientific establishment.

It ends with a pleasantly sophisticated take* on the complexities of doing science in a world of immediate responses that can occur from across all corners of the web.

They are both definitely worth the read and may well inspire you to run out and join the online dialog.

*In stark contrast to this piece in Nature.

No peer review crisis after all?

We’ve had a bit of discussion here at JE about potential solutions to the tragedy of the reviewer commons, so I found a recent letter in Nature (warning – it’s behind a pay wall) suggesting that there may not actually be a problem interesting. The take home message is:

At the journal Molecular Ecology, we find little evidence for the common belief that the peer-review system is overburdened by the rising tide of submissions.

and the authors base this conclusion on some basic statistics about the number of review requests required to obtain a reviewer and the average number of authors and reviewers for each paper. It’s not exactly the kind of hard, convincing data that will formally answer the question of whether there is a problem, but it’s interesting to hear that at least one journal’s editorial group isn’t particularly concerned about this supposedly impending disaster.

Fighting the snake [Things you should read]

As I’ve mentioned before I’m not a big fan of the configuration of most comprehensive exams, but my post on the matter keeps languishing on my out of control To Do list. So, I was really pleased when a friend of mine passed along something that a student had sent him*. The piece is actually about a portion of thesis defenses, but I think it applies most appropriately to comprehensive exams (just substitute writtens for thesis, and add the fact that the guy who picks the snakes is hard of hearing). Regardless, it is short, hilarious and just the sort of thing stressed out students, postdocs, and faculty need to get a little chuckle as they finish up the semester. Go read it.

*Thanks to Joanna Hsu and Peter Adler for passing this along.

Courting controversy & academic ponzi schemes [Things you should read]

Anyone who has been around the halls of academia for a while has heard some well meaning soul talk about how we produce too many PhD students for the number of faculty positions, that this is unfair, and that therefore we should take fewer students. The most recent version of this idea on the web goes so far as calling the academic enterprise a Ponzi scheme. I’ve never personally found this argument very convincing. No other area of employment has a degree the guarantees its recipients their preferred job and I think that thinning the pool of potential talent from the scientific fields before it’s really possible to tell who the important thinkers of the next generation might be is bad for science (and all of the things that benefit from it). I’ve never taken the time to really expand on these thoughts, but thankfully James Keirstead over at Academic Productivity has an interesting post up responding to the ideas in the first link. Go check it out.