Jabberwocky Ecology

Sometimes it’s important to ignore the details [Things you should read]

Joan Strassman has a very nice post about why it is sometimes useful to step back from the intricate details of biological systems in order to understand the general processes that are operating. Here’s a little taste of the general message

In this talk, Jay said that MacArthur claimed the best ecologists had blurry vision so they could see the big patterns without being overly distracted by the contradictory details. This immediately made a huge amount of sense to me. Biology is so full of special cases, of details that don’t fit theories, that it is easy to despair of advancing with broad, general theories. But we need those theories, for they tell us where to look next, what data to collect, and even what theory to challenge. I am a details person, but love the big theories.

The whole post is definitely worth a read.

Why I will no longer review for your journal

I have, for a while, been frustrated and annoyed by the behavior of several of the large for-profit publishers. I understand that their motivations are different from my own, but I’ve always felt that an industry that relies entirely on both large amounts of federal funding (to pay scientists to do the research and write up the results) and a massive volunteer effort to conduct peer review (the scientists again) needed to strike a balance between the needs of the folks doing all of the work and the corporations need to maximize profits.

Despite my concerns about the impacts of increasingly closed journals, with increasingly high costs, on the dissemination of research and the ability of universities to support their core missions of teaching and research, I have continued to volunteer my time and effort as a reviewer to Elsevier and Wiley-Blackwell. I did this because I have continued to see valuable contributions made by these journals and I felt that this combined with the contribution that I was making to science by helping improve the science published in high profile places made supporting these journals worthwhile. I no longer believe this to be the case and from now on I will no longer be reviewing for any journal that is published by Elsevier, Springer, or Wiley-Blackwell (including society journals that publish through them).

Why have I changed my mind? Because of the pursuit/support by these companies of the Research Works Act. This act seeks to prevent funding agencies from requiring that the results of research that they funded be made publicly available. In other words it seeks to prevent the government (and the taxpayers that fund it), which pays for a very large fraction of the cost of any given paper through both funding the research and paying the salaries of reviewers and editors, from having any say in how that research is disseminated. I think that Mike Taylor in the Guardian said most clearly how I feel about this attempt to exert legislative control requiring us to support corporate profits over the dissemination of scientific research:

Academic publishers have become the enemies of science

This is the moment academic publishers gave up all pretence of being on the side of scientists. Their rhetoric has traditionally been of partnering with scientists, but the truth is that for some time now scientific publishers have been anti-science and anti-publication. The Research Works Act, introduced in the US Congress on 16 December, amounts to a declaration of war by the publishers.

You should read the entire article. It’s powerful. There are lots of other great articles about the RWA including Michael Eisen in the New York Times, a nice post by INNGE, and a interesting piece by Paul Krugman (via oikosjeremy). I’m also late to the party in declaring my peer review strike and less eloquent than many of my peers in explaining why (see great posts by Michael Taylor, Gavin Simpson, and Timothy Gowers). But I’m here now and I’m letting you know so that you can consider whether or not you also want to stop volunteering for companies that don’t have science’s best interests in mind.

If you’d like to read up on the publisher’s side of this argument (they have costs, they have a right to recoup them) you can see Springer’s official position or an Elsevier Exec’s exchange with Michael Eisen. My problem with all of these arguments is that there is nothing in any funding agency’s policy that requires publishers to publish work funded by that agency. This is not (as Springer has argued) an “unfunded mandate”, this is a stake holder that has certain requirements related to the publication of research in which they have an interest. This is just like an author (in any non-academic publishing situation) negotiating with a publisher. If the publisher doesn’t like the terms that the author demands, then they don’t have to publish the book. Likewise, if a publisher doesn’t like the NIH policy then they should simply not agree to publish NIH funded research.

To be clear, I am not as extreme in my position as some. I still support and will review for independent society journals like Ecology and American Naturalist even though they aren’t Open Access and even though ESA has made some absurd comments in support of the same ideas that are in RWA. The important thing for me is that these journals have the best interests of science in mind, even if they are often frustratingly behind the times in how they think and operate.

And don’t worry, I’ve still got plenty of journal related work to keep me busy, thanks to my new position on the editorial board at PLoS ONE.

UPDATE: The links to the INNGE and Timothy Gowers post have now been fixed, and here are links to a couple of great posts by Casey Bergman that I somehow left out: one on how to turn down reviews while making a point and one on the not so positive response he received to one of these emails.

UPDATE 2: A great collection of posts on RWA. There are a lot of really unhappy scientists out there.

UPDATE 3: A formal Boycott of Elsevier. Almost 1000 scientists have signed on so far.

UPDATE 4: Wiley-Blackwell has now distanced itself from RWA and said that “We do not believe that legislative initiatives are the best way forward at this time and so have no plans to endorse RWA. Instead we believe that research funder-publisher partnerships will be more productive.” In addition, it was announced that a bill that would do the opposite of RWA has now been introduced. Hooray for collective action!

Am I teaching well given the available research on teaching

Figuring out how to teach well as a professor at a research university is largely a self-study affair. For me the keys to productive self-study are good information and self-reflection. Without good information you’re not learning the right things and without self-reflection you don’t know if you are actually succeeding at implementing what you’ve learned. There have been some nice posts recently on information and self-reflection about how we teach over at Oikos (based on, indirectly, on a great piece on NPR) and Sociobiology (and a second piece) that are definitely worth a read. As part of a course I’m taking on how to teach programming I’m doing some reading about research on the best approaches to teaching and self-reflection on my own approaches in the classroom.

One of the things we’ve been reading is a great report by the US Department of Education’s Institute of Education Sciences on Organizing Instruction and Study to Improve Student Learning. The report synthesizes existing research on what to do in the classroom to facilitate meaningful long-term learning, and distills this information into seven recommendations and information on how strongly each recommendation is supported by available research.

Recommendations

  1. Space learning over time. Arrange to review key elements of course content after a delay of several weeks to several months after initial presentation. (moderate)
  2. Interleave worked example solutions with problem-solving exercises. Have students alternate between reading already worked solutions and trying to solve problems on their own. (moderate)
  3. Combine graphics with verbal descriptions. Combine graphical presentations (e.g., graphs, figures) that illustrate key processes and procedures with verbal descriptions. (moderate)
  4. Connect and integrate abstract and concrete representations of concepts. Connect and integrate abstract representations of a concept with concrete representations of the same concept. (moderate)
  5. Use quizzing to promote learning.
    1. Use pre-questions to introduce a new topic. (minimal)
    2. Use quizzes to re-expose students to key content (strong)
  6. Help students allocate study time efficiently.
    1. Teach students how to use delayed judgments of learning to identify content that needs further study. (minimal)
    2. Use tests and quizzes to identify content that needs to be learned (minimal)
  7. Ask deep explanatory questions. Use instructional prompts that encourage students to pose and answer “deep-level” questions on course material. These questions enable students to respond with explanations and supports deep understanding of taught material. (strong)

(Quoted directly from the original report via a Software Carpentry blog post)

This is a nice summary, but it’s definitely worth reading the whole report to explore the depth of the thought process and learn more about specific ideas for how to implement these recommendations.

How am I doing?

Recently I’ve been teaching two courses on programming and database management for biologists. Because I’m not a big believer in classroom lecture, for this type of material, a typical day in one of these courses involves: 1) either reading up on the material in a text book or viewing a Software Carpentry lecture before coming to class; 2) a brief 5-10 minute period of either re-presenting complex material or answering questions about the reading/viewing; and 3) 45 minutes of working on exercises (during which time I’m typically bouncing from student to student helping them figure out things that they don’t understand). So, how am I doing with respect the the above recommendations?

1. Space learning over time. I’m doing OK here, but not as well as I’d like. The nice thing about teaching introductory programming concepts is that they naturally build on one another. If we learned about if-then statements two weeks ago then I’m going to use them in the exercises about loops that we’re learning about this week. I also have my advanced class use version control throughout the semester for retrieving data and turning in exercises to force them to become very comfortable with the work-flow. However, I haven’t done a very good job of bringing concepts back, on their own, later in the semester. The exercise based approach to the course is perfect for this, I just need to write more problems and insert them into the problem-sets a few weeks after we cover the original material.

2. Interleave worked example solutions with problem-solving exercises. I think I’m doing a pretty good job here. Student’s see worked examples for each concept in either a text book or video lecture (viewed outside of class) and if I think they need more for a particular concept we’ll walk through a problem at the beginning of class. I often use the Online Python Tutor for this purpose which provides a really nice presentation of what is going on in the program. We then spend most of the class period working on problem-solving exercises. Since my classes meets three days a week I think this leads to a pretty decent interleaving.

3. Combine graphics with verbal descriptions. I do some graphical presentation and the Online Python Tutor gives some nice graphical representations of running programs, but I need to learn more about how to communicate programming concepts graphically. I suspect that some of the students that struggle the most in my Intro class would benefit from a clearly graphical presentation of what is going happening in the program.

4. Connect and integrate abstract and concrete representations of concepts. I think I do this fairly well. The overall motivation for the course is to ground the programming material in the specific discipline that the students are interested in. So, we learn about the general concept and then apply it to concrete biological problems in the exercises.

5. Use quizzing to promote learning. I’m not convinced that pre-questions make a lot of sense for material like this. In more fact based classes they are helping to focus students’ attention on what is important, but I think the immediate engagement in problem-sets that focus on the important aspects works at least as well in my classroom. I do have one test in the course that occurs about half way through the Intro course after we’ve covered the core material.  It is intended to provide the “delayed re-exposure” that has been shown to improve learning, but after reading this recommendation I’m starting to think that this would be better accomplished with a series of smaller quizzes.

6. Help students allocate study time efficiently. I spend a fair bit of time doing this when I help students who ask questions during the assignments. By looking at their code and talking to them it typically becomes clear where the “illusion of knowing” is creeping in and causing them problems and I think I do a fairly good job of breaking that cycle and helping them focus on what they still need to learn. I haven’t used quizzes for this yet, but I think they could be a valuable addition.

7. Ask deep explanatory questions. One of the main focuses in both of my courses is an individual project where the students work on a larger program to do something that is of interest to them. I do this with the hope that it can provide the kind of deep exposure that this recommendation envisions.

So, I guess I’m doing OK, but I need to work more on representation of material both through bringing back old material in the exercises and potentially through the use of short quizzes throughout the semester. I also need to work on alternative ways to present material to help reach folks whose brains work differently.

If you are a current or future teacher I really recommend reading the full report. It’s a quick read and provides lots of good information and food for thought when figuring out how to help your students learn.

Thanks for listening in on my self-reflection. If you have thoughts about this stuff I’d love to hear about it in the comments.

A new database for mammalian community ecology and macroecology

There are a number of great datasets available for doing macroecology and community ecology at broad spatial scales. These include data on birds (Breeding Bird Survey, Christmas Bird Count), plants (Forest Inventory & Analysis, Gentry’s transects), and insects (North American Butterfly Association Counts). However, if you wanted to do work that relied on knowing the presence or abundance of individuals at particular sites (i.e., you’re looking for something other than range maps) there has never been a decent dataset to work with for mammals.

Announcing the Mammal Community Database (MCDB)

Over the past couple of years we’ve been working to fill that gap as best we could. Since coordinated continental scale surveys of mammals don’t yet exist [1] we dug into the extensive mammalogy literature and compiled a database of 1000 globally distributed communities. Thanks to Kate Thibault‘s leadership and the hard work of Sarah Supp and Mikaelle Giffen, we are happy to announce that this data is now freely available as a data paper on Ecological Archives.

In addition to containing species lists for 1000 locales, there is abundance data for 940 of the locations, some site level body size data (~50 sites) and a handful of reasonably long (> 10 yr) time-series as well. Most of the data is restricted to the particular mode of sampling that an individual mammalogist uses and as a result much of the data is for small mammals captured in Sherman traps.

Working with data compilations like this is always difficult because the differences in sampling intensity and approaches between studies can make it very difficult to compare data across sites. We’ve put together a detailed table of information on how sampling was conducted to help folks break the data into comparable subsets and/or attempt to control for the influence of sampling differences in their statistical models.

The joys of Open Science

We’ve been gradually working on making the science that we do at Weecology more and more open, and the MCDB is an example of that. We submitted the database to Ecological Archives before we had actually done much of anything with it ourselves [2], because the main point of collecting the data was to provide a broadly useful resource to the ecological community, not to answer a specific question. We were really excited to see that as soon as we announced it on Twitter

folks started picking it up and doing cool things with it [3]. We hope that folks will find all sorts of uses for it going forward.

Going forward

We know that there is tons more data out there on mammal communities. Some of it is unpublished, or not published in enough detail for us to include. Some of it has licenses that mean that we can’t add it to the MCDB without special permission (e.g., there is a lot of great LTER mammal data out there). Lots of it we just didn’t find while searching through the literature.

If folks know of more data we’d love to hear about it. If you can give us permission to add data that has more restrictive licensing then we’d love to do so [4]. If you’re interested in collaborating on growing the database let us know. If there’s enough interest we can invest some time in developing a public portal.

The footnotes [5]

[1] We are anxiously awaiting NEON’s upcoming surveys, headed up by former Weecology postdoc Kate Thibault.

[2] We have a single paper that is currently in review that uses the data.

[3] Thanks to Scott Chamberlain and Markus Gesmann. You guys are awesome!

[4] To be clear, we haven’t been asking for permission yet, so no one has turned us down. We wanted to get the first round of data collection done first to show that this was a serious effort.

[5] Because anything that David Foster Wallace loved has to be a good thing.

NSF Pre-proposal guidelines/instructions

UPDATE: If you’re looking for the information for 2014, checkout the DEBrief post for links.

UPDATE: If you’re looking for the information for 2013, here’s an updated post.

Since I have now spent far too much time on multiple occasions trying to track down the instructions for the new pre-proposals for NSF DEB and IOS grants I’m going to post the link here under the assumptions that other folks will be looking for this information as well (and also finding it difficult to track down).

http://www.nsf.gov/pubs/2011/nsf11573/nsf11573.htm#prep

Happy post-holiday grant writing to all.

UPDATE 1: Also note that the Biosketches are different for the pre-proposals (changes noted in bold-italics)

Biographical Sketches (2-page limit for each) should be included for each person listed on the Personnel page. It should include the individual’s expertise as related to the proposed research, professional preparation, professional appointments, five relevant publications, five additional publications, and up to five synergistic activities. Advisors, advisees, and collaborators should not be listed on this document, but in a separate table (see below).

UPDATE 2: Though it is not explicitly clear from the link above, Current & Pending Support should NOT be included in pre-proposals (thanks to Alan Tessier for clearing this up).

Stay Classy Wiley

I logged into one of my reviewer accounts at a Wiley journal this morning and was greeted by a redirect that took me to a page with the following message:

CONSENT
We appreciate your involvement with this publication, which is published by a John Wiley & Sons company. The publisher would like to contact you by email/post with details of publications and services that may be of interest to you, specific to your subject area, from companies in the John Wiley & Sons group (only) worldwide. Your information will never be passed to any third party companies and as part of any communications you will be given the opportunity to unsubscribe from receiving further contact. Please indicate whether you wish to receive this information by answering the CONSENT question below.

Asking someone who is already working for you for free if it’s OK to also try to sell them stuff while they’re doing it seems like a pretty good definition of classless to me.

NSF Proposal Changes – Follow-up

Recently, NSF has changed the process for proposal submission for the core panels in the Directorate for Biological Sciences. Wondering if this might be important to you? Please answer the following questions:

  • do you study some aspect of biology (defined as anything from the molecular to ecosystem levels)?,
  • do you intend to submit a proposal to NSF someday?
  •  If you answered yes to these questions, then the probability is high that this pertains to you (though the details of what I say below may differ depending upon the Division you tend to apply to).

Anyway, we’ve covered the basics of this shift here before, but this week the DEB (Division of Environmental Biology) at NSF conducted a webinar on the changes and a few additional pieces of info were added.

Some important additional pieces of info

1)The following solicitations are NOT impacted by the preproposal rules:

Assembling the Tree of Life, CAREER, Dynamics of Coupled Nautral and Human Systems, Dimensions of Biodiversity, Ecology of Infectious Disease, OPUS, RCN, and the DDIGs.

What does that mean? 1) Those solicitations are operating under their own rules, so read their solicitations for details of how to submit and 2) submitting to them won’t count against your 2 preproposals per year limit.

2) Timeline (for DEB, supposedly IOS – Integrative Organismal Systems – will be similar):

Preproposals: due Jan 9, Preproposal review panel meetings March-April, Invite/Not invite decision by May 1(ish).

Full proposals (for those invited): Due Aug 2, Panel Review Oct/Nov, Award/Decline decision by December. In theory this will give you close to a month to revise your preproposal.

3) The webinar provides info on what should be in a preproposal, what the panel will be asked to assess, and what the basis for invite/not invite will be. I recommend perusing through the last few slides (see links below). There is a lot of emphasis (in my opinion) on how bold, compelling, general the research will be. If this is how the panels will be instructed, I think this is a good thing – but again that’s just my opinion.

4) There is also info on trends in funding rates, proposal submissions, and numbers of reviewers that were being required for all the proposals. If you’ve been submitting to NSF you know things have been grim, but there’s something about seeing the numbers that make you realize that regardless of whether you think this is the best change, things really had to change.

If you’d like to see the webinar here are some links for you (I tried the ‘streaming’ one and it worked fine, there is an executable that will be downloaded to your computer to run it)

Streaming recording link: https://mmancusa.webex.com/mmancusa/ldr.php?AT=pb&SP=MC&rID=43949312&rKey=2ad726bf2f77bd13

Download recording link: https://mmancusa.webex.com/mmancusa/lsr.php?AT=dw&SP=MC&rID=43949312&rKey=1fe4937906efe109

Post-docs at the University of Wyoming

The Department of Zoology and Physiology at the University of Wyoming is advertising some postdoctoral fellowships (details below). There are a number of stellar people out there (including friend of Weecology, Jake Goheen, who sent us the ad), so we strongly recommend checking out the opportunity if you’re looking for a postdoc or know someone who is:

Berry Postdoctoral Fellowships
Berry Postdoctoral Fellowships are intended for outstanding ecologists or evolutionary biologists whose research is motivated by issues in conservation biology. Applicants must have a faculty sponsor from the Department of Zoology and Physiology, and secondary sponsor from the same department or from another department at the University of Wyoming. The initial fellowship period is one year, renewable for a second year contingent on performance.  Berry Fellows will offer a one-credit graduate seminar during their first year to provide the opportunity to gain teaching experience and to promote interactions with graduate students. Starting date is negotiable, but requires having a doctoral degree in hand and needs to begin by August 2012.
Berry Fellows will receive an annual stipend of $35,000 and a research fund of $7,000 per year. They will be eligible for UW benefits.
Application procedure
The applicant must first contact a faculty member in the Department of Zoology and Physiology to arrange sponsorship. In consultation with the sponsor, applicants need to arrange for a second faculty sponsor. Once sponsors are arranged, the applicant needs to submit a two-page research proposal (including the names of their primary and secondary faculty sponsors), a CV, three outside letters of recommendation, and a letter of support from their primary faculty sponsor to the Berry Fellowship committee <cbenkman [at] uwyo.edu> by 1 November 2011.

Postdoc in Evolutionary Bioinformatics [Jobs]

There is an exciting postdoc opportunity for folks interested in quantitative approaches to studying evolution in Michael Gilchrist’s lab at the University of Tennessee. I knew Mike when we were both in New Mexico. He’s really sharp, a nice guy, and a very patient teacher. He taught me all about likelihood and numerical maximization and opened my mind to a whole new way of modeling biological systems. This will definitely be a great postdoc for the right person, especially since NIMBioS is at UTK as well. Here’s the ad:

Outstanding, motivated candidates are being sought for a post-doctoral position in the Gilchrist lab in the Department of Ecology & Evolutionary Biology at the University of Tennessee, Knoxville. The successful candidate will be supported by a three year NSF grant whose goal is to develop, integrate and test mathematical models of protein translation and sequence evolution using available genomic sequence and expression level datasets. Publications directly related to this work include Gilchrist. M.A. 2007, Molec. Bio. & Evol. (http://www.tinyurl/shahgilchrist11) and Shah, P. and M.A. Gilchrist 2011, PNAS (http://www.tinyurl/gilchrist07a).

The emphasis of the laboratory is focused on using biologically motivated models to analyze complex, heterogeneous datasets to answer biologically motivated questions. The research associated with this position draws upon a wide range of scientific disciplines including: cellular biology, evolutionary theory, statistical physics, protein folding, differential equations, and probability. Consequently, the ideal candidate would have a Ph.D. in either biology, mathematics, physics, computer science, engineering, or statistics with a background and interest in at least one of the other areas.

The researcher will collaborate closely with the PIs (Drs. Michael Gilchrist and Russell Zaretzki) on this project but potentiall have time to collaborate on other research projects with the PIs. In addition, the researcher will have opportunties to interact with other faculty members in the Division of Biology as well as researchers at the National Institute for Mathematical and Biological Synthesis (http://www.nimbios.org).

Review of applications begins immediately and will continue until the position is filled. To apply, please submit curriculum vitae including three references, a brief statement of research background and interests, and 1-3 relevant manuscripts to mikeg[at]utk[dot]edu.

An excoriation of for-profit academic publishers

George Monbiot has just published a piece in The Telegraph berating for-profit academic publishers that will surely be castigated by some as over the top hyperbole and praised by others as a trenchant criticism of the state of academic publishing*. Starting off with the, perhaps, ever so slightly, contentious title of Academic publishers make Murdoch look like a socialist Monbiot proceeds to fire zingers like

Murdoch pays his journalists and editors, and his companies generate much of the content they use. But the academic publishers get their articles, their peer reviewing (vetting by other researchers) and even much of their editing for free. The material they publish was commissioned and funded not by them but by us, through government research grants and academic stipends. But to see it, we must pay again, and through the nose.

and backs up his position with a recent analysis by Deutsche Bank

The publishers claim that they have to charge these fees as a result of the costs of production and distribution, and that they add value (in Springer’s words) because they “develop journal brands and maintain and improve the digital infrastructure which has revolutionised scientific communication in the past 15 years”. But an analysis by Deutsche Bank reaches different conclusions. “We believe the publisher adds relatively little value to the publishing process … if the process really were as complex, costly and value-added as the publishers protest that it is, 40% margins wouldn’t be available.”

finally ending with a call to arms that even your, never shying away from a good fight, narrator would have toned down a bit**

The knowledge monopoly is as unwarranted and anachronistic as the corn laws. Let’s throw off these parasitic overlords and liberate the research that belongs to us.

Go read the whole thing. This is something that folks are going to be talking about, and I think it’s another good opportunity to ask ourselves whether the the group that contributes the least to the overall scientific process should the one that benefits the most financially. I take it as yet another sign that 2011 is the year that the war over academic publishing officially began.

————————————————————————————————————————————————————————–

*Yes, once I used the word “excoriation” in the title I got a little carried away with the big words.

**Though not for lack of agreement with the general sentiment; clearly.

Changes in NSF process for submissions to DEB and IOS*

As some of you may have heard, the BIO directorate at NSF has implemented some sweeping changes to the proposal process. Some of you youngsters may be unaware what the ‘old’ system was, but it involved two submission deadlines per year. At these deadlines, scientists would submit to a panel** one or more full proposals*** which were then reviewed by external reviewers and deliberated on by the panel the proposal was submitted to. Starting in January, this is changing. Some of the important highlights include:

1) Preproposals: we will now need to submit a preproposal before we can submit a full proposal. Preproposals will be 5 pages, 4 of which are for the science/broader impacts. These preproposals will be considered by a panel and full proposals will be solicited for a subset of those preproposals. No full proposals can be submitted unless solicited by NSF****.

2) fewer full proposal deadlines:  The early year proposal deadline has been converted to a preproposal deadline so there is now a deadline for preproposals (January) and one later in the year for the solicited full proposals (August).

3) limits on submissions: the NSF info says, “In a given year, an individual may participate as a PI, co-PI, or lead senior investigator of a subaward on no more than two preliminary proposals submitted per Division solicitation (DEB or IOS).” I suspect from the wording that it is 2 total – regardless of how many different panels you want to submit to. Does anyone have more info on this?

NSF has highlighted the following as part of their motivation:

1) focus on transformative science. In their FAQ, NSF specifically highlights that they expect that the Preproposals panel will focus on how interesting and important the research is since obviously preproposal methods will be limited.

2) higher funding rate for full proposals. Also in the FAQ, NSF is aiming at a 25-35% funding rate for full proposals. For the youngsters, funding rates at least for the panel I tend to submit to have been in the range of 8-10%, so this will be a substantial improvement. However, this means that the rejection rates for the preproposals will be high.

The next year or two as we transition to the new process are going to be rocky. Some of my colleagues think these changes are akin to the 4 horsemen of the apocalypse running across the scientific landscape.  I think there are some critical issues with respect to currently untenured faculty, and funding gaps for projects that were planning on having 2 submissions before funding runs out. But, assuming that we are able to get feedback on rejected full proposals in time to write revised preproposals for the next round, I’m actually cautiously optimistic that there will eventually be some benefits to this system. I’ve read some of the concerns on other blogs and I would recommend that our readers go check them out. One of the reasons I am cautiously optimistic is that the old system was broken and some of those concerns were unofficially being built into the old system anyway*****. The truth is that our funding world has changed and we all (NSF, scientists, and university tenure committees) need to figure out how to make our current reality work. NSF is trying to adjust, scientists are being forced to adjust, and I think those of us with tenure need to make sure our universities are also adjusting so that the young people coming up through the new reality don’t get screwed****** because the system has changed on them and we haven’t adjusted our expectations*******.

______________________________________________

* DEB (Division of Environmental Biology) and IOS (Division of Integrated Organismal Systems) are divisions at NSF and cover pretty much everything from organismal physiology to ecosystem ecology. Incidentally, MCB (Division of Molecular and Cellular Biosciences) is/has also adopted the new system.

**Panel, n. 1) a program (in the generic sense, not the computer sense) to which unsolicited proposals on a certain research area (e.g., Population and Community Ecology, Ecosystems, etc) could be submitted twice a year. 2) a group of scientists invited to NSF to provide funding recommendations on proposals submitted to that panel.

***Full proposal: This involves a 1 page Project Summary, 15 page Project Description, Budget and Budget Justification, 2-page CV for each PI and co-PI, Lists of current and pending funding for each PI and co-PI, Data Management Plan, Cited Literature, and Postdoc mentoring plan (if you are going to fund a postdoc).

**** There are exceptions to this: CAREER, OPUS, RCN, LTER (i.e., pretty much anything with its own acronym except LTREB).

***** For example, proposals were being held by NSF until after the next deadline unless you specifically bugged your program director for it and then sometimes is was released with very little time before the deadline. I suspect it was only a matter of time before they refused to release them at all before the next deadline. The fact that it’s a deadline (i.e., late submissions rejected) and not a target date (i.e., acceptance or rejection of late submissions at the discretion of the program director) was also a relative recent change and indicative of where things were heading.

****** New system or old system, the fact is that 8% (or whatever the new number will be – though it’s hard to imagine it’ll be a LOT better) is horrid and frankly a lack of successful NSF funding is NOT indicative of quality or long-term prospects when percentages are that low. We need to make sure that productive people who have not landed the ‘gold standard’ grant are not being flushed out of the system.

******* This one’s just because I find the long string of astrixes (astrices?) funny.

Weecology at ESA

If folks are interested in seeing what Weecology has been up to lately we have a bunch of posters and talks at ESA this year. In order of appearance:

  • Tuesday at 2:30 pm in Room 9AB our new postdoctoral researcher Dan McGlinn will be giving a talk on looking at community assembly using patterns of with- and between-species spatial variation.
  • Tuesday afternoon at poster #28 Morgan will be presenting research on how the long-term community dynamics of the plant and rodent communities near Portal, AZ are related to decadal scale climate cycles. She’ll be there from 4:30 to 6:30 to chat, or stop by any time and take a look.
  • Wednesday at 1:50 pm in Room 19A one of our new members, Elita Baldridge, will be giving a talk on her masters research on nested subsets.
  • Wednesday at poster #139 Ethan will be presenting on our two attempts to make it easier to find and use ecological data. He’ll be there from 4:30 to 6:30 to chat, or stop by any time and take a look (or grab a computer and check out EcologicalData and the EcoData Retriever).
  • Thursday at 1:50 pm in Room 10A another of our new members, Zack Brym, will be giving a talk on his masters research on controls on the invasion of an exotic shrub.
  • Thursday at 4 pm in Room 8 Sarah Supp will give a talk on her work looking at the impacts of experimental manipulations on macroecological patterns (highlighted as a talk to see by Oiko’s blog)
  • And last, but certainly not least, bright and early Friday morning at 8 am in Room 8 Kate Thibault (who has now moved on to fame and fortune at NEON) will be presenting on our work using maximum entropy models to predict the species abundance distributions of 16,000 communities.

Enjoy!