Understanding and managing forests is crucial to understanding and potentially mitigating the effects of climate change, invasive species, and shifting land use on natural systems and human society. However, collecting data on individual trees in the field is expensive and time consuming, which limits the scales at which this crucial data is collected. Remotely sensed imagery from satellites, airplanes, and drones provide the potential to observe ecosystems at much larger scales than is possible using field data collection methods alone.
We running the second in a series of data science competition where multiple groups attempt to use the same remote sensing data from low flying airplanes to infer the locations, sizes and species identities of millions of trees. This kind of collaborative data analysis challenge has proven highly effective in other fields for quickly improving methods for converting image data to useful information. This round of the competition focuses on exploring how methods generalize beyond a single forest.
There are two tasks in the current competition: 1) identifying individual trees in remote sensing images; and 2) classifying trees into species.
Teams (or individuals) can participate in either or both tasks. Task 1 requires working with remote sensing data (RGB, LIDAR, and Hyperspectral). Task 2 can either leverage this raw remote sensing data or use simplified tabular data provided by the organizers. Details of the different tasks will be available starting March 1st. To read more and sign up check out the competition website:
You can learn about the results of the first round of this competition in the summary paper and full PeerJ paper collection. We plan to write up the results of this round of the competition in a similar way, with a synthetic paper covering the competition, data, and comparison of different methods, and with each team given the opportunity to write up and publish associated short papers on the methods they used and results they produced.
This challenge is supported by the Gordon and Betty Moore Foundation’s Data-Driven Discovery Initiative through grant GBMF4563 and the National Science Foundation through grant DEB-1926542. It uses data from the National Ecological Observatory Network in addition to data collected by the organizers. It is being organized by the Weecology lab, Machine Learning and Sensing lab, Data Science Research lab, and Stephanie Bohlman and Aditya Singh‘s labs all at the University of Florida and the Environmental Observation & Informatics program at the Nelson Institute for Environmental Studies at the University of Wisconsin.