- Crowdsourcing for conservation generally involves collecting information, opinions, or labor from groups of people rather than from individual employees or experts.
- The technique is being applied in dozens of studies on fish, amphibians, invertebrates, birds, mammals, and other life forms, as well as landscapes, and is increasingly being accepted into the scientific literature and conservation databases.
- Nevertheless, doubts about the quality of crowdsourced data persist.
Through years of effort, scientists have assessed the conservation status of some 77,000 species of plants and animals via the IUCN’s Red List of Threatened Species. Scientific records on the distribution and abundance of species over time allow them to evaluate how close each one is to extinction. But scientists still lack the data needed to assess the rest of the 1.5 million or so species known to exist, let alone a potential 7 million that haven’t even been described yet. Without its risk of extinction being classified, a species can miss out on conservation funding and attention. But without funding, the information needed to assess extinction risk can’t be collected. To overcome this Catch 22 some conservationists have recently been tapping in to a resource that costs nothing and has actually been there all along: volunteers.
Within the bustling crowds at Lorengau market on Manus Island, Papua New Guinea, there is knowledge — data — that would take months and thousands of dollars to accumulate if it were to be acquired by a research team on an expedition into the jungle. Nathan Whitmore, population biologist with the New-York based conservation organization the Wildlife Conservation Society, spent time with these local people in order to learn from what is sometimes called “the wisdom of crowds” and build a detailed picture of the abundance and distribution of the Manus green tree snail (Papustyla pulcherrima). His results were recently published in the journal Oryx.
This snail, remarkable though it is with its vivid green shell embellished with a swirling yellow stripe, would have difficulty garnering international empathy and the accompanying conservation funding. Species receiving the most funds and attention are the lucky few graced with an intangible appeal, more often than not bundled in fur or feathers.
“The situation is particularly severe for invertebrates where only a few percent of species have been assessed,” Whitmore told Mongabay. “In a competition for funds tigers beat snails, even pretty ones, every time.”
Historically, the Manus people used the snails’ emerald shells in ceremonial dress and ornamentation during festivals. Eventually, export of the shells for jewelry led to the decline of the species, landing it, in 1975, on Appendix II of the Convention on International Trade in Endangered Species, a multilateral treaty restricting trade of threatened plants and animals. Since then, scientists have gathered little further information about its status. The snail could have been in trouble, but the conservation community had no way of knowing.
Organizing a team to survey for the arboreal snail would involve scouring the forest canopy as well as obtaining appropriate permissions from landowners. Such a task would be intense. It would cost $30,000 to $40,000 at a minimum; to do it well the figure would be closer to $60,000, according to Whitmore. But the Manus people had long shared their environment with the snail and encountered it in the forest, often while felling trees. Whitmore and his team realized that the knowledge the scientific community lacked about this species could be here, in the local people. This crowd was wise.
So, rather than organize extensive field surveys, Whitmore decided to crowdsource his data. Especially useful when time and money are in short supply, crowdsourcing for conservation generally involves collecting information, opinions, or labor from groups of people rather than from individual employees or experts. Whitmore was to crowdsource information from the knowledgeable people of Manus.
The idea of tapping into the collective wisdom is gaining traction in our modern society, and is an especially new technique in conservation science. But its origins are old, with Aristotle credited as the first to take the idea seriously. “The many, who are not as individuals excellent men, nevertheless can, when they have come together, be better than the few best people, not individually but collectively, just as feasts to which many contribute are better than feasts provided at one person’s expense,” he wrote in Politics III.
While Aristotle was likely referring to the exchange of arguments in a public forum, the phenomenon of the many minds being greater than the few has since been explored in statistical experiments. Over 100 years ago the famed English statistician Sir Francis Galton asked participants at a country fair to write down their best guess at the weight of a slaughtered ox. Eight hundred people took part, and the median (middle) guess fell within one pound of the correct answer.
Whitmore’s approach to collecting his conservation data did not involve best guesses but rather the accumulation of existing knowledge. He and his team asked 400 randomly selected local market-goers and stall holders to recall the abundance and distribution of the snail in the present day, as well as around the time of a well-known event 15 years earlier, with the aid of maps of the island. From their recollections Whitmore was able to produce maps of the snail’s current and past distribution, as well as an estimate of its rate of decline.
The wisdom of his crowd was credible enough for the IUCN to list the snail as Near Threatened on its Red List of Threatened Species, having previously listed it as Data Deficient. The new listing means that the snail population is declining, but is at no immediate risk of extinction. “That should, in theory, free up money for other species which are in need of treatment,” Whitmore said.
Whitmore crowdsourced his data; however, modern advancements have made it possible for other scientists to crowdsource their workforce. Thanks to the expansion of the Internet, now used by over 3 billion people, people around the world can easily enter data and upload photos into scientific databases. This has allowed citizen scientists — members of the public who wish to participate in a scientific effort — to become a worldwide army of data contributors, pitching in on projects that scientists would otherwise be unable to complete.
The Global Freshwater Fish BioBlitz is one such project, run by a consortium of conservation NGOs. It encourages people around the world to go outside and photograph freshwater fish in their natural habitats on February 4, World Wetlands Day. Since starting in 2014, 169 people have taken part by uploading their photos along with details of where and when they saw the fish, to be identified by volunteers knowledgeable about fish taxonomy.
“By creating a way for scientists and amateur naturalists around the globe to collaborate, we can build a richer knowledge base of the world’s freshwater fish and their distribution. And maybe even discover something new,” Michele Thieme, a freshwater conservation biologist with WWF, one of the groups running the bioblitz, said in a press release about the project. “If we don’t know where or what the species are, it’s difficult to plan for their conservation.”
The project was inspired by the success of a similar project, the Global Amphibian Bioblitz. Since launching in 2011, it has garnered over 40,000 observations of amphibians and even led to the description of an entirely new species.
Unlike invertebrates, fish, or frogs, the California condor (Gymnogyps californianus) does not lack the distribution and abundance data needed for conservation classification. It is listed as Critically Endangered thanks to a prolonged conservation intervention, having previously been listed as Extinct in the Wild after heavy persecution. However, crowdsourcing for conservation can still come in to play.
Condor Watch is a project that does not even require its participants to step out of their homes. Set up alongside similar projects on the online platform Zooniverse, Condor Watch asks participants to look at photographs of condors at feeding stations where researchers have set out carcasses, and enter information about the number of birds, their individual identification, and distance from carcasses, as well as other animals present. To date, volunteers have submitted 340,000 photo classifications. Project leaders hope that tracking the location and behavior of the condors as recorded in photographs taken during the past 10 years will contribute to understanding how condors’ personalities and social status might predispose the birds to lead poisoning. The consumption of lead bullet fragments lodged in animal carcasses is one of the main threats facing this icon of conservation intervention.
Just how many crowdsourced conservation projects there are across the world is unconfirmed, though there are easily dozens, possibly hundreds. Does all of this data sourced from non-experts have a place in scientific literature? Well, it would appear so. Earlier this year, an analysis of over 500 papers from the past 74 years of monarch butterfly research revealed that 17 percent of them used citizen-science data. And it’s becoming more popular. Since 2000, two thirds of field-based research into monarch butterflies utilized records from citizen scientists, many of them participants in crowdsourcing projects.
Nevertheless, there have been few evaluations of the accuracy of crowdsourced conservation data and doubts about its accuracy and scientific suitability persist, at least in certain applications.
For instance, a paper published in February in the journal Land Use Policy examined crowdsourced mapping data gathered for a study of public lands in Victoria, Australia. It found that by certain measures the crowdsourced data was about 70 percent accurate and about 80 percent complete. “The spatial accuracy and completeness of [crowdsourced] data in this study suggests spatial data quality may be ‘good enough’ to complement biological data in conservation planning but perhaps not good enough to overcome the mistrust associated with crowd-sourced knowledge,” the authors write, adding “The absence of trust in the quality of [crowdsourced] data derived from non-authoritative sources…means the spatial data is unlikely to be used for natural resource planning and decision support, including conservation planning.”
Karen Oberhauser, a conservation biologist at the University of Minnesota and a co-author of the monarch paper, believes the apprehension is unwarranted. “Who collects field data in scientific studies? It’s often undergraduate researchers who are paid 10 dollars an hour — they aren’t necessarily going to do a better job than somebody who is volunteering their time and has a vested interest in the phenomenon they are studying,” she said in a press release about her paper. “Citizen scientists are equally as diligent about the accuracy of their data. They want to do a good job because they care about the work.”
Whitmore’s work faced its own pushback. Whitmore indicated that a number of critics, including his initially skeptical boss, suggested to him that his data should be corroborated by field surveys. But this lands us back in another Catch 22: there is no money to conduct such a survey in order to prove the efficacy of a study that was conducted because there was no money to conduct an alternative.
While sympathizing with his critics’ concerns, Whitmore maintains that his $1,500 survey method was a pragmatic solution. “I qualify it by saying that given the information we have collected why on Earth would you try and spend upwards of $30,000 — a massive sum for any invertebrate study — doing a corroborative survey on…a species that for all intents and purposes doesn’t appear to need urgent conservation intervention,” he said. “[C]ouldn’t that money be better spent elsewhere?”
Nevertheless, Whitmore concedes that to enter the conservation mainstream, his technique will eventually need to be compared with field methods in order to determine its accuracy. “The great irony here is that…for financial reasons we will likely end up pursuing a high cost project on a charismatic species to verify the technique.”
For now, when there aren’t always alternatives for species lacking softness and splendor, the sourcing of data from volunteers seems to be gaining traction and proving sufficient to support certain basic conservation decisions. Whether the crowd is a few hundred people in a remote island village or thousands dispersed across countries and connected by the Internet, scientists are finding there is wisdom in the crowd that could bolster conservation for even the most unloved, unnoticed species on the planet.
Citations
Whitmore, N. (2015). Harnessing local ecological knowledge for conservation decision making via Wisdom of Crowds: the case of the Manus green tree snail Papustyla pulcherrima. Oryx. doi:10.1017/S0030605315000526.
Ries, L., Oberhauser, K. (2015). A Citizen Army for Science: Quantifying the Contributions of Citizen Scientists to our Understanding of Monarch Butterfly Biology. BioScience, 65(4), 419-430.
Brown, G., Weber, D., de Bie, K. (2015). Is PPGIS good enough? An empirical evaluation of the quality of PPGIS crowd-sourced spatial data for conservation planning. Land Use Policy 43:228–238.