- Astrophysicists and conservation ecologists have teamed up to apply the heat-detection software and machine-learning algorithms used to find stars to automatically identify people and different animal species.
- The system detects warm, living objects from drone-derived thermal video footage and uses a reference database to identify the various objects efficiently and reliably.
- The research team is refining the system to overcome challenges of variable environmental conditions, as well as hot rocks and other “thermally bright” but uninteresting objects, while building a reference database of multiple target species.
An unlikely group of experts have teamed up to apply software developed to find distant stars to help solve problems in conservation ecology.
The “astro-ecology” project at Liverpool Johns Moores University (LJMU) uses machine-learning algorithms to train the software, normally used to detect distant galaxies, to recognize wild animals in thermal-infrared imagery taken by a camera attached to an unmanned aerial system (UAS, a.k.a. drone).
Project leader Claire Burke, an astrophysicist at LJMU, and her colleague, Maisie Rashman, presented the project at the European Week of Astronomy and Space Science last week in Liverpool, U.K.
A need for automation
How did a group of astrophysicists take on a down-to-earth wildlife and human detection challenge?
Team member Serge Wich, professor of ecology at LJMU, has studied wildlife through drone imagery for several years. He and other researchers have begun putting thermal cameras on small drones to find animals and people in the dark. Studying nocturnal animals and finding wildlife poachers, who typically hunt at night, requires special thermal-infrared cameras. These sensors detect heat, not light, so they “see” living organisms, such as birds, rhinos, or people, against a background of cooler vegetation, soil, or water.
Drones offer an aerial view of a landscape and, despite their short flight duration, provide access to dangerous or remote areas. They are also relatively inexpensive to run, so researchers and others are amassing ever-larger databases of images of their study areas.
Manually reviewing gigabytes of images looking for signs of wildlife, and then assessing the heat signatures to determine which species created them, is time-consuming, painstaking work. It also takes too long to respond effectively to poachers or to crop- or livestock-raiding animals, so resource managers are seeking ways to automate the image analysis.
Inspired by successful stargazing
All objects emit some heat, and astrophysicists have for years been using thermal imaging devices and software that recognizes the unique heat signatures of stars and galaxies.
The project team used freely available astronomical source-detection software to detect humans and animals in the drone-based thermal-infrared footage.
“When we look at animals with the thermal camera, we are seeing their body heat, and in the thermal footage they ‘glow’. This glow is the same as the glow we see from stars and galaxies in space,” Burke told Mongabay-Wildtech. “This means that we can use software from astronomy to automatically detect animals and humans in thermal footage.”
Scientists combine the detection software with machine-learning algorithms to recognize different objects in space by their unique heat footprints. The astro-ecology researchers first needed to train the software to automatically recognize a whole new set of targets: people and animals several hundred meters below, instead of giant galaxies thousands of light years from Earth.
“We’ve found that every different species of animal has a unique ‘thermal fingerprint,’” Burke said, “so once we have detected the animals, we can then examine this thermal profile and by using machine learning we can train a computer to tell the difference between different species automatically. This is much more efficient than humans looking at the footage by eye, as it can be difficult to tell different animals apart by eye.”
Rhinos are endangered, with their slaughter by poachers—to use their horns for their now-disproven medicinal benefits—an even greater threat than loss of their habitat. These rhinos in South Africa are shown in an optical image (left) and a thermal-infrared image. Both photos courtesy of: Liverpool John Moores University in partnership with the Endangered Wildlife Trust. |
Pattern recognition only as good as its reference database
Training machine-learning algorithms to automatically recognize patterns requires a reference database of target objects— in this case, heat signatures of different species—for the algorithms to consult, so the astro-ecology team began collecting thermal-infrared images of various animals.
They started with cows. They flew the drone-thermal camera-software combination over cows in the English countryside. The system detected animals in the open but had trouble when the drone flew too high, the cows stood close together, or nearby roads and rocks retained heat they absorbed from the sun.
To distinguish non-moving living things from objects that are “thermally bright” but not interesting to their study, such as heated-up roads or rocks, the project team has begun to refine its algorithms in several ways.
“One [method] is to look at how the shape of the object changes as the drone flies over,” Burke said. “Upright objects such as animals change shape, whereas flat objects like rocks on the ground tend to stay the same shape in the footage. We will also be training the software to identify rocks once we have a better understanding of how to separate them from animals.”
During the project’s first true field test, in South Africa, the team learned that the humidity, rain, and various environmental conditions on the ground can adversely affect thermal imaging. Nevertheless, the system found five riverine rabbits, very elusive mammals that are highly endangered. The testing also helped the team determine an optimal height to fly the drones. The software can now track living objects that are moving in the video footage.
The scientists are currently working with a local safari park and zoo to take the still and video images of the animals needed to build the image database. Multiple images of the target species, from different angles and distances, can better “teach” the algorithms to identify them under a range of conditions. It will likely be two years, Burke told The New York Times, before the team has a fully automatic prototype ready for testing.
Burke said the software, which is still in development, can tell the difference between different species of animals, including humans. “At the moment we can’t tell the difference between different individuals within the same species,” she said, “but maybe we will in future once the software is a bit more advanced. However, we can tell if animals are injured and spot some types of illness.”
The scientists want the system to be flexible enough to be used anywhere in the world for any type of warm-blooded animal, and they are focusing on monitoring endangered species. They are exploring how to include temperature sensors on the drone, so the software will have information about weather conditions and how that might affect the data.
They want to estimate how reliably the system detects target objects with the drone at certain heights and under certain conditions, in order to translate video footage to scientifically useful information, such as the spatial distribution and density of different animal species.
A comprehensive database and flexible algorithms would help field managers locate intruders in protected areas, detect target animals or people hiding in vegetation, or study cryptic nocturnal species.
“In order to identify different species, we will need to have a large library of thermal fingerprints for different animals,” Burke said. “Some of this we can develop ourselves by filming animals at the local zoo and safari park with the thermal camera. The rest we can gather in partnership with conservationists around the world, who will be able to upload their own thermal data to help train our machine learning algorithm.”
FEEDBACK: Use this form to send a message to the editor of this post. If you want to post a public comment, you can do that at the bottom of the page.