Facial recognition for bears – and other ways to use the tech for good
Researchers have paired a neural network and a facial detection system used in a “dog hipsterisation” app to help manage and protect grizzly bears. Nicole Kobie reveals how BearID was created
The problematic tech is being used to manage and protect grizzly bears, as well as a range of other wildlife, finds Nicole Kobie.
Facial recognition is problematic for humans. When it works, it invades privacy and eases us into a surveillance state. When it doesn’t work, people have been falsely arrested by police. But that’s people. For bears, it’s all good – and facial recognition is now being used to help research, monitor and protect the animals using a neural network-based system called BearID.
Melanie Clapham tracks grizzlies. Normally, that requires methodically examining photographs or physically tagging the animal, as the University of Victoria researcher’s work on grizzly behaviour requires being able to pinpoint a specific individual.
But that’s not easy because bears have few distinctive markings – they’re all brown and fluffy – and can dramatically change appearance from one season to the next. “They moult their coats in the summertime,” said Clapham. “And in the autumn, before they go into hibernation, they can put on a third of their body weight.”
If individual bears are watched closely, it’s possible to track them through such changes. However, that becomes more difficult when you’re monitoring many bears over a wide area or not seeing the same bear frequently enough. “If you’re not observing them constantly, it can be difficult to pick out the same bears even between spring and fall,” she said.
Wild solution
Clapham knew there had to be a solution, and realised that automation and machine learning could be part of it. To find out, she joined a group called Wildlabs.
Based in Cambridge, Wildlabs is a network of 4,305 members including field conservationists, researchers and technology experts. “Academics often have really specific research questions they’re answering, and a high level of expertise and time to build their own stuff,” explained Stephanie O’Donnell, community manager at Wildlabs.
Wildlabs was founded by NGOs, including the WWF, and tech groups such as Google.org and Arm in the hopes of bridging the gap between the two worlds. “Conservationists who work in the field find it really hard to find other conservationists to talk about technology,” O’Donnell said. “Field conservationists are trying to manage big protected areas or monitor a whole ecosystem while dealing with challenges around human-wildlife conflict or climate change. They need technology to do different things.”
The BearID project was a bit of both, taking in practical management with academic research. O’Donnell recalls Clapham getting in touch because two hours earlier she heard from Ed Miller and Mary Nguyen, a pair of Silicon Valley software developers who were working on a similar idea as part of a bear-watching project called Explore.org, which has webcams watching grizzlies at Brooks Falls in Katmai National Park, Alaska. “I was like, ‘you guys must be
“If you’re not observing them constantly, it can be difficult to pick out the same bears even between spring and fall”
working together already’,” said O’Donnell. But they weren’t, so she helped link them up. “That interaction is exactly what Wildlabs was created to do.”
Brought together by Wildlabs, the two projects combined their resources and their datasets to set up BearID, and, over several years, developed software that would analyse images to learn how to recognise one grizzly from another.
Building BearID
The system itself is designed in two sections. First, there’s a facialdetection tool, which looks at the image, recognises a bear’s face, and makes measurements between key aspects, such as eyes and the tip of the nose. Rather than design such a program from scratch, the team used a pre-made system from Dlib, a library of machine-learning algorithms and tools, that was designed to recognise dog faces in order to give them hipster glasses and moustaches — yes, you read that correctly. Silly apps like the “dog hipsteriser” aren’t necessarily a waste of time in the right hands.
“We used elements of that network to help us detect bear faces initiall y, which gave us a bit of a kickstart, though we did go back and retrain it on bear faces,” Clapham said. That made the job of labelling bears faces easier, plus it let the team add hipster glasses and moustaches to bears, which is as delightful as it sounds.
The other half of the system is facial recognition. That began with labelling the photos, marking each bear, in order to use that data to train a machine-learning system. Human facial-recognition systems are taught on millions of images, but there simply aren’t enough bears for such detailed training.
The network is shown a selection of images that have been correctly labelled to learn how to tell bears apart. “We don’t tell the network what to look for in a bear’s face, we just present the labelled data,” said Clapham. “Over time, the network learns what is stable about that bear’s face and uses that to distinguish between different individuals.”
The first version of the system was trained on 5,000 images of 132 bears, half from Alaska and half from further south in the US and Canada. That was split into two sections: 80% were used for training and 20% to test the system’s accuracy. When a bear’s face is detected, a deep-convolutional neural network examines 128 dimensions on the image.
“The network learns what is stable about that bear’s face and uses that to distinguish between different individuals”