PC Pro

Keeping watch on oceans using off-theshelf technology

Oceans are massive but drones and AI are helping researcher­s better watch whales, beaches and seabirds.

- Nicole Kobie reveals the high-flying future of marine science

Drones and AI are helping researcher­s watch whales, beaches and seabirds. Nicole Kobie reveals the high-flying future of marine science.

To count how many seabirds lived in a colony, David Johnston and his team at Duke University used to walk among the black-browed albatrosse­s and southern rockhopper penguins with a clicker, like bouncers at a bar. When one small section was finished, that number would be extrapolat­ed across the total area.

If the colony on the Falklands Islands was in a spot too difficult for humans to easily access, planes would be sent up for aerial photograph­y that would be painstakin­gly examined on a computer display – counting the tens of thousands of birds, one by one. “Trying to sit there counting those damn birds on the screen, you just go crazy,” said Johnston, an associate professor of marine conservati­on at Duke. Nor was it accurate. “If we were both asked to count all the albatross, we’d probably come up with different numbers.”

If wasting the talents of marine biologists by methodical­ly, boringly and inaccurate­ly counting birds on a screen makes you think that there has to be a better way, you’re right. Thanks to the arrival of drones for data collection and artificial intelligen­ce for analysis, much of this work can be automated, making it easier for scientists to keep a careful watch on our oceans.

Alongside Johnston’s seabirds, researcher­s at the University of Exeter have used overhead drones to capture videos of orcas, giving more understand­ing about how they interact socially – in short, to see how they choose their friends. In Australia, drones are used to spot sharks at beaches, helping to keep swimmers and surfers safe while avoiding destructiv­e practises such as culling. South of South America, in the South Sandwich Islands, researcher­s from Oxford and the British Antarctic Survey are using drones for aerial photograph­y of seabirds and seals, allowing more to be counted without disturbing the animals. And underwater footage is being stitched together and analysed by AI to help monitor at-risk coral reefs, thanks to work by Arizona State University.

It’s not just images. Ben Milner, lecturer in computing science at the University of East Anglia, uses deep learning to remove interferen­ce from audio to better track and identify endangered right whales. That idea has also been echoed in the Bay of Fundy in Nova Scotia, Canada, where researcher­s repurposed a facial

recognitio­n model to automate the filtering out of background noise from acoustic monitoring data.

Indeed, alongside seabirds, Johnston and his team use that combinatio­n of drone photograph­y and AI to watch over blue whales, track changes to beaches after hurricanes and monitor temperatur­e changes to the sea. Here’s how off-the-shelf drones, open-source AI and other clever ideas are being reworked and combined to help better monitor and protect the watery bits of our world.

From whales to seabirds

Johnston has always been interested in the clever ways people find new ways to use technology. After seeing how precision drones were being used at farms and vineyards, he wondered how the same ideas could be applied to his own monitoring work, be that counting animals, surveying a beach or looking for marine debris. “The potential was dramatic,” he said. “It was very obvious after we did our first few missions that the sky was the limit on what we could do.”

Traditiona­l animal counting methods are expensive, inaccurate and difficult – and can potentiall­y harm the animals. For example, to measure blue whales, scientists turned into whalers, hunting and killing them to size them up. It was possible to fly planes overhead for photograph­y, but that comes with a high cost and low success rate, as whales don’t surface on demand.

Drones change that. Now,

Johnston says, it’s possible to go out on a small boat, seek out the whales, and within minutes send the drone up for measuremen­ts – without any hunting and killing. On shore, drones can also measure marsh grass rather than human researcher­s trampling it.

“It reduces the effect of humans on sensitive habitats,” he said.

Drones can also cover a wide area quickly. Measuring ocean temperatur­es through colour is a core part of modern marine science, but satellites can only manage areas larger than 5km wide. Drones can cheaply and easily get more detail, and be sent out to investigat­e sudden changes. Surveying a beach previously required a laser scanner; the same work can now be done in a tenth of the time with a suitably equipped drone. That means it’s possible to not only cover more area, but do so in a timely manner. If a hurricane is incoming, a team can collect data on the shape of the beach before the waves and wind hit, and do the same immediatel­y after the storm passes. “There’s an amazing quality of immediacy,” he said.

Drones just offer data, and that’s in turn interprete­d by AI, which is key to the success of the seabirds project. Such colonies are massive, making them difficult to count, but these particular birds cause extra confusion because of how they build their nests. “These albatross, they create these really cool nests on little pillars, with the egg on top where they sit,” Johnston explained. “The penguins stagger around underneath that.”

In other words, researcher­s have to count a double layer of birds.

Doing that quickly and easily is key to track population fluctuatio­ns, which can be caused by climate change or overfishin­g. “Our biggest challenge is trying to detect those changes before it’s too late,” he said. “And if we can do a better job of counting them, that means uncertaint­y levels are lower.”

“It was very obvious after we did our first few missions that the sky was the limit on what we could do”

Training bird AI

Humans aren’t left out of this loop. Such algorithms still need to be trained on relevant data, which at Duke was largely done by team technician Maddie Hayes. To achieve that, photos of bird colonies were labelled by humans to show the AI what to look for. The model was then run against that dataset, teaching it the difference between an albatross and a penguin.

To check accuracy, the results are compared to traditiona­l methods.

“In almost all cases, we find that the neural network does a better job than people,” said Johnston. Plus, it’s consistent: if the system makes mistakes, it makes the same ones – and that consistenc­y means it’s easier to track changes and replicate results. “We know what its bias is, so we can apply it very efficientl­y every time… reducing the uncertaint­y associated with the values we produce.”

Not only is the AI more accurate at counting birds, but it’s unsurprisi­ngly faster than a human staring at a computer screen, trying to decide if that’s an albatross, penguin or oddly shaped rock. Previously, counting a seal colony on the ground would have taken a few days, while seabird colonies would take at least a week. “With this model, you could run it in an hour or an hour and a half,” said Johnston, depending on what hardware you’re using.

Listen up

While Johnston’s work shows how automation can collect better quality data and analyse it using AI, Ben Milner’s project solved a different problem: cleaning up the data we have. Rather than drones photograph­ing the ocean, beaches and animals, Milner works with sound. It’s the main way whales are tracked underwater.

There are only 350 North Atlantic right whales left in the world, and they’re at threat from human activity such as getting caught in fishing equipment or being hit by ships. When a pod of these whales traverses through an area, researcher­s can try to steer them a different way while warning industrial users of that area to be careful of the animals, or even halt operations to let them pass safely.

To do so, researcher­s need to know the whales’ location. Traditiona­lly this has been done using observers on ships, but that’s difficult, expensive and misses plenty of whales. “They’d be looking to see the whales break the surface, or for spray patterns,” Milner said. “Obviously you can’t do that at night or in fog or low visibility.”

This type of right whale makes two distinct noises. “An ‘upcall’ is quite low frequency but we can just about hear it with our own ears,” he said. “And there’s the ‘gunshot’, which is the sound of a gun going off.”

Underwater microphone­s, called hydrophone­s, have helped to automate the process, either placed permanentl­y on buoys or temporaril­y run underwater using flying drones called gliders. While underwater equipment meant listening for those key sounds could be automated, those distinctiv­e vocalisati­ons can be masked by other sounds, including the ships or offshore work such as oil and gas drilling, that researcher­s are hoping to keep away from the animals. “They suffered from a lot of false alarms,” said Milner.

False alarms aren’t helpful when you’re asking industry to halt operations to help animals. “If you end up shutting something down, that may be costing thousands of pounds a day. A false alarm is bad, as is missing a right whale.”

Cleanup operation

Milner’s solution was to create a deep-learning algorithm that can detect and remove background noise, pairing that with a secondary system that can identify whale noises. “If you can remove the background noise, then it leaves a cleaner signal for you to detect the right whale without missing it and avoiding a false trigger,” he explained.

To train the AI, thousands of sound clips of whales, as well as background noise such as shipping and drilling, were turned into images called spectrogra­ms, rather than feeding audio into AI. Those were all classified and labelled, letting the deeplearni­ng system understand what each sound represente­d. A selection of clips were held back to test the system, ensuring it learned well and was accurate.

The system doesn’t spit out an answer of whale or not a whale.

Instead, it gives a probabilit­y that the sound clip is a right whale. A figure of 0.9 means the system is very certain, while 0.1 means it’s very uncertain. Researcher­s can then set a threshold for which clips the system should flag, erring on the side of either higher or lower certainty. “You might say that anything above 0.3 is a right whale, so you’ll get anything that’s similar to a whale,” Milner said. “Or if you want to reduce false alarms, you might set a probabilit­y above 0.8 – you’re going to miss some, but you’ll be more certain.”

The whole system helps increase the amount of data that can be examined, while also boosting accuracy. “It will reduce the number of missed whales and reduce the number of false alarms,” said Milner.

Reuse and recycle

As with Johnston’s work, this right whale detection system can be reused across other animals simply by training the model on a new dataset. “We could train our classifier so it knows what a blue whale sounds like – we can actually train it so it knows the difference between a blue whale and a right whale,” Milner said. “It can then detect either of them.”

Accomplish­ing that requires a set of labelled training data, and that’s the hard bit: if the researcher­s make mistakes classifyin­g the dataset of sounds used to teach the AI, those errors will be baked into the system. “If the human expert was incorrect, then the model will not learn correctly,” explained Milner. “It’s just like with a human – if you’re told the wrong thing then you’ll learn wrong.”

But with the right datasets and accurate labelling, this mix of background noise scrubbing and classifica­tion could have uses well beyond whales. One of Milner’s students is adapting the system to track small mammals near Chernobyl using passive sound monitoring devices, to monitor their activity and recovery.

That reuse of scientific techniques mirrors how researcher­s are reusing existing technologi­es to help monitor the animal world. While Milner’s team had to develop these AI models themselves, the ideas at their core come from speech processing and recognitio­n, which is Milner’s research background, rather than whales. “Those technologi­es apply just as well to right whales or small mammals or bats because they’re just acoustic signals that we represent as images in spectrogra­ms,” he said. In short, the same sort of systems that let you ask Alexa or Siri a question are being repurposed to save endangered whales in the North Atlantic.

That reuse of existing innovation­s is exactly how Johnston’s projects came about, too. The idea was sparked by the use of drones to capture agricultur­al data, and the equipment used is often off-theshelf, consumer products.

Some drones are specialist and customised, but some of the work done by Johnston is possible with a drone picked up at a Best Buy, he says. Even the AI is pre-made. The algorithm used by Johnston and his team was released as an open-source project, meaning it could be downloaded for easy use. “If someone before us hadn’t been committed to open science and had their code up… we would not have been able to do this as efficientl­y,” Johnston explained.

The future of innovation in marine research isn’t necessaril­y invention, but creative remixing. “So many of the tools for scientists to excel are out there and available,” explained Johnston. “It’s just assembling them in unique ways.”

“We can actually train it so it knows the difference between a blue whale and a right whale. It can then detect either of them”

 ??  ??
 ??  ??
 ??  ?? ABOVE Labels teach the AI the difference between an albatross and a penguin
ABOVE Labels teach the AI the difference between an albatross and a penguin
 ??  ??
 ??  ?? BELOW The AI cuts down the birdcounti­ng time from weeks to hours
BELOW The AI cuts down the birdcounti­ng time from weeks to hours

Newspapers in English

Newspapers from United Kingdom