BETTER CORAL THROUGH QUANTIFICATION
If you can’t measure it, you can’t improve it.” Peter Drucker may have been writing about business, but his thesis also applies to conserving the world’s coral reefs. Scientists believe that the earth has lost over half of its coral reefs since 1980, and they fear the remaining habitat could be gone by 2100.
Enter a new partnership between the Khaled bin Sultan Living Oceans Foundation and NASA’S Ames Research Center that’s targeted at quantifying and qualifying global coral reefs.
Some backstory: KSLOF’S Global Reef Expedition spent a decade aboard research vessel Golden Shadow, mapping some 65,000 square kilometers of coral reefs in the Atlantic, Pacific and Indian oceans. The result is a massive dataset of high-resolution imagery that’s being shared with NASA, with the goal of significantly expanding the agency’s ability to map reefs and—using historical satellite imagery—determine their health.
NASA will sift through KSLOF’S data using its Pleiades supercomputer, automation, machine learning and its Neural Multi-modal Observation and Training Network (aka NEMO-NET), the latter of which is an online game that’s powered by Pleiades and allows citizen scientists to identify corals from images. NEMO-NET player input is used to train Pleiades to autonomously identify coral from imagery. Additionally, NASA will use its cutting-edge Fluidcam to precisely survey coral reefs in 3D—without distortion— from aircraft and drones.
“This is a game-changer,” says Sam Purkis, KSLOF’S chief scientist and a professor and chair of marine geosciences at the University of Miami’s Rosenstiel School of Marine and Atmospheric Science. “NASA’S new imaging technologies and supercomputers dramatically change the landscape of what is possible in terms of mapping coral reefs.”
With luck, NASA’S enhanced capabilities will empower nations to improve conditions for the world’s remaining reefs.