Santa Fe New Mexican

Satellite images sharpen vision of Earth

- By Amanda Ziemann Amanda Ziemann is a remote sensing scientist at Los Alamos National Laboratory. A version of this article first appeared on Space.com.

Being able to accurately detect changes to the Earth’s surface using satellite imagery can aid in everything from climate change research and farming to human migration patterns and nuclear nonprolife­ration. But until recently, it was not possible to flexibly integrate images from multiple types of sensors — for example, ones that show surface changes (such as new building constructi­on) versus ones that show material changes (such as water to sand). Now, with a new capability, we can — and in doing so, we get a more frequent and complete picture of what’s happening on the ground.

At Los Alamos National Laboratory, we’ve developed a flexible mathematic­al approach to identify changes in satellite image pairs collected from different satellite sensor types that use different sensing technologi­es, allowing for faster, more complete analysis. It’s easy to assume all satellite images are the same and, thus, comparing them is simple. But the reality is quite different. Hundreds of different imaging sensors are orbiting the Earth right now, and nearly all take pictures of the ground in a different way from the others.

Take, for example, imaging sensors that capture informatio­n from multiple spectral channels, or types light. These are among the most common type of sensors and give us the images most of us think of when we hear “satellite imagery.” These imaging sensors are alike in that they can capture color informatio­n beyond what the human eye can see, making them extremely sensitive to material changes. For example, they can clearly capture a grassy field that, a few weeks later, is replaced by synthetic turf.

But how they capture those changes varies widely from one sensor to the next. One might measure four different colors of light, for instance, while another measures six. Each sensor might measure the color red differentl­y.

Add to this the fact these sensors aren’t the only type of satellite imaging. For example, there is also synthetic aperture radar, or SAR, which captures radar images of the Earth’s surface structure in fine detail. These SAR images are sensitive to surface changes or deformatio­n and are commonly used for applicatio­ns such as volcano monitoring and geothermal energy. So, once again, we have an imaging sensor that is capturing informatio­n in a completely different way from another.

This is a real challenge when comparing these images. When signals come from two different remote sensing techniques, traditiona­l approaches for detecting changes will fail because the underlying math and physics no longer make sense. But there’s informatio­n to be had there because these sensors are all imaging the same scenes, just in different ways. So how can you look at all of these images captured by different methods in a way that automatica­lly identifies changes over time?

Our mathematic­al approach makes this possible by creating a framework that not only compares images from different types of sensors, but also effectivel­y “normalizes” the different types of imagery — all while maintainin­g the original signal informatio­n.

But the most important benefit of this image integratio­n is that we’re able to see changes as frequent as minutes apart. Previously, the time elapsed between images captured by the same sensor could take days or weeks. But being able to integrate various types of images means we’re able to use data from more sensors faster, and thus see the changes more quickly, which allows for more rigorous analysis.

To test our method, we looked at images of the constructi­on of the new SoFi Stadium in Los Angeles starting in 2016. We began by comparing the different types of images over the same date range to see which ones picked up which changes. For example, in one case, the roof of a building beside the stadium was replaced — changing it from beige to white over the course of several months. The spectral imaging sensors detected this change, because it was related to color and material. SAR, however, did not, as we expected. However, SAR was highly sensitive to surface changes due to moving dirt piles, whereas the spectral imagery was not.

When we integrated the images using our new analysis capability, we were able to see both changes — the surface and the material — at a much faster rate than if we focused on any one individual satellite. This had never been done before at scale, and it signals a potential fundamenta­l shift in how satellite imagery is analyzed.

We were also able to demonstrat­e how changes could be detected much faster than before. In one instance, we were able to compare different spectral images collected just 12 minutes apart. In fact, it was so fast, we were able to detect a plane flying through the scene.

As space-based remote sensing continues to become more accessible — particular­ly with the explosive use of cubesats and smallsats in both government and commercial sectors — more satellite imagery will become available. That’s good news in theory because it means more data to feed comprehens­ive analysis. In practice, however, this analysis is challenged by the overwhelmi­ng volume of data, the diversity in sensor designs and the stove-piped nature of image repositori­es for different satellite providers. Furthermor­e, as image analysts become deluged with this tidal wave of imagery, the developmen­t of automated detection algorithms that “know where to look” is paramount.

This new approach to change detection won’t solve all of those challenges, but it will help by optimizing the strengths of various satellite imagers — and give us more clarity about the changing landscape of our world in the process.

 ?? ??

Newspapers in English

Newspapers from United States