Various satellite images refine our image of the Earth | Health and science

Being able to accurately detect changes to the Earth’s surface using satellite imagery can help with everything from climate change research and agriculture to human migration patterns and non-proliferation. nuclear. But until recently, it was not possible to flexibly integrate images from multiple types of sensors, for example, those that show surface changes (like the construction of new buildings) compared to those that show changes in materials (such as water to sand). Now with a new ability we can and in doing so we get a more frequent and complete picture of what is happening on the ground.

At Los Alamos National Laboratory, we have developed a flexible mathematical approach to identify changes in pairs of satellite images collected from different types of satellite sensors that use different detection technologies, allowing for faster and more complete analysis. It is easy to assume that all satellite images are the same and therefore comparing them is straightforward. But the reality is very different. Hundreds of different imaging sensors are orbiting the Earth right now, and almost all of them take photos of the ground in a different way than others.

Take, for example, imaging sensors that capture information from multiple spectral channels or types of light. These are among the most common types of sensors and give us the images most of us think of when we hear “satellite images”. These imaging sensors are similar in that they can capture color information beyond what the human eye can see, making them extremely sensitive to material changes. For example, they can clearly capture a grass field which, a few weeks later, is replaced by synthetic turf.

But how they capture these changes varies greatly from sensor to sensor. One can measure four different colors of light, for example, while another can measure six. Each sensor can measure the color red differently.

Add to that the fact that these sensors are not the only type of satellite imagery. For example, there is also Synthetic Aperture Radar, or SAR, which captures radar images of the structure of the Earth’s surface in great detail. These SAR images are sensitive to surface changes or deformations and are commonly used for applications such as monitoring volcanoes and geothermal energy. So, again, we have an imaging sensor that picks up information in a completely different way than another.

It is a real challenge when comparing these images. When the signals come from two different remote sensing techniques, traditional approaches to detecting change will fail because the underlying math and physics no longer makes sense. But there is information to be gained there, as these sensors all image the same scenes, but in different ways. So how can you look at all of these images captured by different methods in a way that automatically identifies changes over time?

Our mathematical approach makes this possible by creating a framework that not only compares images from different types of sensors, but also effectively “normalizes” different types of imaging, while retaining the original signal information.

But the most important advantage of this integration of images is that we can see changes as frequent as a few minutes apart. Previously, the time that elapsed between images captured by the same sensor could take days or weeks. But being able to integrate different types of images means that we can use data from more sensors more quickly, and thus see changes more quickly, which allows for more rigorous analysis.

To test our method, we looked at footage of the construction of the new SoFi stadium in Los Angeles from 2016. We started by comparing the different types of images over the same date range to see which ones picked up which changes. For example, in one case, the roof of a building next to the stadium was replaced, changing from beige to white over the course of several months. Spectral imaging sensors detected this change because it was related to color and material. SAR, however, did not, as we expected. However, the SAR was very sensitive to surface changes due to moving piles of soil, whereas spectral imagery was not.

When we integrated the images using our new analytical capability, we were able to see both changes – surface and material – at a much faster rate than if we were focusing on a single satellite. This has never been done on a large scale before and signals a potential fundamental change in the way satellite imagery is analyzed.

We were also able to demonstrate how changes can be detected much faster than before. In one case, we were able to compare different spectral images collected just 12 minutes apart. In fact, it was so fast that we were able to detect a plane flying over the scene.

As remote sensing from space continues to become more accessible, especially with the explosive use of cubes and small satellites in government and commercial sectors, more satellite imagery will become available. This is good news in theory because it means more data to fuel a full analysis. In practice, however, this analysis is challenged by the overwhelming volume of data, the diversity of sensor designs, and the siled nature of image repositories for different satellite vendors. Additionally, as image analysts are inundated with this tidal wave of images, the development of automated detection algorithms that “know where to look” is paramount.

This new approach to change detection will not solve all of these challenges, but it will help in optimizing the strengths of various satellite imagers and give us more clarity on the changing landscape of our world in the process.

Amanda Ziemann is a remote sensing scientist at the Los Alamos National Laboratory. A version of this article first appeared on Space.com.

Source link

About Terry Gongora

Check Also

Canadian mask guidelines have changed. Here’s why you might need an upgrade

Now that cold weather has hit and people are moving indoors, many doctors and scientists …