by Amanda Ziemann
Being able to accurately detect changes to Earth's surface using satellite imagery can aid in everything from climate change research and farming to human migration patterns and nuclear nonproliferation. But until recently, it was impossible to flexibly integrate images from multiple types of sensors — for example, ones that show surface changes (such as new building construction) versus ones that show material changes (such as water to sand). Now, with a new algorithmic capability, we can — and in doing so, we get a more frequent and complete picture of what's happening on the ground.
At Los Alamos National Laboratory, we've developed a flexible mathematical approach to identify changes in satellite image pairs collected from different satellite modalities, or sensor types that use different sensing technologies, allowing for faster, more complete analysis. It's easy to assume that all satellite images are the same and, thus, comparing them is simple. But the reality is quite different. Hundreds of different imaging sensors are orbiting the Earth right now, and nearly all take pictures of the ground in a different way from the others.
Read the rest of the story as it appeared in Space.com.