How AI and ML Are Supercharging Earth Observation
Armed with AI, satellites can speed up image processing and deliver meaningful, timely information to stakeholders. But is the buzz overshadowing the reality of what’s possible?July 24th, 2023Earth Observation (EO) satellites have captured meaningful snapshots of our planet for decades, from images of ice caps over a 20-year interval to images of military assets in a war zone. But the job of deriving actionable insights can take hours.
Enter AI and machine learning. Because AI/ML algorithms can be programmed, or trained, to analyze multiple data sets and derive critical information, tasks that previously took hours or days — such as analyzing environmental patterns or military surveillance — now take minutes or seconds.
It’s like EO on steroids.
“Because of these enormous explosions of data volumes, it's rather difficult to both manage all that data to find information that's mission critical, but also to decide how [to] downlink just the right information, so that the latencies from sensor to decision get driven down as fast as possible,” says Manny Gonzalez-Rivero, Maxar’s director of Applied Machine Learning.
Maxar has an all-hands-on-deck investment in machine learning models, and how they can improve accuracy when applied to satellite imagery. Recently, Maxar engineers published a paper in the journal Nature on how to combine high-resolution imagery (15 cm HD) with machine learning models to improve the accuracy of detecting small objects such as solar panels.
Gonzalez-Rivero says these EO activities would be nearly impossible without the assistant of machine learning models. The latency “actually makes the information actionable instead of more academic in nature,” he says. “You’re not just writing reports about what happened. You can actually get information fast enough to do something about it.”
To illustrate the importance of speed to insight, Gonzalez-Rivero offers the hypothetical example of the U.S. military engaging in broad area surveillance of an area where missile launchers are stationed.
“You may only have about five minutes from the moment that you detect that activity to the moment you may want to take direct action,” he says, emphasizing the importance of utilizing onboard analytics and machine learning technologies at the edge of the network, to collect and analyze information. These analytics understand which 100 by 100 pixel chip of 3.8 million square kilometers should be immediately downlinked so that the customer can make the most relevant decision, as quickly as possible.
In turn, machine learning models can be trained to filter out non-critical data that slows things down.
“If you have a pixel every 30 centimeters, you start talking about really, really heavy data loads and that really starts clogging up your downlink bandwidth,” says Gonzalez-Rivero. “It becomes even more critical that we use AI and ML to drive those kinds of missions.”
The use cases for AI/ML-powered satellite EO from maritime domain awareness to monitoring elephants are endless, and government agencies and private industry are working to evolve the models to make the best use of the data. Here is a look at how they are making progress.
Impactful Insights
One of the more telling signs of AI/ML’s role in satellite’s future transpired in August, when NASA announced the release of an open-source geospatial artificial intelligence foundation model for Earth observation data.
The foundation model, fueled by a public/private partnership involving NASA’s Interagency Implementation and Advanced Concepts Team (IMPACT) and IBM Research, was built using NASA’s Harmonized Landsat and Sentinel-2 (HLS) dataset. It includes a wide range of potential environmental applications, including tracking changes in land use, monitoring natural disasters, and predicting crop yields. NASA and IBM released the foundation model on Hugging Face, a public repository for open-source machine learning models.
The goal for NASA’s Earth Science Data Systems program is to make sure that the valuable Earth observations that NASA assets collect from satellites, ground instruments, and airborne instruments, are effectively utilized by the science community and the broader community, says Rahul Ramachandran, IMPACT manager and a senior research scientist at NASA Marshall.
“We realize that technology changes fast, and we have to make sure our data and information systems are more proactive about adopting new technologies to enable faster access discovery and use of these complex science data sets,” Ramachandran says.
One area that researchers are heavily focused on in 2023 is how to infuse AI and machine learning in the entire data lifecycle, as well as the research lifecycle, to improve the way data is managed.
“We are looking at petabytes of data – right now we have 70 petabytes. It's supposed to grow to 600 petabytes in a few years,” says Ramachandran. “Our existing processes on the way we manage and analyze data will now scale. We definitely view AI as a tool that can help in that.”
The foundation model ultimately alleviates barriers to entry for public and private organizations, too, because the pre-training of the model is done in advance, through the notion of self-supervised learning.
“The foundation model just runs on this data and it understands what is inside the data and models itself,” says Ramachandran. “And then you can give it to user communities who can then take the foundation model and then tune it for their application whether it's flood mapping or bone scar reduction, cloud classify, crop classification, using smaller amounts of training data than normally required to train a deep learning model.”
Like NASA, a growing number of legacy non-government organizations are putting their faith into the future of AI and machine learning. Many of the most compelling use cases that involve the use of ML and AI to synthesize data from multiple sources — EO satellite imagery and other raw data — are showing promise in combating climate change.
For example, Scepter and ExxonMobil recently announced a new project powered by Amazon Web Services (AWS) to address methane emissions through space-based monitoring. Similarly, Orbital Sidekick uses advanced satellite technology and data analytics powered by AWS to monitor energy pipelines and reduce risks and emissions.
“AWS supports a variety of organizations to leverage Earth observation insights that have meaningful, real-world impact,” says Clint Crosier, director of AWS Aerospace & Satellite. Crosier cites that OSK’s monitoring and analytic technologies have monitored more than 12,000 miles of pipelines to date and flagged nearly 100 suspected methane leaks, 200 suspected liquid hydrocarbon leaks, and more than 300 intrusive events related to construction.
Customers want to use EO data to monitor change over time, and AI is key to delivering on that, says Patrick O'Neil, BlackSky chief technology officer. He says BlackSky has taken a software-first approach to EO with a significant commitment to reliable, automated systems including its tasking service and multi-intelligence analytics platform.
“Without AI, it’s challenging to extract insights from space data in the timelines required by our customers,” O’Neil says. “Customers are becoming more interested in observing change over time and the shift from mapping the Earth to dynamic time-diverse monitoring informs why AI needs to be an integral part of the EO business.”
Stronger Defenses
AI and ML is also aiding international oversight and defense operations.
HawkEye 360, which specializes in space-based radio frequency (RF) data and analytics, is utilizing AI/ML-based technologies to monitor activities that may or may not be aligned with international guidelines.
Most recently, the company announced a pilot program in July with Australia’s Department of Foreign Affairs and Trade (DFAT) to leverage its satellite RF maritime analytics and training module to provide greater maritime domain awareness to help detect and prevent illegal, unreported, and unregulated (IUU) fishing.
HawkEye 360’s satellites pick up sources of RF activity such as radars and GPS jammers, which are applied to geolocation-focused AI/ML models to derive actionable intelligence, says Kate Zimmerman, the company’s chief data scientist.
For example, if a ship turns off its AIS-tracking number in an attempt to engage in illegal fishing, HawkEye’s AI/ML technology embedded within its satellites can process RF energy emissions data, synthesize that data with other insights — such as the likelihood that a dark vessel is engaging in illegal activities.
“The dream is, of course, to be able to apply this level of modeling to nearly any RF activity,” says Zimmerman. HawkEye 360 is working to use AI to get more detailed information about the emitters its satellites detect, to give more details to the end user, along with geolocation information.
“We’re trying to get to a place where we can actually say, it’s not just a ship in the ocean, it’s this ship in the ocean, which gets really important for the dark-vessel scenario, where they’re engaging in illegal activity,” says Zimmerman. “We’ve been applying machine learning to get to that kind of increased fidelity of saying that it’s not just a radar in the ocean.”
Quentin Donnellan, president of Space and Defense for Hypergiant, a five-year-old startup that was recently acquired by Trive Capital, says interest in its AI-enabled, cloud-based command-and-control technologies has increased. Hypergiant’s customers include divisions within the United States military.
“[When] operators have to make a decision [and] they're looking at a bunch of data, they're potentially looking at a geospatial context. They’re maybe leveraging machine learning or artificial intelligence, or other decision aids and they typically have a compressed timeframe within which to make decisions. The cost of making the wrong decision or potentially delaying any sorts of decision are sometimes high,” Donnellan says.
Space is a key vantage point to collect data on Earth in real-time.
“If you want to collect data about the Earth in real time, there's only one place you can do that from,” he says. “The alternative is just start deploying sensors everywhere on the ground [and] that just doesn't scale. There's only one domain where I can go out and collect data anywhere on the Earth quickly. It will continue to become, though it already is, very, very important for the data collection piece of all of this.”
The Outlook for Generative AI
Perhaps a sign of the times, Donnellan says he can't go to a defense conference these days without seeing people talk about hot-button AI issues such as responsible AI, or the emergence of large language models like ChatGPT and how they will affect the workforce. Most likely, these hot button issues will become bigger topics within satellite industry conferences, too.
One of the criticisms of ChatGPT is where it pulls information from, and accounts of it regurgitating incorrect information. Knowing what’s inside the “black box” or internal data is key, Donnellan says
“If you're going to leverage AI into the decision-making process — especially if that interface is very easy for humans to use — there has to be somewhere where you open up the box and can examine where everything came from,” he says.
Few AI applications have received more attention over the last 12 months than ChatGPT, which AWS exec Crosier notes, has captured the widespread attention and the imagination of the greater public. Generative AI, like ChatGPT, is AI that can create new content like text and images.
Of the buzz, he says: “We’re just at the beginning of imagining how tools like generative AI can enhance today’s space missions, but we’re certainly at an exciting inflection point. AWS believes generative AI is poised to transform virtually every customer experience — including the space industry.”
Crosier said AWS is letting its space customers lead the way to experiment and define where this technology will provide them the most benefit.
“While it’s still early days,” Crosier says, “we believe that generative AI holds promise to enhance a variety of space applications such as enhancing satellite imagery analysis and improving object identification, optimizing flight paths and mission planning, and identifying optimal testing scenarios for digital twins.” VS