Big Data from Small Sats
Speakers from government and industry discussed solutions for analyzing big data derived from small sats at USGIF’s “Powering GEOINT Analytics” workshop.
Dozens of speakers from government and industry discussed solutions for analyzing the exponentially growing mountain of big data derived from small satellite imagery at USGIF’s “Powering GEOINT Analytics” workshop.
Dr. Anthony Vinci, director of plans and programs at the National Geospatial-Intelligence Agency (NGA), gave the keynote address at the workshop, which was held April 13 at NGA Campus East in Springfield, Va.
Vinci said the convergence of small sats and big data will be a significant part of the agency’s future, and that the ability to detect change at a global scale will allow the agency to take on new missions that were never before possible.
“[Humans] can’t look at every part of the world every day,” Vinci said. “Lots of people come at big data as a problem … to me it’s an opportunity. We do great missions every day, and now we can do more of them.”
But accomplishing this, he continued, will require changing how imagery analysis and GEOINT are defined.
“There may be a convergence where imagery analysts become big data analysts,” Vinci said. “There may be a time where they don’t look at imagery, or only do so as a last step after an anomaly is detected. That’s a huge shift.”
He said “cross training” between imagery science and data analysis is another option worth exploring.
“We need to rethink at a core level who we are, what we are, and how we do it,” Vinci concluded.
Following Vinci, a group of industry leaders provided updates from their companies. Rob Rainhart, senior vice president of engineering for HawkEye 360, said the company would launch its first cluster of three commercial RF small sats early next year following the $11 million in Series A funding it recently procured.
“We’ll be delivering a new breed of analytics capabilities as well to support the growing need for better pattern-of-life detection,” Rainhart said.
Alex Bakir, vice president of product marketing with Planet, described the organization as “more like a software company with disaggregated risk over many satellites,” noting how humans have been taken out of the loop throughout nearly every step of the technology chain.
Bakir said Planet is interested in merging its data sets with others and in providing analysts and decision-makers with the tools to best structure their day and derive information from data.
“The Planet data set is interesting, but it’s not the only data set,” he said. “We want to bring it together with other sets. … The goal is not to solve every end user’s problem, but to enable the data set to do that.”
Todd M. Bacastow, director of strategic alliances at DigitalGlobe, said more data does not necessarily equal better insight, pointing to big data analytics as the way to gain decision advantage.
When asked about recent mergers and acquisitions within the sector—DigitalGlobe acquiring The Radiant Group then itself being acquired by MDA, or Planet acquiring Terra Bella—the speakers described the trend as responding to user needs and being “a sign of a healthy sector.”
The Analysts’ Perspective
A group of analysts addressed the audience to share their end users insights. Jim McCool, director of the Tradecraft and Technology Group within NGA’s Analysis Directorate, said discovery missions are still required when a new security event occurs.
“You have to find information about the event, understand it over time, and model it in the future environment,” McCool said. “You have to start to develop a schema then get into that more routine automation. The discovery phase is not automated today, and we will be looking for commercial support there.”
McCool also added that the considerable amount of time spent conditioning and preparing data for analysis is a significant challenge for analysts.
“When [Director Cardillo] asks how long that took, working on conditioning the data is usually the longest part of the answer,” he said.
The analysts all agreed that trust in a data set is paramount, explaining how word spreads fast among analysts when one discovers a data set that helped them in their mission.
U.S. Southern Command’s (USSOUTHCOM) Commercial + Open GeoIntelligence (COGINT) Team gave a presentation in which they chronicled their early operational exploration of fusing deep data stacks with social media.
SOUTHCOM, which panelists said is the test bed for U.S. combatant commands, has been using the COGINT model in the field for about a year. The program’s themes include partner choice, rapid response, and a focus on transnational threat networks—an area of particular interest to the command.
Team member A. Terrell Dyess said the group is taking the human-machine aspect of leveraging the explosion in commercial imagery and applying it to open-source imagery. He posited the “neo GEOINT” could become geo-enabled data rather than primary imagery.
The COGINT team is using data from Planet and a variety of open sources, and has collaborated with Descartes Labs on a geospatial search tool. In what it called an “automation sprint,” the team had success detecting illicit airfields in Honduras.
Team members have had the opportunity to brief NGA Deputy Director Sue Gordon on their progress, and believe their method will have many applications beyond the SOUTHCOM mission set.
Modern Technology Infrastructure
The day concluded with a panel of NGA representatives who discussed modern technology infrastructure for analysis.
“[Technology] is changing every six months and we can’t wait,” said Jason Hess, NGA’s chief of cloud security. “We need to change how we use technology when going forward in the agency as well as change the mindset of the developers. We need to give them the tools they need to be power users.”
Dr. Ben Tuttle, NGA’s GEOINT Services development lead as well as the lead for NGA’s Outpost Valley, said it’s important to break down stovepipes and work horizontally across the organization.
“We [panel members] all work in different parts of the agency, but what we do is so relevant to each other that we have to organize together to achieve our ultimate goals,” Tuttle said.
Justin Poole, director of NGA’s Source Directorate, discussed the agency’s newest operating model, called Broker, which he described as a “stable of suppliers for GEOINT content and services.” He added there are four pieces to the model: issue management, broker services, supplier interaction, and content conveyance.
“It’s a way to modernize [the Source Directorate],” Poole said. “We’re beginning a path and thinking about how to interact with the customer to get what they need.”
Following the formal programing, attendees were invited to a networking reception. The event also included industry demonstrations and exhibits in the NGA atrium as well as industry lightning talks throughout the day.
Lindsay Tilton Mitchell also contributed reporting to this article.
Photo Credit: Planet
Posted in: Event Recaps Tagged in: Analysis, Data, Machine Learning & AI, NGA, Remote Sensing, Small Sats
Mission Focus: Global Sustainability
The morning program included a keynote and a panel discussion about collaboration between government, industry, and academia to detect and quantify changes and map trends.
IC GEOINT Support to USAF and USSF
Panel highlights how commercial GEOINT contributes to mission success
USAF: From Air to Space
Kenneth Bray reflects on ongoing and future changes for USAF ISR, and how commercial capabilities will help them achieve their new goals.