GEOINT practitioners discussed how the evolution of GEOINT content velocity, security, and intelligence is driving improved decision-making capabilities and experiences for the user
According to many GEOINT practitioners, we are entering the third wave of enterprise transformation. While the previous two disruption waves were about computing, networking, and the adoption of enterprise systems that transformed workforce management and the mission, this third wave is about modernizing the business experience organizations provide around content. On Sept. 23, a panel of GEOINT practitioners discussed how the evolution of GEOINT content velocity, security, and intelligence is driving improved decision-making capabilities and experiences for the user on USGIF’s Virtual GEOConnect Series Main Stage.
According to Derek Hoffman, deputy director, GEOINT Information Office, Source, NGA, the agency has four main buckets of data. The first is pixel data. The second is foundation GEOINT data—anything produced in support of safety, navigation, geodetics, geography, etc. The third is analytic data, for example, an intelligence analysis assessment or supporting data from activity-based intelligence. And the fourth is data from emerging sensors. Each type of data comes with its own baggage, needs, supply chain, dissemination, and release needs. Tying them together is very challenging.
“NGA’s value proposition is that the data we do have is vetted. The data coming through NGA meets a certain specification that supports a specific customer and using that data for multiple end users gets challenging,” Hoffman said.
One of the biggest challenges for NGA is building a data indexing service for all those different types of data in order to provide a fusion product at the end.
“Because what we find is, for example, if we’ve got a third-party agreement with someone, they’ll say ‘we will give you this data, but you can’t share it here and you can’t share it there.’ We have to [honor those conditions] throughout the entire lifecycle of the data,” Hoffman said. “When you talk about context, the framing is really our biggest challenge. Transforming those CRMs or the infrastructure in the architecture to enable the fusion of that data is really our biggest challenge.”
Similarly, Riyadh Feghali, innovation and customer experience lead, U.S. Department of Defense, said the integration of data into something that’s contextualized drives value proposition, particularly from a customer experience perspective.
“Understanding data and having the government tied to each bit of data so it can be easily disseminated and shared in a way that’s value-add to an end consumer is critical to really knowing yourself in a larger customer experience landscape,” Feghali said. “But also understanding the marketplace and where desire and demand are driving you becomes really crucial.”
From an industry perspective, Ben Conklin, director, Defense and Intelligence Solutions, Esri, emphasized the trends he has seen: sharing data, collaboration, and the need for information. One example he gave was the Johns Hopkins University COVID-19 dashboard. The team tasked with creating this dashboard, according to Conklin, understood what it was going to take to bring the data together and what the need was.
“They understood that people need information. They need more than just a simple map, they actually need information products they can drill into. They need to be able to click on and understand it for their community to understand the trends as a whole across the country and across the world,” said Conklin. “What I saw was this emergence of content curators who understand the data, but also understand the needs of the customer, how the data is going to be used, and then puts those things together and adds that context.”
According to CeCe Smith, director, Federal Programs, Orbital Insight, “Velocity is critically important, establishing reliability and consistency that partners can really understand and appreciate.”
Wayfinding, according to Smith, becomes a critical piece of content velocity because it helps users quickly synthesize what they can get and determine where it fits within their data needs.
From a security and analytic industry perspective, customers and the industry are asking for a move away from unsupervised anomaly detection and a shift toward more supervised learning, said Anthony Tellez, data scientist, Machine Learning & Advanced Analytics, Splunk.
“Customers and industry want models that are delivered from vendors or via open-source channels that will allow a user to use that model and inference and learn something about their own data,” he added.