Transforming GEOINT organizations from pipes to platforms
Today’s networked platforms are able to achieve massive success by simply connecting producers and consumers. Uber doesn’t own cars, but runs the world’s largest transportation business. Facebook is the largest content company, but doesn’t create content. Airbnb has more rooms available to its users than any hotel company, but doesn’t even own any property.
In his book, “Platform Scale: How an Emerging Business Model Helps Startups Build Large Empires with Minimum Investment,” Sangeet Paul Choudary describes how these companies have built two-sided markets that enable them to have an outsized impact on the world. He contrasts the traditional “pipe” model of production, within which internal labor and resources are organized around controlled processes, against the “platform” model, within which action is coordinated among a vast ecosystem of players. Pipe organizations focus on delivery to the consumer, optimizing every step in the process to create a single “product,” using hierarchy and gatekeepers to ensure quality control. A platform allows for alignment of incentives of producers and consumers, vastly increasing the products created and then allowing quality control through curation and reputation management. In this model, people still play the major role in creating content and completing tasks, but the traditional roles between producer and consumer become blurred and self-reinforcing.
A Platform Approach for Geospatial Intelligence
So, where does the geospatial world fit into this “platform” framework? Geospatial intelligence, also known as GEOINT, means the exploitation and analysis of imagery and geospatial information to describe, assess, and visually depict physical features and geographically referenced activities on Earth. In most countries, there is either a full government agency or at least large, dedicated groups who are the primary owners of the GEOINT process and results. Most of the results they create are still produced in a “pipe” model. The final product of most GEOINT work is a report that encapsulates all the insight into an easy-to-digest image with annotation. The whole production process is oriented toward the creation of these reports, with an impressive array of technology behind it, optimized to continually transform raw data into true insight. There is the sourcing, production, and operation of assets used to gather raw geospatial signal, and the prioritization and timely delivery of those assets. Then, there are the systems to store raw data and make it available to users, and the teams of analysts and the myriad tools they use to process raw data and extract intelligence. This whole pipe of intelligence production has evolved to provide reliable GEOINT, with a growing array of incredible inputs.
These new inputs, however, start to show the limits of the pipe model, as new sources of raw geospatial information are no longer just coming from inside the GEOINT Community, but from all over the world. The rate of new sources popping up puts stress on the traditional model of incorporating new data sources. Establishing authoritative trust in an open input such as OpenStreetMap is difficult since anyone in the world can edit the map. And the pure volume of information from new systems like constellations of small satellites also strains the pipe production method. Combining these prolific data volumes with potential sources of intelligence, like geo-tagged photos on social media and raw telemetry information from cell phones, as well as the process of coordinating resources to continually find the best raw geospatial information and turn it into valuable GEOINT, becomes overwhelming for analysts working in traditional ways.
The key to breaking away from a traditional pipe model in favor of adopting platform thinking is to stop trying to organize resources and labor around controlled processes and instead organize ecosystem resources and labor through a centralized platform that facilitates interactions among all users. This means letting go of the binary between those who create GEOINT products and those who consume them. Every operator in the field, policy analyst, and decision-maker has the potential to add value to the GEOINT production process as they interact with GEOINT data and products—sharing, providing feedback, combining with other sources, or augmenting with their individual context and insight.
Transforming GEOINT Organizations from Pipes to Platforms
The GEOINT organizations of the world are well positioned to shift their orientation from the pipe production of polished reports to providing much larger value to the greater community of users and collaborators by becoming the platform for all GEOINT interaction. Reimagining our primary GEOINT organizations as platforms means framing them as connectors rather than producers. Geospatial information naturally has many different uses to many people, so producing finished end products has a potential side effect of narrowing that use. In a traditional pipe model, the process and results become shaped toward the sources consuming it and the questions they care about, limiting the realized value of costly assets.
Becoming the central node providing a platform that embraces and enhances the avalanche of information will be critical to ensure a competitive and tactical advantage in a world where myriad GEOINT sources and reports are available openly. The platform will facilitate analysts being able to access and exploit data ahead of our competitors, and enable operators and end users to contribute unique insights instead of being passive consumers. The rest of this article explores in-depth what an organization’s shift from pipe production toward a platform would actually look like.
Rethinking GEOINT Repositories
A GEOINT platform must allow all users in the community to discover, use, contribute, synthesize, amend, and share GEOINT data, products, and services. This platform should connect consumers of GEOINT data products and services to other consumers, consumers to producers, producers to other producers, and everyone to the larger ecosystem of raw data, services, and computational processes (e.g., artificial intelligence, machine learning, etc.). The platform envisioned provides the filtering and curation functionality by leveraging the interactions of all users instead of trying to first coordinate and then certify everything that goes through the pipe.
Trust is created through reputation and curation. Airbnb creates enough trust for people to let strangers into their homes because reputation is well established by linking to social media profiles and conducting additional verification of driver’s licenses to confirm identity, and then having both sides rate each interaction. Trust is also created through the continuous automated validation, verification, and overall “scrubbing” of the data, searching for inconsistencies that may have been inserted by humans or machines. Credit card companies do this on a continuous, real-time basis in order to combat the massive onslaught of fraudsters and transnational organized crime groups seeking to syphon funds. Trust is also generated by automated deep learning processes that have been broadly trained by expert users who create data and suggest answers in a transparent, auditable, and retrainable fashion. This is perhaps the least mature, though most promising, future opportunity for generating trust. In such a future GEOINT platform, all three of these kinds of trust mechanisms (e.g., reputation/curation, automated validation/verification/scrubbing, expert trained deep learning) should be harnessed together in a self-reinforcing manner.
Most repositories of the raw data that contributes to GEOINT products attempt to establish trust and authority before data comes into the repository, governed by individuals deeply researching each source. The platform approach embraces as much input data as possible and shifts trust and authority to a fluid process established by users and producers on the platform, creating governance through metrics of usage and reputation. These repositories are the places on which we should focus platform thinking. Instead of treating each repository as just the “source” of data, repositories should become the key coordination mechanism. People searching for data that is not in the repository should trigger a signal to gather the missing information. And the usage metrics of information stored in the repository should similarly drive actions. Users of the platform, like operators in the field, should be able to pull raw information and easily produce their own GEOINT data and insights, then and contribute those back to the same repository used by analysts. A rethinking of repositories should include how they can coordinate action to create both the raw information and refined GEOINT products that users and other producers desire.
- This article is part of USGIF’s 2018 State & Future of GEOINT Report. Download the PDF to view the report in its entirety and to read this article with citations.
Core Value Units
How would we design a platform that was built to create better GEOINT products? In “Platform Scale,” Choudary points to one of the best ways to design a platform is to start with the “Core Value Unit,” and then figure out the key interactions to increase the production of that unit. For YouTube, videos are the core value unit, for Uber, it’s ride services, for Facebook, it’s posts/shares, and so on.
For GEOINT, we posit the core value unit is not simply a polished intelligence report, but any piece of raw imagery, processed imagery, geospatial data, information, or insight—including that finished report. For the purposes of this article, we’ll refer to this as the “Core Value Unit of GEOINT (CVU-GEOINT).” It includes any annotation that a user makes, any comment on an image or an object in an image, any object or trend identified by a human or algorithm, and any source data from inside the community or the larger outside world. It is important to represent every piece of information in the platform, even those that come from outside with questionable provenance. Trusted actors with reputations on the platform will be able to “certify” the CVU-GEOINT within the platform. Or they may decide it is not trustable, but will still use it in its appropriate context along with other trusted sources. Many CVU-GEOINTs may be remixes or reprocessing of other CVUs, but the key is to track all actions and data on the platform so a user may follow a new CVU-GEOINT back to its primary sources.
Maximizing Core Value Units of GEOINT
It is essential that as much raw data as possible be available within the platform, both trusted and untrusted. The platform must be designed to handle the tsunami of information, enabling immediate filtering after content is posted to the platform, not before. Sources should be marked as trusted or untrusted, but it should be up to users to decide if they want to pull some “untrusted” information, and then, for example, certify as trusted the resulting CVU-GEOINT because they cross-referenced four other untrusted sources and two trusted sources that didn’t have the full picture. Open data sources such as OpenStreetMap, imagery from consumer drones, cell phone photos, and more should be available on the platform. The platform would not necessarily replicate all the data, but it would reference it and enable exploitation. These open data sources should be available to the full community of users, as the more people that use the platform, the more signal the platform gets on the utility and usefulness of its information, and, subsequently, more experts can easily analyze the data and certify it as trusted or untrusted.
It should be simple to create additional information and insight on the platform, where the new annotation, comment, or traced vector on top of some raw data becomes itself a CVU-GEOINT that another user can similarly leverage. An essential ingredient to enable this is to increase the “channels” of the platform, enabling users and developers in diverse environments to easily consume information and also contribute back. This includes standards-based application programming interfaces (APIs) that applications can be built upon and simple web graphical user interface (GUI) tools that are accessible to anyone, not just experts. It would also be important to prioritize integration with the workflows and tool sets that are currently the most popular among analysts. The “contribution back” would include users actively making new processed data, quick annotations, and insights. But passive contribution is equally important—every user contributes as they use the data, since the use of data is a signal of it being useful, and including it as a source in a trusted report is also an indication of trust. The platform must work with all the security protocols in place, so signal of use in secure systems doesn’t leak out to everyone, but the security constraints do not mean the core interactions should be designed differently.
Filtering Data for Meaning
Putting all the raw information on the platform does risk overwhelming users, which is why there must be complementary investment in filters. Platforms such as YouTube, Facebook, and Instagram work because users get information filtered and prioritized in a sensible way. Users don’t have to conduct extensive searches to find relevant content—they access the platform and get a filtered view of a reasonable set of information. And then they can perform their own searches to find more information. A similar GEOINT platform needs to provide each user with the information relevant to them and be able to determine that relevance with minimal user input. It can start with the most used data in the user’s organization or team, or the most recent in areas of interest, but then should learn based on what a user interacts with and uses. Recommendation engines that perform deep mining of usage and profile data will help enhance the experience so that all the different users of the platform—operators in the field, mission planning specialists, expert analysts, decision-makers, and more—will have different views that are relevant to them. Users should not have to know what to search for, they should just receive recommendations based on their identity, team, and usage patterns as they find value in the platform.
The other key to great filtering is tracking the provenance of every piece of CVU-GEOINT in the platform so any derived information or insight also contains links to the information that came from it. Any end product should link back to every bit of source information that went into it, and any user should be able to quickly survey all data pedigrees. Provenance tracking could employ new blockchain technologies, but decentralized tracking is likely not needed initially when all information is at least represented on a centralized platform.
Building readily available source information into the platform will enable more granular degrees of trust; the most trusted GEOINT should come from the certified data sources, with multiple trusted individuals blessing it in their usage. And having the lineage visible will also make usage metrics much more meaningful—only a handful of analysts may access raw data, but if their work is widely used, then the source asset should rise to the top of most filters because the information extracted from it is of great value. If this mechanism is designed properly, the exquisite data would naturally rise to the surface, above the vast sea of data that still remains accessible to anyone on the platform.
It is important to note that such a platform strategy would also pay dividends when it is the divergent minority opinion or insight that holds the truth, or happens to anticipate important events. The same trust mechanisms that rigorously account for lineage will help the heretical analyst make his or her case when competing for the attention of analytical, operational, and policy-making leadership.
The Role of Analysts in a Platform World
To bootstrap the filtration system, the most important thing is to leverage the expert analysts who are already part of the system. This platform would not be a replacement for analysts; on the contrary, the platform only works if the analysts are expert users and the key producers of CVU-GEOINT. Any attempt to transform from the pipe model of production to a platform must start with analysts as the first focus, enabling their workflows to exist fully within a platform. Once their output seamlessly becomes part of the platform, then any user could easily “subscribe” to an analyst or a team of analysts focused on an area. The existing consumers of polished GEOINT products would no longer need to receive a finished report in their inbox that is geared exactly to their intelligence problem. Instead, they will be able to subscribe to filtered, trusted, polished CVU-GEOINT as it is, configuring notifications to alert them of new content and interacting with the system to prioritize the gathering and refinement of additional geospatial intelligence.
The consumption of GEOINT data, products, and services should be self-service, because all produced intelligence, along with the source information that went into it, can be found on the platform. Operators would not need to wait for the finished report; they could just pull the raw information from the platform and filter for available analyst GEOINT reports. Thus analysts shift to the position of the “curators” of information instead of having exclusive access to key information. But this would not diminish their role—analysts would still be the ones to endow data with trust. Trust would be a fluid property of the system, but could only be given by those with the expert analyst background. This shift should help analysts and operators be better equipped to handle the growing tsunami of data by letting each focus on the area they are expert in and allowing them to leverage a network of trusted analysts.
The other substantial benefit of a platform approach is to integrate new data products and services using machine learning and artificial intelligence-based models. These new models and algorithms have the promise to better handle the vast amounts of data being generated today, but also risk overwhelming the community with too much information. In the platform model, the algorithms would both consume and output CVU-GEOINT, tracking provenance and trust in the same environment as the analysts. Tracking all algorithmic output as CVU-GEOINT would enable analysts to properly filter the algorithms for high-quality inputs. And the analyst-produced CVU-GEOINT would in turn be input for other automated deep learning models. But deep learning results are only as good as their input, so the trusted production and curation of expert analysts becomes even more important in the platform-enabled, artificial intelligence-enhanced world that is fast approaching. The resulting analytics would never replace an analyst as it wouldn’t have full context or decision-making abilities, but the output could help analysts prioritize and point their attention in the right direction.
Recommendations for GEOINT Organizations
Reimagining GEOINT organizations as platforms means thinking of their roles as “trusted matchmakers” rather than producers. This does not mean such agencies should abdicate their responsibilities as a procurer of source data. But, as a platform, they should connect those with data and intelligence needs with those who produce data. And this matchmaking should be data-driven, with automated filters created from usage and needs. Indeed the matchmaking should extend all the way to prioritizing collections, but in a fully automated way driven by the data needs extracted from the system.
A GEOINT organization looking to embrace platform thinking should bring as much raw data as possible into the system, and then measure usage to prioritize future acquisitions. It should enable the connection of its users with the sources of information, facilitating that connection even when the utility to the users inside the agency is not clear.
- Be the platform for GEOINT, not the largest producer of GEOINT, and enable the interaction of diverse producers and consumers inside the agency with the larger intelligence and defense communities and with the world.
- Supply raw data to everyone. Finished products should let anyone get to the source.
- Govern by automated metrics and reputation management, bring all data into the platform, and enable governance as a property of the system rather than acting as the gatekeeper.
- Create curation and reputation systems that put analysts and operators at the center, generating the most valuable GEOINT delivered on a platform where all can create content. Enable filters to get the best information from top analysts and data sources by remaking the role of the expert analyst as curators for the ecosystem rather than producers for an information factory.
The vast amounts of openly available geospatial data sources and the acceleration of the wider availability of advanced analytics threaten to overwhelm traditional GEOINT organizations that have fully optimized their “pipe” model of production. Indeed there is real risk of top agencies losing the traditional competitive advantage when so much new data can be mined with deep learning by anybody in the world. Only by embracing platform thinking will organizations be able to remain relevant and stay ahead of adversaries, and not end up like the taxi industry in the age of Uber. There is a huge opportunity to better serve the wider national security community by connecting the world of producers and consumers instead of focusing on polished reports for a small audience. The GEOINT organization as a platform would flexibly serve far more users at a wider variety of organizations, making geospatial insight a part of everyday life for everyone.
ArcGIS AllSource connects disparate data, enabling actionable, holistic intelligence for decision-makers
L3Harris Technologies to provide technology that will help characterize and predict human mobility