Intelligence Preparation of the Operational Environment is a mandatory step toward the understanding of an area of interest and the planning of a relevant course of action to support the warfighter
By Ben Conklin, Esri; Brigham Bechtel, MarkLogic; and Mathieu Goebel, Earthcube
Intelligence Preparation of the Operational Environment (IPOE) is a mandatory step toward the understanding of an area of interest and the planning of a relevant course of action (COA) to support the warfighter. The production of IPOE encompasses political, military, economic, social, information, and infrastructure (PMESII) identification, and through this, the ability to detect critical key nodes and vulnerabilities.
PMESII products describe the overlapping of systems and sub-systems independent from geographical scale—from city to province, and even conflict level. Thus, a holistic methodology can be used to derive a systemic analysis of those PMESII components. This systemic approach could be used to organize large-scale research, collection, and structured analysis to:
- Identify critical objects constituting the PMESII of any given operational environment.
- Define how those critical objects form interrelated entities.
- Produce a shared reference situation.
- Estimate the potential impact of any course of action.
Recent developments in technology enable automated monitoring of the PMESII elements and make it possible to process the exponential quantity of data available at scale. An AI-based approach to automation allows commanders to  take advantage of the growth in data sources—sensors, collection platforms, imagery, signals, and open source—and enables a dynamic intelligence product to present a more accurate view of the target environment at any moment in time.
Automated monitoring can trigger the spatial and temporal exploitation of data and the ability to swiftly assess and explore emerging issues in the operational environment, which impact the mission of the warfighter. An automated, dynamic IPOE helps modern militaries respond to today’s fast and agile adversaries. IPOE must become a living product, evolving at the same pace as data collected to support the warfighter.
This article describes how such a novel and holistic approach would support the warfighter and offer adaptive and always up-to-date COA planning.
Importance of Big Data in the Foundation of IPOE/PMESII
The amount of data available to a modern commander could enable an unprecedented, comprehensive view of the operations environment on a massive scale with an up-to-the-minute picture of the battlespace. Information assembled and integrated at speed enables a new systemic and dynamic IPOE that incorporates all available PMESII data. Societal infrastructure, even in third world countries, now contains terabytes of publicly available information from digital records, sensors, and social media in near real time. Meanwhile, command intelligence professionals are challenged to integrate open-source intelligence streams, tactical reporting, and traditional reporting from HUMINT, SIGINT, and MASINT as quickly as possible. The growing volume and variety of data is stressing current intelligence analysis during the planning phases. This increases the fog of war during operations solely due to problems handling the massive volumes of various data types and structures.
It is no longer necessary for IPOE to end after the analysis of data collected during the planning phase, because with current AI and data processes we can push for a dynamic IPOE into the execution phase. In the past, the IPOE was drawn from analytic conclusions based on data collected before and during planning. Then, operational decisions were made, and the plan executed with little chance to update the data prior to launch or during an operation, but this is not necessary. The ability to have up-to-the-instant data for IPOE would reduce uncertainty and the fog of war, even during the execution of joint operations.
Up-to-the-minute collection of volumes of PMESII in a world of systems for even small areas is, however, quickly overwhelming current processes. Take, for example, a hostage-rescue scenario in a non-permissive environment against a terrorist force operating outside government control. Figure 1 highlights some of the data points that must be assembled on a single building to run the operation.
For these complexes, sets of information requirements, with static and dynamic data that might be learned through traditional means, are often managed as structured foundation intelligence datasets. What is new and necessary in IPOE are to capture and model the dynamic human activities gleaned from various sources of information during planning and in real time. This operational data includes massive amounts of structured and unstructured data.
Commanders and policymakers are demanding a level of detail represented in the aforementioned graphics to make informed decisions, while our data systems attempt to integrate ever-increasing volumes and types of data for analysis. As the operation against a point target with limited objectives described, the volume of data available from just hours of video surveillance by a drone could be measured in terabytes—depending on the resolution of the camera, whether sound was included, and the type of camera file system used.
According to a Digital Rebellion video calculator, one hour of NTSC uncompressed, 10-bit video at a 59.94 frame rate and 720×480 resolution would result in 263.71 GB of data.
A critical element of all the data sources and streams will also be the collection of the metadata from these files to ensure the integrity and quality of the reporting that gets included and analyzed in our IPOE. It would also require constant human monitoring to feed into the PMESII available.
Integrating other data streams into the analysis (i.e., media, call data, imagery) grows the demand on our human-machine team with volume and both structured and unstructured data in a manner that could mean the loss of key indicators simply because of the volume and time required to process the information.
This increasing variety of data also increases exponentially in volume for actions across larger geographic areas when more PMESII information is needed to describe the theater of operations to a larger audience of friendly and/or allied forces, commanders, and policymakers. The expanded volume includes more data streams, sensors, and reporting from other sources. A data management environment capable of dealing with this complexity would help joint force commanders manage big data for IPOE.
Leveraging AI Inside IPOE
Most of the growing volume of data needs to be processed, conditioned, and analyzed to be leveraged in IPOE. This is when artificial intelligence (AI) comes into play. Facing the amount of data usable to populate the PMESII, no analyst is able to manually keep track of how all sources and relevant data evolve and change. A potential consequence of this is that a PMESII can quickly become outdated and thus endanger the IPOE and the ability to plan the most efficient course of action.
AI can support this process. Analyzing the elements of PMESII means to identify several entities for each dimension. Each of those entities can be identified and monitored through different sources of information. For instance, a power plant can be monitored thanks to imagery, open-source, and classified information sources. Likewise, the power grid in which this power plant is integrated and the human geography it exists within helps the analyst to understand how the population depend on this plant. Each time an analyst identified a source of information that provides updates on a regular basis, the analysis of those updates could potentially be automatized thanks to AI.
That being said, the first step to achieve this level of automation is actually to specify what are the key components of the operational environment you seek to monitor and how those can be broken down into systems and sub-systems and identify the interconnecting nodes between those. This is why AI and tradecraft always need to be intimately coordinated. The analyst helps to direct the AI, and the AI helps to reduce the burden on monitoring or updating datasets.
Once this is done, you can thus define your data model and the sources you need to monitor. If one looks at international standards on this data model—for instance, the features listed in STANAG 2592—it is clear that the AI technology available today won’t be able to process and monitor all those. Nonetheless, a significant number—one could say the critical ones—could be achieved. Figure 2 shows a few examples of data sources that could be monitored thanks to AI.
From a technological perspective, depending on the data source to be monitored, different AI technologies could be leveraged on natural language processing for text-based data, automated speech recognition for audio data, computer vision for imagery data, and overall machine learning/deep learning techniques when relevant. Once an AI agent was operational for each of those data sources, automated processing could be then generated. Each time a source provides new data, these can then be processed thanks to the AI agent and its outputs aggregated into a georeferenced database. In other words, AI will be able to permanently interrogate databases in order to feed intelligence plans and consolidate information.
Over time, this recurring analysis allows us to derive a multilayered activity dashboard for each output from this source. This will allow the analyst to quickly and efficiently visualize patterns of life for each of the PMESII dimensions, how those vary from usual behaviors, and their level of influence on the overall analysis.
The IPOE, which was in the initial phases of the intelligence cycle, then becomes a living analysis with both geographical and pyramidal classifications. The analysis, in return, becomes easier and more time-saving. Figure 3 illustrates a possible dashboard made possible by such a concept.
Thus, AI and databases combined will give the information and decision-making operator a more complete, multilayered, and consolidated view. Supported by AI, the intelligence cycle will be denser and faster. Therefore, the strategic level may develop more comprehensive studies describing systems and anticipating their changes. Having identified the centers of gravity of the system, the strategic level can then offer to the political decision-maker a more complete analysis of the system and a set of actions to be taken, each affecting one or more centers of gravity.
IPOE as a Dynamic Process
With the integration of big data and AI, the information feeds into IPOE are all dynamic. To take advantage of these advances, the IPOE process itself needs to evolve. In current doctrine, IPOE is described as a dynamic process, but we still see IPOE as a linear process that has four major steps:
- Define the battlespace.
- Describe the operational environment.
- Evaluate the threat/adversary.
- Determine course of action.
Each of these steps is integrated into the military decision-making process to inform operational planning. This sequential flow helps to create focused information products like the modified combined obstacles overlay, situation template, and decision support templates. Key information like Named Areas of Interest (NAIs) and High-Value Target (HVT) lists are used during the execution of the operation to confirm courses of action and contribute to running estimates.
Rethinking IPOE as a Continuous Cycle
The reality of intelligence support to decision-making is that the intelligence picture is often incomplete and uncertain, and it takes time and resources to reduce the uncertainty. To support an agile decision-making process, it is more useful to think of the IPOE process as a cycle (Figure 4). Each phase in the cycle has information products that inform the next phase in the cycle and can also be used as decision-making inputs.
This larger cycle is a series of four smaller cycles, which should each be running continuously. The work required to improve knowledge in each phase can be divided into teams, and each team can be focused on their step in the cycle. A short example helps to illustrate this concept. If a joint task force is assigned a mission in a new area of operations, the division of labor could be defined as follows:
A BATTLESPACE DEFINITION TEAM
Initial Task: This team starts the process by obtaining the authoritative foundational intelligence for the area of interest. This would include information like standard maps, current imagers, order of battle information, human terrain data, historical climate and weather data, and known threats. This information would be shared with the other teams. Significant characteristics or actives in the area that might impact operations are identified during this phase.
Refinement: The foundation team would leverage in-theater, national assets and open-source intelligence to improve the foundational sources based on the needs of the mission. This refinement is man-hour intensive and usually requires specialists in data management.
AN OPERATIONAL ENVIRONMENT TEAM
Initial Task: This team would start with the foundational intelligence and create analytic models based on mission parameters to create derived products depicting the potential impact the environment will have on operations. These products include items like cross-country mobility, obstacles, mobility corridors, and avenues of approach.
Refinement: As foundational intelligence, knowledge of mission parameters and knowledge of adversary capabilities improve, analytic models would be refined and re-executed to produce up-to-date information products. Many of these models can run in an automated fashion using AI or statistical modeling but would need analysts to refine and shape them based on evolving parameters. This refinement is not as man-hour intensive, but does require trained analysts and data scientists to execute.
A THREAT EVALUATION TEAM
Initial Task: Identify threat force capabilities and the tactics, techniques, and procedures they employ. This task is completed using historical reporting and knowledge of the enemy and must be updated based on the mission parameters and conditions in the environment. Initial identification of high-value targets can occur based on the current picture and known mission objectives. This information is translated into dynamic graphic overlays and shared in real time.
Refinement: In today’s threat environment, the threat force capabilities and threats to personnel (TTPs) need to be constantly reassessed based on new information and changing conditions. This information is refined based on new reporting and assessments, and should be updated continuously to support the planning process. These threats might be regular, irregular, or hybrid. The evolving nature of the threat requires a constantly evolving threat picture.
A THREAT COURSE OF ACTION TEAM (COA)
Initial Task: The development of threat courses of action requires understanding from all of the other key phases. The initial task will be to develop the threat COAs based on known information gathered in the initial phases. The output of this process would be the situation template and threat COA statement. Additionally, the event template can be built and named areas of interest identified. All of these graphics would be shared as live overlays.
Refinement: As the underlying knowledge of the operational variables is improved, the likely COAs need to be reassessed with updates to all of the key information products. By maintaining live connections to all of the information and relevant analytic models, this process can be streamlined for the analysts who are making the assessments.
A Digital Platform for IPOE
To support this workflow, the process for IPOE needs to be digitally transformed as well. Current IPOE workflows, while digital, still result in static products without any connection back to the data used to create the product. A digital approach for IPOE would shift the approach by providing two key outputs:
- Dynamic and interactive information products.
- Application-ready web services.
These two outputs would support decision-makers and analysts through dynamic, always up-to-date products. The sharing of dynamic and interactive information products would allow decision-makers to drill down on the products. They would be able to have up-to-date information at their fingertips when planning. The application-ready web services would provide access to the raw data for machines and humans to use in their own workflows and processes. This is critical because the ability to automate and streamline this process requires machine-readable data. The technology to realize this exists today, but the concept of employment needs to change to receive the decision advantage possible.
The technology and data to realize a live IPOE exists today. As more data is created, there will be an imperative to leverage this technology to maintain decision advantage against adversaries. The Intelligence Community will need a data environment capable of dealing with the volume of big data, both structured and unstructured. AI will need to be employed to support analysts actively monitoring and integrating data. Finally, this technology needs to be employed to support a dynamic process within the intelligence sections conducting IPOE. This will help commanders deal with the increasingly complex threats we face today.
As St. Louis’ geospatial ecosystem looks to continued growth and further develops a talent pipeline to meet future needs, several strategies are being developed
The insurance and risk management sectors are playing a more prominent role in managing disaster risks, and GEOINT underpins much of the new innovation
Empowering Innovations and New Solutions by Expanding the GEOINT Workforce Through University, Industry, and Government Partnerships
Creative partnerships among academia, government, and industry could contribute greatly to addressing current challenges in GEOINT