Maximizing analytical performance under pressure
By Dr. Colleen McCue and Lt. Col. Brandon J. Daigle
Critical thinking is an absolute “must-have” attribute associated with the successful implementation of advanced analytics, which include artificial intelligence (AI), machine learning (ML), and a host of other capabilities. The Intelligence Community is intimately familiar with the importance of critical thinking due to the inherent uncertainty associated with our operating environment. Moreover, the Intelligence Community often faces an added level of challenge associated with ensuring critical thinking and cognitive performance under stress, particularly as related to real time/near real time operational support and “no fail” mission requirements. In response to this second, mission-specific requirement, the Joint Geospatial Intelligence Squadron has augmented traditional instruction in critical thinking and problem solving to address the challenges associated with maintaining cognitive performance under stress.
As we consider increased availability and accessibility of advanced analytics to include AI and ML, the importance of critical thinking skills has become even more important to today’s GEOINT professional. Underscoring this point, empirical research in the commercial sector demonstrates that organizations realizing the promise of AI rather than simply chasing opportunity provide their technologists with ethics training. While ethical use of any technological capability is a worthy goal in and of itself, decades of experience with advanced analytics demonstrate repeatedly that biased algorithms simply do not work. Merely deciding to remove bias from analysis, however, is easier said than done given its subtle nature and pervasive hardcoded errors in human intuition. Moreover, depending on the organization and role, many GEOINT professionals face the added challenge of maintaining cognitive performance under pressure in situations where the consequence of bias, faulty logic, or errors can represent the difference between life and death. Therefore, the Joint Geospatial Intelligence Squadron has expanded our critical thinking training program and related efforts to address and at least mitigate these unique requirements.
Critical Thinking Basics
The human brain is a marvelous organ with functionality developed and tuned over the course of evolution. Unfortunately, many of the cognitive behavior patterns favored in natural selection run counter to the skills required for precise insight and critical thinking. Therefore, while quick decisions and an “80% solution” may have provided benefit to our ancestors, they can result in cognitive bias and overall errors in judgment. As a result, cognitive bias—whether in the form of anchoring, confirmation bias, group think, gambler’s fallacy, base rate neglect, or sampling bias—represent legitimate hurdles the intelligence analyst must overcome on a daily basis and as such must be identified and mitigated. Therefore, some understanding of how the brain works, as well as the cognitive “hacks” that have developed over time and situations where they might appear, represents the foundation of current training in critical thinking for the intelligence analyst.
The work of Daniel Kahneman and Richards Heuer form a baseline foundation of critical thinking training for most courses taught throughout the community. As a means by which to better understand mental activity, Kahneman divides thinking into “System 1” and “System 2.” System 1 (Fast) is defined by automatic, quick thinking associated with little or no effort or intention. System 2 (Slow), on the other hand, generally is associated with the subjective experience of concentration or thought, requiring intention and effortful cognitive activity. While it seems obvious that the intelligence analyst would want to ensure System 2 is actively and consistently engaged for consequential tasks, Kahneman also demonstrates through a series of clever experiments that your brain is inherently lazy, frequently looking for intuitive shortcuts that likely are wrong. Unfortunately, the brain also tends to be confident in its faulty logic, often preferring intuition, tacit knowledge, and gut instincts over data and statistics, and believing these shortcuts to be accurate, reliable, and correct.
Heuer builds on and extends the work of Kahneman, successfully translating relevant cognitive psychology concepts and literature into intelligence analysis processes and activities in his text, “The Psychology of Intelligence Analysis.” Both Kahneman and Heuer address the difficulty in effectively identifying and mitigating the brain’s predisposition to cognitive bias and heuristics, counseling that ongoing attention and awareness are key. As Heuer observes, cognitive bias can be “alleviated by conscious application of tools and techniques that should be in the analytical tradecraft toolkit of all intelligence analysts.” Similar to physical training, however, cognitive “fitness” is an ongoing requirement. Just as one would not expect to conduct a robust leg workout one day then be done for a lifetime or even a year, analysts frequently receive training in critical thinking during college or early in their career and believe they are set. Research indicates, though, that like physical training, the “muscle” between our ears requires ongoing training and exercise, particularly if we expect it to perform well under pressure.
Will AI and ML fix this? Unfortunately, no. As Principal Deputy Director of National Intelligence (PDDNI) Sue Gordon noted in her keynote address at GEOINT 2018, algorithms are “someone’s opinion written in code.” Advanced analytics do not prevent bias; rather, they operationalize it and enable it to be performed at scale. This is particularly true with analytic methods and approaches that are opaque or “black box” in nature, rendering the ability to identify and mitigate bias even more challenging. As a result, those organizations realizing the promise of advanced analytics—the so-called “analytic competitors”—train their analysts in critical thinking and ethics.
Maintaining Critical Thinking Under Pressure
Francis Bacon wrote, “Critical thinking is a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture.” While a worthy goal, the modern GEOINT professional rarely enjoys the deliberate, intentional, and truly mindful approach to analysis described by Bacon. Moreover, many of the challenges known to interfere with critical thinking—including stress, pressure, distraction, fatigue, and the requirement for a quick judgment with limited information—are antithetical to the model proposed by Bacon but represent daily reality to the GEOINT professional. Therefore, in addition to standard training on critical thinking concepts and skills, we include additional instruction on maintaining cognitive performance under pressure, particularly as it relates to the importance of repeated training, environmental tipping and cueing, the role that analytic workflow or process can play, and the influence of timing on performance.
Training: Again, similar to other organizations across the community, our critical thinking curriculum builds on the work of Heuer and Kahneman, however we also include material specifically designed to understand and improve cognitive performance under stress. Therefore, the analyst cadre is exposed to the basic neuroscience underpinnings of cognitive performance, including foreshadowing of the limitations they may experience in real-world situations when pressure, fatigue, uncertainty, and stress are applied. After Action Reports (AARs) and case studies are reviewed in an effort to underscore the fact that mere knowledge and insight may not be sufficient to offset biology during fast-breaking, or other stressful real-world events. Bad decisions frequently do not go back to lack of training; rather, they often are driven by the inability to perform well under pressure. As outlined by James Kerr in “Legacy,” understanding how the brain reacts to pressure and switches from a resourceful state to one that reverts to instinct and heuristics is an important first step. Included in the training is discussion of the need for repeated exposure to critical thinking training and ongoing development and testing of those skills, as well as intentional tipping and cueing in support of a layered approach to skill development and maintenance. Again, parallels to physical training and operational performance provide intuitive context. With that in mind, the use of techniques such as “skill ladders” to build critical thinking competence and capacity with increasing levels of pressure, and other methods that include the introduction of pressure during training scenarios are emphasized as a means by which to develop the ability to perform cognitively under pressure.
In “When,” Daniel Pink reviews the literature on the role that timing plays in cognitive performance. Unfortunately, the results suggest that while there are periods during the day associated with peak cognitive performance and staggering creativity, other times are marked by significantly reduced cognitive functioning. While the professional GEOINT analyst may not have the luxury of scheduling critical thinking tasks, particularly during real-time operations or breaking events, research indicates there are effective techniques that can be used to mitigate these known dips in cognitive performance including timeouts and restorative breaks.
Checklists and “Maps”
As Atul Gawande outlines in “The Checklist Manifesto,” checklists or process maps can be particularly helpful during stressful situations in which the brain is simply looking for the nearest exit. Intentionally cueing critical thinking at specific steps can increase the likelihood that cognitive bias and faulty heuristics are at least considered. Therefore, these “maps” not only capture workflow to ensure completeness, but also provide tipping and cueing for critical thinking, including callouts for operational pauses or “timeouts” that intentionally cue the analyst to consider bias.
The “Gift” of Peer Review
Though it requires intellectual humility and trust to expose oneself to critique, even constructive criticism, unflinching peer review can begin to mitigate the error blindness6 that often limits the ability to effectively review one’s own work. Peer review, therefore, is a gift we can give to our teammates that improves their work by ensuring that errors in logic and bias are identified and mitigated prior to distribution. Moreover, peer review also provides the added benefit of cognitive training because critically reviewing content, if done well, requires activation of System 2 thinking and active engagement with the material.
“Lather, Rinse, Repeat”
With Kahneman’s caution that lazy thinking is natural, persistent, and difficult to overcome, classroom instruction represents only the beginning. Taking a cue from layered approaches to marketing, our local efforts have been designed to exploit multiple, complementary approaches. These include formal classroom instruction and training, ad hoc notes, direct desk-side consultation and support, a culture of peer review, visual cues, desk-side reference cards, reading lists, and other targeted professional development activities. This is all in an effort to create a model that approximates a “brand ecosystem” wherein content promulgated across multiple channels becomes intellectually “sticky” and begins to compel its own influence.
Again, AARs that document simple, preventable errors are invaluable in their ability to underscore the fact that even foreknowledge does not necessarily mean that known patterns of cognitive bias will not continue to drive bad decisions, confirming that failures generally are not associated with lack of skill. Rather, they are driven by an inability to perform well under pressure. As a result, critical thinking training cannot be a “one and done.” Instead, repeated reminders in the form of analytic checklists or maps and operational pauses can be leveraged in support of real-time tipping and cueing. Additional tools include posters to visually cue, ad hoc notes that include current research or case studies highlighting key themes, and “pocket cards” with simple checklists and maps for quick reference. Current posters and pocket cards include: “Heilmeier’s Questions,”—referenced by Dr. Lisa Porter at GEOINT 2019 as a means to effectively manage creativity; and “More than Math: Analysis as a Process,” which outlines a source, method, and technology-agnostic approach to analysis focusing on the “why” (mission), operationalizing PDDNI Gordon’s GEOINT 2019 comment that “the craft is not the point of your endeavor,” it is merely the vehicle you use.
The proliferation and democratization of advanced analytic capabilities has made powerful tools including AI and ML accessible to the professional GEOINT analyst. Rather than mitigating cognitive bias, however, these capabilities may have increased the likelihood of errors and judgment. This is especially true for complex or “opaque” algorithms. Research in data science process and methods demonstrates the importance of training in critical thinking and ethical use of these capabilities to realizing the competitive advantage of advanced analytics, rather than simply chasing the next new thing. This requirement for intentional use is even more critical for the GEOINT professional working in an operational environment where stress, fatigue, pressure, uncertainty, and the urgency of a “no fail” mission create a perfect storm of conditions favoring cognitive bias and errors—errors that can have significant consequence. Research and experience provide many solutions, however these need to be applied regularly in order to offset “lazy” thinking and the brain’s inherent desire to revert to faulty logic and heuristics. With the consistent application of training, maps, checklists, “timeouts,” and unflinching peer review, however, the professional GEOINT analyst can take steps to mitigate this challenge in support of accurate, reliable, and meaningful responses to some of our nation’s hardest problems.
About the Authors
Dr. Colleen McCue is a Principal Data Scientist with CACI International, where she supports special missions. She also is a member of the USGIF Academic Advisory Board. Dr. McCue earned her doctorate in psychology from Dartmouth College, and completed a five-year postdoctoral fellowship at the Medical College of Virginia, Virginia Commonwealth University.
Lieutenant Colonel Brandon J. Daigle is commander of the Joint Geospatial Intelligence Squadron in the JSOC Intelligence Brigade. He is a career Air Force Intelligence Officer with multiple deployments conducting intelligence, surveillance, and reconnaissance missions in support of Special Operations Forces. He earned his Bachelor of Science in Religion from Southern Christian University, his Master of Science in Organizational Leadership and Design from Amridge University, and his Master of Science in Defense Analysis/Irregular Warfare from the Naval Postgraduate School.
 Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.
 Heuer, R. (2003). Psychology of Intelligence Analysis, 3rd Edition. Center for the Study of Intelligence. Government Printing Office.
 Kerr, J. (2013). Legacy: What the All Blacks Can Teach Us about the Business of Life. London: Constable.
 Pink, D. (2018). When: The Scientific Secrets of Perfect Timing. New York: Riverhead Books.
 Gawande, A. (2009). The Checklist Manifesto: How to Get Things Right. New York: Picador.
6 Schulz, K. (2010). Being Wrong: Adventures in the Margin of Error. New York: HarperCollins.
Headline Image: U.S. Army Spc. John Young-Ahuna, a friendly force tracking mission support specialist with the California National Guard’s Joint Operations Center, describes the tracking devices and interface system being used by the Guard and partner agencies during the Montecito, Calif., mudslide response, Jan. 11, 2018.