Pandemics and adversarial threats have made shoring up the national supply chain an urgent priority. How we get there, according to these GEOINT professionals, involves beefed-up benchmarking, broad data collection, and a merging of that data into one accessible set.
Violent extremists, political unrest, and natural disasters are no walk in the park. But mess with people’s groceries, and watch real panic set in.
Unable to find peanut butter and toilet paper throughout the 18 months of pandemic, Jeanette McMillian felt on a personal level the importance of a healthy national supply chain (this, coming from the deputy assistant director of supply chain and cyber directorate at the National Counterintelligence and Security Center, where she examines threats to the nation’s goods and services every day).
McMillian’s conclusion following the peanut butter and toilet paper shortage was that the U.S. needs a more robust, diverse set of suppliers. Her colleagues on Wednesday’s panel agreed and — more to the panel’s topic (“GEOINT as Critical Supply Chain: Digital Transformation, Agility, Resiliency, and Sustainability”) — settled on the areas of benchmarking, data quantity, and data sharing as top priorities.
Recognizing abnormalities in the supply chain requires first understanding what “normal” looks like, said McMillian. “If you can quantify that, that’s where you’ll be able to pick out those anomalies and be ready to quickly make pivotal turns.”
“Anytime you’re monitoring change and [conducting] time-series analysis, if you don’t have a strong, robust baseline, there’s no way to do that,” said Ryan McKinney, head of strategic initiatives at Descartes Labs. “That’s the importance of having all the data all the time. You can’t do it with just a snapshot.”
About “all” that data: How to procure it when, as one audience member put it, “Much of the supply chain data is owned by companies that don’t make it readily available”?
“We need to consider what truly is ‘proprietary’ and then determine whether there is a common good everyone could gain by sharing this,” said McKinney. “I do believe there are organizations that are thinking about, ‘How can I use my data, perhaps anonymized, in a way that supports sustainability and supply chain initiatives’.”
“This is where universities shine,” said Cindy Mebruer, director of the Center for Supply Chain Excellence at Saint Louis University. “Researchers can go to a number of companies and get access to data that these companies [typically] would not share. And they can take it and build case studies and use studies that are beneficial to all companies.”
In addition to sharing their data, companies must also be willing to merge it with the data of their competitors. “You get the most value out of data sets when they’re all connected and aggregated,” said Ben Ruddell, director of the School of Informatics, Computing, and Cyber Systems at Northern Arizona University. “But no one has an incentive for the actors to connect. No one wants to be the first mover, because you get all of the pain and none of the gain.”
Ruddell believes federal intervention would be appropriate toward shifting these attitudes, whether through incentives or legislation. “The government has the authority to solve this problem and might need to, because who is going to do it otherwise?”
McMillian, on the other hand, thinks the solution could exist in a more cross-collaborative approach. “We want to make sure we’re not dictatory or regulatory but instead partnering with the industry,” she said. “When you’re contracting and working with the government, that’s where we’re going to have the most leverage.”
Whether it’s through legislation or public-private partnerships, everyone agreed on the end game, as declared by Ruddell. “We need a national supply chain intelligence capability which provides a near real-time picture of what’s going on in those commodity supply chains.”
NGA’s procurement leaders outline strategies to meet new strategic objective