The Case for Cloud

Intelligence Community remains bullish on cloud despite inherent challenges

In 2015, then-Deputy Director of the National Geospatial Intelligence-Agency (NGA) Sue Gordon made a proclamation that rocked the government IT world: NGA was moving to the cloud.

With demand for GEOINT exploding across the federal government—encompassing policymakers, the armed forces, the Intelligence Community (IC), and myriad others—NGA needed a flexible and scalable IT infrastructure with which to make its data and applications available quickly and easily to a wide swath of users. Cloud, Gordon hypothesized, was the answer.

“Getting our data into the cloud environment, where it is easily accessible and useful, is incredibly important to us,” Gordon told GIS provider Esri in a 2017 interview. “We needed to get our data in a place that would protect it, make it application agnostic, and allow it to be quickly retrieved—and the cloud is perfectly designed for that. From this point forward, we will develop everything in the cloud.”

But the reality of the cloud proved to be quite different from the idea of it.

“We very quickly found out…it wasn’t really the smart thing to do. And Sue Gordon would be the first person to say, ‘Yep. I think we probably need to rethink what we’re doing,’” NGA Chief Information Officer Mark Chatelain said Tuesday afternoon in the Government Hub at GEOINT 2023 in St. Louis, where he was part of a panel discussion about cloud computing within the IC.

But NGA didn’t give up on the cloud. Instead, it took inventory of the challenges and opportunities it presented, and embraced the federal government’s “Cloud Smart” strategy, which empowers agencies to adopt cloud—or not—based on their unique service and mission needs, technical requirements, and existing policy limitations.

Today, the entire IC is approaching the cloud with the same mix of enthusiasm and caution, moving forward wholeheartedly with cloud architectures when it makes sense to do so and holding back when it doesn’t, panelists conveyed during Tuesday’s session. Titled “IC Cloud Strategies,” the session featured moderator Nick Buck, CEO of The Buck Group and co-chair of the USGIF NRO Industry Advisory Working Group; Chatelain; and three other panelists: Fred Ingham, deputy chief information officer at the National Reconnaissance Office (NRO); Jennifer Kron, deputy chief information officer at the National Security Agency (NSA); and E.P. Mathew, deputy chief information officer at the Defense Intelligence Agency (DIA).

“For those who remember when we first started moving away from stovepipes to ISP/ASP, I think you’d agree we’ve come a long way in how we consume IT infrastructure. But I’d also contend that the industry base for IT infrastructure looks very different today than it did then. It’s more diverse, frankly more robust, and it’s less about components and more about fully integrated IT infrastructure systems,” Buck said during his opening remarks. “So, it’s kind of a good news/bad news story. The good news is we have lots of choice; as the mission evolves, we can choose different infrastructure, we can have different clouds, etc. The bad news is we have lots of choice; as a program, you have to make decisions, and the more choice you have, the harder and more complex that can be.”

During a wide-ranging conversation that lasted for more than an hour, panelists discussed both the good news and the bad news. One of the most compelling moments of dialogue, however, occurred halfway through the session, when Ingham described a hypothetical multi-cloud scenario that might occur inside any of the nation’s intelligence agencies. In it, the agency’s goal is to create a system that can ingest data from a spaceborne asset by writing it very quickly to memory, processing it, doing analysis on it, and eventually turning it into intelligence that can be stored in a repository.

Once upon a time, an agency would need to invest in on-premise infrastructure to create that capability, including a very-high-speed network, purpose-built servers with static RAM for writing data and GPUs for processing it, and a storage area network for data storage. Now, thanks to the cloud, the same agency can create the same capability by outsourcing each individual node in the process to a different cloud service provider (CSP).

At least, it can in theory. In practice, however, it can’t move that process to a multi-cloud environment without answering a litany of open questions, such as: What will be the architecture and governance model uniting disparate clouds and CSPs? Who in the multi-cloud ecosystem will provide operational support when it’s needed? How will users obtain comprehensive situational awareness across clouds? How will identity management function in and across the ecosystem? And what about ingress and egress charges?

“What I like to think about is frictionless computing,” Ingham said. “That’s not frictionless. And until we solve those issues, I don’t see us being able to use the multi-cloud in the manner that I just described.”

For that reason, CSPs who normally relate to each other as competitors must begin to treat each other as collaborators, panelists proposed.

Concluded Kron, “It’s absolutely not a zero-sum game. To the extent that we can have a consolidated, coherent strategy across [the IC and across cloud service providers]…the faster we’re going to be able to figure out how we can move more and more of our workload into more optimized platforms, and the more opportunities we’re going to discover. There is virtually no limit on how much more we can do.”

 

,

Modeling and Simulation for Gaming, Training, and More

Collaboration and planning integral to best serve the GEOINT Community

,

Hindsight, Insight, and Foresight

Geospatial technology and intelligence form the basis for the digital economy, bringing actionable data to world markets

,

NGA Deputy Director Highlights Need for Diverse Workforce

Tonya Wilkerson and Vietta Williams outline why diversity matters, and how the NGA is addressing it