“I can see my house from here!”
A boy’s first flight aboard a commercial airliner is magical. At an altitude of 40,000 feet, he realizes for the first time the sheer size of the world in which he lives. The view from above — more vast than he’d ever imagined on the ground — offers him new perspectives and possibilities. The best way forward, he realizes, is up.
A soldier’s first flight aboard an Enhanced Medium Altitude Reconnaissance and Surveillance System (EMARSS) aircraft promises the same awakening all over again. Thanks to its seven-hour flight time, high-bandwidth data links, modular open system architecture, and generous payload capacity, the Boeing-built multi-INT platform offers a view that’s not only farther and wider than that on the ground, but also deeper. When it’s accessorized with cutting-edge sensors that capture imagery and other information, one can not only see the enemy’s house, but also who’s entering it, where they’re coming from, when they leave, and where they’re headed when they do.
“This one airplane can do what it used to take more than three airplanes to do,” said Mark Stephenson, Boeing’s EMARSS program manager, whose team in 2010 won a U.S. Army contract for four EMARSS aircraft, the first of which completed its first test flight in May 2013.
EMARSS is but one cog in an enormous machine that’s shaping the future of airborne intelligence, surveillance, and reconnaissance (ISR). For the boy looking down on his house—who one day becomes a soldier looking down on a threat—that future is one in which adversaries cannot hide. Along with sophisticated platforms and advanced sensors, this future requires new ways of working that replace “seeking” with “finding.”
The Case for Airborne
Airborne ISR dates back to the Civil War, when Union soldiers used hot air balloons to spy on Confederate troops. The deployment of manned observation balloons carried over into World War I, when Zeppelins and the use of fixed-wing aircrafts for reconnaissance were first introduced. By World War II, aerial photography was prolific, producing superior situational awareness, but at a significant cost: Gathering intelligence required flying great distances to and from the battlefield, as well as flying at low altitude over enemy territory and developing miles of film that had to be manually analyzed and stored.
“Airborne ISR was very tedious in its infancy,” said Eric Zitz, a lead associate at Booz Allen Hamilton, where he serves as an intelligence integration specialist for the National Geospatial-Intelligence Agency (NGA). “It produced an incredible amount of raw film that had to be taken off the airplane and processed just like you would an old Kodak roll, then given to someone who knew what they were looking for.”
In the decades after World War II, when the dominant reconnaissance aircraft was the F-4 variant of Lockheed Martin’s P-38, the United States made several notable advancements in aerial platforms, including the Lockheed U-2 jet-powered reconnaissance platform, which was introduced in 1955; the Ryan Model 147, an unmanned reconnaissance aircraft created in 1962; the Lockheed A-12 reconnaissance aircraft, first flown for the CIA as a U-2 back-up in 1962; and the A-12’s doppelganger— the Lockheed SR-71 Blackbird, a high-speed, high-altitude reconnaissance platform in use from 1966 until 1998.
“There was a fairly major breakthrough [in airborne ISR] in 1983, when we learned to put data links on our aircraft with the U-2,” said retired Air Force Maj. Gen. James Poss, former assistant deputy chief of staff for ISR at U.S. Air Force headquarters. “I’ll make the argument, though, that aerial reconnaissance really hadn’t changed much since WWII.”
It finally did change—“fundamentally,” according to Poss—in 1995, when Abraham Karem designed the MQ-1 Predator unmanned aerial vehicle (UAV). “He took a miniaturized version of what CNN uses to broadcast live imagery and essentially designed an aircraft around it,” Poss said. “After Desert Storm, we invented the Global Positioning System (GPS) and deployed a massive amount of laser-guided weapons. Our adversaries learned they had to be constantly moving, and we now had a surveillance vehicle that would allow us to follow them persistently—whatever it took to find them and wait for the perfect moment to strike. It revolutionized the way we do airpower.”
The revolution wasn’t the Predator’s platform so much as it was its satellite data link, which allowed the military to collect continuous imagery from behind enemy lines. “We realized that WWII-style reconnaissance wasn’t going to work anymore,” Poss continued. “We needed constant surveillance and we needed advanced data links to allow us to go deep in [hostile] territory.”
Since then, demand for persistent surveillance and advanced data links has grown exponentially under the thumb of two trends—technology and terrorism. The War on Terror isn’t a war against stationary military targets such as airfields, ports, and complexes, but rather individual adversaries who are constantly shifting locations.
Against this backdrop, airborne platforms are increasingly advantageous, as they can navigate around weather, get closer to targets, fly time- and place-specific missions, and allow for agility with respect to sensor selection.
“Satellite platforms offer data at a global scale according to a systematic time schedule, and with a highly centralized data processing and distribution system,” said Nancy McGee, federal business development manager for Fugro EarthData, a remote sensing, mapping, and GIS services company. “Airborne platforms offer the alternative of a more user-specific service at a regional scale, so that data acquisition can be organized flexibly both in time and space. They’re flexible, temporal, and targeted.”
Put another way, spaceborne assets have breadth, but airborne assets have depth.
“Satellites have huge collection footprints, but airborne platforms get a lot closer,” Zitz explained. “You don’t get as much area, but you get a much higher resolution.”
Given current threats, high resolution is a major benefit. “Imagine if the resolution is so good that I can not only see a guy, but I can see that guy’s face, or maybe even read the paper he’s holding,” Zitz said.
High-resolution imagery is only the beginning. Ultimately, airborne ISR consists of three major aspects: platforms; sensors; and processing, exploitation, and dissemination (PED) technologies, all of which are maturing rapidly.
When it comes to platforms, intelligence experts cite two major opportunities. The first is persistence: Platforms that can stay in the air longer can collect more and better intelligence.
“We’re very interested to find the knee of the curve for cost per flight hour in endurance,” Poss said. “We spend about 30 to 40 percent of our time transiting Predators and Reapers to target, so coming up with a long-endurance platform that won’t have to make that transit time every 24 hours, but only once a week, is very attractive.”
Because no pilot can fly for days, achieving longer flight times is reserved for UAVs—including the aforementioned Predator and Reaper, current versions of which have flight durations of 24 and 14 hours, respectively. In contrast, alternatively-fueled UAVs currently in development, such as Aurora Flight Sciences’ hydrogen-powered Orion UAV and Titan Aerospace’s solar-powered SOLARA UAV, promise endurance of five days and five years, respectively.
The second major opportunity modern platforms offer is payload capacity: Platforms that carry more weight can accommodate more sensors, the result of which is multi-INT functionality that produces a more complete intelligence picture.
“Multi-INT is extremely important,” said Mike Manzo, director of geospatial solutions in the Imagery Systems division at General Dynamics Advanced Information Systems. “You get a much richer picture when you’re looking at multiple [sources] of data.”
Miniaturization of sensors—fitting more capabilities into smaller, lighter, and therefore cheaper packages—is also key, according to Dr. Armando Guevara, president and CEO of aerial imaging technology company Visual Intelligence. “Miniaturization will bring to bear the fusion of sensors in ways that were not possible before, thereby creating a brand-new opening for multi-INT,” he said.
New platforms like EMARSS were designed with multi-INT fusion in mind and are therefore perfectly positioned to exploit and integrate the next generation of aerial sensors, including:
- Wide-area motion sensors capable of scanning entire cities;
- High-definition, full-motion video sensors that can help analysts distinguish, for instance, a civilian raising a cellphone from a militant raising a weapon;
- Canopy-penetrating LiDAR sensors that generate weather- and light-independent maps;
- Biometric sensors that can remotely establish targets’ identities;
- SIGINT sensors that detect telecommunications activity;
- Infrared sensors that register heat; and
- Hyperspectral sensors that can read hundreds of bands of color in order to identify materials and differentiate objects.
When all of these capabilities are combined, it creates considerable context for the end user. For example, not only can an analyst see a group of people outside a house, but can tell the group is setting up a decoy because the image’s multi-spectral signature reveals what the house is made of. Or, if an analyst sees a car, he or she can also tell it recently arrived at the location because the infrared sensor shows the vehicle is still warm. Such insights are very powerful in decision-making.
The Power of PED
The technological trajectory of platforms and sensors suggests infinite possibilities for airborne ISR. There’s just one problem: Data collection is evolving faster than data processing.
“The volume of image and sensor data we can generate means that management of image and sensor data is the primary computational challenge of the 21st century,” said Dr. David Brady, an optical engineer at Duke University, where researchers are developing an ultra-high-resolution camera—called Aware-2—for use in airborne ISR.
The data content already is too much for analysts to swallow.
“Presently, you’ve got a 4,000-person [data analysis] wing at Langley Air Force Base that’s barely keeping up with what our present generation of sensors is giving us,” Poss said. “We’ve got to put a lot more effort into figuring out ways to automate PED, which is really the toughest part of airborne ISR.”
In fact, industry is hard at work on evolved PED solutions, including advanced data links. Boeing, for example, is working on the Family of Advanced Beyond Line-of-Sight Terminals (FAB-T), which could support airborne ISR in remote areas with High Data Rate satellite communications. Meanwhile, San Diego-based Cubic Defense Applications is using cutting-edge micro-electronics to develop a smaller, lighter multiband digital data link system for the U.S. Navy’s MQ-8C Fire Scout unmanned helicopter.
“A data link system that used to be the size of a toaster oven is now the size of three Pop-Tarts,” said Robert Kalebaugh, senior director of business development for Cubic Defense Applications. “The smaller size benefits manned and unmanned air platforms because lighter-weight systems will save on fuel, which could allow the aircraft to fly longer missions. It is also much easier for ground troops to transport.”
And yet, new PED policies and infrastructure ultimately are needed in order for software and data links to deliver their promised benefits—especially as the U.S. shifts attention and resources out of Iraq and Afghanistan and into new, less familiar areas of interest, such as South America, Africa, and Asia, where paucities of permissive airspace, funding, ground assets, and coalition partners could pose significant challenges.
“In the past, we have essentially owned the skies in the areas in which we wanted to operate, and because of that we’ve had the luxury of flying any type of airborne mission we wanted,” Manzo said. “Because we don’t necessarily own the airspace in the emerging areas we want to look at, the concepts, techniques, and procedures we use will be different.”
In asymmetric regions where targets are complex and resources scarce, collecting data is futile without the means to also interpret it.
“The discussion about PED should precede the discussion about platforms because it’s how information is examined and analyzed that allows us to get more squeeze out of the fruit,” said Lt. Col. Faye Cuevas, an intelligence officer in the U.S. Air Force Reserves. “Instead of focusing entirely on advanced sensors and integrating new phenomenology, we need to look at how we treat information once we have it.”
The DoD understands the importance of PED, which is why it created the Distributed Common Ground System (DCGS). As part of the Defense Intelligence Information Enterprise (DI2E)—the information network that connects DoD with the rest of the Intelligence Community (IC)—the goal of DCGS is to improve data processing and exploitation by activating shared intelligence across all four U.S. military branches.
“The DCGS family of systems was established to create a unified intelligence picture,” said Army Col. Charles Wells, program manager for the Army’s DCGS system, DCGS-A, which launched in 2005 and now contains more than 131 million pieces of data, not to mention every intelligence report made since 2004. “That’s powerful for two reasons. First, when you bring all your intelligence into one system you have what we call an all-source analyst who looks at all the pieces of the puzzle and puts together very powerful answers because they’re literally seeing all the dots they need to connect. Second, when you have a common architecture and a common framework, you start to get collaboration between services; we’re all building to a common blueprint.”
Although more progress is needed, current PED hardware—for example, Northrop Grumman’s Ground Station, Operational Intelligence (OGS) truck-mounted military shelter—already is advancing the DCGS vision by connecting disparate Army networks, operators, and multi-INT sensors.
“We’re now getting more and more data at the tactical edge,” Wells said. “The question that remains is: How do we get more meaning out of that data?”
Reaching New Altitudes
The first step is to make sure PED technology keeps pace with advances in platforms and sensors.
“Where you achieve maximum capability is when all three are in sync,” said Dave Bottom, director of NGA’s information technology services directorate. “We have to make sure we have the PED that is able to handle what the sensor is able to collect and deliver it in such a way that the analyst or decision maker can understand it and act on it.”
The analyst is just as important as the technology, according to Bottom, who stressed the need to develop and deploy more multi-INT analysts, as has been done to support DCGS-A. “Both PED and collectors need to be optimized for the whole more than they are a particular type of phenomenology,” he said.
Instead of traditional PED centers of excellence, Cuevas advocates the deployment of analysts inside non-traditional organizations, like the U.S. Agency for International Development (USAID).
“In somewhere like Africa, you don’t always see bad guys, but because you’re in a place where bad guys go and bad things happen, there are other things within a frame of video or an image still that have relevance,” she said, emphasizing the value of seating analysts next to subject-matter experts who know problem sets best. “As a DoD intelligence analyst, I can analyze data for its intelligence value, but an environmental engineer, an agronomist, or a hydrologist can bring a unique texture that generates better understanding of the operational environment.”
Optimizing PED in this manner requires eliminating traditional stovepipes to facilitate more data sharing and collaboration across DoD and the IC, which despite DCGS and DI2E has been hindered by gaps in policy, culture, and governance.
One solution is democratizing data with cloud computing, resulting in PED that’s based on access instead of dissemination.
“Moving to a cloud or distributed model allows you to connect things that were once not connected,” Manzo said. “By pushing a lot of data into the cloud you’re broadening its reach and utility, and also breaking down those typical stovepipe barriers.”
Take PIXIA’s HiPER STARE and HiPER WATCH software, for example, which catalog, organize, and share large volumes of multi-INT data within a cloud-based architecture. With such solutions, intelligence is more “pull” than “push.” Analysts previously tasked with answering questions about disparate pieces of information can now query the cloud like they would a search engine to discover all relevant data, regardless of type or heritage.
Because it keeps data stationary, cloud computing also solves storage and bandwidth challenges associated with advanced sensors.
“What we have now is intelligence that requires a lot of storage and processing,” explained Wells, who said the next iteration of DCGS-A would live in the cloud. “Cloud computing allows us to do local processing and storage to get meaningful answers out of massive amounts of data on the tactical edge.”
Moore’s law—the principle that computing power doubles every 18 months—likewise will help streamline data processing through automation. For instance, many sensor companies are developing onboard processing capabilities—as computing power increases, their ultimate goal is equipping sensors with processors that can sort data upon collecting it and filter only relevant information to PED specialists on the ground.
“That’s a smart thing to do for two reasons,” Wells continued. “First, you’re not getting as much raw data. When I receive data at DCGS-A, I already have half the answer I’m looking for. Second, it helps with bandwidth. Next-generation sensors are collecting terabytes of raw data, so doing some of the processing on board [reduces stress on our network].”
This could allow analysts on the ground to spend minutes looking at video instead of hours, thereby catalyzing better and faster decision-making. “It’s all about collecting the right data at the right time and having the right system in place to exploit it,” Manzo said.
The Role of GEOINT
Because so much of airborne ISR is grounded in imagery, the GEOINT Community is ideally positioned to lead the transition from a focus on platforms to a focus on PED, the result of which will be crucial for realizing a future in which intelligence isn’t just informational, but also contextual.
“GEOINT has a unique ability to integrate with other ,” Bottom said. “Everything happens somewhere and at some time, so location is usually the first point of integration.”
Simply put: The GEOINT Community has the opportunity to champion consolidation and collaboration by modeling them.
“Geospatial intelligence is a critical component supporting our common operational picture. It is a central focus and foundation area for consolidated multi-intelligence, and will only increase in importance as we consolidate previously stove-piped intelligence and mission command systems into a common operational environment,” Wells concluded. “GEOINT subject matter experts must partner with their Intelligence Community counterparts to collaborate on technology advances, data collection, research and development to ensure the [U.S.] gets the maximum benefit from high-payoff [airborne ISR] capabilities.”