Connected Crisis

In the U.S., 2017 was a landmark year for natural disasters—and a data-driven turning point for domestic crisis mapping

Connected_Crisis_1

When Houstonians heard the name Harvey, they didn’t think much of it. Days ahead of the storm, the National Weather Service predicted it would unleash torrents of rain on southeastern Texas, causing unprecedented flooding in and around Harris County. But Houston had heard the same song before. In 2005, just weeks after Hurricane Katrina devastated New Orleans, local officials ordered a mass evacuation under the specter of Hurricane Rita. Millions fled, but few got far. A lethal combination of severe gridlock and extreme heat caused more than 100 evacuation-related deaths. And when evacuees finally escaped the misery of traffic to return home, most found their streets dry and their homes unscathed. Next time, officials and residents resolved, there would be no evacuation.

But Harvey wasn’t a false alarm. After making landfall Aug. 25, 2017, the Category 4 hurricane lost strength and became a tropical storm. Then, it stalled. For four days it meandered along the Texas coast, dropping more than 60 inches of rainfall that caused $125 billion in damage—the same amount caused by Katrina—and at least 68 deaths.

Unfortunately, Harvey was only the first in a devastating trilogy of storms. Days later, Hurricane Irma ripped through the Florida Keys. Two weeks after that, Hurricane Maria ransacked Puerto Rico. Both mangled the U.S. Virgin Islands on their journeys north.

In 2017, the United States experienced 16 weather and climate disasters with losses exceeding $1 billion each, according to the National Oceanic and Atmospheric Association (NOAA). Collectively, the crises claimed 362 lives and caused approximately $306 billion in damage—a new U.S. record.

“It was a very difficult year,” said Federal Emergency Management Agency (FEMA) Chief Technology Officer Ted Okada. “We had numerous catastrophic disasters, any one of which would have been historic in and of itself. It was a nightmare.”

But 2017 also presented an opportunity. Twelve years after Katrina, domestic responders had cause to reflect on how far they’d come. What they saw when they did was a crisis mapping community that has been fundamentally transformed by geospatial intelligence (GEOINT), the pervasiveness of which has made the U.S. and its territories more responsive than ever to natural disasters.

Panel: Experts at GEOINT 2018 Discuss Spatial Analytics for Disaster Planning from Trajectory On Location on Vimeo.

From Content to Context

Like the separate fibers that constitute a single rope, GEOINT and emergency management are tightly intertwined. They have been since Katrina, when the National Geospatial-Intelligence Agency (NGA) provided stakeholders a common operating picture that saved lives and hastened recovery.

Since then, crisis mapping—the real-time collection, analysis, and distribution of location-based disaster data—has evolved in the wake of numerous catastrophes, each of which has infused the global community with valuable lessons. One of the most significant catastrophes, for instance, was the 2010 earthquake in Haiti.

“There was no mapping data of Haiti when the earthquake hit, so [volunteers] very quickly rallied to build a map of Haiti to help rescuers find homes and navigate roads,” recalled Kevin Bullock, director of business development at DigitalGlobe.

2017 was, in effect, the year of the ‘big data’ disaster. The amount of data generated was so enormous that [disaster response] really became about data management.
– Ted Okada, FEMA

Volunteers once again rallied in 2013, when Typhoon Haiyan tore through the Philippines; in 2015, when an earthquake rattled Nepal; and in 2016, when another quake struck Ecuador. Each time, the crisis mapping community learned and advanced.

But Houston is not Haiti, and Key West is not Kathmandu.

Lt. Christopher Capule, an Air Station Corpus Christi pilot, monitors the weather as Hurricane Harvey approaches the Texas coast Aug. 24, 2017. (Photo by U.S. Coast Guard Petty Officer 3rd Class Johanna Strickland)

“There’s an interesting contrast between how we respond to a natural disaster internationally versus how we respond when it happens in the United States,” Bullock said. “When the storms hit last year, we were better prepared because local governments, state governments, the federal government, big companies like Google and Apple, and open-source projects like OpenStreetMap had already mapped Texas and Florida really well. Even Puerto Rico was fairly well mapped.”

It’s the difference between crawling and running. In developing nations, a shortage of remotely sensed imagery and maps means crisis mappers must create content; in developed nations, an abundance allows them to create context.

“Typically, the information that needs to be built in developing countries already exists domestically,” explained Ryan Lanclos, public safety industry team lead at Esri, whose Disaster Response Program (DRP) provides 24/7 support to GIS users during humanitarian crises. “That means the community here can focus on … the next step—providing additional information to help responders better assess when and where they need to be moving.”

Optimizing the “next step” has become the theme of domestic crisis mapping, according to Okada, who said U.S. responders entered a new phase of emergency management in 2017, fueled as much by data science as natural science.

Satellites Save the Day

“2017 was, in effect, the year of the ‘big data’ disaster,” declared Okada, who said there was an “avalanche” of data in 2017. “The amount of data generated was so enormous that [disaster response] really became about data management.”

One explanation for the data torrent is the promulgation of commercial satellite imagery, according to NGA Chief of Disaster Analysis and Domestic Support Todd Noel.

“The commercial industry has boomed so much during the last two decades that information nobody had access to during Katrina everybody has access to now,” Noel said.

And he means everybody.

“During Hurricanes Harvey, Irma, and Maria, our representatives were inundated with phone calls from regular people like stay-at-home mothers who had evacuated and wanted to know if their house had survived the storm,” said DigitalGlobe’s Bullock. “We saw a huge influx of people who now understand the power of remote sensing.”

To satiate the growing appetite for commercial imagery during and after disasters, DigitalGlobe in 2016 launched its Open Data Program, through which it releases high-resolution satellite imagery in support of the humanitarian community. The program includes an activation protocol and a centralized data portal so responders know ahead of time what imagery DigitalGlobe will release and under what circumstances.

DigitalGlobe’s open imagery was especially helpful during Harvey because it included not only optical imagery, but also synthetic aperture radar (SAR) imagery from RADARSAT-2, which became available to DigitalGlobe in 2017 after the company was acquired by RADARSAT-2’s Canadian operator: MacDonald, Dettwiler, and Associates (MDA).

These satellite images, captured before and after Hurricane Irma, reveal damage at Long Beach, Key West, Florida. (Image courtesy of DigitalGlobe, a Maxar Company)

“With radar, we can operate at night or when there are clouds in the way,” Bullock said. “Because we had access to RADARSAT-2, we were able to create and release flood maps that showed the extent of flooding while the storm was still over Houston.”

GEOINT products such as this help create clarity in times of chaos.

“I spoke to a colleague who lived in Houston during Harvey, and he described it as a ‘fog of war,’” Bullock said. “Communications were spotty and nobody knew where to go for relief. With remote sensing, we can be a source of truth to let people know exactly where to go and what’s the best way to get there.”

Calling on the Crowd

Of course, commercial satellite imagery is not the only milestone in the history of emergency management. Another is the rise of social media. Data from both converge for crowdsourced initiatives, which in 2017 took on new significance within the domestic emergency management community.

The crowd has already been established as a vital resource in developing nations, around which volunteer mappers mobilize after a disaster to create much needed base maps. In developed nations that already possess content, the crowd can help provide context instead.

In 2017, the crowd proved it could deliver this context thanks to volunteers like Jessica Decker and Joe Larson. The former is a front-end developer and cartographer who moonlights as a volunteer crisis mapper. When Harvey hit Houston she took to social media in search of crisis mapping initiatives to join from her home in San Francisco. When she couldn’t find any, she decided to start one using Fulcrum Community, the free version of Fulcrum, a mobile app developed by Spatial Networks to facilitate field data collection during humanitarian crises.

“We’d heard a lot of reports about people asking for assistance, but we didn’t see any real response yet by an official organization,” said Larson, an integration engineer at Spatial Networks. “So when Jessica submitted an appeal to provision a Fulcrum Community instance for Harvey, we said, ‘Absolutely.’”

Organized by Jessica Decker, 700 volunteers came together online to map locations of shelter, food distribution, medical aid, and more in the days that followed Hurricane Harvey’s devastation of Houston. (Image courtesy of Jessica Decker/Spatial Networks)

Together, Decker and Larson mobilized a remote community of digital volunteers who collaborated via the messaging app Slack, then used a mélange of source material—including social media posts, commercial satellite imagery, news reports, and even firsthand accounts from Houstonians—to map ground truth for the benefit of citizens and responders.

“The storm hit on Friday and we scaled up to 700 volunteers worldwide just over the weekend,” said Decker, who with Larson created videos and training materials on the fly to onboard volunteers. “The entire operation took place online; I didn’t leave the room I was in for the first five days.”

Volunteers mapped more than 1,400 resources, including medical facilities, food drop-off and distribution centers, shelters, animal hospitals, and more.

Thousands of Harvey survivors, rescuers, and volunteers likely benefited from Decker’s Fulcrum map and other crowdsourced resources, such as DigitalGlobe’s Tomnod.

“Tomnod is an extremely effective tool in crisis response,” said Bullock, adding Tomnod has a community of approximately one million registered users. “By dispersing millions of chips of images to thousands of people, we’re able to very quickly crawl through imagery and do a damage assessment.”

Bringing Grassroots to Government

Crowdsourcing can be just as vital during domestic disasters as international ones. But the Americans who most need the crowd’s help during a domestic crisis—federal responders—have been slow to accept it.

“Especially when it comes to sudden onset disasters, the need for collaboration between formal responders and volunteers is big but far underdeveloped,” explained Norwegian crisis mapper Per Aarvik, president of Standby Task Force, a global network of volunteer crisis mappers. “Formal responders have certain parameters for things like security and training, and that can make collaboration risky.”

In 2017, federal responders conceded: The benefits might justify the risks.

Federal agencies have experimented with crowdsourcing since at least 2012, when Hurricane Sandy pummeled the East Coast. At that time, FEMA recruited crisis mappers from the Humanitarian OpenStreetMap Team (HOT) to help it conduct damage assessments and solicited data from commercial entities like Waze, whose popular traffic app helped it identify gas shortages. But efforts since have been ad hoc and haphazard.

That changed last year, starting with the U.S. Coast Guard (USCG).

Coast Guard Academy Cadet Evan Twarog shows a heat map in the academy’s GIS lab on Sept. 1, 2017. (Photo courtesy of Humanity Road)

“During Hurricane Harvey, two [USCG Academy] cadets, under their own initiative, began combing social media looking for reports of people in distress. The cadets then created heat maps to show areas of high impact and sent updates three times per day to search and rescue coordinators in New Orleans. These products … supported the more traditional methods used by search and rescue coordinators to locate people in distress,” said USCG spokesperson Lt. Amy Midgett.

The grassroots effort—which marked the first time the USCG used crowdsourced social media for disaster response—was supported by Standby Task Force and another crisis mapping organization called Humanity Road. The cadets ultimately mapped 1,000 search-and-rescue cases involving 5,200 people.

Like other federal efforts before it, the campaign was informal, finite, and fleeting. Nevertheless, it was a catalyst for significant change, according to FEMA Geospatial Information Officer Christopher Vaughan, who used it as inspiration for his own, more official operation.

“They were doing a really innovative thing, and we wanted to sponsor or encourage it. But it was their show,” said Vaughan, who saw an opportunity to carry the USCG’s baton when Harvey gave way to Irma. “I talked to our senior leadership about using crowdsourcing and … their response back to me was, ‘If not now, when?’”

Vaughan recruited U.S. Geological Survey (USGS) Innovation Specialist Dr. Sophia Liu, who under a FEMA mission assignment stood up the organization’s first-ever crowdsourcing desk.

“It’s the most official we’ve ever taken crowdsourcing,” continued Vaughan, who said the desk reached peak performance when Maria made landfall in Puerto Rico. “Power and communications were out, our stream gauge sensors weren’t working as well as they could have been, and there was persistent cloud cover. So we didn’t have our normal sources of information coming off the island, such as impacts to roads, bridges, and hospitals. That’s when we turned to crowdsourcing and ramped up our efforts.”

Under Liu’s direction, more than 5,000 digital volunteers converged from HOT, Standby Task Force, and others. Collectively, they converted tens of thousands of images from the Civil Air Patrol into actionable insights.

“That gave us some of the earliest pictures we had of impacts, and faster than we would have had them if we’d waited for the power to come back on and for traditional information to start flowing again,” Vaughan said. “It was amazing.”

The volunteers thought it was amazing, too.

“This was a very significant event for FEMA and crowdsourcing,” said Russell Deffner, a project manager and crisis mapping lead for HOT. “It was a great collaboration and an eye-opener for the U.S. government to see crowdsourcing as a potential resource.”

Now that they can appreciate the crowd, Deffner and Vaughan agree, the feds must determine how to institutionalize it.

“There has been a perspective change,” Vaughan said. “There is now acceptance and openness toward using these tools in what has traditionally been a very rigid system.”

Advancing that goal is a fledgling effort to develop a “Crowdsourcing Playbook” that will standardize when, how, and toward what end FEMA will use crowdsouring.

“If there’s an event we’re needed for, we’re going to activate no matter what,” Deffner said. “What FEMA has realized is that if we’re going to activate anyway, they might as well direct us so that we can contribute what they need the most in order to save people.”

The Digital Dilemma

The value of crowdsourcing is obvious: More people yield more data, and more data yields better emergency response. But there’s a potential downside, too: Without the right governance in place, data can obscure instead of illuminate.

“More and more information is coming at us, and we’ve got to do a better job of organizing it and structuring it so people can actually consume it,” Vaughan said.

To make sure it can leverage data effectively, FEMA has spent the last three years standardizing its geospatial products around six themes: hazard identification, resource needs, population impacts, building impacts, transportation impacts, and infrastructure threats. In the wake of a disaster, it now delivers 21 standard products derived from those six themes. The result, according to Vaughan, is a template approach to GEOINT production that increases speed and efficiency, which gives FEMA more capacity to ingest information.

“As I reflect on 2017, I think our dedicated focus on standardization was the seminal thing that carried the day,” Vaughan said.

An outcome of standardization was more mature data models with which to conduct damage assessments, which have reduced FEMA’s reliance on external partners.

“We’ve built our own internal capabilities such that we’re able to stand on our own two feet,” Vaughan said. “Where we traditionally relied on NGA for their analysis, we’re now able to do that ourselves.”

NGA’s Noel said the organization has welcomed the change.

“Last year was a banner year for disasters, but it also represented a big change for NGA in terms of how we support our domestic partners,” Noel said. “In fact, Harvey was one of the first times that there was no request from FEMA for NGA analytical support.”

Most NGA analysts are devoted to intelligence problems. Fewer FEMA requests means those who work on disaster analysis can devote their time to more sophisticated analytic pursuits.

“That doesn’t mean NGA is off the hook,” Vaughan said. “In fact, we have a very positive relationship with NGA, and they’re still very much supporting our mission” by supplying all the foundation imagery on which FEMA’s models run.

“That’s where the future for NGA is really going to be,” echoed Noel. “It’s about providing the data.”

Although the role of “data steward” sounds peripheral, it’s actually critical. Consider, for example, NGA’s work in Puerto Rico in the aftermath of Maria. While FEMA did not request analytic assistance from NGA during Harvey or Irma, the stress of a third storm necessitated reinforcements. At FEMA’s request, NGA sent two analysts—Michelle Nichols and Grant Eaton—to Puerto Rico to provide onsite GEOINT support. Upon arrival, they found as much digital disarray as physical destruction.

Preliminary damage assessments of St. Martin from Sept. 9, 2017. The yellow, orange, red, and purple areas mark damage levels as affected, minor, major, and destroyed, respectively. (Image courtesy of NGA)

“When we got there, FEMA was stretched pretty thin; it was a huge operation with very little geospatial support,” explained Nichols, who said there was a “disconnect in data”—an abundance of data, but a deficit of governance to exploit it. Responders lacked a common operating picture, which caused inefficiencies like duplicate deliveries; while some survivors received extra provisions, others received none.

To bring focus to a blurry landscape, Nichols and Eaton served as data conduits that standardized, organized, and disseminated geospatial information.

“Having very good data organizers is really important from the get-go,” Noel said. “If you walk into a disaster, and you have a lot of people there who are providing data, you need people to catalog it and organize it and make it usable to folks when they finally need it. That’s what Michelle and Grant were able to do, and moving forward we’re working with FEMA to ensure there are a lot more people like that available to actually manage the data.”

FEMA is also pursuing efficiencies with the help of technology, starting with dissemination. The sheer number of data sources available makes it difficult to discover them. FEMA has launched a data repository—disasters.geoplatform.gov—where it will curate authoritative apps, services, datasets, and APIs from the public and private sector to give responders a one-stop-shop for data discovery.

One of the apps currently available is Hurricane Incident Journal, which speaks to another FEMA strategy: automation. Whenever the National Hurricane Center announces a hurricane advisory, Hurricane Incident Journal automatically captures data and flows it into pre-made models that generate standard mapping products inside the application.

“Eventually, my dream is that we’ve got everything set up in such a way that as an incident starts to unfold, the analytics will start to roll out without any human intervention at all. It will be like a very complex domino set where one workflow kicks off another workflow,” said Vaughan. “I don’t think we’re very far away from that.”

Featured Image: Public safety organizations used Esri mapping technology to track Hurricane Harvey’s forecasted path while analyzing anticipated precipitation levels. (Image courtesy of Esri)

Heroes At Work

An innovative program run by the U.S. Chamber of Commerce Foundation matches service members in their final months of active duty with businesses and organizations

‘Geoinfluencers’ Rising

Meet two social media influencers whose viral content is taking GEOINT to the masses.

Honoring The Fallen

A new NGA memorial honors the service members who died while supporting U.S. mapping missions as part of the Inter American Geodetic Survey.