Public Safety & Emergency Management – Trajectory Magazine http://trajectorymagazine.com We are the official publication of the United States Geospatial Intelligence Foundation (USGIF) – the nonprofit, educational organization supporting the geospatial intelligence tradecraft Tue, 20 Feb 2018 16:12:19 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.4 https://i2.wp.com/trajectorymagazine.com/wp-content/uploads/2017/08/cropped-TRJ-website-tab-icon-1.png?fit=32%2C32 Public Safety & Emergency Management – Trajectory Magazine http://trajectorymagazine.com 32 32 127732085 Modeling Outcome-Based Geospatial Intelligence http://trajectorymagazine.com/modeling-outcome-based-geospatial-intelligence/ Thu, 01 Feb 2018 08:23:30 +0000 http://trajectorymagazine.com/?p=35905 The applications of GEOINT models to enable decision-making in myriad sectors

The post Modeling Outcome-Based Geospatial Intelligence appeared first on Trajectory Magazine.

]]>
Society has tremendous capabilities to prepare, respond, and recover following natural disasters, emergency events, and security incidents—but these capabilities each require time, space, and energy to mobilize and focus. Geospatial intelligence (GEOINT) is a key element to engaging response capabilities in the right way, at the right time, and in the right place. In particular, the connection between GEOINT and modeling has emerged as a capability that decision-makers and response teams can rely upon to increase the correctness, reliability, and timeliness of their decisions.

Today, the most notable example of models in action occurs in the realm of hurricanes and severe weather. The impact of hurricanes on the United States and across the globe continues to be a challenging part of our natural hazard landscape. We experience not only direct threats to life and property from such storms but also increased second and third order effects. Important tools to focus action (while simultaneously garnering public attention) include the ubiquitous European and United States hurricane models that we are accustomed to seeing. They enable us to track and predict where a hurricane may travel and its potential impacts. Similarly, we subsequently rely on flood and inundation models to predict storm surge and intensity.

As GEOINT systems collect additional data and decision-makers are exposed to more complex and useful models, the demand for modeled outputs in a variety of applications will likely grow. They present data from spatial and dynamic sources as a unique combination of time and space in a format that decision-makers, the public, first responders, non-governmental organizations, and recovery experts all use to enhance and support their actions. This reflects a shift from simple data collection and display to a world where we ask and expect an answer to the question: “What does it all mean?”

The Integration of GEOINT and Mathematical Modeling

From a GEOINT perspective, the output of modeling can be used to analyze the projected path of a hurricane or wildland fire, depict and assess the impact of the event and on structures within that path, and prioritize evacuations. The objective of modeling is to create a simulation of real-world events or conditions. By using modeling to focus on real-world outcomes such as population shifts or emergency services response times, the GEOINT Community is beginning to transform geospatial data from discrete data elements into the direct language of decision-makers and operations personnel.

While many think of modeling in terms of weather/hurricane models and Monte Carlo simulations, it is the application of those models into predictive situations that improve accuracy and the available time for decision-making. Delivering outcome-based modeling as a product of GEOINT allows policy-makers to more rapidly assess both risks and opportunities. Models allow us to investigate complex things by applying our knowledge of simpler things. Once a model is proven consistent with supporting evidence and therefore accepted, it can be confidently used to make reliable predictions.

“What if?” analysis is one of the most recognizable modeling outputs. Often called “predictive analytics,” these models encompass a variety of modeling and statistical techniques ranging from machine learning (ML) to linear regression to multivariate analysis. These models allow policy-makers to assess various decisions, predict uncertainty, and assess the impacts of one attribute versus another. They present insights into what may happen based on trends, combinations of data and patterns, or rule-based behaviors. Predictive models generally focus on outcomes rather than numbers. Decision-makers are far more likely to be asked when a river will reach flood stage, and what to do about it, than what the reading is on a particular stream gauge. The goal is to answer not only what may happen in the environment, but also what would be the impact of various decisions on the outcome.

Applying Models to Decision-Making 

The majority of models are developed in academia or other technically sophisticated environments. Transitioning them from expert users to emergency service practitioners poses a variety of challenges. The development and transition of models such as Hazus or the European and U.S. hurricane models into common tools used by responders requires consistent mentorship and stewardship by both the academic community and the targeted user community, typically including a centralized authority such as the Federal Emergency Management Agency (FEMA).

As demand for actionable GEOINT information increases, academia and GEOINT producers need to focus on ways to speed up the long-term adoption process (which exceeds five to eight years in some cases) that typical models take to transition from science into practice. One example of a recent effort to streamline this adoption cycle is the efforts of the State of Colorado to integrate a fire behavior model directly into fire operation. The Colorado Center of Excellence for Advanced Technology Aerial Firefighting (CoE) recently entered into a partnership with the National Center for Atmospheric Research (NCAR) to transition a weather-based wildland fire predictive model to operational use. As part of the project, the CoE not only provided training material to firefighters, but also engineered a separate training module to allow for multivariant simulations based on specific geographic and meteorological case studies. These enhancements improved the usability of the modeling system for non-experts and hastened potential adoption by the wildland fire community.

  • This article is part of USGIF’s 2018 State & Future of GEOINT Report. Download the PDF to view the report in its entirety. 

Current Uses of Modeling 

Emergency Services Delivery: Increasingly, fire and EMS leaders are applying quantitative standards, confronting complex problems, and seeking innovative deployment solutions. There is growing demand within fire and emergency medical services (EMS) communities for decision support systems that can answer “What if?” questions and allow for deployment planning based on future incident predictions. This confluence of factors is spurring the development of the kinds of hybrid GEOINT solutions described above.

A number of cities around the country are using hybrid modeling, geospatial data, and ML techniques to evaluate performance and determine efficiencies. For example, the city of Palo Alto recently faced a daunting problem: optimizing paramedic unit placement, scheduling, and staffing to optimize five competing performance metrics within budget and physical constraints. A unique software solution that paired ML algorithms with discrete simulation and geospatial-temporal data identified eight near-optimal solutions out of millions of alternatives for command staff to evaluate. Nearby Redwood City addressed the problem of unprecedented growth by utilizing a hybrid system that combined a machine learner with geospatial-temporal analysis tools to generate an accurate, validated model of future incident profiles. This model generated future scenarios using discrete event simulation, enabling command staff to assess key performance metrics such as response time, utilization, budget impact, and system resiliency on both reasonably anticipated and extreme versions of future scenarios.

Fire Behavior: Fire behavior modeling projects the behavior and effects of fire activity to inform and guide prevention measures, response tactics, resource management, and safety decisions across all levels of the fire service—from wildfire to structure fire, from initial attack to sustained attack, and throughout the cycle of land and forest management. As a fire response increases in complexity, wildfire modeling is used for tactical planning, the assessment of future resource requirements, and evacuation planning.

Wildfire behavior models range from empirical fire spread estimation to physical models. They combine geospatial data (slope, fuel/land cover data) and dynamic data (weather measurements) to provide specific decision products including fire spread, intensity, and behavior. The State of Colorado is currently in the second year of a multiyear effort to deploy a wildland fire prediction system based on improved weather data provided by the High-Resolution Rapid Refresh (HRRR) model developed by the National Oceanic and Atmospheric Administration (NOAA). Rather than relying on a single point forecast, as was the case in previous generations of fire models, the HRRR model provides a gridded forecast at three-kilometer spatial resolution across the state. The HRRR model creates a new 18-hour forecast every hour and uses radar data to model the propagation of existing storms. The fire prediction system creates a hyper-local forecast over 36 square kilometers at the location of the fire. This forecast is suitable for operational and tactical decisions of ground personnel based on micro-terrain and local winds.

In Southern California, a system called Wildfire Analyst simulates 10 million fires daily to evaluate the potential impact of wildfires on values at risk and utility lines, using a two-mile resolution local model and high-resolution five-meter fuels analysis. This massive analysis allows local utility managers to prepare in advance to address the potential impact of wildfires up to three days in advance. Similar systems are used in Chile by Corporación Nacional Forestal (CONAF), which uses real-time observations to calibrate modeling of large-scale incidents such as the Las Maquinas Fire in January 2017. CONAF uses the modeling not only to support incidents but also to communicate to the public what is intact after every big fire, comparing the resulting burned area versus simulated fire progressions that would have occurred without suppression efforts.

Land Use: Land use change modeling projects historic patterns into the future and visualizes alternative futures as a tool for decision-making by local or regional government officials. This type of modeling helps reveal the causes, mechanisms, and consequences of land use dynamics by modeling the interaction in space and time between humans and the environment. GEOINT data in the form of satellite images and maps play a key role.

One example of a land use change model is SLEUTH, which stands for Slope, Land Cover, Exclusion, Urban, Transportation, and Hill Shade—the input data to the model. For more than 20 years SLEUTH has been used in 18 countries to study land use change. The impacts to urban growth can also be examined. One such study assessed declining water quality in the Chesapeake Bay estuary due in part to disruptions in the hydrological system caused by urban and suburban development. Land use change models will continue to be used to study the complex interactions of urban dynamics and can be used by local and regional governments to inform policy decisions.

Watershed: Providing water for human and ecological needs remains a challenge for local and regional government officials worldwide. As populations grow and demand for water increases, land and water resource management is evolving from simple, local-scale problems toward complex, regional ones. Such problems can be addressed with models that can compute runoff and erosion at different spatial and temporal scales. In 2002, the U.S. Environmental Protection Agency, U.S. Department of Agriculture, the University of Arizona, and the University of Wyoming first developed an automated, GIS-based watershed modeling tool. Now under continual development, the Automated Geospatial Watershed Assessment (AGWA) helps decision-makers manage and analyze water quantity and quality. AGWA utilizes the Kinematic Runoff and Erosion (KINEROS2) hydrologic model and the Soil and Water Assessment Tool (SWAT) to evaluate watersheds with varying soils, land uses and management conditions, and their related environmental and economic impact. AGWA has also been used to analyze land impacts of coalbed methane extraction, management of impacts from military training activities, and the evaluation of flow in streams on military bases in the southwestern U.S.

Sea-Level Rise: The effects of rising sea levels range from large-scale population displacement to critical infrastructure degradation due to saltwater intrusion. Coastal erosion is evident in many areas of the world, in the U.S. notably along the Louisiana coast. In 2014, the U.S. Geological Survey and the University of San Francisco published a new marsh accretion model, WARMER, to assess the risk of sea-level rise to salt marsh parcels around San Francisco Bay. The aim of this model is to provide site-specific sea-level rise predictions to land managers through the intensive collection of field data and innovative predictive modeling. WARMER indicates that most salt marsh around San Francisco Bay will transition from high to mid marsh by 2040, to low marsh by 2060, and to mudflat by 2080, however, there is a great deal of variation around the bay.

What Makes a Good GEOINT Model?

A scientific model must not only generate predictions, but also generate results that are used and accepted by decision-makers. As observers of the natural world, we will only accept a model if its predictions stand up against outcomes we can observe. Although specific fields and disciplines may accept and use models with varying attributes, models that can be integrated as a GEOINT product must share some specific attributes. Here are some characteristics of good models:

Output that is linked to decision or analytical objectives: Models that support specific identifiable decisions or outputs support the entire GEOINT cycle. In this sense, they are “products” and must be aligned to an information need or decision point. They are the automated counterpart of manual analysis and reports.

Consistent, identifiable, and available data: The data that support a model can turn a good model into one that is inconsistent or irrelevant. GEOINT models are used to support decisions at all levels, from analysts to the public, and at frequencies that range from one to three times to common, automated runs. As a result, they should be aligned to consistent, accurate, and standardized geospatial data for which an analyst or automated system has a reasonable expectation of availability.

Ability to assess and compare the impact of inputs: The sensitivity of a model to changes and variation in input data is directly linked to decision-maker understanding, trust, and adoption. If a model appears to have wide swings in output based on small changes to inputs, it can limit the adoption and trust of outputs by decision-makers.

Consistent outputs: A model must produce output that is consistent with inputs. Although one of the benefits of modern modeling is that it goes beyond pattern assessment, decision processes and trust in systems begin to wane when similar inputs result in different outcomes.

Ability to assimilate real-time observations: Operational models need to provide answers in near real time and respond to the influx of massive amounts of data that can be captured from operators, citizens, and the wealth of sensors available through the Internet of Things (IoT) and remote platforms.

Produce results for advanced visualization platforms: Traditional GEOINT results in maps, a tool for specialists to be used in planning and operations. But our audience (GEOINT professionals, decision-makers, the public) demands easy-to-understand information in a variety of formats (e.g., 360-degree videos, 4D immersive environments, augmented and virtual reality).

Conclusion

Better and expanded application of modeling as a GEOINT product has the potential to enhance and focus the work of traditional analysis. To expedite the adoption and improve the relevance of modeling, the GEOINT Community should begin to revamp and refocus modeling advances on customer needs. The low adoption rate of modeling as a GEOINT product for some of the examples cited in this article can be refocused by addressing the core attributes listed above. Modeling should produce immediate, consumable results. Simultaneously, models need modernization to take advantage of new practices and approaches as well as new and improved data sources. Modeling needs to decrease reliance on hard-to-find or outdated information, such as inconsistent, manually collected land cover data, and transition to the application of methods, such as automatically extracted data from satellite remote sensing systems.

Policy-makers must be able to depend upon a reliable, integrated, and continuously improving GEOINT framework to address the increasing challenges. Our future GEOINT framework should expand to include modeling tools as well as more data, processes, and visualization as we strive to support today’s decision-makers.

Headline Image: The Automated Geospatial Watershed Assessment (AGWA) tool is an automated, GIS-based watershed modeling tool that uses nationally available GIS data layers to parameterize, execute, and visualize results from the RHEM, KINEROS2, KINEROS-OPUS, SWAT2000, and SWAT2005 watershed runoff and erosion models. 

The post Modeling Outcome-Based Geospatial Intelligence appeared first on Trajectory Magazine.

]]>
35905
What are Your 3 Words? http://trajectorymagazine.com/what-are-your-3-words/ Fri, 17 Nov 2017 18:47:46 +0000 http://trajectorymagazine.com/?p=35457 What3words assigns three-word identifiers to every location on Earth

The post What are Your 3 Words? appeared first on Trajectory Magazine.

]]>
The global address system is imperfect. Road names are often repeated or similar within municipalities, leading to botched deliveries, confusing navigation, and wasted time. Street addresses only cover developed areas with established infrastructure. Geographic coordinates are precise but too complicated for everyday use.

To fix these problems, London-based what3words is simplifying global addresses. The company has divided the entire surface of the world into a geocoding grid of 57 trillion 3-meter-by-3-meter squares, assigning each a unique three-word identifier. This allows more accurate location sharing and product delivery and provides addresses for the billions of people living in developing neighborhoods without defined street names.

To encourage the use of their system around the world, what3words has translated the map grid into 14 languages such as French, Arabic, and Swahili, with more to come including Hindi and Zulu.

The system’s benefits are numerous. To date, the national post services of Nigeria, Djibouti, Côte D’Ivoire, and Mongolia have adopted the what3words system and begun delivering goods and mail to many residential locations for the first time. South African cities like Durban are using it to properly direct emergency responders. The United Nations is using it to geotag imagery as a common operating picture for disaster recovery efforts in remote locations. The system could even break into personal navigation. Mercedes announced it will incorporate what3words addresses into the voice-activated satellite GPS for next generation vehicles.

For areas without thorough building numbering or street addresses, embracing what3words could improve city planning, enable efficient business, and help people define their homes.

Photo Credit: what3words

The post What are Your 3 Words? appeared first on Trajectory Magazine.

]]>
35457
GEOINT for Policing http://trajectorymagazine.com/geoint-for-policing/ Wed, 01 Nov 2017 13:07:34 +0000 http://trajectorymagazine.com/?p=35082 Software, sensors, and other location-based technologies offer opportunities and challenges for law enforcement

The post GEOINT for Policing appeared first on Trajectory Magazine.

]]>
The traditional who-what-when-where crime report is starting to acquire many more details—from the proximity of the nearest ATM or street light to the occupational, educational, or religious significance of the date.

These are the kind of data points and insights any cop on the scene would notice, but which could then easily get lost in the system.

By combining increasingly detailed databases with powerful software that can detect patterns almost as fast as reports are filed, police departments and other first responders can deploy their resources more efficiently, be more accountable to citizens, and perhaps even develop a sense of where crime is likely to occur next.

But there’s also a risk of confusion and unnecessary expense as busy police departments try to assess pitches from geospatial intelligence (GEOINT) firms.

“We’re almost getting flooded by them,” said Police Lt. Joseph Flynn, assistant commander of the Fairfax County Police Department’s Criminal Intelligence Division and deputy director of the Northern Virginia Regional Intelligence Center. “It’s still so new, and what do we want?”

Prescient Analytics

Applying GEOINT to policing begins with the basics of incident reports and 911 calls, explained Robert Cheetham, CEO of the Philadelphia firm Azavea. Its subsidiary HunchLab performs some of the leading work in next-generation policing software.

HunchLab models incorporate “a whole range of other things,” Cheetham said. He listed both nearby amenities and businesses—transit stops, ATMs, liquor stores, and even lighting—in addition to temporal factors such as the time of the day, the day of the week, whether school was in session, and whether it was a holiday.

In each municipality, HunchLab builds a model that incorporates these inputs and calculates the potential harm of types of crime using the RAND Corporation’s “Cost of Crime” calculations. The results—at an annual subscription cost of $20,000 to $80,000 depending on municipality size, with custom pricing for the largest cities—not only illuminate crime trends but offer a hint of where they’re likely to head.

“What we’re doing is not prediction,” Cheetham said. “It’s more of a forecast of a difference in risk.”

The Chicago Police Department (CPD) ranks as HunchLab’s highest-profile client on account of the high rate of shootings across the city. CPD began deploying HunchLab’s system in January 2017; by mid-year, the department had brought it to the six of its 25 districts that account for 25 percent of the city’s shootings.

“We’ve seen what I’ll say are promising results,” said Jonathan Lewin, chief of CPD’s Bureau of Technical Services. In the first two districts to get this upgrade, shootings have so far dropped by 33 percent, well above the 14 percent drop citywide.

Lewin added the department is using the data it collects not just to dispatch officers faster but to speed actions by other parts of city government.

“One of the things we looked at was 311 calls for streetlights out,” he said. “Does that tend to correlate with nighttime shootings?”

As a result, Lewin said, the city is now prioritizing its deployment of connected LED streetlights “in some of the areas where we think it might have the greatest impact on reducing crime.” 

However, if law enforcement agencies don’t clean up their data before implementing forecasting technologies, they risk being led astray.

“Not having the proper protocols and data governance policies to prevent incomplete and inaccurate data entry leads to the issue of ‘junk in, junk out,’” Jody Weis, public safety lead at Accenture, warned via e-mail. “The finest analytic system, with the absolute best algorithms, will be useless if the data it is analyzing isn’t accurate.”

Jeff Gallagher, a GIS specialist with the Fairfax County Police Department, advised cultivating relationships with local government information technology and GIS professionals.

“Get out of the little pigeonhole and see the amount of data your county has,” Gallagher said.

Unblinking Eyes

In addition to information derived from officers, citizens, and databases, many police departments also have unblinking eyes on their communities in the form of automated sensors that collect real-time data for quick analysis.

“If it’s collecting a location, we can bring it in,” said John Beck, Esri’s industry manager for police. Esri’s GIS software can incorporate data from license-plate reading sensors, ShotSpotter gunfire-detecting microphones, officers’ body-worn cameras, and GPS anklets worn by offenders.

Data from sensors such as the ShotSpotter gunshot detection and location service can be integrated with other data into GIS systems for analysis by police departments. (Image credit: ShotSpotter)

Such data integration can add to a department’s budget and can encounter resistance from citizens. For instance, Lewin said CPD cameras got a better reception in communities after the department switched to a less obvious model that didn’t have continuously flashing blue lights.

But they do work.

“People are now actually catching criminals in the act based on the predictive analysis of all this historic and real-time data,” Beck said.

However, Beck continued, with the deluge of new information also comes the risk of overloading officers with data that should first pass an analyst’s eyes.

“We’re seeing a lot more real-time crime centers in the U.S. and beyond,” Beck said, complimenting CPD for setting up these centers in individual districts. That, however, should not come at the cost of taking officers off the street.

Lewin said CPD hired eight civilian analysts to embed in these centers. It also had representatives from HunchLab and security systems firm Genetec go on ride-alongs with officers to learn how to refine their user interfaces.

An existing set of analog sensors—as in, the eyes and ears of citizens—remains essential.

“Don’t become so over-reliant on [technology] that you become disconnected from the community,” said Sean Whitcomb, a sergeant and spokesman with the Seattle Police Department (SPD). He pointed to SPD’s regular incorporation of citizen input into its SeaStat crime-statistics program. “The value is increased exponentially because we supplement our own data with real-time feedback from the community.”

A Balancing Act

Collecting new data and building predictive models can also help police agencies increase their accountability to citizens.

“When I was a cop, we didn’t share any information with the public,” Beck said. “Now, police are sharing information about all of their activity, including use of force and police-involved shootings, and making that data open to the public.”

He pointed to the Philadelphia Police Department, whose website documents officer-involved shootings and allows visitors to compare the locations of those incidents with the locations of gun crimes across the city.

Public desire for accountability is another factor driving law enforcement agencies to deploy GEOINT.

In Chicago, the city’s Independent Police Review Authority now maintains a searchable use-of-force database, including audio and video from officers’ body cameras. And in Seattle, a 2011 Department of Justice investigation that found fault with SPD’s collection of data led the department to partner with Accenture to build a data analytics platform.

But data collection in policing can also generate public dissatisfaction with police departments. In 2016, citizens were angered to learn SPD had purchased Geofeedia’s social media analysis software two years earlier.

Weis and Beck each pointed to social media monitoring as the next frontier in the use of GEOINT by police. But after SPD’s attempts to glean intelligence from status updates went awry, the resulting blowback led Facebook and Twitter to yank Geofeedia’s access to their networks.

“There’s a very fine line between government surveillance and spying,” SPD’s Whitcomb said, adding the department now focuses on the social postings of individual suspects. “Something causes more harm than good if it erodes public trust and confidence.”

Said CPD’s Lewin, “Community partnership requires that we engage our stakeholders, and part of that is being as transparent as possible.”

Jay Stanley, senior policy analyst for the American Civil Liberties Union, emphasized police departments and the GEOINT industry should maintain transparency to help “reduce bias and improve trust with communities.”

Cheetham echoed Stanley’s point.

“I want to be on the right side of history on this,” he said.

More Research Needed

Cheetham and Stanley separately noted the need for more published research on the effectiveness of GEOINT and predictive policing.

For example, while the Police Executive Research Forum has spent years investigating law enforcement best practices, it has yet to study this technology, Director of Communications Craig Fischer wrote via e-mail.

A former police officer and current academic concurred via email. “The independent empirical research is limited and equivocal,” wrote Dr. Kim Rossmo, director of the Center for Geospatial Intelligence and Investigation at Texas State University.

Lewin said CPD is now working with the University of Chicago’s Crime Lab to research how its initial deployment of predictive policing technology has fared.

But, he added, the real-world consequences of police work make it difficult to run a classic experiment in which a control group is left out of a technological advance: “If you have something that could be effective, you want to use it.”

The post GEOINT for Policing appeared first on Trajectory Magazine.

]]>
35082
Situational Analysis http://trajectorymagazine.com/situational-analysis/ Wed, 01 Nov 2017 13:06:15 +0000 http://trajectorymagazine.com/?p=35087 Satellite imagery, drones, advanced analysis, and other emerging technologies are quickly changing the face of firefighting

The post Situational Analysis appeared first on Trajectory Magazine.

]]>
The use of geospatial intelligence (GEOINT) tools such as remote sensing and data visualization is on the rise in the firefighting community, and the future of the profession will be greatly influenced by ongoing technological advances.

Kate Dargan, former California State Fire Marshal, co-founder of Intterra, and a USGIF board member, reflected on her early career as an air attack officer fighting wildfires in her home state.

“I was the ‘eye in the sky’ translating what I was looking at from several thousand feet to the firefighters on the ground,” she said, recalling later trying to capture video from the air using a handheld camera.

Today, commercial satellite imagery as well as LiDAR, hyperspectral, and infrared imagery collected from manned and unmanned planes could all be part of a firefighter’s toolkit. When paired with powerful data analysis platforms and mobile apps, GEOINT offers first responders greater situational awareness and a better understanding of the communities they serve.

A Rapid Evolution

“Many firefighters may only see the world through the windshield of the fire truck,” Dargan said, but noted available technologies and firefighter expectations are rapidly evolving.

For example, she said, fire chiefs may understand the basics of infrared technology but not yet be conversant in the various types of infrared and their corresponding capabilities. Regardless, Dargan said she is seeing the increased presence of unmanned aerial systems (UAS) at industry trade shows and is aware of more and more departments purchasing small drones.

For the last 100 years, firefighters used paper and pencil to create diagrams of buildings and map areas of wildfire risk. Modern fire departments employ geospatial technology to develop a standard of cover, more efficiently deploy resources, perform risk assessment, and pinpoint potential problem areas, according to Talbot Brooks, firefighter and director of the Center for Interdisciplinary Geospatial Information at Delta State University in Mississippi. Investment in geospatial tools supports risk reduction by being able to plan in advance of an emergency what equipment to use and where to position it. But the ability to improve response and mitigate risks relies on the ability to also properly integrate and manipulate geospatial data. 

If additional staffing, stations, or [equipment] are needed, a fire chief has the [geospatial] evidence needed to justify a budget request.
 
— Talbot Brooks, Delta State University

Dargan said the future of firefighting technology includes the networking of disparate imagery derived from different sensors and organizations. That is what her team strives for with its subscription-based Situation Analyst platform, which pulls all of that imagery together in one place and serves it up to each person in an organization modified for his or her needs.

David Holmerud, a fire service management consultant and former deputy fire chief in Solana Beach, Calif., emphasized the importance of asking the correct questions of the data at the right times: “Is there something more we can do to change the outcome of the responses? Of these structural fires, how many were contained to the original building? What difference did what we do make?”

Knowing how to draw the right conclusions from the data is the key to advancing the capabilities of the modern-day firefighter.

Startup Descartes Labs, founded by a group of scientists from Los Alamos National Laboratory, is pairing satellite imagery with machine learning to help draw better conclusions for firefighting. In a company blog post titled “Fighting Wildfires Using a Cloud-based Supercomputer,” research scientist Daniela Moody writes: “The Descartes Labs Platform provides us with a view of the planet that no one has ever seen before—not only is it multi-sensor, multi-resolution, and multispectral—it’s also a multi-decadal historical lens.”

This information helps to ascertain damage from fires over time and can be used to make better decisions about how to fight fires in the future. The platform enables users to extract information not available to the naked eye, pull in a lot more data than can be done manually, and leverage machine learning processes that incorporate algorithms based on numerous data points.

“During the course of a fire, especially one with limited allocated resources, satellite imagery analysis could better direct ground crews to hotspot and containment areas,” Moody wrote.

Building Partnerships

Communication among the public safety community is also important when adopting new technologies. Holmerud recommends initiating and maintaining an open dialogue with city planners who may have already gathered and even visualized valuable data fire services could potentially tap into.

“For example, when a new subdivision is planned, many different data elements are available as a result of the approval process,” Holmerud said. “These data sets, ranging from street layouts to location of underground utilities, can be used to provide the basis of fire department pre-plans and updates to response maps and dispatching procedures.

It can be a time saver as well as provide accurate data.”

Dargan encourages fire chiefs to participate in wide-ranging discussions that include police departments, community health workers, public works, transportation officials, and other civic departments. These conversations will introduce fire service leaders to technologies not designed specifically for firefighting, but that could be adapted for their missions.

The Bolivar County Volunteer Fire Department concludes a live fire training at a donated structure in Benoit, Miss., in February 2009. (Photo credit: Delta State University)

Holmerud, who is also an instructor at the National Fire Academy in Emmitsburg, Md., touts the value of collaborating with local colleges and universities on projects that could be of benefit to both parties. For example, the city of Wilson, N.C., has done significant work in mapping layers of data such as water flow, utility shut offs, and the number of people potentially living in a given structure. The city of Wilson makes these maps available to Holmerud’s students, who manipulate the layers behind the scenes by changing various conditions and factors. This activity enables students to go back to their communities with a better understanding of where information comes from and who they need to work with to ensure adequate resiliency and response planning.

Public-private partnerships could also pave the way toward better technological support for fire services. In the Phoenix, Ariz., area, 27 fire departments broke through jurisdictional boundaries to integrate their response to 911 calls. With a GPS unit now in every fire truck, the team in the best position to respond is dispatched to an incident, regardless of geographic boundaries. This new approach has resulted in shorter response times throughout the area.

Eric Prosser, information technology officer for the Santa Clara County Fire Department in California, points to the multi-agency coordination that was necessary for Santa Clara to host Super Bowl 50 in 2016. According to the NFL, 1.1 million people attended the game and related events.

Prosser’s iMAP Team won a USGIF Award in 2016 for providing the Santa Clara County Multi-Agency Coordination Center with a GEOINT-based decision and situational awareness platform. The iMAP team developed an enterprise GEOINT system used to manage all fire and medical service operations throughout Super Bowl 50. In collaboration with Dargan’s Intterra, the developers generated the ability to integrate 911 computer-aided dispatch information, map special events throughout the region, monitor resource availability, view GIS layers to include near real-time satellite imagery, and analyze data trends.

“The results of iMap enabled us to be better prepared for future special events and large-scale incidents, and to have situational awareness at both the department and operational area levels,” Prosser said. “This additional data provides us with useful information on a daily basis within the Silicon Valley.”

The Geospatial ROI

Holmerud said although many fire departments are slow to officially adopt GEOINT, he is beginning to see volunteer departments systematically use smartphone apps to gain a sense of who’s responding as well as their locations and estimated arrival times. He believes these kinds of tools will make departments hungry for more geospatial information.

“We’re starting to see the value of [geospatial] intelligence coupled with response software—starting to see what they can do and look at the possibilities,” Holmerud said.

The realization that geospatial technology can be a force multiplier when it comes to getting the most out of existing resources will also help drive adoption, according to Brooks.

This map by Descartes Labs shows the burn severity index for the 2016 Soberanes fire on California’s Monterey peninsula. (Photo credit: Descartes Labs)

“If I want budget to go after something, now I can show it,” Brooks said of his ability to use data to test and prove a theory. “It’s not just a supposition. [Geospatial tools are] a good way of separating fact from fiction.”

Developing a standard of cover using GEOINT provides a data-driven solution for understanding where departmental strengths and weaknesses are located geographically.

“If additional staffing, stations, or [equipment] are needed, a fire chief has the [geospatial] evidence needed to justify a budget request,” Brooks said. “Supposition and anecdote are removed from the process and political leadership can have more confidence in decisions that often cost (or save) millions of dollars.”

According to Dargan, there are three main areas in which fire departments can invest: equipment, people, and information.

“One of the key messages we’re trying to communicate is that information is a resource and a hard commodity that should be planned for and used just like equipment and [people],” she said. “The return on investment for data is or will pan out to be higher than it is for the other two types of resources.”

For example, the amount of data a fire department can acquire and put to use through remote sensing is not available through any other method except boots on the ground evaluating each building and area of risk.

“We’ll never have enough staff to send feet up every driveway in California to talk to every home or business owner,” Dargan said.

Those data-enabled decisions could lead to less costly emergency response with less loss of life and property, she added.

Imagine a firefighter being able to do a voice search while combating a wildland or structural fire, Dargan said. They could say, for example: “Show me houses with wooden roofs and give me their addresses.”

This type of timely access to geospatial data will enable firefighters to more effectively respond to emergencies and will significantly improve their ability to predict events and therefore protect more property and save more lives.

Featured image: Fire glows on a hillside in Napa, Calif., October 9, as multiple wind-driven fires whipped through the region. (Credit: Josh Edelson / AFP / Getty Images)

The post Situational Analysis appeared first on Trajectory Magazine.

]]>
35087
Enabling Rapid Response http://trajectorymagazine.com/enabling-rapid-response/ Wed, 01 Nov 2017 13:05:43 +0000 http://trajectorymagazine.com/?p=35092 Geospatial intelligence proves a powerful tool for paramedics

The post Enabling Rapid Response appeared first on Trajectory Magazine.

]]>
In the Emergency Operations Center, a dispatcher takes a bystander’s cellphone call about a car crash on a poorly marked rural road. The report prompts the dispatcher to send regional air medics as well as the nearest local ground EMS crew. Next door, EMS managers analyze response statistics for a rapidly growing residential area.

Across town, an EMS crew teaches citizen CPR in a neighborhood with a high cardiac arrest rate. After training, a smartphone app will be integrated with EMS dispatch, so bystander CPR can be started in public spaces before EMS arrives. All of these activities, some long established and others cutting-edge, rely on geospatial intelligence (GEOINT) data and technology to save lives, yield better patient outcomes, and improve agency efficiency.

Early EMS operations used “static deployment,” with a set number of vehicles assigned to permanent stations. In the 1980s, increased call volumes without equal investment in EMS systems led to system status management, which was intended to optimize coverage based on temporal patterns of use.

The advent of computer-aided dispatch and automatic vehicle locator technology allowed dispatchers to determine the closest available ambulance for a call, but it took near real-time analysis and predictive analytics to make the deployment and use of resources truly effective. As economic stresses mandate that services accomplish more with fewer resources, dynamic deployment has become a mainstay in providing efficient and cost-effective coverage.

Dynamic Deployment

“In dynamic deployment, ambulances are directed toward the highest uncovered demand at that moment in time. Some call it ‘chasing the blob,’” said Dale Loberger, an active EMS member and a developer at Bradshaw Consulting Services, which developed the Mobile Area Routing & Vehicle Location Information System (MARVLIS). “Demand is constantly being re-evaluated in near real-time and resources are being matched to that demand as their level of availability changes.”

The MARVLIS system models the probability of future call locations based on historic data, near-real-time inputs such as dispatch and response times, and factors such as traffic conditions. The automated forecast is modeled through Esri’s ArcGIS platform and displayed as a mapping interface. Combined, MARVLIS GPS data, GIS modeling, and wireless communications allow EMS to “have the right units at the right places at the right times,” Loberger said.

The lower response times and decreased distances enabled by systems such as MARVLIS and Optima Predict from Intermedix help save lives in the subset of patients that must be reached in four minutes or less to survive. Jersey City Medical Center EMS doubled its return of spontaneous circulation rate in cardiac arrest victims after integrating MARVLIS into its operations in 2012.

A University of Pittsburgh team modeled fatal vehicle crash rates in Pennsylvania from 2013-2014 and distances from trauma resources using Fatality Analysis Reporting System data. They discovered a theoretical 12.3 percent decrease in mortality if two medevac units were to be reassigned to the higher-incidence areas.

“There was a big disparity for these patients, depending on where they live,” said Joshua Brown, a general surgical resident at the university medical center and lead investigator on the study. “It’s only recently that trauma systems analysts have begun to incorporate GIS tools into their work to achieve improved outcomes. That we could potentially reduce mortality by relocating only two helicopter units was a very powerful finding.”

Community Engagement

Focusing resources strategically to improve patient outcomes involves more than ambulance placement. According to the American Heart Association, more than 350,000 out-of-hospital cardiac arrests occur in the United States each year. Only 5.5 percent of these victims survive to hospital discharge. Improving survival rates from sudden cardiac arrest is a holy grail among the EMS profession, and providers are combining geo-location data, GIS modeling, and smartphone apps in this quest.

In Mississippi, American Medical Response analyzed new data for geospatial patterns, looking for hotspots associated with neighborhood type, rural versus urban patterns, and similar factors. In the Jackson metropolitan area, they discovered an association between citizen CPR/Automated External Defibrillator (AED) training and bystander CPR rates in certain neighborhoods. Since bystander CPR/AED use can double or triple the chances of surviving cardiac arrest, AMR increased outreach training to the areas with high arrest and low training rates. Improved bystander CPR and increased survival rates followed.

PulsePoint AED is a crowdsourcing app that allows users to report the location of AEDs in their community. (Image credit: PulsePoint)

“So much can happen during the critical minutes of an emergency,” explained Michael Arinder, M.D., director of clinical services for the south region with American Medical Response. “We recognized that we had the ability to see what happens in the moments before the arrival of trained personnel and we decided to use that to better serve the community. We knew that if it saved only one additional life, it was worth it.”

This focus on bystander CPR/AED inspired PulsePoint to create a smartphone app suite to bring citizen rescuers to the cardiac arrest victim. The PulsePoint Respond app sounds an alert when a cardiac arrest occurs in a public place. Users in the agency-defined notification area will see the victim’s location on a map. PulsePoint Respond incorporates data from PulsePoint AED, a crowdsourcing app that allows users to report the location of AEDs in their community. The AED location data is made available in PulsePoint Respond after being verified by local authorities. 

“PulsePoint is the marriage between technology and citizen engagement,” said PulsePoint spokesperson Shannon Smith.

To date, PulsePoint Respond has been activated more than 20,000 times and has more than 59,000 users.

911 for the Next Generation

Crowdsourced traffic information is another valuable geospatial tool that can benefit the EMS community. Genesis PULSE, a vehicular tracking system used for dynamic deployment, exchanges data on road closures and traffic conditions with navigation app Waze.

Data after the first year of information exchange revealed that in 62 percent of cases Waze obtained accident notification up to 4.5 minutes faster than 911 centers. Although the implications are unsettling, Waze data provides PULSE users an advantage in rapid deployment—if, as in all GEOINT use cases, the data is accurate.

All geospatial data requires accuracy to be useful, but in public safety, accuracy can make the difference between life and death. Leaders in the field consider this a primary public safety challenge.

“Geographic Information Systems, when coupled with first-responder missions, private industry, and public policy can improve operational understanding and help PSAPs (public safety answering points) create and maintain reliable, dispatchable address databases,” said Mike King, emergency call-taking and dispatch industry manager for Esri as well as a member of the National Emergency Number Association. “All three disciplines are necessary for true success.”

The Next Generation 911 (NG911) initiative, spearheaded by U.S. Department of Transportation, seeks to design an emergency communications architecture that will transcend current limitations. Wireless mobile devices, Voice over Internet Protocol telephoning, and other modern technologies have rendered the 911 call center system outmoded.

According to King, core GIS capabilities, wireless and broadband use, and 3D routing technology, particularly for indoors, will be incorporated into NG911, but the parameters and solutions are evolving with the initiative.

Startup RapidSOS hopes to end geo-location fuzziness with a database that seamlessly integrates with 911 call centers. A cellphone call to 911 will ping the RapidSOS database, and geolocation information will be supplied to the 911 center. In trials, RapidSOS provided more accurate geo-location information than the wireless carriers tested.

EMS relies increasingly on GEOINT to provide effective healthcare.

In the coming years, the technology will continue to evolve with the proliferation of predictive artificial intelligence and machine learning algorithms, according to Nikiah Nudell, chief data officer for The Paramedic Foundation and a board member of the National EMS Management Association.

“Geospatial intelligence has become a powerful worldwide tool for paramedic chiefs and the public health and safety officials they often work with,” Nudell said. “In an environment where limited resources are being used to respond to dynamic critical incidents, having full situational awareness from an historic and real-time perspective is powerful.”

Featured image: The MARVLIS system models the probability of future emergency call locations based on historic data, near-real-time inputs such as dispatch and response times, and factors such as traffic conditions. (Credit: Esri)

The post Enabling Rapid Response appeared first on Trajectory Magazine.

]]>
35092
Providing Community ROI with Geospatial Tools http://trajectorymagazine.com/providing-community-roi-geospatial-tools/ Wed, 01 Nov 2017 13:04:23 +0000 http://trajectorymagazine.com/?p=35104 The logistical demands of providing emergency services to large crowds

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
The month of June brings the Wichita Riverfest to Sedgwick County, Kan. For more than a week, concerts, art shows, athletic events, and more draw crowds of up to several hundred thousand to enjoy themselves and support the community along the Arkansas River.

Handling the logistical demands of providing emergency services to large crowds, concentrated within a several-block radius, is the responsibility of Scott Hadley, director of Sedgwick County EMS. His agency handles all services for the 1,008 square mile area.

“Riverfest requires extra coordination, along with the approximately 170 calls per day that are our normal operations,” Hadley explained, adding that the tools his agency invests in allow daily operations and special events to run more smoothly.

For daily operations, Sedgwick County EMS uses a proprietary computer-aided dispatch system along with the MARVLIS system to staff 15 posts throughout the county. The agency tracks and analyzes operational performance, call volume and type, cardiac arrest and survival rates, and financial performance metrics. GEOINT analysis is integrated into these metrics.

Sedgwick County also employs FirstWatch during Riverfest. FirstWatch provides real-time surveillance and analysis to warn agencies of trends and patterns in a selected area. It does this using “triggers,” a set of user-defined filter criteria tailored to the specific event. Various data sources can be integrated with FirstWatch, making it very useful for events such as the Super Bowl, large conferences, festivals, and more.

Using FirstWatch at Riverfest, Sedgwick County EMS sets a geo-fenced area within which the incident command is deployed. Bike, ATV, and other responder teams staff the event, and patients who need to be taken to the hospital are transferred to an assigned point at the periphery of the geo-fenced area.

Geospatial tools are critical to efficient EMS operations, even more so when everyday operations are complicated by a special event or disaster.

Hadley views acquiring and using these tools “not as a cost, but as an investment.” The return on investment for geospatial technology, he said, provides Sedgwick County’s residents with cost-effective, patient-centered emergency care.

Featured image: Sedgwick County Riverfest, 2016. (Photo credit: Sedgwick County)

Return to feature story: Enabling Rapid Response

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
35104
Roadmap for Nationwide Geospatial Data Sharing http://trajectorymagazine.com/roadmap-nationwide-geospatial-data-sharing/ Wed, 01 Nov 2017 13:03:19 +0000 http://trajectorymagazine.com/?p=35111 GeoCONOPS is a guide to support homeland security, public safety, and emergency management

The post Roadmap for Nationwide Geospatial Data Sharing appeared first on Trajectory Magazine.

]]>
Luke Meyers, a planning coordinator with Seattle’s Office of Emergency Management, described himself as “a pig in mud” when he first learned about the Geospatial Concept of Operations (GeoCONOPS) at a conference in January.

He has since taken three of four available online GeoCONOPS courses.

GeoCONOPS, overseen by the Department of Homeland Security’s (DHS) Geospatial Management Office (GMO), is a strategic roadmap for national, state, local, private sector, and academic stakeholders to coordinate geospatial information, share data and tradecraft, and communicate in support of homeland security, public safety, and emergency management.

The roadmap is a guide for linking the geospatial data efforts of the 17 U.S. intelligence agencies, 22 DHS components, and the 50 states, 3,114 counties, and 78 data fusion centers throughout the country, in addition to other data producers in major cities. GMO does not seek to own or hold the data, but rather to validate data and sources, then direct users to them.

David Carabin, Bryan Costigan, Aaron Kustermann, and Jay Moseley, who lead data fusions centers in Massachusetts, Montana, Illinois, and Alabama, respectively, hope GeoCONOPS will soon mature to support an idea they call “SitRoom.”

SitRoom, according to Kustermann, would enable analysts at any of the nation’s 78 data fusion centers to learn, for example, that an individual stopped for a broken taillight in California is driving a car stolen from Minnesota, wanted for drug trafficking in Chicago, and suspected to be part of a terrorist cell in New York.

“GeoCONOPS is how we’re going to be able to share geospatial information,” Kustermann said. “It sets the standards for our being able to share [data]. Without it, the puzzle can’t be built.”

A Maturing Concept

Although the first version of GeoCONOPS was published eight years ago, public safety leaders like Kustermann and Meyers may have only learned of it recently or not be aware of it yet at all.

“It really hasn’t been publicized a lot, at least on the state and local level,” Meyers said.

Other leaders expressed some uncertainty as to which interoperability efforts fall under the umbrella of GeoCONOPS, which perhaps has too broad a definition for the far-reaching complexities of its mission.

“I’m not sure GeoCONOPS should be looked at as a specific program or policies to try to get to interoperability,” said James McConnell, assistant commissioner of strategic data for the New York City Office of Emergency Management. “Sharing—we’re doing a lot of that—but I’m not sure it falls under the title GeoCONOPS.”

This is a model for GeoCONOPS, which is overseen by the Department of Homeland Security’s Geospatial Management Office.

Yet when Hurricane Sandy struck New York and New Jersey in October 2012, the Federal Emergency Management Agency (FEMA) dispatched a GIS unit from Baltimore to assist in relief efforts. “They basically took a copy of our entire database, which we were happy to give them, as their base for working in New York,” McConnell said.

GeoCONOPS has its roots in 9/11, when first responders lacked the maps and data needed to navigate the labyrinth of the Pentagon. Four years later, first responders viewed the aftermath of Hurricane Katrina via commercial satellite imagery, but lacked the tools to communicate about what they were seeing.

“I think that’s really when people started to wake up to this concept of location as a critical element of their operations,” said Chris Vaughan, then deployed in support of FEMA’s Urban Search and Rescue Team providing on the ground geospatial support in New Orleans, and now the agency’s geospatial information officer.

The Hurricane Katrina disaster and others before it prompted a three-day meeting in Washington, D.C., of first responders, government, industry, and academia, that generated a 2007 National Academies report titled “Successful Response Starts with a Map: Improving Geospatial Support for Disaster Management.”

The report acknowledged growing geospatial capability, but warned, “The effectiveness of a technology is as much about the human system in which it is embedded as about the technology itself. Issues of training, coordination, planning and preparedness, and resources invested in technology need to be addressed if future responses are to be effective.”

This statement embodies the intent behind GeoCONOPS.

“There was a feeling that we didn’t know what we didn’t know, and we had gaps we couldn’t identify,” said Nathan Smith, a contract program manager for GeoCONOPS. “A lot of that was a perception that geospatial wasn’t reaching its potential, and that it was constrained by a lack of coordination within the geospatial community.”

Published for the first time June 30, 2009, GeoCONOPS underwent six updates by Jan. 18, 2015, and was met with varying degrees of success. While federal agencies worked toward data sharing, many potential state and local stakeholders looked askance at the 228-page document from Washington. Today, GeoCONOPS is hosted online via geoplatform.gov. A second, more secure site is planned to facilitate shared access for more sensitive data.

“The moment something is printed, it’s obsolete,” said David Lilley, acting director of the GMO. “So we moved to the web, a dynamic mode of delivery, and it puts the content media in an environment that’s of more use to our readers. We are more able to keep the content current and add searches so users can drive directly to what they are looking for in a matter of clicks, instead of searching through 100 pages.”

Realizing What Could Be

Lilley is working to foster a more complete understanding of GeoCONOPS. According to him, GeoCONOPS not only shows how geospatial data is currently supporting the mission at hand—but what geospatial data is available to the community and how it could support other missions.

Realizing what “could be” is perhaps the most important message, especially for those with data that could help FEMA, or state and local governments who could benefit from sharing data with one another. Lilley’s outreach is bringing more data and registered systems into the GeoCONOPS community. In doing so, he seeks to foster a cultural change across all echelons.

“I think through GeoCONOPS, people are identifying the concept that ‘the more people are using my data, the better I can justify sustaining the program (that gleans the data),’” Lilley said. “That’s a fundamental shift, because it used to be that ‘my data is mine, my power is my information.’ They still control it, but letting more people into the data makes it more powerful.”

Tightening budgets are also leading more partners to GeoCONOPS.

“People are more apt to re-leverage an existing capability for their mission need through the CONOPS than always building their own,” Lilley said.

Monetary constraints, technological evolution, and more persistent threats are creating a public safety landscape ripe for more widespread adoption of GeoCONOPS.

“Technology became easier at about the same time data became more prevalent,” said Vaughan, adding GeoCONOPS has been prominent in FEMA exercises such as Gotham Shield, which in April simulated a nuclear explosion in the New York/New Jersey area.

At many levels, public safety experts said GeoCONOPS should also be used as a roadmap for preparedness and resiliency in addition to natural disaster response.

“If effective, [GeoCONOPS] is really being used to support preparedness activities—planning, exercises,” said Rebecca Harned, director of National & Federal for the National Alliance for Public Safety GIS (NAPSG) Foundation. “It’s not something you want to try to access for the first time when the ‘big one’ hits.”

The post Roadmap for Nationwide Geospatial Data Sharing appeared first on Trajectory Magazine.

]]>
35111
GeoQ Meets GitHub http://trajectorymagazine.com/geoq-meets-github/ Wed, 01 Nov 2017 13:02:06 +0000 http://trajectorymagazine.com/?p=35128 The power of the crowd builds upon NGA’s open-source platform to better equip first responders with geospatial information

The post GeoQ Meets GitHub appeared first on Trajectory Magazine.

]]>
Accurate, up-to-date information is a first responder’s biggest asset. Data about infrastructure, passable roads, regional populations, and supplies is essential in a crisis, and can be more difficult to obtain in underdeveloped countries. Without immediate access to the right data, first responders scramble to assess damage and lose valuable time that would otherwise be spent helping people.

To assist with relief efforts in both domestic and international disasters, the National Geospatial-Intelligence Agency (NGA) developed an open-source web application that collects unclassified imagery from nontraditional sources. Called GeoQ, the tool is accessible on any internet browser and pulls together geo-tagged data from social media, maps, news, Earth imaging satellites, and more to provide response teams with a holistic picture of disaster areas in real time.

The problem we realized was a lot of people didn’t have this GIS or remote sensing background. They wanted something easy and intuitive to use, and that’s where GeoQ comes into play.
 
—John Mills, Penn State Applied Research Laboratory

Since its launch on code-sharing site GitHub in April 2014, GeoQ has been deployed for relief management efforts in more than 35 natural disasters, including tornadoes in Oklahoma, earthquakes in Nepal and Japan, typhoons in the Philippines, and the Ebola outbreak in West Africa.

Traditional damage evaluations can take up to 72 hours—during which relief agencies operate mostly “blind” on the ground. But GeoQ can provide a thorough damage assessment within 24 hours of an event, according to Ray Bauer, NGA‘s innovation lead and GeoQ project manager.

In the first ever applied use of GeoQ—a 2013 tornado in Moore, Okla.—“We were able to have 90 percent of the damage assessment done before we could get imagery from traditional sources,” Bauer said, referring to the period just after a disaster when relief agencies rush to compile data before deploying response teams.

Local Power

As data pops up online—such as geo-tagged photos on Instagram or helicopter footage from live news broadcasts—GeoQ’s crowdsourced workflow allows users to quickly receive and filter information to annotate at-risk areas. Emergency volunteers working online from relief agencies around the world are assigned manageable cells of land in the affected region and pore over the data, placing markers for things such as roadblocks and flood perimeters. 

Responding agencies can pull up the crowdsourced analysis on their computers or mobile devices, and can share information directly with other agencies. That shared accessibility is one of GeoQ’s primary benefits.

“In working with [federal, state, and local partners], we realized the inefficiencies of everyone doing their work a little bit differently,” Bauer said. “If you looked at the houses after Hurricane Sandy, they got marked with three or four Xs. Different organizations would come through and put a red X on the door … to show that they’ve already accounted for this property.”

With GeoQ, NGA hopes to standardize responder workflows and reduce that kind of overlap and resource waste to establish a more collaborative model of disaster relief.

Because of their access to tools and bandwidth for damage analysis, federal governments typically lead major disaster response efforts as requested by state and local authorities. GeoQ’s open-source approach helps give similar bandwidth to local responders so time isn’t lost communicating up the chain of command. Another benefit is geospatial intelligence (GEOINT) data held locally is often far more detailed and up-to-date than federal data.

“All disasters are local,” Bauer said, meaning that because disasters are primarily community-based in their impact, relief efforts should begin at the local level rather than the current model for disaster relief that puts most of the responsibility on federal agencies.

Bauer wants to flip the script with GeoQ to give more power to local entities such as fire departments and volunteer organizations, which are in a better position to provide immediate help but often lack sophisticated analytic technology.

“We’re giving them the fishing pole and teaching them how to fish,” Bauer said.

Members of Penn State’s Applied Research Lab (ARL) pose with National Geospatial-Intelligence Agency Director Robert Cardillo in the ARL booth at USGIF’s GEOINT 2017 Symposium. (Photo credit: PSU ARL)

NGA’s desire to share this local-first concept with the rest of the Intelligence Community and beyond is what led it to release GeoQ code on GitHub for free download and unrestricted use.

This means a user not affiliated with NGA could identify inefficiencies with the platform, alter GeoQ’s code, and upload the new, updated version on GitHub. If NGA approved the solution, it could be added to the source code. NGA hopes this low barrier to entry will encourage non-government organizations and private companies to participate.

“We’ve had several companies who have pulled the software down and have taken some of the ideas from GeoQ and started to implement it in their own software,” Bauer said. “That’s awesome. It’s about being open, transparent, and sharing ideas.”

Such a high level of transparency has led to significant leaps for GeoQ in the past three years.

Building Partnerships

GeoHuntsville, a nonprofit initiative in Alabama that unites organizations to improve disaster management, led an effort beginning in 2014 to integrate GeoQ with the operations of nearly every response agency within the municipality. This includes law enforcement, fire and rescue, medical, dispatch, civil air patrol, and more.

According to GeoHuntsville CTO Chris Johnson, “[GeoHuntsville] working groups were seeking a technology platform that would both visualize spatial data and capture tactical activities going on during an event.”

The organization wanted every Huntsville responder sent into a damage-prone area to be able to answer four questions: ‘Who am I?; Where am I?; How am I; and How can I report my activity back to the rest of the responding community?’

“We started using GeoQ to address the four questions, and also to help us break down workload, which it turns out GeoQ does very well,” Johnson said.

Now, GeoHuntsville utilizes its “Responders Working Group”—a collective of public safety specialists—to address prospective real-world challenges using GeoQ. GeoHuntsville’s technical unit, the “Geospatial Intelligence Working Group,” develops pilot programs and functional experiments based on those challenges to stress-test emerging tools and capabilities within GeoQ. NGA analysts as well as Federal Emergency Management Agency teams have participated directly in a number of these GeoHuntsville pilots.

“Through this working collaboration, we’ve been able to add a lot of features to GeoQ. And the wonderful thing about that is it doesn’t just benefit us in Huntsville,” Johnson said. “We are sharing [these capabilities] with everyone through GitHub.”

In August 2016, GeoHuntsville teamed with the National Oceanic and Atmospheric Administration and the National Weather Service to explore the use of unmanned aircraft systems as a platform to deliver live imagery to first responders on the ground. That intake of real-time surveillance paired with the ability to track the unmanned vehicle was new to GeoQ.

In the same exercise, GeoHuntsville developed a YouTube filter within GeoQ. Now, an operator can pull up an effected area on his or her screen and query YouTube for a specific keyword, timestamp, or location to pull real-time video data as soon as civilians post it online. Such data could be instrumental in determining where to direct resources and avoiding repeat coverage.

Pennsylvania State University has also contributed to GeoQ’s field testing and open-source development.

John Mills, a technologist with Penn State’s Applied Research Laboratory (PSU ARL), worked alongside Bauer on NGA’s “Map of the World,” and took a lead in enhancing GeoQ’s automation and data analytics when it first launched.

Students in Penn State’s Red Cell Analytics Lab work with high-tech equipment to simulate threats and analyze information. (Photo credit: PSU ARL)

“The problem we realized was a lot of people didn’t have this GIS or remote sensing background,” Mills said. “They wanted something that’s easy and intuitive to use, and that’s where GeoQ comes into play.”

PSU ARL joined forces with the PSU College of Information Sciences and Technology’s Red Cell Analytics Lab to focus on predictive analytics and implementation of open-source software into local, state, and federal GIS workflows. PSU students test GeoQ in the field, with student-run analytics teams evaluating and managing security threats at events such as Penn State football games at Beaver Stadium and THON, the world’s largest student-run philanthropic event.

According to Mills, the Red Cell teams have focused primarily on two initiatives: exploiting social media to access data, and supplementing GeoQ with other open-source projects such as NGA’s Mobile Awareness GEOINT Environment (MAGE) app. MAGE allows users to create geo-tagged data reports containing observable photo, video, or audio records, and to share those reports instantly with other team members.

“I call it the Red Cell Army,” Mills said. “They were able to go out and use MAGE to do event observable collects, and then in real time, GeoQ was in the emergency operations center in Beaver Stadium and you could see all these [MAGE] data sets popping up. That allowed emergency response folks to better do force deployment.”

Additionally, Mills continued, PSU ARL supervisors and Red Cell Analytics Lab members meet with government stakeholders—including NGA—to observe workflows and brainstorm ways the process could be automated to improve GeoQ’s efficiency and efficacy.

Though the application’s development has been primarily focused on disaster relief, GeoQ’s collaborative model has broader possibilities. The tool is designed to be applied internationally and in other industries.

People on six continents have downloaded or shown interest in GeoQ on GitHub. For example, an insurance company reached out to NGA about using GeoQ for after-damage reports to show where agents made adjustments.

Archaeologists have shown interest as well, according to Bauer. GeoQ currently divides land into single kilometer cells, but perhaps, he said, the program could be used to divide land into centimeter cells to support the examination and analysis of historic excavation sites.

The Next Level

For the next generation of GeoQ, NGA is exploring gamification to incentivize more people in the GEOINT Community to use the program. For now, GeoQ still requires an entry-level background in damage analysis and data management to be used productively.

To encourage engagement, NGA released in late 2014 a gamification code within the program that rewards volunteer analysts with badges and points based on feature creation within GeoQ. For example, a contributor might gain five points for marking five damaged houses within their assigned cell—once they acquire 10 points, they’d earn a badge. Accumulation of badges leads to higher clearance to assist in further, more intense disaster relief.

Badges and other user awards can be exported into a folder called the “Open Badges Backpack,” where contributors can show off their expertise.

Bauer joked about his children’s enthusiasm for virtual games. “We can see how powerful this gamification is—now imagine if we can start to use it for good,” he said. 

According to Bauer, tests of this gamification technique during real-world events have engaged analysts working side-by-side in friendly competition to earn more points and badges.

Bauer said perhaps by incorporating GeoQ into emergency response training programs for the public “[NGA] could start to develop a community in the future where we have civilians participate in first response.”

Through its open source code, GeoQ and similar applications provide first responders and volunteers with unprecedented speed and ease of use in data sharing. The advent of open-source tools will help keep first responders informed and unified in their assessments of danger and damage, enabling superior aid and ultimately saving more lives.

Featured image: GeoQ allows anyone with a web browser and an understanding of geospatial tools like Google Earth and ESRI ARC products to support a project. Contributors focus on information within the image as well as outside the frame to rapidly assess impacts and changes of disasters over large geographical areas to produce detailed features from traditional and non-traditional data sources quickly. (Credit: NGA)

The post GeoQ Meets GitHub appeared first on Trajectory Magazine.

]]>
35128
Revolutionizing Mission Effectiveness http://trajectorymagazine.com/revolutionizing-mission-effectiveness/ Wed, 01 Nov 2017 13:01:50 +0000 http://trajectorymagazine.com/?p=35074 Integrating the precepts of geospatial intelligence into the practice and lexicon of public safety professionals

The post Revolutionizing Mission Effectiveness appeared first on Trajectory Magazine.

]]>
On behalf of the entire trajectory team, I’m thrilled to share our first special edition. It’s our hope that this is an important step in further integrating the precepts of geospatial intelligence (GEOINT) into the practice and lexicon of public safety professionals. As I like to say, while GEOINT was created in the “laboratory” of the defense and intelligence communities in the wake of 9/11, it has in the intervening years escaped the confines of that lab and gone viral.

The national security sector loosely defines GEOINT as the combination of: remote sensing from phones, drones, and space; geospatial/location information of all layers and types; and data management, analytics, and visualization for an actionable purpose. However, in the public safety community, GEOINT may be called something else entirely and be leveraged through the combination of visualization tools such as crime mapping, network analysis, CompStat, route analysis, crisis mapping, critical infrastructure assessment, and more.

USGIF identified the virulent nature of GEOINT in trajectory’s 2015 cover story, “The GEOINT Revolution,” and further explored it when we subsequently themed our 2016 annual Symposium with the same moniker. The thesis of the GEOINT Revolution article is there are multiple technologies undergoing rapid change, and when viewed collectively, create a powerful synergy for revolutionary advances in the GEOINT field.

In the years since 9/11, the defense, intelligence, and, more recently, the homeland security communities have leveraged the power of GEOINT to enhance their respective and collective mission effectiveness. Over time, doctrine has been generated and training, education, and professional development opportunities have developed. 

As GEOINT is increasingly adopted in other sectors, I see a tremendous opportunity to share this body of knowledge and to leverage lessons learned. Our first responders ought not make the same mistakes or blindly face some of the same challenges the traditional GEOINT Community has already overcome. And it is my hope as you find new ways to deploy these approaches that we in turn can learn from your community.

We at USGIF have endeavored to engage with law enforcement, fire and rescue services, emergency medical services, and others to foster an ongoing dialogue regarding public safety mission applications for GEOINT. Our respective communities share meaningful core values exemplified by selfless service to others, fierce dedication to mission, and genuine camaraderie. 

We’ve all seen recent exemplars of the close cooperation among first responders and national security organizations during wildfires in the American west and southwest, and in the wake of hurricanes Harvey, Irma, and Maria. At every level, rescue and relief organizations rely upon GEOINT to accomplish their missions.

It’s my fervent hope that this edition of trajectory ends up in the hands of police officers, fire services professionals, EMS workers, emergency planners, and others who will recognize the opportunity at hand. As a result, I hope you will participate in the discussion that will fully unleash the power of the GEOINT Revolution in support of your vital role in serving, protecting, and responding to keep our nation safe.

USGIF is eager to extend its educational mandate to this important constituency and use our unique power as a convening authority to create and sustain knowledge transfer to further develop the GEOINT tradecraft in support of public safety missions.

Featured image: USGIF CEO Keith Masback with members of the Orange County Sheriff’s department at USGIF’s GEOINT 2016 Symposium in Orlando, Fla.

The post Revolutionizing Mission Effectiveness appeared first on Trajectory Magazine.

]]>
35074
The Future of Firefighting http://trajectorymagazine.com/the-future-of-firefighting/ Wed, 01 Nov 2017 13:00:45 +0000 http://trajectorymagazine.com/?p=35119 Q&A with Kate Dargan, co-founder and chief strategy officer, Intterra Group; former California State Fire Marshal

The post The Future of Firefighting appeared first on Trajectory Magazine.

]]>
Kate Dargan is co-founder and chief strategy officer for Intterra Group, helping to bring innovative geospatial and remote sensing solutions to first responders. Prior to founding Intterra, Dargan was the first woman to become State Fire Marshal for California. She has 30 years of firefighting experience. Dargan is also a member of USGIF’s Board of Directors.

What were some of your early experiences with geospatial intelligence (GEOINT) and firefighting?

My early days were right out of high school in 1977. I was among the first women hired by CAL FIRE—then called the California Department of Forestry and Fire Protection (CDF). I started as a seasonal firefighter responding to large wildfires, and 1977 was a record-setting year.

Kate Dargan Photo Slideshow

[See image gallery at trajectorymagazine.com]

In 1980, I became full-time and started progressing through the ranks—from firefighter to fire engineer to fire captain. It was as captain that I started engaging with remote sensing. One of my assignments was air attack officer, which means essentially forward air traffic control over wildfires. I flew on an OV-10 Bronco for seven years as an eye-in-the-sky translating a map of what I was looking at from several thousand feet aloft to the firefighters on the ground. They had no other way to get that information.

That really gave me an appreciation for how difficult it was to translate visual information into a radio and paint a picture in someone’s head of what you’re looking at. Secondly, some of the technologies that came online in the late ’90s and early 2000s with infrared and mapping were making our world much easier. We experimented with holding a video camera in our hands while we flew so we could have video imagery. I gained a real sense of why we needed to be pursuing these technologies because so many things were advantaged with that perspective.

How did you gain an interest in community resilience?

When I went on to become a fire marshal in Napa County—wine country—I became very engaged with the prevention and mitigation parts of the equation and how to educate and prepare communities to survive wildfires.

In 2003, I was assigned to the Cedar Fire, which was one of California’s historic loss fires. It burned a fair chunk of San Diego County and killed many people including some firefighters. It was a traumatic experience for everyone involved, and it generated in me a strong commitment to firefighter and community safety in firefighting and mitigation. I started looking for ways to use remote sensing for that problem as well.

What led you to become a co-founder of Intterra?

It was also around 2003 that I met my Intterra co-founders, David Blankinship and Brian Collins, through a mutual colleague. They were doing some amazing, innovative work with hyperspectral imaging analysis of watershed areas surrounding Colorado Springs. That translated into wildland fire fuels and rooftop analysis—flammable roofs. It was simply the best idea I’d seen in a long time relative to helping firefighters on the ground with detailed information. I became both a fan and an advocate. These bright people had taken a technology used for other purposes and reapplied it to a new use. It was one of those light bulb experiments where you see how the future might look 20 to 30 years down the road. We kept up a professional relationship for several years.

When Gov. Schwarzenegger appointed me to State Fire Marshal in 2006 I took the opportunity to expand and advance some of the remote sensing work we could do. We looked at things like LiDAR for forestry assessment, hyperspectral imagery (HSI) for community wildfire mitigation, and real-time infrared for firefighter safety. But there was no way to unite the various remote sensing products into a common platform for firefighters or planners. It was very technical and cumbersome and required a lot of analysis.

When I retired from the State Fire Marshal position in 2010, I went back to those colleagues and we talked about how we were still bumping our heads up against this problem. We said it’s been seven years and no one else is fixing it, so let’s start a company and build the right kind of software that can do what we’ve been struggling with. Since founding Intterra we’ve been working to build the capability to bring large amounts of data—including a lot of remotely sensed data from satellite and drone imagery, ISR products, and interpretive LiDAR and HSI products—into a situational awareness platform designed for fire departments.

It’s an eye-opener when folks see they can have all sorts of data streams plugged into a single device at their fingertips. That’s a game changer. Our clients range from the U.S. Forest Service to small, rural fire departments. That’s one of the things unique to the public safety world. When you’re focused on the military or federal world as a vendor you have a pretty homogenous client. In public safety, the tools have to be flexible enough to handle that disparity between large and small organizations.

How does GEOINT contribute to real-time firefighting and mitigation?

All phases of the emergency management cycle are advantaged with remote sensing. An example currently in use in the planning and mitigation phases is rooftop analysis. In the wildfire community, this allows you to assess a community’s vulnerability without having boots on the ground. You can find wood as opposed to concrete or shingle/tile roofs and assess fuel in terms of structure. ‘Is it eucalyptus, pine, juniper? Are palm trees adjacent? How far are they from the structure?’

You can predict fire behavior close enough to the structure to impact it or to create embers that are going to flow downstream. You can mix vegetative information with building information from rooftops, then marry it with GIS information of roadways, water sources, and historic fire information for patterns of behavior. When you put all of this together you have a tactically oriented map firefighters can use to decide which houses are safer to protect, which are riskier to protect, and where to stage equipment at the incident.

That model is directly out of the military paradigm of ‘shaking the battlefield,’ making sure conditions for success have been enhanced so the tactical fight is to your advantage. It’s not just about getting data into the hands of firefighters, but also about the ability to deliver large volumes of information served up to each person in an organization the way they need to use it. That’s where industry can really help the public safety world.

How does GEOINT yield a return on investment for fire departments?

The amount of detail a fire department can acquire and put to use through remote sensing is unavailable anywhere else accept boots on the ground. As we’ve shifted to an electronic medium we have the ability to take an aerial base map and electronically create that same building footprint or wildland area. It’s not just a static, electronic picture. We can now add data and refresh during an emergency. That is very different for the fire service. They’re not used to having that much information available to them during the decision-making phase of response. Remote sensing is arriving so quickly we’re struggling to figure out ways to use that data inside the mission. The information has to fit into the pattern and flow of how we fight fires, not disrupt it.

What do you anticipate for the future of firefighting technology?

Remote sensing products are arriving with exponential growth. So much of it is headed our way that the general public isn’t even aware of yet. Those capabilities, whether it be small drones or a large, near-real-time satellite network, are going to effect almost every moment of our decision-making process. A firefighter of tomorrow will be used to seeing their response area in an aerial viewpoint rather than through the windshield or via Google Earth. They will have constant access to what their district looks like at that moment, where they are within that area, where their adjacent units are, and the homes, building material, and emergency status in that area. They will be visually connected to their communities and will be much safer, especially in the wildland world. We’ve just begun to scratch the surface of how to incorporate these technologies into everyday decision-making.

What advice do you have for young public safety professionals?

I have two sons in the public safety field and both in the fire service. My advice to them has been to pay attention to GEOINT technology and to become conversant. To become familiar with the language of it, what the capabilities are, and read up on remote sensing applications. Become the person in your department who knows how to manage the information side. It’s less about being able to control the drone itself and more about being able to interpret the imagery it’s generating.

Kate Dargan, former California State Fire Marshal and co-founder of Intterra

The post The Future of Firefighting appeared first on Trajectory Magazine.

]]>
35119