Features – Trajectory Magazine http://trajectorymagazine.com We are the official publication of the United States Geospatial Intelligence Foundation (USGIF) – the nonprofit, educational organization supporting the geospatial intelligence tradecraft Fri, 19 Jan 2018 19:39:44 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.4 https://i2.wp.com/trajectorymagazine.com/wp-content/uploads/2017/08/cropped-TRJ-website-tab-icon-1.png?fit=32%2C32 Features – Trajectory Magazine http://trajectorymagazine.com 32 32 127732085 The DigitalGlobe Foundation Celebrates 10 Years http://trajectorymagazine.com/digitalglobe-foundation-celebrates-10-years/ Fri, 15 Dec 2017 17:50:43 +0000 http://trajectorymagazine.com/?p=35554 A look at some of the globally meaningful work the foundation has made possible

The post The DigitalGlobe Foundation Celebrates 10 Years appeared first on Trajectory Magazine.

]]>
The DigitalGlobe Foundation (DGF), an educational nonprofit established by commercial satellite imagery provider DigitalGlobe, celebrates its tenth anniversary this year. To promote globally significant research and prepare the next generation of geospatial professionals, DGF awards grants to students and scientists in the form of free access to the company’s imagery, training, and other space-based technology.

DGF founder Mark Brender saw the need in 2007 to ramp up workforce development in preparation for the industry’s imminent growth.

“We needed a way to open our aperture, to bring new ideas and people into geospatial sciences and the commercial remote sensing imagery ecosystem,” Brender said. “The best way to do that was to establish a foundation that can put high-resolution imagery into the hands of students so they can experiment with it, understand it, and eventually become geospatial users.”

To date, DGF has awarded more than 3,000 imagery grants valued at more than $14 million to students and researchers around the world. Such fieldwork has explored changes in topography over time, human and wildlife population sustainability, and historic site identification.

Students at USGIF-accredited GEOINT programs are often the recipients of such grants. 

Our partnership with DGF provides unique opportunities for USGIF’s 14 accredited college and university programs,” said USGIF CEO Keith Masback, who is also a member of DGF’s board of directors. “With this access they are able to expand their ability to conduct research and advance the GEOINT tradecraft.” 

In addition to research support, DGF also offers scholarships to select partner schools, including $5,000 annual awards to students at George Mason University and the University of Colorado.

To encourage more global-scale problem-solving from promising geospatial scientists, DGF is gradually expanding its scope beyond awarding imagery grants for specific research projects. Since March, DGF President Kumar Navulur has led the foundation toward investments in three main areas:

  • Leveraging machine learning and spectral analysis to extract insights from data.
  • Promoting the study of foundational sciences where the current global capacity is sub-par, specifically photogrammetry and physics.
  • Creating a cooperative network of research-focused universities.

According to Navulur, DGF has also expanded its reach from just a few universities outside the U.S. to a wider distribution of 50 universities in 20 countries. Additionally, DGF has established a relationship with the African Association of Remote Sensing of the Environment, which consists of about 50 more universities.

The foundation hopes increased support will push young geospatial professionals to seek tangible solutions to major environmental problems.

“I would love for universities to look at how to use imagery to document the quantifiable progress of the United Nations’ Sustainable Development Goals,” Navulur said.

In years to come, DGF partners and grant recipients will benefit from new access to cloud-penetrable radar data from Maxar Technologies, DigitalGlobe’s new parent organization. Additionally, case-specific imagery grants will be supplemented with access to the company’s global base map, DigitalGlobe Cloud Services.

“We are ensuring students have the skills to develop location-based technologies like the Internet of Things and remote sensing,” Navulur said. “Not only will they get jobs, they’ll make a difference in the world.”

Following are case studies featuring seven DGF grant recipients who are already making a difference:

Egyptian Looting

DGF granted three high-resolution images to University of Alabama at Birmingham’s Dr. Sarah Parcak to help measure archaeological looting in Egypt. Illegal digging reports were growing in the Saqqara and Dashur regions south of Cairo. Up-to-date data was not immediately available, so official theft measurements for the area were highly inaccurate until Parcak received access to GeoEye imagery via DGF.

[See image gallery at trajectorymagazine.com]

 

DigitalGlobe Foundation – Sarah Parcak / Girls Inc. from Trajectory On Location on Vimeo.


Surveying Nomadic Health

In one of its first grants, DGF released imagery to Stanford researcher Hannah Binzen Wild for her analysis of health in nomadic pastoral populations in Ethiopia. Wild used the data to locate mobile settlements quickly enough to develop and deliver hundreds of surveys to people living in the remote Nyangatom region of Ethiopia’s Lower Omo Valley. She’s now back at Stanford, working in collaboration with the Stanford Geospatial Center to refine the use of imagery for analysis by developing algorithms to determine average settlement size and other population characteristics. The team hopes these methods and pilot data can serve as a foundation to improve health care access for nomadic populations in other contexts.

[See image gallery at trajectorymagazine.com]

Tracking Gold

Michael Armand Canilao, an archaeologist and University of Illinois in Chicago graduate student, received an imagery grant from DGF supporting his research on ancient gold trading routes in the Philippines. DGF released four sharpened WorldView-2 multispectral images each displaying 1,000 square-foot tiles in northwest Luzon. The imagery enabled a closer look at the trails and, according to Canilao, made clear “how small-scale gold miners were able to negotiate, and, in some cases dictate, the terms of their participation in Early Historical Period maritime gold trade.”

[See image gallery at trajectorymagazine.com]

Mapping the Magan Peninsula

New York University doctoral candidate Eli Dollarhide sought to uncover the true historic landscape of Magan, an ancient peninsula in Oman with an uncertain political past. DGF granted Dollarhide access to Worldview-2 and -3 imagery of the land between Bronze Age settlements Bat and Amlah. This imagery helped Dollarhide’s team determine where to spend their limited time in the field and enabled the discovery of prehistoric tombs, petroglyphs, and roughly 450 other previously undocumented archaeological sites.

[See image gallery at trajectorymagazine.com]

Satellites Over Seals

University of Minnesota researcher Michella LaRue and her team used imagery provided by DGF to determine factors affecting the population variation and distribution of Weddell Seals along the Antarctic coast. Both commercial fishing and the melting of ice caused by climate change have affected the ice-dependent species. The project aims to determine what environmental conditions the seals require to survive. “We literally couldn’t do this research without [this imagery],” LaRue said. She manually scoured the imagery to count seals, and compared her findings to modern, ground-validated counts as well as counts from the 1960s.

[See image gallery at trajectorymagazine.com]

Erosion in the Yukon

It is theorized that slight increases in temperature caused the recent disappearance of the glacial Slims River in the Yukon. Dan Shugar, a researcher and professor at the University of Washington, Tacoma, was awarded WorldView-1, WorldView-2, and GeoEye-1 imagery by DGF to create 3D maps of the region. This enabled him to observe erosion processes in the Slims and Kaskawulsh rivers. Some imagery is being converted into a series of multi-temporal digital elevation models (DEMs) to visualize the hydrological system underground in search of changes that would affect glacial drainage. Shugar called these DEMs “a game changer.” DGF is continuing to work with Shugar on new tasking for stereo and multi-spectral images to detect changes in Kluane National Park.

[See image gallery at trajectorymagazine.com]

Valley of the Khans

DGF helped researchers from the University of California San Diego, the Mongolian Academy of Science, and the National Geographic Society in their quest to locate the final resting place of Genghis Khan. In one of its first grants, DGF provided Albert Yu-Min Lin and his team with imagery of multiple areas over Mongolia. The researchers are leveraging the power of the crowd and enlisting the general public to help study the satellite imagery and identify features of interest. The aim is to find Khan’s tomb using non-invasive tools and enable protective conservation methods at the historic site.

[See image gallery at trajectorymagazine.com]

Images courtesy of DigitalGlobe and the individual DGF grant recipients.

The post The DigitalGlobe Foundation Celebrates 10 Years appeared first on Trajectory Magazine.

]]>
35554
The Genesis of Google Earth http://trajectorymagazine.com/genesis-google-earth/ http://trajectorymagazine.com/genesis-google-earth/#comments Wed, 01 Nov 2017 14:53:37 +0000 http://trajectorymagazine.com/?p=35010 The history and future of the software that made GEOINT mainstream and changed the way we view the world

The post The Genesis of Google Earth appeared first on Trajectory Magazine.

]]>
In August 2005, Hurricane Katrina ravaged the Gulf Coast of the United States, bursting levees throughout Louisiana and Mississippi and submerging the streets of south Florida. According to the National Hurricane Center, it was the deadliest hurricane since 1928, claiming at least 1,800 lives and causing more than $108 billion in damages.

The U.S. Navy, Coast Guard, and other federal relief groups deployed helicopter teams to rescue people stranded in New Orleans without the resources to escape or survive in their homes. Hurricane victims dialed 911 for urgent help at specific street addresses, but it was impossible for first responders to find them without precise GPS coordinates—street signs and house numbers were invisible beneath the deluge. In the absence of traditional situational awareness, responders were operating blind.

In California, a team from the recently minted Google Earth program launched into action, creating real-time imagery overlays of heavily affected areas on top of its existing 3D globe platform. Fly-by aerial photos from the National Oceanic and Atmospheric Administration (NOAA) and satellite imagery from DigitalGlobe—one of Google Earth’s primary providers—revealed the scope of the hurricane’s destruction. Google Earth made this data publicly available and responders had eyes again.

Now, they could input a caller’s location into Google Earth paired with case-specific details—for example, a target trapped in a two-story house with a clay roof next to an oak tree. Equipped with up-to-date imagery from Google Earth, relief teams saved thousands of people from Katrina’s aftermath.

Years later, the Louisiana Governor’s Office of Homeland Security and Emergency Preparedness would pair internal data with Google Earth Enterprise (GEE)—the desktop software suite for private or offline use of Google Earth—to create 3D globes for emergency response and infrastructural planning.

Today, Google Earth is among the most popular geospatial software in the world, boasting upward of one billion downloads. With it, students take virtual tours of the world’s wonders from their classrooms, house hunters evaluate prospective properties without leaving home, and much more. The U.S. military employs GEE for secure mission planning and intelligence professionals use it to visualize points of interest and detect change. Google’s spinning globe truly represents the democratization of geospatial intelligence.

In the case of GEE, government and military organizations became so dependent on the software’s private storage and visualization capabilities that not even a depreciation announcement from Google two years ago stopped them from using the platform.

As a result of the community’s reliance on GEE, earlier this year Google decided to make the software’s code open source and available for public download on GitHub.

With its future in the hands of its users, GEE is poised to remain at the center of mission planning and situational awareness efforts for the defense and intelligence communities—at least until a supported platform of equal utility arises.

A Giant’s Infancy

At the time Hurricane Katrina made landfall, Google Earth software had been available to the public for only three months. But the story of Google Earth began to take shape 10 years earlier at a computer hardware company called Silicon Graphics (SGI).

Michael T. Jones, then a member of SGI’s computer engineering team, had developed an invention that would revolutionize the firm’s 3D graphics offering, which at the time was used primarily for flight simulation.

“It was called clip mapping. That’s the fundamental hardware feature SGI had that let it do this amazing, smooth flight around the world,” said Jones, now a managing partner at Seraphim Capital.

Jones’ technique displayed a small region of graphics—the region under examination—in high resolution while the peripheral regions were displayed in low resolution. Jones, along with SGI engineers Chris Tanner, Chris Migdal, and James Foran, patented the method in 1998. Clip mapping required powerful supercomputers to run, but enabled a high-fidelity texture map that became the centerpiece of SGI’s final graphics system, Infinite Reality, which at the time boasted the fastest 3D graphics in the world.

Federal agencies such as the National Geospatial-Intelligence Agency (NGA) and the National Reconnaissance Office (NRO) would later follow suit, Jones said, using clip mapping to build data visualization platforms of their own.

To demonstrate the vastness of Infinite Reality’s capabilities, SGI created a demo called “Space to Your Face.” It began with a wide view of Earth from space, slowly zooming into Europe. When Lake Geneva became visible, the program would focus on the Matterhorn in the Swiss Alps. It would continue to zoom until reaching a 3D model of a Nintendo 64 console on the mountainside. Then it would zoom in even more, settling on the Nintendo’s MIPS r4000 graphics chip—a microprocessor created by SGI—before snapping smoothly back to space.

The demo was well received. Educators were excited to see an interactive, classroom-friendly global map tool, and video game developers had never seen such fluid graphics.

Seeking a new home for their brainchild, Jones, Tanner, and former SGI engineers Remi Arnaud and Brian McClendon founded a company of their own. Called Intrinsic Graphics, it focused on developing high-quality 3D graphics for personal computers and video games.

In October 1999, Tanner took the concept further when he designed a software version of the clip mapping feature that allowed a user to “fly” within a 3D visualization of Earth.

“People were blown away,” Jones said. “They were looking at Google Earth.”

Though the software platform wasn’t Intrinsic’s primary product—the graphics themselves were—Jones was intrigued and continued refining the spinning globe.

Yet running the software required expensive and highly specialized computing hardware not available to most of the private tech industry, let alone the commercial user.

“That machine cost $250,000. We wanted to be able to offer this without the specialized hardware,” said McClendon, now a research professor at the University of Kansas. “To be able to get that performance out of a PC meant we could share it with the world. The moment you realize you can transmit this data over the internet, you begin to realize the impact. A group of us at Intrinsic thought, ‘We need to build a company around this.’”

And before long, yet another company was founded. In 2000, Jones, McClendon, and a few others spun out the software from Intrinsic Graphics to launch Keyhole. In early 2001, Keyhole raised first round funding from NVIDIA and Sony Digital Media Ventures, making official its existence as a standalone company. Keyhole’s first product, EarthViewer 1.0, was the true precursor to Google Earth.

Using public data gathered from NASA’s Landsat constellation, IKONOS imagery, and aerial photos of major U.S. cities, Keyhole built a complete digital Earth. Though pixels were beginning to proliferate, high-resolution imagery was mostly limited to U.S. metropolitan areas.

Under the direction of newly appointed Keyhole CEO John Hanke, the company marketed EarthViewer to the commercial real estate and travel industries. Civil engineers also purchased it for the ability to sketch out location information when planning construction projects. 

“Intelligence agencies wanted this capability as well, but they wanted to use their own data,” McClendon said.

The Intelligence Community (IC) was intrigued, but wanted to use classified geospatial data gathered through National Technical Means rather than the data on Google’s public server. To accommodate such buyers, Keyhole began offering an enterprise version of its software, allowing large-scale users to stand up private network servers and host their own data on a replica of EarthViewer’s 3D globe.

NIMA Backing

The National Imagery and Mapping Agency (NIMA) was the first agency to take note of this unprecedented capability. Under the leadership of then director James Clapper and deputy director Joanne Isham in 2001, NIMA launched a research and development directorate known as InnoVision. The new directorate sought to leverage state-of-the-art technologies from industry to help the IC adapt to the changing face of conflict in the aftermath of 9/11.

Isham, a former CIA employee, was well versed in In-Q-Tel, the CIA’s nonprofit venture capital initiative. She approached Robert Zitz, InnoVision’s first director, about collaborating with In-Q-Tel to find partner companies.

“We sat down together with In-Q-Tel and went over what our most urgent requirements were,” said Zitz, now senior vice president and chief strategy officer of SSL MDA Government Systems. “In-Q-Tel started trying to locate companies and [in 2002] discovered Keyhole.”

In-Q-Tel was impressed by the low barrier to entry and EarthViewer’s ease of use.

[Users] will create data files … rapidly and not to spec, put them in Google Earth, and they’ll run somehow. That’s really the reason why no other applications have been able to enter this space as dominantly as Google Earth.

— Air Force Lt. Col. Mike Russell, NGA

“With [EarthViewer], you just click on the icon and all of a sudden you’re flying around the globe,” said Chris Tucker, In-Q-Tel’s founding chief strategic officer and now the principal of Yale House Ventures. “There had been some way earlier-era, very expensive defense contract iterations [of a 3D digital Earth], but none at a consumer level that a regular analyst could make sense of without being a missile defense expert or some other technical user.”

In 2003, In-Q-Tel invested in Keyhole using NIMA funding. It was the first time an intelligence agency other than the CIA had employed In-Q-Tel. NIMA experienced an immediate return on its investment. Within two weeks, the U.S. military launched Operation Iraqi Freedom, which Keyhole supported in its first mission as a government contractor.

“We wanted a capability that would help military planners visualize and seamlessly move through datasets pertaining to particular target areas,” Zitz said. “We also wanted the ability to rapidly conduct battle damage assessments. NIMA was supporting joint staff in the Pentagon, and to sense how effective a strike was after-the-fact was very labor and imagery intensive. With Keyhole, we were able to streamline that process.”

EarthViewer quickly gained public exposure through TV news coverage using its battlefield imagery.

One of McClendon’s junior high school classmates, Gordon Castle, was CNN’s vice president of technologies. McClendon approached Castle with his EarthViewer demos. Castle was wowed, and CNN became one of Keyhole’s first media customers. The network routinely used EarthViewer to preview story locations during broadcasts. When the U.S. invaded Iraq, CNN used the software heavily—sometimes several times an hour—to show troop movement or combat locations.

The Big Break

Realizing its technology could improve people’s understanding of the planet, widespread commercialization became Keyhole’s mission. But Keyhole was a small company, and scaling up its computing infrastructure to handle more traffic was expensive. An annual EarthViewer Pro subscription still cost $599—a price justified by the company’s high operating costs. Keyhole’s bottom line stood in the way of its goal.

“[We wanted] everybody that opened the app to be able to find their house,” McClendon said. “It’s the first thing everybody searches for. If that experience isn’t good, the user thinks the product isn’t good.”

That first step required high-quality coverage of the entire land surface of Earth—a seemingly unattainable achievement for Keyhole’s 29 employees, even with In-Q-Tel backing. And the startup’s network bandwidth wasn’t strong enough to offer a high-resolution 3D globe to millions of consumers worldwide. McClendon recalled making regular trips to Fry’s electronics store to purchase hard drives, struggling to keep up with demand.

“To provide high-resolution data for the whole world was an epic undertaking … that would’ve taken us probably a decade to build up on our own,” he said.

For its vision to materialize, Keyhole needed more capital to scale up imagery procurement and to build powerful data infrastructure to store high volumes of imagery. In 2004, as if on cue, along came Google—one of the few companies powerful enough to manifest Keyhole’s mission. And they wanted to buy.

“It seemed like a tough road. Everybody was impressed with what we had done, but there was going to be competition and we needed to move quickly,” Jones said. “So we sold to Google because our dream would happen.”

As part of the acquisition, the Keyhole team maintained control of the program as it evolved. Most personnel, including McClendon and Jones (Tanner had since departed Keyhole), became executives at Google, developing their software unrestricted by the need to keep a startup afloat.

Once at Google, the program began to operate on an entirely different scale. Instead of acquiring licensing deals for small portions of a vendor’s imagery at a time, Google bought out all the imagery a vendor had available at once. Google also provided access to a rapidly growing user base already hooked on its web search platform.

Before debuting a Google-branded product, the former Keyhole team had to rewrite EarthViewer’s service code to run within Google’s infrastructure. Additionally, pre-release engineering refinements focused on adding data around the globe, making the program accessible to non-English speaking users, and simplifying features. Finally, Google Earth launched in June 2005.

The software exploded in the commercial marketplace. Where Keyhole’s consumer version of EarthViewer was too expensive for most casual civilian users, Google Earth was downloadable for free.

“We had millions of users in the first few days and tens of millions in the first year,” McClendon said.

Keyhole brought to Google a new form of interactive information that mimicked the real world and helped people understand their place in it. A GEOINT tool had finally made it to the mainstream.

In 2006, Google released Google Earth Enterprise for organizations seeking the capabilities of Google Earth but with private data in a secure, offline environment. The GEE suite included three software components: Fusion, the processing engine that merged imagery and user data into one 3D globe; the Earth server that hosted the private globes built by Fusion; and Client, the Javascript API used to view these globes.

Whether to disseminate that data after creating proprietary globes in GEE was, and still is, up to the user. This was the final evolution of the EarthViewer enterprise suite used by the Pentagon at the outset of the Iraq war.

GEE in Action

In the years following its launch, government agencies, businesses, and state municipalities began to deploy GEE at internal data centers to produce 3D globes using sensitive or classified data.

The city of Washington, D.C., for example, has used GEE to model and visualize public safety data including crime, vehicle and fire hydrant locations, and evacuation routes.

Arguably the largest user of GEE is the U.S. Department of Defense (DoD). When Google Earth was first released, military customers had an explicit need for this capability to function in a highly secure private network.

For example, the Army Test and Evaluation Command (ATEC) uses private data on enterprise servers such as Google’s to evaluate a wide range of weapon systems as well as ground and air operations.

At ATEC’s Yuma Proving Ground (YPG) in Arizona, proprietary terrain data, imagery, and operations maps are overlaid on Google Earth and used to plan and schedule launches.

“Knowing where everyone is and moving in a secure range and air space is important to our operations,” said Ruben Hernandez, an Army civilian in the YPG’s engineering support branch. “Much of this data is also armed for range awareness display.”

For example, prior to an indirect fire artillery test, personnel use YPG data within GEE to assess the safest positions on base to conduct the test—when to fire, where to fire from, and what to fire at. That information is disseminated throughout YPG for awareness.

“Many of these munitions have extensive footprints. We want to find out how much air and land space [the blast] is going to consume. Safety is a big component of how these overlays are planned,” Hernandez said.

NGA is another major GEE stakeholder. In 2008, the agency’s new GEOINT Visualization Services (GVS) program invested in the enterprise server. GVS has since produced a proprietary version of Google Earth for warfighters featuring classified NGA data.

According to GVS program manager Air Force Lt. Col. Mike Russell, “GVS was built around providing a version of Google Earth in the secret and top secret domains so users could visualize classified information geospatially and temporally in a common operating picture.”

Now, NGA’s private Google Earth globes are mission critical for more than 30,000 customers daily, including DoD Combatant Commands, the FBI, CIA, NRO, National Security Agency, and Federal Emergency Management Agency. NGA’s current release is the second largest Google Earth globe in the world and is used across the DoD and IC for common situational awareness, tracking vehicles and personnel, delivering intelligence briefings, and more.

Russell praised Google’s efficient rendering of data files in the Keyhole Markup Language (KML) format. KML was created for file building in Keyhole’s EarthViewer platform and has since become an industry standard for visualizing geospatial data.

“[Users] will create data files like the location of an IED or a live dynamic track of an aircraft. They can build these files rapidly and not to spec, put them in Google Earth, and they’ll run somehow. [Competitors] can only render smaller KMLs or those built to spec. That’s really the reason why no other applications have been able to enter this space as dominantly as Google Earth,” Russell said.

The Unbundling

GEE served a far more specific client and purpose than the commercial Google Earth services, but its rate of adoption was noticeably low compared to most Google products.

According to McClendon, “Continuing to innovate on a hosted service exclusively for the enterprise community was not financially viable.”

In March 2015, Google announced the depreciation of GEE. After a two-year transitional maintenance period, the company stopped supporting GEE software in March 2017. Though it was being phased out of Google’s product line, GEE remained in use by invested customers relying on it to meet mission demands and house their data.

Hernandez recalled pushback from teams at Yuma who were not keen to change their data storage and visualization system. According to Russell, GVS feared losing its primary product and stranding customers without an application to replace it.

To accommodate the ongoing need, Google announced in January it would publish all 470,000 lines of GEE’s code on GitHub, allowing customers to continue using the software they’d grown loyal to and to improve the product independently.

For customers who prefer transitioning to a supported enterprise software, Google has coordinated with Esri to offer free software and training for GEE customers who migrate to Esri’s ArcGIS platform. 

The open-source GEE (GEE-OS) suite includes the Earth server, Fusion, and a portable server allowing users to run GEE on a mobile device or desktop computer not connected to a centralized server. The GEE Client software, which is required to connect to the Earth server and view 3D globes, was not carried forward into the open-source environment. Instead, it will continue to be maintained and provided by commercial Google Earth.

Thermopylae Sciences and Technology (TST), NT Concepts, and Navigis—three longtime Google partners—supported GEE’s transition to open source. In the spring, each of the three companies sent a developer to Google in Mountain View, Calif., to spend several weeks learning the code from Google developers who had been maintaining the software baseline. 

TST began a partnership with Google in 2007 through a series of federal government customer engagements supporting Thermopylae’s own Google Earth-based tracking console. When the open-source announcement was made, TST’s Earth Engineering team was reassigned to the company’s Open Source Development Office to create the GEE GitHub site and migrate the source code.

On Sept. 14, TST’s open source team released GEE-OS version 5.2.0, which matches the last proprietary release as well as fixes bugs that emerged during the two-year depreciation period.

“When we pulled the code out from [Google’s] proprietary side, there were a lot of things that needed to be built back up or replaced with open-source components,” said Thermopylae CEO AJ Clark. “Really these first few months are just about providing feature parity with where the code was at its last state inside Google.”

TST’s team aims to release GEE-OS 5.2.1 by the end of 2017.

Now that parity is achieved and the program’s performance is stabilized, developers will begin submitting expanded code contributions. According to Clark, the first value-add propositions will most likely begin to flow in early 2018. Meanwhile, DoD and IC users are eager to discover how they can further adapt the software for their specific missions.

Chris Powell, CTO of NT Concepts, said the company is working with its defense and intelligence community customers to support GEE and their transition to the GEE-OS baseline. 

“We’re also actively looking for opportunities to contribute back to the open source baseline for feature improvements and capabilities,” Powell said, adding some possibilities are scaling the GEE processing power to a larger compute platform and examining how the software can be optimized for the cloud.

Hernandez said the planning crew at Yuma is looking forward to new software capabilities that could be built out at the request of the test community. Among these features, he said, is the ability to “grab geospatial objects and collaborate on them between multiple users; to grab, extend, and change the shape of a [weapon] footprint in 2D or 3D; and to provide a simulation of an object’s line trajectory.”

According to Jon Estridge, director of NGA’s Expeditionary GEOINT Office, the agency has committed to provide enhancements and ongoing sustainment to open-source GEE on Github through at least 2022.

“A few specific examples would be multi-threading the fusion process to support massive terrain and imagery updates, enhanced 3D mesh management, and inclusion of ground-based GEOINT content like Street View,” Estridge said. 

Open source means more customizability for users with niche wants and needs. No two proprietary Google Earth globes look the same, and teams will have more command over the unique data they store, visualize, and analyze within the program.

“It’s very positive,” Russell said. “[Open source is] an opportunity for NGA to partner with Thermopylae to tie the proprietary and non-proprietary pieces together, and it allows us to sustain Google Earth for our user community for a longer period of time.” 

The decision to make GEE code open source only improves the program’s accessibility and potential use cases, and will bolster the software’s longevity. Code sharing is a growing trend in the IC, and Google has provided government, military, and industry unlimited access and control of one of the most useful enterprise GEOINT tools on the market. 

The post The Genesis of Google Earth appeared first on Trajectory Magazine.

]]>
http://trajectorymagazine.com/genesis-google-earth/feed/ 2 35010
The Next Generation of GEOINTers http://trajectorymagazine.com/next-generation-geointers-2/ Wed, 01 Nov 2017 14:52:06 +0000 http://trajectorymagazine.com/?p=34964 Meet the 2017 USGIF Scholarship winners

The post The Next Generation of GEOINTers appeared first on Trajectory Magazine.

]]>
USGIF awarded $117,000 in scholarship funds this year to students studying the geospatial sciences and related disciplines. The annual USGIF Scholarship Program recognizes the achievements of graduating high school seniors as well as undergraduate, graduate, and doctoral students. The program’s goal is to further the advancement of the geospatial tradecraft.

USGIF awarded 26 scholarships this year to six high school seniors, six undergraduate students, 10 graduate students, and four doctoral candidates. The Foundation also awarded the $10,000 Ken Miller Scholarship for Advanced Remote Sensing Applications for the second year. The Ken Miller Scholarship is presented to a master’s degree candidate studying remote sensing who plans to enter the defense, intelligence, or homeland security workforce.

Since the USGIF Scholarship Program began in 2004, more than $1.1 million in funds have been awarded to students with GEOINT aspirations.


KEN MILLER SCHOLARSHIP FOR ADVANCED REMOTE SENSING APPLICATIONS


Andrew KnightAndrew Knight
University of Georgia
Geography
EXPANDED PROFILE

Knight holds a bachelor’s degree from James Madison University in geographic science with a concentration in applied geographic information sciences. His research goals include applying remote sensing methods to humanitarian issues. For his thesis, Knight is studying the intersection of machine learning and unmanned aerial systems. He currently works as a research assistant at the University of Georgia’s Center for Geospatial Research.
 


DOCTORATE


Michael BradyMichael Brady
Rutgers University
Geography

Brady, a former enlisted member of the U.S. Coast Guard, earned both his bachelor’s and master’s degrees in geography at Hunter College of the City University of New York under the GI Bill. In his doctoral research, Brady maps shoreline risks with whaling communities that live along Alaska’s northern coastline. He is particularly interested in collaborative research using participatory geospatial methods.
 


Carolyn S. FishCarolyn S. Fish
Pennsylvania State University
Geography

Before returning to graduate school, Fish was a cartographic product engineer at Esri. She also completed her bachelor’s degree in geography at Penn State University and her master’s degree in geography at Michigan State University. Fish’s research aims to understand how maps are designed and used to convey climate change in many contexts, including national security, with the goal of improving such visualizations for better decision-making.
 


Cara MurphyCara Murphy
Rochester Institute of Technology
Imaging Science

Murphy is a full-time employee at Systems & Technology Research and a part-time Ph.D. student with the Center of Imaging Science at Rochester Institute of Technology (RIT). Previously, Murphy received her bachelor’s degree in physics and mathematics at Merrimack College and her master’s degree in imaging science with a concentration in remote sensing from RIT. Her work focuses on solving problems in forensics, defense, border protection and monitoring, and other law enforcement and intelligence applications.


James WalkerJames Walker
University of California, Los Angeles
Geography

After a decade of working in the nonprofit sector, Walker returned to school to obtain a bachelor’s degree in global studies and a master’s degree in geography from UCLA. His doctoral research is focused on the adoption of remote sensing and GEOINT analysis by human rights groups. Using methods drawn from critical geopolitics and science and technology studies, his research explores how GEOINT technology empowers non-state actors in their efforts to shape international crisis response.


GRADUATE


Roxanne AhmadiRoxanne Ahmadi
Pennsylvania State University
Homeland Security-GEOINT

Ahmadi is currently an intern with the National Geospatial-Intelligence Agency and graduates with her master’s degree this spring. Ahmadi’s research interests are object-based image analysis and automation as well as the increasingly relevant integration of human geography and remote sensing.

 

 


Colin BunkerColin Bunker
Ohio State University
Electrical and Computer Engineering

Bunker earned his bachelor’s degree in mechanical engineering from Purdue University. He currently works at the Air Force Research Laboratory in Dayton, Ohio, as a Pathways intern. His research interests include object detection and tracking, high-level scene recognition, and image geo-location.

 

 


Courtney ConnorCourtney Connor
Middlebury Institute of International Studies at Monterey
Nonproliferation and Terrorism Studies

Connor holds a bachelor’s degree in modern languages and literatures with a minor in psychology from California Polytechnic State University, San Luis Obispo. Her interests lie in GEOINT fusion and how algorithms and software can be streamlined to aid in the successful identification and rescue of human trafficking victims, while also bringing their traffickers to justice.

 


Jace EbbenJace Ebben
Pennsylvania State University
Homeland Security – GEOINT

Ebben works in St. Louis for Booz Allen Hamilton as a geospatial analyst assigned to the National Geospatial-Intelligence Agency. Ebben served for six years as an intelligence analyst in the Wisconsin Air National Guard and attended the University of Wisconsin, graduating with a bachelor’s degree in geography and political science. Ebben also holds a graduate certificate in geospatial intelligence analytics from Penn State. His current interest is in machine learning and its application as a force-multiplier for analyzing remotely sensed information.


Linnea JohnsonLinnea Johnson
George Washington University
Data Science

Johnson graduated in 2013 from Mount Holyoke College, where she studied geography and Chinese. During her time at Mount Holyoke, she had several internships and research experiences that allowed her to work with geospatial technologies, and received several scholarships to improve her Chinese language abilities. After earning her bachelor’s degree, Johnson spent a year in Taiwan as a Fulbright Scholar, and upon her return began working as a research specialist with the Department of Defense.


Phil McTiguePhil McTigue
Northeastern University
Emergency Management & Geospatial Information Technology

McTigue’s collegiate education began with an undergraduate degree in homeland security from American Military University. His concentration is focused on geospatial information and intelligence as it relates to homeland security. One of his specific areas of interest is the use of unmanned aerial systems for imagery capture.

 


Travis MeyerTravis Meyer
Pennsylvania State University
Geographic Information Systems

Meyer’s undergraduate degree is in marine environmental science from the State University of New York Maritime College. He spent nine years as a U.S. Marine and Naval officer. Meyer’s research is focused on using bathymetric LiDAR and photogrammetry to analyze the vulnerability of American coastlines to sea level rise, coastal erosion, and storm surge. Meyer is currently a curriculum developer and instructor at the Naval Meteorology & Oceanography Professional Development Center.


Andrew RyanAndrew Ryan
George Mason University
Geoinformatics and Geospatial Intelligence

Ryan graduated with a bachelor’s degree in geography from Virginia Tech in 2015, after which he completed an internship with the State Department’s Office of the Geographer. He currently works full-time as an all-source geospatial analyst with DigitalGlobe. Ryan’s research interests include social media analysis, activity-based intelligence, data mining, machine learning, and deep learning.
 


Jesse SpragueJesse Sprague
University of New Mexico
Computer Science and Geography

Sprague earned a bachelor’s degree in Earth and planetary science from the University of New Mexico, and has worked for the U.S. Geological Survey and private firms using geospatial information sciences for environmental management. Sprague now runs a spatial data company and is interested in deep belief networks and virtual augmentation of human experiences with low-latency spatial data.
 


UNDERGRADUATE


Luke M. DeJongLuke M. DeJong
American Military University
Homeland Security

DeJong brings his experience in the Marine Corps geospatial intelligence field to his pursuit of a degree in homeland security. He believes the future of our nation and the safety of American citizens can be best secured through using intelligence gathering to prevent terrorist attacks.

 

 


Norman Dela FuenteNorman Dela Fuente
University of California, Los Angeles
Geography

Dela Fuente’s geospatial interests include disaster response, urban planning, and nation security. He is also in the UCLA Army ROTC program and upon graduating will be commissioned as a second lieutenant in the California National Guard. Dela Fuente plans to utilize the leadership and critical thinking skills he’s learned as an Army officer to complement his civilian career.

 


Daniel GurleyDaniel Gurley
James Madison University
Geographic Science

Gurley is interested in the use of GIS to better implement international development programs and humanitarian responses to crises. He is currently a returning intern with the State Department’s Virtual Student Federal Service using remote sensing and research to help the Bureau of Overseas Buildings Operations select sites for new embassies and consulates. He is also involved in a research lab focusing on the infrastructure, history, and biodiversity of Gonâve Island in Haiti and its surrounding coral reefs.


Erin ManthErin Manth
Mercyhurst University
Intelligence Studies
EXPANDED PROFILE

Manth has spent two summers as a GEOINT analyst intern for a federal agency and has previous work experience at the National Student Leadership Conference on Intelligence and National Security. Manth is interested in GEOINT applications to national security and humanitarian response, specifically in the Middle East and North African regions.

 


Emma McFeeEmma McFee
University of Utah
Geography

McFee is continuing her pursuit of a geography degree with an emphasis in hazards, resources, and human security. Her passion for GEOINT stems from experiencing two major floods while growing up in Upstate New York. After being displaced from her home twice, she knew she wanted to help people in similar situations. She is also interested in how GEOINT can influence business decisions.

 


Elijah StapleElijah Staple
University of Colorado Boulder
Computer Science

Staple’s computer science interest is in deep machine learning networks. He has interned at two major Silicon Valley companies, the National Geospatial-Intelligence Agency, and the National Air and Space Intelligence Center. Staple’s goal is to employ advanced computational techniques to enhance the GEOINT tradecraft by enabling analysts to provide actionable intelligence to policy and decision-makers.

 


GRADUATING HIGH SCHOOL SENIORS


Robert CordtsRobert Cordts
South Lakes High School in Reston, Va.
Now attending James Madison University

Cordts became interested in GIS after taking a dual-enrollment geospatial analysis class offered through James Madison University in his senior year of high school. For his final project, Cordts used GIS tools to analyze where to place a new swim team in his hometown to increase participation in swimming among minorities and low-income families. He is majoring in geographic information science and looks forward to solving real-world problems.


Caitlin GormleyCaitlin Gormley
Sayville High School in West Sayville, N.Y.
Now attending the University of Toronto

In high school, Gormley was fortunate to gain experience in GIS as part of her school’s scientific research program. Her research used geospatial methods to investigate hydraulic fracturing and its potential impacts to the health of local communities. She was subsequently named one of the 2017 Regeneron Science Talent Search Scholars. Gormley hopes to major in urban studies and human geography with a minor in geospatial information systems.


Lily Nalulani JenkinsLily Nalulani Jenkins
Molokai High School in Kaunakakai, Hawaii
Now attending the University of North Carolina at Chapel Hill

In her free time, Jenkins would participate in wetland and fishpond restoration projects and conduct research projects on the effects of invasive marine species on coastal ecosystems. While conducting these research projects, Jenkins found her passion for using geospatial technology as a tool to tackle environmental issues. She plans to pursue a bachelor’s degree in environmental science and a master’s degree in information systems.

 


Haley KingHaley King
Tuscarora High School in Leesburg, Va.
Now attending George Mason University

During high school, King studied geospatial sciences through a dual enrollment program with James Madison University. Her final projects focused on first responders and precision agriculture. King also completed pre-college software development courses and cybersecurity courses at George Mason University and was a GIS analyst and summer intern for Dewberry. King plans to study geography and GIS.

 


Joshua Queja OrtezaJoshua Queja Orteza
Westside High School in Jacksonville, Fla.
Now attending the University of Florida

Orteza is majoring in mechanical engineering and participating in the Army ROTC program, and is interested in both aerospace and national security. If he earns a commission as an Army officer, Orteza would like to work with satellites, either helping maintain them or using the information they provide. Later, as a civilian, Orteza intends to work on satellites with a large aerospace company.

 


Timothy VrakasTimothy Vrakas
Brookfield East High School in Brookfield, Wis.
Now attending Stanford University

Vrakas is pursuing a degree in electrical engineering. For the past two years, he has explored interests in this field through his work for the Arizona State University Mastcam-Z Team, developing imaging hardware and software to support the cameras on NASA’s 2020 Mars Rover. Vrakas hopes to continue this work while in college.

 


The post The Next Generation of GEOINTers appeared first on Trajectory Magazine.

]]>
34964
Tempests + Terrain http://trajectorymagazine.com/tempests-terrain/ Wed, 01 Nov 2017 14:51:07 +0000 http://trajectorymagazine.com/?p=35020 Weather forecasting and GEOINT are naturally intertwined. As the former becomes more sophisticated, humanity stands better poised to predict and harness the power of the latter.

The post Tempests + Terrain appeared first on Trajectory Magazine.

]]>
Every mundane conversation you’ve ever had has probably included empty banter about the weather. And you’re not alone. In 1897, American writer Charles Dudley Warner quipped, “Everybody talks about the weather, but nobody does anything about it.”

Talk is the only thing about weather that’s small, however. Everything else about it is big, including its effects, which have economic, social, and political implications of growing consequence for individuals, communities, businesses, governments, and militaries. This is particularly true in an era of increasing meteorological tumult, when extreme weather events like Hurricanes Harvey, Irma, and Maria are broadcasting in no uncertain terms, “Severe weather ahead!” As such events become more routine than rare, Warner’s jocular observation begs a serious call to action: Instead of talking about the weather, the time has come to better anticipate, harness, and respond to it.

“Hurricanes, cyclones, thunderstorms, and other extreme weather events are becoming more common due to climate change and global warming,” said Peter Platzer, CEO of Spire, a cubesat startup with plans to collect and distribute high-frequency weather data to commercial customers. “So the contribution you can make to humanity by improving weather forecasting is really substantial.”

Weather has always been important. Not only because of the innumerable crises it has created, but also because of the many opportunities.

“In U.S. history alone, there have been all kinds of events where weather played an important role, going all the way back to George Washington crossing the Delaware to win the Battle of Trenton,” said meteorologist Paul Dorian, a senior systems engineer at Vencore, whose weather division provides weather forecasting for government clients like the U.S. Air Force Weather Agency and NASA. “One of the most famous, of course, is D-Day. Weather was critical for the Normandy invasion because Gen. Eisenhower made the decision to invade based on the weather forecast. It turns out we had better forecasters than the Germans did, and that’s [a primary reason the] invasion worked out so well for us.”

Bad forecasting can be just as impactful as good forecasting.

“In 1980, there was a hostage crisis in Iran and President Carter ordered a rescue mission,” Dorian continued. “Helicopters flew into the Iranian desert to try to rescue the hostages, but the winds kicked up and dust was blowing everywhere. It brought down one of the helicopters, which [contributed significantly] to the mission being aborted.”

Neither sandstorms nor hurricanes can be prevented. They can, however, be predicted. And if you can predict weather, you can manage it, according to Dr. Peter Neilley, an IBM Distinguished Engineer and director of weather and forecasting technologies for The Weather Company, which was bought by IBM last year and includes The Weather Channel and Weather Underground.

If weather impedes one warfighting function, that has repercussions for the entire brigade combat team in terms of being able to conduct its mission and defeat the adversary.
 
— Bill Spendley Jr., Army Office of the Deputy Chief of Staff, G-2

“Weather forecasts aren’t perfect, and they never will be. But they have gotten a lot better,” Neilley said. “As a result, decisions are being made every day based fundamentally on the weather forecast.”

But meteorology alone can’t ensure more D-Days and fewer failed missions. Because all weather has a location and all locations have weather, weather forecasting must work in concert with GEOINT, according to Neilley. “Weather is fundamentally a geospatial science,” he continued, noting, for instance, the temperature differences between low and high elevations, and between inland and coastal communities. “The terrain can have a significant impact on what the local weather is.”

It’s not just terrain. Other GEOINT variables such as land type, latitude, water proximity, and even human geography also influence weather.

“GEOINT is the exploitation and analysis of imaging and geospatial information that’s describing, assessing, and visually depicting physical features and geo-referenced activities. Weather exploitation is the same thing; it’s exploiting and analyzing images and atmospheric information to describe, assess, and visually depict physical features that are geo-referenced,” explained Eric Webster, vice president and general manager of environmental solutions at Harris Corp.

Understanding and exploiting these parallels could help humankind recast weather as an opportunity instead of a threat.

Why Weather Matters

Few people understand the significance of weather better than Rep. Jim Bridenstine (R-Okla.), who serves on the U.S. House of Representatives’ Armed Services and Science, Space, and Technology committees, and at press time was nominated by President Trump to be the new NASA administrator. “As a member of Congress from … Oklahoma, until this year I have had constituents die every year in tornadoes,” said Bridenstine, a Navy combat veteran. “I will also tell you as a naval aviator—and now as a pilot in the Air National Guard—that I have been very affected by weather many times in my military career, from the ability to do strikes on a target to the ability to land on an aircraft carrier in high seas. So it’s very important to me and very important for our country to make sure we’re doing everything we can to get the right [weather] intelligence to the right people at the right time.”

Weather forecasts are equally consequential for civilians and warfighters. For the former, it boils down to lives and livelihoods.

“The physical and economic losses the world suffers because of inaccurate weather forecasts are staggering,” said Platzer, who added weather impacts a third of the global economy and 100 percent of the global population. “When the weather forecast calls for a blizzard in New York, but it actually takes place in Boston, there’s loss of life, loss of property, and loss of money.”

For the military, what’s ultimately lost is the mission.

Airmen from the 3rd Weather Squadron set up a Tactical Meteorological Observing System (TMOS) during Spartan Warrior May 13, 2015, at Avon Park Air Force Range, Fla. TMOS is used in the field to measure wind speeds, cloud levels, and temperature. (Photo credit: U.S. Air Force)

“Let’s say you’re going to take out some ISIS guys in Libya, and you’re going to fly an airplane across the pond from the United States to do it,” said Air Force Director of Weather Ralph Stoffler. “Obviously, you want to know from a weather perspective when is the best time to take off; when is the best time to conduct aerial refueling operations, and where; when is the target going to be clear, and if it’s not going to be clear, should you use a different weapon that potentially works better when you can’t see the target? Those are all questions that we help answer.”

The Army leverages weather forecasting to answer similar questions, according to Bill Spendley Jr., weather team chief in the Army’s Office of the Deputy Chief of Staff, G-2. “The Army has six warfighting functions, and every one of those warfighting functions has capabilities therein that are affected by weather,” said Spendley, who described weather’s effects on brigade combat teams as a “mud-to-sun situation.”

In space, for instance, extreme weather can affect satellite communications and GPS signals. In the air, it can hamper the ability to drop weapons or paratroopers. And on the ground it can affect trafficability, interrupting the delivery of fuel, ammunition, supplies, and medical care.

“If weather impedes one warfighting function, that has repercussions for the entire brigade combat team in terms of being able to conduct its mission and defeat the adversary,” Spendley said.

Because cloud cover can obstruct images taken by Earth observation satellites, which supply more than 90 percent of data used in weather forecasts, weather likewise is mission-critical for the Intelligence Community (IC), according to Air Force Col. Herb Keyser, a senior meteorology and oceanography (METOC) officer at the National Geospatial-Intelligence Agency (NGA). If you’re looking at weather through an ISR lens, he said: “It’s all about clouds. Not many people care about cloud forecasting to the extent that we do.”

In truth, it’s not all about clouds. It’s also about context. “NGA is looking at population-forcing functions like potential landslides, vegetation health, and water security,” Keyser continued.

Weather’s impact on human geography can be simple—people stay home because they don’t want to go out in the rain—or complex: Climatological problems catalyze large population shifts.

“Weather influences crops and drought, which influence political instability, which influences refugees,” said Patrick Biltgen, director of data analytics for Vencore’s intelligence group. “If you’re able to forecast changes in weather and climate, you can predict massive geopolitical changes.

Targeting Terrain

Citizens, soldiers, and decision-makers are no longer content with talking about the weather; faced with so many impacts, they’re acting on it, too.

GEOINT deserves a lot of the credit, according to Bridenstine. “When you talk about national security, weather, and climate, all of it requires geospatial intelligence,” he said, noting that Earth observation satellites launched for GEOINT missions are benefitting weather forecasters every day by collecting data about the atmosphere, lithosphere, hydrosphere, cryosphere, and biosphere—Earth’s air, land, water, ice, and organisms, respectively. “[Using satellites], we’re now discovering that we can see massive sandstorms in the Sahara Desert that are moving over the Atlantic Ocean, where they absorb large quantities of radiant energy from the sun. That affects the temperature of the ocean and in some cases actually mitigates the hurricane seasons that affect the United States … That’s just one example of many where GEOINT has benefited the weather community.”

GEOINT and weather are especially symbiotic in the military, according to Spendley and Stoffler, who agree that terrain is ground zero for GEOINT-weather synthesis.

“The intersection to a great extent happens at the tactical level,” explained Stoffler, who said Army topography teams collaborate with Air Force weather officers to determine trafficability based on terrain and weather inputs.

“An example would be years ago when we deployed to Rwanda. We had to support a 1,000-truck convoy of humanitarian-relief mission sets,” Stoffler said. “The Army was very concerned that the roads would wash out because of the monsoons that happen at that time of year, but they couldn’t do a proper trafficability forecast because they didn’t have the weather information. So we provided that to them and they in return provided things back to us, which allowed us to produce an integrated forecast on when the best time was to move those trucks and the best route to take.”

Echoed Spendley, “The intersection of terrain and the atmospheric conditions touching that terrain is obviously critical in terms of being able to conduct operations. METT-TC—mission, enemy, terrain and weather, troops and support available, time available, and civil considerations—is the lens through which the Army plans, conducts, and executes operations. Notice how it’s ‘terrain and weather.’”

METOC personnel from across the services collaborated this year to author a new edition of Joint Publication 3-59: Meteorological and Oceanographic Operations for the chairman of the Joint Chiefs of Staff.

“The armed forces use joint doctrine as principles of how to fight and win wars,” Spendley explained. “This entire joint publication was completely rewritten with a focus on the integration of the most accurate, timely, and relevant weather information into the joint force commanders’ decision-making process.”

Civil stakeholders also are invested in the terrain-weather nexus. An area of particular interest is flooding. In 2015, the National Oceanic and Atmospheric Administration’s (NOAA) National Weather Service (NWS) launched the National Water Center at the University of Alabama in Tuscaloosa. Geospatial scientists and weather forecasters at the 65,000-square-foot facility collaborate to analyze, model, and forecast water conditions—including stream flow, water level, runoff, flood inundation, snowpack, soil moisture, and evapotranspiration—for 2.7 million rivers and streams.

“They have implemented a new National Water Model that uses geospatial information like terrain and slope to forecast basins so we know better how water is going to flow through them,” explained NWS Observations Portfolio Manager Kevin Schrab.

Forecast models that fuse geospatial and weather information likewise can help mitigate wild fires.

“There are three primary drivers of wild fire behavior. One is terrain, or how the landscape is arranged. The other two are wind and fuels, both of which are dependent on what’s going on with the weather,” said former California State Fire Marshal Kate Dargan, co-founder and chief strategy officer at Intterra, a software company that provides situational awareness to public safety customers. Dargan is also a member of USGIF’s Board of Directors. “So, everything about wildland firefighting and wildfire risk is geospatial and weather-based in nature.”

Fortifying Forecasts

The most coveted weather data includes forecasts that are more detailed, accurate, local, and protracted.

Improvements are inevitable yet incremental, according to Neilley, who said weather forecasting accuracy historically has improved at the pace of one day per decade, such that a three-day forecast today is as accurate as a two-day forecast was 10 years ago. “Weather forecasting is an evolutionary science, and there’s a perpetual pipeline of things that are coming along and contributing to those evolutions,” he explained.

The most significant items in the weather pipeline today are the next generation of weather satellites, which are fundamentally better than their predecessors, according to Neilley. Specifically, NOAA operates two types of satellites: Polar Operational Environmental Satellites (POES), which provide global coverage twice daily, and Geostationary Operational Environmental Satellites (GOES), which have a fixed position from which they provide near-continuous observation of a certain region. NOAA and NASA are collaborating on upgrades to both.

The Geostationary Lightning Mapper is a single-channel, near-infrared optical transient detector that can detect the momentary changes in an optical scene, indicating the presence of lightning. (Photo credit: NASA)

At press time, the next iteration of POES, the Joint Polar Satellite System (JPSS), is scheduled to launch its second of five satellites, JPSS-1/NOAA-20, in November. Carrying a payload of five weather-monitoring instruments, the system will gather global measurements of atmospheric, terrestrial, and oceanic conditions. Its measurements will support accurate seven-day weather forecasts that will help meteorologists predict the intensity and location of severe weather events days before they occur.

The next iteration of GOES, the GOES-R Series, launched its first of four satellites, GOES-R/-16/-East, in November 2016 with a payload of six instruments. The satellite, slated to become operational in November, has already demonstrated a number of new capabilities that promise to improve the detail and accuracy of weather forecasts.

GOES-R satellites feature an advanced baseline imager (ABI) that views Earth across 16 spectral bands. It can scan the entire Western Hemisphere every five minutes or take multiple images concurrently, in which case it’s capable of imaging the Western Hemisphere every 15 minutes, the continental U.S. every five minutes, and two specific storms every 60 seconds. The previous generation of GOES features five spectral bands and can image the Western Hemisphere just once every 30 minutes.

“The picture is much clearer; there’s three times the spectral bands, which allows you to see variations in temperatures and other things within clouds; and you’re able to get information to forecasters much more quickly,” said Webster of Harris, which developed the GOES-R ABI for NOAA.

That increased capability will assist not only with forecasting weather on Earth, but also in space.

“Space weather is becoming more important as electronics and satellites become more and more embedded in our society,” said Dorian of Vencore, which is working with NOAA on GOES-R in a systems engineering capacity.

Destructive solar storms represent a growing threat to satellite operations and communications.

“GOES-R gives us better capability to monitor solar activity, which is critical to the Intelligence Community because satellites can be impacted as a result of solar wind,” Dorian added.

Another notable instrument aboard GOES-R satellites is Lockheed Martin’s Geostationary Lightning Mapper (GLM), a sensor that can detect and measure lightning activity continuously.

“This is the very first lighting sensor from space that’s in the geostationary orbit,” said Dr. Allan Weiner, senior scientist in charge of the GOES-R ground processing system at Harris. “This particular sensor in combination with the ABI is going to be very exciting because we’re going to learn all-new information from it.”

Historically, meteorologists have forecast storms based on cloud formation and rainfall. Measuring lightning activity alongside those traditional inputs adds another dimension to weather forecasting that will make it easier to identify whether storms are escalating or de-escalating. GLM images the Earth at a rate of 500 frames per second, then performs onboard image processing in the form of automated change detection. The resulting data is especially promising for forecasting tornadoes.

“Right now the accuracy of predicting tornadoes is quite terrible. Even with all the information [meteorologists] have, it’s on the order of 60 percent of the time that they’re wrong,” said Dr. Samantha Edgington, Lockheed Martin’s chief GLM scientist.

Weather forecasters typically rely on radar to identify tornadoes—which often leads to missed tornadoes when radar coverage is poor.

“As you can imagine, if you live in a place where there are tornado warnings often, and more than half of the time they’re wrong, eventually you stop paying attention to them,” Edgington continued. “The goal of lightning data is to not only detect those tornadoes that are missed because of poor radar coverage, but also to make tornado predictions more accurate so that when the National Weather Service says a tornado is coming, people will actually listen and do something about it.”

Bridging the Weather Gap

Despite the advent of new satellite systems like JPSS and GOES-R, the U.S. Government Accountability Office (GAO) says the country is facing an “imminent satellite data gap.”

“Federal agencies are currently planning or executing major satellite acquisition programs to replace existing polar and geostationary satellite systems that are nearing the end of, or are beyond, their expected life spans,” the GAO reported to Congress in early 2017. “However, these programs have troubled legacies of cost increases, missed milestones, technical problems, and management challenges that have reduced functionality and delayed launch dates. As a result, the continuity of weather satellite data is at risk.”

Of special concern, according to the GAO, are aging polar satellite systems the Department of Defense (DoD) operates. Not only has DoD been slow to plan and launch replacements, it said, but the department has also been plagued with misfortune. For example, its newest weather satellite, Defense Meteorological Satellite Program (DMSP)-19, launched in 2014 but experienced a power failure in 2016 and was subsequently lost.

“There’s a gap that’s coming,” Webster said. “The military has acknowledged that, and now they’re trying to figure out how to fill it.”

One potential solution represents yet another shared interest between weather and GEOINT: commercial data sources.

“We have made it clear to the commercial world that we are very interested [in commercial weather data],” Stoffler said. “Within DoD, to maintain our own capability that covers the entire globe is a challenge. It costs a lot of money to do that. And frankly, we’ve been relying a lot upon international players … [that] are now being replaced by Russian and Chinese capabilities that we legally can’t use—and wouldn’t use even if we could.”

Shaina Johl, one of Spire’s engineers, inspects an early Lemur-2 satellite model while Joel Spark, co-CTO at Spire, looks on from outside the clean room. Lemur-2 satellites are among the 40 cubesats Spire currently has in orbit. The company was awarded NASA’s first commercial weather contract in 2016. (Photo credit: Spire)

Bridenstine is among commercial weather data’s biggest advocates. “I’ve been working on … encouraging the Department of Defense and other government agencies—NOAA, specifically—to purchase commercial space-based weather data,” he said. “A lot of commercial entities are launching satellites to furnish this data because private industry has signaled demand for it. Transportation companies, agriculture companies, and insurance companies all are interested in gaining a competitive advantage by being able to better predict the weather. The question is: Will the government purchase the same data to improve our weather prediction capabilities?”

Bridenstine co-sponsored the bipartisan Weather Research and Forecasting Innovation Act of 2017 that President Donald Trump signed in April, giving NOAA permission to explore, test, and purchase commercial weather data.

“NOAA already is conducting a pilot project to test and validate that data,” Bridenstine continued. “The next step is to have the Department of Defense do the same thing, and we’re going to accomplish that through the National Defense Authorization Act and defense appropriations.”

Of greatest interest is commercial GPS radio occultation (GPS-RO) data, which is being furnished by companies like Spire and PlanetiQ. Spire, which was awarded NOAA’s first-ever commercial weather contract in September 2016, already has 40 cubesats in orbit, with plans to eventually have more than 100. PlanetiQ plans to have a constellation of 12 to 18 microsatellites in orbit by the end of 2019, the first two of which are expected to launch in summer 2018.

“GPS radio occultation is pure physics,” said Chris McCormick, PlanetiQ’s chairman, founder, and former CEO. “It’s refraction. When you see the sunrise and sunset, the reds, oranges, and yellows are light being refracted, or bent, by the atmosphere … GPS signals also get bent by the atmosphere.”

When the atmosphere bends GPS signals, it delays them. Measuring the delay allows scientists to deduce the makeup of the atmosphere, including its temperature, pressure, and—most importantly—moisture.

“The next day’s weather, or the next week’s weather, is created in the ocean by the sun-ocean interface,” McCormick explained. “If you know how much water vapor is in the atmosphere, and what the variability is of the temperature of that water vapor, it’s much easier to predict where clouds will form, when, and for how long.”

The more measurements one has, the more accurate the forecast. “It’s not the size of the sensor, but the number of sensors, that drives value,” Platzer said. “That’s why companies like Spire can make a difference.”

Computing and Communication

What ultimately will unlock the next generation of weather forecasting are computing and communication, both of which will enable a new order of GEOINT-weather integration.

The observations of next generation weather satellites will be rendered useless without sufficient processing power to interpret them. Quantum computing is one likely solution. Artificial intelligence and machine learning is another.

“DoD, specifically, has been working on service-enabling weather data to be able to get it to organizations like NGA … so [analysts] can search for patterns that they can then extract intelligence from,” NGA’s Keyser said. “Because we don’t have time for somebody to sit and look at a wind gauge, for example, we need to be able to do machine-to-machine processing that frees up the analyst to actually think about problems instead of just looking at them.”

Which leads to forecasting’s other major opportunity: communication.

“The weather community needs to do a better job of being less esoteric,” Platzer said.

Echoed Biltgen, “Generally, people don’t really understand the weather. The forecaster comes on TV with a map that has triangles and half-moons and ‘high pressure’ and ‘low pressure,’ but all anyone really wants to know is: Do I need a jacket and an umbrella?’”

Neither civilians, warfighters, first responders, nor intelligence analysts care about weather science; all four, however, care about weather impacts, which can be understood and communicated better with the assistance of GEOINT. For example, the National Center for Atmospheric Research is testing technology that marries ground data with atmospheric predictions to give wildland firefighters real-time, location-based insights.

NWS is doing similar research via its Weather Ready Nation (WRN) program, whose charge is exploring new ways to present and disseminate weather information so decision-makers and citizens will take it seriously.

“One of our sterling successes has been a storm-surge inundation map,” said WRN spokesperson Douglas Hilderbrand. “A few years ago, the way we quantified and communicated storm surge was very complicated; you had to go to the National Hurricane Center and the National Ocean Service, consult a tide chart, and incorporate on your own wave and elevation information to answer the basic question: Is my house vulnerable to storm surge? Now we have a very intuitive map that incorporates all the storm surge science and allows people to determine their home’s risk in a much more visual way … We’re trying to make NOAA information more personal and more applicable … because that’s when people listen and take action.”

And in a world where extreme weather events are increasingly common, action will be weather forecasting’s most important output—not only for citizens in the path of destructive storms, but also for military commanders and intelligence officers seeking strategic advantage over powerful enemies.

“If you’ve got a brigade combat team commander with a rotary wing assault force that needs a 500-foot cloud ceiling and two-mile visibility, they need to know if there’s only a 300-foot cloud ceiling and a quarter-mile visibility so they can make decisions in the most informed way possible,” Spendley concluded. “They’re not interested in knowing that there’s a high chance of rain today; they’re only interested in what the effects of weather will be on their mission, either where they’re operating or where they’re projected to operate. That’s where the rubber meets the road.”

Featured image: Beginning of a tornado on a deserted highway in the Oklahoma panhandle.

The post Tempests + Terrain appeared first on Trajectory Magazine.

]]>
35020
GEOINT for Policing http://trajectorymagazine.com/geoint-for-policing/ Wed, 01 Nov 2017 13:07:34 +0000 http://trajectorymagazine.com/?p=35082 Software, sensors, and other location-based technologies offer opportunities and challenges for law enforcement

The post GEOINT for Policing appeared first on Trajectory Magazine.

]]>
The traditional who-what-when-where crime report is starting to acquire many more details—from the proximity of the nearest ATM or street light to the occupational, educational, or religious significance of the date.

These are the kind of data points and insights any cop on the scene would notice, but which could then easily get lost in the system.

By combining increasingly detailed databases with powerful software that can detect patterns almost as fast as reports are filed, police departments and other first responders can deploy their resources more efficiently, be more accountable to citizens, and perhaps even develop a sense of where crime is likely to occur next.

But there’s also a risk of confusion and unnecessary expense as busy police departments try to assess pitches from geospatial intelligence (GEOINT) firms.

“We’re almost getting flooded by them,” said Police Lt. Joseph Flynn, assistant commander of the Fairfax County Police Department’s Criminal Intelligence Division and deputy director of the Northern Virginia Regional Intelligence Center. “It’s still so new, and what do we want?”

Prescient Analytics

Applying GEOINT to policing begins with the basics of incident reports and 911 calls, explained Robert Cheetham, CEO of the Philadelphia firm Azavea. Its subsidiary HunchLab performs some of the leading work in next-generation policing software.

HunchLab models incorporate “a whole range of other things,” Cheetham said. He listed both nearby amenities and businesses—transit stops, ATMs, liquor stores, and even lighting—in addition to temporal factors such as the time of the day, the day of the week, whether school was in session, and whether it was a holiday.

In each municipality, HunchLab builds a model that incorporates these inputs and calculates the potential harm of types of crime using the RAND Corporation’s “Cost of Crime” calculations. The results—at an annual subscription cost of $20,000 to $80,000 depending on municipality size, with custom pricing for the largest cities—not only illuminate crime trends but offer a hint of where they’re likely to head.

“What we’re doing is not prediction,” Cheetham said. “It’s more of a forecast of a difference in risk.”

The Chicago Police Department (CPD) ranks as HunchLab’s highest-profile client on account of the high rate of shootings across the city. CPD began deploying HunchLab’s system in January 2017; by mid-year, the department had brought it to the six of its 25 districts that account for 25 percent of the city’s shootings.

“We’ve seen what I’ll say are promising results,” said Jonathan Lewin, chief of CPD’s Bureau of Technical Services. In the first two districts to get this upgrade, shootings have so far dropped by 33 percent, well above the 14 percent drop citywide.

Lewin added the department is using the data it collects not just to dispatch officers faster but to speed actions by other parts of city government.

“One of the things we looked at was 311 calls for streetlights out,” he said. “Does that tend to correlate with nighttime shootings?”

As a result, Lewin said, the city is now prioritizing its deployment of connected LED streetlights “in some of the areas where we think it might have the greatest impact on reducing crime.” 

However, if law enforcement agencies don’t clean up their data before implementing forecasting technologies, they risk being led astray.

“Not having the proper protocols and data governance policies to prevent incomplete and inaccurate data entry leads to the issue of ‘junk in, junk out,’” Jody Weis, public safety lead at Accenture, warned via e-mail. “The finest analytic system, with the absolute best algorithms, will be useless if the data it is analyzing isn’t accurate.”

Jeff Gallagher, a GIS specialist with the Fairfax County Police Department, advised cultivating relationships with local government information technology and GIS professionals.

“Get out of the little pigeonhole and see the amount of data your county has,” Gallagher said.

Unblinking Eyes

In addition to information derived from officers, citizens, and databases, many police departments also have unblinking eyes on their communities in the form of automated sensors that collect real-time data for quick analysis.

“If it’s collecting a location, we can bring it in,” said John Beck, Esri’s industry manager for police. Esri’s GIS software can incorporate data from license-plate reading sensors, ShotSpotter gunfire-detecting microphones, officers’ body-worn cameras, and GPS anklets worn by offenders.

Data from sensors such as the ShotSpotter gunshot detection and location service can be integrated with other data into GIS systems for analysis by police departments. (Image credit: ShotSpotter)

Such data integration can add to a department’s budget and can encounter resistance from citizens. For instance, Lewin said CPD cameras got a better reception in communities after the department switched to a less obvious model that didn’t have continuously flashing blue lights.

But they do work.

“People are now actually catching criminals in the act based on the predictive analysis of all this historic and real-time data,” Beck said.

However, Beck continued, with the deluge of new information also comes the risk of overloading officers with data that should first pass an analyst’s eyes.

“We’re seeing a lot more real-time crime centers in the U.S. and beyond,” Beck said, complimenting CPD for setting up these centers in individual districts. That, however, should not come at the cost of taking officers off the street.

Lewin said CPD hired eight civilian analysts to embed in these centers. It also had representatives from HunchLab and security systems firm Genetec go on ride-alongs with officers to learn how to refine their user interfaces.

An existing set of analog sensors—as in, the eyes and ears of citizens—remains essential.

“Don’t become so over-reliant on [technology] that you become disconnected from the community,” said Sean Whitcomb, a sergeant and spokesman with the Seattle Police Department (SPD). He pointed to SPD’s regular incorporation of citizen input into its SeaStat crime-statistics program. “The value is increased exponentially because we supplement our own data with real-time feedback from the community.”

A Balancing Act

Collecting new data and building predictive models can also help police agencies increase their accountability to citizens.

“When I was a cop, we didn’t share any information with the public,” Beck said. “Now, police are sharing information about all of their activity, including use of force and police-involved shootings, and making that data open to the public.”

He pointed to the Philadelphia Police Department, whose website documents officer-involved shootings and allows visitors to compare the locations of those incidents with the locations of gun crimes across the city.

Public desire for accountability is another factor driving law enforcement agencies to deploy GEOINT.

In Chicago, the city’s Independent Police Review Authority now maintains a searchable use-of-force database, including audio and video from officers’ body cameras. And in Seattle, a 2011 Department of Justice investigation that found fault with SPD’s collection of data led the department to partner with Accenture to build a data analytics platform.

But data collection in policing can also generate public dissatisfaction with police departments. In 2016, citizens were angered to learn SPD had purchased Geofeedia’s social media analysis software two years earlier.

Weis and Beck each pointed to social media monitoring as the next frontier in the use of GEOINT by police. But after SPD’s attempts to glean intelligence from status updates went awry, the resulting blowback led Facebook and Twitter to yank Geofeedia’s access to their networks.

“There’s a very fine line between government surveillance and spying,” SPD’s Whitcomb said, adding the department now focuses on the social postings of individual suspects. “Something causes more harm than good if it erodes public trust and confidence.”

Said CPD’s Lewin, “Community partnership requires that we engage our stakeholders, and part of that is being as transparent as possible.”

Jay Stanley, senior policy analyst for the American Civil Liberties Union, emphasized police departments and the GEOINT industry should maintain transparency to help “reduce bias and improve trust with communities.”

Cheetham echoed Stanley’s point.

“I want to be on the right side of history on this,” he said.

More Research Needed

Cheetham and Stanley separately noted the need for more published research on the effectiveness of GEOINT and predictive policing.

For example, while the Police Executive Research Forum has spent years investigating law enforcement best practices, it has yet to study this technology, Director of Communications Craig Fischer wrote via e-mail.

A former police officer and current academic concurred via email. “The independent empirical research is limited and equivocal,” wrote Dr. Kim Rossmo, director of the Center for Geospatial Intelligence and Investigation at Texas State University.

Lewin said CPD is now working with the University of Chicago’s Crime Lab to research how its initial deployment of predictive policing technology has fared.

But, he added, the real-world consequences of police work make it difficult to run a classic experiment in which a control group is left out of a technological advance: “If you have something that could be effective, you want to use it.”

The post GEOINT for Policing appeared first on Trajectory Magazine.

]]>
35082
Situational Analysis http://trajectorymagazine.com/situational-analysis/ Wed, 01 Nov 2017 13:06:15 +0000 http://trajectorymagazine.com/?p=35087 Satellite imagery, drones, advanced analysis, and other emerging technologies are quickly changing the face of firefighting

The post Situational Analysis appeared first on Trajectory Magazine.

]]>
The use of geospatial intelligence (GEOINT) tools such as remote sensing and data visualization is on the rise in the firefighting community, and the future of the profession will be greatly influenced by ongoing technological advances.

Kate Dargan, former California State Fire Marshal, co-founder of Intterra, and a USGIF board member, reflected on her early career as an air attack officer fighting wildfires in her home state.

“I was the ‘eye in the sky’ translating what I was looking at from several thousand feet to the firefighters on the ground,” she said, recalling later trying to capture video from the air using a handheld camera.

Today, commercial satellite imagery as well as LiDAR, hyperspectral, and infrared imagery collected from manned and unmanned planes could all be part of a firefighter’s toolkit. When paired with powerful data analysis platforms and mobile apps, GEOINT offers first responders greater situational awareness and a better understanding of the communities they serve.

A Rapid Evolution

“Many firefighters may only see the world through the windshield of the fire truck,” Dargan said, but noted available technologies and firefighter expectations are rapidly evolving.

For example, she said, fire chiefs may understand the basics of infrared technology but not yet be conversant in the various types of infrared and their corresponding capabilities. Regardless, Dargan said she is seeing the increased presence of unmanned aerial systems (UAS) at industry trade shows and is aware of more and more departments purchasing small drones.

For the last 100 years, firefighters used paper and pencil to create diagrams of buildings and map areas of wildfire risk. Modern fire departments employ geospatial technology to develop a standard of cover, more efficiently deploy resources, perform risk assessment, and pinpoint potential problem areas, according to Talbot Brooks, firefighter and director of the Center for Interdisciplinary Geospatial Information at Delta State University in Mississippi. Investment in geospatial tools supports risk reduction by being able to plan in advance of an emergency what equipment to use and where to position it. But the ability to improve response and mitigate risks relies on the ability to also properly integrate and manipulate geospatial data. 

If additional staffing, stations, or [equipment] are needed, a fire chief has the [geospatial] evidence needed to justify a budget request.
 
— Talbot Brooks, Delta State University

Dargan said the future of firefighting technology includes the networking of disparate imagery derived from different sensors and organizations. That is what her team strives for with its subscription-based Situation Analyst platform, which pulls all of that imagery together in one place and serves it up to each person in an organization modified for his or her needs.

David Holmerud, a fire service management consultant and former deputy fire chief in Solana Beach, Calif., emphasized the importance of asking the correct questions of the data at the right times: “Is there something more we can do to change the outcome of the responses? Of these structural fires, how many were contained to the original building? What difference did what we do make?”

Knowing how to draw the right conclusions from the data is the key to advancing the capabilities of the modern-day firefighter.

Startup Descartes Labs, founded by a group of scientists from Los Alamos National Laboratory, is pairing satellite imagery with machine learning to help draw better conclusions for firefighting. In a company blog post titled “Fighting Wildfires Using a Cloud-based Supercomputer,” research scientist Daniela Moody writes: “The Descartes Labs Platform provides us with a view of the planet that no one has ever seen before—not only is it multi-sensor, multi-resolution, and multispectral—it’s also a multi-decadal historical lens.”

This information helps to ascertain damage from fires over time and can be used to make better decisions about how to fight fires in the future. The platform enables users to extract information not available to the naked eye, pull in a lot more data than can be done manually, and leverage machine learning processes that incorporate algorithms based on numerous data points.

“During the course of a fire, especially one with limited allocated resources, satellite imagery analysis could better direct ground crews to hotspot and containment areas,” Moody wrote.

Building Partnerships

Communication among the public safety community is also important when adopting new technologies. Holmerud recommends initiating and maintaining an open dialogue with city planners who may have already gathered and even visualized valuable data fire services could potentially tap into.

“For example, when a new subdivision is planned, many different data elements are available as a result of the approval process,” Holmerud said. “These data sets, ranging from street layouts to location of underground utilities, can be used to provide the basis of fire department pre-plans and updates to response maps and dispatching procedures.

It can be a time saver as well as provide accurate data.”

Dargan encourages fire chiefs to participate in wide-ranging discussions that include police departments, community health workers, public works, transportation officials, and other civic departments. These conversations will introduce fire service leaders to technologies not designed specifically for firefighting, but that could be adapted for their missions.

The Bolivar County Volunteer Fire Department concludes a live fire training at a donated structure in Benoit, Miss., in February 2009. (Photo credit: Delta State University)

Holmerud, who is also an instructor at the National Fire Academy in Emmitsburg, Md., touts the value of collaborating with local colleges and universities on projects that could be of benefit to both parties. For example, the city of Wilson, N.C., has done significant work in mapping layers of data such as water flow, utility shut offs, and the number of people potentially living in a given structure. The city of Wilson makes these maps available to Holmerud’s students, who manipulate the layers behind the scenes by changing various conditions and factors. This activity enables students to go back to their communities with a better understanding of where information comes from and who they need to work with to ensure adequate resiliency and response planning.

Public-private partnerships could also pave the way toward better technological support for fire services. In the Phoenix, Ariz., area, 27 fire departments broke through jurisdictional boundaries to integrate their response to 911 calls. With a GPS unit now in every fire truck, the team in the best position to respond is dispatched to an incident, regardless of geographic boundaries. This new approach has resulted in shorter response times throughout the area.

Eric Prosser, information technology officer for the Santa Clara County Fire Department in California, points to the multi-agency coordination that was necessary for Santa Clara to host Super Bowl 50 in 2016. According to the NFL, 1.1 million people attended the game and related events.

Prosser’s iMAP Team won a USGIF Award in 2016 for providing the Santa Clara County Multi-Agency Coordination Center with a GEOINT-based decision and situational awareness platform. The iMAP team developed an enterprise GEOINT system used to manage all fire and medical service operations throughout Super Bowl 50. In collaboration with Dargan’s Intterra, the developers generated the ability to integrate 911 computer-aided dispatch information, map special events throughout the region, monitor resource availability, view GIS layers to include near real-time satellite imagery, and analyze data trends.

“The results of iMap enabled us to be better prepared for future special events and large-scale incidents, and to have situational awareness at both the department and operational area levels,” Prosser said. “This additional data provides us with useful information on a daily basis within the Silicon Valley.”

The Geospatial ROI

Holmerud said although many fire departments are slow to officially adopt GEOINT, he is beginning to see volunteer departments systematically use smartphone apps to gain a sense of who’s responding as well as their locations and estimated arrival times. He believes these kinds of tools will make departments hungry for more geospatial information.

“We’re starting to see the value of [geospatial] intelligence coupled with response software—starting to see what they can do and look at the possibilities,” Holmerud said.

The realization that geospatial technology can be a force multiplier when it comes to getting the most out of existing resources will also help drive adoption, according to Brooks.

This map by Descartes Labs shows the burn severity index for the 2016 Soberanes fire on California’s Monterey peninsula. (Photo credit: Descartes Labs)

“If I want budget to go after something, now I can show it,” Brooks said of his ability to use data to test and prove a theory. “It’s not just a supposition. [Geospatial tools are] a good way of separating fact from fiction.”

Developing a standard of cover using GEOINT provides a data-driven solution for understanding where departmental strengths and weaknesses are located geographically.

“If additional staffing, stations, or [equipment] are needed, a fire chief has the [geospatial] evidence needed to justify a budget request,” Brooks said. “Supposition and anecdote are removed from the process and political leadership can have more confidence in decisions that often cost (or save) millions of dollars.”

According to Dargan, there are three main areas in which fire departments can invest: equipment, people, and information.

“One of the key messages we’re trying to communicate is that information is a resource and a hard commodity that should be planned for and used just like equipment and [people],” she said. “The return on investment for data is or will pan out to be higher than it is for the other two types of resources.”

For example, the amount of data a fire department can acquire and put to use through remote sensing is not available through any other method except boots on the ground evaluating each building and area of risk.

“We’ll never have enough staff to send feet up every driveway in California to talk to every home or business owner,” Dargan said.

Those data-enabled decisions could lead to less costly emergency response with less loss of life and property, she added.

Imagine a firefighter being able to do a voice search while combating a wildland or structural fire, Dargan said. They could say, for example: “Show me houses with wooden roofs and give me their addresses.”

This type of timely access to geospatial data will enable firefighters to more effectively respond to emergencies and will significantly improve their ability to predict events and therefore protect more property and save more lives.

Featured image: Fire glows on a hillside in Napa, Calif., October 9, as multiple wind-driven fires whipped through the region. (Credit: Josh Edelson / AFP / Getty Images)

The post Situational Analysis appeared first on Trajectory Magazine.

]]>
35087
Enabling Rapid Response http://trajectorymagazine.com/enabling-rapid-response/ Wed, 01 Nov 2017 13:05:43 +0000 http://trajectorymagazine.com/?p=35092 Geospatial intelligence proves a powerful tool for paramedics

The post Enabling Rapid Response appeared first on Trajectory Magazine.

]]>
In the Emergency Operations Center, a dispatcher takes a bystander’s cellphone call about a car crash on a poorly marked rural road. The report prompts the dispatcher to send regional air medics as well as the nearest local ground EMS crew. Next door, EMS managers analyze response statistics for a rapidly growing residential area.

Across town, an EMS crew teaches citizen CPR in a neighborhood with a high cardiac arrest rate. After training, a smartphone app will be integrated with EMS dispatch, so bystander CPR can be started in public spaces before EMS arrives. All of these activities, some long established and others cutting-edge, rely on geospatial intelligence (GEOINT) data and technology to save lives, yield better patient outcomes, and improve agency efficiency.

Early EMS operations used “static deployment,” with a set number of vehicles assigned to permanent stations. In the 1980s, increased call volumes without equal investment in EMS systems led to system status management, which was intended to optimize coverage based on temporal patterns of use.

The advent of computer-aided dispatch and automatic vehicle locator technology allowed dispatchers to determine the closest available ambulance for a call, but it took near real-time analysis and predictive analytics to make the deployment and use of resources truly effective. As economic stresses mandate that services accomplish more with fewer resources, dynamic deployment has become a mainstay in providing efficient and cost-effective coverage.

Dynamic Deployment

“In dynamic deployment, ambulances are directed toward the highest uncovered demand at that moment in time. Some call it ‘chasing the blob,’” said Dale Loberger, an active EMS member and a developer at Bradshaw Consulting Services, which developed the Mobile Area Routing & Vehicle Location Information System (MARVLIS). “Demand is constantly being re-evaluated in near real-time and resources are being matched to that demand as their level of availability changes.”

The MARVLIS system models the probability of future call locations based on historic data, near-real-time inputs such as dispatch and response times, and factors such as traffic conditions. The automated forecast is modeled through Esri’s ArcGIS platform and displayed as a mapping interface. Combined, MARVLIS GPS data, GIS modeling, and wireless communications allow EMS to “have the right units at the right places at the right times,” Loberger said.

The lower response times and decreased distances enabled by systems such as MARVLIS and Optima Predict from Intermedix help save lives in the subset of patients that must be reached in four minutes or less to survive. Jersey City Medical Center EMS doubled its return of spontaneous circulation rate in cardiac arrest victims after integrating MARVLIS into its operations in 2012.

A University of Pittsburgh team modeled fatal vehicle crash rates in Pennsylvania from 2013-2014 and distances from trauma resources using Fatality Analysis Reporting System data. They discovered a theoretical 12.3 percent decrease in mortality if two medevac units were to be reassigned to the higher-incidence areas.

“There was a big disparity for these patients, depending on where they live,” said Joshua Brown, a general surgical resident at the university medical center and lead investigator on the study. “It’s only recently that trauma systems analysts have begun to incorporate GIS tools into their work to achieve improved outcomes. That we could potentially reduce mortality by relocating only two helicopter units was a very powerful finding.”

Community Engagement

Focusing resources strategically to improve patient outcomes involves more than ambulance placement. According to the American Heart Association, more than 350,000 out-of-hospital cardiac arrests occur in the United States each year. Only 5.5 percent of these victims survive to hospital discharge. Improving survival rates from sudden cardiac arrest is a holy grail among the EMS profession, and providers are combining geo-location data, GIS modeling, and smartphone apps in this quest.

In Mississippi, American Medical Response analyzed new data for geospatial patterns, looking for hotspots associated with neighborhood type, rural versus urban patterns, and similar factors. In the Jackson metropolitan area, they discovered an association between citizen CPR/Automated External Defibrillator (AED) training and bystander CPR rates in certain neighborhoods. Since bystander CPR/AED use can double or triple the chances of surviving cardiac arrest, AMR increased outreach training to the areas with high arrest and low training rates. Improved bystander CPR and increased survival rates followed.

PulsePoint AED is a crowdsourcing app that allows users to report the location of AEDs in their community. (Image credit: PulsePoint)

“So much can happen during the critical minutes of an emergency,” explained Michael Arinder, M.D., director of clinical services for the south region with American Medical Response. “We recognized that we had the ability to see what happens in the moments before the arrival of trained personnel and we decided to use that to better serve the community. We knew that if it saved only one additional life, it was worth it.”

This focus on bystander CPR/AED inspired PulsePoint to create a smartphone app suite to bring citizen rescuers to the cardiac arrest victim. The PulsePoint Respond app sounds an alert when a cardiac arrest occurs in a public place. Users in the agency-defined notification area will see the victim’s location on a map. PulsePoint Respond incorporates data from PulsePoint AED, a crowdsourcing app that allows users to report the location of AEDs in their community. The AED location data is made available in PulsePoint Respond after being verified by local authorities. 

“PulsePoint is the marriage between technology and citizen engagement,” said PulsePoint spokesperson Shannon Smith.

To date, PulsePoint Respond has been activated more than 20,000 times and has more than 59,000 users.

911 for the Next Generation

Crowdsourced traffic information is another valuable geospatial tool that can benefit the EMS community. Genesis PULSE, a vehicular tracking system used for dynamic deployment, exchanges data on road closures and traffic conditions with navigation app Waze.

Data after the first year of information exchange revealed that in 62 percent of cases Waze obtained accident notification up to 4.5 minutes faster than 911 centers. Although the implications are unsettling, Waze data provides PULSE users an advantage in rapid deployment—if, as in all GEOINT use cases, the data is accurate.

All geospatial data requires accuracy to be useful, but in public safety, accuracy can make the difference between life and death. Leaders in the field consider this a primary public safety challenge.

“Geographic Information Systems, when coupled with first-responder missions, private industry, and public policy can improve operational understanding and help PSAPs (public safety answering points) create and maintain reliable, dispatchable address databases,” said Mike King, emergency call-taking and dispatch industry manager for Esri as well as a member of the National Emergency Number Association. “All three disciplines are necessary for true success.”

The Next Generation 911 (NG911) initiative, spearheaded by U.S. Department of Transportation, seeks to design an emergency communications architecture that will transcend current limitations. Wireless mobile devices, Voice over Internet Protocol telephoning, and other modern technologies have rendered the 911 call center system outmoded.

According to King, core GIS capabilities, wireless and broadband use, and 3D routing technology, particularly for indoors, will be incorporated into NG911, but the parameters and solutions are evolving with the initiative.

Startup RapidSOS hopes to end geo-location fuzziness with a database that seamlessly integrates with 911 call centers. A cellphone call to 911 will ping the RapidSOS database, and geolocation information will be supplied to the 911 center. In trials, RapidSOS provided more accurate geo-location information than the wireless carriers tested.

EMS relies increasingly on GEOINT to provide effective healthcare.

In the coming years, the technology will continue to evolve with the proliferation of predictive artificial intelligence and machine learning algorithms, according to Nikiah Nudell, chief data officer for The Paramedic Foundation and a board member of the National EMS Management Association.

“Geospatial intelligence has become a powerful worldwide tool for paramedic chiefs and the public health and safety officials they often work with,” Nudell said. “In an environment where limited resources are being used to respond to dynamic critical incidents, having full situational awareness from an historic and real-time perspective is powerful.”

Featured image: The MARVLIS system models the probability of future emergency call locations based on historic data, near-real-time inputs such as dispatch and response times, and factors such as traffic conditions. (Credit: Esri)

The post Enabling Rapid Response appeared first on Trajectory Magazine.

]]>
35092
Providing Community ROI with Geospatial Tools http://trajectorymagazine.com/providing-community-roi-geospatial-tools/ Wed, 01 Nov 2017 13:04:23 +0000 http://trajectorymagazine.com/?p=35104 The logistical demands of providing emergency services to large crowds

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
The month of June brings the Wichita Riverfest to Sedgwick County, Kan. For more than a week, concerts, art shows, athletic events, and more draw crowds of up to several hundred thousand to enjoy themselves and support the community along the Arkansas River.

Handling the logistical demands of providing emergency services to large crowds, concentrated within a several-block radius, is the responsibility of Scott Hadley, director of Sedgwick County EMS. His agency handles all services for the 1,008 square mile area.

“Riverfest requires extra coordination, along with the approximately 170 calls per day that are our normal operations,” Hadley explained, adding that the tools his agency invests in allow daily operations and special events to run more smoothly.

For daily operations, Sedgwick County EMS uses a proprietary computer-aided dispatch system along with the MARVLIS system to staff 15 posts throughout the county. The agency tracks and analyzes operational performance, call volume and type, cardiac arrest and survival rates, and financial performance metrics. GEOINT analysis is integrated into these metrics.

Sedgwick County also employs FirstWatch during Riverfest. FirstWatch provides real-time surveillance and analysis to warn agencies of trends and patterns in a selected area. It does this using “triggers,” a set of user-defined filter criteria tailored to the specific event. Various data sources can be integrated with FirstWatch, making it very useful for events such as the Super Bowl, large conferences, festivals, and more.

Using FirstWatch at Riverfest, Sedgwick County EMS sets a geo-fenced area within which the incident command is deployed. Bike, ATV, and other responder teams staff the event, and patients who need to be taken to the hospital are transferred to an assigned point at the periphery of the geo-fenced area.

Geospatial tools are critical to efficient EMS operations, even more so when everyday operations are complicated by a special event or disaster.

Hadley views acquiring and using these tools “not as a cost, but as an investment.” The return on investment for geospatial technology, he said, provides Sedgwick County’s residents with cost-effective, patient-centered emergency care.

Featured image: Sedgwick County Riverfest, 2016. (Photo credit: Sedgwick County)

Return to feature story: Enabling Rapid Response

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
35104
Roadmap for Nationwide Geospatial Data Sharing http://trajectorymagazine.com/roadmap-nationwide-geospatial-data-sharing/ Wed, 01 Nov 2017 13:03:19 +0000 http://trajectorymagazine.com/?p=35111 GeoCONOPS is a guide to support homeland security, public safety, and emergency management

The post Roadmap for Nationwide Geospatial Data Sharing appeared first on Trajectory Magazine.

]]>
Luke Meyers, a planning coordinator with Seattle’s Office of Emergency Management, described himself as “a pig in mud” when he first learned about the Geospatial Concept of Operations (GeoCONOPS) at a conference in January.

He has since taken three of four available online GeoCONOPS courses.

GeoCONOPS, overseen by the Department of Homeland Security’s (DHS) Geospatial Management Office (GMO), is a strategic roadmap for national, state, local, private sector, and academic stakeholders to coordinate geospatial information, share data and tradecraft, and communicate in support of homeland security, public safety, and emergency management.

The roadmap is a guide for linking the geospatial data efforts of the 17 U.S. intelligence agencies, 22 DHS components, and the 50 states, 3,114 counties, and 78 data fusion centers throughout the country, in addition to other data producers in major cities. GMO does not seek to own or hold the data, but rather to validate data and sources, then direct users to them.

David Carabin, Bryan Costigan, Aaron Kustermann, and Jay Moseley, who lead data fusions centers in Massachusetts, Montana, Illinois, and Alabama, respectively, hope GeoCONOPS will soon mature to support an idea they call “SitRoom.”

SitRoom, according to Kustermann, would enable analysts at any of the nation’s 78 data fusion centers to learn, for example, that an individual stopped for a broken taillight in California is driving a car stolen from Minnesota, wanted for drug trafficking in Chicago, and suspected to be part of a terrorist cell in New York.

“GeoCONOPS is how we’re going to be able to share geospatial information,” Kustermann said. “It sets the standards for our being able to share [data]. Without it, the puzzle can’t be built.”

A Maturing Concept

Although the first version of GeoCONOPS was published eight years ago, public safety leaders like Kustermann and Meyers may have only learned of it recently or not be aware of it yet at all.

“It really hasn’t been publicized a lot, at least on the state and local level,” Meyers said.

Other leaders expressed some uncertainty as to which interoperability efforts fall under the umbrella of GeoCONOPS, which perhaps has too broad a definition for the far-reaching complexities of its mission.

“I’m not sure GeoCONOPS should be looked at as a specific program or policies to try to get to interoperability,” said James McConnell, assistant commissioner of strategic data for the New York City Office of Emergency Management. “Sharing—we’re doing a lot of that—but I’m not sure it falls under the title GeoCONOPS.”

This is a model for GeoCONOPS, which is overseen by the Department of Homeland Security’s Geospatial Management Office.

Yet when Hurricane Sandy struck New York and New Jersey in October 2012, the Federal Emergency Management Agency (FEMA) dispatched a GIS unit from Baltimore to assist in relief efforts. “They basically took a copy of our entire database, which we were happy to give them, as their base for working in New York,” McConnell said.

GeoCONOPS has its roots in 9/11, when first responders lacked the maps and data needed to navigate the labyrinth of the Pentagon. Four years later, first responders viewed the aftermath of Hurricane Katrina via commercial satellite imagery, but lacked the tools to communicate about what they were seeing.

“I think that’s really when people started to wake up to this concept of location as a critical element of their operations,” said Chris Vaughan, then deployed in support of FEMA’s Urban Search and Rescue Team providing on the ground geospatial support in New Orleans, and now the agency’s geospatial information officer.

The Hurricane Katrina disaster and others before it prompted a three-day meeting in Washington, D.C., of first responders, government, industry, and academia, that generated a 2007 National Academies report titled “Successful Response Starts with a Map: Improving Geospatial Support for Disaster Management.”

The report acknowledged growing geospatial capability, but warned, “The effectiveness of a technology is as much about the human system in which it is embedded as about the technology itself. Issues of training, coordination, planning and preparedness, and resources invested in technology need to be addressed if future responses are to be effective.”

This statement embodies the intent behind GeoCONOPS.

“There was a feeling that we didn’t know what we didn’t know, and we had gaps we couldn’t identify,” said Nathan Smith, a contract program manager for GeoCONOPS. “A lot of that was a perception that geospatial wasn’t reaching its potential, and that it was constrained by a lack of coordination within the geospatial community.”

Published for the first time June 30, 2009, GeoCONOPS underwent six updates by Jan. 18, 2015, and was met with varying degrees of success. While federal agencies worked toward data sharing, many potential state and local stakeholders looked askance at the 228-page document from Washington. Today, GeoCONOPS is hosted online via geoplatform.gov. A second, more secure site is planned to facilitate shared access for more sensitive data.

“The moment something is printed, it’s obsolete,” said David Lilley, acting director of the GMO. “So we moved to the web, a dynamic mode of delivery, and it puts the content media in an environment that’s of more use to our readers. We are more able to keep the content current and add searches so users can drive directly to what they are looking for in a matter of clicks, instead of searching through 100 pages.”

Realizing What Could Be

Lilley is working to foster a more complete understanding of GeoCONOPS. According to him, GeoCONOPS not only shows how geospatial data is currently supporting the mission at hand—but what geospatial data is available to the community and how it could support other missions.

Realizing what “could be” is perhaps the most important message, especially for those with data that could help FEMA, or state and local governments who could benefit from sharing data with one another. Lilley’s outreach is bringing more data and registered systems into the GeoCONOPS community. In doing so, he seeks to foster a cultural change across all echelons.

“I think through GeoCONOPS, people are identifying the concept that ‘the more people are using my data, the better I can justify sustaining the program (that gleans the data),’” Lilley said. “That’s a fundamental shift, because it used to be that ‘my data is mine, my power is my information.’ They still control it, but letting more people into the data makes it more powerful.”

Tightening budgets are also leading more partners to GeoCONOPS.

“People are more apt to re-leverage an existing capability for their mission need through the CONOPS than always building their own,” Lilley said.

Monetary constraints, technological evolution, and more persistent threats are creating a public safety landscape ripe for more widespread adoption of GeoCONOPS.

“Technology became easier at about the same time data became more prevalent,” said Vaughan, adding GeoCONOPS has been prominent in FEMA exercises such as Gotham Shield, which in April simulated a nuclear explosion in the New York/New Jersey area.

At many levels, public safety experts said GeoCONOPS should also be used as a roadmap for preparedness and resiliency in addition to natural disaster response.

“If effective, [GeoCONOPS] is really being used to support preparedness activities—planning, exercises,” said Rebecca Harned, director of National & Federal for the National Alliance for Public Safety GIS (NAPSG) Foundation. “It’s not something you want to try to access for the first time when the ‘big one’ hits.”

The post Roadmap for Nationwide Geospatial Data Sharing appeared first on Trajectory Magazine.

]]>
35111
GeoQ Meets GitHub http://trajectorymagazine.com/geoq-meets-github/ Wed, 01 Nov 2017 13:02:06 +0000 http://trajectorymagazine.com/?p=35128 The power of the crowd builds upon NGA’s open-source platform to better equip first responders with geospatial information

The post GeoQ Meets GitHub appeared first on Trajectory Magazine.

]]>
Accurate, up-to-date information is a first responder’s biggest asset. Data about infrastructure, passable roads, regional populations, and supplies is essential in a crisis, and can be more difficult to obtain in underdeveloped countries. Without immediate access to the right data, first responders scramble to assess damage and lose valuable time that would otherwise be spent helping people.

To assist with relief efforts in both domestic and international disasters, the National Geospatial-Intelligence Agency (NGA) developed an open-source web application that collects unclassified imagery from nontraditional sources. Called GeoQ, the tool is accessible on any internet browser and pulls together geo-tagged data from social media, maps, news, Earth imaging satellites, and more to provide response teams with a holistic picture of disaster areas in real time.

The problem we realized was a lot of people didn’t have this GIS or remote sensing background. They wanted something easy and intuitive to use, and that’s where GeoQ comes into play.
 
—John Mills, Penn State Applied Research Laboratory

Since its launch on code-sharing site GitHub in April 2014, GeoQ has been deployed for relief management efforts in more than 35 natural disasters, including tornadoes in Oklahoma, earthquakes in Nepal and Japan, typhoons in the Philippines, and the Ebola outbreak in West Africa.

Traditional damage evaluations can take up to 72 hours—during which relief agencies operate mostly “blind” on the ground. But GeoQ can provide a thorough damage assessment within 24 hours of an event, according to Ray Bauer, NGA‘s innovation lead and GeoQ project manager.

In the first ever applied use of GeoQ—a 2013 tornado in Moore, Okla.—“We were able to have 90 percent of the damage assessment done before we could get imagery from traditional sources,” Bauer said, referring to the period just after a disaster when relief agencies rush to compile data before deploying response teams.

Local Power

As data pops up online—such as geo-tagged photos on Instagram or helicopter footage from live news broadcasts—GeoQ’s crowdsourced workflow allows users to quickly receive and filter information to annotate at-risk areas. Emergency volunteers working online from relief agencies around the world are assigned manageable cells of land in the affected region and pore over the data, placing markers for things such as roadblocks and flood perimeters. 

Responding agencies can pull up the crowdsourced analysis on their computers or mobile devices, and can share information directly with other agencies. That shared accessibility is one of GeoQ’s primary benefits.

“In working with [federal, state, and local partners], we realized the inefficiencies of everyone doing their work a little bit differently,” Bauer said. “If you looked at the houses after Hurricane Sandy, they got marked with three or four Xs. Different organizations would come through and put a red X on the door … to show that they’ve already accounted for this property.”

With GeoQ, NGA hopes to standardize responder workflows and reduce that kind of overlap and resource waste to establish a more collaborative model of disaster relief.

Because of their access to tools and bandwidth for damage analysis, federal governments typically lead major disaster response efforts as requested by state and local authorities. GeoQ’s open-source approach helps give similar bandwidth to local responders so time isn’t lost communicating up the chain of command. Another benefit is geospatial intelligence (GEOINT) data held locally is often far more detailed and up-to-date than federal data.

“All disasters are local,” Bauer said, meaning that because disasters are primarily community-based in their impact, relief efforts should begin at the local level rather than the current model for disaster relief that puts most of the responsibility on federal agencies.

Bauer wants to flip the script with GeoQ to give more power to local entities such as fire departments and volunteer organizations, which are in a better position to provide immediate help but often lack sophisticated analytic technology.

“We’re giving them the fishing pole and teaching them how to fish,” Bauer said.

Members of Penn State’s Applied Research Lab (ARL) pose with National Geospatial-Intelligence Agency Director Robert Cardillo in the ARL booth at USGIF’s GEOINT 2017 Symposium. (Photo credit: PSU ARL)

NGA’s desire to share this local-first concept with the rest of the Intelligence Community and beyond is what led it to release GeoQ code on GitHub for free download and unrestricted use.

This means a user not affiliated with NGA could identify inefficiencies with the platform, alter GeoQ’s code, and upload the new, updated version on GitHub. If NGA approved the solution, it could be added to the source code. NGA hopes this low barrier to entry will encourage non-government organizations and private companies to participate.

“We’ve had several companies who have pulled the software down and have taken some of the ideas from GeoQ and started to implement it in their own software,” Bauer said. “That’s awesome. It’s about being open, transparent, and sharing ideas.”

Such a high level of transparency has led to significant leaps for GeoQ in the past three years.

Building Partnerships

GeoHuntsville, a nonprofit initiative in Alabama that unites organizations to improve disaster management, led an effort beginning in 2014 to integrate GeoQ with the operations of nearly every response agency within the municipality. This includes law enforcement, fire and rescue, medical, dispatch, civil air patrol, and more.

According to GeoHuntsville CTO Chris Johnson, “[GeoHuntsville] working groups were seeking a technology platform that would both visualize spatial data and capture tactical activities going on during an event.”

The organization wanted every Huntsville responder sent into a damage-prone area to be able to answer four questions: ‘Who am I?; Where am I?; How am I; and How can I report my activity back to the rest of the responding community?’

“We started using GeoQ to address the four questions, and also to help us break down workload, which it turns out GeoQ does very well,” Johnson said.

Now, GeoHuntsville utilizes its “Responders Working Group”—a collective of public safety specialists—to address prospective real-world challenges using GeoQ. GeoHuntsville’s technical unit, the “Geospatial Intelligence Working Group,” develops pilot programs and functional experiments based on those challenges to stress-test emerging tools and capabilities within GeoQ. NGA analysts as well as Federal Emergency Management Agency teams have participated directly in a number of these GeoHuntsville pilots.

“Through this working collaboration, we’ve been able to add a lot of features to GeoQ. And the wonderful thing about that is it doesn’t just benefit us in Huntsville,” Johnson said. “We are sharing [these capabilities] with everyone through GitHub.”

In August 2016, GeoHuntsville teamed with the National Oceanic and Atmospheric Administration and the National Weather Service to explore the use of unmanned aircraft systems as a platform to deliver live imagery to first responders on the ground. That intake of real-time surveillance paired with the ability to track the unmanned vehicle was new to GeoQ.

In the same exercise, GeoHuntsville developed a YouTube filter within GeoQ. Now, an operator can pull up an effected area on his or her screen and query YouTube for a specific keyword, timestamp, or location to pull real-time video data as soon as civilians post it online. Such data could be instrumental in determining where to direct resources and avoiding repeat coverage.

Pennsylvania State University has also contributed to GeoQ’s field testing and open-source development.

John Mills, a technologist with Penn State’s Applied Research Laboratory (PSU ARL), worked alongside Bauer on NGA’s “Map of the World,” and took a lead in enhancing GeoQ’s automation and data analytics when it first launched.

Students in Penn State’s Red Cell Analytics Lab work with high-tech equipment to simulate threats and analyze information. (Photo credit: PSU ARL)

“The problem we realized was a lot of people didn’t have this GIS or remote sensing background,” Mills said. “They wanted something that’s easy and intuitive to use, and that’s where GeoQ comes into play.”

PSU ARL joined forces with the PSU College of Information Sciences and Technology’s Red Cell Analytics Lab to focus on predictive analytics and implementation of open-source software into local, state, and federal GIS workflows. PSU students test GeoQ in the field, with student-run analytics teams evaluating and managing security threats at events such as Penn State football games at Beaver Stadium and THON, the world’s largest student-run philanthropic event.

According to Mills, the Red Cell teams have focused primarily on two initiatives: exploiting social media to access data, and supplementing GeoQ with other open-source projects such as NGA’s Mobile Awareness GEOINT Environment (MAGE) app. MAGE allows users to create geo-tagged data reports containing observable photo, video, or audio records, and to share those reports instantly with other team members.

“I call it the Red Cell Army,” Mills said. “They were able to go out and use MAGE to do event observable collects, and then in real time, GeoQ was in the emergency operations center in Beaver Stadium and you could see all these [MAGE] data sets popping up. That allowed emergency response folks to better do force deployment.”

Additionally, Mills continued, PSU ARL supervisors and Red Cell Analytics Lab members meet with government stakeholders—including NGA—to observe workflows and brainstorm ways the process could be automated to improve GeoQ’s efficiency and efficacy.

Though the application’s development has been primarily focused on disaster relief, GeoQ’s collaborative model has broader possibilities. The tool is designed to be applied internationally and in other industries.

People on six continents have downloaded or shown interest in GeoQ on GitHub. For example, an insurance company reached out to NGA about using GeoQ for after-damage reports to show where agents made adjustments.

Archaeologists have shown interest as well, according to Bauer. GeoQ currently divides land into single kilometer cells, but perhaps, he said, the program could be used to divide land into centimeter cells to support the examination and analysis of historic excavation sites.

The Next Level

For the next generation of GeoQ, NGA is exploring gamification to incentivize more people in the GEOINT Community to use the program. For now, GeoQ still requires an entry-level background in damage analysis and data management to be used productively.

To encourage engagement, NGA released in late 2014 a gamification code within the program that rewards volunteer analysts with badges and points based on feature creation within GeoQ. For example, a contributor might gain five points for marking five damaged houses within their assigned cell—once they acquire 10 points, they’d earn a badge. Accumulation of badges leads to higher clearance to assist in further, more intense disaster relief.

Badges and other user awards can be exported into a folder called the “Open Badges Backpack,” where contributors can show off their expertise.

Bauer joked about his children’s enthusiasm for virtual games. “We can see how powerful this gamification is—now imagine if we can start to use it for good,” he said. 

According to Bauer, tests of this gamification technique during real-world events have engaged analysts working side-by-side in friendly competition to earn more points and badges.

Bauer said perhaps by incorporating GeoQ into emergency response training programs for the public “[NGA] could start to develop a community in the future where we have civilians participate in first response.”

Through its open source code, GeoQ and similar applications provide first responders and volunteers with unprecedented speed and ease of use in data sharing. The advent of open-source tools will help keep first responders informed and unified in their assessments of danger and damage, enabling superior aid and ultimately saving more lives.

Featured image: GeoQ allows anyone with a web browser and an understanding of geospatial tools like Google Earth and ESRI ARC products to support a project. Contributors focus on information within the image as well as outside the frame to rapidly assess impacts and changes of disasters over large geographical areas to produce detailed features from traditional and non-traditional data sources quickly. (Credit: NGA)

The post GeoQ Meets GitHub appeared first on Trajectory Magazine.

]]>
35128