Software – Trajectory Magazine http://trajectorymagazine.com We are the official publication of the United States Geospatial Intelligence Foundation (USGIF) – the nonprofit, educational organization supporting the geospatial intelligence tradecraft Fri, 19 Jan 2018 19:39:44 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.4 https://i2.wp.com/trajectorymagazine.com/wp-content/uploads/2017/08/cropped-TRJ-website-tab-icon-1.png?fit=32%2C32 Software – Trajectory Magazine http://trajectorymagazine.com 32 32 127732085 Weekly GEOINT Community News http://trajectorymagazine.com/weekly-geoint-community-news-32/ Mon, 20 Nov 2017 17:53:36 +0000 http://trajectorymagazine.com/?p=35493 Radiant Solutions Announces Plan of Operations; General Atomics Acquires Surrey Satellite Technology U.S.; Planet Imagery Made Available in SpyMeSat App; USGS Publishes Global Crop Map; & More

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
Radiant Solutions Announces Plan of Operations

Maxar Technologies’ geospatial business unit Radiant Solutions will combine former service brands RadiantBlue, HumanGeo, MDA Information Systems, and DigitalGlobe Intelligence Solutions into one commercial provider. The business will be organized into three missions: sensor and ground modernization; data to insight; and agile intelligence. This convergence of data-gathering sensors, cloud computing, open source, big data, and machine learning will offer customers a strong and thorough way to uphold national security missions.

General Atomics Acquires Surrey Satellite Technology U.S.

General Atomics acquired the majority of the assets of Surrey Satellite Technology U.S., a Colorado-based provider of small satellite technologies, systems, and services. The assets and workforce will be integrated into General Atomics’ Electromagnetic Systems Group to support the organization’s growth initiatives focused on the development and delivery of small satellite and advanced payload systems.

Planet Imagery Made Available in SpyMeSat App

Planet reached an agreement with Orbit Logic, allowing users of Orbit Logic’s SpyMeSat mobile app to access Planet’s daily satellite imagery. SpyMeSat provides on-demand access to recently archived imagery and the ability to request tasking over specific areas. Planet images available in the app cover 625 square kilometers at 3.7-meter resolution and cost less than $1.30 per square kilometer, while new tasking options begin at $375.

Esri Partners with Mobileye on Driver Assistance

Esri announced a collaboration with Intel’s Mobileye, a provider of driver-assistance software, to integrate Esri’s analysis and visualization capabilities with Mobileye’s Shield+ system. A network of sensors placed on the vehicle will record real-time data like pedestrian or cyclist detection in blind spots—that data will be uploaded into Esri’s ArcGIS platform and viewed on the Mobileye dashboard. Municipal buses and other public transport will be outfitted with this technology, making for safer commutes and communities.

USGS Publishes Global Crop Map

The United States Geological Survey released a new high-resolution map of croplands around the world. The map identifies 1.87 billion total hectares of farmland—India has the highest net cropland area, followed by the U.S., China, and Russia. The map was built using Landsat imagery at 30-meter resolution, the highest quality of any global agricultural dataset.

Loft Orbital Raises Funding for Condo Constellation

Loft Orbital has raised $3.2 million in seed funding to create a constellation of satellites carrying multiple payloads from different customers. Spacecraft would weigh between 100 and 200kg to keep launch prices low enough to dissuade customers from purchasing and operating satellites of their own. Loft will manage the satellite procurement, launch, operations, and downlinking data, while customers will task their own payloads. Loft is targeting a first mission for the second half of 2019.

Boundless Rebrands GIS Software

Boundless announced the rebranding of its flagship GIS software from Boundless Suite to Boundless Server. The new enterprise package will feature enhanced styling and increased compatibility with Esri’s ArcGIS. The software’s flexible architecture allows users to manage and publish location data with ease.

Luciad Launches Data Management Software Updates

Luciad announced the new V2017.1 version of its software suite, particularly the LuciadFusion data management platform. Luciad refers to the suite as a one-minute data manager—it can complete setups, publishing, visuals, discovery, and analytics in 60 seconds each.

Pitney Bowes Launches Collaborative Online GIS Community

Pitney Bowes launched its Li360 Community, a global online population of GIS professionals, clients, and customers collaborating on business tools and capabilities. The community serves as a way to promote innovation from geospatial industry as companies realize the benefits of location intelligence and begin using it to drive sales.

ODNI Re-launches Intelligence.gov

The Office of the Director of National Intelligence announced the re-launch of Intelligence.gov, a central website for the U.S. Intelligence Community (IC). The move is rooted in a community-wide effort to standardize transparency about the IC’s activities. Users can parse through public data, documents, and products, and can link to other resources such as the websites of specific intelligence agencies.

Photo Credit: USGS

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
35493
The Genesis of Google Earth http://trajectorymagazine.com/genesis-google-earth/ http://trajectorymagazine.com/genesis-google-earth/#comments Wed, 01 Nov 2017 14:53:37 +0000 http://trajectorymagazine.com/?p=35010 The history and future of the software that made GEOINT mainstream and changed the way we view the world

The post The Genesis of Google Earth appeared first on Trajectory Magazine.

]]>
In August 2005, Hurricane Katrina ravaged the Gulf Coast of the United States, bursting levees throughout Louisiana and Mississippi and submerging the streets of south Florida. According to the National Hurricane Center, it was the deadliest hurricane since 1928, claiming at least 1,800 lives and causing more than $108 billion in damages.

The U.S. Navy, Coast Guard, and other federal relief groups deployed helicopter teams to rescue people stranded in New Orleans without the resources to escape or survive in their homes. Hurricane victims dialed 911 for urgent help at specific street addresses, but it was impossible for first responders to find them without precise GPS coordinates—street signs and house numbers were invisible beneath the deluge. In the absence of traditional situational awareness, responders were operating blind.

In California, a team from the recently minted Google Earth program launched into action, creating real-time imagery overlays of heavily affected areas on top of its existing 3D globe platform. Fly-by aerial photos from the National Oceanic and Atmospheric Administration (NOAA) and satellite imagery from DigitalGlobe—one of Google Earth’s primary providers—revealed the scope of the hurricane’s destruction. Google Earth made this data publicly available and responders had eyes again.

Now, they could input a caller’s location into Google Earth paired with case-specific details—for example, a target trapped in a two-story house with a clay roof next to an oak tree. Equipped with up-to-date imagery from Google Earth, relief teams saved thousands of people from Katrina’s aftermath.

Years later, the Louisiana Governor’s Office of Homeland Security and Emergency Preparedness would pair internal data with Google Earth Enterprise (GEE)—the desktop software suite for private or offline use of Google Earth—to create 3D globes for emergency response and infrastructural planning.

Today, Google Earth is among the most popular geospatial software in the world, boasting upward of one billion downloads. With it, students take virtual tours of the world’s wonders from their classrooms, house hunters evaluate prospective properties without leaving home, and much more. The U.S. military employs GEE for secure mission planning and intelligence professionals use it to visualize points of interest and detect change. Google’s spinning globe truly represents the democratization of geospatial intelligence.

In the case of GEE, government and military organizations became so dependent on the software’s private storage and visualization capabilities that not even a depreciation announcement from Google two years ago stopped them from using the platform.

As a result of the community’s reliance on GEE, earlier this year Google decided to make the software’s code open source and available for public download on GitHub.

With its future in the hands of its users, GEE is poised to remain at the center of mission planning and situational awareness efforts for the defense and intelligence communities—at least until a supported platform of equal utility arises.

A Giant’s Infancy

At the time Hurricane Katrina made landfall, Google Earth software had been available to the public for only three months. But the story of Google Earth began to take shape 10 years earlier at a computer hardware company called Silicon Graphics (SGI).

Michael T. Jones, then a member of SGI’s computer engineering team, had developed an invention that would revolutionize the firm’s 3D graphics offering, which at the time was used primarily for flight simulation.

“It was called clip mapping. That’s the fundamental hardware feature SGI had that let it do this amazing, smooth flight around the world,” said Jones, now a managing partner at Seraphim Capital.

Jones’ technique displayed a small region of graphics—the region under examination—in high resolution while the peripheral regions were displayed in low resolution. Jones, along with SGI engineers Chris Tanner, Chris Migdal, and James Foran, patented the method in 1998. Clip mapping required powerful supercomputers to run, but enabled a high-fidelity texture map that became the centerpiece of SGI’s final graphics system, Infinite Reality, which at the time boasted the fastest 3D graphics in the world.

Federal agencies such as the National Geospatial-Intelligence Agency (NGA) and the National Reconnaissance Office (NRO) would later follow suit, Jones said, using clip mapping to build data visualization platforms of their own.

To demonstrate the vastness of Infinite Reality’s capabilities, SGI created a demo called “Space to Your Face.” It began with a wide view of Earth from space, slowly zooming into Europe. When Lake Geneva became visible, the program would focus on the Matterhorn in the Swiss Alps. It would continue to zoom until reaching a 3D model of a Nintendo 64 console on the mountainside. Then it would zoom in even more, settling on the Nintendo’s MIPS r4000 graphics chip—a microprocessor created by SGI—before snapping smoothly back to space.

The demo was well received. Educators were excited to see an interactive, classroom-friendly global map tool, and video game developers had never seen such fluid graphics.

Seeking a new home for their brainchild, Jones, Tanner, and former SGI engineers Remi Arnaud and Brian McClendon founded a company of their own. Called Intrinsic Graphics, it focused on developing high-quality 3D graphics for personal computers and video games.

In October 1999, Tanner took the concept further when he designed a software version of the clip mapping feature that allowed a user to “fly” within a 3D visualization of Earth.

“People were blown away,” Jones said. “They were looking at Google Earth.”

Though the software platform wasn’t Intrinsic’s primary product—the graphics themselves were—Jones was intrigued and continued refining the spinning globe.

Yet running the software required expensive and highly specialized computing hardware not available to most of the private tech industry, let alone the commercial user.

“That machine cost $250,000. We wanted to be able to offer this without the specialized hardware,” said McClendon, now a research professor at the University of Kansas. “To be able to get that performance out of a PC meant we could share it with the world. The moment you realize you can transmit this data over the internet, you begin to realize the impact. A group of us at Intrinsic thought, ‘We need to build a company around this.’”

And before long, yet another company was founded. In 2000, Jones, McClendon, and a few others spun out the software from Intrinsic Graphics to launch Keyhole. In early 2001, Keyhole raised first round funding from NVIDIA and Sony Digital Media Ventures, making official its existence as a standalone company. Keyhole’s first product, EarthViewer 1.0, was the true precursor to Google Earth.

Using public data gathered from NASA’s Landsat constellation, IKONOS imagery, and aerial photos of major U.S. cities, Keyhole built a complete digital Earth. Though pixels were beginning to proliferate, high-resolution imagery was mostly limited to U.S. metropolitan areas.

Under the direction of newly appointed Keyhole CEO John Hanke, the company marketed EarthViewer to the commercial real estate and travel industries. Civil engineers also purchased it for the ability to sketch out location information when planning construction projects. 

“Intelligence agencies wanted this capability as well, but they wanted to use their own data,” McClendon said.

The Intelligence Community (IC) was intrigued, but wanted to use classified geospatial data gathered through National Technical Means rather than the data on Google’s public server. To accommodate such buyers, Keyhole began offering an enterprise version of its software, allowing large-scale users to stand up private network servers and host their own data on a replica of EarthViewer’s 3D globe.

NIMA Backing

The National Imagery and Mapping Agency (NIMA) was the first agency to take note of this unprecedented capability. Under the leadership of then director James Clapper and deputy director Joanne Isham in 2001, NIMA launched a research and development directorate known as InnoVision. The new directorate sought to leverage state-of-the-art technologies from industry to help the IC adapt to the changing face of conflict in the aftermath of 9/11.

Isham, a former CIA employee, was well versed in In-Q-Tel, the CIA’s nonprofit venture capital initiative. She approached Robert Zitz, InnoVision’s first director, about collaborating with In-Q-Tel to find partner companies.

“We sat down together with In-Q-Tel and went over what our most urgent requirements were,” said Zitz, now senior vice president and chief strategy officer of SSL MDA Government Systems. “In-Q-Tel started trying to locate companies and [in 2002] discovered Keyhole.”

In-Q-Tel was impressed by the low barrier to entry and EarthViewer’s ease of use.

[Users] will create data files … rapidly and not to spec, put them in Google Earth, and they’ll run somehow. That’s really the reason why no other applications have been able to enter this space as dominantly as Google Earth.

— Air Force Lt. Col. Mike Russell, NGA

“With [EarthViewer], you just click on the icon and all of a sudden you’re flying around the globe,” said Chris Tucker, In-Q-Tel’s founding chief strategic officer and now the principal of Yale House Ventures. “There had been some way earlier-era, very expensive defense contract iterations [of a 3D digital Earth], but none at a consumer level that a regular analyst could make sense of without being a missile defense expert or some other technical user.”

In 2003, In-Q-Tel invested in Keyhole using NIMA funding. It was the first time an intelligence agency other than the CIA had employed In-Q-Tel. NIMA experienced an immediate return on its investment. Within two weeks, the U.S. military launched Operation Iraqi Freedom, which Keyhole supported in its first mission as a government contractor.

“We wanted a capability that would help military planners visualize and seamlessly move through datasets pertaining to particular target areas,” Zitz said. “We also wanted the ability to rapidly conduct battle damage assessments. NIMA was supporting joint staff in the Pentagon, and to sense how effective a strike was after-the-fact was very labor and imagery intensive. With Keyhole, we were able to streamline that process.”

EarthViewer quickly gained public exposure through TV news coverage using its battlefield imagery.

One of McClendon’s junior high school classmates, Gordon Castle, was CNN’s vice president of technologies. McClendon approached Castle with his EarthViewer demos. Castle was wowed, and CNN became one of Keyhole’s first media customers. The network routinely used EarthViewer to preview story locations during broadcasts. When the U.S. invaded Iraq, CNN used the software heavily—sometimes several times an hour—to show troop movement or combat locations.

The Big Break

Realizing its technology could improve people’s understanding of the planet, widespread commercialization became Keyhole’s mission. But Keyhole was a small company, and scaling up its computing infrastructure to handle more traffic was expensive. An annual EarthViewer Pro subscription still cost $599—a price justified by the company’s high operating costs. Keyhole’s bottom line stood in the way of its goal.

“[We wanted] everybody that opened the app to be able to find their house,” McClendon said. “It’s the first thing everybody searches for. If that experience isn’t good, the user thinks the product isn’t good.”

That first step required high-quality coverage of the entire land surface of Earth—a seemingly unattainable achievement for Keyhole’s 29 employees, even with In-Q-Tel backing. And the startup’s network bandwidth wasn’t strong enough to offer a high-resolution 3D globe to millions of consumers worldwide. McClendon recalled making regular trips to Fry’s electronics store to purchase hard drives, struggling to keep up with demand.

“To provide high-resolution data for the whole world was an epic undertaking … that would’ve taken us probably a decade to build up on our own,” he said.

For its vision to materialize, Keyhole needed more capital to scale up imagery procurement and to build powerful data infrastructure to store high volumes of imagery. In 2004, as if on cue, along came Google—one of the few companies powerful enough to manifest Keyhole’s mission. And they wanted to buy.

“It seemed like a tough road. Everybody was impressed with what we had done, but there was going to be competition and we needed to move quickly,” Jones said. “So we sold to Google because our dream would happen.”

As part of the acquisition, the Keyhole team maintained control of the program as it evolved. Most personnel, including McClendon and Jones (Tanner had since departed Keyhole), became executives at Google, developing their software unrestricted by the need to keep a startup afloat.

Once at Google, the program began to operate on an entirely different scale. Instead of acquiring licensing deals for small portions of a vendor’s imagery at a time, Google bought out all the imagery a vendor had available at once. Google also provided access to a rapidly growing user base already hooked on its web search platform.

Before debuting a Google-branded product, the former Keyhole team had to rewrite EarthViewer’s service code to run within Google’s infrastructure. Additionally, pre-release engineering refinements focused on adding data around the globe, making the program accessible to non-English speaking users, and simplifying features. Finally, Google Earth launched in June 2005.

The software exploded in the commercial marketplace. Where Keyhole’s consumer version of EarthViewer was too expensive for most casual civilian users, Google Earth was downloadable for free.

“We had millions of users in the first few days and tens of millions in the first year,” McClendon said.

Keyhole brought to Google a new form of interactive information that mimicked the real world and helped people understand their place in it. A GEOINT tool had finally made it to the mainstream.

In 2006, Google released Google Earth Enterprise for organizations seeking the capabilities of Google Earth but with private data in a secure, offline environment. The GEE suite included three software components: Fusion, the processing engine that merged imagery and user data into one 3D globe; the Earth server that hosted the private globes built by Fusion; and Client, the Javascript API used to view these globes.

Whether to disseminate that data after creating proprietary globes in GEE was, and still is, up to the user. This was the final evolution of the EarthViewer enterprise suite used by the Pentagon at the outset of the Iraq war.

GEE in Action

In the years following its launch, government agencies, businesses, and state municipalities began to deploy GEE at internal data centers to produce 3D globes using sensitive or classified data.

The city of Washington, D.C., for example, has used GEE to model and visualize public safety data including crime, vehicle and fire hydrant locations, and evacuation routes.

Arguably the largest user of GEE is the U.S. Department of Defense (DoD). When Google Earth was first released, military customers had an explicit need for this capability to function in a highly secure private network.

For example, the Army Test and Evaluation Command (ATEC) uses private data on enterprise servers such as Google’s to evaluate a wide range of weapon systems as well as ground and air operations.

At ATEC’s Yuma Proving Ground (YPG) in Arizona, proprietary terrain data, imagery, and operations maps are overlaid on Google Earth and used to plan and schedule launches.

“Knowing where everyone is and moving in a secure range and air space is important to our operations,” said Ruben Hernandez, an Army civilian in the YPG’s engineering support branch. “Much of this data is also armed for range awareness display.”

For example, prior to an indirect fire artillery test, personnel use YPG data within GEE to assess the safest positions on base to conduct the test—when to fire, where to fire from, and what to fire at. That information is disseminated throughout YPG for awareness.

“Many of these munitions have extensive footprints. We want to find out how much air and land space [the blast] is going to consume. Safety is a big component of how these overlays are planned,” Hernandez said.

NGA is another major GEE stakeholder. In 2008, the agency’s new GEOINT Visualization Services (GVS) program invested in the enterprise server. GVS has since produced a proprietary version of Google Earth for warfighters featuring classified NGA data.

According to GVS program manager Air Force Lt. Col. Mike Russell, “GVS was built around providing a version of Google Earth in the secret and top secret domains so users could visualize classified information geospatially and temporally in a common operating picture.”

Now, NGA’s private Google Earth globes are mission critical for more than 30,000 customers daily, including DoD Combatant Commands, the FBI, CIA, NRO, National Security Agency, and Federal Emergency Management Agency. NGA’s current release is the second largest Google Earth globe in the world and is used across the DoD and IC for common situational awareness, tracking vehicles and personnel, delivering intelligence briefings, and more.

Russell praised Google’s efficient rendering of data files in the Keyhole Markup Language (KML) format. KML was created for file building in Keyhole’s EarthViewer platform and has since become an industry standard for visualizing geospatial data.

“[Users] will create data files like the location of an IED or a live dynamic track of an aircraft. They can build these files rapidly and not to spec, put them in Google Earth, and they’ll run somehow. [Competitors] can only render smaller KMLs or those built to spec. That’s really the reason why no other applications have been able to enter this space as dominantly as Google Earth,” Russell said.

The Unbundling

GEE served a far more specific client and purpose than the commercial Google Earth services, but its rate of adoption was noticeably low compared to most Google products.

According to McClendon, “Continuing to innovate on a hosted service exclusively for the enterprise community was not financially viable.”

In March 2015, Google announced the depreciation of GEE. After a two-year transitional maintenance period, the company stopped supporting GEE software in March 2017. Though it was being phased out of Google’s product line, GEE remained in use by invested customers relying on it to meet mission demands and house their data.

Hernandez recalled pushback from teams at Yuma who were not keen to change their data storage and visualization system. According to Russell, GVS feared losing its primary product and stranding customers without an application to replace it.

To accommodate the ongoing need, Google announced in January it would publish all 470,000 lines of GEE’s code on GitHub, allowing customers to continue using the software they’d grown loyal to and to improve the product independently.

For customers who prefer transitioning to a supported enterprise software, Google has coordinated with Esri to offer free software and training for GEE customers who migrate to Esri’s ArcGIS platform. 

The open-source GEE (GEE-OS) suite includes the Earth server, Fusion, and a portable server allowing users to run GEE on a mobile device or desktop computer not connected to a centralized server. The GEE Client software, which is required to connect to the Earth server and view 3D globes, was not carried forward into the open-source environment. Instead, it will continue to be maintained and provided by commercial Google Earth.

Thermopylae Sciences and Technology (TST), NT Concepts, and Navigis—three longtime Google partners—supported GEE’s transition to open source. In the spring, each of the three companies sent a developer to Google in Mountain View, Calif., to spend several weeks learning the code from Google developers who had been maintaining the software baseline. 

TST began a partnership with Google in 2007 through a series of federal government customer engagements supporting Thermopylae’s own Google Earth-based tracking console. When the open-source announcement was made, TST’s Earth Engineering team was reassigned to the company’s Open Source Development Office to create the GEE GitHub site and migrate the source code.

On Sept. 14, TST’s open source team released GEE-OS version 5.2.0, which matches the last proprietary release as well as fixes bugs that emerged during the two-year depreciation period.

“When we pulled the code out from [Google’s] proprietary side, there were a lot of things that needed to be built back up or replaced with open-source components,” said Thermopylae CEO AJ Clark. “Really these first few months are just about providing feature parity with where the code was at its last state inside Google.”

TST’s team aims to release GEE-OS 5.2.1 by the end of 2017.

Now that parity is achieved and the program’s performance is stabilized, developers will begin submitting expanded code contributions. According to Clark, the first value-add propositions will most likely begin to flow in early 2018. Meanwhile, DoD and IC users are eager to discover how they can further adapt the software for their specific missions.

Chris Powell, CTO of NT Concepts, said the company is working with its defense and intelligence community customers to support GEE and their transition to the GEE-OS baseline. 

“We’re also actively looking for opportunities to contribute back to the open source baseline for feature improvements and capabilities,” Powell said, adding some possibilities are scaling the GEE processing power to a larger compute platform and examining how the software can be optimized for the cloud.

Hernandez said the planning crew at Yuma is looking forward to new software capabilities that could be built out at the request of the test community. Among these features, he said, is the ability to “grab geospatial objects and collaborate on them between multiple users; to grab, extend, and change the shape of a [weapon] footprint in 2D or 3D; and to provide a simulation of an object’s line trajectory.”

According to Jon Estridge, director of NGA’s Expeditionary GEOINT Office, the agency has committed to provide enhancements and ongoing sustainment to open-source GEE on Github through at least 2022.

“A few specific examples would be multi-threading the fusion process to support massive terrain and imagery updates, enhanced 3D mesh management, and inclusion of ground-based GEOINT content like Street View,” Estridge said. 

Open source means more customizability for users with niche wants and needs. No two proprietary Google Earth globes look the same, and teams will have more command over the unique data they store, visualize, and analyze within the program.

“It’s very positive,” Russell said. “[Open source is] an opportunity for NGA to partner with Thermopylae to tie the proprietary and non-proprietary pieces together, and it allows us to sustain Google Earth for our user community for a longer period of time.” 

The decision to make GEE code open source only improves the program’s accessibility and potential use cases, and will bolster the software’s longevity. Code sharing is a growing trend in the IC, and Google has provided government, military, and industry unlimited access and control of one of the most useful enterprise GEOINT tools on the market. 

The post The Genesis of Google Earth appeared first on Trajectory Magazine.

]]>
http://trajectorymagazine.com/genesis-google-earth/feed/ 2 35010
Providing Community ROI with Geospatial Tools http://trajectorymagazine.com/providing-community-roi-geospatial-tools/ Wed, 01 Nov 2017 13:04:23 +0000 http://trajectorymagazine.com/?p=35104 The logistical demands of providing emergency services to large crowds

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
The month of June brings the Wichita Riverfest to Sedgwick County, Kan. For more than a week, concerts, art shows, athletic events, and more draw crowds of up to several hundred thousand to enjoy themselves and support the community along the Arkansas River.

Handling the logistical demands of providing emergency services to large crowds, concentrated within a several-block radius, is the responsibility of Scott Hadley, director of Sedgwick County EMS. His agency handles all services for the 1,008 square mile area.

“Riverfest requires extra coordination, along with the approximately 170 calls per day that are our normal operations,” Hadley explained, adding that the tools his agency invests in allow daily operations and special events to run more smoothly.

For daily operations, Sedgwick County EMS uses a proprietary computer-aided dispatch system along with the MARVLIS system to staff 15 posts throughout the county. The agency tracks and analyzes operational performance, call volume and type, cardiac arrest and survival rates, and financial performance metrics. GEOINT analysis is integrated into these metrics.

Sedgwick County also employs FirstWatch during Riverfest. FirstWatch provides real-time surveillance and analysis to warn agencies of trends and patterns in a selected area. It does this using “triggers,” a set of user-defined filter criteria tailored to the specific event. Various data sources can be integrated with FirstWatch, making it very useful for events such as the Super Bowl, large conferences, festivals, and more.

Using FirstWatch at Riverfest, Sedgwick County EMS sets a geo-fenced area within which the incident command is deployed. Bike, ATV, and other responder teams staff the event, and patients who need to be taken to the hospital are transferred to an assigned point at the periphery of the geo-fenced area.

Geospatial tools are critical to efficient EMS operations, even more so when everyday operations are complicated by a special event or disaster.

Hadley views acquiring and using these tools “not as a cost, but as an investment.” The return on investment for geospatial technology, he said, provides Sedgwick County’s residents with cost-effective, patient-centered emergency care.

Featured image: Sedgwick County Riverfest, 2016. (Photo credit: Sedgwick County)

Return to feature story: Enabling Rapid Response

The post Providing Community ROI with Geospatial Tools appeared first on Trajectory Magazine.

]]>
35104
HPE: Revolutionizing IT http://trajectorymagazine.com/hpe-revolutionizing-it/ Wed, 16 Aug 2017 20:11:39 +0000 http://trajectorymagazine.com/?p=34460 Q&A with Ken Bruni, director, Advanced Programs Group; and Howard Clifford, distinguished technologist

The post HPE: Revolutionizing IT appeared first on Trajectory Magazine.

]]>
Q: How does HPE support the Intelligence Community (IC)?

Hewlett Packard Enterprise (HPE) has had a strong, long-term relationship with the IC, engineering and building information technology (IT), and providing consulting services in support of their unique and challenging missions. Additionally, HPE has worked with the IC to define technologies and techniques to address cyber vulnerabilities such as the advanced persistent threat, and as a result, HPE has created highly secure IT infrastructure. As part of HPE’s cyber strategy, the company is implementing the National Institute of Standards and Technology (NIST) Cybersecurity Framework and NIST 800-171 to secure HPE’s supply chain. Finally, HPE has cleared support technologists worldwide and secure facilities in order to support the IC globally.

Q: What is the background on HPE’s Enterprise Services spin off to DXC Technology? How will this change HPE?

On April 1, we completed the spin-merge of our Enterprise Services business with Computer Sciences Corp. to form DXC Technology. We believe this was an important move for HPE to create a more focused company dedicated to the solutions our customers and partners tell us they want most.

HPE will retain and continue to invest in Pointnext, its technology services organization, made up of more than 25,000 specialists in 80 countries to support customers across advisory and transformation services, professional services, and operational services. These teams collaborate with businesses worldwide to speed their adoption of emerging technologies, including cloud computing and hybrid IT, big data and analytics, the intelligent edge, and the Internet of Things (IoT).

Q: How is HPE innovating in the GEOINT space?

HPE is innovating across IT, from the core to the edge. One focus area is what we call “hybrid IT.” HPE recognizes some workloads are best deployed in public or private clouds, while others are best deployed in traditional IT infrastructure. Building and helping to create hybrid IT is a core strategy of HPE, since that is what our customers are asking for. To deliver on that strategy, HPE has engineered and built new hardware and software technologies to deliver the same dynamic configuration flexibility and economics of cloud across traditional computing, storage, and networking solutions. This innovation allows our customers to deploy the right workload on the right platform within the right economic model. Most importantly, this directly supports the GEOINT Community’s desire for rapid development and widely-shared apps and data hosted in the cloud while keeping data collection, high performance data processing, and mature workloads on traditional infrastructure.

Another major innovation is in the area of mobility with HPE’s Aruba Wi-Fi hardware and software. The IC now has its community cloud and HPE has worked with the IC to create a National Security Agency-approved way of handling sensitive and classified data over Wi-Fi. While Wi-Fi is likely not appropriate for use everywhere in the IC, it does have its place and its use will grow over time.

Q: What are your thoughts on how IT will transform in the next five years?

A huge change is already underway and will become more apparent in the next several years. If you look at the IT industry since its inception, there have been several tectonic shifts and we are at the beginning of a fourth shift. Now, we are rapidly moving toward a world where everything imaginable has some kind of connectivity and processing. This is the Internet of Things, where processing is decentralized and pushed out to the edge close to where data is created, whether by autonomous cars and planes, smart cities, or sensors adorning nearly every item imaginable. With IoT the number of “users” or data creators could reach the hundreds of trillions and the resulting amount of data generated will grow exponentially.

The computers we rely on today, from smartphones to supercomputers, are hitting a wall in terms of physical size, efficiency, and computing capacity, because today’s computers are based on an architecture that’s more than 60 years old. To address this challenge, HPE envisioned an entirely new computing architecture called “memory-driven computing,” which enables a massive leap in our ability to process data. It allows the development of new ways of extracting knowledge and insights from large, complex data sources. Massive performance gains can be obtained from rethinking and re-architecting how data is processed and analyzed. All of this has huge implications for the IC, allowing the community to leverage the power of the IoT.

Machine learning will cease to be a novelty and will soon become a necessity as the data volumes continue to grow beyond what human eyes can view and analyze. And, the IC will need to learn how to protect its own IoT from exploitation as well as how to exploit the intelligent things deployed by adversaries. For the IC, our adversaries’ secrets hide in plain sight within that ocean of data, and it’s critical they have the systems and know-how to discover those secrets.

Q: What benefits has HPE seen from its USGIF Organizational Membership?

HPE has maintained a great relationship with USGIF. The GEOINT Symposium is one of HPE Federal’s most important shows to attend. The breakout sessions, networking events, and access to senior executives within IC leadership are outstanding. HPE also greatly benefited from attending USGIF’s Powering GEOINT Analytics: Big Data from Small Sats workshop in April at NGA Campus East in Virginia. The theme of collecting data from small satellites was right on target and of great interest to HPE. We see computing at the intelligent edge as a significant area of opportunity for many years ahead.

The post HPE: Revolutionizing IT appeared first on Trajectory Magazine.

]]>
34460
Red Hat: An Open Platform for Democratized Data http://trajectorymagazine.com/red-hat-open-platform-democratized-data/ Fri, 11 Aug 2017 20:39:43 +0000 http://trajectorymagazine.com/?p=34570 Q&A with Adam Clater, chief architect, public sector

The post Red Hat: An Open Platform for Democratized Data appeared first on Trajectory Magazine.

]]>
How has Red Hat’s role evolved since it was founded in the early ’90s?

For a long time, Red Hat endeavored to be a boxed software company—the kind of company where you would buy software along with an agreement for a finite amount of time in a retail channel. That was Red Hat’s goal.

Eventually, we realized we needed to take a hard look at what was actually going on in the software space and the competitive landscape. There were a variety of Linux distributions that were coming on the market and it became difficult to differentiate the true value of buying a Red Hat Linux distribution (as it was referred to) as opposed to buying any other Linux distribution or even using a free one.

The decision was made around 2003 to essentially get rid of that boxed software business and to go full enterprise. We wanted to lead the community still in that consumer-hobbyist marketplace, but take all the things that came from the early endeavors and put them into Red Hat Enterprise Linux. Today, we have this bifurcated mode where our community efforts are coalesced via Fedora Linux and our customers use Red Hat Enterprise Linux. This is a model that we’ve found works well for the development of open source software.

What does Red Hat offer with regard to GEOINT services and capabilities?

As you go about building GEOINT capabilities, you want a very consistent baseline for an operating system on which to build that infrastructure. Throughout the Intelligence Community, many GIS providers, and the federal government, Red Hat Enterprise Linux has become the de facto standard for building enterprise applications and mission-critical workloads.

Customers don’t want to have to re-baseline and re-certify their applications every time a new version of an operating system comes out. If I’m building a GIS capability, I want to be able to write that GIS capability on an operating system that I know is going to be supported in the long-term. We support Red Hat Enterprise Linux for 10 to 13 years depending on the subscription. We drive that ability and that value throughout the entire application development space.

Who are some of Red Hat’s current customers?

Red Hat is quite active throughout the entire Intelligence Community, especially for GIS workloads, as well as with other federal agencies. We have great numbers around the number of banking institutions, airlines, and health care agencies in the Fortune 500.

What differentiates Red Hat from other software companies, specifically in the open source field?

Red Hat is a 100 percent open source company. For every product we sell, the source code is available to our customers. That’s different from a lot of other open source companies that may be pursuing things like open core, and models where they have an open source component, but there’s some sort of software intellectual property that they see as their value-add.

At Red Hat, we see ourselves as the value-add. When you buy Red Hat Enterprise Linux or Red Hat JBoss solutions, we are going to support those technologies by making sure they conform to certain standards—like FIPS 140-2, which is highly important for government agencies, as well as common criteria certification. Red Hat Enterprise Linux is common criteria EAL-4 certified with virtualization and containers, which is valuable for our customers within the DoD and intelligence space.

In the mid-2000s, we partnered with the National Security Agency in the development of security-enhanced Linux. SE-Linux is a core component of how we achieve a lot of those security certifications and it assures we can run multiple security levels on a single operating system. It’s been incredibly valuable within the Intelligence Community for achieving certifications and assuring we are able to implement the security as defined by those organizations.

What GEOINT trends are you seeing right now, and how is Red Hat responding to them?

What’s amazing is that, if you look at GEOINT industry 10-15 years ago, you found yourself going to just a handful (or less) of key vendors who were the only ones defining how geo-data was being made available, how it was being used, and what you could do with it. Now, we’re witnessing the democratization of that data. It started with fairly simple things like Google Maps. The explosion of capabilities we now have that are now available in these communities is incredible.

The spread of open source communities has led to that democratization. At Red Hat, we’ve shown the world how to make participation in open source communities both valuable and sustainable. We’ve shown how to do this in an open way and it is core to our creed that we be open and stay open. That’s incredibly valuable when you begin to talk about individuals making contributions of their time and effort to collect, codify, and aggregate data in a single place. There’s an implicit reciprocity that, when I make a contribution of my data or my capability, I’m going to get access to data and capabilities that others have made. Without that, I don’t think we’d have seen the explosion of data and capabilities in the GIS market and in GIS communities that we see today.

The post Red Hat: An Open Platform for Democratized Data appeared first on Trajectory Magazine.

]]>
34570
Multilingual Intelligence http://trajectorymagazine.com/multilingual-intelligence/ Thu, 08 Jun 2017 03:12:15 +0000 http://trajectorymagazine.com/?p=34139 SDLGov implements machine learning to understand language

The post Multilingual Intelligence appeared first on Trajectory Magazine.

]]>
SDL Government (SDLGov) showcased at GEOINT 2017 its platform of commercial-off-the-shelf products and integrated solutions that enable understanding of foreign languages. According to the company, more and more information in the sea of GEOINT content contains foreign languages or informal language, slang, acronyms, and a mixture of dialects—especially in social media.

SDLGov’s translation content management technologies are deployed across federal agencies and into the hands of warfighters. Its technology is embedded within systems and programs that ingest live video and audio, including platforms in theater. As a result, SDLGov enables the translation and understanding of multiple dialects of Arabic, as well as the French dialects spoken in the Sahel region of Africa, Russian social media content, and more.

SDLGov provided demonstrations at its booth of its integrated Machine Translation platform, which ingests multilingual audio and provides near real-time translations. The company also showcased entity extraction from a diverse set of foreign language content.

“Our products and solutions allow geospatial analysts to take control of multilingual content by enabling the centralizing of translation efforts into single, streamlined processes,” said SDLGov CEO S. Danny Rajan “Through the use of powerful translation memories, terminology managers, and customizable workflows, our translation offerings will reduce the time and cost of translation across a broad range of content types, from social media, signals intelligence, documents, and multimedia.”

The post Multilingual Intelligence appeared first on Trajectory Magazine.

]]>
34139
Accelerating Innovation http://trajectorymagazine.com/accelerating-innovation/ Thu, 08 Jun 2017 03:05:17 +0000 http://trajectorymagazine.com/?p=34153 DigitalGlobe showcased earth imaging, machine learning, and more

The post Accelerating Innovation appeared first on Trajectory Magazine.

]]>
At GEOINT 2017, DigitalGlobe demonstrated its advanced imaging capabilities as well as its developing machine learning and open-source software initiatives.

The DigitalGlobe booth offered demonstrations of products such as Securewatch, a cloud-based imagery interface. DigitalGlobe announced this week the launch of Securewatch Sites, a monitoring service available within the interface that will make use of new high-resolution imagery from SI Imaging Services’ KOMPSAT-3 and KOMPSAT-3a satellites. Two new products—GeoNews and Human Landscape—will be integrated with Securewatch as well. GeoNews offers Securewatch subscribers access to timely global news articles. Human Landscape adds five geospatial information layers to existing DigitalGlobe imagery: political boundaries, military installations, airports, seaports, and other features of interest.

The company also demonstrated GBDX, its big data library and analysis tool suite. A recent agreement between DigitalGlobe and MDA will make a RADARSAT-2 data set available on the GBDX platform, providing users synthetic aperture radar data that will display both day and nighttime Earth imagery as well as a change-detection capability.

DigitalGlobe transformed part of its booth into a small theater presenting “DigitalGlobe Live,” a series of 10-minute “Smart Talks” on the company’s offerings.

“DigitalGlobe is funding development to accelerate innovation in machine learning through the SpaceNet Challenge, which [we highlighted at GEOINT 2017] in one of the Smart Talks at our booth,” said Taner, Kodanaz, industry leadership champion with DigitalGlobe.

Image courtesy of DigitalGlobe.

The post Accelerating Innovation appeared first on Trajectory Magazine.

]]>
34153
Analyzing Cyber Threat http://trajectorymagazine.com/analyzing-cyber-threat/ Wed, 07 Jun 2017 05:25:34 +0000 http://trajectorymagazine.com/?p=34036 Tom Sawyer Software demonstrates new products alongside Oracle

The post Analyzing Cyber Threat appeared first on Trajectory Magazine.

]]>
Tom Sawyer Software (Booth 1526) provides organizations with sophisticated graph and data visualization for geospatial intelligence analysis. The company works with defense systems integrators and U.S. federal customers on mission-critical projects.

“Our company name, Tom Sawyer Software, says a lot about our philosophy and what it means to try to build a great company—it’s a long journey down a winding river,” said CEO Brendan Madden. “Great things don’t come easily, we believe in building something that lasts.”

The company’s flagship product is Tom Sawyer Perspectives, which helps clients analyze cyber threats and criminal activity on a global scale. U.S. federal customers have used the technology for several years to filter, visualize, and analyze more than one trillion entities—people, events, places, and activities—from foreign and domestic sources.

At its GEOINT 2017 booth, Tom Sawyer Software highlights how to visualize, navigate, and analyze GEOINT data in desktop and web-enabled applications. Additionally, Oracle (Booth 1939) and Tom Sawyer Software are both showcasing a cyber threat analysis solution jointly developed by the two companies.

Tom Sawyer Software is also highlighting its new Tom Sawyer Maps functionality that combines the power of the company’s existing rule-based drawing views and the OpenLayers map library.

Image courtesy of Tom Sawyer Software.

The post Analyzing Cyber Threat appeared first on Trajectory Magazine.

]]>
34036
Bringing GEOINT to the Cloud http://trajectorymagazine.com/bringing-geoint-cloud/ Sun, 04 Jun 2017 20:01:39 +0000 http://trajectorymagazine.com/?p=33711 Hexagon U.S. Federal to demonstrate real-time analytics with espionage game

The post Bringing GEOINT to the Cloud appeared first on Trajectory Magazine.

]]>
Hexagon U.S. Federal (Booth 1051), which recently rebranded from Intergraph Government Services, has more than three decades of experience serving the GEOINT Community.

The company, a subsidiary of Hexagon AB, offers imaging software, data integration, asset management, and information analytics to its government customers.

“We can bring more solutions to bear on a specific problem than any one company,” said Rob Mott, Hexagon U.S. Federal’s vice president of geospatial solutions.

The company’s theme for its GEOINT 2017 booth is “bringing GEOINT to the cloud.” Hexagon U.S. Federal highlights a number of emerging technologies, including a cloud-based exploitation solution consisting of an analysis platform called M.App eX and a web application for image and data management called Web GLT.

The company is also demoing a real-time analytics capability using what Mott referred to as a “convergence platform.” This comes in the form of the Global Espionage Challenge: a game in which Symposium attendees try to “catch an enemy spy.” Participants answer a series of geography-oriented questions via text message each day, and their responses are processed by the convergence platform. Real-time analysis displays the results on a global map in the Hexagon U.S. booth. The map will reveal a daily winner based on how close the participant got to identifying the location of the enemy “spy.” Each daily winner will be awarded a Series 2 Apple Watch.

Additionally, the company is showcasing a web-based GIS technology called GeoMedia SmartClient. According to Mott, this is more than just a virtual desktop of GIS applications—it’s a highly configurable interface meant for mobile, field-based users that need to collect and input data along stringent guidelines and quality checks, often in low-bandwidth areas.

“We’re most looking forward to hearing from thought leaders about emerging trends and key challenges,” Mott said. “You really get a sense, through tone and passion for certain topics, what’s important to them—that’s the most important takeaway for us. It helps us shape our future strategies.”

Photo courtesy of Hexagon U.S. Federal

The post Bringing GEOINT to the Cloud appeared first on Trajectory Magazine.

]]>
33711
Weekly GEOINT Community News http://trajectorymagazine.com/weekly-geoint-community-news-4/ Mon, 17 Apr 2017 19:03:19 +0000 http://trajectorymagazine.com/?p=31986 NGA Seeks Proposals for Airborne SAR BAA; HPE Software Achieves Information Processing Standard; Astro Digital Secures $16.65M in Series A Funding

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
NGA Seeks Proposals for Airborne SAR BAA

The National Geospatial-Intelligence Agency (NGA) seeks proposals for platforms designed to enhance image processes used with airborne synthetic aperture radar (SAR) by expanding the type and number of SAR sensor data that can be integrated with the GEOINT standard. According to the agency, the solicitation falls under the fifth topic of the agency’s three- year Boosting Innovative GEOINT (BIG) Broad Agency Announcement (BAA) initiative. The target value for the contract award is not to exceed $430,140 with a performance period of one year or less. Proposals are due May 1.

HPE Software Achieves Information Processing Standard

Hewlett Packard Enterprise (HPE) announced its HPE SecureData software achieved the Federal Information Processing Standard (FIPS) 140-2 validation of Format-Preserving Encryption (FPE). HPE SecureData with Hyper FPE delivers a NIST-standardized method of protecting data at-rest, in-motion, and in-use, and maintains the format, meaning, value, and logic in the data.

Astro Digital Secures $16.65M in Series A Funding

Astro Digital announced it has secured $16.65 million in Series A funding. The funding will accelerate the launch cycle for Astro Digital’s Landmapper constellation and development of the company’s existing analytics platform, and in turn will enable the company to provide almost daily imaging of the globe by the end of 2017.

Peer Intel

Dewberry promoted Dan Bubser to senior associate at the firm’s Tampa office. Bubser has more than 16 years of experience supporting remote sensing, geographic information systems, photogrammetry, and photo interpretation services for local, state, and federal clients.

Photo Credit: NASA JPL

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
31986