Information Technology – Trajectory Magazine http://trajectorymagazine.com We are the official publication of the United States Geospatial Intelligence Foundation (USGIF) – the nonprofit, educational organization supporting the geospatial intelligence tradecraft Fri, 19 Jan 2018 19:39:44 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.4 https://i2.wp.com/trajectorymagazine.com/wp-content/uploads/2017/08/cropped-TRJ-website-tab-icon-1.png?fit=32%2C32 Information Technology – Trajectory Magazine http://trajectorymagazine.com 32 32 127732085 Weekly GEOINT Community News http://trajectorymagazine.com/weekly-geoint-community-news-29/ Mon, 30 Oct 2017 15:13:06 +0000 http://trajectorymagazine.com/?p=35137 NGA Seeks Cloud Development Support; Engility Wins DIA Contract; IARPA Announces Updates to Functional Map of the World Challenge; ESA and Radiant.Earth Partner to Support Sustainable Development

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
NGA Seeks Cloud Development Support

The National Geospatial-Intelligence Agency (NGA) is seeking unclassified cloud development vendors to deliver support to 40 individuals in St. Louis, Mo. The agency aims to award a one-year base contract with four option years. Responses to the notice are due Oct. 31.

Engility Wins Defense Intelligence Agency Contract

Engility won a digital forensics contract renewal with Defense Intelligence Agency (DIA). Engility will analyze multimedia files extracted from digital devices to support DIA missions across the defense, intelligence, and law enforcement communities, and will provide IT infrastructure including cloud solutions and enterprise technology. DIA first awarded the contract in 2011. The $14 million renewal has a one-year base and four option years.

IARPA Announces Updates to Functional Map of the World Challenge

The Intelligence Advanced Research Projects Activity (IARPA) announced the scoring process for its Functional Map of the World Challenge, which seeks algorithms to detect and label points of interest in satellite imagery. Participants will be scored based on their ability to accurately categorize portions of imagery and can submit solutions through Dec. 31. In concert with this announcement, IARPA released one of its largest annotated imagery datasets that participants will use to train and test their algorithms.

ESA and Radiant.Earth Partner to Support Sustainable Development

The European Space Agency and Radiant.Earth announced a cooperation to better track the progress of the United Nation’s Sustainable Development Goals (SDG). The partnership will work to strengthen data literacy by using mutually shared platforms and satellite imagery to analyze SDG objectives.

CACI Awarded Army ISR Task Order

CACI was awarded a $91 million task order to provide support to the U.S. Army Communications-Electronics Research, Development, and Engineering Center Flight Activity. This four-year task order, awarded under the Rapid Response-Third Generation contract vehicle, represents continuing work in the company’s surveillance and reconnaissance market area.

Berico Technologies Wins Place on Multi-Award NGA Contract

Berico Technologies won a place on an analytical support contract for the National Geospatial-Intelligence Agency (NGA). Berico is one of several partners who will provide data science analysis and collection management support to NGA and other government agencies. The contract is worth a combined maximum of $977.6 million.

Photo Credit: ESA

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
35137
Weekly GEOINT Community News http://trajectorymagazine.com/weekly-geoint-community-news-27/ Mon, 16 Oct 2017 17:11:52 +0000 http://trajectorymagazine.com/?p=34932 NGA Awards Leidos IT Management Contract; DXC Reaches Three-Way Deal with Vencore and Keypoint; Northern California Wildfire Relief; Vricon Awarded USSOCOM Contract; Engility Wins DoD Security Contract; & More

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
NGA Awards Leidos IT Management Contract

The National Geospatial-Intelligence Agency (NGA) awarded Leidos a prime contract under the Information Technology Enterprise Management (ITEMS) User Facing Services (UFS) program. The contract is a five-year IDIQ agreement for a single payment of up to $988 million. Leidos will take command of NGA’s IT management and will transition UFS away from current NGA contracts.

DXC Reaches Three-Way Deal with Vencore and Keypoint

To accommodate a three-way deal with Veritas Capital’s Vencore and Keypoint Government Services, DXC Technology is spinning out its U.S. public sector offering. The three businesses will combine to create a new publically traded government contracting company. The merger is expected to close by March 31, 2018. The resulting company will focus on cyber, big data analytics, cloud computing, and enterprise IT.

Northern California Wildfire Relief

The GEOINT Community is using data to help fight the devastating wildfires in Northern California. DigitalGlobe released high-resolution imagery of affected areas in Santa Rosa, as well as a more extensive collection of before-and-after imagery available to emergency responders on the Santa Rosa wildfires page.

Esri released an interactive ArcGIS map visualizing active fire locations, traffic alerts, and other near-real-time data from the U.S. Department of Agriculture, USGS, and Waze. The map provides an accurate, up-to-date picture of the situation on the ground in California.

Vricon Awarded USSOCOM Contract

U.S. Special Operations Command awarded Vricon Systems a $1,271,765 sole source contract for commercial data and software testing. Under the contract, Vricon will work to increase the resolution and accuracy of SOCOM’s 3D geospatial data and will automate workflows to reduce time-consuming manual work.

Engility Wins DoD Security Contract 

Engility announced it will partner with the DoD to protect critical acquisition processes and weapons systems from foreign hacking. Engility will address existing vulnerabilities and mitigate future cyber attacks primarily in support of the DoD’s Damage Assessment Management Office and Joint Acquisition Protection and Exploitation Cell. The contract has a one-year base with four option years and is valued at $28 million. 

PAR Technology Subsidiary Awarded Air Force Subcontract

Rome Research Corporation, a subsidiary of PAR Technology, was awarded a subcontract from Croop-LaFrance to provide client support services to the U.S. Air Force’s 72nd Air Base Wing Communications Directorate at Tinker AFB, Okla. The contract is valued at $11.9 million and has a one-year base with up to four option years.

University of Missouri Uses Deep Learning to Detect Chinese Missile Sites

The University of Missouri’s Center for Geospatial Intelligence, a USGIF-accredited school, used machine learning to help human analysts parse through large volumes of imagery in search of surface-to-air missile sites in southeast China. This deep learning approach delivered an average search time of 42 minutes for 90,000 km areas—more than 80 times the efficiency of human visual searches. The research study was published in a special issue of the SPIE Journal of Applied Remote Sensing.

Peer Intel

Northrop Grumman named Kathy Warden its next president and COO following the retirement of current president and COO Gloria Fatch. Additionally, corporate VP of government relations Sid Ashworth will retire at the end of this year. Mark Caylor will become corporate VP and president for mission systems, Shawn Purvis will become corporate VP and president for enterprise services, and Lesley Kalan will become corporate VP for government relations. Warden will oversee these three branches of operation as well as the integration of Orbital ATK once the merge is complete.

Photo Credit: DigitalGlobe

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
34932
HPE: Revolutionizing IT http://trajectorymagazine.com/hpe-revolutionizing-it/ Wed, 16 Aug 2017 20:11:39 +0000 http://trajectorymagazine.com/?p=34460 Q&A with Ken Bruni, director, Advanced Programs Group; and Howard Clifford, distinguished technologist

The post HPE: Revolutionizing IT appeared first on Trajectory Magazine.

]]>
Q: How does HPE support the Intelligence Community (IC)?

Hewlett Packard Enterprise (HPE) has had a strong, long-term relationship with the IC, engineering and building information technology (IT), and providing consulting services in support of their unique and challenging missions. Additionally, HPE has worked with the IC to define technologies and techniques to address cyber vulnerabilities such as the advanced persistent threat, and as a result, HPE has created highly secure IT infrastructure. As part of HPE’s cyber strategy, the company is implementing the National Institute of Standards and Technology (NIST) Cybersecurity Framework and NIST 800-171 to secure HPE’s supply chain. Finally, HPE has cleared support technologists worldwide and secure facilities in order to support the IC globally.

Q: What is the background on HPE’s Enterprise Services spin off to DXC Technology? How will this change HPE?

On April 1, we completed the spin-merge of our Enterprise Services business with Computer Sciences Corp. to form DXC Technology. We believe this was an important move for HPE to create a more focused company dedicated to the solutions our customers and partners tell us they want most.

HPE will retain and continue to invest in Pointnext, its technology services organization, made up of more than 25,000 specialists in 80 countries to support customers across advisory and transformation services, professional services, and operational services. These teams collaborate with businesses worldwide to speed their adoption of emerging technologies, including cloud computing and hybrid IT, big data and analytics, the intelligent edge, and the Internet of Things (IoT).

Q: How is HPE innovating in the GEOINT space?

HPE is innovating across IT, from the core to the edge. One focus area is what we call “hybrid IT.” HPE recognizes some workloads are best deployed in public or private clouds, while others are best deployed in traditional IT infrastructure. Building and helping to create hybrid IT is a core strategy of HPE, since that is what our customers are asking for. To deliver on that strategy, HPE has engineered and built new hardware and software technologies to deliver the same dynamic configuration flexibility and economics of cloud across traditional computing, storage, and networking solutions. This innovation allows our customers to deploy the right workload on the right platform within the right economic model. Most importantly, this directly supports the GEOINT Community’s desire for rapid development and widely-shared apps and data hosted in the cloud while keeping data collection, high performance data processing, and mature workloads on traditional infrastructure.

Another major innovation is in the area of mobility with HPE’s Aruba Wi-Fi hardware and software. The IC now has its community cloud and HPE has worked with the IC to create a National Security Agency-approved way of handling sensitive and classified data over Wi-Fi. While Wi-Fi is likely not appropriate for use everywhere in the IC, it does have its place and its use will grow over time.

Q: What are your thoughts on how IT will transform in the next five years?

A huge change is already underway and will become more apparent in the next several years. If you look at the IT industry since its inception, there have been several tectonic shifts and we are at the beginning of a fourth shift. Now, we are rapidly moving toward a world where everything imaginable has some kind of connectivity and processing. This is the Internet of Things, where processing is decentralized and pushed out to the edge close to where data is created, whether by autonomous cars and planes, smart cities, or sensors adorning nearly every item imaginable. With IoT the number of “users” or data creators could reach the hundreds of trillions and the resulting amount of data generated will grow exponentially.

The computers we rely on today, from smartphones to supercomputers, are hitting a wall in terms of physical size, efficiency, and computing capacity, because today’s computers are based on an architecture that’s more than 60 years old. To address this challenge, HPE envisioned an entirely new computing architecture called “memory-driven computing,” which enables a massive leap in our ability to process data. It allows the development of new ways of extracting knowledge and insights from large, complex data sources. Massive performance gains can be obtained from rethinking and re-architecting how data is processed and analyzed. All of this has huge implications for the IC, allowing the community to leverage the power of the IoT.

Machine learning will cease to be a novelty and will soon become a necessity as the data volumes continue to grow beyond what human eyes can view and analyze. And, the IC will need to learn how to protect its own IoT from exploitation as well as how to exploit the intelligent things deployed by adversaries. For the IC, our adversaries’ secrets hide in plain sight within that ocean of data, and it’s critical they have the systems and know-how to discover those secrets.

Q: What benefits has HPE seen from its USGIF Organizational Membership?

HPE has maintained a great relationship with USGIF. The GEOINT Symposium is one of HPE Federal’s most important shows to attend. The breakout sessions, networking events, and access to senior executives within IC leadership are outstanding. HPE also greatly benefited from attending USGIF’s Powering GEOINT Analytics: Big Data from Small Sats workshop in April at NGA Campus East in Virginia. The theme of collecting data from small satellites was right on target and of great interest to HPE. We see computing at the intelligent edge as a significant area of opportunity for many years ahead.

The post HPE: Revolutionizing IT appeared first on Trajectory Magazine.

]]>
34460
Lt. Joseph Flynn: Making GEOINT Connections http://trajectorymagazine.com/lt-joseph-flynn-making-geoint-connections/ Wed, 16 Aug 2017 20:10:53 +0000 http://trajectorymagazine.com/?p=34466 How USGIF Membership is opening new doors for one Fairfax County law enforcement officer

The post Lt. Joseph Flynn: Making GEOINT Connections appeared first on Trajectory Magazine.

]]>

Lt. Joseph Flynn, Fairfax County Police

Lt. Joseph Flynn is assistant commander of the Fairfax County Police Department’s Criminal Intelligence Division and deputy director of the Northern Virginia Regional Intelligence Center (NVRIC).

In 24 years with the Fairfax County Police Department Flynn has held many roles—patrol officer, air and SWAT paramedic, and more. He has also held leadership positions in case and branch management. Recently, Flynn was elected chair of the Metropolitan Washington Council of Governments’ Subcommittee for Intelligence. Flynn has been a USGIF Individual Member since fall 2016.

Q: What led you to become a USGIF Individual Member?

When I transferred into the Criminal Intelligence Division, one of the big things I noticed was NVRIC analysts and staff were very isolated. The NVRIC itself has numerous analysts—cyber, critical infrastructure and key resources, threat assessment, gangs, narcotics. I wanted to see what else was out there in the intelligence world—other organizations or groups we could tap into to possibly expand our resources. Through the good graces of Google, USGIF came up.

I learned USGIF was very involved with the Defense Department and the federal side of geospatial technology, so I reached out to see if they would allow U.S. law enforcement into the organization and be interested in partnering with law enforcement agencies. My email received a prompt response from USGIF CEO Keith Masback, and he actually visited us with USGIF staff. They spoke with our analysts and our commander to explain what the Foundation does and to share more about some of the outlets they could provide to us. They also wanted to learn about the trends law enforcement is following with regard to GEOINT. I wanted my analysts to have opportunities for networking and outreach and to see other technologies out there that they may be unaware of.

Q: How do Fairfax County police use GEOINT to prevent crime and protect the community?

We have several different layers of crime analysts throughout the county. A lot of stuff we do is related to GPS search warrant information we’re allowed to receive from that type of data dump. We also use a lot of cellphone tower information when dealing with specific cases. For plotting information, we use a system called Tableau to highlight where events are happening.

There are two avenues we go down with geospatial information. The first is plotting an event and the historical marker of it. That information is used to help highlight, for example, whether the event occurred in a high accident traffic area. Then we’d push our efforts that way. Or to determine whether the event occurred in an area with high gang activity. And we’d push our activity that way. We break it down into the specifics of the crime and then determine what resources we’re going to direct to that area to help reduce crime and have more of a presence.

The second part of our geospatial aspect is plotting evidence data to reveal a timetable of how an individual person is moving and discover correlation between one or more targets to determine if there’s a relationship. This is where companies are starting to come to us to see if they can help or if we can help them with a product that performs the geospatial evidentiary role.

Q: What advice do you have for students and young professionals hoping to join or who recently joined the law enforcement community?

Sit back and determine what type of law enforcement you want to do. I enjoyed starting my career as a beat cop, going out, pounding the street, driving, and meeting people and investigating certain levels of crime. There are those who want to go straight into working the crime scene processing or the forensics. You also have people who don’t want to get their hands dirty but are very analytical and think deeply—the people that can correlate and see the bigger picture and bring it into perspective. Decide whether that’s something you want to go into. Also, technology is still big in all aspects; you have to get very comfortable with the current technology and always think forward. If you’re not doing that, you’re going to handcuff yourself from advancing your career and your abilities.

Q: How have you benefited from USGIF Membership?

Professionally, it’s opening up eyes and doors. There are opportunities for law enforcement intelligence folks to meet and network with people who are experts in the field and are willing to assist us. I’m bringing geospatial intelligence specialists into NVRIC to talk with our analysts and to see how the workflows go and how they set their goals. Then, we can ask those outside groups for advice on how we can improve. USGIF is opening doors for us to people and technologies that we may not have thought of in the past.

The post Lt. Joseph Flynn: Making GEOINT Connections appeared first on Trajectory Magazine.

]]>
34466
Weekly GEOINT Community News http://trajectorymagazine.com/weekly-geoint-community-news-12/ Mon, 03 Jul 2017 16:19:37 +0000 http://trajectorymagazine.com/?p=34265 Woolpert to Provide EPA with Geospatial Infrastructure Support; Harris Produces Weather Satellite Instrument for South Korea; Altran and Luciad to Work with Dassault Aviation; Unmanned Underwater Vehicles Market to be Worth $5.2 Billion by 2022; Peer Intel

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
Woolpert to Provide EPA with Geospatial Infrastructure Support

Woolpert will supply the U.S. Environmental Protection Agency (EPA) with geographic information system, remote sensing data, and related consulting services in support of a five-year infrastructure information technology contract with CSRA. The EPA will consolidate services to include data center management, application hosting, application deployment and maintenance, geospatial service support, network security, cybersecurity, cloud computing, continuity of operations services, and more. The contract is valued at $266 million.

Harris Produces Weather Satellite Instrument for South Korea

Harris Corp. delivered an advanced digital weather satellite instrument to the Korea Aerospace Research Institute, helping forecasters protect people in the region from severe weather. The Harris-built Advanced Meteorological Imager (AMI) will be integrated into the next-generation GEO-KOMPSAT-2A weather satellite, scheduled to launch in 2018. According to the Harris press release, AMI will deliver images with three times more data and four times the resolution at refresh rates five times faster than currently available in the region.

Altran and Luciad to Work with Dassault Aviation

French aerospace company Dassault Aviation has partnered with Altran and Luciad to design Dassault’s Mission Preparation Systems, known as OPERA, for its RAFALE and Mirage 2000 fighter jets. OPERA is used for training simulations and combat missions. The system prepares pilots by supplying them with aerial photographs, satellite images of terrain, aeronautical charts, 2D/3D topography, and other tactical and geographic data. OPERA then converts the data to a dedicated format for uploading to mission calculators.

Unmanned Underwater Vehicles Market to be Worth $5.2 Billion by 2022

The Unmanned Underwater Vehicles market is estimated at $2.69 billion in 2017 and is projected to reach $5.2 billion by 2022. This research was published in the report “Unmanned Underwater Vehicles (UUV) Market by Type (Remotely Operated Vehicle & Autonomous Underwater Vehicles), ROV & AUV Market by Application, Product, Propulsion System, Payload, and Region – Global Forecasts to 2022.” The growth of the market can be attributed to the rising number of deep-water offshore oil and gas production activities and increasing maritime security threats.

Peer Intel

President Trump nominated Susan M. Gordon to serve as the next Principal Deputy Director of National Intelligence. Gordon is currently deputy director at the National Geospatial-Intelligence Agency and previously spent much of her career with the CIA.

USGIF CEO Keith J. Masback was appointed to the International Spy Museum’s Advisory Board of Directors. The board is comprised of leading intelligence experts, scholars, and practitioners.

Defense Intelligence Agency Director Lt. Gen. Vincent Stewart was nominated to become deputy commander of U.S. Cyber Command. Stewart formerly served as the head of Marine Forces Cyber.

Photo Credit: Luciad

The post Weekly GEOINT Community News appeared first on Trajectory Magazine.

]]>
34265
Service-Driven Culture http://trajectorymagazine.com/service-driven-culture/ Thu, 08 Jun 2017 03:10:39 +0000 http://trajectorymagazine.com/?p=34145 InTec provides expert tech consulting and management to federal agencies

The post Service-Driven Culture appeared first on Trajectory Magazine.

]]>
InTec is a veteran-owned small business providing full-service IT consulting to the Intelligence Community and DoD. InTec formed in 2004 with a single technical contributor on contract with the National Geospatial-Intelligence Agency (NGA) and has since expanded its capabilities to include geospatial data analysis, training development, and program management.

According to InTec Senior Vice President and COO Nate Copeland, the company’s focus at GEOINT 2017 was simple. Rather than demo a new novelty application or gadget, InTec promoted its service-driven corporate culture and customer-centric approach.

InTec primarily employs professionals who have dedicated much of their career to supporting federal intelligence agencies. The company prides itself on maintaining what Copeland refers to as a “non-bureaucratic environment” in which trained, high-value experts continue their service.

“We’ve been nationally recognized for that focus and continue to grow because our customers benefit from happy, focused, and dedicated industry business partners,” Copeland said.

Image courtesy of InTec.

The post Service-Driven Culture appeared first on Trajectory Magazine.

]]>
34145
Building Better Security http://trajectorymagazine.com/building-better-security/ Sun, 04 Jun 2017 19:38:11 +0000 http://trajectorymagazine.com/?p=33700 AECOM advocates “building information modeling and management” to increase efficiency, security at IC facilities

The post Building Better Security appeared first on Trajectory Magazine.

]]>
To collect, analyze, and distribute information effectively, the Intelligence Community (IC) needs not only talented and dedicated people, but also secure and efficient facilities, according to multinational engineering firm AECOM (Booth 425). At GEOINT 2017, AECOM is showcasing the tool it says is best positioned to help the IC optimize its physical infrastructure: building information modeling and management, otherwise known as BIM.

“A BIM model is the digital representation of physical and functional characteristics of a built facility or structure,” explained Stuart Harrison, senior vice president leading the Infrastructure & Engineering sector for AECOM’s Management Services Group. “The BIM model is therefore a drawing—usually three-dimensional—with a digital library of product information embedded within each element. Because the model contains such a widespread array of information, it creates a shared knowledge resource about the building or structure and forms a reliable basis for decisions not only during the early design and construction phases, but also during the operational stage.”

AECOM is illustrating BIM’s value to the IC by demonstrating digital models it has built for two high-value facilities currently using BIM to securely monitor, manage, and upgrade their infrastructure: Denver International Airport in Colorado and the Sydney Opera House in Australia.

“Building a BIM model is not like drawing lines on paper or on AutoCAD and assigning information to them,” Harrison concluded. “BIM modeling is more analogous to constructing a LEGO model of individual bricks, each containing within them information on their size, connections, technical specifications, costs, etc. With AutoCAD you draw lines. With BIM you assemble components. And it is these components that contain a family of information.”

Photo courtesy of AECOM

The post Building Better Security appeared first on Trajectory Magazine.

]]>
33700
Return on Interoperability: The Benefits of Horizontal Integration http://trajectorymagazine.com/return-interoperability-benefits-horizontal-integration/ Wed, 03 May 2017 14:00:46 +0000 http://trajectorymagazine.com/?p=32761 Collaborating across organizations to make the knowable known

The post Return on Interoperability: The Benefits of Horizontal Integration appeared first on Trajectory Magazine.

]]>
Interoperability is more than an IT buzzword. Investment in interoperability will yield significant returns for the defense intelligence enterprise.

“There are more things to do than we have money to go and do them with,” said Todd Probert, vice president of mission sustainment and modernization at Raytheon. “So, we have to make the best use of what we have, including not only data, but also all the underpinnings that allow maximum use of that data.”

It’s a simple concept: As sharing increases, duplication and spending decrease. Efficiency, meanwhile, surges. That leads to an even more important benefit of interoperability: speed, which is a key tenet of the Third Offset, the Pentagon’s strategy to ensure the long-term competitive advantage of the U.S. military.

“The principle of the Third Offset is really important,” Probert said. “A tentative offset is speed, and you can’t have speed if you don’t have the ability to talk to each other.”

Communicating standardized data via shared systems also bears critical mission fruit. When ISR systems are interoperable, for example, they can accept and integrate a host of different inputs, giving intelligence analysts access to a more holistic picture that enables better and faster decision- making, according to Sean Love, director of business development at Northrop Grumman.

“When you’re bringing together the diversity of an imagery analyst with the expertise of a SIGINT analyst with the reach of a HUMINT analyst, all of a sudden you’ve got a picture that is painted a little more clearly and a lot more rapidly,” Love said.

Consider, for example, a combat scenario in which SIGINT sensors detect potential activity from enemy forces. Interoperability ensures SIGINT sensors can queue IMINT sensors on a separate platform to confirm the presence of hostile forces before a bomb is dropped.

“Interoperability is only possible if those two sensors know about each other, if they have a data format that’s compatible, and if they have the ability to communicate with one another,” Love explained.

What makes the intelligence picture truly complete isn’t merely that sensors are interoperable; it’s that the services are, too.

“The idea of collaborating across organizations is something that enables us to make the knowable known,” said retired Marine Col. Phillip Chudoba, assistant director of intelligence at Marine Corps headquarters. “Sometimes, there is specialized intelligence work that I need right now; I should not have to produce that myself if it already exists somewhere else. Having that kind of analytic and production transparency across organizational boundaries is incredibly powerful in an environment where decision-making has to be supported rapidly.”

Capt. Jeffrey Czerewko, who serves in the Navy’s newly formed Office of Digital Warfare within the Office of the Chief of Naval Operations, echoed support of joint intelligence.

“Being able to be interoperable across the services increases our capacity, obviously,” Czerewko said. “And in certain cases it increases our capability—especially for niche intelligence collection requirements.”

When that increased capability reaches the tactical edge, the case for interoperability is clear.

“In a strike group, I need to sense the environment in a distributed manner,” Czerewko concluded. “With an analytic engine [that’s interoperable] I get the ability to do a fairly effective first-pass look. That provides a deep value to the leading edge because you get more effective intelligence far forward in a more timely manner.

I see the enemy and I have a decent idea now what their intent is at the forward edge.”

Return to feature story: Embracing the Enterprise

The post Return on Interoperability: The Benefits of Horizontal Integration appeared first on Trajectory Magazine.

]]>
32761
Embracing the Enterprise http://trajectorymagazine.com/embracing-the-enterprise/ Wed, 03 May 2017 13:58:03 +0000 http://trajectorymagazine.com/?p=32753 After three decades of incremental integration, the defense intelligence community is leveraging IT advancements to pursue a new era of interoperability and agility

The post Embracing the Enterprise appeared first on Trajectory Magazine.

]]>
In modern business parlance, “stovepipe” is a dirty word. Believe it or not, however, stovepipes used to be useful. When they became commonplace in the 19th century, wood-burning stoves were connected to literal stovepipes that drew smoke out of the stove’s belly into a flue or chimney, which coughed it into the sky. In business and government, figurative stovepipes likewise move information from the bottom of an organization to the top. Like smoke from a wood-burning stove, the information flows upward to senior leaders, then out in the form of streamlined decision-making. Because it’s rigid and linear, the stovepipe promotes security, ensures accountability, and reinforces the chain of command, all of which can yield benefits in highly regimented organizations.

There’s just one problem: Stovepipes only flow in one direction. If you’re trying to move smoke through a chimney, that’s ideal. If you’re trying to move information through an enterprise, however, it’s problematic, as vertical processes are prone to inefficiency, duplication, and myopia. In that case, stovepipes don’t always eliminate smoke; often, they create more of it.

The Department of Defense (DoD) came to this realization when it transitioned from analog to digital imagery for airborne intelligence, surveillance, and reconnaissance (ISR), said Ralph Wade, a former Air Force imagery analyst and now vice president of Booz Allen Hamilton’s Strategic Innovation Group.

“The technology for digital sensors came about in the mid-1980s, when electronic communication made it possible to send information digitally from an airplane to a ground station in near real time,” explained Wade, who served as program manager for one of the first and largest such ground stations. “It was a huge increase in capability.”

Marines in Afghanistan

Marines assigned to Special Operations Task Force-West in Herat province, Afghanistan, review images taken by the team during an area assessment. DCGS-MC provides Marine intelligence analysts capabilities for enterprise search, content discovery, collaboration, and workflow management. Photo credit: U.S. Marine Corps Staff Sgt. Rasheen A. Douglas

It was also a huge increase in cost, as each newly acquired platform in turn acquired its own dedicated data link and ground station.

“What you started seeing in the late 1980s and early 1990s was a proliferation of platforms with one-of-a-kind ground stations that were stovepiped,” Wade said. “Every time you wanted to put a sensor onboard an airplane, you had people reinventing the wheel by building custom systems. Congress looked at that and began challenging the Department of Defense: Why aren’t we getting more commonality?”

When Operation Desert Storm exposed a need for more and better imagery, the DoD began asking itself the same question. And when Congress subsequently reduced defense spending under President Bill Clinton, it felt compelled to answer it.

“Budgets were being cut and ISR was on the chopping block because … many of the services at that time didn’t see ISR as their core mission,” Wade recalled. “At the same time, a lot of new technology was coming along—particularly, unmanned vehicles—that wasn’t getting enough attention.”

To protect and prioritize airborne ISR funding, in 1993 the DoD created the Defense Airborne Reconnaissance Office (DARO) to develop and acquire department-wide airborne ISR capabilities.

The objective is the same now as it was then: interoperability. And it’s getting nearer every day, thanks to ongoing horizontal integration efforts within and among the services.

Embracing the Enterprise

Although the business case for interoperability is clear today, it wasn’t always apparent at the outset of DARO. Fortunately, the Air Force had already sown the seeds.

“It started somewhat by accident,” recalled Col. Jason Brown, commander of the Air Force’s 480th ISR Wing. “When digital imagery platforms came about, the Air Force put a digital imagery sensor on the U-2 so that the ones and zeros, if you will, would go down to a ground station … They later decided to put signals intelligence sensors on the same U-2, which went down to the same ground station. So, here you had imagery analysts and signals intelligence analysts all working in the same spot.”

It was an unorthodox but effective arrangement.

“You had folks who didn’t normally work together working together, which was a very powerful capability,” Brown continued.

DCGS was the first attempt at saying: When data comes off a sensor and gets processed, it’s got to be made ‘enterprise-able’—which means, it’s got to be made for use by all.

—John Snevely, DCGS FoS Leader, OUSD(I)

When the Cold War ended, the Air Force saw an opportunity to exploit that capability even further. Therefore, in 1992 it established the Contingency Airborne Reconnaissance System (CARS). Encompassing mobile ground stations in deployable vans, the system migrated in 1994 to a trio of permanent shelters that collected, processed, and exploited data from multiple airborne ISR platforms, then distributed it through a federated architecture to sites across the globe. In 1996, the permanent shelters—Distributed Ground Stations 1, 2, and 3—became known holistically as the Air Force Distributed Common Ground System (AF DCGS), which now includes 27 regionally aligned, globally networked sites around the world.

By leveraging multi-source inputs and federated architecture, AF DCGS had solved the problems posed by stovepiped Air Force ground stations. As a result, the DoD sought sister systems to work in the same fashion across all services. And so was born the Distributed Common Ground System (DCGS) Family of Systems (FoS), consisting of AF DCGS, DCGS-A, DCGS-N, DCGS-MC, and DCGS-SOF—belonging to the Air Force, Army, Navy, Marine Corps, and Special Operations Forces, respectively—each of which integrates with the next via a common software construct known as the DCGS Integration Backbone, or DIB.

“The DIB is essentially a data architecture that everybody publishes to and exploits from,” said Todd Probert, vice president of mission sustainment and modernization at Raytheon, which along with Northrop Grumman, Lockheed Martin, and other industry partners has contracted with the DoD to build the systems necessary to achieve interoperability. “It’s foundational, and without that foundation it’s difficult to do sharing at speed.”

In humans, the spine integrates the body’s various anatomical systems via a shared nervous system through which they can communicate and share resources while still performing their own independent functions. In the DCGS FoS, the DIB is the spine. Although each service-specific DCGS architecture has its own functions and applications, it must be configured to store and share data through the DIB.

“DCGS was the first attempt at saying: When data comes off a sensor and gets processed, it’s got to be made ‘enterprise-able’—which means, it’s got to be made for use by all,” explained John Snevely, who leads the DCGS FoS at the Office of the Under Secretary of Defense for Intelligence OUSD(I). “It took us away from proprietary intelligence data and forced us to start meeting and testing to standards.”

Sailors aboard the USS

Sailors aboard the USS Carl Vinson experiment with Naval Integrated Tactical-Cloud Reference for Operational Superiority (NITROS) capability within various systems, including DCGS-N Increment 2 and the Maritime Tactical Command and Control system. Photo credit: U.S. Navy

Inspired by the Air Force, DARO and the National Imagery and Mapping Agency (NGA’s predecessor) began to consider the idea of interoperable ground stations across the services in 1994. The idea didn’t fully mature, however, until some time after the dissolution of DARO, when OUSD(I) assumed the work of standing up the DCGS FoS.

Prior to DARO’s termination in 1998, each of the services had formed a DCGS program office. Under OUSD(I) oversight, the program managers transitioned from an informal working group to a formal structure called the Multi-Service Execution Team (MET). MET collaborated—and still does—to determine the requirements and configuration of the DIB software. When this work was completed in 2003, OUSD(I) issued a mandate requiring the services to develop and acquire technology to DIB standards, which was made easier on the services by the provision of extra resources.

“We invested in enterprise governance,” Snevely said. “We paid for engineering support at the enterprise level so the program offices didn’t have to figure things out for themselves. It was done for them. All they had to do was take the technology off the shelf and implement it.”

The result was seamless discovery and dissemination.

“Sometimes, I anecdotally call DCGS ‘the Napster for intelligence,’” said retired Marine Col. Phillip Chudoba, assistant director of intelligence at Marine Corps headquarters, recalling the popular music-sharing platform of the early aughts. “That file-sharing platform allowed you to look on my computer and see what music files existed there that you might want to have. The same kind of logic exists with DCGS. A user at the tactical level theoretically has the ability via DCGS and the DIB to look across the joint services and see what information products and data are available.”

Current State: Operational Interoperability

Since OUSD(I) issued its DIB mandate, implementation of interoperability in general—and DCGS in particular—has unfolded in different ways and at different speeds across the services. In the last five years, however, the maturation of mobile computing and cloud architecture has allowed the defense intelligence community to enter a new phase of execution toward horizontal integration.

“You used to have intelligence analysts sitting in very specific seats doing very specific things with very specific intelligence types,” said Sean Love, director of business development at Northrop Grumman. “And that was fine, because the technology—the bandwidth and sheer connectivity—didn’t exist to do a whole lot more than that. Now that those barriers are coming down very quickly, you’re starting to see a lot more cross-sharing.”

The evolution of DCGS from concept to reality began with GEOINT. For example, within the Marine Corps—which initiated its DCGS-MC program in 2007—the first DIB-enabled systems were the Tactical Exploitation Group, an imagery system, and the Topographic Production Capability, which provides topographic and mapping capabilities.

“The GEOINT layer is the first intelligence capability that we elected to pursue in DCGS because the GEOINT layer offers us tremendous potential for enhancing our decision support to commanders,” explained Chudoba.

He added the Marine Corps wants to move from what he calls a “mall cop” environment—intelligence analysts trying to make cognitive sense of a single, limited-view input, like a mall cop monitoring a security-camera feed—to a multi-INT environment wherein analysts can get a more holistic view.

“We want to have a single integrated system consisting of a synoptic GEOINT layer on top of which we can toggle all the other intelligence disciplines in order to look at a problem from different dimensions and make good, timely, accurate decisions.”

With foundation GEOINT in place, the Marine Corps can now pursue DIB-enabled capabilities for other intelligence disciplines.

“DCGS started with sharing only GEOINT,” Snevely said. “We’ve since taken that model and used it to establish sharing in HUMINT, MASINT, and SIGINT. Each of those threads is growing and has its own level of success.”

What we’ve seen is an explosive growth in collected data. Historically, we’ve done what we had to do, which is throwing a ton of manpower at the problem. But we’re starting to realize that we need automation to assist. The goal is to remove hay from the haystack.

—Capt. Jeffrey Czerewko, Office of Digital Warfare within the Office of the Chief of Naval Operations

The Navy is focused on data fusion as it develops the next generation of DCGS-N. The Navy’s forthcoming upgrade, DCGS-N Increment 2, which recently entered its initial development phase, will likewise allow users to synchronize intelligence data from multiple sources within a single computing environment.

“I’m taking tools that sailors have seen, and I’m integrating them at the data layer so the individual can use them from a single work page without having to jump from product to product,” said Capt. Mark Kempf, program manager for the Navy’s Battlespace Awareness and Information Operations Program Office, which oversees DCGS-N.

Unlocking Agility

In many ways, DCGS-N Increment 2 represents the future of the DCGS FoS in that it will embrace automation.

“What we’ve seen is an explosive growth in collected data,” said Capt. Jeffrey Czerewko, who serves in the Navy’s newly formed Office of Digital Warfare within the Office of the Chief of Naval Operations. “Historically, we’ve done what we had to do, which is throwing a ton of manpower at the problem. But we’re starting to realize that we need automation to assist. The goal is to remove hay from the haystack.”

DCGS-N Increment 2 will “remove hay” via real-time automated aggregation, correlation, and fusion of all-source intelligence.

“I want the analyst to be able to do analysis instead of having to do production,” Kempf said. “The button pushing should all be automated.”

Automation isn’t the only forward-looking aspect of DCGS-N Increment 2. Another is the way in which the program is being delivered: using an agile software development framework whereby new capabilities are tested by and delivered to users on a rolling basis through a series of incremental releases.

That approach to developing and acquiring capabilities is the future of DCGS and the key to DoD ISR interoperability, according to Wade, who says the entire defense intelligence community must go the way of the Navy by transitioning the focus from hardware to software. Consider, for example, the difference between navigating in your car using a dash-mounted GPS unit, like a Garmin or TomTom, versus a smartphone app such as Google Maps or Waze.

“The way we buy things right now in DoD is we buy Garmin- and TomTom-type systems. These are single-capability systems,” explained Wade, who said such hardware takes the DoD many years and extensive manpower to design, develop, manufacture, test, deploy, install, integrate, and maintain. “Contrast that to the Waze application that provides the same capability, but can be developed by a handful of people and deployed on millions of smartphones around the world in a matter of minutes.”

What’s missing, according to Wade, is the common IT platform—the DoD version of Apple’s iOS or Google’s Android—on which to run the software. “When you talk about the future vision for DCGS, what you want to have is a common ISR IT platform that you can rapidly build out with applications and services.”

That’s exactly where the Air Force intends to take its DCGS platform, according to Col. Kristofer Gifford, chief of the Air Intelligence Staff’s Multi-Domain Operations Division.

“Historically, the way we’ve acquired and fielded DCGS is like acquiring and fielding an aircraft carrier or a fighter jet, which is a five- to seven-year process of block fielding,” Gifford said. “At the end of that you get one thing: Everything from the tires to the software to the navigation system and the weapons is all rolled up. If you acquired [DCGS like a consumer acquires an] iPhone [then downloads apps], you’d break it apart into bits and pieces, and you’d field the separate pieces as you go.”

Across the services, the key to breaking DCGS apart is breaking it open. As in, open architecture.

Although the DIB and its core component, the Distributed Data Framework, unlocked the door to open architecture, they didn’t completely open it, according to Jerry Mamrol, director of ISR systems at Lockheed Martin, which helped develop the DIB.

“The DIB took an important step toward an open architecture by providing a standardized method to query and access finished intel product data,” Mamrol said. “This provided some degree of openness by enabling interoperability and sharing of finished products between the services via the DDF. For the architecture to be fully open, it also needs a standardized, common infrastructure that allows applications to be developed and ‘plugged in’ by different providers.”

A plug-and-play infrastructure will activate a whole new level of interoperability by way of flexibility.

“If you have an open architecture, you can horse trade what tools you like better for any given mission,” Love said. “You’re not going to send a really geospatial-heavy system out into the field, for example, because you won’t have the power you need and you won’t have the bandwidth. So, being able to use something that’s a lot lighter without having to change your data standards to make it happen is absolutely key.”

The weapons that matter most in the next war won’t be hardware…they will be software and data, and our decisive advantage will be how quickly our airmen can access, leverage, develop, and create those software and data.

—Col. Jason Brown, Commander, 480th ISR Wing, U.S. Air Force

Ultimately, DCGS open architecture will be similar to that of a smart home environment, according to Love. “There are five or six different standards out there for [connected home devices]. If you put all those in your house and you don’t have a way for them to interconnect, you’re going to need four different pieces of software to control your house, which is super irritating,” he continued. “Now there’s a single hub out there that accepts all the different signals so you can control your entire house with one app. It’s truly a system of systems.”

This plug-and-play approach allows data to flow freely between service- and mission-specific applications that can be created cheaper and deployed faster, according to Brown, who said the Air Force is currently piloting an “open architecture” version of AF DCGS, known as OADCGS, that allows airmen to develop their own scripts and apps.

“The weapons that matter most in the next war won’t be hardware—a stealth aircraft, a ship, or a tank,” Brown said. “They will be software and data, and our decisive advantage will be how quickly our airmen can access, leverage, develop, and create those software and data.”

Next-Level Integration

Although technology will continue to advance the DCGS FoS, strategic governance will drive it. While there are several constructs through which the services manage ISR interoperability, principal among them is the Defense Intelligence Information Enterprise (DI2E), an umbrella under which OUSD(I) organizes and unites disparate defense intelligence systems, including the DCGS FoS. The DCGS Multi-Service Execution Team, made up of the DCGS program managers from each of the services, meets regularly to prioritize, establish, and resolve issues with DCGS standards, specifications, and architecture. This group operates under the auspices of a high-level governance group known as the DI2E Council.

“We use the DI2E Council to bring all the services together along with the [intelligence] agencies—anybody who has a role to play in DCGS—to make sure we’re [aligned],” said retired Air Force intelligence analyst Jack Jones, director of ISR infrastructure at OUSD(I). “Because if everybody’s in charge and has their own unique budget set and their own idea about where they want to go, then nobody’s in charge and you end up with non-compatible solutions.”

U.S. Naval forces

The aircraft carrier USS Carl Vinson and the Arleigh Burke-class guided-missile destroyer USS Wayne E. Meyer transit the East China Sea March 9 with the Japan Maritime Self-Defense Force. DCGS-N is the primary conduit for intelligence support to deployed U.S. Naval forces around the world. Photo credit: U.S. Navy Mass Communication Specialist 2nd Class Sean M. Castellano

As the fountainhead of DCGS objectives, the DI2E Council is responsible in large part for the services’ drive toward open architecture, having laid out the standards by which such architectures will be executed. Likewise, it’s the driving force behind the next major milestone in DoD ISR interoperability: IT integration with the larger defense enterprise—via the Joint Information Environment (JIE)—and with the Intelligence Community (IC) via the IC Information Technology Enterprise (IC ITE).

“The challenge is making sure that as these large enterprise deliveries and concepts get put in place that they don’t ignore the need for interoperability to go all the way down to the Joint Task Force-level and below, which is where DCGS is,” Snevely said. “We spend a lot of time ensuring that IC ITE standards and specifications, and JIE vision, are going to be executable at the DCGS level.”

The challenge is significant, but so are the promised returns, according to Chudoba, who said a number of IC organizations already share intelligence products and data across the DCGS FoS via their own versions of the DIB.

“Stuff I previously had to request through formal processes and linear channels now can be exposed to me through the same methodology as commercial file-sharing capabilities,” Chudoba said. “The power there is incredible.”

Progress is incremental. Eventually, the IT standards enabling interoperability across the defense intelligence community will enable interoperability at a global scale, uniting the DoD, the IC, and even their international mission partners through shared data.

“We’re looking for ways for intelligence information to be readily shared at the appropriate level with partners in all regions of the world,” explained Snevely, who said such sharing would happen by automatically extracting intelligence from DCGS and distributing it within the combatant commands via the U.S. Battlefield Information Collection and Exploitation Systems program. “It’s very difficult to do, but that’s the future.”

A ‘Fungible’ Future

Good governance and cutting-edge technology have turned interoperability from an ethereal goal into a tangible reality. As a result, stovepipes are crumbling. And yet, work remains.

“I think we’re doing OK, but we have a long way to go,” Jones said, citing DoD’s size, complexity, and culture as major challenges to overcome on the way to increased interoperability. “We’re in an environment that’s used to building planes, ships, and tanks. Even with our ISR capabilities, we build a collector, a sensor, a link, and a ground station—a point-to-point solution. Instead, we need to be more focused on data as an asset. If we do that, then build backward, it won’t be about the collector; it will be about what we’re trying to do with the data. That, in turn, will help us get better synchronized.”

When that happens, DoD ISR will truly become a team sport.

“We’re evolving into an enterprise construct that makes intelligence capability and capacity fungible,” Chudoba concluded. “By that, I mean systems like DCGS give us the ability to play [young children’s] soccer when a problem arises. Most people say that in a pejorative fashion. To me it’s a positive thing. When a problem arises—when we see the ball—we can get everyone to converge around it and kick it into the goal. That’s what [interoperability] does for us, and that’s how we want to operate.”

EDITOR’S NOTE

The Army has not yet decided how to modernize DCGS-A amid ongoing litigation with Palantir Technologies Inc. The service declined an interview request from trajectory.

The post Embracing the Enterprise appeared first on Trajectory Magazine.

]]>
32753
Smart IT for Smart Cities http://trajectorymagazine.com/smart-smart-cities/ Wed, 26 Apr 2017 02:59:55 +0000 http://trajectorymagazine.com/?p=32535 Location and geospatial technology enable precise mapping of utility assets, urban properties, transportation infrastructure, and government facilities.

The post Smart IT for Smart Cities appeared first on Trajectory Magazine.

]]>

John Renard serves as president for Europe, the Middle East, and Africa as well as business unit head for utilities and geospatial with Cyient, a global provider of engineering, data analytics, network and operations solutions. Renard lives in London and has a Master’s Degree in Geography and Management Studies from the University of Cambridge. Guest posts are intended to foster discussion, and do not represent the official position of USGIF or trajectory magazine.

In January 2016, the United Nations unveiled 17 Sustainable Development Goals that member countries must meet by 2030. Most of these Goals involve climate action and socioeconomic change. Yet one goal really stands out: making cities and human settlements inclusive, safe, resilient, and sustainable. As the global population continues to expand rapidly, urbanization is a mega trend across the globe, more specifically in Africa and Asia. More than half of the world’s population already lives in urban spaces and it is expected that by 2030 this number will rise to about five billion. In this context, sustainable development, as envisaged by the United Nations, will be unattainable without sustainable urban development. And that’s where the concept of a “smart city,” though around for at least two decades, becomes incredibly significant.

Many of the world’s leading governments are engaged in making smart cities part of their strategy for both the present and the future. For instance, the European Innovation Partnership on Smart Cities and Communities, supported by the European Commission, is bringing together cities, industry, and citizens to improve urban life and make communities more competitive through integrated and sustainable solutions. The initiative supports cities in finding the right partners and solutions to achieve social, environmental, and economic sustainability across Europe. Meanwhile, the Indian government launched its Smart Cities Mission in 2015 to enhance the quality of urban life in 100 cities across the country, and the UK government last year backed a smart city trade mission to Malaysia and Singapore to learn more about smart city environments.

While these pockets of initiatives thus far should be commended, intelligent digital systems have typically only been deployed in cities across the world in a piecemeal fashion, with varying degrees of success. To become truly smart, it is imperative that city councils and governments integrate the human, physical, and digital systems operating within their built environment to enable urban reform.

Why an Integrated Approach is Critical

While IT provides a holistic framework for smart cities’ infrastructure, data and information are the key ingredients in achieving this reform. Fundamentally, a smart city is one that holistically unifies data from a wide range of sources – authoritative data sources, embedded sensors, public services, citizen reports, utility companies, and more – to generate actionable intelligence that facilitates improved governance and citizen services through better decision-making.

One method for unifying data is to aggregate all of the different data streams in a city under a single roof, in the form of an operations center. Such centers act as unifying hubs that break down the silos in city administration. Another way to bring about this integration is by co-locating different infrastructure components. Constructed and equipped over the last four decades, Prague’s underground utility tunnels are a great example. Spanning 90 kilometers underneath the city including its historical center, these tunnels (called kolektory in Czech) house everything from gas pipes, steam pipes, water mains, high and low voltage cables, data cables, telecommunication cables, and special networks connecting individual companies. The system has been built to better manage urban space for the next 200 years.

Geo-enabled Platforms

One important aspect that acts as a force multiplier in smart cities is the geographical context of the data. According to a 2015 study by Dalberg in association with Confederation of Indian Industry and Google, smart maps (geospatial data) can help India gain upward of $8 billion in savings, save 13,000 lives, and reduce one million metric tons of carbon emissions a year, and this is in cities alone. Location is a key enabler in these solutions.

Location and geospatial technology enable precise mapping of utility assets, urban properties, transportation infrastructure, and government facilities. When this data is integrated with non-spatial data from disparate and multiple sub-systems of a city using a GIS-enabled enterprise information system, it allows city agencies to integrate various subsystems and put the data into a precise context. This context provides them with the ability to derive insights, visualize, and extract actionable intelligence to respond to every situation holistically. This ability makes location data a unique and powerful unifying component in a city enterprise, and is critical to increase the smartness index of a city.

Cities Making the Most of Location

City governments are cognizant of the significance of location technology and are using it in smart city programs. Barcelona was recognized as the smartest city in the world in 2015, having harnessed technology extensively to transform itself with data-driven urban systems. The city integrated ongoing projects and identified 12 areas for intervention, including transportation, water, energy, waste, and open government, as well as initiated 22 programs in which location technology played a foundational role.

Public safety is of paramount importance for cities, especially around major events, and city authorities are effectively using social media, one of the richest sources of location data, to ensure the safety of their citizens. For example, one police department in the U.S. used Geofeedia (a platform for analyzing location data from social media) in real-time to better anticipate crime during the U.S. Open for Surfing. Meanwhile, city councilors in Chicago monitor social media to understand public sentiment around specific services, and send relevant geo-tagged posts to the concerned agencies to follow-up.

However, investments in location technology and other IT tools to create economic, social, and environmental improvements for citizens is only part of the smart city story. While it is necessary, it is not sufficient to make cities truly “smart” as we understand it. There is an urgent need for political, administrative, and social groups, both at a domestic and international level, to come together to debate the appropriate policies and measures required to ensure a positive outcome from technological investments and the equal provision of resources to citizens in the urban space. Without such an exercise, technology solutions to the smart city conundrum will remain locked away in expensive ivory towers distant from future cities.

Photo Credit: The City of London 

The post Smart IT for Smart Cities appeared first on Trajectory Magazine.

]]>
32535