Social media presents new opportunities and challenges for deriving open source intelligence
In the aftermath of Haiti’s 7.0 earthquake in 2010, Twitter, Facebook, YouTube, Flikr, and other social media earned their bona fides as data-generating vehicles in support of disaster relief. They also showed their warts. A year later, groups in Libya revolted against the government of Col. Moammar Gadhafi. Social media was there, guiding the United Nations humanitarian support as well as reporting and, to some extent, fomenting the rebellion. Tweeting became an act of bravery when the Libyan government began to locate sources.
Social media offers data that can be good, bad, even ugly, but its impact on today’s society, largely because it’s driven by that society, is too great to ignore. Those seeking to harness social media to make it part of the open source intelligence (OSINT) aggregate are operating in a wildfire-like confluence of technology, political upheaval, and natural disaster.
Social media is the latest in a continuum of OSINT elements that go as far back as the earliest newspapers and speeches, and later, radio, television and public government documents, including budgets.
The value of OSINT was seen vividly in World War II, when the Foreign Broadcast Information Service (FBIS) was created to record and analyze shortwave broadcasts aimed at the U.S. by Axis powers. The FBIS moved from the Federal Communications Commission to the CIA in 1948, and continued to collect publicly available information, adding television.
The advent of the Internet altered OSINT once again, offering databases, online news sources, and quicker access to public information, including blogs. But the Internet also complicated the already complex task of separating facts from opinion and propaganda, still a key concern among OSINT operatives. With the Internet, their mission became even more complicated, in an era bursting with possible facts to balance against one another in an effort to provide the clearest picture possible.
Add social media and the phenomenon of crowd-sourced information to that mix, and the task looms both mountainous and frenetic. Twitter’s emergence in 2007 provided a significant forum for public information, albeit with considerable noise. It also added an immediacy that was unprecedented by any other medium.
“The big guys, like CNN and MSNBC and Fox News, still exist, but they’re definitely not as fast as the sensor network of millions of people communicating by using these informal social media,” said Abe Usher, who leads technology development for HumanGeo.
Some companies, such as HumanGeo, are developing geospatial applications and tools to synthesize, manage and exploit large data sets derived from social media.
There is an ongoing rush to keep up with advancing technology for the social media mix. Take Google Goggles for example. This technology provides geospatial information when viewing pictures and video on a smartphone.
OSINT operatives in and out of government continuously strive to keep abreast of emerging technologies as well. The FBIS became the DNI Open Source Center in 2005, and two years later established an Emerging Media Group to study and learn to exploit social media.
The DNI Open Source Center (OSC) was harnessing social media “before it was cool,” quipped Doug Naquin, formerly director of FBIS and now the recently retired director of the OSC.
“As you can imagine, it opened up a whole new realm of possibilities,” Naquin said.
But can a medium like Twitter, which can often be fixated on trivial topics such as Justin Bieber’s latest haircut, also be taken seriously as an intelligence source? Can it add to the geospatial construct necessary to tell a story that can be acted upon?
Following the earthquake in Haiti, volunteer groups, in particular OpenStreetMap, used a geospatial wiki to build a street map of Haiti in two weeks—a task that ordinarily would have taken a year.
As more data streamed in, new maps were created. It was a challenge to piece together the geospatial information accompanying tweets and other social media messages. Still, the maps provided relief agencies with the story of a Haitian population on the move from the devastation in Port-au-Prince, seeking help.
Disaster relief efforts in Haiti proved that the method could, and should be taken seriously, when social media messages revealed the locations of people still alive under the rubble days after the earthquake. Videos posted online showed some of the hardest-hit neighborhoods, and those reports were confirmed when aid workers rushed in with supplies.
The combination of traditional OSINT with geo-tagged social media data provided an invaluable resource to the large conglomerate of response organizations and those watching around the world.
In early 2008, Ushahidi was formed as a website to map reports of violence in Kenya after the post-election fallout. Today, Ushahidi is a full-blown nonprofit tech company, specializing in information collection, visualization, and interactive mapping. Its platform is among the most commonly used in public geospatial mapping, and encourages the public to contribute information to an open online resource.
Following the earthquake in Haiti, Ushahidi joined with the Fletcher School of Law and Diplomacy at Tufts University, the United Nations, and the International Network of Crisis Mappers, to gather reports from social media and other sources. It then sent the data to U.S. Southern Command, which combined the geospatially enabled information with more traditional intelligence sources to help target relief efforts. The beauty of this resource was its openness: non-governmental organizations were able to immediately access and use the information, too.
The International Network of Crisis Mappers, co-founded and co-directed by Jen Ziemke and Patrick Meier, was created in 2009, just months before the disaster in Haiti. The network meets annually to share ideas and accomplishments, and experts applauded its work with Ushahidi in Haiti.
“High-ranking [military] officers and officials from the [United Nations] Office for the Coordination of Humanitarian Affairs came to our conference in 2010,” said Ziemke. “They said, ‘What you did in Haiti was amazing.'”
Volunteer crisis mappers turned out some 40,000 reports of 4,000 events. But with all of this crowd-sourced, geo-tagged information, the data threatened to overwhelm what was at first an ad hoc, and very new, system.
“To be honest, Haiti was a double-edged sword,” said Shadrock Roberts, a former USGIF scholarship winner who is launching the USA GeoCenter as a contractor with the U.S. Agency for International Development (USAID). Roberts gathered data in Haiti and points to the United Nations after-action report, Disaster Relief 2.0, which acknowledged the technological shortcomings in the Haiti relief effort. “These sorts of new methods, tools and technologies received an awful lot of visibility,” Roberts said. “At the same time, I don’t think the utility of these tools and methods in Haiti matched the visibility. It became sort of an information circus that probably caused as many headaches as it solved.”
Developing Best Practices
In the wake of the Haiti response, GIS experts are implementing new processes during new opportunities that best utilize the vast amount of data coming out of the social and crowd-sourced media. “We are all now trying to show fresh examples of how social media can be used,” Roberts said. “There’s a real interest in best practices.”
Standby Task Force, an on-call volunteer group, was created after the Haiti earthquake to collect, process and analyze social media as needed, but in a more organized way than before. Meier of the International Network of Crisis Mappers was a founder, and Ziemke is also a member.
The task force has also played a role throughout the Arab Spring, as social media reported public unrest and government atrocities that led to successful revolutions in Tunisia and Egypt, helped embolden Libyans to overthrow Gadhafi, and continues to influence events in Syria.
“Social media was a weapon for people who didn’t have weapons,” said Anthony Stefanidis, director of the Center for Geospatial Intelligence at George Mason University. “The government had conventional weapons, and the people had tweets. There was a cartoon in which there was a gun, and a kid said, ‘Stop, or I’ll tweet.’ It showed that this was a very powerful weapon in the hands of people who didn’t have power yet.”
Early in March 2011, the United Nation’s Office for the Coordination of Humanitarian Affairs (UN OCHA) activated the Standby Task Force to assist with the upheaval in Libya. The task force responded by developing the LibyaCrisisMap.net within 48 hours. As more data streamed in from social media, aerial photographs, video, and other sources, layered maps revealed an unfolding civil war and a population fleeing to get out of the way.
The crisis maps offered the United Nations a look at what was going on inside of Libya. But the same information was also available to Gadhafi’s regime, and that diluted some of the data gatherers’ naÏveté by injecting ethical issues into the process.
After learning of the problem, UN OCHA ran two websites: one with data accessible only to limited and approved parties; the other open, but with no information that could endanger a source, and with a 24-hour delay on data.
“We’re operating at that strange nexus between, on the one hand, philosophically and fundamentally believing in open data and openness in general–open software and open systems,” said Ziemke. “On the other hand, we also operate in the real world where security trackers can track where people are tweeting, ‘My brother was killed at a prison where he was tortured.’ These tools can be used by people any way they want.”
Verification of Data
The use of social media for intelligence purposes is often unwieldy, and draws skepticism from those who are concerned with the verification of such information.
Those concerns begin with taking social media seriously, a never-ending challenge. For example, in early July, the Army posted Techniques Publication 2.22.9, Open Source Intelligence, establishing doctrine which, “highlights the characterization of OSINT as an intelligence discipline, its interrelationship with other intelligence disciplines and its applicability to unified land operations,” according to the service’s Intelligence Center of Excellence. However, an Army source said the classified publication “does not address social media.”
Verification of social media data can be especially trying when compared to traditional OSINT resources like broadcasts, newspapers, and miscellaneous documents. Add geospatial information to this social media, like user-selected check-ins and GPS-enabled applications, and you’ve injected another variable that can be difficult to substantiate before that data can be massaged into actual, actionable intelligence.
Verification techniques run the gamut from relying more heavily on proven OSINT contributors, to getting often-used sources to support information offered by new people, to challenging sources to provide their own corroboration, including pictures or video to support texted or tweeted claims. When the latter technique is used, often one social medium is applied to verify another.
In essence, verification involves techniques of journalism and law enforcement, but uses new tools in newer ways.
“My father was a police officer, a detective with almost 30 years of service, and these are the things he does,” Roberts said. “Journalism, anthropology, detective work. You use all of these things.”
Stefanidis is a preponderance-of-evidence advocate. “You cannot cry wolf in social media.” he said. “If you tweet about something and other people aren’t tweeting about it, it’s worthless. I cannot start a revolution by myself.”
Experts estimate more than 100 companies and other organizations are working on technology to help gather, verify, sort, and analyze data in what is largely a labor-intensive field. Ushahidi is chief among them, with SwiftRiver, a platform to process and integrate large amounts of seemingly disparate data in short amounts of time. This capability is particularly valuable during the social media torrents within the first 24 hours of a disaster.
Perhaps more problematic than developing tools is herding the more than 2,000 members of the International Network for Crisis Mappers, each with ideas of how the process should work. “There are a lot of people with questionable qualifications doing work because it’s an open field,” Naquin said. “It’s like a land rush, like Oklahoma. Everybody is trying to stake a claim before they know what they are doing.”
While organizations such as the crisis mappers and Standby Task Force work on best practices, others, such as the Woodrow Wilson International Center for Scholars, seek to develop ethical standards and policies for the use of social media in generating OSINT, especially for governmental use.
“We’re lagging behind very seriously in a policy framework for using this data,” said Roberts, who recently addressed the Wilson Center on a USAID exercise that used crowd sourcing to verify developmental data. That exercise incorporated many of the lessons learned from data gathering in Haiti, such as management, communication, practice runs, and accuracy assessment. “It’s not that we can’t catch up with what’s happening,” Roberts said. “At the end of the day, this isn’t magic. It’s not a black box. We just need to do it. That’s the problem with all geospatial data. People should create metadata for all of their geospatial files. Generally they don’t. We just need to change the culture a little bit.”
Metadata or not, historically, the military hasn’t been a strong advocate of OSINT.
“There’s a huge amount of suspicion or lack of trust in outside information,” said a civilian who asked not to be identified because he works with special operations in Afghanistan. “People often don’t trust information when it doesn’t come from their own organization. And there’s not a lot of trust by the military in organizations like the Standby Task Force, which deals in the international community.” The irony is the that military, with its thousands of young soldiers, sailors, airmen and marines, understands social media better than most. Harnessing OSINT derived from social media for its overall intelligence package would require a cultural shift at the command level, but that could be just around the corner.
Some say the trust issue is melting away, albeit slowly. Such a step is the product of a realization that OSINT has progressed well beyond reading the International Herald Tribune and Stars and Stripes. However, the recent Army publication covering OSINT still doesn’t mention social media.
But Roberts suggested that perhaps disaster response could be the catalyst for military intelligence to realize the power of geo-enabled social media. He said that USAID is often approached by intelligence members in the military, as disaster response is a growing U.S. military mission. “In all of those cases, those individuals have been interested in supporting disaster response, in supporting development,” Roberts said.
Still, if the military becomes more invested in OSINT derived from social media, it would be one step closer to what many see as the future of the medium: helping to forecast conditions, perhaps even conflict.
After the office of the United Nations Secretary General was caught off guard by the worldwide financial crisis in 2008, it asked why the UN was dealing in data sets that were five to eight years old, when there was access to real-time data that could have drawn attention to the coming crisis, according to Ziemke. This led to the development of a program called UN Global Pulse.
Global Pulse seeks to forecast crises by fishing from an ocean of social media data, rather than waiting for economic indicators to offer situational awareness.
But forecasting conflict? It’s still early in the game for that, according to some in the space.
“To do that, you really have to have a good idea beforehand of the questions you need to address,” said Naquin. “And you have to have very smart people who are familiar with statistics and data analysis to establish indicators. What I’m seeing happening is that people were kind of hoping that technology would be able to take a bunch of data and, with the use of tools, press F9 and find out where the next revolution was going to happen. I found that to be a little bit overly hopeful.”
It’s still early in what could be a long trip on a fast-moving train, but there’s a rush to get on board, tempered by some restraints that most experts believe can be resolved through imagination, creativity, and innovation.
“It’s exciting,” Usher said. It’s moving in the direction of pro-activity pretty quickly, but at this point it’s still more of an art than a science.”
In many ways, the art was displayed in Haiti and Libya. The science will develop when time and location are routinely part of all open source data, and when geospatial applications are just as routinely the result..