FAS

Crowd-Sourcing the Treaty Verification Problem

07.23.15 | 5 min read | Text by Steven Aftergood

Verification of international treaties and arms control agreements such as the pending Iran nuclear deal has traditionally relied upon formal inspections backed by clandestine intelligence collection.

But today, the widespread availability of sensors of many types complemented by social media creates the possibility of crowd-sourcing the verification task using open source data.

“Never before has so much information and analysis been so widely and openly available. The opportunities for addressing future [treaty] monitoring challenges include the ability to track activity, materials and components in far more detail than previously, both for purposes of treaty verification and to counter WMD proliferation,” according to a recent study from the JASON defense science advisory panel. See Open and Crowd-Sourced Data for Treaty Verification, October 2014.

“The exponential increase in data volume and connectivity, and the relentless evolution toward inexpensive — therefore ubiquitous — sensors provide a rapidly changing landscape for monitoring and verifying international treaties,” the JASONs said.

Commercial satellite imagery, personal radiation detectors, seismic monitors, cell phone imagery and video, and other collection devices and systems combine to create the possibility of Public Treaty Monitoring, or PTM.

“The public unveiling and confirmation of the Natanz nuclear site in Iran was an important early example of PTM,” the report said. “In December 2002, the Institute for Science and International Security (ISIS), an independent policy institute in Washington, DC, released commercial satellite images of Natanz, and based on these images assessed it to be a gas-centrifuge plant for uranium isotope separation, which turned out to be correct.”

An earlier archetype of public arms control monitoring was the joint verification project initiated in the late 1980s by the Natural Resources Defense Council, the Federation of American Scientists and the Soviet Academy of Sciences to devise acceptable methods for verifying deep reductions in nuclear weapons. The NRDC actually installed seismic monitors around Soviet nuclear test sites.

In the 1990s, John Pike’s Public Eye project pioneered the use of declassified satellite images and newly available commercial imagery for public interest applications.

Recently, among many other examples, commercial satellite imagery has been used to track development of China’s submarine fleet, as discussed in this report by Hans Kristensen of FAS.

Jeffrey Lewis and his colleagues associated with the Armscontrolwonk.com website are regularly using satellite imagery and related analytic tools to advance public understanding of international military programs.

“The reason that open source data gathering [for treaty verification] is an easier problem is that, increasingly, at least some of it will happen as a matter of course,” the JASONs said. More and more people carry mobile phones, are connected to the Internet, and actively use social media. Even with no specific effort to create incentives or crowd-sourcing mechanisms there is likely to be a wealth of images and sensor data publicly and freely shared from practically every country and region in the world.”

The flood of open source data naturally does not solve all verification problems. Much of the data collected may be of poor quality, and some of it may be erroneous or even deliberately deceptive.

There are “enormous difficulties still to be faced and work to be done before claiming confidence in the reliability of data obtained from open sources,” the JASON report said.

“The validity of the original data is especially problematic with social media, in that eyewitnesses are notoriously unreliable (especially when reporting on unanticipated or extreme events) and there can be strong reinforcement of messages — whether true or not — when a topic ‘goes viral’ on the Internet.”

“In addition to such errors, taken here to be honest mistakes and limitations of sensors, there is the possibility of willful deception and spoofing of the raw data, including through conscious insertion or duplication of false information.”

“Key to validating the findings of open sources will be confirming the independence of two or more of the sources. Multi-reporting — ‘retweeting’ — or posting of the same information is no substitute for establishing credibility.”

On the other hand, the frequent repetition of information “can provide an indication of importance or urgency… The occurrence of the 2008 Wenchuan earthquake was first noted in the United States as an anomalous increase in text-messaging.” Because the information moved at near light speed, it arrived before the seismic waves could reach US seismic stations.

“Commercial satellites have also provided valuable data for analysis of (non-)compliance with the Nuclear Non-Proliferation Treaty, and open sources proved valuable in detecting the use of chemical weapons in Aleppo, and in subsequent steps to remove such weapons from Syria. They also informed the world about the Russian troop movements and threats to Ukraine.”

While the US Government should take steps to promote and exploit open source data collection, the report said, it should do so cautiously. “It is crucial that citizens or groups not be put at risk by encouraging open-source activities that might be interpreted as espionage. The line between open source sensors and ‘spy gear’ is thin.”

In short, the JASON report concluded, “Rapid advances in technology have led to the global proliferation of inexpensive, networked sensors that are now providing significant new levels of societal transparency. As a result of the increase in quality, quantity, connectivity and availability of open information and crowd-sourced analysis, the landscape for verifying compliance with international treaties has been greatly broadened.”

*     *     *

Among its recommendations, the JASON report urged the government to “promote transparency and [data] validation by… keeping open-source information and analysis open to the maximum degree possible and appropriate.”

Within the U.S. intelligence community, such transparency has notably been embraced by the National Geospatial-Intelligence Agency under its director Robert Cardillo.

But the Central Intelligence Agency has chosen to move in the opposite direction by shutting off much of the limited public access to open source materials that previously existed. Generations of non-governmental analysts who were raised on products of the Foreign Broadcast Information Service now must turn elsewhere, since CIA terminated public access to the DNI Open Source Center in 2013.

FAS has asked the Obama Administration to restore and increase public access to open source analyses from the DNI Open Source Center as part of the forthcoming Open Government National Action Plan.