A Forum for Classified Research on Cybersecurity
By definition, scientists who perform classified research cannot take full advantage of the standard practice of peer review and publication to assure the quality of their work and to disseminate their findings. Instead, military and intelligence agencies tend to provide limited disclosure of classified research to a select, security-cleared audience.
In 2013, the US intelligence community created a new classified journal on cybersecurity called the Journal of Sensitive Cyber Research and Engineering (JSCoRE).
The National Security Agency has just released a redacted version of the tables of contents of the first three volumes of JSCoRE in response to a request under the Freedom of Information Act.
JSCoRE “provides a forum to balance exchange of scientific information while protecting sensitive information detail,” according to the ODNI budget justification book for FY2014 (at p. 233). “Until now, authors conducting non-public cybersecurity research had no widely-recognized high-quality secure venue in which to publish their results. JSCoRE is the first of its kind peer-reviewed journal advancing such engineering results and case studies.”
The titles listed in the newly disclosed JSCoRE tables of contents are not very informative — e.g. “Flexible Adaptive Policy Enforcement for Cross Domain Solutions” — and many of them have been redacted.
However, one title that NSA withheld from release under FOIA was publicly cited in a Government Accountability Office report last year: “The Darkness of Things: Anticipating Obstacles to Intelligence Community Realization of the Internet of Things Opportunity,” JSCoRE, vol. 3, no. 1 (2015)(TS//SI//NF).
“JSCoRE may reside where few can lay eyes on it, but it has plenty of company,” wrote David Malakoff in Science Magazine in 2013. “Worldwide, intelligence services and military forces have long published secret journals” — such as DARPA’s old Journal of Defense Research — “that often touch on technical topics. The demand for restricted outlets is bound to grow as governments classify more information.”
As new waves of AI technologies continue to enter the public sector, touching a breadth of services critical to the welfare of the American people, this center of excellence will help maintain high standards for responsible public sector AI for decades to come.
The Federation of American Scientists supports the Critical Materials Future Act and the Unearth Innovation Act.
By creating a reliable, user-friendly framework for surfacing provenance, NIST would empower readers to better discern the trustworthiness of the text they encounter, thereby helping to counteract the risks posed by deceptive AI-generated content.
By investing in the mechanisms that connect learning ecosystems, policymakers can build “neighborhoods” of learning that prepare students for citizenship, work, and life.