Science Experiments Blocked Due to Safety Risks

The U.S. government blocked dozens of life science experiments over the past decade because they were deemed to pose undue risks to public health and safety.

Between 2006 and 2013, researchers submitted 618 potentially restricted experiment proposals for review by the Centers for Disease Control (CDC) Division of Select Agents and Toxins (DSAT), according to a new study published in the journal Health Security.

Fifteen percent of those (91) were found to meet the regulatory definition of a “restricted experiment.” 31 of those experiments were nevertheless approved because they included appropriate safety measures.

But “DSAT did not approve 60 restricted experiment requests due to potentially serious biosafety risks to public health and safety,” researchers found. “All 60 denied restricted experiments proposed inserting drug resistance traits into select agents that could compromise the control of disease.”

See Review of Restricted Experiment Requests, Division of Select Agents and Toxins, Centers for Disease Control and Prevention, 2006-2013 (abstract only) by Jacinta Smith, Denise Gangadharan, and Robbin Weyant, Health Security, Vol. 13, No. 5, September 2015: 307-316.

Regulatory restrictions on research can infringe on academic freedom and may have the unintended consequence of foreclosing important — and beneficial — avenues of scientific investigation.

But the risks involved in genetic manipulation of biological agents are so profound that almost everyone agrees that some limits are necessary and appropriate.

“A product resulting from a restricted experiment has the potential to be directly misapplied by others to pose a threat to public health and safety, agricultural crops and other plants, animals and/or the environment,” the authors wrote. “In addition, the accidental release of a product of a restricted experiment may compromise the control or treatment of the disease agent in humans, animals, and/or plants.”

There have been four reported cases involving violations of restricted experiment regulations in recent years, the authors noted. Two of the restricted experiment violations resulted in civil penalties ranging from $40,000 to $1 million.

Some say the existing regulatory regime does not go far enough to restrict hazardous research.

“In the current Wild West of otherwise completely unregulated, and otherwise nearly completely unmonitored, US pathogens research, the requirement for review of ‘restricted experiments’ under the select agent rule is the one small bright spot,” said Richard H. Ebright, a molecular biologist at Rutgers University.

He noted that current regulations specify only two categories of potentially restricted experiments, which leaves much research on pathogens beyond regulatory control or oversight.

“The most effective avenue for the [US government] to implement a requirement for review of other pathogen research projects–for example, to implement a requirement for review of pathogen research projects that create new potential pandemic pathogens–would be to add additional ‘restricted experiments’ to the select agent rule,” Dr. Ebright said.

“Controlled Unclassified Information” Is Coming

After years of preparation, the executive branch is poised to adopt a government-wide system for designating and safeguarding unclassified information that is to be withheld from public disclosure.

The new system of “controlled unclassified information” (CUI) will replace the dozens of improvised control markings used by various agencies that have created confusion and impeded information sharing inside and outside of government. A proposed rule on CUI was published for public comment on May 8 in the Federal Register.

While CUI is by definition unclassified, it is nevertheless understood to require protection against public disclosure on the basis of statute, regulation, or agency policy. In many or most cases, the categories of information that qualify as CUI are non-controversial, and include sensitive information related to law enforcement, nuclear security, grand jury proceedings, and so on.

Until lately, “more than 100 different markings for such information existed across the executive branch. This ad hoc, agency-specific approach created inefficiency and confusion, led to a patchwork system that failed to adequately safeguard information requiring protection, and unnecessarily restricted information sharing,” the proposed rule said.

One of the striking features of the new CUI program is that it limits the prevailing autonomy of individual agencies and obliges them to conform to a consistent government-wide standard.

“CUI categories and subcategories are the exclusive means of designating CUI throughout the executive branch,” the proposed rule states. “Agencies may not control any unclassified information outside of the CUI Program.”

Nor do agencies get to decide on their own what qualifies as CUI. That status must be approved by the CUI Executive Agent (who is the director of the Information Security Oversight Office) based on an existing statutory or regulatory requirement, or on a legitimate agency policy. And it must be published in the online CUI Registry. There are to be no “secret” CUI categories.

Importantly, the CUI Program offers a way of validating agency information control practices pertaining to unclassified information. (A comparable procedure for externally validating agency classification practices does not exist.) But CUI status itself is not intended to become an additional barrier to disclosure.

“The mere fact that information is designated as CUI has no bearing on determinations pursuant to any law requiring the disclosure of information or permitting disclosure as a matter of discretion,” the new proposed rule said. The possibility that CUI information could or should be publicly disclosed on an authorized basis is not precluded.

More specifically, a CUI marking in itself does not constitute an exemption to the Freedom of Information Act, the rule said. However, a statutory restriction that justifies designating information as CUI would also likely make it exempt from release under FOIA.

One complication arises from the fact that simply removing CUI controls does not equate to or imply public release.

“Decontrolling CUI relieves authorized holders from requirements to handle the information under the CUI Program, but does not constitute authorization for public release,” the rule said. Instead, disclosure is only permitted “in accordance with existing agency policies on the public release of information.”

The upshot is that while there can be “controlled unclassified information” that is publicly releasable, there can also be non-CUI (or former CUI) information that is not releasable. The latter category might include unclassified deliberative materials, for example, that are not controlled as CUI but are still exempt from disclosure under the Freedom of Information Act.

More subtly, noted John P. Fitzpatrick, the director of the Information Security Oversight Office, there is a large mass of material that is neither CUI nor non-CUI– until someone looks at it and makes an assessment. In all such cases (other than voluntary disclosure by an agency), public access would be governed by the provisions and exemptions of the FOIA.

The genealogy of the CUI Program dates back at least to a December 16, 2005 memorandum in which President George W. Bush directed that procedures for handling what was called “sensitive but unclassified” information “must be standardized across the Federal Government.”

At that time, the impetus for standardization (which never came to fruition) was based on the need for improved sharing of homeland security and terrorism-related information. The initiative was broadened and developed in the 2010 Obama executive order 13556, which eventually led to the current proposed rule. Public comments are due by July 7.

Growing Data Collection Inspires Openness at NGA

A flood of information from the ongoing proliferation of space-based sensors and ground-based data collection devices is promoting a new era of transparency in at least one corner of the U.S. intelligence community.

The “explosion” of geospatial information “makes geospatial intelligence increasingly transparent because of the huge number and diversity of commercial and open sources of information,” said Robert Cardillo, director of the National Geospatial-Intelligence Agency (NGA), in a speech last month.

Hundreds of small satellites are expected to be launched within the next three years — what Mr. Cardillo called a “darkening of the skies” — and they will provide continuous, commercially available coverage of the entire Earth’s surface.

“The challenges of taking advantage of all of that data are daunting for all of us,” Mr. Cardillo said.

Meanwhile, the emerging “Internet of Things” is “spreading rapidly as more people carry more handheld devices to more places” generating an abundance of geolocation data.

This is, of course, a matter of intelligence interest since “Every local, regional, and global challenge — violent extremism in the Middle East and Africa, Russian aggression, the rise of China, Iranian and North Korean nuclear weapons, cyber security, energy resources, and many more — has geolocation at its heart.”

Consequently, “We must open up GEOINT far more toward the unclassified world,” Director Cardillo said in another speech last week.

“In the past, we have excelled in our closed system. We enjoyed a monopoly on sources and methods. That monopoly has long since ended. Today and in the future, we must thrive and excel in the open.”

So far, NGA has already distinguished itself in the area of disaster relief, Mr. Cardillo said.

“Consider Team NGA’s response to the Ebola crisis. We are the first intelligence agency to create a World Wide Web site with access to our relevant unclassified content. It is open to everyone — no passwords, no closed groups.”

NGA provided “more than a terabyte of up-to-date commercial imagery.”

“You can imagine how important it is for the Liberian government to have accurate maps of the areas hardest hit by the Ebola epidemic as well as the medical and transportation infrastructure to combat the disease,” Mr. Cardillo said.

But there are caveats. Just because information is unclassified does not mean that it is freely available.

“Although 99 percent of all of our Ebola data is unclassified, most of that is restricted by our agreements [with commercial providers],” Mr. Cardillo said. “We are negotiating with many sources to release more data.”

Last week, Director Cardillo announced a new project called GEOINT Pathfinder that will attempt “to answer key intelligence questions using only unclassified data.”

When it comes to transparency, the Office of the Director of National Intelligencerecently expressed the view that the U.S. intelligence community should make “information publicly available in a manner that enhances public understanding of intelligence activities, while continuing to protect information when disclosure would harm national security.”

But some intelligence agencies have chosen a different path.

At the CIA, for example, public access to unclassified translations and analytical products of the Open Source Center was abruptly terminated at the end of 2013. Such materials from the OSC and its predecessor, the Foreign Broadcast Information Service, had provided invaluable support to generations of scholars, students, and foreign policy specialists. But that is no longer the case.

Making Government Accountability Work

The U.S. Constitution does not explicitly recognize a “public right to know.” But without reliable public access to government information, many features of constitutional government would not make sense. Citizens would not be able to evaluate the performance of their elected officials. Freedom of speech and freedom of the press would be impoverished. Americans’ ability to hold their government accountable for its actions would be neutered.

The conditions that make government accountability possible and meaningful are the subject of the new book Reclaiming Accountability by Heidi Kitrosser (University of Chicago Press, 2015).

The author introduces the term “substantive accountability,” which she contrasts with mere “formal accountability.” While formal accountability includes such things as the right to vote, substantive accountability requires that people must “have multiple opportunities to discover information relevant to their votes….”

This may seem obvious, but the trappings of formal accountability are often unsupported by the information that is needed to provide the substance of accountability, especially in matters of national security.

Kitrosser, a professor of law at the University of Minnesota Law School, shows that the principles of substantive accountability are deeply rooted in the text, structure and history of the Constitution. She uses those principles to provide a framework for evaluating contemporary assertions of presidential power over information, including executive privilege, state secrets, secret law, and prosecutions of unauthorized disclosures.

It cannot be the case, for example, that unauthorized disclosures of classification information are categorically prohibited by law and also that the President has discretion to classify information as he sees fit. If that were so, she explains, then the President would have unbounded authority to criminalize disclosure of information at will, and the classification system would have swallowed the First Amendment. As she writes: “The First Amendment’s promise would be empty indeed if its protections did not extend to information that the president wishes to keep secret.”

Kitrosser reviews the relevant case law to find openings and lines of argument that could be used to bolster the case for substantive accountability. She notes that Supreme Court rulings over the years “contain the seeds of an affirmative case for strongly protecting classified speakers.” In a 1940 ruling in Thornhill v. Alabama, for example, the Court declared that “The freedom of speech and of the press guaranteed by the Constitution embraces at the least the liberty to discuss publicly and truthfully all matters of public concern without previous restraint or fear of subsequent punishment.”

There is, of course, an opposing school of thought which posits a largely unconstrained presidential authority over government information. Moreover, this presidentialist view has been on “an upward historical trajectory” in recent decades. Leak investigations and prosecutions have risen markedly, and so have assertions of the state secrets privilege. Secret law blossomed after 9/11. The very term “executive privilege” is a modern formulation that only dates back to 1958 (as noted by Mark Rozell).

One of the deeply satisfying features of Kitrosser’s book (which is a work of scholarship, not a polemic) is her scrupulous and nuanced presentation of the presidential supremacist perspective. Her purpose is not to ridicule its weakest arguments, but to engage its strongest ones. To that end, she traces its origins and development, and its various shades of interpretation. She goes on to explain where and how substantive accountability is incompatible with presidential supremacy, and she argues that the supremacist viewpoint misreads constitutional history and is internally inconsistent.

The book adds analytical rigor and insight to current debates over secrecy and accountability, which it ultimately aims to inspire and inform.

“We can seek to harness and support those aspects of American law, politics, and culture that advance substantive accountability,” she writes.

“Reclaiming accountability is no single act. From internal challenges or external leaks by civil servants, to journalistic inquiries and reports, to congressional oversight, to FOIA requests, accountability is claimed and reclaimed every day by countless actors in myriad ways.”

Big Data: Stealth Control

Everyone who uses the Internet is implicated in a web of data collection; it relies on user data to produced tailored advertising revenue to support growth and free use. This digital profiling produces “the black box society,” in which basic societal functions are performed in deliberate obscurity via collection and algorithmic manipulation of personal data such as location, age and political affiliation. In a new study, law professor Frank Pasquale examines how these algorithms impact money and information, and how algorithmic decision-making is taking society to a dangerous place.

Steven Aftergood, Director of the Government Secrecy Project, examines Pasquale’s study and the impacts of personal data collection in a new article published in Nature. By hiding black-box practices from public evaluation and inspection, hinders independent oversight, error correction and free-market competition.

Read the article here. 

Classification May Impede Treatment for Vets

National security secrecy can be an impediment to veterans who are seeking treatment for traumas suffered during military service yet who are technically prohibited from disclosing classified information related to their experience to uncleared physicians or therapists.

The problem was epitomized by the case of U.S. Army Sgt. Daniel Somers, who participated in classified Special Operations missions in Iraq. He returned with significant physical, mental and psychological damage. He killed himself in June 2013.

Secrecy, among other factors, appears to have exacerbated his condition, according to Rep. Kyrsten Sinema (D-AZ).

“One of the struggles Daniel faced was as an individual who had served in classified service,” Rep. Sinema said at a hearing last July. “He was unable to participate in group therapy because he was not able to share [what] he experienced while in service.”

To address this problem, Rep. Sinema last week re-introduced the Classified Veterans Access to Care Act, HR 421.

“The Classified Veterans Access to Care Act ensures that veterans with classified experiences have appropriate access to mental health services from the Department of Veterans Affairs,” she said in a release.

The bill itself would require the Secretary of Veterans Affairs “to ensure that each covered veteran may access mental health care provided by the Secretary in a manner that fully accommodates the obligation of the veteran to not improperly disclose classified information.”

The Classified Veterans Access to Care Act was originally introduced in October 2013 (as HR 3387). But although it had, and has, bipartisan support, it was not acted on in the 113th Congress. Nor are its prospects for passage in the new Congress clear. Still, there is nothing to prevent the Department of Veterans Affairs from addressing the underlying issue, and fixing the problem, without awaiting the formal enactment of Rep. Sinema’s legislation.

“The V.A. welcomes criticism but also needs constructive ideas to succeed,” wrote Drs. Marsden McGuire and Paula Schnurr in a letter to the New York Times last week. “The V.A. is actively engaging community partners, academia, advocates, the private sector and, most important, veterans and their families, to improve services.”

The parents of Sgt. Daniel Somers described his experience, and theirs, in “On Losing a Veteran Son to a Broken System,” New York Times, November 11, 2013.

According to the latest Department of Defense annual report on suicide, “The suicide rate per 100,000 [military personnel] in 2013 was 18.7 for active component service members, 23.4 for reserve component and 28.9 for National Guard.”

That is a decline from the annual suicide rate year before. But the figures from the first quarters of 2014 indicate a further increase in suicide among active duty service members.

New Literature on Secrecy

National security secrecy, which remains a source of conflict and consternation, inspires a steady flow of books and journal articles. As in other policy-related fields, much of this literature is tendentious, derivative or dull. Some of it is insightful, original or usefully provocative.

Most works naturally occupy a middle ground including both virtues and defects. Two highly original works on secrecy in recent years — Daniel Patrick Moynihan’s Secrecy: The American Experience and Garry Wills’ Bomb Power — also have significant conceptual flaws and factual errors. That is to say, it is hard to write a good book about secrecy.

With that in mind, here are some notable recent additions to the literature.

**     Secrecy in the Sunshine Era: The Promise and Failure of U.S. Open Government Laws by Jason Ross Arnold (University Press of Kansas, 542 pages, 2014).

This is a study of the impact of laws such as the Freedom of Information Act and the Federal Advisory Committee Act. It presents a survey of how these open government laws were implemented through successive administrations and how they were sometimes circumvented.

“The sunshine laws of the 1970s substantially revised the way information flowed through the American political system,” writes Arnold. “It is hard to deny that the new legal framework placed serious constraints on executive branch officials.”

Nevertheless, “excessive secrecy still reigned in the sunshine era,” he concludes. “All administrations did what they could… to twist around the statutes when they deemed it necessary. All diverged from their own pro-transparency rhetoric and rules.”

**     A Proposal to Reduce Government Overclassification of Information Related to National Security by Herbert Lin, Journal of National Security Law & Policy, Vol. 7, No. 3, 2014.

This article focuses on the perennial problem of overclassification and proposes a solution. It would seek to alter the incentives that currently favor (over)classification by establishing new incentives to reduce classification.

“Classification should not be a free good,” Lin writes. He defines a classification cost metric that would reflect the relative importance of different classified documents, and that would make it possible to “budget” for classification.

Through the application of appropriate incentives, “Those who actually make decisions about classification should benefit from reductions in the amount of classified information produced.”

The author anticipates several objections to his idea, and offers responses to them.

**     Lords of Secrecy: The National Security Elite and America’s Stealth Warfare by Scott Horton (Nation Books, 272 pages, 2015).

Sometimes secrecy is not simply an annoying artifact of national security bureaucracy, but is itself a weapon in the struggle for power. The use of secrecy in this way is corrosive and has now become disabling to American democracy, according to author Scott Horton.

While most national security attention is focused on threats from abroad, Horton says “the more serious threats to American democracy are internal. They stem from a steady transfer of democratic decision making and authority away from the people and to unelected elites. This has occurred both with respect to the disproportionate grasp of power by wealthy super elites, and by the rise of national security elites who increasingly take the key decisions about national security matters without involving the people in any meaningfully democratic process.”

“More effectively than before, they use secrecy not only to cover up their past mistakes but also to wrest from the public decisions about the future that properly belong to the people.”

Invention Secrecy Orders Reach a 20 Year High

On October 27, 1977, Dr. Gerald F. Ross filed a patent application for a new invention he had devised to defeat the jamming of electromagnetic transmissions at specified frequencies. But it was not until June 17, 2014 — nearly 37 years later — that his patent was finally granted (Anti-jam apparatus for baseband radar systems, patent number 8,754,801).

In the interim, Dr. Ross’s patent application had been subject to a secrecy order under the Invention Secrecy Act of 1951, which both prevented issuance of the patent and prohibited its public disclosure.

At the end of Fiscal Year 2014 (on September 30), there were 5,520 such invention secrecy orders in effect, according to statistics released by the U.S. Patent and Trademark Office under the Freedom of Information Act.

That is the highest number of invention secrecy orders in effect since 1994. It is unclear whether this reflects growing innovation in sensitive technology areas, or a more restrictive approach to disclosure by government agencies.

In fact, the overwhelming majority of current secrecy orders were issued in prior years, but there were 97 new secrecy orders that were imposed in FY 2014. Meanwhile, there were 22 existing orders that were rescinded, including the order concerning Dr. Ross’s invention.

Under the Invention Secrecy Act, secrecy orders may be imposed whenever, in the judgment of an executive branch agency, the disclosure of a patent application would be “detrimental to the national security.” This is a lower, less demanding standard than that for national security classification (which applies to information that could “cause damage to national security”) and not all secret inventions are classified. Some may be unclassified but export controlled, or otherwise restricted.

Other newly disclosed inventions formerly subject to a secrecy order that was rescinded by the government during the past year include these (according to data obtained from the Patent and Trademark Office):

Method of producing warheads containing explosives, patent number 8,689,669

Method of treating a net made from ultra-high-molecular-weight polyethylene, patent number 8,808,602

Ballistic modification and solventless double propellant, and method thereof, patent number 8,828,161

Ballistic modifier formulation for double base propellant, patent number 8,864,923

Synthetic aperture radar smearing, patent number 8,836,569

NARA Backs Away from CIA Email Destruction Proposal

The National Archives and Records Administration told the Central Intelligence Agency last week that it was withholding approval of a CIA proposal to allow the destruction of the email records of all but 22 senior Agency officials.

“NARA intends to reassess the Central Intelligence Agency (CIA) proposal for the disposition of non-senior email accounts,” wrote Paul M. Wester, Jr., Chief Records Officer at NARA in a November 20 letter to Joseph Lambert, Director of Information Management Services at CIA.

“Based on comments from Members of the U.S. Senate Select Committee on Intelligence and a number of public interest groups, we are concerned about the scope of the proposed schedule and the proposed retention periods,” Mr. Wester wrote.

Based on a preliminary review of the CIA proposal, NARA had initially recommended approval of the plan, Secrecy News reported last month. (“CIA Asks to Destroy Email of Non-Senior Officials,” October 1.)

But critical comments that were submitted to NARA — from the Federation of American Scientists, and other public interest groups and individuals, the Department of Defense Chief Defense Counsel, and especially from Senators Feinstein and Chambliss, the leaders of the Senate Intelligence Committee, and Senators Wyden, Udall and Heinrich, Members of the Committee — turned the tide and blocked the proposal in its current form.

“We will hold a public meeting on this schedule in the coming months to address the comments raise by you and others and to share how NARA is moving forward,” wrote Margaret Hawkins of NARA Records Management Services in an email message today. “This meeting will be announced in the Federal Register and will be open to all commenters and the public.”

For related coverage, see: “The CIA Wants To Delete Old Email; Critics Say ‘Not So Fast'” by David Welna, NPR All Things Considered, November 20; “Top Senators Oppose CIA Move to Destroy Email” by Siobhan Gorman, Wall Street Journal, November 19; “National Archives: Ok, So Maybe Letting The CIA Destroy Emails Wasn’t A Great Idea” by Ali Watkins, Huffington Post, November 21; and “Furor Over CIA Shake-Up of Email System” by Adam Klasfeld, Courthouse News Service, November 7.

CIA Asks to Destroy Email of Non-Senior Agency Officials

The Central Intelligence Agency has asked for authority to destroy email messages sent by non-senior officials of the Agency. The National Archives and Records Administration (NARA) has tentatively approved the proposal.

In an August 18 appraisal of the CIA request, Meredith Scheiber of NARA wrote that any permanently valuable material in the emails would almost certainly be captured in other permanent CIA records.

“It is unlikely that permanent records will be found in these email accounts that is not filed in other appropriate files appraised as permanent,” the appraisal said.

“There are multiple records systems to capture the actions and decisions of employees and multiple internal controls in place in the event an employee was engaged in malicious activities.”

Any “remaining email not captured in other recordkeeping systems is routine or administrative in nature; transitory; or personal in nature.”

The NARA appraisal of the CIA proposal noted in passing that “The Agency’s current email policy is to print and file” rather than to save permanently valuable email in softcopy format.

“The average career of an Agency employee is 22 years,” the NARA appraisal also observed.

The CIA proposal for email disposal authority and the accompanying NARA appraisal were announced for public comment in the Federal Register on September 17.