Govt Will Not Declassify 2001 Opinion on Surveillance

The Department of Justice refused this month to declassify a 2001 legal Office of Legal Counsel opinion by John C. Yoo concerning the legality of the Bush Administration’s warrantless surveillance program.

The redacted information in the OLC opinion “is classified, covered by non-disclosure provisions contained in other federal statutes, and is protected by the deliberative process privilege,” wrote Paul P. Colborn, Special Counsel at OLC.

The document had been requested by researcher Matthew M. Aid, who writes on NSA and surveillance policy.

Eight partial sentences from the 21 page opinion were released, including a previously declassified assertion that “unless Congress made a clear statement in FISA that it sought to restrict presidential authority to conduct warrantless searches in the national security area — which it has not — then the statute must be construed to avoid such a reading.”

That claim alone has drawn criticism from some members of Congress.

“I cannot reconcile the plain language of FISA that it is the exclusive procedure for electronic surveillance of Americans with the OLC opinion saying Congress didn’t say that,” Sen. Sheldon Whitehouse told the Washington Post in a May 23, 2008 story. “Once again, behind the veil of secrecy, OLC appears to have cooked up extravagant or misguided legal theories which would never survive the light of day.”

“The 2001 statement addressing FISA does not reflect the current analysis of the department,” wrote Justice official Brian A. Benczkowski, quoted in the Post.

Court Denies Motions to Dismiss Kim Leak Case

A federal court yesterday rejected (pdf) multiple defense motions to dismiss Espionage Act charges against former State Department contractor Stephen Kim, who is accused of leaking classified information to a Fox News reporter.

Mr. Kim’s defense team had marshalled a series of seemingly ingenious arguments for dismissal.  The use of the Espionage Act to punish “political crimes” such as leaking is prohibited by the Constitution’s Treason Clause, one defense motion said.  Further, the language of the statute appears to prohibit unauthorized disclosure of tangible items, such as documents, not “information” which cannot be surrendered on demand.  Also, the defense argued, the Espionage Act is impermissibly vague and ambiguous with respect to oral disclosures.  Finally, prohibitions against leaks are enforced and prosecuted rarely and unpredictably, rendering those rare cases intrinsically unfair.

None of these arguments gained any traction with the court, though the defense discussion of the Treason Clause was deemed “compelling and eloquent.” Judge Colleen Kollar-Kotelly dismissed all of the defendant’s objections in a memorandum opinion on August 24.

“To the extent that Defendant intends to argue that the information he is charged with leaking was previously disclosed or was not properly classified, he may do so as part of his defense,” Judge Kollar-Kotelly wrote, “but such arguments do not render the statute vague” or otherwise invalid.

The new ruling probably does not come as a big surprise to any of the parties.  The surprise would have been if the court had overturned or reinterpreted the Espionage Act.  Instead, the ruling leaves the existing legal apparatus in place, and clears the path for trial.

The Kim case is one of five Espionage Act prosecutions undertaken by the Obama Administration due to alleged unauthorized disclosures of classified information.  A New York Times editorial today scolded the Administration for what it called the “misguided” use of the Act.

“With no allegation of a motive or intended harm to the US, the government’s use of Stephen as an example to deter the leaking of information is inappropriate,” according to a statement on Mr. Kim’s own web site.

A Correction on Nuclear Secrecy

On August 22 (“Some New Wrinkles in Nuclear Weapons Secrecy”), Secrecy News mistakenly wrote that the SILEX uranium enrichment process is “a unique case in which information that was privately generated was nevertheless classified by the government.  As far as could be determined, the decision to classify this non-governmental information under the Atomic Energy Act is the first and only time that such authority has been exercised.”  That was inaccurate.

Dr. Andrew Weston-Dawkes, the director of the Department of Energy Office of Classification, said that offhand he was aware of at least one other such case of classification of privately-generated information.  It involved “an AVLIS-like technology,” he said, referring to “atomic vapor laser isotope separation.”

Bryan Siebert, the former director of the DOE Office of Classification, said his recollection was that some of the laser fusion technology developed by the private company KMS Fusion in the early 1970s was also considered to be classified, “a long time before SILEX.”  An account of the KMS Fusion case —  which, he said, is “inaccurate in many ways” — is available from Wikipedia here.

Beyond that, said Dr. Weston-Dawkes, “there’s a long history of us going out to people [in the private sector] saying ‘you’re doing stuff’ [that needs to be reviewed for classification].”

He pointed to a 1972 public notice (pdf) issued by the Atomic Energy Commission.  It instructed “any person” working on isotope separation techniques to notify the Commission whenever a separation process has been demonstrated “so the Commission can give him appropriate classification and reporting guidance.”

There are many other instances in which individual authors have tangled with Department of Energy classification officials concerning the publication of information that DOE believed to be classified, such as Howard Morland’s article on the H-Bomb that was the subject of The Progressive case in 1979.  But those disputes involved previously generated and previously classified information, not qualitatively new inventions or developments.

NRO Has “Most Aggressive” Launch Record in 25 Years

The National Reconnaissance Office (NRO), which builds, launches and operates the nation’s intelligence satellites, has been unusually active over the past year.

“We are nearly through the most aggressive launch campaign in over 25 years,” said Betty J. Sapp, the NRO Principal Deputy Director, at a March 15, 2011 hearing of the House Armed Services Committee.  The record of that hearing was published (pdf) last month.

“We have successfully launched five satellites into orbit in the last six months, with one more launch planned next month,” she said in March.  “These successful launches have been a very important and visible reminder of the space reconnaissance mission NRO started 50 years ago, and continues with such great success today.”

The full record of the March 15 hearing provides an unclassified overview of national security space programs.  See “Budget Request for National Security Space Activities,” House Armed Services Committee.

Among other interesting points raised at the hearing, Gen. William L. Shelton of Air Force Space Command discussed the Air Force’s reliance on NOAA’s aging Advanced Composition Explorer (ACE) satellite to detect disruptive solar activity.

“Located at a stationary point approximately 1 million miles between the Earth and the Sun, it gives us 30-90 minutes warning before the detected solar disturbance reaches the Earth and our space assets,” Gen. Shelton said in response to a question for the record.  “This enables us to implement measures to protect our space systems and services.”

This year’s 50th anniversary of the NRO (established in 1961) will be accompanied by some new declassification activity.  “Almost all” of the historical intelligence imagery from the KH-9 satellite (1971-1986) will be declassified within a few months, said Douglas G. Richards of the Joint Staff at an August 23 forum sponsored by the National Declassification Center.

Open Up Open Source Intelligence

If the Obama Administration wants to advance the cause of open government, one particularly fruitful way to do so would be to share unclassified open source intelligence publications with the public.

The Federation of American Scientists offered that suggestion in response to a White House call for public input into the development of the pending Open Government Plan.

“The U.S. Government should adopt a policy of publishing all non-sensitive products generated by the Director of National Intelligence Open Source Center,” we wrote. “Doing so would serve to enrich the online domain with uniquely high-value content on a broad range of national security and foreign policy topics. It would foster increased public awareness and understanding of national security and foreign policy affairs. And it would provide the public with a tangible ‘return on investment’ in this vital area of national policy.”

The U.S. Open Government Plan is being developed as part of the multi-national Open Government Partnership that is to be launched next month.  The White House solicited public input to the process in an August 8 blog posting.

The Institutionalization of Open Source Intelligence

The battle for public access to open source intelligence may have been lost before most people even knew it began, judging from the new book, “No More Secrets: Open Source Information and the Reshaping of U.S. Intelligence” by Hamilton Bean (Praeger, 2011).

“No More Secrets” is an academic work, not an expose.  But it is an exceptionally stimulating one that brings the theoretical principles of organization management and communications theory to bear on intelligence policy in original and insightful ways.

As Bean shows in depth, the meaning of “open source” has been fiercely contested, beginning with the very definition of the term (which generally refers to policy-relevant information that can be acquired legally).  Other disputed questions include, Whom does open source serve?  Is it only for policy makers, or also the public?  Who should perform the open source mission?  Should it be housed within the intelligence community or outside of it?  Which aspect of “open source intelligence” dominates?  Is it the logic of openness or the logic of secrecy?

For the most part, these questions have now been answered, at least provisionally.  Open source intelligence is for policymakers, not the public.  It is part of the intelligence community, not separate from it.  The logic of secrecy, not openness, is primary.  “Intelligence officials have successfully marginalized” those who would argue differently, Bean says.

Among several fateful turning points in the current institutionalization of open source intelligence, Bean highlights a conflict between Robert Steele, a former CIA officer and Marine Corps open source advocate, and Eliot Jardines, who served as the senior ODNI official on open source.

While Steele favored an open, expansive and inclusive vision of open source intelligence, “Jardines sought to institutionalize the collection and analysis of open source within the U.S. intelligence community in ways that did not overtly challenge the dominant institutional logic of secrecy.”  In 2005 or thereabouts, Jardines won that battle, and “those who share Steele’s vision of an independent open source agency find their ability to affect change similarly constrained,” the author says.  (Steele’s own review of the book is here.)

Of course, there is no reason why the status quo must be perpetuated indefinitely.  In fact, Prof. Bean notes, “many stakeholders… are still, to this day, actively struggling to institutionalize their preferred meanings of open source….”

Secrecy News is cited a couple of times in the book and the Federation of American Scientists makes an appearance in this peculiar sentence:  “A principal reason that WikiLeaks, Public Intelligence, Cryptome, and FAS are controversial is because they threaten to rupture distinctions between open and secret information and destabilize conventional notions of authority, expertise and control.”

Some New Wrinkles in Nuclear Weapons Secrecy

A newly released intelligence guide to document classification markings explains the meaning and proper use of control markings to designate classified information.  See “Authorized Classification and Control Markings Register” (pdf), CAPCO, Volume 4, Edition 2, May 31, 2011.  (See also the associated Implementation Manual of the same date.)

This material is very detailed, comprehensive and quite informative, with only a few redacted passages pertaining to some code word usages.

But though it is only three months old, it is already out of date due to the constant churning within the classification system that regularly generates new marking requirements and cancels old, familiar ones.  This has been particularly true lately with respect to changes in markings for “Restricted Data,” or classified nuclear weapons information.

Thus, the intelligence guide to classification marking refers to the so-called “Sigma” system for marking Restricted Data.  Each Sigma level refers to a particular aspect of nuclear weapons design.  According to the intelligence community guide, the Sigma system extends from Sigma 1 to Sigma 15 and also Sigma 20.  But that is no longer accurate.

In a July 2011 order (pdf), the Department of Energy determined that Sigma levels 1 through 5 and 9 through 13 are now obsolete.  So they have been disestablished.  Meanwhile, a new Sigma category, Sigma 18, has been created to address “Control of Complete Designs” and to protect “past and present U.S. nuclear weapons, nuclear devices and weapon designs.”  See “Control of Nuclear Weapon Data,” DoE Order 452.8, July 21, 2011.

At this late date in the nuclear era, there are still other “innovations” in nuclear technology and nuclear secrecy.  The New York Times reported last weekend on an apparent breakthrough in the use of lasers to enrich uranium.  This laser enrichment process, known as SILEX (Separation of Isotopes by Laser Excitation), also poses new proliferation issues.  See “Laser Advances in Nuclear Fuel Stir Terror Fear” by William J. Broad, August 21.

Though the Times story did not mention it, the SILEX process is also a unique case in which information that was privately generated was nevertheless classified by the government.  As far as could be determined, the decision to classify this non-governmental information under the Atomic Energy Act is the first and only time that such authority has been exercised.  [Update, August 25: This is not correct. See A Correction on Nuclear Secrecy.] See this 2001 “Record of Decision to Classify Certain Elements of the SILEX Process as Privately Generated Restricted Data.”  (See also “A Glimpse of the SILEX Uranium Enrichment Process,” Secrecy News, August 22, 2007.)

For its part, the Department of Defense issued a new Instruction last week on “Disclosure of Atomic Information to Foreign Governments and Regional Defense Organizations” (DoDI 5030.14, August 17, 2011).

Sterling Defense Argues Against Secret Evidence

Prosecutors in the case of former CIA officer Jeffrey Sterling, who is accused of leaking classified information, should not be permitted to present their evidence at trial in modified or redacted form and should also not be able to employ other extraordinary security measures, defense attorneys argued in an August 19 pleading (pdf).

Specifically, the defense team said that prosecutors should not be allowed to use the provisions of the Classified Information Procedures Act (CIPA) to introduce unclassified substitutions for classified evidence that they wish to present.

The purpose of CIPA, the defense said, is to allow the defendant to present exculpatory classified evidence in an unclassified form while preventing “graymail,” i.e. the threat to disclose classified information as a tactic for evading prosecution.

But CIPA does not entitle prosecutors to introduce their own classified evidence in redacted form, the defense argued, particularly since “the Government… cannot ‘graymail’ itself.”  Instead, the prosecution “must either declassify information it wishes to use in its case-in-chief or forego using that information.”

“What CIPA does not provide is the ability of the Government to prosecute a defendant using substitute or redacted evidence against him in its case-in-chief,” the defense said.

The Sterling defense also objected to the prosecutors’ proposed use of the “silent witness” rule, by which classified information is shared with the jury but not disclosed in open court.

The silent witness rule is fundamentally unfair, is not authorized by law, is possibly unconstitutional, and should not be approved by the court, the defense said.  (The proposal to invoke the silent witness rule was first reported by Josh Gerstein in Politico on August 10.)

“It will be impossible effectively to contest and challenge the Government’s evidence before a jury if the Court permits use of the silent witness rule, which would impermissibly provide the stamp of secrecy and national security importance to information that the Government has elected to disclose in a criminal trial where those very issues are contested.  The Court must decline this invitation to conduct an unfair and constitutionally impermissible trial,” the defense said.

For similar reasons, the defense also objected to the proposed use of security measures such as initials and screens to conceal the identities of government witnesses.

“The Department of Justice, surely after consultations with the CIA, approved this prosecution,” the defense pleading said.  “In doing so, it should have expected an open and public trial that featured all of the Constitutional protections afforded a defendant in the United States.”

Navy: Excessive Security Can Degrade Effectiveness

There can be such a thing as too much security, the Navy said in a new Instruction on “Operations Security” (pdf) or OPSEC.

OPSEC refers to the control of unclassified indicators that an adversary could use to derive “critical information” (CI) concerning military or intelligence programs.

“Properly applied, OPSEC contributes directly to operational effectiveness by withholding CI from an adversary, thereby forcing an adversary’s decisions to be based on information friendly forces choose to release,” the new Navy Instruction said. “Inadequate OPSEC planning or poor execution degrades operational effectiveness by hindering the achievement of surprise.”

But even if adequately planned and executed, not all OPSEC is necessary or useful;  sometimes it is actually counterproductive.

“Excessive OPSEC countermeasures… can degrade operational effectiveness by interfering with the required activities such as coordination, training and logistical support,” the Instruction said.  See “Operations Security,” OPNAV Instruction 3432.1A, 4 August 2011.

Unfortunately, the Instruction does not and perhaps cannot provide criteria for distinguishing between proper OPSEC and excessive OPSEC.  Instead, it directs commanders and program managers to “evaluate” each operation and draw the appropriate conclusions.  What if the program manager is shortsighted or simply makes a mistake?  What if OPSEC is justified from a security perspective, but also undermines government accountability or public confidence in government integrity?  The Instruction has nothing to say about that.

Because of the subjective element in such decisions, the use of OPSEC (like the application of national security classification controls) is often arbitrary and disputed.

After 30 U.S. servicemen, including 17 Navy SEALs, were killed in Afghanistan on August 6 when their helicopter was shot down, U.S. Special Operations Command asked that the names of the SEALs not be disclosed for security reasons.  Secretary of Defense Leon Panetta rejected that view and the names were released by the Pentagon yesterday.

But in a questionable nod to OPSEC, the name of the unit to which the SEALs were attached — the Naval Special Warfare Development Group (DEVGRU) — was not cited by the Pentagon, Bloomberg News reported.  Instead, the DoD press release referred only to “an East Coast-based Naval Special Warfare unit.”  Yet the Navy itself has previously acknowledged and referred by name to the same SEAL unit.  See “Pentagon Releases Identities of SEALs Killed, Not Unit Name” by Tony Capaccio, Bloomberg News, August 11.

Information Sharing Still a Work in Progress

While information sharing among government agencies has increased dramatically over the past decade, it still falls short in some areas.

Due to “impediments to intelligence information sharing between U.S. forces and coalition partners,” information sharing with U.S. allies in Afghanistan has faltered to the detriment of the military mission, the Inspector General of the Department of Defense said in a mostly classified report last month.

Continuing impediments have “resulted in information not being tactically useful by the time it is authorized for release,” the Inspector General said.  See “Results in Brief: Improvements Needed in Sharing Tactical Intelligence with the International Security Assistance Force Afghanistan,” excerpted from DoD Inspector General Report 11-INTEL-13, July 18, 2011.

The 2011 Annual Report on the DNI Information Sharing Environment (pdf) said that “steady progress has been made” in information sharing, especially with respect to homeland security and law enforcement.

Among other things, the Report noted that the intelligence community intranet called Intelink “recently crossed the 100 million document threshold for records exposed to Intelink search services…. In one month alone this year, Intelink recorded over two million searches.”  Such datapoints “highlight the ability of IC personnel to acess more information quicker and more effectively, enabling them to better share information and thus perform their missions,” the Report said.

Another recent report from the Government Accountability Office said the Information Sharing Environment still had not identified its desired “end state.”  Six years after it was created, “there is not a clear definition of what the ISE is intended to achieve and include.”  See “Information Sharing Environment: Better Road Map Needed to Guide Implementation and Investments” (pdf), Government Accountability Office report GAO-11-455, July 2011.

It should be understood that “information sharing” is quite different from “information disclosure,” and the two practices are usually at odds.  In fact, the prerequisite for most so-called information sharing is an official assurance that the information to be shared will not be disclosed to unauthorized persons such as members of the general public.

Executive Order Responding to WikiLeaks Due Shortly

The Obama Administration is putting the finishing touches on a new executive order that is intended to improve the security of classified information in government computer networks as part of the government’s response to WikiLeaks.

The order is supposed to reduce the feasibility and the likelihood of the sort of unauthorized releases of classified U.S. government information that have been published by WikiLeaks in the past year.

According to an official who has reviewed recent drafts, the order addresses gaps in policy for information systems security, including characterization and detection of the insider threat to information security.  It does not define new security standards, nor does it impose the security practices of intelligence agencies on other agencies.  (“It doesn’t say, ‘go polygraph everybody’,” the official said.)

Rather, the order establishes new mechanisms for “governance” and continuing development of security policies for information systems.  Among other things, it builds upon the framework established — but not fully implemented — by the 1990 National Security Directive 42 (pdf), the official said.

The order, developed on a relatively fast track over the past nine months, has already gone through two rounds of interagency coordination and is expected to be issued within a matter of weeks.

Offensive Cyber Tools to Get Legal Review, Air Force Says

Even the most highly classified offensive cyberwar capabilities that are acquired by the Air Force for use against enemy computer systems will be subject to “a thorough and accurate legal review,” the U.S. Air Force said in a new policy directive (pdf).

The directive assigns the Judge Advocate General to “ensure all cyber capabilities being developed, bought, built, modified or otherwise acquired by the Air Force that are not within a Special Access Program are reviewed for legality under LOAC [Law of Armed Conflict], domestic law and international law prior to their acquisition for use in a conflict or other military operation.”

In the case of cyber weapons developed in tightly secured Special Access Programs, the review is to be performed by the Air Force General Counsel, the directive said.  See “Legal Reviews of Weapons and Cyber Capabilities,” Air Force Instruction 51-402, 27 July 2011.

The Air Force directive is somewhat more candid than most other official publications on the subject of offensive cyber warfare.

Thus, “for the purposes of this Instruction, an Air Force cyber capability requiring a legal review prior to employment is any device or software payload intended to disrupt, deny, degrade, negate, impair or destroy adversarial computer systems, data, activities or capabilities.”

On the other hand, cyber capabilities requiring legal review “do not include a device or software that is solely intended to provide access to an adversarial computer system for data exploitation,” the directive said.

One challenge facing such legal reviews is that law and policy in the relatively new field of cyberwar are not fully articulated.  Another challenge is that where applicable law and policy do exist, they may be inconsistent with the use of offensive cyber tools.

In response to a question (pdf) on cyberwarfare from the Senate Armed Services Committee at his confirmation hearing last year, Lt. Gen. Keith Alexander of U.S. Cyber Command said: “President Obama’s cybersecurity sixty-day study highlighted the mismatch between our technical capabilities to conduct operations and the governing laws and policies, and our civilian leadership is working hard to resolve the mismatch.” (page 9)

But he added: “Given current operations, there are sufficient law, policy, and authorities to govern DOD cyberspace operations. If confirmed, I will operate within applicable laws, policies, and authorities. I will also identify any gaps in doctrine, policy and law that may prevent national objectives from being fully realized or executed to the Commander, U.S. Strategic Command and the Secretary of Defense.”

Asked whether DoD possesses “significant capabilities to conduct military operations in cyberspace,” Gen. Alexander would only provide an answer on a classified basis.

The Pentagon does not often acknowledge the existence of offensive cyber capabilities.  The “Department of Defense Strategy for Operating in Cyberspace” (pdf) that was released in unclassified form last month does not address offensive cyber warfare at all.