“Too Mild a Nuclear Option”? National Security in the 1970s

U.S. nuclear weapons strategy evolved during the Nixon administration from a reflexive policy of massive retaliation against a Soviet attack to a diverse range of options for more limited nuclear strikes. The transition was not without some bumps.

A declassified 1974 memo recorded that National Security Adviser Henry Kissinger at first needed some persuading about the efficacy of limited strikes.

Kissinger “expressed concern that many of the options appeared to him as too timid. He judged that nuclear use must have a decisive military effect in order to achieve the desired political goal– convince enemy to stop.”

“Too mild a nuclear option is likely to convince the enemy to persevere, or respond tit for tat, or both,” Kissinger said, as paraphrased in the 1974 Pentagon memo.

The formerly Top Secret memo (document 36) is one of many that appeared in a richly informative, 1,000-page new volume of the State Department’s Foreign Relations of the United States (FRUS) series on National Security Policy, 1973-1976 that was released this week.

Kissinger was soon convinced of the need for greater flexibility, and presented the argument himself to President Nixon.

“The concept that we could ‘win’ a war through virtually unlimited nuclear exchanges has become increasingly irrational as the Soviets acquired the capability to destroy the United States– even if the U.S. were to strike first,” he wrote in a memorandum to the President (document 30). “This has resulted in concern that such a strategy is no longer credible and that it detracts from our overall deterrent.”

The proposed new nuclear policy would therefore provide “for the development of a broad range of limited options aimed at terminating war on terms acceptable to the U.S. at the lowest level of conflict feasible.” Still, it would preserve “the major SIOP-type options in the event that escalation cannot be controlled.”

Kissinger asked President Nixon to approve the proposed steps and “authorize me to sign” the new nuclear weapons policy. Nixon did approve, but he wrote that “RN will sign.”

The FRUS volume is full of impressive, candid and chatty source documents on the diverse national security issues of the time, including anti-satellite weapons, the notorious “Team B” competitive analysis project that challenged CIA assessments of Soviet military strength, the Glomar Explorer effort to raise a sunken Soviet submarine, and the growing threat of Soviet surveillance and interception of U.S. communications.

The fear that Soviets were monitoring U.S. telephone communications inspired a concerted effort to improve communications security against espionage and the invasion of privacy.

“The President… recognizes that U.S. citizens and institutions should have a reasonable expectation of privacy from foreign or domestic intercept when using the public telephone system,” according to National Security Decision Memorandum 338 of September 1, 1976 (document 180).

The Foreign Relations of the United States series has been an important driver of the declassification process, identifying high-value historical records for declassification review. While it sometimes represents the state of the art in declassification, other times it lags behind, probably due to the painfully slow pace of the review and production process. (The latest volume was under declassification review from 2007 to 2014.)

In some peculiar cases, FRUS both leads and lags in declassification. So, for example, the new FRUS volume includes a copy of the 1976 National Security Decision Memorandum 333 on “Enhanced Survivability of Critical U.S. Military and Intelligence Space Systems” (document 91). The newly published document includes two declassified paragraphs that had been withheld from public release as recently as 2008. Incongruously, however, the new FRUS version of NSDM 333 also withholds two lines concerning threats against U.S. satellites that it mistakenly says were “not declassified.” In fact, those lines were declassified years ago in the NSDM 333 that is available from the Ford Presidential Library. The two contrasting and complementary versions of NSDM 333 can be viewed here and here.

Newly Declassified Intelligence Satellite Imagery is Hard to Access

The declassification of historical intelligence satellite imagery has been a boon to scientists, environmentalists and other researchers since it began with President Clinton’s executive order 12951. So, for example, “The declassification of imagery from CORONA and subsequent intelligence satellite programs has inspired a revolution in landscape archaeology in the Near East,” wrote archaeologist Jason Ur.

But last year’s declassification of imagery from the KH-9 HEXAGON intelligence satellite will be slower to generate any such revolutionary impact because the newly declassified images are so hard to access and to use.

The KH-9 imagery was successfully transferred from the National Geospatial-Intelligence Agency to the National Archives. But in order to protect the perishable film it must be maintained in cold storage, and so it was all sent to a National Archives facility in Lenexa, Missouri Kansas. Researchers must make their best guess as to what images they are seeking, and then order the originals to be transferred from cold storage. It’s a slow and cumbersome process.

The larger policy issue is that the archival burden on the National Archives and Records Administration is growing faster than the available resources. The task of curating the nation’s documentary heritage appears to be escalating out of control. Meanwhile, the Archives is literally running out of space. Last month, Archivist of the United States David S. Ferriero announced the closure of three NARA facilities “as part of ongoing budget adjustments.”

*    *    *

Recently, one concerned researcher shared his frustrations about the current procedures for obtaining declassified satellite imagery. Secrecy News forwarded his comments to the National Archives and Records Administration, and a NARA official provided an annotated response, reproduced below.

Researcher: Since the [KH-9 HEXAGON] film is original negative, it was all shipped to Lenexa, Kansas.

NARA: Correct.  There is a potential that some of the film was not acetate and as such didn’t require cold storage but we did not have the resources to review each of the 14,685 cans to determine the base format and we erred on the side of caution in determining where to store it.

Researcher: NGA DID make available to NARA under the MOU [Memorandum of Understanding] the imagery, and finding aids, which are image mosaic overlays on maps 1:100,0000.  These are completely useless.

NARA: There was no MOU for this particular transfer.  Previous transfers had MOUs because there were multiple sets of records which were being distributed between NARA, NGA, and USGS. I think that there is some confusion between the past transfers and this one.  For this transfer we were provided with frame metadata.  The overlays referenced here do not index KH-9 film, they only index the airborne imagery previously transferred from NGA.

Researcher: There is also a CD-ROM which can be loaded onto a flash drive containing an ASCII file with mission date, pass frame, lat-long footprints, in an Excel format. But there is no way to know if the images are fully cloud-covered or not until the film arrives.

NARA: The CD provided for access as described in the KH-9 reference guide is what was provided to us by NGA.  We know we can make it better but it will likely never provide information on cloud cover by image.  All of our film, except for that indexed by the overlays, requires looking at it to determine quality and potential cloud cover.

Researcher: One must submit that data to an archivist who then converts the info into Original Negative Can numbers.  The researcher then must submit a second request including the ON number and the cold storage numbers to an Archivist, who quality controls it and submits the request to NARA Lenexa.

NARA: As with any other transfer of imagery, there is a process involved in going from whatever index exists to identifying the cans of imagery.  In the case of KH-9, once researchers identify imagery from the frame metadata, we have a can locator which converts the information for missions, dates, etc. to an actual can of imagery.  This can locator is available for copying by researchers, and is available through the consultant in the research room who can provide the necessary information.  It is also available on a hard drive for researchers to use themselves.

There is a need to fill out a pull slip for documentation of use and a Lenexa request form but that is done at the same time and does not require much effort other than writing a can number and barcode.

Researcher: The cans show up a few days later, and an Archivist must then quality control the cans for “supply chain management.”

I have spent a week at College Park just to find this out, and I have yet to actually order a can and see imagery.

NARA: The process for requesting cans from Lenexa is the same for any record stored there.   We submit the requests on a daily basis, the Lenexa staff pulls the items and ship them out the next day.   They are potentially available two days after the initial request.  We do have to take time to document where the cans are every step of the way in order to ensure the security of the holdings but that does not slow the process down significantly.

The biggest issues are those simply related to having records stored offsite–timing of requests, ability of staff pulling the items to find the correct items, and the weather which affects the shipments both during the winter and tornado season.  There are sometimes preservation issues identified early before the records are used but that is very rare and they are generally addressed quickly so the researcher does not have to wait.

Researcher: By the next Friday, the researcher can only have the film checked out for 3 business days, Friday, Saturday, and Monday, then the film must be flown back to cold storage.

NARA: All of the research rooms have a 3 business day hold for records.  This is simply to ensure that records are looked at in a timely manner and are available for other researchers.  There is always the opportunity to extend the period of retention but the researcher needs to communicate a need for that.

The NARA official added a rough estimate of the cost of create a duplicate set of KH-9 imagery to facilitate user access:

“At 14,685 cans, and an estimate of $800 worth of film stock per can, the cost is likely more than 11 million dollars.  In addition, we estimate it would take a dedicated employee some 8 years to perform the work (roughly 5 cans/day).”

“Digitization of course avoids the cost of the film stock, but has its own costs and challenges,” the official said. “We have to try and figure out where we focus our limited resources.”

Putting Declassified Records to Good Use

The final, climactic step in the declassification of government records is not the formal removal of classification markings or even the transfer of the declassified documents to public archives. The culmination of the declassification process is when the records are finally examined by an interested reader and their contents are absorbed into the body of public knowledge.

The records themselves are mute. It is the reader who interprets them, assigns them their significance, and thereby adds value to them.

Declassification of government records can be a tedious bureaucratic process.  But at its most successful, it can also be an electrifying, revelatory source of fundamental new insights.

So, for example, “The declassification of imagery from CORONA and subsequent intelligence satellite programs has inspired a revolution in landscape archaeology in the Near East,” wrote Harvard archaeologist Jason Ur in a book chapter last year.

Support for archaeological research was never intended or imagined by those who built or operated Cold War intelligence satellites. Yet “CORONA has emerged as an irreplaceable source for reconstructing ancient landscapes.”

Declassified CORONA satellite imagery “allows virtual survey of regions where ground observation would be difficult or impossible” and it has already yielded a near doubling of the number of archeological sites of interest, Dr. Ur wrote.

See Spying on the Past: Declassified Intelligence Satellite Photographs and Near Eastern Landscapes, Near Eastern Archaeology, volume 76, no. 1 (2013).

In another promising new initiative using declassified government records, historians, statisticians, computer scientists and others at Columbia University have joined forces to try to develop new ways to derive insights from such records.

Their project, known as the Declassification Engine, works to apply statistical tools and machine learning to cast new light on declassified record collections. With such tools, the project believes it will be able to characterize declassified records in meaningful new ways.

Near-term objectives include the attribution of authorship to anonymous documents, identifying patterns of secrecy in previously redacted text, and correlating the production of (de)classified diplomatic cables with international events in order to help uncover significant events that may have gone unrecognized. Another seemingly mundane but vital goal that is coming within reach is to enable the cost-effective digitization of documents that are in non-standard formats or that are not entirely legible.

“The long-range goal is to create a cloud-based virtual archive,” according to the project website. “It would aggregate the digitized documents now scattered across dozens of different repositories, offer a place for scholars and journalists to upload their own archival finds, and provide a range of visualization and attribution tools to advance research on the history, and future, of world politics.”

See also The Ghost Files by David J. Craig, Columbia Magazine, Winter 2013-14.

For now, however, these kinds of innovative approaches to the exploitation of classified documents stand out as novelties. They are still exceptions to the conventional rule.

Even when declassification is successfully accomplished, many — probably most — declassified records go unexamined by researchers and other members of the public.

This is partly a resource issue, said William J. Bosanko, the chief operating officer of the National Archives and Records Administration. NARA’s holdings have quadrupled in the last few decades, while its staff support has remained close to level. As a result, archivists have been unable to produce detailed indexing of many incoming records so as to make them easily “discoverable.”

At the same time, there seem to be fewer and fewer individual researchers that are inclined to delve deeply into archived collections of hardcopy records. It appears that many of them — many of us — have become habituated instead to the instant gratification of online access. (There are, however, backlogs of FOIA and mandatory declassification review requests.)

The upshot is that “there are lots of [record] series never used by the public,” said Mr. Bosanko. He noted that this is true of both declassified records and of records that were never classified.

This neglect is not a reflection on the contents of those records, which are endlessly rich. “There is a huge, vast treasure trove of fascinating stories waiting to be revealed” at the National Archives, Mr. Bosanko said. But they continue to wait.

Another persistent problem is the erratic, often illogical character of the declassification process.

The Department of Defense recently sought to redact the well-known fact that there were U.S. missiles deployed in Turkey during the Cuban Missile Crisis in 1962. This and other “inane and contradictory declassification actions” were highlighted recently by the non-profit National Security Archive.

“It is a waste of resources and a sign of a seriously defective declassification system when reviewers redact 50-year-old documents when nothing about them is sensitive,” wrote William Burr of the National Security Archive.  See Dubious Secrets of the Cuban Missile Crisis, February 21, 2014.

As with classification, so too with declassification: new oversight procedures are needed to prevent egregious errors and to promote more discriminating judgment.

December 2013 Declassification Deadline Passes– And?

In a December 2009 memorandum, President Obama directed the newly established National Declassification Center (NDC) to process an estimated 400 million page backlog of historical records at the National Archives “in a manner that will permit public access to all declassified records from this backlog no later than December 31, 2013.”

That December 31 deadline has now come and gone. So what happened? There are conflicting views about that.

The task was successfully completed, NDC Director Sheryl J. Shenberger told her staff in a December 31 email message. “I am delighted to announce your successful retirement of the 352M page backlog [according to the final page count, down from 400 million]…. Your work will be directly responsible for providing access to historical records of all types and topics to the American people,” she wrote.

But one dissenting observer said that any assertion that the backlog has now been declassified is “an elaborate subterfuge to re-brand failure as success. It is a study in mendacity.” The observer noted that a huge chunk of the backlog has not been declassified, and that the majority of it is not accessible to the public in any case.

Beneath these contrasting assessments are disagreements about what the President actually ordered in 2009, what was supposed to be accomplished by the end of 2013, what was accomplished in fact, and what it all means.

The confusion probably stems in part from the wording of the 2009 Presidential memorandum, the relevant section of which was captioned “Declassification of Records of Permanent Historical Value.”

Despite that promising title, the memo did not order declassification of any records at all.

Rather, NARA officials said, the presidential memorandum was intended to respond to a very specific problem, which hardly anyone outside of government recognized or was aware of. The problem was that hundreds of millions of pages of records that had already gone through declassification review were not reaching the public shelves, because agencies kept insisting on additional referrals and re-reviews to search for information that may have been missed on previous occasions.

The purpose of the Obama memo, they said, was to stop this endless merry-go-round of declassification review. The memo specified that any further referrals of records for agency review would be justified only in the case of records that could reveal a confidential human source or key design concepts for weapons of mass destruction. And even in those two cases, the follow-on reviews were to be completed by a date certain– December 31, 2013.

This task appears to have been accomplished by the deadline.

But other things were not done:

Not all of the historical backlog was declassified as a result of the four-year process (especially since no new declassification reviews were performed). As of last June, the NDC was reporting a public release rate of 61% of documents that had completed all processing.

Moreover, not all of the documents in the backlog that were declassified were made accessible to the public. An estimated 128 million pages of records reached the public shelves at the National Archives, though with abbreviated indexing of documents compared to past practice. This was done to expedite the reduction of the backlog, but at a cost to the clarity, coherence and utility of the collections. More than 100 million other pages still require indexing and other archival processing.

This may be disappointing, but it should not be surprising. The notion that all or most of the backlog would be declassified and made available to the public was never a realistic possibility, said William J. Bosanko, Chief Operating Officer at the National Archives.

“That expectation was not aligned with the resources that were brought to bear,” he said.  Declassification does not equate to public release, he noted, and it could take years just to screen the declassified records for personal privacy information and other data that might be exempt from public release. “If someone doesn’t want to see this as a victory, they can find lots of reasons to criticize it.”

But he said that the implementation of the President’s memo was in fact a remarkable achievement. It constituted a breakthrough in declassification policy because it required agencies to adopt an increased tolerance for risk of inadvertent disclosure of classified information.

Since repeated referrals of records for additional review were now blocked by the Obama policy (except in the two permitted cases), “agencies had no choice but to accept some risk” that information they preferred to withhold would be released, he said.

Even in those cases of backlog records suspected of containing WMD or human source information, reviewers only examined a tiny percentage of the relevant collections, not the entire set, and in all likelihood they missed some records that would have been exempt from disclosure.

“This is the first time that the Administration has accepted some finite risk” of inadvertent disclosure, Mr. Bosanko said. It could therefore represent a long-deferred turning point in declassification policy.

To the extent that agencies are willing (or are compelled) to renounce their claims to review records in which they have an interest (or an “equity”), the declassification process can be streamlined and made somewhat more rational.

Last year, the National Security Staff formally waived its equity in various National Security Council and White House records through the first Clinton Administration, according to a June 2013 notice from the Information Security Oversight Office. Refreshingly, the NSC instructed executive branch declassification reviewers not to refer most such records (with some exceptions) back to the White House for additional review.

The National Declassification Center and the CIA hosted a symposium today on Berlin in the Cold War, accompanied by the release of 11,000 pages of newly declassified records which are available online.

A History of History: The Story of the FRUS Series

The Foreign Relations of the United States (FRUS) series is the official documentary record of U.S. foreign policy published by the U.S. Department of State. The origins, development and continuing evolution of the FRUS series are explored in a massive new history prepared by the State Department Office of the Historian. See “Toward ‘Thorough, Accurate and Reliable': A History of the Foreign Relations of the United States Series” by William B. McAllister, Joshua Botts, Peter Cozzens, and Aaron W. Marrs, Department of State, December 19, 2013.

Dating back to the Civil War — the Abraham Lincoln Administration — FRUS long predates the existing national security classification and declassification regimes.  But from the start it has manifested and reinforced the impulse towards open government to a remarkable if imperfect degree. It appears to surpass any comparable effort to systematically and publicly document foreign policy by any other government in the world.

But more than a mere expression of open government, the FRUS series has been a battleground on which fundamental issues of secrecy and disclosure have been fought. Generations of officials, historians, journalists and others have disputed the timeliness of FRUS publications and their completeness, and weighed the demands of national security against the imperatives of historical integrity, with outcomes that shifted and diverged through the series.

“One might imagine individual FRUS volumes as akin to tree rings: each iteration records the environmental conditions from which it emerged; a broader story unfolds by examining change over time,” wrote historians William B. McAllister and Joshua Botts.

The advances, compromises and setbacks that characterized the evolution of the FRUS series are recounted in impressive and illuminating detail in the new historical study.

One of the themes that emerges is that the series progressed “dialectically,” in a continuing clash between conflicting interests in secrecy and disclosure.

So, for example, one of the main factors in the the post-World War II development of FRUS was the unauthorized disclosure of a classified compilation known as the Yalta Papers, which was a study of FDR’s wartime diplomacy.  The leak of the Yalta Papers by a FRUS historian in 1954 (which in some respects prefigured the Vietnam-era leak of the Pentagon Papers) catalyzed methodological changes in the production, timeliness and oversight of the FRUS series (see Chapter 7).

Meanwhile, excesses of secrecy generated their own corrective reactions. The suppression of information about US covert action in a FRUS volume on Iran, for example, helped instigate a statutory requirement that the FRUS series must be “thorough, accurate and reliable,” thereby strengthening the hand of openness advocates inside and outside the Department (Chapter 11).

The new history of FRUS is not a polemic or a piece of advocacy. It is a scrupulous account of the multiple and diverse perspectives that generated the FRUS series throughout its history. (And those who care about the series or participated in its development will find much of it gripping reading.)

But after hundreds of pages, the State Department authors allow the conclusion that in the conflict between secrecy and disclosure, it is secrecy that been the greater problem for FRUS, for the Department and for the US Government:

“The most significant negative repercussions attributable to the FRUS series have not involved damaging releases of potentially-sensitive national security or intelligence information. Rather, the reputation of the U.S. Government has suffered primarily from failures of the series to document significant historical events or acknowledge past actions.”

“FRUS realizes its promise when it fulfills global expectations for openness that promote democracy and encourage human freedom.”

The new FRUS history will be the subject of a panel discussion at the upcoming Meeting of the American Historical Association on January 4 in Washington, DC.

Two-Decade Review Yields History of Covert Action in Congo

After a declassification review that lasted nearly twenty years, the history of CIA covert action in the Congo from 1960 to 1968 was finally published last week by the State Department, filling an awkward gap in the historical record.

“In August 1960, the U.S. Government launched a covert political program in the Congo lasting almost 7 years, initially aimed at eliminating [Prime Minister Patrice] Lumumba from power and replacing him with a more moderate, pro-Western leader,” an editorial note introducing the new publication stated. See Foreign Relations of the United States (FRUS), 1964-1968, Volume XXII, Congo, 1960-1968.

“The U.S. Government provided advice and financial subsidies…. These funds were to be channeled in such a way as to conceal the U.S. Government as a source.”

“At the same time, based on authorization from President Eisenhower’s statements at an NSC meeting on August 18, 1960, discussions began to develop highly sensitive, tightly-held plans to assassinate Lumumba. After Lumumba’s death at the hands of Congolese rivals in January 1961, the U.S. Government authorized the provision of paramilitary and air support to the new Congolese Government….”

“In addition, the covert program included organizing mass demonstrations, distributing anti-Communist pamphlets, and providing propaganda material for broadcasts,” the editorial introduction said.

The new publication supplements previously published official histories of U.S. policy during the Congo Crisis, which were harshly criticized by historians and others for withholding documentary evidence of U.S. covert action.

By excluding CIA covert action, the 1994 FRUS volume on the Congo Crisis “omitted vital information, suppressed details concerning US intervention, and generally provided a misleading account of the Congo crisis,” wrote David N. Gibbs, a political scientist at the University of Arizona in a review entitled “Misrepresenting the Congo Crisis” (African Affairs: Journal of the Royal African Society, vol. 95, no. 380, pp. 453-459, 1996).

In another 1995 paper on Secrecy and International Relations, Prof. Gibbs said the persistent classification of the Congo covert action exemplified the use of secrecy to evade the democratic process.

“According to this approach, governments seek to conceal potentially controversial activities or ones that could generate public opposition,” he wrote. “In the Congo case secrecy successfully concealed government activities (such as the efforts to assassinate Lumumba) that were potentially very controversial.”

Historian Philip Zelikow told his colleagues on the State Department Historical Advisory Committee in 1999 that by refusing to admit the role of covert action, the earlier Congo volume “did enormous damage to the credibility of the Foreign Relations series and of the CIA.”

The current Historian of the State Department, Dr. Stephen P. Randolph, acknowledged that the earlier FRUS volumes “did not… contain documentation of the U.S. covert political action program. There were also no records in the two volumes concerning U.S. planning and preparation for the possible assassination of Patrice Lumumba.”

“This volume consists of a selection of the most significant of those previously unavailable documents,” Dr. Randolph wrote in the Preface to the new FRUS volume.

The first part of the new volume “contains numerous CIA cables to and from the Station in Leopoldville, which documents the chaotic nature of the Congo crisis and the pervasive influence of U.S. Government covert actions in the newly independent nation,” he wrote.

The second part “documents the continuation of the U.S. covert political action programs and their role in providing paramilitary and air support to the Congolese Government in an effort to quell provincial rebellions.”

Astonishingly, “The declassification review of this volume began in 1994 and was finally completed in 2013.”  Even so, it resulted in a number of redactions, some of which are not very credible.

The Central Intelligence Agency insisted on censoring cost figures for its covert action programs, even when they are half a century old. So, for example, document 170 in the new collection states that “To date covert support of Adoula’s government has cost a total of [dollar amount not declassified].”

A helpful editorial note (at p. 5), however, supplies some of the missing information: “The Special Group/303 Committee-approved aggregate budget for covert action in the Congo for the years 1960-1968 totaled approximately $11,702,000 (Political Action, $5,842,000; Air Program, $3,285,000; and Maritime Program, $2,575,000).”

The State Department Historical Advisory Committee, composed of non-governmental historians, advised and supervised the preparation of the final manuscript, and ultimately recommended its publication.

Even with the remaining redactions, the Committee said it “assesses the volume as a reliable guide to the trajectory of U.S. policy toward the Congo from 1960 until 1968 and an exceptionally valuable addition to the historical record.”

Declassification as a Confidence-Building Measure

In order to restore public trust, the U.S. intelligence community ought to be “aggressive” about reducing classification, former intelligence officials said last week.

Secrecy “is an enormous problem,” said Michael Leiter, who directed the National Counterterrorism Center from 2007 to 2011. “I hope the DNI is very aggressive about moving towards less classification and more effective security clearances.”

He didn’t specify what information he thought should cease to be classified, or exactly how a policy of less classification should be implemented. But he said that current secrecy policies have eroded public confidence.

“I think what Snowden has really illustrated better than anything else is [that] the trust that we need to have in a democratic society between those elements which should remain secret and its public is broken,” Mr. Leiter said. He spoke at a December 11 program on The Current State of Intelligence Reform held at the Bipartisan Policy Center.

“We struck a balance basically in the early 70s — with the FISA Court, Church-Pike– that has broken down,” he said. “And we no longer trust that the HPSCI, or the SSCI, or the FISA Court as its currently constructed or the Privacy and Civil Liberties Oversight Board or the President’s Intelligence Advisory Board– people don’t know those names, they don’t know those acronyms, and they don’t trust the US intelligence Community because people don’t believe those organizations are conducting the oversight they should.”

“So I hope out of this Snowden affair we end up with a new modern form of oversight which provides the trust that we need to do the things that have to remain secret,” he said.

Michael Allen, former staff director of the House Intelligence Committee, defended congressional oversight and said that criticism of the quality of intelligence oversight masked a policy disagreement.

“I think Congress has been working better in these respects in the past few years,” he said.

“The reason I think you see some potshots about congressional oversight as related to the Snowden matter is because Congress knew about the [bulk collection] programs and people who don’t like the programs are mad at Congress for going along with them in the first place. So they assault congressional oversight writ large, when they’re really making a policy judgment that they don’t support the underlying programs, and they’re casting aspersions against anyone who might have known or been comfortable with them,” he said.

But Mr. Allen agreed that the intelligence community needed to provide the public with more insight into its activities and with more access to its products.

“I think for the intelligence community to be able to survive and [for] many of the collection programs to be able to be successful, the intelligence community has got to rethink its declassification policy,” he said.

The IC should “consider trying to put out more examples of where its collection programs have been successful in order to fight for the authorities to keep them.” Likewise, it “might consider putting out more of its analysis so that a broader swath of people can have an appreciation for what they [intelligence agencies] do,” Mr. Allen said.

While there has been significant declassification of records concerning NSA surveillance programs, a comparable reassessment of intelligence classification policy in other topical areas remains to be accomplished.

As for public disclosure of other intelligence analysis, that currently seems remote. The ODNI Open Source Center stubbornly refuses to release even unclassified, uncopyrighted products that it generates (though some do leak from time to time).

In fact, current trends point in the opposite direction, towards reduced disclosure.  Later this month the Open Source Center, which is managed by CIA, will terminate longstanding public access to translations of foreign news reports which have long been available (to paid subscribers) through the NTIS World News Connection.

Last week, the Obama Administration argued in court that the CIA should not be obliged to publicly release a 30 year old draft history of the 1961 Bay of Pigs episode.

White House Sets New Goals for Open Government

In a new Open Government National Action Plan that was released today, the White House affirmed its support for open government values, and set an agenda for the remainder of the current Administration.

“The new plan includes a wide range of actions the Administration will take over the next two years, including commitments that build upon past successes as well as several new initiatives,” the Plan stated. “The Administration will work with the public and civil society organizations to implement each of these commitments over the next two years.”

With respect to national security secrecy, the Plan includes a new commitment to “transform the security classification system” based on the principle that “classification must… be kept to the minimum required to meet national security needs….”

Towards that end, a new interagency Classification Review Committee is being established with White House leadership to evaluate proposals for classification reform, and to coordinate their implementation throughout the executive branch.  The creation of such a body was the primary recommendation of the Public Interest Declassification Board last year, and it was strongly endorsed by public interest groups.

Both because of its interagency character and especially due to its White House leadership, the new Committee has the potential to overcome the autonomous classification practices of individual agencies that have contributed to the explosive growth in secrecy.

Positive results are naturally not guaranteed.  The Administration has not embraced an explicit theory of how overclassification occurs, or even how overclassification is to be defined, and therefore it is not yet well-equipped to address the problem.

The new Plan notes that in June of this year President Obama directed the Intelligence Community to declassify and make public “as much information as possible” about intelligence surveillance programs. But in an optimally functioning classification system, the President’s directive would have been redundant and unnecessary; the system would already be declassifying as much information as possible.

Of course, the existing classification system is not functioning optimally. That is the problem.  So either the President needs to issue individualized directives to all agencies on every conceivable classified topic to “declassify as much as possible,” or else the new White House interagency Committee needs to find alternate means to effectively communicate the same imperative.

“The Obama Administration remains fully committed to building a 21st-Century Open Government and fundamentally improving the relationship between citizens and government,” the new Plan said.

Not everyone has gotten that message, though.  The Central Intelligence Agency is determined to cut off public access to foreign news reports and translations gathered through its Open Source Center (formerly the Foreign Broadcast Information Service) and marketed to subscribers through the NTIS World News Connection. At the end of this month, this legendary resource will cease to be available to the public after more than half a century. (“CIA Halts Public Access to Open Source Service,” Secrecy News, October 8).

A CIA official suggested that anyone who is interested in foreign news can “use the internet” instead.

Prioritizing Topics for Declassification

The Public Interest Declassification Board, which advises the President on classification and declassification policy, is proposing to recommend that certain historically significant topics and events be prioritized for expedited declassification.

The Board has invited public input into the formulation of its recommendations for prioritization, which currently fall into five broad categories:  Topics 25 Years Old and Older, Topics 25 Years Old and Younger, Topics Related to Formerly Restricted Data (FRD) Information, General Topics of Interest, and Topics Specifically Gathered from Presidential Libraries.

The working list of potential declassification topics that are less than 25 years old includes many worthy subjects including, for example, 9/11 Commission records and “Guantanamo / Detainee issues.”  On the other hand, it does not yet include many other high priority items for declassification, such as the Senate Intelligence Committee’s massive report on CIA interrogation practices.

“We invite the public to comment on these topics and offer its own suggestions on what should be on this list of topics younger than 25 years,” the Board statement said. “We hope the List will serve as a guide to aid agencies in reviewing the information the public wants to see.  This is your opportunity to spark a much-needed conversation about the sustainability of the current declassification system and what our priorities collectively should be to make the most impact.”  Comments can be submitted through the Board’s blog, Transforming Classification.

But Is Prioritizing Declassification Topics the Right Approach?

There is a longstanding disagreement over whether it is appropriate to prioritize some areas for declassification because of their topicality, or whether it is better to gradually declassify everything in an orderly and systematic way.  (Or whether the right answer, as I thought, was to do some of both.)

Some have argued that prioritization of special declassification projects is the wrong way to go.

“If effective, routine, comprehensive systematic declassification review were in place for all agencies, and if the public believed in the integrity and thoroughness of those review processes, then important documentation… would be routinely reviewed and declassified without an expensive special search,” said Rutgers historian Warren F. Kimball at a 2000 hearing of the Senate Committee on Governmental Affairs.

“Those boutique declassification efforts… devour resources that should go to systematic declassification review,” Prof. Kimball said then. “Some of those special searches have been legitimate. Some have been trivial. Many have been repetitive and unrewarding…. All have been exorbitantly expensive in both money and work hours. All were or should have been unnecessary.”

Not only are topic-based “special searches” more resource-intensive than regular, systematic declassification, but they may also subtly distort the historical record by removing individual documents from their context, and by favoring “popular” topics over others whose deeper significance may be unrecognized.

That may all be true, say proponents of prioritization.  But the reality is that many records that are supposedly “historically valuable” are of no interest to anyone, and will not be read even if they are declassified.  And besides, the current systematic declassification review  program cannot keep up with the current and anticipated declassification workload.  So in practice, there is little choice but to prioritize.

Whichever argument seems more persuasive, the Public Interest Declassification Board, composed of presidential and congressional appointees, has now tipped the balance in favor of prioritization.  The support of the Board doesn’t guarantee that it will happen, but it makes the issue a newly live one.

If prioritization of particular declassification topics does go forward, then there are important questions to consider beyond the identification of the topics themselves. One question is, how can the declassification of the prioritized topics be made as productive as possible?  Another question is, what happens to all the documents that are not prioritized?

Revise the Standards for Prioritized Declassification

Specialized declassification projects have the greatest impact when they do more than simply move a particular topic to the front of the queue for declassification. The best of them, like the one performed by the JFK Assassination Records Review Board, also involve revised standards for declassifying the prioritized information in order to maximize disclosure.

Interestingly, it appears that agencies already tend to be more forthcoming in declassification projects that they initiate themselves than they are when applying legacy declassification standards in response to FOIA requests. This is true even (or especially) in the case of secrecy-intensive organizations like CIA or NSA. (The CIA and the National Declassification Center will sponsor a symposium in January on the history of the Berlin Wall featuring some newly declassified documents.)

In any event, the utility of the prioritization approach to declassification could be maximized if the adoption of a prioritized topic were accompanied by an appropriate revision of declassification criteria to ensure that only the least necessary amount of information relevant to the topic will be withheld. (Ideally, such a revision of project-related declassification standards would be performed or supervised by an independent third party, such as the Interagency Security Classification Appeals Panel.)

An updated review of classification and declassification criteria is clearly necessary in order to overcome residual, obsolete barriers to disclosure.

When DNI James Clapper released voluminous records concerning foreign intelligence surveillance programs last week, he noted that “President Obama directed me to declassify and make public as much information as possible about certain sensitive programs while being mindful of the need to protect sensitive classified intelligence activities and national security.”

The tacit implication was that without the President’s direction, the DNI would not “declassify and make public as much information as possible….”  Similar direction to “declassify as much as possible” ought to be applied in the case of each prioritized declassification project.

Set a Drop Dead Date for Classification to Expire

A necessary consequence of prioritization of some records for declassification is that other records will be pushed back in the queue. What this means is that, without remedial action, more and more records may never be declassified.

President Obama’s executive order 13526 declared for the first time that “No information may remain classified indefinitely” (section 1.5d).  But it is not clear how that dictum is to be translated into actual declassification policy.

Records that were exempted from “automatic declassification” at 25 years were supposed to be automatically declassified beginning at the end of this year when they turned 50 years old.  Exceptions had been provided for records that revealed the identities of human intelligence sources or of key design concepts for weapons of mass destruction.  In practice, however, it appears that much more than such narrow categories will now be withheld.  According to a January 23, 2013 notice from the Information Security Oversight Office, numerous agencies have been granted authority to exempt records from declassification even at the 50 year point.

Unfortunately, this continuing deferral of declassification compounds the problem and may take it beyond any practical resolution.

What is needed instead is a “drop dead date” beyond which classification controls will simply expire.  Records of a certain age would not need to be “reviewed” for declassification. In fact, they would not need to be formally “declassified” at all. Rather, their status as classified records would just terminate.

A drop dead date would be consistent with the President’s direction that classification cannot continue indefinitely.  And as backlogs of classified records continue to accumulate, this approach would finally cut through the endless and increasingly intractable cycle of declassification review.

If, as the PIDB recommends, some records are going to be prioritized for declassification, then new consideration should be given to a drop dead date for all of those classified records that remain “unprioritized” for decade after decade.

In its version of the pending FY 2014 intelligence authorization bill (section 307), the Senate Intelligence Committee proposed to extend the charter of the Public Interest Declassification Board from December 2014, when it would otherwise expire, to December 2018.