Declassification Advances, But Will Miss Goal

The latest report from the National Declassification Center features notable improvements in interagency collaboration in declassifying records, along with increased efficiency and steadily growing productivity.  Even so, the declassification program will almost certainly miss its presidentially-mandated goal of eliminating the backlog of 25 year old records awaiting declassification by December 2013.

The new NDC report puts on a brave face and presents an upbeat account of its achievements to date.

“As of June 30, 2012, we have assessed 90% of the backlog.  Quality assurance evaluation and processing for declassification prior to final segregation and indexing have been completed on 55% of that 90%,” the report says.  Of the records that have been fully processed, 82% have been approved for public release.

Yet the awkward fact remains that only around 50 million pages of the original 370 million page backlog have been fully processed in the past two and a half years.  The prospect that declassification of the remaining 320 million pages will somehow be completed in the next 18 months as ordered by President Obama in 2009 is quickly receding.

It is shocking — or it ought to be — that the classification system is not fully responsive to presidential authority.  Beyond that, the impending failure to reach the assigned goal is an indication that current declassification procedures are inadequate to the task at hand.

While the NDC has already achieved some difficult changes in declassification policy, something more is evidently needed.

Potential changes that could be adopted include self-canceling classification markings that require no active declassification;  depopulation of the obsolete Formerly Restricted Data category for certain types of nuclear weapons information, which complicates declassification without any added security benefit;  and the surrender of agency “equity” or ownership in government records after a period of time so as to enable third-party (or automatic) declassification of the records.

These and other changes in declassification policy could be placed on the action agenda by the forthcoming report to the President from the Public Interest Declassification Board.

New Pentagon Statement on Leak Policy

Following a closed House Armed Services Committee hearing on leaks yesterday, the Department of Defense issued a statement outlining its multi-pronged effort to deter, detect and punish unauthorized disclosures of classified information.

“The Department of Defense has taken a comprehensive approach to address the issue of national security leaks,” the statement said.  “Personnel in all components are continuously working to protect classified information and identify those who do not uphold their obligations to protect national defense information.”

Several of the steps announced have previously been described and implemented, such as new guidance on protection of classified information and physical restrictions on use of portable media to download classified data.  Other measures involve new tracking and reporting mechanisms, and the ongoing implementation of an “insider threat” detection program.

Although many of these changes originated in response to WikiLeaks-type disclosures of DoD information two years ago, their repackaging now might serve to diffuse congressional anger over more recent high-profile leaks, and to preempt more extreme legislative responses.

The new DoD statement does not admit any valid role for unauthorized disclosures under any circumstances.

To the contrary, the Secretary of Defense affirmed that the Assistant Secretary for Public Affairs is the “sole release authority for all DoD information to news media in Washington.”

In other words, DoD Public Affairs is the only legitimate source for defense news and information.  It follows that freedom of the press means the unfettered ability of reporters to write about what the DoD Public Affairs Officer says.

New Army Doctrine Seeks to Minimize Civilian Casualties

Both as a matter of humanitarian principle and as sound military strategy, U.S. military forces should strive to minimize civilian casualties in military operations, according to new U.S. Army doctrine published on Wednesday.

“In their efforts to defeat enemies, Army units and their partners must ensure that they are not creating even more adversaries in the process,” the new publication states.

“Focused attention on CIVCAS [civilian casualty] mitigation is an important investment to maintain legitimacy and ensure eventual success.  Failure to prevent CIVCASs will undermine national policy objectives as well as the mission of Army units, while assisting adversaries.”

So, for example, “When Army units are establishing and maintaining wide area security, it may be more important to minimize CIVCAS than to defeat a particular enemy.”

However, “While CIVCAS mitigation efforts can greatly reduce CIVCASs, it is unreasonable to expect that CIVCASs can be completely eliminated in all instances.  When CIVCASs occur, the most important part of the response is to determine the facts of the incident, including the numbers and severity of CIVCASs.”

“Recognizing that they are in a constant information battle with their adversaries regarding CIVCASs and other issues, Army units should maintain a consistent pattern of truthfulness and timeliness.”

“Army investigations [of civilian casualty incidents] should strive for integrity, credibility, and inclusion of external perspectives…. Immediate and broad denial of reports without complete and accurate information in hand can undermine credibility, especially if the investigation finds reports [of civilian casualties] were correct.”

See “Civilian Casualty Mitigation,” ATTP 3-37.31, July 2012.

Social Motivations in a Cyber World

“The major threat to security, our way of life, to prosperity  is not kinetic warfare or terrorism, but is in fact espionage and crime and cheating,” according to Ben Hammersley, technologist and Wired editor at large. Hammersley spoke on “Adding Fuel to the Wi-fire: What is the Nexus between Social Media, Emerging Technologies and digital Radicalization” at The Brookings Institute on Tuesday, July 17th .

Hammersley explained that he is mainly a “Futurist … get[ting] paid to live six months in the future and tell stories about it.” He often referred to Moore’s Law, which is the concept that every 12-18 months the computing capabilities of a certain product that costs “x” amount of dollars, will have twice as much computing power for the same cost, or that the cost of a unit with the same commuting power will cost half as much as it did previously. As current British ambassador to East London Tech City, essentially the British Silicon Valley, he knows what he’s talking about. Continue reading

Punishing Leaks of Classified Information

The first new legislative initiative to combat leaks of classified information is a bill called the Deterring Public Disclosure of Covert Actions Act of 2012, which was introduced July 10 by Sen. Richard Burr (R-NC).

“This act will ensure that those who disclose or talk about covert actions by the United States will no longer be eligible for Federal Government security clearance. It is novel. It is very simple. If you talk about covert actions you will have your clearance revoked and you will never get another one,” Sen. Burr said.

As justification for the measure, he cited “a series of articles published in the media that have described and in some cases provided extensive details about highly classified unilateral and joint intelligence operations, including covert actions.”

But this assumes certain facts that are not in evidence.  As Walter Pincus wrote in the Washington Post today, there are numerous official and unofficial sources of information about the Stuxnet covert action story, for example, including private sector companies and foreign sources that do not hold security clearances.  From that point of view, the Burr bill does not seem well-suited to achieve its intended purpose.

But the most peculiar thing about the new legislation is that it appears to validate the spurious notion of an “authorized leak.”

Thus, the text of the bill would revoke the security clearance of persons who publicly disclose or discuss classified details of covert actions — unless they have “prior authorization from an original classification authority.”

This seems to mean that classified information about covert actions need not be specifically declassified in order to be publicly released, but only that its disclosure must be “authorized.”

The question of imposing criminal penalties for disclosure of classified information to the press was discussed lately by Morton H. Halperin, who has been involved as a consultant or an expert witness for the defense in many or most of the leak prosecutions from the Ellsberg case in the 1970s to the present.

“Starting from the premise that more information must be made public and that the government has the right to keep some information secret in the name of national security, we need a public and congressional dialogue about what set of measures would be most effective in meeting these two equally important objectives. Reducing government secrecy must be a key component of any such measures,” he wrote in “Leaks and the Public Right to Know,” Huffington Post, July 16.  See also a longer paper by Halperin on “Criminal Penalties for Disclosing Classified Information to the Press in the United States.”

Midnight Rulemaking, and More from CRS

New and updated reports from the Congressional Research Service that Congress has declined to make publicly available online include these.

Midnight Rulemaking, July 18, 2012

An Analysis of the Distribution of Wealth Across Households, 1989-2010, July 17, 2012

Oil Sands and the Keystone XL Pipeline: Background and Selected Environmental Issues, July 16, 2012

Defense Surplus Equipment Disposal: Background Information, July 18, 2012

Nigeria: Current Issues and U.S. Policy, July 18, 2012

The United Arab Emirates (UAE): Issues for U.S. Policy, July 17, 2012

Timor-Leste: Political Dynamics, Development, and International Involvement, July 3, 2012

The History of the Soviet Biological Weapons Program

In 1972, the United States, the Soviet Union and other nations signed the Biological and Toxin Weapons Convention that was supposed to ban biological weapons.  At that very time, however, the Soviet Union was embarking on a massive expansion of its offensive biological weapons program, which began in the 1920s and continued under the Russian Federation at least into the 1990s.

The astonishing story of the Soviet biological weapons enterprise is told in an encyclopedic new work entitled “The Soviet Biological Weapons Program: A History” by Milton Leitenberg and Raymond A. Zilinskas (Harvard University Press, 2012).

The Soviet biological weapons (BW) program was by far the largest and most sophisticated such program ever undertaken by any nation.  It was also intensely secretive, and was masked by layers of classification, deception and misdirection.

“The program’s most important facilities remain inaccessible to outsiders to this day,” Leitenberg and Zilinskas write, “and it has been made a crime for anyone in present-day Russia to divulge information about the former offensive BW program.”  Needless to say, official archives are closed and Russian government officials are uncommunicative on the subject, or deny the existence of the program altogether.

Over a period of a decade or so, Leitenberg and Zilinskas were able to interview about a dozen former Soviet scientists who were involved in the Soviet BW program, along with dozens of other sources.  Their revelations inform the authors’ analysis and serve to advance public knowledge of the subject far beyond previous reports.  Even relatively well-known incidents like the 1979 Sverdlovsk anthrax epidemic are cast in a new light.  Many other aspects of the program will be entirely unfamiliar to most readers.

Much of the book is devoted to a description of the vast infrastructure of Soviet BW research and production, including descriptions of the various institutes, their history, their workforce and the nature of their research, as far as it could be discerned.  Along the way, many fascinating and sometimes horrific topics are addressed.  For example:

  •     In an effort to enhance the weapons-related properties of BW agents, Soviet scientists spent years working to create a viral “chimera,” which is an organism that contains genetic material from two or more other organisms.
  •     Other scientists worked to eliminate the “epitopes” on the surface of existing BW agents in order to make them unrecognizable to regular diagnostic techniques.  By using such a modified agent, “the Soviets would have made it considerably more difficult for the attacked population to identify the causative pathogen of the resulting disease outbreak and begin timely treatment.”
  •     A project codenamed Hunter (Okhotnik) sought to develop hybrids of bacteria and viruses such that use of an antibiotic to kill the bacteria would trigger release of the virus.  “Unlike other national BW programs, which without exception used only classical or traditional applied microbiology techniques to weaponize agents, the post-1972 Soviet program had a futuristic aspect. By employing genetic manipulation and other molecular biology techniques, its scientists were able to breach barriers separating species….”
  •     The Soviet BW program appears to have taken advantage of the declassification in the 1970s of a large number of documents from the United States BW program.  Thus, the design of the Soviet Gshch-304 BW bomblet was found to closely resemble that of the declassified US E-130R2 bomblet.  In 2001, the US Government moved to reclassify many documents on the US BW program, but “nothing could be done about recalling reports that had been distributed relatively freely for more than 35 years.”
  •     The quality of US intelligence about the Soviet BW program left much to be desired.  “Intelligence about Soviet BW-related activities is relatively thin for the pre-1972 period; meager and often of dubious value during 1970-1979; and a little less meager and of better quality during 1980-1990.” After 1990, little has been declassified.  “There is an unknown number of still-classified reports concerning the Soviet BW program produced by the CIA and perhaps by other agencies that we do not have,” the authors write.  The state of declassification is such that “we have been able to collect far more information” about the history of Soviet BW activities from interviews with former Soviet scientists and others than from declassified official records.
  •     In what the authors term “a horrendous mistake by the United States,” the US government undertook a covert deception and disinformation program aimed at the Soviet Union in the late 1960s which implied falsely that the US had a clandestine biological weapons program.  This unfortunate campaign may have reinforced an existing Soviet belief that the US had never terminated its own offensive BW program, a belief that lent impetus, if not legitimacy, to the Soviet BW program.
  •     Today, the situation with respect to BW in the former Soviet Union is “ambiguous and unsatisfactory,” Leitenberg and Zilinskas write. “There remains the possibility that Russia maintains portions of an offensive BW program in violation of the BWC.” Alternatively, “since we do not actually know what is and has been taking place within the three [Ministry of Defense BW] facilities since 1992, perhaps the situation is better than might be feared.”

In 23 chapters, the authors painstakingly examine many facets of the history, structure and operation of the Soviet BW program.  They scrupulously cite prior scholarship on the subject, while sorting out verifiable fact, plausible inference, dubious speculation, and error or fabrication.  (Thus, “No SS-18 ICBM bomblet delivery system was ever completed, none was ever tested, and obviously none could ever have been employed.”)

But even after 900 pages of often dense text, “there are large gaps in our understanding of the Soviet BW program” and “readers are cautioned that much remains to be discovered.”

“We have not been able to resolve definitively some of the most important questions,” they observe.  Unanswered questions involve basic issues such as the motivation and purpose of the program.  Why did the Soviet Union pursue the development and acquisition of biological weapons?  Who was to be targeted by Soviet biological weapons – the US?  China?  Europe? – and under what conceivable circumstances?  And what happens now?

Following a brief period during the Yeltsin years during which Russian officials acknowledged this activity, “Russia’s current official position is that no offensive BW program had existed in the Soviet Union.”

*    *    *

The History of the Soviet Biological Weapons Program was reviewed by author David E. Hoffman in Foreign Policy last month.

In 2010 the US Government signed an agreement with the former Soviet Republic of Armenia to cooperate in the control or destruction of dangerous pathogens, and in other efforts to prevent proliferation of biological weapons.  The agreement, one of several such documents, was published earlier this year.

FY2013 Defense Authorization and Appropriations, and More from CRS

New and updated reports from the Congressional Research Service that have not been made readily available to the public include the following.

Defense: FY2013 Authorization and Appropriations, July 13, 2012

The Unified Command Plan and Combatant Commands: Background and Issues for Congress, July 17, 2012

LIBOR: Frequently Asked Questions, July 16, 2012

The 2001 and 2003 Bush Tax Cuts and Deficit Reduction, July 16, 2012

Guatemala: Political, Security, and Socio-Economic Conditions and U.S. Relations, June 26, 2012

Publishing Scientific Papers with Potential Security Risks

The recent controversy over publication of scientific papers concerning the transmissibility of bird flu virus was reviewed in a new report by the Congressional Research Service. The report cautiously elucidates the relevant policy implications and considers the responses available to Congress.

“Because of the complexity of dual-use issues, analysis of a topic according to one set of policy priorities may lead to unforeseen complications due to its intersection with other policy priorities,” the report says. “For example, maximizing security may lead to detriments in public health and scientific advancement, while maximizing scientific advancement may lead to security risks.”

See Publishing Scientific Papers with Potential Security Risks: Issues for Congress, July 12, 2012.

Up for Debate: Laser Isotope Separation

Prof. Mark Raizen and Dr. Francis Slakey debate the benefits and risks of dual use scientific research. Is the promise of tapping into the rare isotopes of the elements worth risking the threat of nuclear proliferation?

Debate: The Risks and Benefits of Laser Isotope Separation (LIS)

In April 2012, Nature published “Controversial Research: Good Science Bad Science,”  which focused on types of scientific research that could seriously damage global security and pose ethical dilemmas. The article called for an open and frank debate about dual use scientific research. One of the examples in the article was the use of lasers to separate radioactive isotopes.

Laser isotope separation (LIS) could be used to efficiently produce fuel for nuclear power reactors and to produce radioactive isotopes for medical use. But there is a down side. LIS could also be used to produce the fissile material, particularly highly-enriched uranium, needed to build nuclear weapons. A number of lawmakers and nuclear-proliferation experts have voiced concern that a commercially successful laser-enrichment program in the United States could reinvigorate research on the technology in a global scale.

Dr. Mark Raizen and Dr. Francis Slakey debate below about the benefits and risks of this technology.

Dr. Mark Raizen, University of Texas at Austin

Our planet contains vast natural resources, still largely untapped. These resources hold the promise of detecting and treating cancer, saving energy, making new materials, and advancing basic science. What are these valuable resources? Where can they be found? How can we make them available?

The answer to the first question is that the resources are rare isotopes of the elements. The answer to the second question is easy: these isotopes are literally in our midst, within the elements that make up our planet.  The third question is the crux of the matter; isolating rare isotopes of elements has been extremely difficult because they have nearly the same physical and chemical properties as other, more common, isotopes of the same element. This is the reason that many rare isotopes are the most expensive commodity on earth, with a price that can be over one thousand times that of gold! This prohibitive cost severely limits the exploration of new applications and therapies.

Here are just two examples of rare isotopes that could be widely used if only they were less expensive : Nickel-64,  a stable isotope with a natural abundance of only 1 percent.  It can be converted in a medical accelerator to Copper-64 which is a short lived radio-isotope with great promise for PET scans and cancer therapy.  Calcium-48 is a stable isotope with a natural abundance of 0.2 percent.  It is used as a diagnostic for osteoporosis in women, bone development in children, and for a basic physics experiment that may determine the mass of the neutrino.

The only method for separating such isotopes dates back more than eighty years. This method, known as the Calutron, relies on electron ionization of atoms, and separation by the charge-to-mass ratio. Although first used in the 1930s for separating uranium, they were replaced by the gas centrifuge which is limited mostly to that element.  The Calutrons remained as general purpose, though inefficient, isotope separators.  Today, these machines are only operating in Russia, with an obsolete technology that is facing imminent shut-down. Without an alternative approach, most rare isotopes will not be available in the future at any price. The looming shortage of crucial isotopes is a national priority, as indicated by a 2009 report of the Nuclear Science Advisory Committee to the Department of Energy, “Isotopes for the Nation’s Future.”

I recommend this report to anyone with an interest in the scope and uses of stable and radio-isotopes.  One topic discussed in this report is laser isotope separation. Although isotopes are almost identical in every manner, the wavelengths of the atomic transitions of different isotopes are slightly shifted from one another.

This “isotope shift” makes it possible to excite only one isotope with a narrow-band laser, leaving the others unaffected.  The common wisdom until now has been that one must use lasers to selectively ionize the desired atoms. However, it turns out that in order to have a large probability for ionization, very high laser power at multiple colors is required. The scale is so large that it required a government effort, with one dedicated goal:  laser isotope separation of uranium.  This  effort was ultimately terminated in 1999, mainly due to the high cost and complexity of the lasers, and to the best of my knowledge is not being pursued.  Laser separation of a molecular compound of uranium is still being pursued commercially by GE-Hitachi.  I have followed this work from a distance, and always felt there must be a solution which would be simple and cost-effective for the many smaller-scale isotopes that are needed.  It came from an unexpected direction.

Over the past few years, my research has focused on developing general methods for controlling the motion of atoms in gas phase.  The successful realization of these methods uses single-photons to control the magnetic state of each atom, followed by magnetic manipulation.  It has brought to reality a thought experiment by James Clerk Maxwell from 1870 known as Maxwell’s Demon.  This work is reviewed in an article that I wrote forScientific American“Demons, Entropy and the Quest  for Absolute Zero,” published in the March 2011 issue.  I  realized that these very same methods can also be used for efficient isotope separation with low-power solid-state lasers, a paradigm shift from ionization.  We are pursuing this avenue with a proof-of-principle experiment, soon to be completed.  This will then be applied commercially towards production of important medical isotopes, where the need is most urgent.  In fact, this could save your life!

Dr. Francis Slakey, American Physical Society

Over the last 15 years I’ve criss-crossed the globe and witnessed its full range of stories. And when you see dust kick up from the bare feet of a tribeswoman walking 5 miles to get water, you realize that we face enormous global challenges, including climate change, pandemics and access to clean water, to name just a few. Regardless of our individual views on any of those issues, I’m sure that we can all agree on one thing: let’s not add more challenges to the list. We have enough to deal with.

So, when the research that we carry out has the possibility of creating significant risks, then we should pause, reflect, and make sure that we don’t add yet another burden to an already challenged world.

Biologists did just that – pause and reflect – in exemplary fashion a few months ago when they confronted the H5N1 issue.  Concerned about potential security risks associated with publishing particular work on airborne transmission of avian flu, the relevant community of biologists put a self-imposed pause on research to consider the implications and challenges.  It was thoughtfully done, with only modest reluctance from some scientists, and with benefit to all.

We are now at a moment when it would be fruitful for the relevant members of the physics and engineering communities to carry out a similar examination of the risks and benefits of some areas of isotope separation research.

So far, we’ve gotten lucky in uncovering when countries are developing nuclear weapons programs. However, new isotope separation technologies are emerging that are smaller, more efficient and harder, if not impossible, to detect. The technologies are in various phases of development, from basic research to commercialization. Consider this:

Global Laser Enrichment, a joint venture of General Electric-Hitachi, is constructing and evaluating a laser-based method of uranium enrichment (SILEX) that is substantially more efficient and could leave little prospect for detection if stolen and acquired by a rogue group.

Professor Raizen has developed a method of single-photon isotope separation using a magnetic trap and low-power laser excitation for a more efficient method to develop much-needed medical isotopes. His technique isn’t intended to enrich

uranium, although the potential may well be there.

These developments raise the same issue: the on-going push for greater efficiency in isotope separation carries associated proliferation risks.

These risks of more efficient isotope separation are well known to the U.S. government. For example, the SILEX technology under development in North Carolina was the subject of a multi-agency proliferation-assessment report. The report conceded that “Laser-based enrichment processes have always been of concern from the perspective of nuclear proliferation… a laser enrichment facility might be easier to build without detection and could be a more efficient producer of high enriched uranium for a nuclear weapons program.”

The report ominously stated that it seemed likely that the technology would “renew interest in laser enrichment by nations with benign intent as well as by proliferants with an interest in finding an easier route to acquiring fissile material for nuclear weapons.”

So the risks of enrichment technology are well documented, and the consequences of the proliferation of the technology are clear and present, most immediately in Iran.

Of course, the easiest path for our research community would be to claim that these risks are someone else’s responsibility – we are scientists after all, not police. Yet, the biologists didn’t take that easy path. They broadened their sense of responsibility outside of the lab. They paused, considered, deliberated. And there is a practical reason for doing this. If scientists don’t consider the risks, we leave it to others to decide. And we may not like what they conclude.

What would we conclude from pausing and carrying out our own “stress test”? I can’t predict the outcome. In the case of the biologists, they strengthened their system with a centerpiece called the National Science Advisory Board for Biosecurity that monitors “dual-use research of concern” and it has received enthusiastic endorsements from scientists. The biologists came out of the process stronger. So can we.


Dr. Mark Raizen:

We live in exciting times, as we learn to control the physical world on the atomic and molecular scale. Efficient isotope separation is one example of a wider range of capabilities that include the assembly of new materials on the nanoscale, one atom at a time. These are powerful developments that can bring many benefits to mankind, but can also be intimidating to some. In particular, the topic of efficient isotope separation can evoke a fear of nuclear proliferation, but is that really true?

In fact, our methods will actually be used to reduce the risk of proliferation. How can that be? Consider Technecium-99m (Tc-99m).

This short-lived radio-isotope is used for medical imaging and is a major tool in nuclear medicine. Today, all Tc-99m is produced using weapon-grade uranium as a target in a nuclear reactor. The need to use such weapon-grade uranium poses a serious risk of proliferation, and the U.S. has led a worldwide effort to halt this mode of production by 2016. An alternative is to enrich a stable isotope, Molybdenum-100, which can be converted to Tc-99m by a clean nuclear process. You can read more about this topic in an excellent article by Dr. Tom Ruth. Our method of laser isotope separation can be used to produce enriched Molybdenum-100, and will therefore be an important tool in stopping nuclear proliferation.

Could our method be used for enrichment of uranium? That is a valid concern, and we should certainly pause and reflect, as suggested by Dr. Slakey. My best guess is that the application of our method to uranium is unlikely to be competitive with existing methods.

The basis for our approach is laser activation of the magnetic state of an atom, requiring a relatively simple atomic structure. Uranium has a very complex structure which may not be amenable to this new process. It is perhaps tempting to say that a method for enriching one isotope could also be applied to another. However, each element is unique in its physical and chemical properties. For example, the starting point for most atomic laser separation projects is to heat the solid material and vaporize it, forming an atomic beam. According to unclassified documents on the laser uranium separation project, it took years to find materials that do not react chemically with hot uranium metal. In contrast, many elements, such as calcium or ytterbium, are routinely used in atomic beams in research laboratories and do not have those problems. Similarly, the atomic structure and required lasers are unique to every atom.  A good analogy is to say that not all fruit are the same. In fact, uranium and calcium are as different from each other as apples and oranges.

In conclusion, with so many evident benefits we should not fear the future. We should look instead to the past and be inspired by the words of the great Marie Curie who said:  “I am one of those who think like Nobel, that humanity will draw more good than evil from new discoveries.”

About the Debaters:

Mark Raizen earned his the University of Texas at Austin in 1989 under the joint supervision of Prof. Steven Weinberg and Prof. H. J. Kimble. He has held the Sid W. Richardson Chair in Physics since 2000. He is a Fellow of the American Physical Society and the Optical Society of America. He was a National Science Foundation Young Investigator, an Office of Naval Research Young Investigator, and a Sloan Foundation Fellow. Prof. Raizen and his group pioneered the development of general methods for cooling and trapping of atoms, applicable to most atoms in the periodic table. This work opens new directions for basic science, and also has world-changing applications: efficient isotope separation.

Francis Slakey received his PhD in physics from the University of Illinois, Urbana-Champaign in 1992. He is the Associate Director of Public Affairs for the American Physical Society where he oversees APS legislative activities, specializing in energy and security policy. He is also The Upjohn Lecturer on Physics and Public Policy and the Co-Director of the Program on Science in the Public Interest at Georgetown University. He is a Fellow of the APS, a Fellow of the AAAS, a MacArthur Scholar, and a Lemelson Research Associate of the Smithsonian Institution. He is also the first person in history to both summit the highest mountain on every continent and surf every ocean, a global journey that is the subject of his best selling adventure memoir “To The Last Breath”.


About Up for Debate:

In Up For Debate, FAS invites knowledgeable outside contributors to discuss science policy and security issues. This debate among experts is conducted via email and posted on FAS invites a demographically and ideologically diverse group to comment – a unique FAS feature that allows readers to reach conclusions based on both sides of an argument rather than just one point of view.


Please read the guidelines for the official debate and rebuttal policy for participants of FAS’s ‘Up For Debate.’ All participants are required to follow these rules. Each opinion must stay on topic and feature relevant content, or be a rebuttal. No ad hominem and personal attacks, name calling, libel, or defamation is allowed, and proper citations must be given.