DoD Agencies Have Prohibited Waterboarding, IG Says
Defense agencies have complied with a recommendation to prohibit the use of military survival training techniques — such as waterboarding — in prisoner interrogation, the DoD inspector general confirmed in a report (pdf) last year.
In response to a previous Inspector General report (pdf), a 2008 DoD directive (pdf) stated that “Use of SERE [survival, evasion, resistance, and escape] techniques against a person in the custody or effective control of the Department of Defense or detained in a DoD facility is prohibited.” Likewise, a 2009 memorandum for the military services and the Special Operations Command specified that use of “SERE techniques for interrogations of personnel in DoD custody or control is prohibited.”
The 2010 IG report found that “all US Air Force, US Army, US Navy, US Marine Corps, and [Joint Personnel Recovery Agency] SERE training programs included, as part of their curriculum, a prohibition against the use of SERE techniques for interrogation of personnel in DoD custody or control.” See “Field Verification-Interrogation and Survival, Evasion, Resistance, and Escape (SERE) Techniques Recommendation,” DoD Inspector General Report 10-INTEL-05, April 16, 2010 (released under FOIA in December 2010).
SERE training provides “a reasonable means to train [U.S. military personnel] for the most challenging captivity environment where captors do not abide by the Geneva Conventions,” the IG report said. But “the physical and psychological pressures developed for… SERE training were not intended for real-world interrogations. Intelligence resistance training does not qualify a SERE Specialist instructor to conduct interrogations or provide subject matter expertise to those who are trained in that specialty.”
At this inflection point, the choice is not between speed and safety but between ungoverned acceleration and a calculated momentum that allows our strategic AI advantage to be both sustained and secured.
Improved detection could strengthen deterrence, but only if accompanying hazards—automation bias, model hallucinations, exploitable software vulnerabilities, and the risk of eroding assured second‑strike capability—are well managed.
New initiative brings nine experts with federal government experience to work with the FAS and Tech & Society’s Beeck Center for Social Impact + Innovation, the Knight-Georgetown Institute, and the Institute for Technology Law & Policy Wednesday, June 11, 2025—Today Georgetown University’s Tech & Society Initiative and the Federation of American Scientists (FAS) announce two […]
A dedicated and properly resourced national entity is essential for supporting the development of safe, secure, and trustworthy AI to drive widespread adoption, by providing sustained, independent technical assessments and emergency coordination.