“Air Force intelligence components do not engage in experimentation involving human subjects for intelligence purposes,” a new Air Force Instruction (pdf) states categorically.
Except for the exceptions.
“Any exception would require approval by the Secretary or Under Secretary of the Air Force and would be undertaken only with the informed consent of the subject and in accordance with procedures established by AF/SG to safeguard the welfare of subjects.”
The new Instruction presents a generally scrupulous account of the regulatory framework within which Air Force intelligence operates. It addresses domestic search and surveillance, imagery collection and dissemination, mail covers, and other intelligence activities.
See Air Force Instruction 14-104, “Oversight of Intelligence Activities,” 16 April 2007.
Some other noteworthy new Air Force Instructions include these (both pdf):
AFI 10-2604, “Disease Containment Planning Guidance,” 6 April 2007.
AFI 40-201, “Managing Radioactive Materials in the U.S. Air Force,” 13 April 2007.
Even as companies and countries race to adopt AI, the U.S. lacks the capacity to fully characterize the behavior and risks of AI systems and ensure leadership across the AI stack. This gap has direct consequences for Commerce’s core missions.
The last remaining agreement limiting U.S. and Russian nuclear weapons has now expired. For the first time since 1972, there is no treaty-bound cap on strategic nuclear weapons.
As states take up AI regulation, they must prioritize transparency and build technical capacity to ensure effective governance and build public trust.
The Philanthropy Partnerships Summit demonstrated both the urgency and the opportunity of deeper collaboration between sectors that share a common goal of advancing discovery and ensuring that its benefits reach people and communities everywhere.