Crowdsourcing the Verification of Nuclear Safeguards
The possibility of mobilizing members of the public to collect information — “crowdsourcing” — to enhance verification of international nuclear safeguards is explored in a new report from Sandia National Laboratories.
“Our analysis indicates that there are ways for the IAEA [International Atomic Energy Agency] to utilize data from crowdsourcing activities to support safeguards verification,” the authors conclude.
But there are a variety of hurdles to overcome.
“Some implementations of crowdsourcing for safeguards are legally or ethically uncertain, and must be carefully considered prior to adoption.”
Crowdsourced information, like other information, has to be independently verified, particularly since it is susceptible to error, manipulation and deception.
The report builds on previous work cited in a bibliography. See Power of the People: A Technical, Ethical and Experimental Examination of the Use of Crowdsourcing to Support International Nuclear Safeguards Verification by Zoe N. Gastelum, et al, Sandia National Laboratories, October 2017.
Outcome-Based Contracting reframes procurement around the staged achievement of measurable mission outcomes rather than the delivery of predefined technical artifacts.
The real opportunity of AI lies not just in the tools, but in an educator workforce prepared to wield them. When done right, this investment in human infrastructure ensures AI accelerates learning outcomes for all students, closing the “digital design divide.”
If carbon markets are going to play a meaningful role — whether as engines of transition finance, as instruments of accurate pricing across heterogeneous climate interventions, or both — they need the infrastructure and standards that any serious market requires.
Good information sources, like collections, must be available and maintained if companies are going to successfully implement the vision of AI for science expressed by their marketing and executives.