A Fair Artificial Intelligence Research & Regulation (FAIRR) Bureau
Summary
Artificial intelligence (AI) is transforming our everyday reality, and it has the potential to save or to cost lives. Innovation is advancing at a breakneck pace, with technology developers engaging in de facto policy-setting through their decisions about the use of data and the embedded bias in their algorithms. Policymakers must keep up. Otherwise, by ceding decision-making authority to technology companies, we face the rising threat of becoming a technocracy. Given the potential benefits and threats of AI to US national security, economy, health, and beyond, a comprehensive and independent agency is needed to lead research, anticipate challenges posed by AI, and make policy recommendations in response. The Biden-Harris Administration should create the Fair Artificial Intelligence Research & Regulation (FAIRR) Bureau, which will bring together experts in technology, human behavior, and public policy from all sectors – public, private, nonprofit, and academic – to research and develop policies that enable the United States to leverage AI as a positive force for national security, economic growth, and equity. The FAIRR Bureau will adopt the interdisciplinary, evidence-based approach to AI regulation and policy needed to address this unprecedented challenge.
The incoming administration must act to address bias in medical technology at the development, testing and regulation, and market-deployment and evaluation phases.
The incoming administration should work towards encouraging state health departments to develop clear and well-communicated data storage standards for newborn screening samples.
Proposed bills advance research ecosystems, economic development, and education access and move now to the U.S. House of Representatives for a vote
NIST’s guidance on “Managing Misuse Risk for Dual-Use Foundation Models” represents a significant step forward in establishing robust practices for mitigating catastrophic risks associated with advanced AI systems.