Technology and NEPA: A Roadmap for Innovation
Improving American competitiveness, security, and prosperity depends on private and public stakeholders’ ability to responsibly site, build, and deploy proposed critical energy, infrastructure, and environmental restoration projects. Some of these projects must undergo some level of National Environmental Policy Act (NEPA) review, a process that requires federal agencies to consider the environmental impacts of their decisions.
Technology and data play an important role in and ultimately dictate how agencies, project developers, practitioners and the public engage with NEPA processes. Unfortunately, the status quo of permitting technology falls far short of what is possible in light of existing technology. Through a workstream focused on technology and NEPA, the Federation of American Scientists (FAS) and the Environmental Policy Innovation Center (EPIC) have described how technology is currently used in permitting processes, highlighted pockets of innovation, and made recommendations for improvement.
Key findings, described in more detail below, include:
- Systems and digital tools play an important role at every stage of the permitting process and ultimately dictate how federal employees, permit applicants, and constituents engage with NEPA processes and related requirements.
- Developing data standards and a data fabric should be a high priority to support agency innovation and collaboration.
- Case management systems and a cohesive NEPA database are essential for supporting policy decisions and ensuring that data generated through NEPA is reusable.
- Product management practices can and should be applied broadly across the permitting ecosystem to identify where technology investments can yield the highest gains in productivity.
- User research methods and investments can ensure that NEPA technology are easier for agencies, applicants, and constituents to use.
Introduction
The Federation of American Scientists (FAS) is a nonprofit, nonpartisan organization that works to embed science, technology, innovation, and experience into government and public discourse. The Environmental Policy Innovation Center (EPIC) is a nonprofit, nonpartisan organization focused on building policies that deliver spectacular improvement in the speed of environmental progress.
FAS and EPIC have partnered to evaluate how agencies use technology in permitting processes required by NEPA. We’ve highlighted pockets of innovation, talked to stakeholders working to streamline NEPA processes, and made evidence-based recommendations for improved technology practices in government. This work has substantiated our hypothesis that technology has untapped potential to improve the efficiency and utility of NEPA processes and data.
Here, we share challenges that surfaced through our work and actionable solutions that stakeholders can take to achieve a more effective permitting process.
Background
NEPA was designed in the 1970s to address widespread industrial contamination and habitat loss. Today, it often creates obstacles to achieving the very problems it was designed to address. This is in part because of an emphasis on adhering to an expanding list of requirements that adds to administrative burdens and encourages risk aversion.
Digital systems and tools play an important role at every stage of the permitting process and ultimately dictate how federal employees, permit applicants, and constituents engage with NEPA processes and related requirements. From project siting and design to permit application steps and post-permit activities, agencies use digital tools for an array of tasks throughout the permitting “life-cycle”—including for things like permit data collection and application development; analysis, surveys, and impact assessments; and public comment processes and post-permit monitoring.
Unfortunately, the current technology landscape of NEPA comprises fragmented and outdated data, sub-par tools, and insufficient accessibility. Agencies, project developers, practitioners and the public alike should have easy access to information about proposed projects, similar previous projects, public input, and up-to-date environmental and programmatic data to design better projects.
Our work has largely been focused on center-of-government agencies and actions agencies can take that have benefits across government.
Key actors include:
- The Permitting Council. Established in 2015 through the Fixing America’s Surface Transportation Act (known as FAST-41), the Permitting Council is charged with facilitating coordination of qualified infrastructure projects subject to NEPA as well as serving as a center of excellence for permitting across the federal government. Administrative functions and salaries are supported primarily by annual appropriations. Infrastructure Investment and Jobs Act (IIJA) funding enables “ongoing operation of, maintenance of, and improvements to the Federal permitting dashboard” while Inflation Reduction Act (IRA) funding supports the center of excellence and coordination functions.
- The Council on Environmental Quality (CEQ). CEQ is an office within the Executive Office of the President established in 1969 through the National Environmental Policy Act. Executive Order 11991, issued in 1977, gave CEQ the authority to issue regulations under NEPA. However, President Trump rescinded that EO in January 2025 and issued a new Executive Order on Unleashing American Energy. This new Executive Order directs the Chair of CEQ to provide “guidance on implementing the National Environmental Policy Act…and propose rescinding CEQ’s NEPA regulations found at 40 CFR 1500 et seq.” CEQ has received annual appropriations to support staff as well as supplemental funding. The IRA provided CEQ with $32.5 million to “support environmental and climate data collection efforts and $30 million more to “support efficient and effective environmental reviews.“
Below, we outline key challenges identified through our work and propose actionable solutions to achieve a more efficient, effective, and transparent NEPA process.
Challenges and Solutions
Product management practices are not being applied broadly to the development of technology tools used in NEPA processes.
Applying product management practices and frameworks has potential to drastically improve the return on investment in permitting technology and process reform. Product managers help shepherd the concept for what a project is trying to achieve and get it to the finish line, while project managers ensure that activities are completed on time and on budget. In a recent blog post, Jennifer Pahlka (Senior Fellow at the Federation of American Scientists and the Niskanen Center) contrasts the project and product funding models in government. Product models, executed by a team with product management skills, facilitate iterative development of software and other tools that are responsive to the needs of users.
Throughout our work, the importance of product management as a tool for improving permitting technology has become abundantly clear; however there is substantial work to be done to institutionalize product management practices in policy, technology, procurement, and programmatic settings.
Solutions:
- Create process maps for the permitting process – in detail – within and across agencies. Once processes are mapped, agencies can develop tailored technology solutions to alleviate identified administrative burdens by either removing, streamlining, or automating steps where possible and appropriate. As part of this process, agencies should evaluate existing software assets, use these insights to streamline approval processes, and expand access to the most critical applications. Agencies can work independently or in collaboration to inventory their software assets. Mapping should be a collaborative, iterative effort between project leads and practitioners. Mapping leads should consider whether the co-development of user journeys with practitioners who play different roles in the permitting process, such as applicants, environmental specialists (federal employees), and public commenters, would be a useful first step to help scope the effort.
- Hire product management and customer experience specialists in strategic roles. Agencies and center of government leaders should carefully consider where product management and customer experience expertise could support innovation. For example, the Permitting Council could hire a product management specialist or customer experience expert to consult with agencies on their technology development projects. Fellowship programs like the Presidential Innovation Fellows (PIF) or U.S. Digital Corps can be leveraged to provide agencies with expertise for specific projects.
- Strategically leverage existing product management guidance and resources. Agencies should use existing resources to support product management in government. The 18F unit, part of the General Services Administration (GSA)’s Technology Transformation Services (TTS), helps federal agencies build, share, and buy technology products. 18F offers a number of tools to support agencies with product management. GSA’s IT Modernization Centers of Excellence can support agency staff in using a product-focused approach. The Centers focused on Contact Center, Customer Experience, and Data and Analytics may be most relevant for agencies building permitting technology. In addition, the U.S. Digital Service (USDS) “collaborates with public servants throughout the government”; their staff can assist with product, strategy, and operations as well as procurement and user experience. 18F and USDS could work together to provide product management training for relevant staff at agencies with a NEPA focus. 18F or USDS could create product management guidance specifically for agencies working on permitting, expanding on the 18F Product Guide. These resources could also explore how agencies can make decisions about building or buying when developing permitting technology. Agencies can also look to the private sector and NGOs for compelling examples of product development.
- Learn from successes at other agencies. We have written about how agencies have successfully applied product management approaches inside and outside of the NEPA space.
Siloed, fragmented data and systems cost money and time for governments and industry
As one partner said, “NEPA is where environmental data goes to die.” Data is needed to inform both risk analysis and decisions; data can and should be reused for these purposes. However, data used and generated through the NEPA process is often siloed and can’t be meaningfully used across agencies or across similar projects. Consequently, applicants and federal employees spend time and money collecting environmental data that is not meaningfully reused in subsequent decisions.
Solutions:
- Develop a data fabric and taxonomy for NEPA-related data. CEQ’s Report to Congress on the Potential for Online and Digital Technologies to Address Delays in Reviews and Improve Public Accessibility and Transparency, delivered in July 2024, recommends standards that would give agencies and the public the ability to track a project from start to finish, know specifically what type of project is being proposed, and understand the complexity of that project. The federal government should pilot interagency programs to coordinate permitting data for existing and future needs. Chief Environmental Review and Permitting Officers (CERPOs) should invest in this process and engage their staff where applicable and appropriate.
- Establish a Digital Service for the Planet to work with agencies specifically on how environmental data is collected and shared across agencies. The Administration should create a Digital Service for the Planet (DSP) that is staffed with specialists who have prior experience working on environmental projects. The DSP should support cross-agency technology development and improve digital infrastructure to better foster collaboration and reduce duplication of federal environmental efforts to achieve a more integrated approach to technology—one that makes it easier for all stakeholders to meet environmental, health, justice, and other goals for the American people.
- Centralize access to NEPA documents and ensure that a user-friendly platform is available to facilitate public engagement. The federal government should ensure public access to a centralized repository of NEPA documents, and a searchable, user-friendly platform to explore and analyze those documents. Efforts to develop a user friendly platform should include dedicated digital infrastructure to continually update centralized datasets and an associated dashboard. Centralizing searchable historical NEPA documents and related agency actions would make it easier for interested parties to understand the environmental assessments, analyses, and decisions that shape projects. Congress can require and provide resources to support this, agencies can invest staff time in participation, and agency leaders can set an expectation for participation in the effort.
Technology tools used in NEPA processes fall far short of their potential
The status quo of permitting technology falls far short of what is possible in light of existing technology. Permitting tools we identified in our inventory range widely in intended use cases and maturity levels. Opportunities exist to reduce feature fragmentation across these tools and improve the reliability of their content. Additionally, many software tools are built and used by a single agency, instead of being efficiently shared across agencies. Consequently, technology is not realizing its potential to improve environmental decision-making and mitigation through the NEPA process.
Solutions:
- Set more ambitious modernization goals. We have the technological capabilities to go above and beyond data fabric and taxonomy. CEQ and the Permitting Council can focus on helping agencies scale successful permitting technology projects and develop decision support tools. This could include supporting agency tools to bring e-permitting into the modern era, which speeds processing time and saves staff time. Agency tools to enhance could include USACE’s Regulatory Request System and tool for tracking wetland mitigation credits (RIBITS), USFWS’s tool for Endangered Species Act consultation (IPaC), the Permitting Council’s FAST-41 dashboard, and CEQ’s eNEPA tool. Policymakers and staff working to improve permitting technology should consider replicating the functionality of successful existing tools, and automating the determination of “application completeness”, which has frequently been cited as a source of delays.
- Institutionalize human centered design (HCD) principles and processes. Agencies should encourage and incentivize deployment of HCD processes. The Permitting Council, GSA, and agency leadership can play a key role in institutionalizing these principles through agency guidance and staff training. Applying human-centered design can ensure thoughtful, well-designed automation of tasks that free up staff members to focus their limited time and attention on matters that need their focus and, crucially, increase the number of NEPA decisions the federal government is able to reach in a designated period of time.
- Prioritize development of digital applications with easy-to-use forms. Application systems may look different from agency to agency depending on their specific needs, but all should prioritize easy-to-use forms, working collaboratively where applicable. Relevant HCD principles include entering data once, user-friendly templates or visual aids, and auto-populating information. Eventually, more advanced features could be incorporated into such forms—including features like AI-generated suggestions for application improvements, fast-tracking reviews for submissions that use templates, and highlighting deviations from templates for review by counsel.
- Create better pre-design tools to give applicants more information about where they can site projects. Improved pre-design tools can help applicants anticipate components of a site that may come up in environmental reviews, such as endangered species. Examples include Vibrant Planet’s landscape resilience tool and the USFWS iPAC platform. These platforms can be developed by agencies or by private-sector and nonprofit organizations. Agencies should seek opportunities to invest in tools that meet multiple needs or provide shared services. The Permitting Council and/or CEQ could lead an interagency task force on modernizing permitting and establish a cross-agency workflow to prevent the siloing of these tools and support agencies in pursuing shared services approaches where applicable.
- Invest in decision-support tools to better equip federal employees. Many regulators lack either the technical skillset to review projects and/or lack the confidence to efficiently and effectively review permit applications to the extent needed. Decision-support tools are needed to lay out all options that the reviewer needs to be aware of to make an informed and timely decision that isn’t based on institutional knowledge (e.g., existing categorical exclusions or nationwide permits that fit the project). These types of decision-support tools can also help create more consistency across reviews. CEQ and/or the Permitting Council could establish a cross-agency workflow to prevent siloing of these tools and support agencies in pursuing shared services approaches where applicable.
Existing NEPA technology tools are difficult for agencies, applicants, and constituents to use
Agencies generally do not conduct sufficient user research in the development of permitting technology. This can be because agencies do not have the resources to hire product management expertise or train staff in product management approaches. Consequently, agencies may only engage users at the very end (if at all), or not think expansively about the range of users in the development of technology for NEPA applications. Advocacy groups and permit applicants aren’t well considered as tools are being developed. As a consequence, permitting forms and other tools are insufficiently customized for their sectors and audiences.
Solutions:
- Incorporate user research into existing projects. Agencies can build user experience activities and funding into project plans and staffing for bespoke permitting tool development. There are resources available to agencies to incorporate user research if they don’t have the talent in-house (as many don’t). These include the 18F unit, GSA’s IT Modernization Centers of Excellence, USDS, and the Presidential Innovation Fellows program.
- Elevate case studies of agencies using user research to improve product delivery. As a center of excellence, the Permitting Council can support elevating agencies using user research. CEQ can also support sharing both challenges and opportunities across agencies. CERPOs can exchange ideas and elevate case studies to explore what is working.
- Launch a regulatory sandbox for permitting. A sandbox would allow testing of different forms and other small interventions. The sandbox would provide an environment for intentional AB testing (e.g., test a new permitting form with ten applicants). The sandbox could be managed by the Permitting Council or another agency, but responsibility to oversee the sandbox should be contained within one single agency. This office should be empowered to offer waivers or exemptions. Ideally, a customer experience specialist would lead the activities of the sandbox. Improving forms that project proponents or public commenters might encounter during the NEPA process is low-hanging fruit that could be a first focus area for the sandbox. Better forms would make processes simpler for applicants, but would also make it possible for agencies to receive and manage associated geospatial and environmental data with applications.
Poor understanding of the costs and benefits of NEPA processes
Costs and benefits of the federal permitting sector have to date been poorly quantified, which makes it difficult to decide where to invest in technology, process reform, talent, or a combination. Applying technology solutions in the wrong place or at the wrong time could make processes more complicated and expensive, not less. For instance, automating a process that simply should not exist would be a waste of resources. At the same time, eliminating processes that provide critical certainty and consistency for developers while delivering substantial environmental benefits would work against goals of achieving greater efficiency and effectiveness.
A more reliable, comprehensive accounting of NEPA costs and benefits will help us design solutions that cost less for taxpayers, better account for public input, and enable rapid yet responsible deployment of energy infrastructure and other critical projects.
Solutions:
- Equip agencies with case management systems that automatically collect data needed for process evaluation. Case management software systems support coordination across multiple stakeholders working on a shared task (e.g., an Environmental Impact Statement). Equipping agencies with these systems would enable automatic capture of data needed to conduct rigorous cost-benefit assessments, providing researchers with rich data to study the impacts of policy interventions on staff time and document quality. Automatic data collection would also drastically reduce the need for expensive and time-consuming retrospective data gathering and analysis efforts. and
- Rapidly execute on a permitting research agenda to support innovation. Establishing a robust case management system may take time. In the interim, agencies, philanthropy, nonprofits, and others can undertake research projects that inform nearer-term decisions about NEPA. Collaborations with user researchers, designers, and product managers will make this research agenda successful. Key gaps a research agenda could address include:
- Money and Time Federal Agencies Spend on NEPA Tasks
- How many staff whose primary job is spent on permitting-related tasks does each agency employ at the national, region, and field levels? The study scope could start with the agencies on the Permitting Council, as they are agencies with relatively large roles in the permitting process.
- How do staffing levels correspond with the number and kind of permitting actions by region and field office? Sources for agency staffing data could include General Services Administration employment classifications and agency NEPA offices.
- What is each agency’s total budget allocated for NEPA review? Do budget codes accurately reflect permitting work?
- Research Gap 2: Private Sector Cost and Scale. The NEPA sector is larger than just the federal government. For example, private-sector consulting firms sometimes help project sponsors prepare their applications and navigate federal processes. A number of private sector entities support the permitting process through government contracts. Questions include:
- What is the total market size of the permitting private sector (dollar amount and employees)?
- What percent is spent on federally mandated permits? How does this break down by task? What are the most expensive labor components and why?
- Research Gap 3: Technology-related Costs
- Building on FAS and EPIC’s permitting inventory, what is the annual technology budget for each agency’s major permit tracking system? Answers to this question should include both internal and external staff costs.
- How many years has each system been in operation? How did the application receive initial funding (e.g., appropriation, general fund, permitting-specific budget)? This helps us know 1) which systems are likely in the most need of an upgrade and 2) how likely it is that funding will be available in the future to modernize.
- Money and Time Federal Agencies Spend on NEPA Tasks
Conclusion
Policymakers, agencies, and permitting stakeholders should recognize the important role that systems and digital tools play in every stage of the permitting process and take steps to ensure that these technologies meet user needs. Developing data standards and a data fabric should be a high priority to support agency innovation and collaboration, while case management systems and a cohesive NEPA database are essential for supporting policy decisions and ensuring that data generated through NEPA is reusable. Leveraging technology in the right place at the right time can support permitting innovation that improves American competitiveness, security, and prosperity.
ARPA-I National Listening Tour Report
The United States is in the midst of a once in a generation effort to rebuild its transportation and mobility systems. Through the Infrastructure Investment and Jobs Act (IIJA) of 2021, hundreds of billions of dollars of new investments are flowing into American highway, rail, transit, aviation, port, pipeline, and active transportation projects.
This transportation infrastructure will be tested by new and emerging threats ranging from increased risk of flooding and heatwaves to supply chain disruptions and cyberattacks. Citizens are also rightly demanding more from the transportation sector—enhanced safety, faster project delivery, lower costs, increased sustainability, efficiency, resiliency, and a more equitable system for all users.
Meeting this moment will require bold investments in new and emerging transportation technologies—new materials, construction techniques, operations systems, planning tools, advanced sensing, computation, and beyond. Authorized in the IIJA, the Advanced Research Projects Agency – Infrastructure (ARPA-I), a new agency within the U.S. Department of Transportation (DOT), is poised to catalyze the transportation innovation ecosystem and accelerate and commercialize essential technologies that solve persistent infrastructure problems.
To inform its research agenda, ARPA-I embarked on a National Listening Tour from September 2023 through June 2024 to gather insights from a wide range of stakeholders from across the transportation ecosystem. With convenings held in the Pacific Northwest, Southeast, Midwest, and Mid-Atlantic, the tour provided several opportunities for ARPA-I to engage with 280 leading transportation experts. The goal was to ensure that ARPA-I heard both the priorities and capabilities of a broad range of transportation and infrastructure stakeholders from across the ecosystem. Of the 280 participants, 99 (35%) were from academia; 58 (21%) were from private corporations; 42 (15%) were from policy and nonprofit organizations; 38 (14%) were from federal, state, and local transportation agencies; 37 (13%) were from startups; and 6 (2%) were financial investors. The ideas shared by these participants will help shape ARPA-I’s future research agenda and will provide a framework for the Agency’s ambitions that will be achieved in part with the participation of and input from this broad ecosystem of stakeholders.
Purpose and Organization of this Report
This report summarizes the insights collected over the course of the ARPA-I National Listening Tour in 2023 and 2024 and the inaugural ARPA-I expert community convening in Washington, DC in December 2022. Insights were gathered from 280 participants in the form of written worksheets and transcribed notes from discussions during the Workshops. The insights summarized in sections 4-7 of this report are intended to inform potential areas for future innovative advanced research and development (R&D) programs to be funded and managed by ARPA–I.
This report is organized into the following sections:
- Laying the Groundwork for ARPA-I
- ARPA-I National Listening Tour Overview
- Promising Ideas for Future ARPA-I R&D Programs
- Challenges Facing U.S. Transportation Infrastructure
- Opportunities to Solve U.S. Transportation Infrastructure Challenges
- Predictions for U.S. Transportation Infrastructure in 10-20 Years
- Conclusion
- Acknowledgements
Laying the Groundwork for ARPA-I
In November 2021, Congress passed the Infrastructure Investment and Jobs Act (IIJA), allocating $550 billion in new funding for various programs within the U.S. Department of Transportation (DOT), including the establishment of the Advanced Research Projects Agency-Infrastructure (ARPA-I). Modeled after highly successful agencies like the Defense Advanced Research Projects Agency (DARPA) and the Advanced Research Projects Agency-Energy (ARPA-E), ARPA-I aims to address significant transportation infrastructure challenges through innovative technology solutions.
ARPA-I’s mission involves funding high-risk, high-reward next-generation innovative technologies with the potential to revolutionize U.S. transportation infrastructure. The agency aims to develop innovative solutions that reduce long-term infrastructure costs, minimize environmental impacts, enhance the safe and efficient movement of goods and people, and improve infrastructure resilience against physical and cyber threats.
To achieve its goals, ARPA-I will support projects that advance early-stage novel research with practical applications, translate conceptual technologies to prototype and testing stages, develop advanced manufacturing processes, and accelerate technological advancements in areas where industry may not invest due to technical and financial uncertainties.
ARPA-I continues a legacy of ARPAs that have delivered breakthrough innovations in a number of sectors. DARPA, established in 1958 in response to the Soviet Sputnik launch, has led to significant technological advances, including the internet, GPS, and mRNA vaccines. Inspired by DARPA’s success, the government created similar agencies for other critical sectors, including the Intelligence Advanced Research Projects Activity (IARPA), ARPA-E, and the Advanced Research Projects Agency-Health (ARPA-H). ARPA-I will adopt many of the core cultural traits and rigorous ideation processes honed by prior ARPAs to seek similar breakthroughs for the transportation infrastructure sector.
To lay the groundwork for ARPA-I’s future, the White House Office of Science and Technology Policy (OSTP) and DOT hosted an inaugural ARPA-I Summit in June 2023 during which a series of announcements were made on future ARPA-I activities. These announcements included:
- Supercharging Infrastructure R&D Made Possible by the Bipartisan Infrastructure Law Programs: ARPA-I announced plans to work with DOT program offices to develop an innovative research agenda that complements flagship investment areas in the Bipartisan Infrastructure Law, including the $5 billion Safe Streets and Roads for All program, the $8.7 billion PROTECT resilient infrastructure program, and the $7.5 billion National Electric Vehicle Infrastructure program. This research agenda will identify technical chokepoints in each domain that could be overcome through a focused R&D initiative.
- Partnering with Communities Across the Nation: ARPA-I is launching a national listening tour with leading researchers, entrepreneurs, companies, and transportation advocates to ensure that ARPA-I reflects the priorities and capabilities of transportation and infrastructure R&D stakeholders across the ecosystem. The listening tour will begin in the Pacific Northwest and will feature locations across the country.
- Request for Information (RFI): ARPA-I invites the public and experts across a variety of modes, sectors and disciplines to provide their ideas and input on high-potential areas for ARPA-I to explore.
- Advancing the Intersection Safety Challenge: ARPA-I is highlighting the USDOT Intersection Safety Challenge, a multi-phased challenge that began with a $6 million prize competition that leveraged machine vision, sensor fusion, and real-time decision making to create safer conditions for pedestrians, cyclists, and drivers at intersections. The first phase of the Intersection Safety Challenge was initiated in 2023. It featured a number of characteristics that have been successful aspects of other ARPA program, including performance-based procurement, stage-gated programs, cross-disciplinary teams and expertise, high-impact domains and open innovation ecosystems.
Building upon the initiatives announced in June 2023, DOT has since undertaken additional efforts related to key transportation infrastructure areas. In January 2024, DOT announced the winners of the first phase of the U.S. DOT Intersection Safety Challenge, a call for opportunities “to transform roadway intersection safety by incentivizing new and emerging technologies that identify and address unsafe conditions involving vehicles, and vulnerable road users at intersections.” In February 2024, DOT announced an RFI about opportunities and challenges of artificial intelligence (AI) in transportation, along with the $15 million Complete Streets AI Initiative–a new opportunity for American small businesses to leverage advancements in AI to improve transportation.
Since its authorization, ARPA-I has made steady progress to gather insights from across our country’s transportation infrastructure experts and is prepared for future ARPA-I projects that will be both appropriately ambitious and focused on our largest transportation problems.
ARPA-I National Listening Tour Overview
The ARPA-I National Listening Tour was the continuation and expansion of an inaugural ARPA-I expert community convening held at DOT headquarters in Washington, DC in December 2022 titled Transportation, Mobility, and the Future of Infrastructure. The National Listening Tour stops included:
- Pacific Northwest (hosted by the University of Washington in Seattle, WA) – September 2023
- Southeast (hosted by the Georgia Institute of Technology in Atlanta, GA) – February 2024
- Midwest (hosted by Newlab in Detroit, MI) – March 2024
- Mid-Atlantic (hosted by Cornell Tech in New York, NY) – June 2024
The purpose of the ARPA-I National Listening Tour was to convene leading researchers, entrepreneurs, companies, and other transportation innovators and initiate a dialogue to ensure that ARPA-I reflects the priorities and capabilities of transportation and infrastructure R&D stakeholders across the ecosystem.
Each Workshop followed a consistent format of plenary presentations from DOT leadership highlighting the opportunity and imperative of this community’s involvement, along with providing background on the unique cultural and structural components that set ARPA agencies up to seed breakthrough innovations, how ARPAs ideate advanced research program designs, and crucial roles that partners outside of government can participate in ARPA-I programs.
Beyond the plenary presentations, the bulk of each Workshop consisted of breakout activities and discussions that focused on some combination of ideal future visions for what U.S. transportation infrastructure could look like in 10-20 years, the biggest problems facing U.S. transportation infrastructure, and novel technological breakthroughs and opportunities that have the potential to solve those fundamental problems.
In total, 280 transportation infrastructure experts participated in the four Workshops and the inaugural convening in Washington, DC. These experts included representatives from academia; corporations; policy and nonprofit organizations; federal, state, and local departments of transportation; venture capital (VC) and other investment firms; and startups.
Promising Ideas for Future ARPA-I R&D Programs
While the DOT continues to engage in strategy development and seek expert inputs to help shape ARPA-I’s initial set of research priorities once fully resourced, several promising ideas were uncovered during the ARPA-I National Listening Tour. Those ideas are described below in no particular order, along with the potential impact they could have on U.S. transportation infrastructure systems at scale. These ideas are not meant to serve as a current representation of priorities or research agendas of ARPA-I, but instead as a showcase of the creative and transformative solutions that the U.S. transportation infrastructure expert community is capable of envisioning.
AI-enabled efficiency throughout the infrastructure lifecycle
One of the most pervasive challenges in U.S. transportation infrastructure is the inefficiency of current project planning, design, and construction processes. Traditional methods often involve lengthy timelines, costly overruns, and frequent delays due to a lack of coordination and integration between stakeholders, planners, and contractors. These inefficiencies contribute to significant cost burdens on public funds and delay the delivery of much-needed transportation maintenance and improvements.
To address this challenge, one concept raised during the Workshops was that of a fully integrated, AI-enhanced project planning, design, scheduling, and construction schema. This idea leverages advancements in AI and “digital twin” technologies to streamline the entire lifecycle of transportation infrastructure projects—from conception to completion. By incorporating digital twins at all scales (geographic, structural, and temporal), the solution provides a powerful opportunity.
Implementing this AI-enhanced system would significantly accelerate the delivery of transportation infrastructure projects across the U.S., reducing both project completion times and overall costs. By enabling predictive analysis and continuous optimization, it would also lead to better resource allocation, reducing material waste and minimizing environmental impact. The result would be more efficient, resilient infrastructure systems that can adapt to future demands more effectively.
Priming physical road infrastructure for a digital future
The rigidity of traditional road infrastructure design remains a key issue contributing to inefficient traffic management, safety concerns, and costly, timely physical infrastructure adaptations. Physical road infrastructure like bike lanes, vehicle lanes, and curbs are static and not reflective of the fluid nature of transportation needs, particularly in urban environments. A suite of technologies such as AI, drones, automated systems, and sensor-equipped barriers could be used to create smart lanes and barriers that adjust according to live traffic data, weather conditions, or sudden hazards, by rerouting traffic or narrowing lanes to optimize for current conditions.
If the U.S. could develop systems like this that have been piloted in other countries (e.g., Spain), the impact on transportation infrastructure would be transformative. City planners would have the ability to program infrastructure dynamically, creating safer, more adaptive environments for cyclists and other road users. The result would be fewer accidents, better traffic management, and more efficient use of space, with the additional benefit of reducing emissions by promoting cycling over car travel. The flexible nature of this infrastructure could also support emerging technologies such as connected vehicles and autonomous driving systems and allow the U.S. to design more sustainable, future-proof cities that prioritize adaptability, safety, and user-centric design.
Automated maintenance and on-site manufacturing
ARPA-I National Listening Tour Workshop participants repeatedly raised concerns around transportation construction and repair challenges, including labor shortages, inconsistent funding, slow project timelines, inefficiencies in traditional construction methods, and high carbon emissions to these methods.
Technologies including AI, robotics, and large-scale additive manufacturing, were noted for their potential to solve these challenges when applied at scale. AI-powered systems can monitor roads and bridges in real-time, predicting failures and enabling proactive repairs. Once detected, autonomous drones and robots can perform immediate repairs, reducing downtime and keeping workers safe by eliminating the need for human intervention in unsafe environments. Simultaneously, on-site manufacturing, using 3D printing and generative design, can produce infrastructure components directly at construction sites, reducing the need for long-distance transportation and reducing carbon emissions.
These solutions, especially if applied in tandem with one another, have the potential to make infrastructure maintenance autonomous, continuous, safer, and more cost-effective. On-site manufacturing would speed up construction projects and minimize logistical challenges while reducing the substantial impact the transportation sector currently has on carbon emissions.
New and emerging alternative PNT technologies
Positioning, Navigation, and Timing (PNT) services are essential to the nation’s critical infrastructure, enabling the safe, secure, and efficient operation of transportation systems for federal, state, commercial, and private entities across the U.S., including tribal lands and territories. These services provide crucial data that supports air and maritime supply chains, freight logistics, efficient roadway operations, crash prevention, and shared road use among vehicles and pedestrians. PNT services also ensure the safety and efficiency of aviation operations. For decades, the Global Positioning System (GPS) has been the backbone of PNT, continually evolving to improve accuracy, integrity, and security while expanding its applications.
However, despite the emergence of technologies like inertial navigation systems and Light Detection and Ranging (LiDAR) to improve the reliability of PNT data, these tools still have limitations. GPS, for example, relies on satellites, making it vulnerable to space weather disturbances and adversarial actions. Additionally, GPS systems can be compromised by threats such as jamming and spoofing.
Quantum sensors offer a promising solution, providing navigation capabilities in areas where GPS signals are weak or unavailable. These sensors use the principles of quantum mechanics to measure physical properties like time, acceleration, and magnetic fields with unprecedented accuracy. Quantum clocks, for instance, provide exceptional timekeeping precision, critical for synchronizing networks and systems. Quantum inertial sensors deliver highly accurate position and velocity measurements, making them invaluable during extended periods where GPS is unavailable. Meanwhile, quantum gravimeters and quantum magnetometers, which are passive systems, can operate under all weather conditions, at any time, and in featureless terrains such as oceans. These sensors enable gravitational anomaly-aided navigation (GravNav) and magnetic anomaly-aided navigation (MagNav). They collectively offer pathways to make our PNT systems more precise, more reliable and resilient.
Repurposing transportation rights-of-way for infrastructure of the future
Another critical challenge raised throughout the Workshops was that of meeting the growing demand for electric vehicles (EVs) and the integration of modern, sustainable technologies at scale. The current network, designed for traditional internal combustion engines, lacks the infrastructure to support widespread EV adoption and smart technologies like connected and autonomous vehicles.
One solution posed during Workshops (and as part of the White House’s Net-Zero Game Changers Initiative) is to repurpose existing transportation rights-of-way (ROWs) along highways and railroads for dual use, enabling the development of EV charging infrastructure and clean energy transmission without the need for costly land acquisition or major structural changes. With 48,000 miles of interstate highways and 140,000 miles of freight railroads, the U.S. has a vast network of ROWs that can be leveraged for new infrastructure. These ROWs can host charging stations, and in some cases, technologies like inductive charging embedded within roadways, allowing vehicles to recharge as they drive. This would make long-distance EV travel more feasible, eliminating range anxiety and encouraging broader adoption of electric vehicles.
Technological innovations like high-voltage underground cables and modular interconnection power electronics would help ensure grid stability while integrating energy and transportation infrastructure. These tools would allow the grid to balance the energy demands of electric vehicles in real time, creating a smarter, more resilient transportation network.
Challenges Facing U.S. Transportation Infrastructure
A portion of each Workshop breakout discussion focused on identifying the fundamental problems and challenges facing U.S. transportation infrastructure. Throughout these discussions, 353 individual responses were gathered. Themes that emerged consisted of problems related to safety, aging infrastructure and maintenance, data access and other data issues, environmental sustainability and resilience, financial constraints, operational inefficiencies related to existing structures and processes, equity and accessibility, technology adoption and scaling, energy and electrification, and community and planning challenges. Each of these broad themes that arose is broken down in further detail below.
Categories of Transportation Infrastructure Challenges
Safety (66 mentions): Safety remains a paramount issue with transportation-related fatalities once again topping 40,000 in the U.S. during 2023. Factors raised by participants as contributing to safety risks include impaired driving, driver distractions, excessive speeds, and inadequate infrastructure accommodations for vulnerable road users like pedestrians and cyclists. Unsafe street and intersection designs were also a critical concern, described as “the majority of intersections outside of urban centers are not designed to accommodate pedestrians safely.” The rise in the size and weight of vehicles, along with the general car-centric nature of much of U.S. transportation infrastructure, were frequently noted as an impediment to pedestrian safety. Multiple participants noted that current infrastructure prioritizes vehicle speed and efficiency over driver, pedestrian, and cyclist safety.
Aging Infrastructure (56 mentions): Aging physical infrastructure continues to be a top concern for many experts. Challenges participants pointed to included failing bridges, deteriorating road conditions, and corrosion of materials, resulting in considerable maintenance backlogs. Participants also frequently pointed to outdated designs that do not accommodate modern transportation needs and struggle to adapt to contemporary demands, including electrification needs and the ability to withstand climate-related events.
Data Inadequacies (42 mentions): Many participants made reference to a significant lack of real-time and comprehensive data for the transportation planning the U.S. needs. For instance, data on pedestrian and bike activities to analyze crash risks effectively is often historic, segmented, inaccurate, and inaccessible. Data collection on vehicle collisions takes up to a year, delaying critical safety decisions. Digital infrastructure gaps were a frequent concern, noting “many areas lack the necessary digital infrastructure for modern transportation systems.” The need for better people-oriented data was also emphasized, referring specifically to data on pedestrian and cyclist movements to improve safety.
Environmental Sustainability and Resilience (37 mentions): The transportation sector is the largest contributor (28%) to U.S. greenhouse gas emissions. Accordingly, participants called out these emissions rates, the environmental impact of larger vehicles, the carbon intensive materials and processes currently used in infrastructure construction, and the need for influencing travel behavior towards low carbon trips. Alongside sustainability concerns, infrastructure resilience was another common challenge raised.
Financial Constraints (34 mentions): According to participants, high costs associated with constructing, repairing, and maintaining transportation infrastructure limit the scope and speed of improvements. Public agencies face difficulties in efficient procurement and funding allocation, exacerbated by a backlog in maintenance and capital improvements. There were also mentions of profit-driven interests in car infrastructure, limited revenue from fares, and the need for funding catalysts and public-private collaborations.
Operational Inefficiencies (34 mentions): Operational inefficiencies were a recurring theme among discussions, specifically fragmented management with comments like “the lack of coordination between agencies results in redundant efforts.” Inefficiencies in resource utilization were highlighted along with a lack of coordination across transportation modes.
Equity and Accessibility (25 mentions): Disparities in access to transportation infrastructure affect low-income and rural communities, as well as vulnerable populations. Car-centric infrastructure impacts and social inequities are major concerns. There are also challenges related to accessibility for people with disabilities, rural area connectivity, and access to reliable wireless connectivity for digital infrastructure advancements.
Technological Integration (21 mentions): Slow adoption of new technologies and integration with legacy systems are major hurdles, with participants asserting that there is resistance to adopting new technologies in the transportation sector. Challenges with autonomous vehicle integration, maritime and port digital integration, and procurement of new technologies are significant. There is also a need for improved cybersecurity alongside digital infrastructure buildout.
Energy and Electrification (20 mentions): Grid limitations in rural areas and the need for improved electric vehicle (EV) charging infrastructure were common among discussions around our largest transportation infrastructure challenges. Specifically, there pointing to a mismatch between the transportation sector and the electric grid, as the current grid is not designed to handle the demands of electric transportation.
Community and Planning Issues (18 mentions): Slow community engagement processes and the need for adaptable project designs were frequently cited issues. Encouraging transit ridership and the need for inclusive transportation planning are also mentioned.
Opportunities to Solve U.S. Transportation Infrastructure Challenges
Building upon ARPA-I National Listening Tour breakout discussion on U.S. transportation infrastructure’s biggest challenges, Workshop participants shifted their attention to solving these problems. In order to do so, they brainstormed and discussed current and coming opportunities upon which we must capitalize in order to solve our most pressing challenges highlighted in the section above.
Throughout these discussions, 415 individual responses were gathered. Participant insights included policy suggestions, stakeholder engagement strategies, and infrastructure improvements, but the primary focus of these discussions–given ARPA-I’s scope–was on technological opportunities (accounting for 334 of the 415 responses). Key themes raised include the integration of cutting-edge technologies such as artificial intelligence (AI) and machine learning (ML) AI, sensor technology, internet of things (IoT), digital twins, and edge computing to enhance data collection, processing, and real-time decision-making. Participants also emphasized the need for interoperable systems, standardization of data, and the creation of national repositories to streamline information sharing and improve infrastructure planning.
Technology Opportunities
Given ARPA-I’s focus on technology solutions, the primary opportunity types identified throughout National Listening Tour Workshops were cross-cutting technologies, described in detail below.
Sensors and Sensor-related Technologies (71 mentions): Sensors were the most frequently cited technology opportunity to address transportation infrastructure problems including real-time detection of hazards, infrastructure wear and tear, and the need for accurate, continuous data collection. Examples of specific sensor technologies or applications included:
- Multiple sensors on new vehicles to increase accessible and real-time data
- High tech cameras to detect pedestrians and reduce the risk of accidents in urban areas
- Low-cost sensors for infrastructure condition assessment to identify early signs of infrastructure failure and reduce the risk of catastrophic failures
- Sensor systems on bridges and within technology like geosynthetics
AI and Machine Learning (AI/ML) (50 mentions): AI and ML were mentioned throughout the National Listening Tour Workshops as necessary pieces to solve complex problems related to traffic management, predictive maintenance, and autonomous vehicle operations. AI and ML can improve the accuracy of predictions, optimize system performance, and automate decision-making processes. Examples of specific types and applications include:
- AI that can model and predict behavior
- Generative AI to address the issue of unforeseen scenarios in autonomous vehicle operations by generating solutions in real-time
- AI/ML to accelerate new materials and structures
- Physics-informed AI to accurately model infrastructure impacts
Data Standardization and Management (32 mentions): One opportunity that may not have one specific technology tool to point to, but is inherently a technological opportunity, is data standardization and management. This would tackle the issue of fragmented data systems, ensuring consistency and interoperability across various modes, networks, and stakeholders. This would facilitate better decision-making and more efficient infrastructure management. Examples of specific standardizations and management tools include:
- Standardized data formats
- National data repositories to create centralized databases
- Federated data change management
- Data version control
Autonomous Vehicles (AVs) (28 mentions): Participants pointed to advancement and widespread rollout of AV technology as perhaps the biggest transformation to come in U.S. transportation systems. Achieving this maturation and scale could address the problem of human error in driving, which is a leading cause of crashes. AVs could optimize traffic flow and reduce congestion, making transportation systems both safer and more efficient. Examples of AV applications and associated technologies include:
- Dedicated AV lanes between major transportation centers
- AV-ready road network
- Fully capable and scalable AV technology
Internet of Things (IoT) (27 mentions): IoT technology can help solve the problem of disconnected systems by enabling communication between various elements of the transportation network. Participants assert it as a way to achieve real-time monitoring, improve safety, and increase efficiency. Examples of IoT applications include:
- IoT and AI integration reducing the need for hardware devices
- Leveraging IoT to reduce urban congestion by creating connected spaces that optimize mobility
Digital Twins (23 mentions): Digital twins have the potential to address inefficient planning and maintenance by providing accurate virtual representations of physical infrastructure. This would allow for better simulation, monitoring, and optimization, leading to more informed decision-making. Examples of applications from Workshop participants include:
- Digital twin immersive testing to identify potential unexpected failures before physical implementation
- Digital twins of infrastructure and traffic models to improve traffic management by simulating real-world conditions and enabling proactive responses to potential issues
Edge Computing (19 mentions): According to many participants, edge computing could completely fix the issue of latency in data processing, which is critical for applications requiring immediate responses, such as AVs and real-time traffic management systems. Examples of edge computing applications include:
- Real-time processing and decision-making
- Onboard and edge computing capabilities to enable vehicle and infrastructure to operate autonomously
Electric Vehicles (EVs) (19 mentions): Participants pointed to the readily apparent opportunity that EVs represent if the technology, costs, and infrastructure can be achieved at scale. To get to scale, it was widely acknowledged that we need more efficient EV charging experience, with infrastructure that allows for rapid charging along service areas. Another challenge for EV adoption is the need to build a widespread, reliable charging network that can meet the demands of all users, including long-haul trucks.
Modeling-related Technologies(17 mentions): Modeling technologies are essential for predicting infrastructure performance, traffic patterns, and the impact of various interventions. Examples of modeling-related technologies raised by participants were:
- Predictive modeling to optimize safety planning and real-time traffic management
- Behavioral modeling to improve AV prediction and safety
5G/6G Communication (15 mentions): Advanced communication technologies like 5G and 6G were mentioned as crucial for enabling real-time data exchange between vehicles, infrastructure, and central systems. An example is the deployment of 6G to support point-to-point communications in vehicle-to-everything (V2X) systems. Examples of applications include:
- 6G integrated sensing and communication to provide ultra-low latency necessary for AV operations and other critical applications
- 6G support for point-to-point communications in vehicle-to-everything (V2X) systems
- Pervasive broadband and level of service
Carbon Capture Technologies (10 mentions): Carbon capture technologies are critical for reducing greenhouse gas emissions across the transportation sector. Example mentioned by participants include:
- Carbon capture methods such as amine-based technology, solid pellets, portable units, and direct air capture
- Use renewable hydrogen to decarbonize steel production
- Apply carbon capture, utilization and storage to cement
Drones and Related Technologies (10 mentions): Workshop discussions around drones noted that they offer innovative solutions for infrastructure monitoring, logistics, and transportation management.
- 3D mobility enhancement through drones for last-mile delivery
- Swarms of drones used for large-scale infrastructure inspection
Low Carbon-related Innovations (9 mentions): Similar to carbon capture technologies, low carbon-related innovations will be important in reducing carbon emissions. Example of technologies raised during breakout discussions include:
- Low carbon materials to reduce the carbon footprint of physical infrastructure like roads and bridges
- Energy-storing materials (multifunctional concrete)
Quantum Computing (4 mentions): Quantum computing was raised a few times throughout discussions as a means for quantum material sensor development and also quantum sensing for unprecedented precision in infrastructure monitoring.
Non-technology Opportunities
Policy suggestions: Policy-related opportunities focused on creating a supportive regulatory framework, aligning financial investments with long-term infrastructure goals, and ensuring that policies are inclusive and equitable. These policies were intended to address challenges in planning, funding, and implementing large-scale infrastructure projects. Specifically, participants discussed opportunities to craft policies to ensure that transportation investments prioritize underserved communities, addressing the problem of unequal access to transportation resources.
Stakeholder engagement strategies: Participants emphasized the opportunity presented by involving diverse groups, particularly those traditionally underserved or impacted by infrastructure projects to raise the voices of all community members so that infrastructure development is responsive to their needs.
Physical infrastructure improvements: Non-technology-specific infrastructure improvements highlighted throughout the Workshops included enhancing the physical aspects of transportation systems, such as road design and public transit accessibility.
For instance, increasing the use of roundabouts can help solve the problem of high accident rates at traditional intersections. Additionally, developing complete streets infrastructure that supports all users, including pedestrians and cyclists, addresses the problem of limited accessibility and safety on roads designed primarily for vehicles.
Financial and economic strategies: Opportunities raised centered on creating sustainable funding mechanisms for transportation infrastructure projects. These strategies address the challenges of securing adequate, long-term financing for both new projects and the maintenance of existing infrastructure. Ideas included alternative funding mechanisms such as value capture or tolling.
Predictions for U.S. Transportation Infrastructure in 10-20 Years
One of the key criteria for any good ARPA project is that it is appropriately ambitious. To set that level of ambition, ARPA-I National Listening Tour participants were asked to make a prediction for U.S. transportation infrastructure 10-20 years from now, taking an ambitious perspective and focusing on what could be. Throughout these discussions, 291 individual predictions were gathered. Despite the wide ranging and sometimes narrowly focused predictions, continuous themes emerged to help reveal a shared vision for what participants want to see in the transportation systems of tomorrow.
Participants frequently emphasized the importance of widespread adoption and integration of advanced technologies such as autonomous vehicles, AI-driven traffic management, and real-time data communication to create a connected and efficient transportation network. Sustainability also emerged as a central theme, with many predicting a shift toward carbon-neutral transportation systems powered by renewable energy sources. Ideas included dynamic wireless charging for electric vehicles, infrastructure embedded with carbon sequestration capabilities, and the widespread adoption of alternative fuels like hydrogen. The focus on sustainability also extended to construction practices, with participants envisioning the use of self-healing and recyclable materials to build resilient infrastructure that can adapt to climate challenges.
Equity and accessibility were also frequently mentioned, with a strong emphasis on creating a transportation system that serves everyone, regardless of location or socioeconomic status. Visions included universal Americans with Disabilities Act (ADA) compliance, free public transportation, and the development of equity-based planning tools to ensure that investments benefit all communities. The overarching ambition is to build a transportation network that is not only technologically advanced and environmentally sustainable but also inclusive and equitable, providing reliable access to mobility for all.
Categories of Predictions for U.S. Transportation Infrastructure in 10-20 years
Automation and Autonomy (36 mentions): Participants imagined a future dominated by autonomous systems, including self-driving cars, autonomous shuttles, and urban air mobility. These systems would be interconnected, allowing for seamless operation and enhanced efficiency. Ideas included autonomous vehicle fleets, virtual “rails” for guiding autonomous vehicles, and the deployment of autonomous drones for transport. The overarching vision was a transportation ecosystem where human error is minimized, and transportation becomes more efficient and safer.
Connectivity (31 mentions): Connectivity was seen as the backbone of transportation infrastructure in the future, with participants envisioning a fully interconnected system where vehicles, infrastructure, and personal devices communicate seamlessly. Advanced technologies such as quantum sensing, AI-driven traffic management, and universal communication standards were proposed to create a hyperconnected network. This would enable real-time traffic management, reduce congestion, and optimize the flow of goods and people.
Sustainability and Environmental Impact (29 mentions): Participants proposed a wide range of ideas aimed at reducing the environmental impact of transportation in the future. These included self-healing materials that extend infrastructure life and reduced emissions associated with frequent maintenance and construction, carbon sequestration integrated into construction materials, and the development of energy-generating infrastructure such as smart trails to generate energy for electric bikes. Participants also discussed the potential of dynamic wireless charging for electric vehicles and the large-scale adoption of alternative fuels, ensuring that the transportation system is not only sustainable but actively contributes to environmental restoration.
Safety (28 mentions): Safety innovations were a frequent theme, with participants proposing strategies to achieve a near-zero fatality transportation system. Ideas included expanding Vision Zero strategies, integrating AI for real-time risk detection, and deploying automated enforcement systems to prevent unsafe driving behaviors. The reimagining of urban spaces to prioritize pedestrian and cyclist safety was also highlighted, with infrastructure designed to protect the most vulnerable road users.
Climate Resilience (25 mentions): Participants emphasized the importance of building infrastructure that can withstand and adapt to the impacts of climate change. Ideas included the use of self-healing and self-sensing materials, modular designs for easier repairs and upgrades, and infrastructure that is resilient to extreme weather events. The vision was for transportation infrastructure that is not only durable but also adaptable to future climate challenges, contributing to overall climate resilience.
Equity and Accessibility (24 mentions): Ensuring equity and accessibility was a priority for participants when thinking about the future, with ideas like universal ADA compliance, free public transportation, and the development of equity-based asset management tools. The goal was to create a transportation system that serves all communities, including those that are traditionally underserved, such as rural areas and marginalized populations.
Public Transit (22 mentions): Public transit was envisioned as a fully integrated, multimodal system that is easy to use and highly efficient. Participants proposed the elimination of car-centric roadways in urban areas, replacing them with dedicated lanes for public transit, cycling, and walking. High-speed rail development, particularly in key regions, and the expansion of micro-mobility options were also highlighted as ways to enhance last-mile connectivity and reduce reliance on personal vehicles.
Innovation and Technology Adoption (18 mentions): Participants predicted a future where technologies like AI, quantum computing, and digital twins have been adopted at scale to optimize infrastructure performance and maintenance. These technologies at scale would allow for real-time monitoring, predictive maintenance, and dynamic system adjustments, and contribute to transportation systems being more efficient, resilient, and future-proofed.
Data-driven Planning (17 mentions): Participants envisioned the use of AI for real-time data analysis, along with the establishment of national data standards to ensure seamless integration across different modes, organizations, and stakeholders.
Policies and Partnerships (13 mentions): Despite not being the focus of ARPA-I policy and governance were mentioned as enablers for many of the ambitious future visions. Participants proposed establishing public-private partnerships to fund and manage infrastructure projects and allow for more flexible and innovative financing models. They also envisioned equity-based asset management tools and policies, increased cross-jurisdictional collaboration, and policies to set national guidelines for climate adaptation in infrastructure design and construction.
Multimodal Mobility (11 mentions): The collective vision for mobility included creating a seamless multimodal transportation system that allows users to easily switch between different modes such as bikes, trains, and buses. Participants hope for the development of unified payment systems to enhance the user experience and reduce reliance on personal vehicles.
Energy (10 mentions): Energy integration predictions centered on using renewable energy sources, bidirectional EV charging for grid stability, and dynamic load management systems. Overall, participants want a future where transportation systems are tightly integrated with energy networks, ensuring reliability while reducing the carbon footprint of the transportation sector.
Urban Planning (10 mentions): Urban planning-related visions included the removal of highways in cities, replacing them with green spaces, light rail, and pedestrian-friendly streets. Participants also predicted the development of high-speed rail corridors connecting major cities, reducing the need for air travel and long car journeys, and integrating land use with transportation planning for more sustainable urban growth.
Materials and Construction (10 mentions): Participants want to see increased use of self-healing and self-sensing materials to extend infrastructure lifespans by centuries. Modular construction techniques and adopting low-carbon materials were also noted as ways to reduce construction times, costs, and environmental impact.
Community and Health Impacts (7 mentions): Transportation infrastructure was envisioned by some as a tool for improving community health and well-being in the future. Ideas included designing infrastructure that supports physical and mental health, reducing the “pink tax” on transportation for women and families, and creating transportation systems that enhance social cohesion and community safety.
Conclusion
As ARPA-I continues its mission to revolutionize transportation infrastructure, it is essential to sustain and expand the support of the transportation expert community and stakeholders across the country. ARPA-I’s success will depend on the collective effort of researchers, innovators, policymakers, and industry leaders who recognize the agency’s potential to drive breakthrough solutions. To truly tackle our biggest transportation infrastructure challenges, we must deepen our commitment to collaboration, align resources strategically, and remain focused on innovative, high-impact outcomes. The expert community should continue to engage with DOT and ARPA-I, push the boundaries of what is possible in their own work, and seek to build the support necessary to turn ARPA-I’s ambitions into reality.
For a full list of organizers, facilitators, and participant organizations, please see the full PDF-version of the report here.
Improving Public Awareness and Understanding of Advisory Committees
From January 2024 to July 2024, the Federation of American Scientists interviewed 30 current and former Advisory Committee (AdComm) members. Based on these discussions, we were able to source potential policy recommendations that may assist with enhancing the FDA’s ability to obtain valuable advice for evidence-based decision-making. The results of these discussions are presented in case study format detailing the recurring themes that emerged and policy recommendations for improvement.
The FDA holds one of the most important roles as a federal agency which is to ensure public safety when approving vaccines, medical devices, and medicines. The approval of these products usually require extensive trials with data that supports their safety and efficacy. Considering that most of these decisions are complex and multifaceted, the FDA enlists the support of Advisory Committees to assist with their decision-making process. The primary role of FDA Advisory Committee members is to provide the FDA with informed advice and recommendations on issues spanning science, regulatory policy, and the evaluation of products under the FDA’s jurisdiction. Although AdComm members serve the FDA in an advisory capacity, their recommendations are non-binding. Therefore, they do not have the final say in the regulatory approval process.
However, over the years, it has been made evident that the public is unaware of the role of Advisory Committees and ways in which they can engage with the FDA. In this case study, FAS hopes to share the current problem and actionable recommendations to combat public misconceptions regarding FDA AdComm roles and provide guidance on increasing FDA engagement with the public and other relevant stakeholders throughout the regulatory process.
Public Awareness Problems
While AdComm members are experts in their respective fields and volunteer their time to provide advice to the FDA, there are multiple factors that must be considered before making official decisions. The recommendations provided during Advisory Committee meetings are just one aspect that is considered for regulatory decision-making and do not guarantee an official approval or denial of a product by the FDA. During AdComm meetings, the FDA allows the general public to make public comments to the Agency and the AdComm regarding the topic that is being addressed. Despite this, members of the general public have expressed that, on many occasions, they are unaware AdComm meetings are occurring. This, in effect, deprives them of the opportunity to communicate directly with the FDA and the AdComm. Additionally, they feel the FDA fails to engage them in an adequate manner, thereby limiting opportunities for participatory engagement. It has also been noted that most members of the general public are unaware FDA Advisory Committees exist; and, for those who are aware, they are unclear about the capacity of their role within the regulatory process.
For these reasons, the FDA must take measures to enhance public understanding in an effort to combat misinformation, educate, and raise awareness on the existence of Committees and their purpose.
Communicating AdComms to the Public
Improving Public Awareness of Advisory Committees and their Role
Improving public awareness on the existence of FDA Advisory Committees and their purpose would assist the FDA with improving public trust and debunking myths and misinformation related to the approval of medical products. Advisory committees operate as an independent party and their recommendations assist with guiding regulatory decision-making. However, their recommendations are non-binding, and FDA leadership must consider additional factors before granting approvals or denials of medical products.
To increase public awareness on Advisory Committees, it should be made clear that AdComm recommendations are not conclusive, as the FDA considers multiple factors in its official decisions. The FDA can leverage social media platforms to increase awareness and understanding of AdComms through the use of disseminating information via the use of ads and active social media engagement. A survey conducted by Pew Research Center states that eight in ten Americans believe social media platforms are an effective way to bring awareness. In addition, disclaimers should be included on all public facing materials referencing AdComms to indicate their purpose. Clearly communicating this to the public will dispel myths that AdComms make the final call on the approvals of medical products.
Improving Communication about Advisory Committee Meetings
Encouraging public participation for Advisory Committee meetings will help foster a collaborative and engaged general public who can contribute valuable life experience to the regulatory process. FAS has identified ways in which the FDA can better communicate with the public to inform them of Advisory Committee meetings. First, the FDA can develop a webpage that allows people to receive notifications of upcoming AdComm meetings. The FDA can also establish relationships with state and local public health agencies, as well as advocacy organizations to spread awareness. Through these relationships, the various agencies and organizations can use their networks to disseminate widespread information on AdComm meetings. Public health agencies and advocacy organizations can gauge the best ways in which these communities would like the FDA to engage with them. This understanding of the communities they serve makes them an ideal partner for fostering continuous engagement.
Policy Recommendations
In an effort to improve public awareness and understanding of AdComms, the potential policy recommendations are as follows:
- FDA can leverage social media platforms to increase awareness and understanding of AdComms through the use of disseminating information (e.g., engagement, ads, etc.)
- FDA should include a disclaimer on all communications and marketing materials regarding AdComms
- These disclaimers should be made at all public meetings
- Disclaimers should emphasize the purpose of AdComm votes. Disclaimers should state that votes allow AdComm members to provide an official stance to the FDA as experts, but votes are non-binding.
- Develop a webpage that allows people to be placed on a listserv regarding upcoming meetings
- Partner with state and local public health agencies and advocacy organizations to spread awareness
Conclusion
Advisory Committees are essential to the FDA regulatory decision-making process. It’s imperative that their role is understood by them and the general public to best move the needle forward. While the FDA currently allows the public to provide public comment at Committee meetings, that alone cannot be considered engaging the community. The FDA must create new opportunities for interpersonal communication which will create an environment of mutual trust and understanding between both parties.
The Role of Patient Advocacy in the AdComm Process
From January 2024 to July 2024, the Federation of American Scientists interviewed 30 current and former Advisory Committee (AdComm) members. Based on these discussions, we were able to source potential policy recommendations that may assist with enhancing the FDA’s ability to obtain valuable advice for evidence-based decision-making. The results of these discussions are presented in case study format detailing the recurring themes that emerged and policy recommendations for improvement.
The regulation of medical products is the responsibility of the Food and Drug Administration (FDA). To ensure effective decision-making regarding these products, the FDA recognizes the importance of patient advocacy and the perspectives of patients. In 1988, the FDA initiated the patient engagement process through the Office of Aids Coordination, and within five years, the first patient representative was appointed to an FDA Advisory Committee. Since then, the FDA has significantly enhanced its methods of engaging patients, caregivers, and patient advocates. This includes the establishment of various offices, programs, collaboratives, listening sessions, public guidance, and more.
The FDA employs several avenues to engage patients in the regulatory process. Some avenues include the Patient Engagement Advisory Committee (PEAC), public comment, the Patient Focused Drug Development Initiative (PFDD), the Patient Listening Session Program, Patient Engagement Collaborative (PEC), and the Patient Representative Program (PRP). This list does not cover all the ways in which the FDA engages patients and advocates but provides an overview of the key operations involved in patient engagement efforts.
The Patient Engagement Advisory Committee is the only Committee that is completely composed of caregivers, patients, and patient representatives from various organizations, as a way to ensure that the lived experiences of these populations and their opinions are included in the deliberations and regulatory decision-making of medical products. Public comment is a requirement by law for federal agencies, allowing the public to provide feedback on proposed actions or new rules and regulations. Public comment is also sought during Advisory Committee meetings to gather information and perspectives from the public. PFDD meetings provide a platform for the FDA to obtain insights from patients on specific diseases and their treatments. To identify the issues most important to patients, the FDA has a series of guidance documents that are used specifically for PFDD meetings.
The Patient Listening Session Program facilitates informal meetings between patients, their representatives, and FDA staff. These sessions cover a range of topics, including treatment preferences, quality of life, unmet medical needs, and the impact of diseases and their symptoms. The PEC offers a forum for patients, caregivers, and advocates to discuss patient engagement operations. Lastly, the PRP allows patients, caregivers, and advocates who serve as special government employees the opportunity to provide advice to the FDA’s Commissioner or a designated representative on matters related to medical devices and their regulation.
Although there are various avenues for patient engagement and advocacy participation in the medical product regulation process, there are also ways in which these avenues can be expanded or improved.
Patient Advocacy Problems
For many years, patients and their caregivers have not seen significant or sustainable treatments that have been developed to treat many illnesses and diseases. Some treatments have proven to be ineffective yet still made it to market approval. On the other hand, there are treatments that met safety and efficacy standards but were not approved. There are also those treatments that are simply not affordable to the populations that need them most. In many of these scenarios, the patient representative voice was lost as they did not have the option to express their concerns or perspectives on certain treatments with decision makers. This further confirms that the role of patient advocacy and allowing space for the patient representative voice is crucial to the regulation process of medical products.
At the moment, the patient voice is not always heard because there are some FDA Advisory Committees that do not have a patient representative.
While the role of patient advocacy is crucial, it is important to note there should be boundaries in which patient perspective is considered for decision-making. Although patients and their advocates seek treatments that better address their needs, this desire can sometimes obscure their judgment concerning long-term treatment effectiveness. Frequently, patients and their supporters present powerful arguments to Advisory Committees and the FDA for approval of particular medical products which can lead to expedited medical product approval in the absence of supportive evidence.
The endorsement of eteplirsen, which was intended for the treatment of Duchenne muscular dystrophy (DMD), illustrates this point. Despite a 7 to 3 vote by the FDA’s Advisory Committee against approval due to insufficient evidence of its benefits, opposition from the FDA’s Center for Drug Evaluation and Research’s former leader resulted in the drug’s authorization. This sparked substantial internal and public criticism and led Dr. Ellis Unger from the FDA’s Office of Drug Evaluation to challenge the approval decision. Dr. Unger emphasized that “patient-focused drug development is about listening to patient perspectives about what matters to them; it is not about basing drug approvals on anecdotal testimony that is not corroborated by data.”
This approval was perceived by many as having been heavily influenced by patient advocacy and raised concerns about potential long-term implications for patient health. It also signaled a need to further examine both patient education and the appropriate limits of patient involvement in the regulatory process. This could have been mitigated had there been a list of criteria in place to be followed for public comment.
Incorporating Patient Perspectives
The Food and Drug Administration (FDA) is committed to understanding the balance of benefits and risks acceptable to patients as they relate to medical products. The FDA defines the role of patient representatives that serve on Advisory Committees as “Special Government Employees” who provide direct input to agency staff and share valuable insight on their experiences with various diseases, conditions, and devices while gaining access to confidential information. These representatives are selected by the FDA to serve on Advisory Committees using a specific set of criteria including, but not limited to:
- “Personal experience with the disease either as a patient or primary caregiver”
- “Knowledge about most treatment options and research for their areas of experience they are representing”
- “Impartiality and compliance with Federal ethics requirements (for example, financial interest, such as stock, in companies that may be affected by FDA decisions)”
This criteria ensures the FDA will understand the patient perspective as it relates to various medical products and ensures those selected to serve on Advisory Committees are knowledgeable about the areas in which they are aiming to provide guidance. Currently, there are some FDA Advisory Committees that do not have a patient representative. Further, the patient representatives serving on committees do not always have voting privileges. The absence of consistent voting privileges for some patient representatives on Advisory Committees and not having a standing patient representative on all committees hinders these individuals from providing an official stance on behalf of the community they represent. Additionally, public comment plays a significant role at Advisory Committee meetings by permitting individuals—including patients, caregivers, and advocacy organizations—to highlight concerns and propose solutions that may not have been previously considered by decision-makers. This process also helps the committee and agency gauge patient acceptance or opposition related to medical products, thereby enhancing their ability to make decisions that more accurately reflect public needs.
When Sarepta was seeking approval of eteplirsen for the treatment of DMD, a patient advocacy organization brought hundreds of patients, caregivers, and other advocates to the Advisory Committee convening so they could make a public comment to the Committee and the agency. Shortly after, the drug received a swift approval. Although it presented much controversy within the agency and the public, it showed how influential patient advocacy can be. Personal lived experiences, compelling stories of debilitating illnesses, and experiences with current treatment have the ability to impact regulatory decision-making.
The role of patient advocacy continues to be important in the Advisory Committee process and FDA regulatory decision-making process because it is crucial to assisting with decisions that affect the American public. Patient advocacy can be presented in the form of patient representatives that serve on Advisory Committees, those who make public comments during Advisory Committee convenings, and various outreach programs by advocacy organizations. The role of advocacy gives patients and caregivers support, promotes and protects their rights, and allows broader visibility for the issues that are most important to them. All of these avenues for patient perspective are important to understand how treatments perform, the current needs of the patient population, and how to tailor care for these populations by truly understanding their condition, diagnosis, and current management. Therefore, their voice is critical to truly understanding how various medical products will benefit their population, how they will access and afford these products, and how they will fill an unmet medical need.
Policy Recommendations
In an effort to better leverage Advisory Committee membership, the potential policy recommendations are as follows:
- Dedicate staff to identifying crucial public comments from patient advocates that should be considered for regulatory decision-making
- Include a patient representative on all committees that are reviewing medical products
- If possible, ensure that patient representatives selected have basic knowledge of the federal regulation process
- Establish criteria for which all public comments must abide by
- This can be done by creating a checklist for the public to review and consider before forming their comment and that clearly delineates the mandatory criteria that a medical product must meet to be considered for approval
- A disclaimer can also be added to state that there is a specific threshold or sample size of a population who must benefit from the medical product
- Promote patient focused medical product development through the use of incorporating patient perspectives into the life cycle of the regulatory process
- This can be done through hosting patient town hall discussions for areas such as rare diseases and expanding initiatives such as the Patient Focused Drug Development (PFDD) public meetings
- Make patient engagement ongoing, rather than only allowing patient engagement quarterly or annually as done with most FDA-led programs and initiatives
Conclusion
The landscape of disease burden and associated symptoms is ever-evolving. To ensure the FDA is best prepared for this changing landscape, patient advocacy and amplifying the patient voice should be considered vital to the development and regulation of medical products. Involving those who are the most impacted by these products is essential. The FDA can further promote the patient perspective and advance patient-centered health through incorporating patient representatives on all Committees that are reviewing medical products, making patient engagement an ongoing process, hosting town halls for patients to allow a broader audience the opportunity to voice opinions, and having dedicated staff to sort through public comments from patients.
FDA Staff and Leadership Disagreements and the Role of the AdComm in the Regulatory Process
From January 2024 to July 2024, the Federation of American Scientists interviewed 30 current and former Advisory Committee (AdComm) members. Based on these discussions, we were able to source potential policy recommendations that may assist with enhancing the FDA’s ability to obtain valuable advice for evidence-based decision-making. The results of these discussions are presented in case study format detailing the recurring themes that emerged and policy recommendations for improvement.
The FDA relies on its scientific staff and Advisory Committees to provide conclusions from trial and study data, which aid in the process of regulating and approving medical products. However, there are instances when disagreements arise between the agency’s scientists, statisticians, Advisory Committees, and leadership on the accelerated or full approval of medical products. The resolution of these disagreements present a growing concern about FDA leadership overruling the expert opinions of scientific staff and proceeding with official approvals, thus undermining staff expertise, decreasing agency morale, and potentially diminishing public trust.
When Disagreements Between FDA and AdComms Arise
The Federal Advisory Committee Act governs the FDA’s Advisory Committees and establishes a process in which the FDA can seek expert advice on various issues related to science, regulatory policy, and the evaluation of products under the FDA’s jurisdiction. When the FDA has differing opinions on safety and efficacy requirements of medical products, certain products may be referred to an Advisory Committee for further data review by an impartial entity. To aid in this matter, the FDA has developed guidance detailing the process for assembling Advisory Committees. Since the resources required for convening these committees are significant, the FDA ensures there is substantial uncertainty or disagreement regarding the data. Advisory Committees will discuss the evidence and provide feedback with the goal of producing the most optimal evidence-based resolution. This part of the regulatory process is crucial to the agency’s regulatory decision-making as it involves unbiased parties and leads to transparency, upholding public safety, and maintaining public trust. However, there are times that FDA leadership disagrees with the votes of their Committees and proceeds with controversial approvals. One example of this scenario was approval of the drug Aduhelm. Aduhelm, which was marketed as a treatment for Alzheimer’s, received an overwhelmingly negative vote from the Advisory Committee to move forward with its approval for market distribution. Ten of eleven Committee members stated the data did not support adequate efficacy for approval. Nonetheless, the FDA granted accelerated approval, sparking resignations from a third of their Committee and outrage amongst the public. This disregard for expert opinion was viewed as the FDA exhibiting an approval bias, a perspective the public currently maintains of the FDA.
In 2023, various organizations and coalitions, such as Doctors for America, Jacob’s Institute of Women’s Health, National Center for Health Research, TMJ Association and more publicly expressed concerns regarding the FDA’s leadership and their approach to drug approvals through a letter addressed to the FDA’s Chief Scientist, Dr. Namandje N. Bumpus. They highlighted FDA leadership had ignored claims by scientific staff that safety and efficacy standards were not being met for the drug Elevidys. Manufactured by Sarepta Therapeutics, Elevidys was granted accelerated approval by FDA leadership. Dr. Bumpus responded to this letter defending the FDA’s approval of Elevidys by referring to the agency’s comprehensive review of the data and consideration of the potential risks of the approved treatment, nature of the disease and its impact on patients, and the limited amount of therapies available. However, the response did little to quell public scrutiny of the controversial approval. Critics were apt to point out that the FDA had a record of approving products in this manner. Years prior, the FDA had approved the drug Eteplirsen for the treatment of duchenne muscular dystrophy (DMD), despite the objections of their scientific staff which disclosed a lack of evidence for the efficacy of this treatment. Criticism was also directed at FDA leadership suggesting the approval of some medical products were potentially due to favoritism toward companies seeking approval.
Resolving Internal FDA Disagreements
While it is acknowledged that the FDA must consider technological and political implications alongside scientific evidence in decision-making, it is essential to address the concerns of these organizations and coalitions. As part of an ongoing project to reform the FDA’s Advisory Committee system and assist the FDA with getting the advice needed to make the best evidence-based decisions, FAS engaged in discussions with current and former Advisory Committee members to seek their input on resolving disagreements between FDA staff and leadership. These conversations highlighted the breach of trust between the FDA, its staff, and the public.
“FDA leadership needs to make is clear what data was used and why they’re moving forward when there is opposition”
“Disagreements should be addressed by a non-biased source because it affects the public safety.”
“There will be times where there are disagreements between staff and leadership. However, there’s a critical need for transparency within the FDA about why decisions are made. These are not decisions about evidence only. Ever.”
“Disagreements should be a matter of public scrutiny. There should be transparency that doesn’t jeopardize confidentiality.”
The FDA’s Commissioner has acknowledged the lack of trust within the institution and expressed a commitment to address the issue. There are many reasons for the lack of trust. However, one reason stems from the FDA’s approval of medical products despite clear opposition from its scientists and sometimes Advisory Committee members.
This raises the question of how the FDA intends to address these internal disagreements, which have the potential to impact the health and safety of the American public. Currently, the FDA has implemented a program called Scientific Dispute Resolution (SDR) to handle such conflicts. This document was initially developed in 2009 and recently updated in 2021 with the purpose of outlining the process for communication regarding internal scientific disputes within FDA Centers. It defines scientific disputes as disagreements that arise from the interpretation of science and the resulting decisions. This definition clearly distinguishes the circumstances in which the guidance should be utilized to resolve discrepancies between FDA scientists, statisticians, and their respective Center leadership. The guidance offers valuable examples of best practices for resolving formal and informal scientific disputes within the agency. Some of those best practices include, but are not limited to:
- Centers employing various mechanisms to disseminate dispute resolution Standard Operating Procedures (SOPs) to employees.
- Requiring Center SOPs for dispute resolution to clearly state that written documentation is the only avenue in which a formal dispute can be triggered (this step ensures appropriate historical documentation).
- Developing several avenues to address scientific issues through the use of regulatory briefings, advisory committees, internal discussions with Center Directors, standing subject matter committees, and multi-disciplinary teams.
However, it should be noted these best practice recommendations are not obligatory and their adoption is left to the discretion of the individual Centers. Furthermore, the document provides a process by which any of the internal parties involved can appeal the resolution of their dispute if they find it unsatisfactory through the Office of Research Integrity at Health & Human Services (HHS).
Policy Recommendations
To increase morale and improve the approach and resolution of internal disagreements within the agency, the policy recommendations are as follows:
- Ensure that all FDA staff and leadership are fully cognizant of the existence and details of the Scientific Dispute Resolution at FDA guidance and the process for submitting disputes for review.
- Incorporate the Scientific Dispute Resolution at FDA guidance into FDA regulations
- Note: To incorporate this guidance into FDA regulations, the FDA will propose the regulation for OMB review. OMB will review and open the regulation up for public comment through the Federal Register. Responses to comments will be developed, and a final draft submitted to OMB for review. If approved, the final regulation will be published in the Code of Federal Regulations (Congressional Research Service, 2013). The involvement of Congress will not be necessary.
- Amend the Scientific Dispute Resolution at FDA guidance to dictate the mandatory execution of best practices within the dispute resolution process.
- This guidance should identify additional non-biased parties (that may not be government-affiliated) to provide impartial guidance on complex scientific matters affecting public safety.
- Develop guidance that clearly explains a transparent process to communicate effectively with AdComm members regarding decision making when parties have opposing viewpoints
- Implementing a transparent process to communicate with AdComm members regarding differences between the agency and the AdComm will assist in improving morale between both parties, but also encourage continued support of the AdComm.
- This recommendation is also supported by a survey conducted by 3D Communications with 400+ AdComm members where 94% of members concurred that the FDA should develop a process to communicate their reasoning for decisions in opposition to Committee recommendations (3D Communications, 2024).
Conclusion
While FDA leadership ultimately holds the authority to grant approvals, it is crucial that the perspectives of all experts are duly considered. This includes the valuable input from the agency’s scientists, statisticians, and advisory committees. To regain public trust and restore integrity, it is imperative to first rebuild trust internally among the dedicated public servants within the FDA. Adoption of the aforementioned recommendations would start the trust-rebuilding process and lead to increased safety and precaution measures when approving drugs and medical devices.
Leveraging AdComm Membership
From January 2024 to July 2024, the Federation of American Scientists interviewed 30 current and former Advisory Committee (AdComm) members. Based on these discussions, we were able to source potential policy recommendations that may assist with enhancing the FDA’s ability to obtain valuable advice for evidence-based decision-making. The results of these discussions are presented in case study format detailing the recurring themes that emerged and policy recommendations for improvement.
The FDA relies on its scientific staff and Advisory Committees to provide conclusions from trial and study data, which aid in the process of regulating and approving medical products. Discussions have been centered around how to appropriately leverage the membership of Advisory Committee experts to assist with areas of difficulty surrounding the safety and efficacy of medical products. Nonetheless, the methods by which these systems currently generate the evidence the Government needs can be improved. This case study focuses on five key areas we believe can assist in fully utilizing the capacity in which AdComms serve and improve overall engagement with AdComms membership.
AdComm Membership Problems
Advisory Committees serve as the core for expert engagement in the Food and Drug Administration’s (FDA) decision-making processes and are composed of medical professionals, industry representatives, patient advocates, and scientific experts. Their primary role is to provide the FDA with informed advice and recommendations on issues spanning science, regulatory policy, and the evaluation of products under the FDA’s jurisdiction.
The intricacies of being an effective AdComm member, however, have been somewhat overlooked. Conversations with current and ex-members have highlighted areas for enhancement that would strengthen the function of AdComms and enrich the advice provided. Feedback indicated a lack of transparency in the FDA’s recruitment methods for committee positions, insufficient orientation or training for new members, limited understanding of regulatory procedures among members, and an onerous conflict of interest protocol that served as a deterrent for some members who were asked to return or renew their membership.
Pathways to Improving AdComms Membership
Committee Composition
The composition of Advisory Committees vary depending on the charter that has been set in place. In some cases, committee composition has been set by law. However, where there is flexibility in determining the composition of a committee, consideration should be given to all categories of expertise that should be included and diversity of voices that are selected to participate in these meetings. Committee composition should reflect the diversity of the world and populations of whom their recommendations could potentially affect. For this reason, discussions with current and former Advisory Committee members indicated the need for three additional areas of expertise that should be included on all Committees. Insights discovered that all Committees should include a patient representative who has the knowledge from lived experience and understanding of how treatments affect day-to-day life. This recommendation was further corroborated by a 3D Communications survey conducted with 400+ FDA AdComm members where results indicated that 48% agreed there should be a patient and consumer representative on all Committees. Members also stated that pharmacists should be included because drugs and devices eventually pass through their hands to give to patients. Pharmacologists should serve on the Committee due to their clinical application knowledge of drugs and devices. Finally, a roster of temporary members should be created for varying categories to use when additional expertise is needed on a Committee because of a conflict of interest or when a certain skillset or knowledge base is lacking on the current Committee.
Role of the Advisory Committee Chair
The FDA describes the purpose of an Advisory Committee chair as one who will “preside at committee meetings and ensure that all rules of order and conduct are maintained during each session”. The chair also has the responsibility of ensuring all recommendations and advice from AdComm members are clear and evidence-based. Moreover, the role of the chair should be used to enhance the overall committee experience as well as be of service to the FDA. Despite these requirements, there’s an underutilization of the chair’s role in terms of communication and stakeholder coordination, as evidenced by the chair not being listed as a primary point of contact for the Advisory Committee and a lack of coordination amongst stakeholders.
Chairpersons are usually selected due to their critical domain knowledge, understanding of best practices, ability to identify risks and keep members engaged, and expansive relationships within their industry. Maximizing the chairperson’s role requires discussion on how to utilize their valuable domain knowledge and professional networks. Chairs possess extensive networks that could support the identification of permanent or temporary expert participants for AdComms, aiding the FDA’s mission to recruit top talent for guidance. This would ensure the FDA’s continued success in recruiting the brightest minds in the industry to assist with providing advice. Additionally, chairs should have oversight in identifying relevant issues or products for their respective committees to appraise, which can provide another layer for the FDA to keep abreast of critical public concerns via appropriate committee evaluation.
Training
Training is a significant part of many Federal Government service positions. However, besides ethics and conflict of interest trainings, there is no set training program in place for most new Advisory Committee members. Considering Advisory Committee members come from different professional backgrounds with varying levels of expertise, the FDA should develop an onboarding training program to assist with acclimating all new AdComm members into their roles. Many former and current AdComm members mentioned that no formal training was provided as part of the onboarding process. Some members who were new to the FDA AdComm process or who were not physicians or scientists stated they had no knowledge of statistical analyses, clinical trial design, or how the FDA views the role of the AdComm in the regulatory process.
A foundational training, covering these aspects, would greatly benefit those members such as consumer and patient representatives who may lack this shared base of expertise. An investment in such an onboarding experience would promote stronger rapport among members and guarantee their preparedness in analyzing scientific and technical submissions.
Learning about the FDA Regulatory Process
The Food & Drug Administration (FDA) was established with the purpose of regulating drugs and medical devices to ensure their safety and effectiveness for all citizens in the United States. Many Advisory Committee members join these committees without basic knowledge of the FDA’s regulatory process. During FAS’ discussions with current and former AdComm members, approximately 71% of members stated that basic knowledge of the regulatory process and how the FDA makes their decisions was unknown to them.
Providing AdComm members with an introductory course on the FDA’s regulatory process could enhance their comprehension, potentially allowing them to make more effective contributions and informed clinical decisions (based on their occupation). Although the FDA provides some online resources about its processes, like FAQs and guidelines, an expansion of this material should be considered for inclusion in AdComm orientation activities.
Conflict of Interest (COI) Process
18 U.S.C. 208(a) prohibits Advisory Committee members who are designated as special government employees (SGE) from serving on federal advisory committees or any other Federal Government form of service that will have a “direct or predictable effect” on their financial interests. Similarly, the FDA describes a conflict of interest as an occurrence “when an individual selected to serve on an advisory committee has financial interests that may be impacted by the individual’s work on the advisory committee”. The auditing process for conflicts of interest is designed to confirm that the members of the advisory committee maintain impartiality and ensure the integrity of public health safety. Prior to any committee gathering, the FDA mandates that each participant, classified as either an AdComm member or SGE, complete an FDA 3410 form that reveals all financial connections that could be seen as potential COIs.
However, the process of what happens after the 3410 form has been completed is ambiguous. In 2007, the FDA submitted draft guidance to the federal register for comment entitled, Guidance for the Public, FDA Advisory Committee Members, and FDA Staff on Procedures for Determining Conflict of Interest and Eligibility for Participation in FDA Advisory Committees in an effort to determine if there is an inappropriate COI that should exclude members from participating in a committee meeting. Moreover, the official guidance is not easily accessible. Another draft guidance was developed with a detailed listing of considerations to be given when examining conflict of interests can be found in the FDA’s draft guidance on Procedures for Evaluating Appearance Issues and Granting Authorizations for Participation in FDA Advisory Committees.
Discussions with current and former AdComm members about the COI auditing process sparked varying views regarding whether flexibility should be exercised for COIs. 82% of members concurred that while they recognize the necessity for such a system, it tends to be overly demanding due to repetitive paperwork, especially when their circumstances remain unchanged. The strenuous nature of this routine has even deterred some from continuing their membership each year and remains a key aspect as to why members choose to end their service.
Despite having a COI process in place, there are loopholes that allow members with conflicts of interest to remain as voting members for specific AdComm meetings. A certain incident involved an Advisory Committee where 10 members who had financial ties to the sponsor were allowed to participate in an AdComm meeting. These individuals ultimately took part in endorsing the TriClip G4 System by Abbott and unanimously agreed that its benefits outweighed the potential risks. To further complicate matters, this information was not disclosed to the public at the time of approval.
While the COI process has resulted in members being rightfully disqualified from meetings due to actual or apparent conflicts, there is room to refine how these conflicts are identified and the standards employed to judge permissible COIs.
Policy Recommendations
In an effort to better leverage Advisory Committee membership, the potential policy recommendations are as follows:
Committee Composition
- If there is flexibility and committee composition is not bound by law, include a patient representative and pharmacist and/or pharmacologist on each Committee
- Patient representatives provide a needed perspective due to lived experiences and understanding of how specific drugs and devices affect their day-to-day life
- Drugs and devices will usually pass through the hands of pharmacists and pharmacologists. Therefore, they should have the opportunity to serve on these Committees and provide feedback. Pharmacologists also understand the clinical application of drugs.
- Develop a register of temporary members that can be utilized when additional expertise is needed for Committee meetings or when there is a conflict of interest (COI)
Role of the Chair
- Expand the role of the Committee chair that will encompass the task of recruiting both standing and ad hoc members, as well as identifying prominent issues and products for Committee consideration, thereby allowing for specialized input from their Committee
Training and Regulatory Process
- Institute a basic 101 training for all newly appointed Advisory Committee members that covers statistical analysis, clinical trial design, and elucidates the partnership between the FDA and AdComm
- Include an overview of the regulatory process and how the FDA’s decision-making process is performed
COI Auditing Process
- Develop a process that can quickly replace individuals who have a known conflict of interest
- Clearly delineate criteria for committee service acceptance regarding individuals with potential or actual conflicts of interest
- Streamline the COI process to prevent duplicative work that may act as a deterrent to experts volunteering to serve on the AdComm (work with GSA on this matter if necessary)
Conclusion
Advisory Committees are pivotal to maintaining trust with the public. It is essential for public safety to ensure that the most qualified experts are selected to serve on these Committees and that they have the tools to provide the FDA with informed and evidence-based recommendations. In an effort to increase public health safety, the FDA should enhance the AdComm structure by expanding the chair’s role, creating training programs for all new Advisory Committee members, and revising the conflict of interest procedures.
The Future of Voting for FDA Advisory Committees
From January 2024 to July 2024, the Federation of American Scientists interviewed 30 current and former Advisory Committee (AdComm) members. Based on these discussions, we were able to source potential policy recommendations that may assist with enhancing the FDA’s ability to obtain valuable advice for evidence-based decision-making. The results of these discussions are presented in case study format detailing the recurring themes that emerged and policy recommendations for improvement.
Advisory Committees (AdComms) serve as the core for expert engagement in the Food and Drug Administration’s (FDA) decision-making processes. These committees are composed of medical professionals, industry representatives, patient advocates, and scientific experts. Their primary role is to provide the FDA with informed advice and recommendations on issues spanning science, regulatory policy, and the evaluation of products under the FDA’s jurisdiction. Public meetings led by the FDA with these committees are instrumental in facilitating transparent deliberation between the FDA, the advisory body, and the American public. This practice helps to cultivate a collaborative environment between the FDA, the AdComms, and the public. AdComm recommendations are integral to strengthening public trust and reinforcing the FDA’s credibility. This relationship is corroborated by aligning the counsel of these independent entities with the FDA’s regulatory actions.
Key Problems Facing Advisory Committees
A critical component of the AdComm structure is its voting mechanism, a method by which hand-selected experts offer expert advice or recommendations on questions that have been proposed by the FDA to assist with informing its formal, regulatory decision-making. These questions include a broad range of topics, from evaluating post-market safety data to assessing pre-market product risks and benefits, and gauging whether a product should be approved or withdrawn from the market. The outcomes of the votes serve as barometers for the AdComms official stance on products and provide the FDA with a comprehensive and collective viewpoint. However, the recommendations proposed by AdComms are suggestive rather than prescriptive; ultimately, leaving the final decision to FDA leadership.
Recent patterns indicate a reduction in the convening of AdComm meetings. In 2010, 55% of FDA-approved drugs were referred to an advisory committee. By 2021, the percentage of FDA-approved drugs with an advisory committee referral had dropped to 6%. The decline of meetings eliminates opportunities for evidence-based evaluation and deliberation that could potentially affect the health and well-being of Americans. Furthermore, the diminishing of these crucial interactions between the FDA, AdComms and the public exacerbates the lack of trust and erodes transparency. Interestingly, while most committees present definitive votes that are supported by explicit justifications for either approving or rejecting items under review, FDA Commissioner Robert Califf has suggested in multiple interviews that AC votes can be useful but are not mandatory for every meeting. This viewpoint raises concerns about the potential removal of voting from the reform agenda, which could undermine AdComms capacity to evaluate intricate topics that affect the American public consumer base. In addition, a survey conducted by 3D Communications with 400+ current and former AdComm members asked about the importance of voting. Results showed that 95% of AdComm members believe that voting should be retained when reviewing the benefits and risks of medical products. Reform discussions have materialized due to these factors, in addition to the FDA’s accelerated approval of Adulhelm (aducanumab) despite clear AdComm opposition. Demand for reform is inevitable and many are urging for there to be an increased number of AC meetings and a thorough reorganization of advisory committee operations and voting protocols. Such reform is not only administrative but also symbolic. This type of reform confirms that decisions affecting public health should be informed and shaped by multidisciplinary expertise. Additionally, it re-establishes the pivotal role of public input in regulatory affairs, which is an indispensable component to maintaining the American public’s trust.
Significance of AdComm Voting
In response to this call for AdComm reform, a project spearheaded by FAS has sought feedback from AdComm members regarding their views on the significance of voting. The intention of engagement is to understand members’ experiences as experts and their perspectives on voting by asking the following questions:
- Describe your Advisory Committee’s voting process.
- On a 1-5 scale, how crucial is it for you to vote on products?
These questions aimed to measure the variability in voting mechanisms across committees and the value members place on voting. Results from 30 participants demonstrate a consensus on the critical role of voting in formulating committee recommendations, with 87% of committee members indicating a five (very important) as their stance on the importance of voting.
Policy Recommendations
To uphold the FDA’s integrity and regain public confidence, retaining voting at AdComm meetings is essential in addition to other recommendations to enhance the advisory committee process. The recommendations are as follows:
- Maintain voting as an integral function, allowing FDA Advisory Committee members to convey their collective expertise and advice, aiding the FDA in informed decision-making on scientific and regulatory matters.
- Revise the guidance for FDA Advisory Committee Members and FDA Staff to explicitly define circumstances for which voting should occur, eliminate sequential voting
Conclusion
The recommendations of FDA Advisory Committee members are a pivotal component to the FDA’s regulatory decision-making process. Maintaining the voting protocol for Advisory Committee meetings is essential as members strive toward the continued provision of precise, impartial, and evidence-based counsel to the FDA. This voting mechanism guarantees the inclusion of each member’s perspective and ensures that an official committee stance is taken, offering the FDA definitive and straightforward guidance.
Scaling Effective Methods across Federal Agencies: Looking Back at the Expanded Use of Incentive Prizes between 2010-2020
Policy entrepreneurs inside and outside of government, as well as other stakeholders and advocates, are often interested in expanding the use of effective methods across many or all federal agencies, because how the government accomplishes its mission is integral to what the government is able to produce in terms of outcomes for the public it serves. Adoption and use of promising new methods by federal agencies can be slowed by a number of factors that discourage risk-taking and experimentation, and instead encourage compliance and standardization, too often as a false proxy for accountability. As a result, many agency-specific and government-wide authorities for promising methods go under-considered and under-utilized.
Policy entrepreneurs within center-of-government agencies (e.g., Executive Office of the President) are well-positioned to use a variety of policy levers and actions to encourage and accelerate federal agency adoption of promising and effective methods. Some interventions by center-of-government agencies are better suited to driving initial adoption, others to accelerating or maintaining momentum, and yet others to codifying and making adoption durable once widespread. Therefore, a policy entrepreneur interested in expanding adoption of a given method should first seek to understand the “adoption maturity” of that method and then undertake interventions appropriate for that stage of adoption. The arc of agency adoption of new methods can be long—measured in years and decades, not weeks and months. Policy entrepreneurs should be prepared to support adoption over similar timescales. In considering adoption maturity of a method of interest, policy entrepreneurs can also reference the ideas of Tom Kalil in a July 2024 Federation of American Scientists blog post on “Increasing the ‘Policy Readiness of Ideas,” which offers sample questions to ask about “the policy landscape surrounding a particular idea.”
As a case study for driving federal adoption of a new method, this paper looks back at actions that supported the widespread adoption of incentive prizes by most federal agencies over the course of fiscal years 2010 through 2020. Federal agency use of prizes increased from several incentive prize competitions being offered by a handful of agencies in the early 2000s to more than 2,000 prize competitions offered by over 100 federal agencies by the end of fiscal year 2022. These incentive prize competitions have helped federal agencies identify novel solutions and technologies, establish new industry benchmarks, pay only for results, and engage new talent and organizations.
A summary framework below includes types of actions that can be taken by policy entrepreneurs within center-of-government agencies to support awareness, piloting, and ongoing use of new methods by federal agencies in the years ahead. (Federal agency program and project managers who seek to scale up innovative methods within their agencies are encouraged to reference related resources such as this article by Jenn Gustetic in the Winter 2018 Issues in Science and Technology: “Scaling Up Policy Innovations in the Federal Government: Lessons from the Trenches.”)
Efforts to expand federal capacity through new and promising methods are worthwhile to ensure the federal government can use a full and robust toolbox of tactics to meet its varied goals and missions.
OPPORTUNITIES AND CHALLENGES IN FEDERAL ADOPTION OF NEW METHODS
Opportunities for federal adoption and use of promising and effective methods
To address national priorities, solve tough challenges, or better meet federal missions to serve the public, a policy entrepreneur may aim to pilot, scale, and make lasting federal use of a specific method.
A policy entrepreneur’s goals might include new ways for federal agencies to, for example:
- Catalyze the development, demonstration, and deployment of technology and novel solutions;
- Acquire or commercialize products and services that meet government or national needs;
- Engage and seek input from communities and the public;
- Deliver more effective, efficient, and equitable services and benefits;
- Provide technical assistance to state, local, Tribal, and territorial governments;
- Retain and recruit talent for mission critical occupations or to fill federal skills gaps;
- Assess and evaluate organizational health and performance or program-level outcomes; or
- Translate evidence to practice.
To support these and other goals, an array of promising methods exist and have been demonstrated, such as in other sectors like philanthropy, industry, and civil society, in state, local, Tribal, or territorial governments and communities, or in one or several federal agencies—with promise for beneficial impact if more federal agencies adopted these practices. Many methods are either specifically supported or generally allowable under existing government-wide or agency-specific authorities.
Center-of-government agencies include components of the Executive Office of the President (EOP) like the Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP), as well as the Office of Personnel Management (OPM) and the General Services Administration (GSA). These agencies direct, guide, convene, support, and influence the implementation of law, regulation, and the President’s policies across all Federal agencies, especially the executive departments. An August 2016 report by the Partnership for Public Service and the IBM Center for the Business of Government noted that, “The Office of Management and Budget and other “center of government” agencies are often viewed as adding processes that inhibit positive change—however, they can also drive innovation forward across the government.”
A policy entrepreneur interested in expanding adoption of a given method through actions driven or coordinated by one or more center-of-government agencies should first seek to understand the “adoption maturity” of a given method of interest by assessing: (1) the extent that adoption of the method has already occurred across the federal interagency; (2) any real or perceived barriers to adoption and use; and (3) the robustness of existing policy frameworks and agency-specific and government-wide infrastructure and resources that support agency use of the method.
Challenges in federal adoption and use of new methods
Policy entrepreneurs are usually interested in expanding federal adoption of new methods for good reason: a focus on supporting and expanding beneficial outcomes. Effective leaders and managers across sectors understand the importance of matching appropriate and creative tactics with well-defined problems and opportunities. Ideally, leaders are picking which tactic or tool to use based on their expert understanding of the target problem or opportunity, not using a method solely because it is novel or because it is the way work has always been done in the past. Design of effective program strategies is supported by access to a robust and well-stocked toolbox of tactics.
However, many currently authorized and allowable methods for achieving federal goals are generally underutilized in the implementation strategies and day-to-day tactics of federal agencies. Looking at the wide variety of existing authorities in law and the various flexibilities allowed for in regulation and guidance, one might expect agency tactics for common activities like acquisition or public comment to be varied, diverse, iterative, and even experimental in nature, where appropriate. In practice, however, agency methods are often remarkably homogeneous, repeated, and standardized.
This underutilization of existing authorities and allowable flexibilities is due to factors such as:
- Comfort with existing methods among program and legal staff (“but that’s how we have always done it!”);
- In turn, limited internal expertise on how to deploy specific new methods (“who in our agency knows how to do this well?”);
- Unclear legal authorities or lack of established agency policies or processes (“are we allowed to do that and, if so, where is that authority written?”)
- Unclear permission authorities and approval roles (“whose review and sign-off do I need to do that?”);
- Difficulties clearly defining the opportunity or problem to be addressed at an actionable level of specificity (“what does success look like?”);
- Concerns about perceived risks, such as the risk of funds going unawarded or effective solutions not being identified (“what if no one/nothing meets our target?”);
- Oversight processes that seek out failure, flaws, and non-compliance and overreliance on strict procedures to ensure accountability (“who is responsible for this failure?” instead of “what can we learn from this for the future?”, and “have all the boxes been checked?” instead of “where might we start and how will we learn along the way?”); and
- Reluctance to define and implement meaningful performance indicators and assessment methods at the start of program design (“how will we know if this new method improves our outcomes compared to the status quo?”).
Strategies for addressing challenges in federal adoption and use of new methods
Attention and action by center-of-government agencies often is needed to address the factors cited above that slow the adoption and use of new methods across federal agencies and to build momentum. The following strategies are further explored in the case study on federal use of incentive prizes that follows:
- clarifying government-wide and agency-specific policies and processes;
- building awareness and fostering leadership and staff buy-in;
- offering case studies, examples, and “how-to” playbooks;
- creating connections among a federal community of practice
- engaging external experts and practitioners;
- removing identified barriers;
- increasing ambition through iterative experimentation; and
- fostering an enterprise-wide learning culture that encourages experimentation, invests in evaluation, and manages risk.
Additional strategies can be deployed within federal agencies to address agency-level barriers and scale promising methods—see, for example, this article by Jenn Gustetic in the Winter 2018 Issues in Science and Technology: “Scaling Up Policy Innovations in the Federal Government: Lessons from the Trenches.”
LOOKING BACK: A DECADE OF POLICY ACTIONS SUPPORTING EXPANDED FEDERAL USE OF INCENTIVE PRIZES
The use of incentive prizes is one method for open innovation that has been adopted broadly by most federal agencies, with extensive bipartisan support in Congress and with White House engagement across multiple administrations. In contrast to recognition prizes, such as the Nobel Prize or various presidential medals, which reward past accomplishments, incentive prizes specify a target, establish a judging process (ideally as objective as possible), and use a monetary prize purse and/or non-monetary incentives (such as media and online recognition, access to development and commercialization facilities, resources, or experts, or even qualification for certain regulatory flexibility) to induce new efforts by solvers competing for the prize.
The use of incentive prizes by governments (and by high net worth individuals) to catalyze novel solutions certainly is not new. In 1795, Napoleon offered 12,000 francs to improve upon the prevailing food preservation methods of the time, with a goal of better feeding his army. Fifteen years later, confectioner Nicolas François Appert claimed the prize for his method involving heating, boiling and sealing food in airtight glass jars — the same basic technology still used to can foods. Dava Sobel’s book Longitude details how the rulers of Spain, the Netherlands, and Britain all offered separate prizes, starting in 1567, for methods of figuring out longitude at sea, and finally John Harrison was awarded Britain’s top longitude prize in 1773. In 1919, Raymond Orteig, a French-American hotelier, aviation enthusiast, and philanthropist, offered a $25,000 prize for the first person who could perform a nonstop flight between New York and Paris. The prize offer initially expired by 1924 without anyone claiming it. Given technological advances and a number of engaged pilots involved in trying to win the prize, Orteig extended the deadline by 5 years. By 1926, nine teams had come forward to formally compete, and the prize went to a little-known aviator named Charles Lindbergh, who attempted the flight in a custom-built plane known as the “Spirit of St. Louis.”
The U.S. Government did not begin to adopt the use of incentive prizes until the early 21st century, following a 1999 National Academy of Engineering workshop about the use of prizes as an innovation tool. In the first decade of the 2000s, the Defense Advanced Research Projects Agency (DARPA), the National Aeronautics and Space Administration (NASA), and the Department of Energy conducted a small number of pilot prize competitions. These early agency-led prizes focused on autonomous vehicles, space exploration, and energy efficiency, demonstrating a range of benefits to federal agency missions.

Federal use of incentive prizes did not accelerate until, in the America COMPETES Reauthorization Act of 2010, Congress granted all federal agencies the authority to conduct prize competitions (15 USC § 3719). With that new authority in place, and with the support of a variety of other policy actions, federal use of incentive prizes reached scale, with over 2,000 prize competitions offered on Challenge.gov by over 100 federal agencies between the fiscal years 2010 and 2022.
There certainly remains extensive opportunity to improve the design, rigor, ambition, and effectiveness of federal prize competitions. That said, there are informative lessons to be drawn from how incentive prizes evolved in the United States from a method used primarily outside of government, with limited pilots among a handful of early-adopter federal agencies, to a method being tried by many civil servants across an active interagency community of practice and lauded by administration leaders, bipartisan members of Congress, and external stakeholders alike.
A summary follows of the strategies and tactics used by policy entrepreneurs within the EOP—with support and engagement from Congress as well as program managers and legal staff across federal agencies—that led to increased adoption and use of incentive prizes in the federal government.

Summary of strategies and policy levers supporting expanded use of incentive prizes
In considering how best to expand awareness, adoption, and use among federal agencies of promising methods, policy entrepreneurs might consider utilizing some or all of the strategies and policy levers described below in the incentive prizes example. Those strategies and levers are summarized generally in the table that follows. Some of the listed levers can advance multiple strategies and goals. This framework is intended to be flexible and to spark brainstorming among policy entrepreneurs, as they build momentum in the use of particular innovation methods.
Policy entrepreneurs are advised to consider and monitor the maturity level of federal awareness, adoption, and use, and to adjust their strategies and tactics accordingly. They are encouraged to return to earlier strategies and policy levers as needed, should adoption and momentum lag, should agency ambition in design and implementation of initiatives be insufficient, or should concerns regarding risk management be raised by agencies, Congress, or stakeholders.
Stage of Federal Adoption | Strategy | Types of Center-of-Government Policy Levers |
---|---|---|
Early – No or few Federal agencies using method | Understand federal opportunities to use method, and identify barriers and challenges | * Connect with early adopters across federal agencies to understand use of agency-specific authorities, identify pain points and lessons learned, and capture case studies (e.g., 2000-2009) * Engage stakeholder community of contractors, experts, researchers, and philanthropy * Look to and learn from use of method in other sectors (such as by philanthropy, industry, or academia) and document (or encourage third-party documentation of) that use and its known benefits and attributes (e.g., April 1999, July 2009) * Encourage research, analysis, reports, and evidence-building by National Academies, academia, think tanks, and other stakeholders (e.g., April 1999, July 2009, June 2014) * Discuss method with OMB Office of General Counsel and other relevant agency counsel * Discuss method with relevant Congressional authorizing committee staff * Host convenings that connect interested federal agency representatives with experts * Support and connect nascent federal “community of interest” |
Early – No or few Federal agencies using method | Build interest among federal agencies | * Designate primary policy point of contact/dedicated staff member in the EOP (e.g., 2009-2017, 2017-2021) * Designate a primary implementation point of contact/dedicated staff at GSA and/or OPM * Identify leads in all or certain federal agencies * Connect topic to other administration policy agendas and strategies * Highlight early adopters within agencies in communications from center-of-government agencies to other federal agencies (and to external audiences) * Offer congressional briefings and foster bipartisan collaboration (e.g., 2015) |
Early – No or few Federal agencies using method | Establish legal authorities and general administration policy | * Engage OMB Office of OMB General Counsel and OMB Legislative Review Division, as well as other relevant OMB offices and EOP policy councils * Identify existing general authorities and regulations that could support federal agency use of method (e.g., March 2010) * Establish general policy guidelines, including by leveraging Presidential authorities through executive orders or memoranda (e.g., January 2009) * Issue OMB directives on specific follow-on agency actions or guidance to support agency implementation (“M-Memos” or similar) (e.g., December 2009, March 2010, August 2011, March 2012) * Provide technical assistance to Congress regarding government-wide or agency-specific authority (or authorities) (e.g., June-July 2010, January 2011) * Delegate existing authorities within agencies (e.g., October 2011) * Encourage issuance of agency-specific guidance (e.g., October 2011, February 2014) * Include direction to agencies as part of broader Administration policy agendas (e.g., September 2009, 2011-2016) |
Early – No or few Federal agencies using method | Remove barriers and “make it easier” | * Create a central government website with information for federal agency practitioners (such as toolkits, case studies, and trainings) and for the public (e.g., September 2010) * Create dedicated GSA schedule of vendors (e.g., July 2011) * Establish an interagency center of excellence (e.g., September 2011) * Encourage use of interagency agreements on design or implementation of pilot initiatives (e.g., September 2011) * Request agency budget submissions to OMB to support pilot use in President’s budget (e.g., December 2013) |
Adoption well underway – Many federal agencies have begun to use method | Connect practitioners | * Launch a federal “community of practice” with support from GSA for meetings, listserv, and collaborative projects (e.g., April 2010, 2016, June 2019) * Host regular events, workshops, and conferences with federal agency and, where appropriate and allowable, seek philanthropic or nonprofit co-hosts (e.g., April 2010, June 2012, April 2015, March 2018, May 2022) |
Adoption well underway – Many federal agencies have begun to use method | Strengthen agency infrastructure | * Foster leadership buy-in through briefings from White House/EOP to agency leadership, including members of the career senior executive service * Encourage agencies to dedicate agency staff and invest in prize design support within agencies * Encourage agencies to create contract vehicles as needed to support collaboration with vendors/ experts * Encourage agencies to develop intra-agency networks of practitioners and to provide external communications support and platforms for outreach * Request agency budget submissions to OMB for investments in agency infrastructure and expansion of use, to include in the President's budget where needed (e.g., 2012-2013), and request agencies otherwise accommodate lower-dollar support (such as allocation of FTEs) where possible within their budget toplines |
Adoption well underway – Many federal agencies have begun to use method | Clarify existing policies and authorities | * Issue updated OMB, OSTP, or agency-specific policy guidance and memoranda as needed based on engagement with agencies and stakeholders (e.g.,: August 2011, March 2012) * Provide technical assistance to Congress on any needed updates to government-wide or agency-specific authorities (e.g., January 2017) |
Adoption prevalent – Most if not all federal agencies have adopted, with a need to maintain use and momentum over time | Highlight progress and capture lessons learned | * Require regular reporting from agencies to EOP (OSTP, OMB, or similar) (e.g., April 2012, May 2022) * Require and take full advantage of regular reports to Congress (e.g., April 2012, December 2013, May 2014, May 2015, August 2016, June 2019, May 2022, April 2024) * Continue to capture and publish federal-use case studies in multiple formats online (e.g., June 2012) * Undertake research, evaluation, and evidence-building * Co-develop practitioner toolkit with federal agency experts (e.g., December 2016) * Continue to feature promising examples on White House/EOP blogs and communication channels (e.g., October 2015, August 2020) * Engage media and seek both general interest and targeted press coverage, including through external awards/honorifics (e.g., December 2013) |
Adoption prevalent – Most if not all federal agencies have adopted, with a need to maintain use and momentum over time | Prepare for presidential transitions and document opportunities for future administrations | * Integrate go-forward proposals and lessons learned into presidential transition planning and transition briefings (e.g., June 2016-January 2017) * Brief external stakeholders and Congressional supporters on progress and future opportunities * Connect use of method to other, broader policy objectives and national priorities (e.g., August 2020, May 2022, April 2024) |
Phases and timeline of policy actions advancing the adoption of incentive prizes by federal agencies
- Growing number of incentive prizes offered outside government (early 2000s)
At the close of the 20th century, federal use of incentive prizes to induce activity toward targeted solutions was limited, though the federal government regularly utilized recognition prizes to reward past accomplishment. In October 2004, the $10 million Ansari XPRIZE—which was first announced in May 1996—was awarded by the XPRIZE Foundation for the successful flights of Spaceship One by Scaled Composites. Following the awarding of the Ansari XPRIZE and the extensive resulting news coverage, philanthropists and high net worth individuals began to offer prize purses to incentivize action on a wide variety of technology and social challenges. A variety of new online challenge platforms sprung up, and new vendors began offering consulting services for designing and hosting challenges, trends that lowered the cost of prize competition administration and broadened participation in prize competitions among thousands of diverse solvers around the world. This growth in the use of prizes by philanthropists and the private sector increased the interest of the federal government in trying out incentive prizes to help meet agency missions and solve national challenges. Actions during this period to support federal use of incentive prizes include:
- EXTERNAL REPORT/ANALYSIS (April 1999): In response to a request from the Clinton-Gore Administration’s National Economic Council, the National Academy of Engineering (NAE), with funding from the National Science Foundation, convened a workshop in April 1999 to “assess the potential value of federally sponsored prizes and contests in advancing science and technology in the public interest” and issued a brief summary report, which recommended “limited experiments” in the use of federally sponsored incentive prizes and encouraged both Congress and federal agencies “to take a flexible approach to the design and administration” of such prizes and that the use of incentive prizes be “evaluated at specified intervals by the agencies involved to determine their effectiveness and impact.”
- AGENCY-SPECIFIC AUTHORITIES and PILOTS (early 2000s): During this period, though, only a few federal government agencies had (and still have) flexible agency-specific prize authorities that allowed them to pilot the use of incentive prizes to advance their missions, scan markets for new solutions, engage new solvers, and solve long-standing problems. Early examples include:
- Because of prizes authority provided to DARPA by Congress under 10 U.S.C. 2374a, enacted as part of the National Defense Authorization Act of 2000 in October 1999, DARPA was early in the federal prizes game, offering a series of challenges demonstrating the capabilities of autonomous vehicles in 2004 and 2005, called the DARPA Grand Challenges.
- With authority from Congress, NASA began offering its ongoing series of Centennial Challenges starting in 2005, to directly engage the public in the process of advanced technology development related to problems of interest to NASA and the nation.
- Congress also saw opportunity for the Department of Energy (DOE) to make progress on energy challenges through the use of incentive prizes, giving DOE authority through the Energy Independence and Security Act of 2007 to run a prize focused on efficient lighting call the “L-Prize” and also to run a series of “H-Prizes” to encourage research into the use of hydrogen as an energy carrier in a hydrogen economy.
- Obama-Biden Administration Seeks to Expand Federal Prizes Through Administrative Action (2009-2010)
From the start of the Obama-Biden Administration, OSTP and OMB took a series of policy steps to expand the use of incentive prizes across federal agencies and build federal capacity to support those open-innovation efforts. Bipartisan support in Congress for these actions soon led to new legislation to further advance agency adoption of incentive prizes. Actions during this period to support federal use of incentive prizes include:
- PRESIDENTIAL DIRECTIVE (January 2009): On the first day of the Obama-Biden Administration on January 21, 2009, President Barack Obama signed a Memorandum on Transparency and Open Government, committing the Administration to creating a more transparent, participatory, and collaborative government. The memorandum directed that federal agencies “should offer Americans increased opportunities to participate in policymaking and to provide their Government with the benefits of their collective expertise and information.”
- EXTERNAL REPORT/ANALYSIS (July 2009): In July 2009, with funding from the Templeton Foundation, McKinsey issued a report called And the Winner Is… that documented the recent resurgence of incentive prizes and noted that over the past decade total prize purses across the large incentive prizes being offered had tripled to surpass $375 million. This report provided synthesis of learnings from recent prizes. For example, the report found that, “As Ken Davidian, formerly of the NASA Challenges, puts it, there are at least four core rewards that drive participants to compete for prizes: ‘goal, glory, guts, and gold—and gold is usually last.’ Or to be more precise (if less memorable), competitors are motivated by the intrinsic interest of a challenge, the recognition or prestige accompanying a winner, the challenge of the problem-solving process itself, and any material incentive. Which motives matter most, and in what mix, will vary depending on the problem—and the problem solver.”
- INCLUSION IN ADMINISTRATION POLICY AGENDA (September 2009): In addition, in September 2009, President Obama released his Strategy for American Innovation, developed by the National Economic Council (NEC) and OSTP. In that strategy, the President called for federal agencies to “take advantage of the expertise and insight of people both inside and outside the federal government, use high-risk, high-reward policy tools such as prizes and challenges to solve tough problems.”
- OMB DIRECTIVE (December 2009): Responding to President Obama’s open government memorandum, on December 8, 2009, the OMB Director issued an Open Government Directive, which required executive departments and agencies to take specific actions to further the principles established by the open government memorandum. The directive charged the OMB Deputy Director for Management to “issue, through separate guidance or as part of any planned comprehensive management guidance, a framework for how agencies can use challenges, prizes, and other incentive-backed strategies to find innovative or cost-effective solutions to improving open government.” The directive also charged federal agencies to include in agency Open Government Plans “innovative methods, such as prizes and competitions, to obtain ideas from and to increase collaboration with those in the private sector, non-profit, and academic communities.”
- EOP LEADERSHIP ROLES (January 2009-January 2017): To support the development and implementation of these and other open-innovation policies, a member of the OSTP policy staff served as policy lead for open innovation, reporting to OSTP Deputy Director Tom Kalil, with additional leadership backing from OSTP Director John Holdren, the inaugural U.S. Chief Technology Officer (CTO) Aneesh Choprah, and then Deputy U.S. CTO Beth Noveck. During the Obama Administration, this OSTP open-innovation policy role was filled by Robynn Sturm Steffen from 2009-2011, the author (Cristin Dorgelo) from 2011-2014 until she became OSTP Chief of Staff, Jenn Gustetic on detail from NASA from 2014-2016, and Christofer Nelson from 2016 through the end of the Administration in January 2017. Sonal Shah as inaugural director of the White House Office of Social Innovation and Civic Participation in the Domestic Policy Council (DPC), and her successor Jonathan Greenblatt, also provided helpful leadership and led key stakeholder engagement efforts.
- OMB GUIDANCE (March 2010): Consistent with the Open Government Directive and the Strategy for American Innovation, OSTP worked closely with the OMB Office of the Deputy Director for Management and the OMB Office of General Counsel on developing guidance on incentive prizes for federal agencies. In March 2010, OMB issued OMB Memorandum M-10-11, Guidance on the Use of Challenges and Prizes to Promote Open Government. This memorandum included clarifications for federal agencies regarding what authorities they could use to offer prize purses, host and sponsor prize competitions, and engage third parties to operate such competitions. OMB Memorandum M-10-11 also established as Administration policy that agencies should:
- Utilize prizes and challenges as tools for advancing open government, innovation, and the agency’s mission;
- Identify and proactively address legal, regulatory, technical, and other barriers to the use of prizes and challenges;
- Select one or more individuals to identify and implement prizes and challenges, potentially in partnership with outside organizations, and to participate in a governmentwide “community of practice” led by OMB and OSTP; and
- Increase their capacity to support, design, and manage prizes, potentially in collaboration with external partners.
- CONVENING and COMMUNITY OF PRACTICE (April 2010): In April 2010, the White House (OSTP and the DPC Office of Social Innovation and Civic Participation) with the Case Foundation convened experts in incentive prize design and administration to share private-sector success stories with nearly 200 representatives from more than 35 federal agencies. This Summit on Promoting Innovation: Prizes, Challenges and Open Grantmaking served as a formal kickoff to a federal prizes and challenges community of practice, which GSA administered, and for which OSTP and OMB provided strategic direction, substantive agenda setting, and resourcing. With GSA’s support, this community of practice remains more than a decade later a valuable network for federal prize practitioners to connect and exchange promising practices and lessons learned.
- TECHNICAL ASSISTANCE TO CONGRESS TO INFORM NEW LEGAL AUTHORITY (June-July 2010): Throughout this period, OSTP and OMB were collaborating with Congress to advance government-wide prize authority. On June 24, 2010, Senators Mark Pryor and Mark Warner introduced in the 111th Congress S.3530, the Reward Innovation in America Act of 2010. Drawing from this introduced bill, in July 2010, the Senate Commerce Committee approved the America COMPETES Reauthorization Act of 2010 with a provision providing government-wide prize authority. Specifically, P.L. 111-358 added Section 24 to the Stevenson-Wydler Technology Innovation Act of 1980 (15 USC § 3719). On July 27, 2010, the Director of OSTP thanked Senators Pryor and Warner for their leadership in a letter that OSTP also published on its blog, highlighting the steps the Administration was taking to set the stage for agencies to take full advantage of prizes authority should Congress move the authority forward.
- SHARED GOVERNMENT WEBSITE (September 2010 to Present): Responding to directives in M-10-11, with support from OSTP and OMB, and with leadership from internal champions, GSA in September 2010 launched a government-wide website called Challenge.gov to provide one place for citizen solvers to come and find the challenges being offered by federal agencies. Over time, Challenge.gov developed back-end capabilities to help federal program managers administer certain types of prize competition and expanded to serve as a knowledge repository for federal program managers looking for more information about designing and administering incentive prizes. Challenge.gov remains today the primary online hub for federally hosted prize competitions.
- Implementing New Government-Wide Prizes Authority Provided by the America COMPETES Act (2011-2016)
During this period of expansion in the federal use of incentive prizes supported by new government-wide prize authority provided by Congress, the Obama-Biden Administration continued to emphasize its commitment to the model, including as a key method for accomplishing administration priorities, including priorities related to open government and evidence-based decision making. Actions during this period to support federal use of incentive prizes include:
- NEW AUTHORITIES THROUGH LEGISLATION (January 2011): On January 4, 2011, President Obama signed into law the America COMPETES Reauthorization Act of 2010 (COMPETES Act), granting all agencies broad authority to conduct prize competitions to spur innovation, solve tough problems, and advance their core missions (Public Law 111-358).
- GSA CONTRACT VEHICLE (July 2011 to Present): In July 2011, as called for by the new law, GSA established a contract vehicle—originally, Sub-Schedule 541 4G, now maintained as the Multiple Award Schedule, 541613, Professional Services – Marketing and Public Relations—to help federal agencies access private-sector technical assistance and consulting support for incentive prizes. Because prize competitions were still an emerging practice both inside and outside of the federal government, this contract vehicle allowed federal agencies to access experts and consultants who were building capacity for prizes across sectors and identifying what works in prize design and operations.
- OMB GUIDANCE (August 2011): In August 2011, OMB’s General Counsel and Chief Information Officer issued a memorandum on the new COMPETES prize authority to agency general counsels and CIOs, which OMB developed hand-in-hand with OSTP. The memorandum included a concise summary of the COMPETES Act’s new prizes authorities and requirements, and it provided guidance to agencies in their implementation of the prize authority found in this legislation. It also addressed an array of frequently asked questions raised by agencies, including agency questions about the new authority to conduct prizes up to $50 million with existing appropriations, as well as the new authorities to: accept private-sector funds for the design, administration, or prize purse of a competition; to partner with nonprofits and tap the expertise of for-profits for successful implementation; and to co-sponsor with another agency.
- AGENCY-SPECIFIC POLICIES and DELEGATION OF AUTHORITY (October 2011): Following issuance of this government-wide guidance, with the support of OSTP and OMB, agencies began to establish strategies and policies to further accelerate widespread use of the new prize authority granted to them under COMPETES. For example, the Department of Health and Human Services (HHS) was at the forefront of agency implementation efforts. On October 12, 2011, HHS Secretary Kathleen Sebelius issued a memorandum notifying the Department of the new prize authority provided under the America COMPETES Reauthorization Act, outlining the Department’s strategy to optimize the use of prize competitions, and calling on the heads of HHS operating and staff divisions to forecast their future use of prize competitions to stimulate innovation in advancing the agency’s mission. Secretary Sebelious also issued a formal delegation of the new prize authority in the Federal Register.
- CENTER OF EXCELLENCE and INTERAGENCY AGREEMENTS (November 2011): In November 2011, OSTP worked with NASA to launch a Center of Excellence for Collaborative Innovation (CoECI), which was co-founded by Jason Crusan and Jeff Davis. As NASA continued to mature the use of challenges and crowdsourcing methods as a new tool in its toolkit, OSTP encouraged NASA to assist other federal agencies in the use of crowdsourced challenges to solve tough, mission-critical problems. This included the new Center working with federal agencies on challenge design through interagency agreements, and supporting those agencies with accessing NASA’s contracts with prize platforms, including for algorithm and apps challenges and for ideation challenges.
- OMB GUIDANCE (March 2012): Throughout 2011, OSTP met regularly with agencies, holding regular phone calls and in-person meetings—often at agency offices—with agency leadership, counsel, and program managers who were considering the use of prizes or undertaking prize design. In these interactions, OSTP listened to agencies, asked clarifying questions, and tracked common issues and challenges. For example, OSTP heard in these agency conversations a variety of agency questions regarding the Paperwork Reduction Act and its intersection with prize authorities. OSTP then collaborated with OMB to assess these issues and determine how clarifying guidance could support agency implementation of prizes. On March 1, 2012, OMB Office of Information and Regulatory Affairs (OIRA) issued a Frequently Asked Questions summary to address common agency questions.
- AGENCY REPORTING TO EOP and REPORT TO CONGRESS (April 2012): In April 2012, the Obama-Biden Administration released a first report to Congress on federal use of incentive prizes in fiscal year 2011, as required by the America COMPETES Act. These reports to Congress and related reporting from federal agencies to OSTP—initially on an annual cycle and now biennial—have been an essential mechanism for tracking federal agency prize activity and capturing case studies and outcomes. The work led by OSTP with GSA to standardize how federal agencies tracked indicators and metrics regarding federal incentive prizes supported both progress tracking over time as well as storytelling by the Administration and prize supporters in Congress. The reports have also recorded steps taken by OSTP, GSA, and other agencies like NASA to build government-wide capacity and infrastructure related to prizes, and the steps taken within agencies to establish policies and processes to ease the use of prizes and remove barriers.
- CONVENING and ONLINE CASE STUDIES AND RESOURCES (June 2012): In June 2012, OSTP, the Case Foundation, and the Joyce Foundation hosted a day-long conference called Collaborative Innovation: Public Sector Prizes, which brought together hundreds of public- and private-sector practitioners to share case studies, research, and lessons learned. The event was partially live streamed, and resulted in a large amount of case studies and video resources that were available on the Case Foundation website, now archived.
- INCLUSION IN ADMINISTRATION POLICY AGENDA (2011-2016): The use of incentive prizes was included and encouraged throughout a variety of Administration policy agendas, year over year. For example, commitments related to prizes and challenges were included in each of the United States Open Government biennial National Action Plans issued during the Obama-Biden Administration, to maintain momentum and highlight this body of work to the international open government community.
- GUIDANCE ON AGENCY BUDGET PROPOSALS (2012-2013): OSTP also collaborated with OMB to ensure that agencies considered and submitted to OMB budget proposals to expand their use of incentive prizes. On May 18, 2012, OMB issued Memorandum M-12-14 on the Use of Evidence and Evaluation in the 2014 Budget, which directed, “Agencies should also consider using the new authority under the America COMPETES legislation to support incentive prizes of up to $50 million. Like Pay for Success, well designed prizes and challenges can yield a very high return on the taxpayer dollar.” For agencies seeking to learn more about prizes, the memo noted, “The Office of Science and Technology Policy has created a ‘community of practice’ for agency personnel involved in designing and managing incentive prizes.” On July 26, 2013, OMB issued Memorandum M-13-17 on Next Steps in the Evidence and Innovation Agenda, OMB encouraged federal agencies to develop budget proposals “that focus Federal dollars on effective practices while also encouraging innovation in service delivery” and specifically mentioned incentive prizes as an encouraged pay-for-performance strategy.
- REPORT TO CONGRESS (December 2013): On December 27, 2013, OSTP released a second report to Congress of federal use of incentive prizes under the COMPETES Act authority, and other authorities, in fiscal year 2012.
- LIFTING UP EFFORT FOR EXTERNAL RECOGNITION (December 2013): On January 24, 2014, Harvard University’s Ash Center for Democratic Governance and Innovation announced Challenge.gov as winner of the prestigious 2013 “Innovations in American Government Award” in honor of exemplary service and creativity in the public interest. The Obama Administration through GSA nominated Challenge.gov for this external honor to raise awareness and increase attention for federal prizes.
- AGENCY-SPECIFIC POLICIES (February 2014): Agencies continued to state and clarify their internal policies and processes related to incentive prizes. For example, on February 12, 2014, the NASA Administrator issued an agency-wide policy directive—still in place today and last updated in June 2023—to encourage “the use of challenges, prize competitions, and crowdsourcing activities at all levels of the Agency to further its mission.”
- REPORT TO CONGRESS (May 2014): On May 7, 2014, OSTP released a third report to Congress on federal use of incentive prizes, focused on fiscal year 2013. This report found an 85 percent annual increase in prizes run under all legal authorities, an over 50 percent increase in the number of prizes conducted under the authority provided by COMPETES increased by over 50 percent compared to fiscal year 2012 (and nearly six-fold compared to 2011), and an increase in the size of agency-sponsored prize purses has grown as well—11 prizes had prize purses of $100,000 or greater in fiscal year 2013.
- EXTERNAL REPORT/ANALYSIS (June 2014): On June 19, 2014, Deloitte University Press released a report—informed by research involving prize practitioners across government—covering in depth the lessons learned and best practices identified from over 350 prizes conducted by the Federal government and over 50 prizes conducted by state, local, and philanthropic leaders. The report, titled The craft of Prize Design: lessons from the public sector, was produced by Doblin (Deloitte’s innovation practice), in collaboration with Bloomberg Philanthropies, the Case Foundation, the Joyce Foundation, the Knight Foundation, the Kresge Foundation, and the Rockefeller Foundation.
- REPORT TO CONGRESS (May 2015): On May 8, 2015, OSTP released a fourth report to Congress on federal use of incentive prizes, focused on fiscal year 2014. This report highlighted steps agencies were taking to support the use of incentive prizes across their components and divisions, from streamlining access to vendors to support the design and implementation of prize competitions through contract vehicles, creating internal working groups, designating points of contact, and creating internal and external communications tools.
- CONVENING and WHITE HOUSE MICROSITE (October 2015): On Oct 7, 2015, five years after the launch of Challenge.gov, the White House, in conjunction with the Case Foundation, the Joyce Foundation, and Georgetown University hosted a conference called “All Hands on Deck: Solving Complex Problems through Prizes and Challenges” to convene federal prize practitioners and catalyze the next generation of ambitious federal prizes. The following day, GSA brought together the federal community to recognize progress with an awards ceremony. By that point, more than 440 federal prizes had been offered, engaging more than 200,000 citizen solvers. Also in October 2015, OSTP and DPC collaborated on the launch of a WhiteHouse.gov microsite with information on federal use of incentive prizes.
- FOSTER BIPARTISAN CONGRESSIONAL SUPPORT (2015): During this period, bipartisan support for the use of incentive prizes by federal agencies continued. In 2015, a Congressional Prize Caucus with bipartisan sponsorship was held to increase awareness and encourage the use of prize competitions. Numerous pieces of legislation supporting prize competitions to fuel medical research were also passed (e.g., the 21st Century Cures Act [Public Law 114-255] included a provision on EUREKA Prize Competitions [42 U.S.C. 284et seq] that authorized the National Institutes of Health in the Department of Health and Human Services to conduct prize competitions to fuel medical research).
- REPORT TO CONGRESS and TRAINING (August 2016): In August 2016, OSTP released a fifth report to Congress on the federal use of prizes in fiscal year 2015. OSTP noted that Challenge.gov had, by August 2016, “featured more than 700 prize competitions and challenges—conducted under the authority provided by COMPETES and other authorities—from more than 100 Federal agencies, departments, and bureaus.” OSTP and GSA together had engaged more than 1,500 federal professionals in training on prize design and operations.
- PRACTITIONER TOOLKIT (December 2016): In December 2016, building on the robust body of federal knowledge on prizes, OSTP and GSA with the federal Community of Practice on Prizes and Challenges issued a robust practitioner’s toolkit on Challenge.gov with a lot of how-to information and practical case studies. The toolkit was developed by an interagency team using insights drawn from experts across federal agencies.

- Maintaining Momentum in New Presidential Administrations
Support for federal use of incentive prizes continued beyond the Obama-Biden Administration foundational efforts. Leadership by federal agency prize leads was particularly important to support this momentum from administration to administration. Actions during the Trump-Pence and Biden-Harris Administrations to support federal use of incentive prizes include:
- INTEGRATION INTO PRESIDENTIAL TRANSITION PLANNING (June 2016 – January 2017): As the end of the Obama-Biden Administration neared, OSTP worked with GSA and federal agency prize leads to prepare for the upcoming presidential transition and ensure the agency leads felt prepared, empowered, and supported with agency-level policies and processes so they could continue to design and launch prize competitions as part of their ongoing regular course of business. OSTP also integrated incentive prizes into its transition communications as a recommendation for the Trump-Pence Administration to continue. For example, in its list of 100 examples of the Obama-Biden Administration putting science it its rightful place, issued in June 2016, OSTP included the following:
Harnessed American ingenuity through increased use of incentive prizes. Since 2010, more than 80 Federal agencies have engaged 250,000 Americans through more than 700 challenges on Challenge.gov to address tough problems ranging from fighting Ebola, to decreasing the cost of solar energy, to blocking illegal robocalls. These competitions have made more than $220 million available to entrepreneurs and innovators and have led to the formation of over 275 startup companies with over $70 million in follow-on funding, creating over 1,000 new jobs.
In addition, in January 2017, the Obama-Biden Administration OSTP mentioned the use of incentive prizes in its public “exit memo” as a key “pay-for-performance” method in agency science and technology strategies that “can deliver better results at lower cost for the American people,” and also noted:
Harnessing the ingenuity of citizen solvers and citizen scientists. The Obama Administration has harnessed American ingenuity, driven local innovation, and engaged citizen solvers in communities across the Nation by increasing the use of open-innovation approaches including crowdsourcing, citizen science, and incentive prizes. Following guidance and legislation in 2010, over 700 incentive prize competitions have been featured on Challenge.gov from over 100 Federal agencies, with steady growth every year.
- TECHNICAL ASSISTANCE TO CONGRESS ON UPDATES TO AUTHORITIES (January 2017): On January 6, 2017, the American Innovation and Competitiveness Act (AICA) was signed into law by President Obama [Public Law 114-329]. Reflecting extensive, multi-year staff-level engagement among Congressional staff and experts at the Obama-Biden Administration’s OSTP and OMB, the AICA updated the government-wide authority that previously had been granted to federal agencies by the COMPETES Act. The aim of the updates were to encourage more ambitious interagency and cross-sector partnerships (and co-funding, with explicit authority to solicit funds for federal prizes from beyond the federal government) in the design and administration of prize competitions, and to eliminate unnecessary administrative burden, among other changes.
- EOP LEADERSHIP ROLES (2017-2021): During the Trump-Pence Administration, support for federal agency use of prizes and challenges continued. In the EOP, Matt Lira, then Special Assistant to the President for Innovation Policy in the White House Office of American Innovation, Michael Kratsios, then Deputy Assistant to the President and Deputy U.S. Chief Technology Officer, and others in OSTP engaged with agencies to maintain momentum and identify new opportunities for the effective application of the COMPETES Act prize authority.
- CONVENING (March 2018): In March 2018, the White House hosted a “Fostering Innovation with Prizes and Challenges” roundtable with then Secretary of Energy Rick Perry and other leaders. The White House confirmed to participants that, “the Trump Administration strongly supports efforts by Federal agencies to host prizes and challenges, particularly those that leverage COMPETES Act authority, to address some of the Nation’s most pressing issues.”
- REPORT TO CONGRESS AND COMMUNITY OF PRACTICE (June 2019): During the Trump-Pence Administration, the federal prizes and challenges community of practice supported by GSA continued a network and active email list exchange of more than 730 current and prospective challenge managers in the Federal space. These agency prize practitioners were and continue to be essential to forward progress in the use of incentive prizes, in the federal government and beyond. OSTP’s fiscal year 2017-2018 biennial report to Congress on agency use of prizes, issued in June 2019 and the sixth such report, noted that, “monitoring the proliferation of State and local crowdsourcing initiatives, Challenge.gov expanded the email list to State and local government prize practitioners in 2018, inviting exchange and opening avenues for partnership.”
- CONNECTIONS TO NATIONAL PRIORITIES AND LEVERAGING WHITE HOUSE COMMUNICATION CHANNELS (August 2020): As the nation faced the COVID-19 pandemic, and as federal agencies responded to emerging challenges and sought to meet urgent needs during the ongoing public health crisis, they turned to incentive prizes as one tool for connecting with solvers across the country and identifying promising solutions. On August 12, 2020, then Director of OSTP Kelvin Droegemeier issued a memorandum to federal agencies highlighting nine prize competitions launched by agencies related to COVID-19 and calling on agencies to “double-down” on their deployment of prizes to meet the challenges of COVID-19. The memo also noted that OSTP was convening open innovation working groups to support these efforts and planning to host a series of webinars with Challenge.gov. The White House also issued a Fact Sheet communicating agency prize competition and open innovation activities related to COVID-19, with incentive prizes being used to catalyze advances in testing technologies, computational models, mental health services, ventilators, and needs of frontline health care workers.
- CONVENING (May 2022): OSTP and GSA collaborated on hosting an Open Innovation Forum to bring together practitioners of incentive prizes, citizen science, and crowdsourcing from across government and other sectors.
- AGENCY REPORTING TO EOP, REPORT TO CONGRESS, AND CONNECTIONS TO POLICY AGENDAS (May 2022 and April 2024): The Biden-Harris Administration has supported the continued use of incentive prizes by federal agencies as part of its commitment to expanding and improving public engagement in the work of the federal government.
- On May 4, 2022, OSTP released its seventh report to Congress on federal incentive prize competitions (and the second that also included a focus on citizen science and crowdsourcing activities alongside incentive prizes). In releasing the report, OSTP noted in a blog post, “This new report details recent Federal efforts to stimulate innovation and partnership and expand the American public’s participation in science. These developments are aligned with the Biden-Harris Administration’s commitment to advancing equity in the science and technology ecosystem, including OSTP’s Time is Now Initiative, and recently released Equity Action Plan.” This report also reflected a new and more robust survey approach used by OSTP and GSA to collect information from agencies about federal incentive prizes, as well as continued efforts among federal agencies to streamline the use of incentive prizes and reduce or remove barriers.
- On April 16, 2024, OSTP released its eighth report to Congress on Federal incentive prize competitions (and the third that also included a focus on citizen science and crowdsourcing). The report connected the continued growth in the use of incentive prizes by federal agencies to a broader Administration-wide “movement towards improving and expanding participation and engagement in not only government research and development, but also government processes more broadly.”
By the end of fiscal year 2022, federal agencies had hosted over 2,000 prize competitions on Challenge.gov, since its launch in 2010. OSTP, GSA, and NASA CoECI had provided training to well over 2,000 federal practitioners during that same period.
Number of Federal Prize Competitions by Authority FY14-FY22
Source: Office of Science and Technology Policy. Biennial Report on “IMPLEMENTATION OF FEDERAL PRIZE AND CITIZEN SCIENCE AUTHORITY: FISCAL YEARS 2021-22.” April 2024.
Federal Agency Practices to Support the Use of Prize Competitions
Source: Office of Science and Technology Policy. Biennial Report on “IMPLEMENTATION OF FEDERAL PRIZE AND CITIZEN SCIENCE AUTHORITY: FISCAL YEARS 2019-20.” March 2022.
CONCLUSION
Over the span of a decade, incentive prizes had moved from a tool used primarily outside of the federal government to one used commonly across federal agencies, due to a concerted, multi-pronged effort led by policy entrepreneurs and incentive prize practitioners in the EOP and across federal agencies, with bipartisan congressional support, crossing several presidential administrations. And yet, the work to support the use of prizes by federal agencies is not complete–there remains extensive opportunity to further improve the design, rigor, ambition, and effectiveness of federal prize competitions; to move beyond “ideas challenges” to increase the use of incentive prizes to demonstrate technologies and solutions in testbeds and real-world deployment scenarios; to train additional federal personnel on the use of incentive prizes; to learn from the results of federal incentive prizes competitions; and to apply this method to address pressing and emerging challenges facing the nation.
In applying these lessons to efforts to expand the use of other promising methods in federal agencies, policy entrepreneurs in center-of-government federal agencies should be strategic in the policy actions they take to encourage and scale method adoption, by first seeking to understand the adoption maturity of that method (as well as the relevant policy readiness) and then by undertaking interventions appropriate for that stage of adoption. With attention and action by policy entrepreneurs to address factors that discourage risk-taking, experimentation, and piloting of new methods by federal agencies, it will be possible for federal agencies to utilize a further-expanded strategic portfolio of methods to catalyze the development, demonstration, and deployment of technology and innovative solutions to meet agency missions, solve long-standing problems, and address grand challenges facing our nation.
Photo by Nick Fewings
Making the Most of OSHA’s Extreme Heat Rule
KEY TAKEAWAYS
- OSHA’s proposed heat safety standard is a critical step towards protecting millions of workers, but its success depends on substantial infrastructure investment.
- Effective implementation requires a multifaceted approach, improving workforce development, employer and industry resources, regulatory capacity, healthcare access and community support.
- Federal government plays a pivotal role through funding, grants, technical assistance, and interagency collaboration to protect workers from the effects of extreme heat.
- Investing in heat safety infrastructure offers multiple benefits: lives saved, injuries prevented, economic protection, and enhanced climate resilience.
- Challenges to implementation include regulatory delays, insufficient funding, financial constraints for small businesses, diverse settings, rural infrastructure limitations, and lack of awareness.
- Overcoming these challenges requires dedicated funding sources, financial incentives, tailored solutions, and comprehensive education campaigns.
- The success of the OSHA standard hinges on prioritizing these infrastructure investments to create a comprehensive, well-resourced system for heat safety.
This article is informed by extensive research and stakeholder engagement conducted by the Federation of American Scientists, including a comprehensive literature review and interviews with experts in the field. Much of this work informed our recent publication which can be found here.
The Imperative for Infrastructure Investment
As climate change intensifies, the need for robust heat safety measures for outdoor workers has never been more pressing. The Occupational Safety and Health Administration has taken a significant step forward in protecting workers from extreme heat by proposing a new safety standard. The proposed rule aims to protect approximately 36 million workers in indoor and outdoor settings from heat-related illnesses and fatalities. As we move forward, the rule’s success hinges on substantial investments to bridge the gap between policy and practice. It is crucial to examine how the federal government can create the necessary infrastructure to support and maximize the effectiveness of this potentially groundbreaking standard.
The need for these investments is underscored by the significant economic and human costs of heat-related illnesses and fatalities. A study by the Atlantic Council estimates that extreme heat costs the U.S. economy $100 billion annually, with agricultural workers being among the most affected. Proper implementation of safety measures could potentially prevent many of these fatalities and reduce substantial economic losses.
Key Areas for Infrastructure Development to Meet OSHA’s Heat Safety Rules
The outdoor occupational sector, employing tens of millions of workers across diverse landscapes and industries, faces unique challenges in properly implementing heat safety measures. From vast open fields to enclosed processing facilities, the infrastructure needs are as varied as the sector itself. Without targeted investments, the OSHA standard risks becoming an unfunded mandate, unable to fulfill its life-saving potential.
The effective implementation of OSHA’s proposed standard requires a multifaceted approach to infrastructure development. By focusing on these key areas, we can create a robust framework that supports the standard’s goals and protects outdoor workers across diverse settings and conditions. To maximize the impact of the proposed rule, investments must be strategically directed across several key areas. It is important to note that these areas represent a broad overview and are not exhaustive– comprehensive stakeholder engagement is essential to tailor solutions to specific needs across different states, regions, industries, and employers.
Workforce
Developing a resilient and well-prepared workforce is a cornerstone of effective safety measures. Key investments in training, access to facilities, and health monitoring ensure that workers are equipped to handle extreme heat conditions, safeguarding their health and productivity.
- Training & Education. Developing multilingual, interactive training modules accessible to all workers is crucial. These programs must include ongoing education to ensure workers are continually updated on best practices for heat safety.
- Access to Infrastructure. Installing hydration stations and shaded rest areas is essential to provide necessary relief from extreme heat. These facilities enable workers to stay hydrated and take breaks, significantly reducing the risk of heat exhaustion and heat stroke.
- Personal Protective Equipment. Providing cooling vests, lightweight clothing, and sunscreen to protect workers from heat stress is another critical component. PPE must be tailored to the specific needs of workers, offering protection without hindering productivity.
- Health Insurance. Ensuring workers have access to adequate health insurance is crucial, particularly for those in rural and underserved areas. This includes addressing the unique challenges faced by workers with complex immigration statuses, who may be hesitant to seek medical care or face barriers in obtaining insurance coverage.
- Awareness. Implementing acclimatization programs and regular health screenings can help monitor workers’ health and identify early signs of heat stress. This includes educating workers about recognizing early signs of heat stress in themselves and colleagues, and understanding the importance of gradual adaptation to hot working conditions.
- Migrant Worker Vulnerabilities. Undocumented workers face unique challenges in accessing heat safety protections, such as fear of retaliation for reporting unsafe conditions, which can lead to underreporting of incidents. This vulnerability highlights the need for stronger protections and outreach strategies specifically tailored to this population.
Employer & Industry
Employers and industries play a critical role in implementing heat safety standards. By investing in infrastructure, regulatory compliance, and technological innovations, they can create safer working environments and ensure the sustainability of their operations.
- Financial Assistance. Offering grants, subsidies, and tax incentives can support employers in implementing necessary safety measures. Financial support can alleviate the burden on small and medium-sized enterprises, ensuring that all employers can invest in heat safety infrastructure.
- Physical Infrastructure. Employers must invest in the necessary infrastructure, including hydration stations, shaded rest areas, and cooling systems. These investments are essential for creating a safe working environment and ensuring compliance with the proposed standards.
- Regulatory Compliance Support. Developing clear guidelines and compliance tools can help employers adhere to the new standards. Providing technical assistance and resources for compliance can simplify the process and encourage widespread adoption of safety measures .
- Technology & Innovation. Utilizing weather monitoring systems, wearable heat sensors, and mobile health applications can enhance worker safety. These technologies enable real-time tracking of heat exposure and facilitate timely interventions, reducing the risk of heat-related illnesses.
- Rural Infrastructure. Many agricultural operations are in rural areas with limited resources and infrastructure. This includes a lack of nearby healthcare facilities, making it difficult to quickly respond to heat-related illnesses in the workplace. Investments in rural infrastructure and targeted support can address these limitations.
Regulatory Agencies
Regulatory agencies are essential in enforcing heat safety standards. Increased resources, staffing, and technical expertise, along with robust data collection and public outreach, are necessary to support compliance and drive continuous improvement in safety measures.
- Resources & Staffing. Adequate staffing is essential to enforce the new standards effectively. Increased financial resources would support hiring additional staff, enhance the technological capabilities for monitoring compliance, and ensure that there are adequate resources to investigate and address non-compliance.
- Training & Expertise. Ensuring regulatory agencies possess the necessary technical and operational expertise through ongoing training for inspectors and regulatory staff to stay updated on the latest heat safety technologies, practices, and research.
- Data Collection & Analysis. Developing incident reporting systems, syndromic surveillance, and integration of data with a centralized health and safety database can inform policy decisions and improve safety measures.
- Public Outreach & Education. Implementing awareness campaigns, community engagement initiatives, and distributing educational materials can increase awareness of safety.
- Research & Development. Funding for research collaborations with academic institutions and pilot programs to test new heat safety technologies and strategies is vital.
- Whistleblower Protections. To ensure the effectiveness of heat safety measures, it’s crucial that all workers, including undocumented workers, can report dangerous conditions without fear of retaliation. Strengthening and enforcing whistleblower protections is essential to create a culture of safety and compliance.
Healthcare
A robust healthcare infrastructure is vital to support the prevention, early detection, and treatment of heat-related illnesses among outdoor workers. Investments in medical facilities, telemedicine, emergency response systems, and healthcare worker training are crucial to providing timely and effective care.
- Access to Healthcare. Strengthening access to healthcare is crucial, especially in rural and underserved areas. This involves expanding medical facilities and ensuring workers have access to qualified healthcare professionals and affordable treatment options tailored to heat-related conditions.
- Telemedicine Infrastructure. Developing robust telemedicine platforms enables remote consultations for workers in remote areas. This provides timely healthcare interventions without the need for extensive travel.
- Emergency Response Systems. Bolstering emergency response capabilities ensures that medical aid is swiftly available during critical heat-related incidents. This reduces potential health complications and improves outcomes for affected workers.
- Healthcare Worker Training. Training healthcare professionals in the specifics of heat-related illnesses prepares them to offer effective treatment and preventative care. This enhances the overall response to heat stress conditions and improves patient outcomes.
- Data Sharing & Coordination. Creating data-sharing frameworks between healthcare providers, emergency services, and public health agencies ensures a coordinated response to heat-related health issues. This enhances overall healthcare efficacy and enables better tracking and management of heat-related incidents.
Community & Advocacy Groups
Community and advocacy groups play a pivotal role in bridging the gap between policy and practice. By supporting local networks, grassroots education programs, and worker advocacy efforts, these groups can significantly enhance the effectiveness of heat safety initiatives. Their involvement ensures that programs are culturally appropriate, widely understood, and effectively implemented on the ground.
- Worker Education. Implementing wide-reaching education and advocacy programs helps raise awareness about heat risks. These efforts promote community-wide preventive measures and empower workers to protect themselves.
- Advocacy. Ensuring direct worker representation in policy discussions and implementation planning is crucial. Their firsthand experiences are invaluable in creating effective, practical safety measures that address real-world challenges.
- Local Heat Safety Networks. Supporting the creation of community networks ensures the distribution of heat safety resources. These networks enhance preparedness and response to heat risks at the local level.
- Worker Advocacy Support. Providing resources to advocacy groups enables effective representation of workers‘ safety interests. This ensures that policies are worker-centered and address the actual needs of those most affected by heat hazards.
- Community Resilience Planning. Collaborating with community groups to develop localized resilience strategies strengthens community preparedness against heat impacts. This approach integrates workplace safety measures with broader community resilience efforts.
The Federal Government’s Role in Facilitating Investments
Successful implementation of OSHA’s heat safety standard requires substantial federal support and coordination. The government must actively facilitate and incentivize necessary investments to create a robust heat safety infrastructure. By leveraging its resources, the federal government can catalyze nationwide improvements. Key actions include:
- Program Investment. Must significantly invest in funding agencies like OSHA and HHS to enhance their capacity to implement and enforce the safety program. This includes financial resources for hiring additional staff, improving technological capabilities, and offering comprehensive training and support to employers.
- Providing Financial Incentives. Should provide targeted grants, subsidies, and tax incentives. These financial aids will alleviate the burden on small and medium-sized enterprises, fostering widespread adoption of advanced heat safety measures.).
- Capacity Building. Must develop and support comprehensive educational programs and training workshops to enhance the capabilities of the workforce. This will ensure that workers are well-informed and equipped to effectively navigate and implement complex safety regulations.
- Public-Private Partnerships. Must encourage collaboration between the public sector and private enterprises, leveraging private innovation alongside public resources to ensure that safety solutions are comprehensive and widely accessible.
- Interagency Coordination. This involves pooling resources, expertise, and efforts from diverse federal agencies to support and enforce the heat safety regulations efficiently. Agencies should identify and allocate resources within their scope to contribute to a broad-based support network—ranging from funding and manpower to specific programmatic initiatives, as well as data-sharing and surveillance.
- Overcoming Bureaucratic Inertia. Delays and resistance within government agencies can impede the timely adoption and enforcement of new regulations. Streamlining processes and clear mandates can help overcome this inertia.
The Benefits of Investing in Heat Safety Infrastructure
Investing in heat safety infrastructure yields numerous benefits, including:
- Lives Saved, Improved Worker Health & Safety. Investing in proper heat safety infrastructure significantly reduces the incidence of heat-related illnesses, such as heat exhaustion and heat stroke, which can be fatal. This reduction cascades into numerous health and safety benefits:
- The most immediate and crucial benefit is the preservation of human life and health
- Enhances workplace safety culture
- Reduces long-term health complications from chronic heat exposure
- Enables better management of pre-existing health conditions exacerbated by heat
- Improves public health outcomes in heat-vulnerable communities
- Reduces inequality by protecting vulnerable worker populations
- Example: a study reported a 91% decrease in heat-related illnesses following the implementation of safety measures.
- Economic Benefits. Heat safety investments stimulate economic growth through multiple channels, creating a positive ripple effect across businesses and communities. Key economic advantages include:
- Increased workforce productivity and efficiency
- Reduced absenteeism and turnover rates
- Stimulation of local economies through infrastructure investments
- Reduced healthcare costs for both employers and the broader healthcare system
- Improved job satisfaction and worker morale
- Enhanced employer reputation and ability to attract/retain talent
- Example: the same study saw heat-related illness claims drop from 30 per 1,000 workers to zero, eliminating workers’ compensation claims entirely.
- Climate Resilience. As global temperatures rise, building infrastructure to withstand extreme heat conditions becomes crucial for overall climate resilience. This proactive approach offers several strategic advantages:
- Increases adaptability to rising global temperatures
- Enables integration with broader climate adaptation strategies
- Reduces energy consumption through efficient cooling methods
- Enhances business continuity during extreme weather events
- Reduces risk of legal liabilities and regulatory penalties
- Enhances organizational preparedness for climate change impacts
Moving Forward
As we face the escalating challenges of climate change, the urgency to protect our workforce cannot be overstated. The proposed OSHA heat safety standard marks a crucial advancement in safeguarding our agricultural workers from the rise of extreme heat conditions. While some may express concerns about the costs and regulatory burden of these investments, it’s crucial to consider the long-term benefits. The initial expenses are outweighed by reduced healthcare costs, increased productivity, and avoided workers’ compensation claims. These measures protect businesses from potential legal liabilities and reputational damage associated with worker heat-related illnesses or fatalities. Moreover, investing in federal infrastructure to support this standard is a strategic imperative that will yield significant returns in public health, economic productivity, and climate resilience.
By thoughtfully allocating resources, the federal government can create a powerful framework for implementing and maximizing the impact of the proposed standard. The health and safety of millions of workers, particularly in high-risk sectors like agriculture, depend on our ability to create a comprehensive, well-resourced system. Every stakeholder from policymakers to industry leaders must now rise to the occasion. It is imperative that we channel collective efforts and resources before another heatwave claims more lives. The consequences of inaction are too severe to ignore.
For specific actions you can take to protect our outdoor workers, please refer to the strategies outlined in Appendix A: Call to Action Guide.
Appendix A. Call to Action Guide
This guide offers strategies for various stakeholders to support and enhance the implementation of OSHA’s heat safety rule.
For Policymakers
- Prioritize Funding. Expedite allocation of funds for heat safety infrastructure development.
- Facilitate Collaboration. Promote interagency cooperation to streamline rule implementation.
- Supportive Legislation. Enact laws that reinforce and expand heat safety protections.
- Oversight. Conduct regular reviews of the rule’s implementation and effectiveness.
For Employers
- Proactive Adoption. Implement heat safety measures ahead of the OSHA rule finalization.
- Infrastructure Investment. Allocate resources for necessary heat safety equipment and facilities.
- Training Programs. Develop comprehensive heat safety education for all employees.
- Best Practices. Engage with industry associations to share effective strategies.
For Workers and Advocacy Groups
- Active Participation. Engage in public comment periods and local heat safety initiatives.
- Collaboration. Work with employers to identify and address workplace heat risks.
- Community Education. Raise awareness about heat safety rights and available resources.
- Reporting. Encourage the use of whistleblower protections to report unsafe conditions.
For Healthcare Providers
- Emergency Preparedness. Enhance readiness for heat-related illnesses, especially in underserved areas.
- Ongoing Training. Participate in regular updates on heat illness prevention and treatment.
- Community Outreach. Partner with employers and local organizations to promote heat safety awareness.
- Data Sharing. Contribute to heat-related illness surveillance efforts to inform policy and practice.
For Researchers and Academic Institutions
- Effectiveness Studies. Evaluate various heat safety measures and emerging technologies.
- Innovation. Develop new solutions for heat stress prevention and management.
- Industry Partnerships. Collaborate with businesses to apply research findings in real-world settings.
- Policy Guidance. Provide evidence-based recommendations to inform future regulations.
Understanding the U.S. Bioeconomy: Agency Perspectives
The U.S. bioeconomy—defined by the National Institute of Standards and Technology (NIST) as “economic activity derived from the life sciences, particularly in the areas of biotechnology and biomanufacturing, including industries, products, services, and the workforce” and valued by some at ~$1 trillion—has been a major focus of policy development over the past few years. These policy advances include the White House Executive Order on “Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy” (Bioeconomy EO), the CHIPS & Science Act, and the Inflation Reduction Act (IRA). In March 2024, the Office of Science and Technology Policy (OSTP), announced the launch of the National Bioeconomy Board (NBB). The board will “partner across the public and private sectors to advance societal well-being, national security, sustainability, economic productivity, and competitiveness through biotechnology and biomanufacturing,” highlighting the Biden Administration’s commitment to future-proofing an economically sustainable U.S. bioeconomy.
Despite these advances, the vast intersectionality inherent to the bioeconomy (e.g., with health, clean energy, national security, climate change, economic development) poses unique challenges for the U.S. government. This complexity makes it difficult for the various agencies to coordinate and even more difficult for the general public to understand the government’s approach to the bioeconomy. Nonetheless, to maintain the continued growth within the bioeconomy that has resulted from these policy advances, it will be imperative to clarify a strategic vision that coordinates and publicizes governmental efforts that support the burgeoning U.S. bioeconomy.
The NBB can play an important role in promoting this strategic vision. As directed by the Bioeconomy EO, the interagency through the Executive Office of the President set up the NBB to promote interagency coordination and collaboration on the bioeconomy. The NBB is co-chaired by OSTP, the Department of Commerce (DOC), and the Department of Defense (DOD), and nine other agencies make up the entirety of the board. Other agencies not represented on the NBB itself, including the Environmental Protection Agency (EPA), work with the NBB through various working groups and play an integral role.
To understand the range of governmental priorities for the bioeconomy, the overarching strategy, the work underway, the various programs within the agencies, and the role of environmental sustainability, our team at the Federation of American Scientists (FAS) spoke with key agencies represented on the NBB to collect their perspectives.
The perspectives summarized below demonstrate that the agencies align bioeconomy-related initiatives to their varied mission areas and, through the NBB and other interagency activities, are working together to develop a shared vision. However, the summaries also show the diversity in focus that informs how agencies approach the bioeconomy. The agency views encompass the broader bioeconomy landscape, including biotechnologies from commodity fuels and agriculture to individualized therapeutics, and biomanufacturing solutions from biomass production to final product. This range highlights both the important role that each agency plays in supporting the U.S. bioeconomy as well as the challenge in coordinating their activities and programs across the federal government.
Approach
In order to collect perspectives from the agencies represented on the NBB around the U.S. bioeconomy, FAS conducted semi-structured interviews with key NBB officials from the OSTP, DOC, DOD, Department of Energy (DOE), Department of Health and Human Services (HHS), and the U.S. Department of Agriculture (USDA) from May 2024 through June 2024. With the exception of USDA, all agency interviews were conducted over Zoom and answers were documented by note-taking. All summaries have been reviewed by agency representatives to confirm consent and validity. The USDA perspective was summarized using publicly available reports and have also been confirmed for validity by an agency representative.
Perspectives from these agencies on the Bioeconomy EO deliverables, bioeconomy-related programs, coordination, goals, hurdles, and the role of environmental sustainability are summarized below. The full list of questions used in the semi-structured interviews can be found in Appendix A.
Agency Perspectives
Office of Science & Technology Policy
For OSTP’s perspective, FAS conducted a semi-structured interview with Dr. Sarah Glaven, principal assistant director for biotechnology and biomanufacturing.
The Office of Science & Technology Policy plays an important role in interagency coordination for topics, like the bioeconomy, that cut across many different agencies, and is one of the co-chairs for the NBB. In adherence with the Bioeconomy EO, OSTP has coordinated interagency efforts and published several reports on the bioeconomy: Bold Goals for U.S. Biotechnology and Biomanufacturing, Building the Bioworkforce of the Future, and Visions, Needs & Proposed Actions for Data for the Bioeconomy Initiative. They are currently working with interagency groups on several activities, including one that recently published a report, in conjunction with USDA and other agencies, that recommended revisions to the North American Industry Classification System (NAICS) and the North American Product Classification System (NAPCS) to better capture economic activity related to the bioeconomy. The creation of the NBB itself fulfills directives from both the Bioeconomy EO as well as the CHIPS & Science Act, which called on OSTP to establish a coordination office on these topics. Currently, due to a lack of funding, OSTP is not an official coordinating office but will function to coordinate activities through the NBB.
According to OSTP, the Bioeconomy EO reflects the whole-of-government approach that will be needed to support the bioeconomy. For the near term, OSTP plans to show the value and utility of the NBB, execute policy from the Bioeconomy EO, prioritize specific actions from the resulting Bioeconomy EO reports, highlight significant investments, and produce a report on the NAICS and NAPCS codes. In the long term, OSTP hopes that the NBB will become a sustainable government entity that drives a clear national strategy to move the bioeconomy forward and enables the United States to work collaboratively with global partners.
A key challenge is measuring the bioeconomy. It is difficult to prioritize, strategize, or advocate for additional resources in the absence of baseline economic metrics to track impact or estimate the potential return on investment. Ultimately, OSTP believes it is important to clarify the definition of the bioeconomy in order to create measurements and classifications.
A challenge for OSTP is continuity as it experiences staff turnover and administration changes. However, the NBB and coordination of the bioeconomy portfolio will be well positioned to persist, in part by relying on the NBB’s co-chairs. Also, the Bioeconomy EO allowed OSTP to create principal and assistant director positions for the bioeconomy portfolio, which can help ensure that it remains a high priority. At OSTP, this portfolio sits within the Industrial Innovation Group, which also houses coordination efforts for semiconductors and clean energy. OSTP leadership understands the importance of the bioeconomy and is keen to see the intersections of biomanufacturing with other initiatives, like the DOE’s Earth Shot programs and other clean energy initiatives.
On the issue of environmental sustainability and the bioeconomy, OSTP highlights efforts by DOE to push for sustainable aviation fuels and USDA’s sustainable biomass supply chain framework as initiatives that are setting the pace for sustainability. There is also an opportunity to consider how biomanufacturing and biosynthesis fit into the broader sustainable chemistry landscape.
Department of Commerce
For DOC’s perspective, FAS conducted a semi-structured interview with Dr. Christopher Szakal, acting director, program coordination office at the National Institute of Standards and Technology.
The Department of Commerce is one of the NBB co-chairs. DOC is sector-agnostic and is interested in the bioeconomy as a way to support the broader economy, remain competitive, and solve broader challenges, such as those related to supply chain resilience. In response to the Bioeconomy EO, the DOC has released the bioeconomy lexicon from NIST and the Feasibility Study for measuring the bioeconomy from the Bureau of Economic Analysis. It has also participated in several interagency activities, including development of OSTP’s Bold Goals report and USDA’s Biomass Supply Chain report, as well as ongoing working groups focusing on updating systems for measuring economic activities (e.g., the NAICS and NAPCS codes) and on biological data and cybersecurity. Separate from the executive order, the Inflation Reduction Act provided significant investments for the Economic Development Administration in biotechnology-related regional technology hubs. Other ongoing activities at DOC in support of the bioeconomy include efforts to support biotechnology and biomanufacturing standards development at NIST, supply chain analyses at the International Trade Administration, work at the Bureau of Industry and Security and at the Patent and Trademark Office to ensure a safe and fair market, and the Workforce Development Strategy.
By nature, DOC keeps a broad perspective and tries to understand how the bioeconomy intersects with other parts of the economy and how technological developments may impact progress. There are important intersections of the bioeconomy with artificial intelligence (AI) and with data security, and policy development in these other areas will have implications for the bioeconomy. For example, the October 2023 Executive Order on AI called for significant new requirements for providers of synthetic nucleic acids to conduct biosecurity screening, which will have implications for biotechnology and biomanufacturing. NIST is tasked with developing standards for this new policy. The intersectional nature of the bioeconomy requires coordination both within the DOC and across the U.S. government. A key challenge is the need for sustained funding because coordination requires time and effort.
On environmental sustainability, the DOC prioritizes the market and what U.S. companies will find profitable in both the near term and the long term. Elevating sustainability has been challenging because there is uncertainty in how sustainability is measured. Additionally, market drivers have been inconsistent relative to the level needed to address the uncertainty. DOC is looking to utilize the NBB to help provide clarity on how to achieve more consistent market forces in support of sustainability to drive growth of the bioeconomy.
Department of Defense
For DOD’s perspective, FAS conducted a semi-structured interview with Dr. Peter Emanuel, senior research scientist, bioengineering at U.S. Army Combat Capabilities Development Command.
The Department of Defense is one of the NBB co-chairs. In September 2022 (before the Bioeconomy EO was announced), DOD announced a $1.2 billion investment in biomanufacturing. In March 2023, DOD released a Biomanufacturing Strategy, which was informed by both the National Defense Authorization Act for Fiscal Year 2023 and the Bioeconomy EO. In support of this strategy and the investments made by DOD, the Department’s Defense Production Act Investments (DPAI) Office published an open Request for Information that sought input from industry on biomanufactured products and process capabilities that could help address defense needs. Significant additional investments in biomanufacturing are likely to be forthcoming.
The bioeconomy portfolio is a tiny portion of the overall programmatic budget for DOD. Previously, the DOD’s interest in biology and biotechnology was limited to military medicine and chemical and biological defense, but the department is increasingly focused on nonmedical biomanufacturing applications and believes that they will be key for ensuring national security. The department also acknowledges the importance of workforce development and the need for standardization and infrastructure for the bioeconomy and strongly supports these areas. This commitment can be seen with DOD’s large investments in 2020 in BioMADE, a Manufacturing Innovation Institute focused on creating a sustainable, domestic end-to-end bioindustrial manufacturing ecosystem.
In the future, DOD hopes to take advantage of biomanufacturing’s potential to support defense objectives beyond just medical countermeasures and other human health-related advances, such as production of bio-based materials, chemicals, and foods. However, DOD faces challenges both internally and externally in communicating the full potential of the bioeconomy and biomanufacturing for DOD.
On environmental sustainability, DOD believes that economic and environmental sustainability for the bioeconomy go hand-in-hand. For example, a company that could make chemicals without waste would have a significant economic advantage and would support environmental sustainability. Historically, DOD has seen significant costs due to polluted sites, and so understands the value of cleaner products and processes. In addition, DOD is investing in different technologies that would valorize waste streams.
Department of Energy
For DOE’s perspective, FAS conducted a semi-structured interview with Dr. Valerie Reed, director, Bioenergy Technologies Office.
The Department of Energy has many goals for advancing the bioeconomy, with the common denominator being to decarbonize America’s transportation and fuel sectors and to build resilient clean energy for generations to come. In response to the Bioeconomy EO, the DOE contributed to the OSTP Bold Goals report and was tasked to work with other agencies to write reports on National Security Recommendations for Federal Procurement (forthcoming) and best practices for cyber security documentation. Furthermore, DOE also played a large role in an upcoming biotechnology and biomanufacturing report mandated by the Bioeconomy EO. Outside of the direct requirements from the EO, the DOE plays a crucial role in supporting industrial biotechnology through additional reports and their involvement in ongoing interagency activities. For example, the Billion Ton Report provides a comprehensive assessment of biomass availability today and how to sustainably produce more than one billion tons of biomass per year to meet the demand for sustainable aviation fuel production.
DOE’s bioeconomy efforts are concentrated within the Bioenergy Technologies Office (BETO) and the Office of Science. BETO aims to utilize biomass for sustainable and renewable fuel and chemical production, while the Office of Science supports fundamental research that enables the bioeconomy, including synthetic biology and thermochemical conversion. Under the Inflation Reduction Act and CHIPS & Science Act, significant support was given to bioenergy solutions and clean energy demonstrations, including DOE tax incentives aimed at carbon reduction in fuel production.
In the short term, DOE is focused on prioritizing the use of biomass for Sustainable Aviation Fuel (SAF) and marine fuel production, as well as supporting renewable diesel and ethanol for medium- and heavy-duty vehicles. Long-term goals include transitioning to electrification using biomass, achieving substantial SAF production by 2035 through the SAF Grand Challenge, and scaling up the production of specific chemicals by 2035 as part of the industrial decarbonization strategy. Additionally, in coordination with the USDA, there are focused efforts to increase cultivation of purpose-grown energy crops.
One of the major hurdles the DOE currently faces, and may continue to face in the future, is ensuring sustained funding levels that support ongoing development. Currently, biomass is seen as an expensive feedstock. While the IRA provided an initial policy bridge, it is essential to establish a longer-term incentive to meet market demand, like the 40B (SAF production) and 45Z (clean fuel production) tax credits.
On environmental sustainability, the DOE is very focused on goals for decarbonization of transportation and fuels, including replacing petroleum-based products with sustainable biomass solutions and conducting life cycle assessments (LCAs) to measure sustainability impacts throughout the supply chain. DOE created the GREET Model for LCAs, which was updated recently, to reduce ambiguity and to help standardize the process for measuring carbon emissions. Additionally, DOE’s Clean Fuels and Products Earthshot is an important cross-agency collaboration that supports accelerating bio-based fuels and chemicals production and decarbonizing both the fuel and chemical industry.
Department of Health & Human Services
For HHS’s perspective, FAS conducted a semi-structured interview with Dr. Lyric Jorgenson, associate director for science policy and the director of the Office of Science Policy at the National Institutes of Health (NIH), and Dr. Julia Limage, director, Office of Strategy, Policy, and Requirements in the Administration for Strategic Preparedness and Response (ASPR).
The Department of Health & Human Services has two representatives on the NBB, one from NIH and one from ASPR. HHS has a broad mission in support of human health, and many of its programs could be considered part of the bioeconomy. However, the Bioeconomy EO outlined a set of priorities that called for additional focus at HHS on advances specific to biotechnology and biomanufacturing, many of which were included in the OSTP Bold Goals report. The EO also tasked HHS with leading the establishment of a Biosafety and Biosecurity Innovation Initiative; a strategic plan for this initiative will be available soon. Another area of intersection of HHS and the Bioeconomy EO is on the regulatory side: the Food and Drug Administration worked with USDA and EPA to provide updates on the regulatory system as deliverables for the EO. Many of the activities related to the EO draw on interagency working groups and other ongoing activities—for example, the work toward pandemic preparedness and biodefense, as well as collaborations between NIH and NSF on health-relevant research.
In the near future, HHS will focus on advancing biotechnologies such as multi-omic medicine, gene editing, and other therapeutics tailored to individual patients. Biomanufacturing and scale-up is another key focus to increase speed and availability of key medicines. In regards to public health, the COVID pandemic highlighted the need for fast and secure biomanufacturing for vaccine production. The Biomedical Advanced Research and Development Authority (BARDA) in ASPR has made significant investments in biomanufacturing for this reason. ASPR also has an Office of Industrial Base Management and Supply Chain to support domestic biomanufacturing in case of public health emergencies.
For HHS, activities related to the bioeconomy directly and unambiguously support the department’s mission and will continue to be prioritized. A key challenge for HHS is the need for sustained funding, especially for coordination, which requires time and effort above and beyond programmatic work. To be effective, activities initiated by the Bioeconomy EO will need to be funded. Some HHS activities, including some related to biomanufacturing of medical countermeasures, were funded with COVID supplemental funding that will soon run out.
On environmental sustainability, HHS has not had any significant focus. However, there have been efforts to decrease the use of single-use plastics and equipment in research and public health activities.
United States Department of Agriculture
For USDA’s perspective, FAS gathered information from publicly available reports and documents, with guidance and direction from Herrick Fox, USDA’s bioeconomy coordinator in the Office of the Chief Economist, and Greg Jaffe, senior advisor in the Office of the Secretary.
The Bioeconomy EO tasked USDA with a wide range of deliverables, and USDA has released many related reports and products that reflect its bioeconomy-related priorities. One set of deliverables focuses on biomass and feedstocks, and supports the strategic vision outlined for agriculture in OSTP’s Bold Goals report. This includes the report on Building a Resilient Biomass Supply—A Plan to Enable the Bioeconomy in America, along with an Implementation Framework. USDA also has a long-standing focus on bio-based products, including support of the BioPreferred Program, a program created by the 2002 Farm Bill to increase the purchase of bio-based products and reauthorized in the 2018 Farm Bill. Their recent Economic Impact Analysis of the U.S. Biobased Products Industry report summarizes the status of bio-based products, an important component of the bioeconomy.
USDA also plays a central role in regulating biotechnology products, and the Bioeconomy EO called for updates to the regulatory system. In response, USDA (along with the FDA and the EPA) conducted stakeholder outreach, which is summarized in a report on Ambiguities, Gaps, and Uncertainties in Regulation of Biotechnology Under the Coordinated Framework. USDA also released a Plan for Regulatory Reform under the Coordinated Framework for the Regulation of Biotechnology and produced an updated Coordinated Framework website. Activities to improve coordination across the three major regulatory agencies are ongoing.
Unlike most federal departments and agencies, most programs and activities at USDA have a link with the life sciences, including those that support food and fiber, forests and grasslands, and other natural resources, as well as the manufacturing of numerous bio-based products and biofuels from these resources and the R&D and infrastructure that supports it. From USDA’s perspective, the department has served the bioeconomy since its founding in 1862. This broad focus provides many opportunities for strategic partnerships with other parts of the U.S. government working on the bioeconomy, and there are many different ways that USDA can contribute to the NBB.
On environmental sustainability, USDA has demonstrated its commitment to developing a circular bioeconomy, which is reflected in its Biomass Plan, and in its support for bio-based products and sustainable agriculture initiatives.
Conclusion
The agencies that make up the NBB highlight the complex nature of the U.S. bioeconomy and the various sectors that fall under it. Nevertheless, despite this complexity, the NBB is providing a whole-of-government approach to enable agencies to better support the burgeoning U.S. bioeconomy. The work underway is underpinned by the agencies’ priorities and programmatic expertise, but comes together to build the foundational base needed to support and grow the U.S. bioeconomy. Each agency also has a focus on environmental sustainability, with some, like DOE, DOD, and USDA, having a stronger focus due to their direct connections with the environment. Finally, agencies also agree on the need for more data on the bioeconomy’s impact as the different sectors evolve and the need for sustained funding to promote coordination, which takes time and effort beyond just programmatic work.
Appendix A. Interview Questions
- In response to the September 2022 Bioeconomy EO, your agency has produced some reports and other deliverables on the bioeconomy.
- Are we missing any deliverables? Are there any other reports or activities that are already completed or still to come in response to the EO?
- Are there programs or other deliverables relevant to the bioeconomy that your agency has pursued under the Inflation Reduction Act or the CHIPS & Science Act?
- Are there other activities within your agency that you believe support the bioeconomy? Is the bioeconomy broader than what was captured by the EO and these other efforts?
- What does your agency hope to achieve in the foreseeable future and in the more distant future regarding the U.S. bioeconomy?
- Are these goals related to the OSTP Bold Goals Report or other deliverables for the Bioeconomy EO?
- To what extent will this progress be prioritized within your agency? How central to your agency is progress in the bioeconomy – now and into the future?
- What are the major hurdles your agency currently faces or may face in the future in reaching these goals?
- How does your agency tackle the issue of creating an environmentally sustainable bioeconomy and/or a circular bioeconomy?
- Are there any initiatives in place currently or coming up in the near future that speak towards this?
Critical Thinking on Critical Minerals
How the U.S. Government Can Support the Development of Domestic Production Capacity for the Battery Supply Chain
Access to critical minerals supply chains will be crucial to the clean energy transition in the United States. Batteries for electric vehicles, in particular, will require the U.S. to consume an order of magnitude more lithium, nickel, cobalt, and graphite than it currently consumes. Currently, these materials are sourced from around the world. Mining of critical minerals is concentrated in just a few countries for each material, but is becoming increasingly geographically diverse as global demand incentivizes new exploration and development. Processing of critical minerals, however, is heavily concentrated in a single country—China—raising the risk of supply chain disruption.
To address this, the U.S. government has signaled its desire to onshore and diversify critical minerals supply chains through key legislation, such as the Bipartisan Infrastructure Law and the Inflation Reduction Act, and trade policies. The development of new mining and processing projects entails significant costs, however, and project financiers require developers to demonstrate certainty that projects will generate profit through securing long-term offtake agreements with buyers. This is made difficult by two factors: critical minerals markets are volatile, and, without subsidies or trade protections, domestically-produced critical minerals have trouble competing against low-priced imports, making it difficult for producers and potential buyers to negotiate a mutually agreeable price (or price floor). As a result, progress in expanding the domestic critical minerals supply may not occur fast enough to catch up to the growing consumption of critical minerals.
To accelerate project financing and development, the Department of Energy (DOE) should help generate demand certainty through backstopping the offtake of processed, battery-grade critical minerals at a minimum price floor. Ideally, this would be accomplished by paying producers the difference between the market price and the price floor, allowing them to sign offtake agreements and sell their products at a competitive market price. Offtake agreements, in turn, allow developers to secure project financing and proceed at full speed with development.
While demand-side support can help address the challenges faced by individual developers, market-wide issues with price volatility and transparency require additional solutions. Currently, the pricing mechanisms available for battery-grade critical minerals are limited to either third-party price assessments with opaque sources or the market exchange traded price of imperfect proxies. Concerns have been raised about the reliability of these existing mechanisms, hindering market participation and complicating discussions on pricing.
As the North American critical minerals industry and market develops, DOE should support the parallel development of more transparent, North American based pricing mechanisms to improve price discovery and reduce uncertainty. In the short- and medium-term, this could be accomplished through government-backed auctions, which could be combined with offtake backstop agreements. Auctions are great mechanisms for price discovery, and data from them can help improve market price assessments. In the long-term, DOE could support the creation of new market exchanges for trading critical minerals in North America. Exchange trading enables greater price transparency and provides opportunities for hedging against price volatility.
Through this two-pronged approach, DOE would simultaneously accelerate the development of the domestic critical minerals supply chain through addressing short-term market needs, while building a more transparent and reliable marketplace for the future.
Introduction
The global transportation system is currently undergoing a transition to electric vehicles (EVs) that will fundamentally transform not only our transportation system, but also domestic manufacturing and supply chains. Demand for lithium ion batteries, the most important and expensive component of EVs, is expected to grow 600% by 2030 compared to 2023, and the U.S. currently imports a majority of its lithium batteries. To ensure a stable and successful transition to EVs, the U.S. needs to reduce its import-dependence and build out its domestic supply chain for critical minerals and battery manufacturing.
Crucial to that will be securing access to battery-grade critical minerals. Lithium, nickel, cobalt, and graphite are the primary critical minerals used in EV batteries. All four were included in the 2023 Department of Energy (DOE) Critical Minerals List. Cobalt and graphite are considered at risk of shortage in the short-term (2020-2025), while all four materials are at risk in the medium-term (2025-2030).
As shown in Figure 1, the domestic supply chain for batteries and critical minerals consists primarily of downstream buyers like automakers and battery assemblers, though there are a growing number of battery cell manufacturers thanks to domestic sourcing requirements in the Inflation Reduction Act (IRA) incentives. The U.S. has major gaps in upstream and midstream activities—mining of critical minerals, refining/processing, and the production of active materials and battery components. These industries are concentrated globally in a small number of countries, presenting supply chain risks. By developing new domestic industries within these gaps, the federal government can help build out new, resilient clean energy supply chains.
This report is organized into three main sections. The first section provides an overview of current global supply chains and the process of converting different raw materials into battery-grade critical minerals. The second section delves into the pricing and offtake challenges that projects face and proposes demand-side support solutions to provide the price and volume certainty necessary to obtain project financing. The final section takes a look at existing pricing mechanisms and proposes two approaches that the government can take to facilitate price discovery and transparency, with an eye towards mitigating market volatility in the long term. Given DOE’s central role in supporting the development of domestic clean energy industries, the policies proposed in this report were designed with DOE in mind as the main implementer.

Adapted from Li-BRIDGE
Segments highlighting in light blue indicated gaps in U.S. supply chains. See original graphic from Li-BRIDGE for more information.
Section 1. Understanding Critical Minerals Supply Chains
Global Critical Minerals Sources
Globally, 65% or more of processed lithium, cobalt, and graphite originates from a single country: China (Figure 2). This concentration is particularly acute for graphite, 91% of which was processed by China in 2023. This market concentration has made downstream buyers in the U.S. overly dependent on sourcing from a single country. The concentration of supply chains in any one country makes them vulnerable to disruptions within that country—whether they be natural disasters, pandemics, geopolitical conflict, or macroeconomic changes. Moreover, lithium, nickel, cobalt, and graphite are all expected to experience shortages over the next decade. In the case of future shortages, concentration in other countries puts U.S. access to critical minerals at risk. Rocky foreign relations and competition between the U.S. and China over the past few years have put further strain on this dependence. In October 2023, China announced new export controls on graphite, though it has not yet restricted supply, in response to the U.S.’s export restrictions on semiconductor chips to China and other “foreign entities of concern” (FEOC).
Expanding domestic processing of critical minerals and manufacturing of battery components can help reduce dependence on Chinese sources and ensure access to critical minerals in future shortages. However, these efforts will hurt Chinese businesses, so the U.S. will also need to anticipate additional protectionist measures from China.
On the other hand, mining of critical minerals—with the exception of graphite and rare earth elements—occurs primarily outside of China. These operations are also concentrated in a small handful of countries, shown in Figure 3. Consequently, geopolitical disruptions affecting any of those primary countries can significantly affect the price and supply of the material globally. For example, Russia is the third largest producer of nickel. In the aftermath of Russia’s invasion of Ukraine at the beginning of 2022, expectations of shortages triggered a historic short squeeze of nickel on the London Metal Exchange (LME), the primary global trading platform, significantly disrupting the global market.
To address global supply chain concentration, new incentives and grant programs were passed in the IRA and the Bipartisan Infrastructure Law. These include the 30D clean vehicle tax credit, the 45X advanced manufacturing production credit, and the Battery Materials Processing Grants Program (see Domestic Price Premium section for further discussion). Thanks to these policies, there are now on the order of a hundred North American projects in mining, processing, and active1 material manufacturing in development. The success of these and future projects will help create new domestic sources of critical minerals and batteries to feed the EV transition in the U.S. However, success is not guaranteed. A number of challenges to investment in the critical minerals supply chain will need to be addressed first.
Battery Materials Supply Chain
Critical minerals are used to make battery electrodes. These electrodes require specific forms of critical minerals for their production processes: typically lithium hydroxide or carbonate, nickel sulfate, cobalt sulfate, and a blend of coated spherical graphite and synthetic graphite.2

Lithium hydroxide/carbonate typically comes from two sources: spodumene, a hard rock ore that is mined primarily in Australia, and lithium brine, which is primarily found in South America (Figure 3). Traditionally, lithium brine must be evaporated in large open-air pools before the lithium can be extracted, but new technologies are emerging for direct lithium extraction that significantly reduces the need for evaporation. Whereas spodumene mining and refining are typically conducted by separate entities, lithium brine operations are typically fully integrated. A third source of lithium that has yet to be put into commercial production is lithium clay. The U.S. is leading the development of projects to extract and refine lithium from clay deposits.

Nickel sulfate can be made from either nickel metal, which was historically the preferred feedstock, or directly from nickel intermediate products, such as mixed hydroxide precipitate and nickel matte, which are the feedstocks that most Chinese producers have switched to in the past few years (Figure 4). Though demand from batteries is driving much of the nickel project development in the U.S., since nickel metal has a much larger market than nickel sulfate, developers are designing their projects with the flexibility to produce either nickel metal or nickel sulfate.

Cobalt is primarily produced in the Democratic Republic of the Congo from cobalt-copper ore. Cobalt can also be found in lesser amounts in nickel and other metallic ores. Cobalt concentrate is extracted from cobalt-bearing ore and then processed into cobalt hydroxide. At this point, the cobalt hydroxide can be further processed into either cobalt sulfate for batteries or cobalt metal and other chemicals for other purposes.

Battery cathodes come in a variety of chemistries: lithium nickel manganese cobalt (NMC) is the most common in lithium-ion batteries thanks to its higher energy density, while lithium iron phosphate is growing in popularity for its affordability and use of more abundantly available materials, but is not as energy dense. Cathode active material (CAM) manufacturers purchase lithium hydroxide/carbonate, nickel sulfate, and cobalt sulfate and then convert them into CAM powders. These powders are then sold to battery cell manufacturers, who coat them onto copper electrodes to produce cathodes.

Graphite can be synthesized from petroleum needle coke, a fossil fuel waste material, or mined from natural deposits. Natural graphite typically comes in the form of flakes and is reshaped into spherical graphite to reduce its particle size and improve its material properties. Spherical graphite is then coated with a protective layer to prevent unwanted chemical reactions when charging and discharging the battery.

The majority of battery anodes on the market are made using just graphite, so there is no intermediate step between processors and battery cell manufacturers. Producers of battery-grade synthetic graphite and coated spherical graphite sell these materials directly to cell manufacturers, who coat them onto electrodes to make anodes. These battery-grade forms of graphite are also referred to as graphite anode powder or, more generally, as anode active materials. Thus, the terms graphite processor and graphite anode manufacturer are interchangeable.
Section 2. Building Out Domestic Production Capacity
Challenges Facing Project Developers
Offtake Agreements
Offtake agreements (a.k.a. supply agreements or contracts) are an agreement between a producer and a buyer to purchase a future product. They are a key requirement for project financing because they provide lenders and investors with the certainty that if a project is built, there will be revenue generated from sales to pay back the loan and justify the valuation of the business. The vast majority of feedstocks and battery-grade materials are sold under offtake agreements, though small amounts are also sold on the spot market in one-off transactions. Offtake agreements are made at every step of the supply chain: between miners and processors (if they’re not vertically integrated), between processors and component manufacturers; and between component manufacturers and cell manufacturers. Due to domestic automakers’ concerns about potential material shortages upstream and the desire to secure IRA incentives, many of them have also been entering into offtake agreements directly with North American miners and processors. Tesla has started constructing their own domestic lithium processing plant.
Historically, these offtake agreements were structured as fixed-price deals. However, when prices on the spot market go too high, sellers often find a way to rip up the contract, and vice versa, when spot prices go too low, buyers often find a way to get out of the contract. As a result, more and more offtake agreements for battery-grade lithium, nickel, and cobalt have become indexed to spot prices, with price floors and/or ceilings set as guardrails and adjustments for premiums and discounts based on other factors (e.g. IRA compliance, risk from a greenfield producer, etc.).
Graphite is the one exception where buyers and suppliers have mostly stuck to fixed-price agreements. There are two main reasons for this: graphite pricing is opaque and products exhibit much more variation, complicating attempts to index the price. As a result, cell manufacturers don’t consider the available price indexes to accurately reflect the value of the specific products they are buying.
Offtake agreements for battery cells are also typically partially indexed on the price of the critical minerals used to manufacture them. In other words, a certain amount of the price per unit of battery cell is fixed in the agreement, while the rest is variable based on the index price of critical minerals at the time of transaction.
Domestic critical minerals projects face two key challenges to securing investment and offtake agreements: market volatility and a lack of price competitiveness. The price difference between materials produced domestically and those produced internationally stems from two underlying causes: the current oversupply from Chinese-owned companies and the domestic price premium.
Market Volatility
Lithium, cobalt, and graphite have relatively low-volume markets with a small customer base compared to traditional commodities. Low-volume products experience low liquidity, meaning it can be difficult to buy or sell quickly, so slight changes in supply and demand can result in sharp price swings, creating a volatile market. Because of the higher risk and smaller market, companies and investors tend to prefer mining and processing of base metals, such as copper, which have much larger markets, resulting in underinvestment in production capacity.
In comparison, nickel is a base metal commodity, primarily used for stainless steel production. However, due to its rapidly growing use in battery production, its price has become increasingly linked to other battery materials, resulting in greater volatility than other base metals. Moreover, the short squeeze in 2022 forced LME to suspend trading and cancel transactions for the first time in three decades. As a result, trust in the price of nickel on LME faltered, many market participants dropped out, and volatility grew due to low trading volumes.
For all four of these materials, prices reached record highs in 2022 and subsequently crashed in 2023 (Figure 4). Nickel, cobalt, and graphite experienced price declines of 30-45%, while lithium prices dropped by an enormous 75%. As discussed above, market volatility discourages investment into critical minerals production capacity. The current low prices have caused some domestic projects to be paused or canceled. For example, Jervois halted operation of its Idaho cobalt mine in March 2023 due to cobalt prices dropping below its operating costs. In January 2024, lithium giant Albemarle announced that it was delaying plans to begin construction on a new South Carolina lithium hydroxide processing plant.
Retrospective analysis suggests that mining companies, battery investors, and automakers had all made overly optimistic demand projections and ramped up their production a bit too fast. These projections assumed that EV demand would keep growing as fast as it did immediately after the pandemic and that China’s lifting of pandemic restrictions would unlock even faster growth in the largest EV market. Instead, China, which makes up over 60% of the EV market, emerged into an economic downturn, and global demand elsewhere didn’t grow quite as fast as projected, as backlogs built up during the pandemic were cleared. (It is important to note that the EV market is still growing at significant rates—global EV sales increased by 35% from 2022 to 2023—just not as fast as companies had wished.) Consequently, supply has temporarily outpaced demand. Midstream and upstream companies stopped receiving new purchase orders while automakers worked through their stock build-up. Prices fell rapidly as a result and are now bottoming out. Some companies are waiting for prices to recover before they restart construction and operation of existing projects or invest in expanding production further.
While companies are responding to short-term market signals, the U.S. government needs to act in anticipation of long-term demand growth outpacing current planned capacity. Price volatility in critical minerals markets will need to be addressed to ensure that companies and financiers continue investing in expanding production capacity. Otherwise, demand projections suggest that the supply chain will experience new shortages later this decade.
Oversupply
The current oversupply of critical minerals has been exacerbated by below market-rate financing and subsidies from the Chinese government. Many of these policies began in 2009, incentivizing a wave of investment not just in China, but also in mineral-rich countries. These subsidies played a large role in the 2010s in building out nascent battery critical minerals supply chains. Now, however, they are causing overproduction from Chinese-owned companies, which threatens to push out competitors from other countries.
Overproduction begins with mining. Chinese companies are the primary financial backers for 80% of both the Democratic Republic of the Congo’s cobalt mines and Indonesia’s nickel mines. Chinese companies have also expanded their reach in lithium, buying half of all the lithium mines offered for sale since 2018, in addition to domestically mining 18% of global lithium. For graphite, 82% of natural graphite was mined directly in China in 2023, and nearly all natural and synthetic graphite is processed in China.
After the price crash in 2023, while other companies pulled back their production volume significantly, Chinese-owned companies pulled back much less and in some cases continued to expand their production, generating an oversupply of lithium, cobalt, nickel, and natural and synthetic graphite. Government policies enabled these decisions by making it financially viable for Chinese companies to sell materials at low prices that would otherwise be unsustainable.
Domestic Price Premium (and Current Policies Addressing It)
Domestically-produced critical minerals and battery electrode active materials come with a higher cost of production over imported materials due to higher wages and stricter environmental regulations in the U.S. The IRA’s new 30D and 45X tax credit and upcoming section 301 tariffs help address this problem by creating financial incentives for using domestically produced materials, allowing them to compete on a more even playing field with imported materials.
The 30D New Clean Vehicle Tax Credit provides up to $7,500 per EV purchased, but it requires eligible EVs to be manufactured from critical minerals and battery components that are FEOC-compliant, meaning they cannot be sourced from companies with relationships to China, North Korea, Russia, and Iran. It also requires that an increasing percentage of critical minerals used to make the EV batteries be extracted or processed in the U.S. or a Free Trade Agreement country. These two requirements apply to lithium, nickel, cobalt, and graphite. For graphite, however, since nearly all processing occurs in China and there is currently no domestic supply, the US Treasury has chosen to exempt it from the 30D tax credit’s FEOC and domestic sourcing requirements until 2027 to give automakers time to develop alternate supply chains.
The 45X Advanced Manufacturing Production Tax Credit subsidizes 10% of the production cost for each unit of critical minerals processed. The Internal Revenue Service’s proposed regulations for this tax credit interprets the legislation for 45X as applying only to the value-added production cost, meaning that the cost of purchasing raw materials and processing chemicals is not included in the covered production costs. This limits the amount of subsidy that will be provided to processors. The strength of 45X, though, is that unlike the 30D tax credit, there is no sunset clause for critical minerals, providing a long term guarantee of support.
In terms of tariffs, the Biden administration announced in May 2024, a new set of section 301 tariffs on Chinese products, including EVs, batteries, battery components, and critical minerals. The critical minerals tariffs include a 25% tariff on cobalt ores and concentrates that will go into effect in 2024 and a 25% tariff on natural flake graphite that will go into effect in 2026. In addition, there are preexisting 25% tariffs in section 301 for natural and synthetic graphite anode powder. These tariffs were previously waived to give automakers time to diversify their supply chains, but the U.S. Trade Representative (USTR) announced in May 2024 that the exemptions would expire for good on June 14th, 2024, citing the lack of progress from automakers as a reason for not extending them.
Current State of Supply Chain Development
For lithium, despite market volatility, offtake demand for existing domestic projects has remained strong thanks to IRA incentives. Based on industry conversations, many of the projects that are developed enough to make offtake agreements have either signed away their full output capacity or are actively in the process of negotiating agreements. Strong demand combined with tax incentives has enabled producers to negotiate offtake agreements that guarantee a price floor at or above their capital and operating costs. Lithium is the only material for which the current planned mining and processing capacity for North America is expected to meet demand from planned U.S. gigafactories.
Graphite project developers report that the 25% tariff coming into force will be sufficient to close the price gap between domestically produced materials and imported materials, enabling them to secure offtake agreements at a sustainable price. Furthermore, the Internal Revenue Service will require 30D tax credit recipients to submit period reports on progress that they are making on sourcing graphite outside of China. If automakers take these reports and the 2027 exemption deadline seriously, there will be even more motivation to work with domestic graphite producers. However, the current planned production capacity for North America still falls significantly short of demand from planned U.S. battery gigafactories. Processing capacity is the bottleneck for production output, so there is room for additional investment in processing capacity.
Pricing has been a challenge for cobalt though. Jervois briefly opened the only primary cobalt mine in the U.S. before shutting down a few months later due to the price crash. Jervois has said that as soon as prices for standard-grade cobalt rise above $20/pound, they will be able to reopen the mine, but that has yet to happen. Moreover, the real bottleneck is in cobalt processing, which has attracted less attention and investment than other critical minerals in the U.S. There are currently no cobalt sulfate refineries in North America; only one or two are in development in the U.S. and a few more in Canada.3
Nickel sulfate is also facing pricing challenges, and, similar to cobalt, there is an insufficient amount of nickel sulfate processing capacity being developed domestically. There is one processing plant being developed in the U.S. that will be able to produce either nickel metal or nickel sulfate and a few more nickel sulfate refineries being developed in Canada.
Policy Solutions to Support the Development of Processing Capacity
The U.S. government should prioritize the expansion of processing capacity for lithium, graphite, cobalt, and nickel. Demand from domestic battery manufacturing is expected to outpace the current planned capacity for all of these materials, and processing capacity is the key bottleneck in the supply chain. Tariffs and tax incentives have resulted in favorable pricing for lithium and graphite project developers, but cobalt and nickel processing has gotten less support and attention.
DOE should provide demand-side support for processed, battery-grade critical minerals to accelerate the development of processing capacity and address cobalt and nickel pricing needs. The Office of Manufacturing and Energy Supply Chains (MESC) within DOE would be the ideal entity to administer such a program, given its mandate to address vulnerabilities in U.S. energy supply chains. In the immediate term, funding could come from MESC’s Battery Materials Processing Grants program, which has roughly $1.9B in remaining, uncommitted funds. Below we propose a few demand-support mechanisms that MESC could consider.
Long term, the Bipartisan Policy Center proposes that Congress establish and appropriate funding for a new government corporation that would take on the responsibility of administering demand-support mechanisms as necessary to mitigate volume and price uncertainty and ensure that domestic processing capacity grows to sufficiently meet critical minerals needs.
Offtake Backstops
Offtake backstops would commit MESC to guaranteeing the purchase of a specific amount of materials at a minimum negotiated price if producers are unable to find buyers at that price. This essentially creates a price floor for specific producers while also providing a volume guarantee. Offtake backstops help derisk project development and enable developers to access project financing. Backstop agreements should be made for at least the first five years of a plant’s operations, similar to a regular offtake agreement. Ideally, MESC should prioritize funding for critical minerals with the largest expected shortages based on current planned capacity—i.e., nickel, cobalt, and graphite.
There are two primary ways that DOE could implement offtake backstops:
First. The simplest approach would be for DOE to pay processors the difference between the spot price index (adjusted for premiums and discounts) and the pre-negotiated price floor for each unit of material, similar to how a pay-for-difference or one-sided contract-for-difference would work.4 This would enable processors to sign offtake agreements with no price floor, accelerating negotiations and thus the pace of project development. Processors could also choose to keep some of their output capacity uncommitted so that they can sell their products on the spot market without worrying about prices collapsing in the future.
A more limited form of this could look like DOE subsidizing the price floor for specific offtake agreements between a processor and a buyer. This type of intervention requires a bit more preliminary work from processors, since they would have to identify and bring a buyer to the table before applying for support.
Second. Purchasing the actual materials would be a more complex route for DOE to take, since the agency would have to be ready to receive delivery of the materials. The agency could do this by either setting up a system of warehouses suitable for storing battery-grade critical minerals or using “virtual warehousing,” as proposed by the Bipartisan Policy Center. An actual warehousing system could be set up by contracting with existing U.S. warehouses, such as those in LME and CME’s networks, to expand or upgrade their facilities to store critical minerals. These warehouses could also be made available for companies’ to store their private stockpiles, increasing the utility of the warehousing system and justifying the cost of setting it up. Virtual warehousing would entail DOE paying producers to store materials on-site at their processing plants.
The physical reserve provides an additional opportunity for DOE to address market volatility by choosing when it sells materials from the reserve. For example, DOE could pause sales of a material when there is an oversupply on the market and prices dip or ramp up sales when there is a shortage and prices spike. However, this can only be used to address short-term fluctuations in supply and demand (e.g. a few months to a few years at most), since these chemicals have limited shelf lives.
A third way to implement offtake backstops that would also support price discovery and transparency is discussed in Section 3.
Section 3. Creating Stable and Transparent Markets
Concerns about Pricing Mechanisms
Market volatility in critical minerals markets has raised concerns about just how reliable the current pricing mechanisms for these markets are. There are two main ways that prices in a market are determined: third-party price assessments and market exchanges. A third approach that has attracted renewed attention this year is auctions. Below, we walk through these three approaches and propose potential solutions for addressing challenges in price discovery and transparency.
Index Pricing
Price reporting agencies like Fastmarkets and Benchmark Mineral Intelligence offer subscription services to help market participants assess the price of commodities in a region. These agencies develop rosters of companies for each commodity, who regularly contribute information on transaction prices. That intel is then used to generate price indexes. Fastmarkets and Benchmark’s indexes are primarily based on prices provided by large, high-volume sellers and buyers. Smaller buyers may pay higher than index prices.
It can be hard to establish reliable price indexes in immature markets if there is an insufficient volume of transactions or if the majority of transactions are made by a small set of companies. For example, lithium processing is concentrated among a small number of companies in China and spot transactions are a minority share of the market. New entrants and smaller producers have raised concern that these companies have significant control over Asian spot prices reported by Fastmarkets and Benchmark, which are used to set offtake agreement prices, and that the price indexes are not sufficiently transparent.
Exchange Trading
Market exchanges are a key feature of mature markets that helps reduce market volatility. Market exchanges allow for a wider range of participants, improving market liquidity, and enables price discovery and transparency. Companies up and down the supply chain can use physically-delivered futures and options contracts to hedge against price volatility and gain visibility into expectations for the market’s general direction to help inform decision-making. This can help derisk the effect of market volatility on investments in new production capacity.
Of the materials we’ve discussed, nickel and cobalt metal are the only two that are physically traded on a market exchange, specifically LME. Metals make good exchange commodities due to their fungibility. Other forms of nickel and cobalt are typically priced as a percentage of the payable price for nickel and cobalt metal. LME’s nickel price is used as the global benchmark for many nickel products, while the in-warehouse price of cobalt metal in Rotterdam, Europe’s largest seaport, is used as the global benchmark for many cobalt products. These pricing relationships enable companies to use nickel and cobalt metal as proxies for hedging related materials.
After nickel trading volumes plummeted on LME in the wake of the short squeeze, doubts were raised about LME’s ability to accurately benchmark its price, sparking interest in alternative exchanges. In April 2024, UK-based Global Commodities Holdings Ltd (GCHL) launched a new trading platform for nickel metal that is only available to producers, consumers, and merchants directly involved in the physical market, excluding speculative traders. The trading platform will deliver globally “from Baltimore to Yokohama.” GCHL is using the prices on the platform to publish its own price index and is also working with Intercontinental Exchange to create cash-settled derivatives contracts. This new platform could potentially expand to other metals and critical minerals.
In addition to LME’s troubles though, changes in the battery supply chain have led to a growing divergence between the nickel and cobalt metal traded on exchanges and the actual chemicals used to make batteries. Chinese processors who produce most of the global supply of nickel sulfate have mostly switched from nickel metal to cheaper nickel intermediate products as their primary feedstock. Consequently, market participants say that the LME exchange price for nickel metal, which is mostly driven by stainless steel, no longer reflects market conditions for the battery sector, raising the need for new tradeable contracts and pricing mechanisms. For the cobalt industry, 75% of demand comes from batteries, which use cobalt sulfate. Cobalt metal makes up only 18% of the market, of which only 10-15% is traded on the spot market. As a result, cobalt chemicals producers have transitioned away from using the metal reference price towards fixed-prices or cobalt sulfate payables.
These trends motivate the development of new exchange contracts for physically trading nickel and cobalt chemicals that can enable price discovery separate from the metals markets. There is also a need to develop exchange contracts for materials like lithium and graphite with immature markets that exhibit significant volatility.
However, exchange trading of these materials is complicated by their nature as specialty chemicals: they have limited shelf lives and more complex storage requirements, unlike metal commodities. Lithium and graphite products also exhibit significant variations that affect how buyers can use them. For example, depending on the types and level of impurities in lithium hydroxide/carbonate, manufacturers of cathode active materials may need to conduct different chemical processes to remove them. Offtakers may also require that products meet additional specifications based on the characteristics they need for their CAM and battery chemistries.
For these reasons, major exchanges like LME, the Chicago Mercantile Exchange (CME), and the Singapore Exchange (SGX) have instead chosen to launch cash-settled contracts for lithium hydroxide/carbonate and cobalt hydroxide that allow for financial trading, but require buyers and sellers to arrange physical delivery separately from the exchange. Large firms have begun to participate increasingly in these derivatives markets to hedge against market volatility, but the lack of physical settlement limits their utility to producers who still need to physically deliver their products in order to make a profit. Nevertheless, CME’s contracts for lithium and cobalt have seen significant growth in transaction volume. LME, CME, and SGX all use Fastmarkets’ price indexes as the basis for their cash-settled contracts.
As regional industries mature and products become more standardized, these exchanges may begin to add physically settled contracts for battery-grade critical minerals. For example, the Guangzhou Futures Exchange (GFEX) in China, where the vast majority of lithium refining currently occurs, began offering physically settled contracts for lithium carbonate in August 2023. Though the exchange exhibited significant volatility in its first few months, raising concerns, the first round of physical deliveries in January 2024 occurred successfully, and trading volumes have been substantial this year. Access to GFEX is currently limited to Chinese entities and their affiliates, but another trading platform could come to do the same for North America over the next few decades as lithium production volume grows and a spot market emerges. Abaxx Exchange, a Singapore-based startup, has also launched a physically settled futures contract for nickel sulfate with delivery points in Singapore and Rotterdam. A North American delivery point could be added as the North American supply chain matures.
No market exchange for graphite currently exists, since products in the industry vary even greater than other materials. Even the currently available price indexes are not seen as sufficiently robust for offtake pricing.
Auctions
In the absence of a globally accessible market exchange for lithium and concerns about the transparency of index pricing, Albemarle, the top producer of lithium worldwide, has turned to auctions of spodumene concentrate and lithium carbonate as a means to improve market transparency and an “approach to price discovery that can lead to fair product valuation.” Albemarle’s first auction in March of spodumene concentrate in China closed at a price of $1200/ton, which was in line with spot prices reported by Asian Metal, but about 10% greater than prices provided by other price reporting agencies like Fastmarkets. Plans are in place to continue conducting regular auctions at the rate of about one per week in China and other locations like Australia. Lithium hydroxide will be auctioned as well. Auction data will be provided to Fastmarkets and other price reporting agencies to be formulated into publicly available price indexes.
Auctions are not a new concept: in 2021 and 2022, Pilbara Minerals regularly conducted auctions of spodumene on its own platform Battery Metals Exchange, helping to improve market sentiment. Now, though, the company says that most of its material is now committed to offtakers, so auctions have mostly stopped, though it did hold an auction for spodumene concentrate in March. If other lithium producers join Albemarle in conducting auctions, the data could help improve the accuracy and transparency of price indexes. Auctions could also be used to inform the pricing of other battery-grade critical minerals.
Policy Solutions to Support Price Discovery and Transparency Across the Market
Right now, the only pricing mechanisms available to domestic project developers are spot price indexes for battery-grade critical minerals in Asia or global benchmarks for proxies like nickel and cobalt metal. Long-term, the development of new pricing mechanisms for North America will be crucial to price discovery and transparency in this new market. There are two ways that DOE could help facilitate this: one that could be implemented immediately for some materials and one that will require domestic production volume to scale up first.
First. Government-Backed Auctions: Auctions require project developers to keep a portion of their expected output uncommitted to any offtakers. However, there is a risk that future auctions won’t generate a price sufficient to offset capital and operating expenses, so processors are unlikely to do this on their own, especially for their first domestic project. MESC could address this by providing a backstop guarantee for the portion of a producer’s output that they commit to regularly auctioning for a set timespan. If, in the future, auctions are unable to generate a price above a pre-negotiated price floor, then DOE would pay sellers the difference between the highest auction price and the price floor for each unit sold. Such an agreement could be made using DOE’s Other Transaction Authority. DOE could separately contract with a platform such as MetalsHub to conduct the auction.
Government-backed auctions would enable the discovery of a true North American price for different battery-grade critical minerals and the raw materials used to make them, generating a useful comparison point with Asian spot prices. Such a scheme would also help address developers’ price and demand needs for project financing. These backstop-auction agreements could be complementary to the other types of backstop agreements proposed earlier and potentially more appealing than physically offtaking materials since the government would not have to receive delivery of the materials and there would be a built-in mechanism to sell the materials to an appropriate buyer. If successful, companies could continue to conduct auctions independently after the agreements expire.
Second. New Benchmark Contracts: Employ America has proposed that the Loan Programs Office (LPO) could use Section 1703 to guarantee lending to a market exchange to develop new, physically settled benchmark contracts for battery-grade critical minerals. The development of new contracts should include producers in the entire North American region. Canada also has a significant number of mines and processing plants in development. Including those projects would increase the number of participants, market volume, and liquidity of new benchmark contracts.
In order for auctions or new benchmark contracts to operate successfully, three prerequisites must be met:
- There must be a sufficient volume of materials available for sale (i.e. production output that is not committed to an offtaker).
- There must be sufficient product standardization in the industry such that materials produced by different companies can be used interchangeably by a significant number of buyers.
- There must be a sufficient volume of demand from buyers, brokers, and traders.
Market exchanges typically conduct research into stakeholders to understand whether or not the market is mature enough to meet these requirements before they launch a new contract. Interest from buyers and sellers must indicate that there would be sufficient trading volume for the exchange to make a profit greater than the cost of setting up the new contract. A loan from LPO under Section 1703 can help offset some of those upfront costs and potentially make it worthwhile for an exchange to launch a new contract in a less mature market than they typically would.
Government-backed auctions, on the other hand, solve the first prerequisite by offering guarantees to producers for keeping a portion of their production output uncommitted. Product standardization can also be less stringent, since each producer can hold separate auctions, with varying material specifications, unlike market exchanges where there must be a single set of product standards.
Given current market conditions, no battery-grade critical minerals can meet the above prerequisites for new benchmark contracts, primarily due to a lack of available volume, though there are also issues with product standardization for certain materials. However, nickel, cobalt, lithium, and graphite could be good candidates for government-backed auctions. DOE should start engaging with project developers that have yet to fully commit their output to offtakers and gauge their interest in backstop-auction agreements.
Nickel and Cobalt
As discussed prior, there are only a handful of nickel and cobalt sulfate refineries currently being developed in North America, making it difficult to establish a benchmark contract for North America. None of the project developers have yet signed offtake agreements covering their full production capacity, so backstop-auction agreements could be appealing to project developers and their investors. Given that more than half of the projects in development are located in Canada, MESC and DOE’s Office of International Affairs should collaborate with the Canadian government in designing and implementing government-backed auctions.
Lithium
Domestic companies have expressed interest in establishing North American-based spot markets and price indexes for lithium hydroxide and carbonate, but say that it will take quite a few years before production volume is large enough to warrant that. Product variation has also been a concern from lithium processors when the idea of a market exchange or public auction has been raised. Lessons could be learned from the GFEX battery-grade lithium carbonate contracts. GFEX set standards on the purity, moisture, loss on ignition, and maximum content of different impurities. Some Chinese companies were able to meet these standards, while others were not, preventing them from participating in the futures market or requiring them to trade their materials as lower-purity industrial-grade lithium carbonate, which sells for a discounted price. Other companies producing lithium of much higher quality than the GFEX standards, opted to continue selling on the spot market because they could charge a premium on the standard price. Despite some companies choosing not to participate, trading volumes on GFEX have been substantial, and the exchange was able to weather through initial concerns of a short squeeze, suggesting that challenges with product variation can be overcome through standardization.
Analysts have proposed that spodumene could be a better candidate for exchange trading, since it is fungible and does not have the limited shelf-life or storage requirements of lithium salts. 60% of global lithium comes from spodumene, and the U.S. has some of the largest spodumene deposits in the world, so spodumene would be a good proxy for lithium salts in North America. However, the two domestic developers of spodumene mines are planning to construct processing plants to convert the spodumene into battery-grade lithium on-site. Similarly, the two Canadian mines that currently produce spodumene are also planning to build their own processing plants. These vertical integration plans mean that there is unlikely to be large amounts of spodumene available for sale on a market exchange in the near future.
DOE could, however, work with miners and processors to sign backstop-auction agreements for smaller amounts of lithium hydroxide/carbonate and spodumene that they have yet to commit to offtakers. This may be especially appealing to companies that have announced delays to project development due to current low market prices and help derisk bringing timelines forward. Interest in these future auctions could also help gauge the potential for developing new benchmark contracts for lithium hydroxide/carbonate further down the line.
Graphite
Natural and synthetic graphite anode material products currently exhibit a great range of variation and insufficient product standardization, so a market exchange would not be viable at the moment. As the domestic graphite industry develops, DOE should work with graphite anode material producers and battery manufacturers to understand the types and degree of variations that exist across products and discuss avenues towards product standardization. Government-backed auctions could be a smaller-scale way to test the viability of product standards developed from that process, perhaps using several tiers or categories to group products. Natural and synthetic graphite would have to be treated separately, of course.
Conclusion
The current global critical minerals supply chain partially reflects the results of over a decade of focused, industrial policies implemented by the Chinese government. If the U.S. wants to lead the clean energy transition, critical minerals will also need to become a cornerstone of U.S. industrial policy. Developing a robust North American critical minerals industry would bolster U.S. energy security and independence and ensure a smooth energy transition.
Promising progress has already been made in lithium, with planned processing capacity expected to meet demand from future battery manufacturing. However, market and pricing challenges remain for battery-grade nickel, cobalt, and graphite, which will fall far short of future demand without additional intervention. This report proposes that DOE take a two-pronged approach to supporting the critical minerals industry through offtake backstops, which address project developers’ current pricing dilemmas, and the development of more reliable and transparent pricing mechanisms such as government-backed auctions, which will set up markets for the future.
While the solutions proposed in this report focus on DOE as the primary implementer, Congress also has a role to play in authorizing and appropriating new funding necessary to execute a cohesive industrial strategy on critical minerals . The policies proposed in this report can also be applied to other critical minerals crucial for the energy transition and our national security. Similar analysis of other critical minerals markets and end uses should be conducted to understand how these solutions can be tailored to those industry needs.
Building a Whole-of-Government Strategy to Address Extreme Heat
Comprehensive recommendations from +85 experts to enable a heat-resilient nation
From August 2023 to March 2024, the Federation of American Scientists (FAS) talked with +85 experts to source 20 high-demand opportunity areas for ready policy innovation and 65 policy ideas. In response, FAS recruited 33 authors to work on +18 policy memos through our Extreme Heat Policy Sprint from January 2024 to April 2024, generating an additional +100 policy recommendations to address extreme heat. Our experts’ full recommendations can be found here. In total, FAS has collected +165 recommendations for 34 offices and/or agencies. Key opportunity areas are described below and link out to a set of featured recommendations. Find the 165 policy ideas developed through expert engagement here.
America is rapidly barreling towards its next hottest summer on record. While we wait for a national strategy, states, counties, and cities around the country have taken up the charge of addressing extreme heat in their communities and are experimenting on the fly. California has announced $200 million to build resilience centers that protect communities from extreme heat and has created an all-of-government action plan to address extreme heat. Arizona, New Jersey, and Maryland are all actively developing extreme heat action plans of their own. Miami-Dade County considered passing some of the strictest workplace heat rules (although the measure ultimately failed). Additionally, New York City and Los Angeles have driven cool roof adoption through funding programs and local ordinances, which can reduce energy demands, improve indoor comfort, and potentially lower local outside air temperatures.
While state and local governments can make significant advances, national extreme heat resilience requires a “whole of government” federal approach, as it intersects health, energy, housing, homeland and national security, international relations, and many more policy domains. The federal government plays a critical role in scaling up heat resilience interventions through research and development, regulations, standards, guidance, funding sources, and other policy levers. But what are the transformational policy opportunities for action?
Sourcing Opportunities and Ideas for Policy Innovation
During Fall 2023, FAS engaged +85 experts in conversations around federal policies needed to address extreme heat. Our stakeholders included: 22 academic researchers, 33 non-profit organization leaders, 12 city and state government employees, 3 private company leaders, 2 current or former Congressional staffers, 3 National Labs leaders, and 10 current or former federal government employees. Our conversations were guided by the following four questions:
- What work are you currently doing to address extreme heat?
- What do you see as some of the opportunity areas to address extreme heat?
- What are the existing challenges to managing and responding to extreme heat?
- What actions should the federal government take to address extreme heat?
Our conversations with experts sourced 20 high-demand opportunity areas for policy innovation and 65 policy ideas. To go deeper, FAS recruited 33 authors to work on +18 policy memos through our Extreme Heat Policy Sprint, generating an additional +100 policy recommendations to address extreme heat’s impacts and build community resilience. Our policy memos from the Extreme Heat Policy Sprint, published in April 2024, provide a more comprehensive dive into many of the key policy opportunities articulated in this report. Overall, FAS’ work scoping the policy landscape, understanding the needs of key actors, identifying demand signals, and responding to these demands has generated +165 policy recommendations for 34 offices and/or agencies.
Opportunities for Extreme Heat Policy Innovation
The following 20 “opportunity areas” are not exhaustive, yet can serve as inspiration for the building blocks of a future strategic initiative.
Facilitate Government-Wide Coordination
The first opportunity is an overarching call to action: the need for a government-wide extreme heat strategic initiative. This can build upon the National Integrated Health Health Information System’s (NIHHIS) National Heat Strategy, set to release this year. This strategy would define the problems to solve, create targets and galvanizing goals, set and assign priorities for federal agencies, review available resources for financial assistance, assess regulatory and rulemaking authority where applicable, highlight legislative action, and include evaluation metrics and timeline for review, adjustment, and renewal of programs. In creating this strategy, one interviewee recommended there should be a comprehensive review of “heat exposure settings” and federal actors that can safeguard Americans in these settings: homes, workplaces, schools and childcare facilities, transit, senior living facilities, correctional facilities, and outdoor public spaces. Through scoping potential regulations, standards, guidelines, planning processes, research agendas, and financial assistance, the federal government will then be prepared to support its intergovernmental actors and communities.
Infrastructure And The Built Environment
Accelerate Resilient Cooling Technologies, Building Codes, and Urban Infrastructure
On average, Americans spend 90% of their time indoors, making the built environment a critical site for heat exposure mitigation. To keep cool, especially in places of the U.S. not used to extreme heat, buildings are increasingly reliant on mechanical cooling interventions. While a life-saving necessity, air conditioning (AC) consumes significant amounts of electricity, putting high demands on aging grid infrastructure during the hottest days. Excess heat from air conditioners can lead to higher outdoor temperatures and even more AC demand. Finally, ACs are useless interventions if there’s no power, an increasing risk due to growing energy poverty and grid failure. In these scenarios, our current construction is likely to widely “fail” in its ability to cool residents.
Resilient cooling strategies, like high-energy efficiency cooling systems, demand/response systems, and passive cooling interventions, need policy actions to rapidly scale for a warming world. For example, cool roofs, walls, and surfaces can keep buildings cool and less reliant on mechanical cooling, but are often not considered a part of weatherization audits and upgrades. District cooling, such as through networked geothermal, can keep entire neighborhoods cool while relying on little electricity, but is still in the demonstration project phase in the United States. Heat pumps are also still out of reach for many Americans, making it essential to design technologies that work for different housing types (i.e. affordable housing construction). Initiatives like the Department of Energy’s (DOE) Affordable Home Energy Shot can bring these technologies into reach for millions of Americans, but only if it is given sufficient financial resources. DOE’s Office of Clean Energy Demonstrations and State and Community Energy Programs FY25 budget request to strengthen heat resilience in disadvantaged communities through energy solutions could be a step towards realizing innovative heat technologies. Further, the Environmental Protection Agency’s Energy Star program can further incentivize low-power and resilient cooling technologies — if rebates are designed that take advantage of these technologies.
Thermal resilience of buildings must also be considered, for both day-to-day operations and emergency blackout scenarios. DOE can work with stakeholders to create “cool” building standards and metrics with human health and safety in mind, and integrate them into building codes like ASHREI 189.1 and 90 series. These codes are “win-wins” for building designers, creating buildings that consume far less electricity while keeping inhabitants safe from the heat. DOE can assist in conducting more demonstration projects for building strategies that ensure indoor survivability in everyday and extreme conditions.
Intervention efficacy and applicability are still evolving for extreme heat resilience interventions at the community scale, such as cool pavements, urban greening, shading, ventilation corridors, and development regulations (i.e. solar orientation). Individual interventions and their interactions need more evidence of their costs and benefits, potential tradeoffs and maladaptations. The National Institutes of Standards and Technology works on building and urban planning standards for other natural hazards, such as their National Windstorm Impact Reduction Program (NWIRP) and their Community Resilience program, and could serve as a “technology test-bed” for heat resilience practices and advance our understanding of their effectiveness as well as how to measure and account for benefits and costs. This could be done in partnership with the National Science Foundation, which has been dedicating funding for use-inspired research and technology development for climate resilience.
Finally, the U.S. government is the largest landlord in the nation. As the General Services Administration is rapidly decarbonizing its buildings, it can also be a test site for new technologies, building designs, planning, and resilience metrics development and analysis.
Adapt Transportation to the Heat
Public transportation is a site of high exposure to extreme heat. While the Department of Transportation’s Promoting Resilient Operations for Transformative, Efficient, and Cost-saving Transportation (PROTECT) grants are for “surface transportation resilience,” multiple of our local and regional government interviewees expressed difficulty successfully applying to these grants for “cooling” infrastructure, like water fountains, shade, and air-conditioned bus shelters. DOT should make extreme heat resilience explicit in its eligibility requirements as well as review the benefit-cost analysis (BCA) formula and how it might disadvantage cool infrastructure.
Asphalt and concrete roadways contribute to the urban heat island effect and hotter weather makes asphalt in particular more vulnerable to cracking. DOT should leverage its research and development (R&D) capabilities to develop and deploy reflective and cool materials as a part of transportation infrastructure improvements. Finally, DOT should also consider the levers available to incentivize cool surfaces and cool materials as a part of transportation construction.
Create More Heat-Resilient Schools for Sustained Learning
Higher temperatures combined with minimal to no air conditioning in older school buildings have led to an increase in the number of “heat days”, or school closures due to dangerous temperatures. Pulling children out of the classroom not only negatively impacts them, but also puts increasing strain on families that rely on schools as childcare. Even when school is in session, many students are attempting to learn in classrooms exceeding 80°F, a temperature threshold where studies have repeatedly shown that students struggle to learn and fall short of true academic performance. This is because heat reduces cognitive function and ability to concentrate – both essential to learning. Learning loss from rising heat will only compound the learning losses from the COVID-19 pandemic. The Environmental Protection Agency predicts that the total lost future income attributable to heat-related learning losses may reach $6.9 billion at 2°C (a threshold we are well on the way to meeting) and $13.4 billion at 4°C. Schools need guidance on how to deal with the heat crisis currently at hand, while being supported as they plan necessary climate adaptations needed for a hotter world.
At a minimum, schools can be encouraged to formalize plans for school heat preparedness to protect both the health of students and safeguard their learning. No federal heat safety recommendations yet exist and thus will need to be created by the Department of Education (Ed), EPA, FEMA, the National Oceanic and Atmospheric Administration (NOAA), and others. Title I Grants, in alignment with Justice40, could then assist schools in adapting to climate change that includes researched guidance on ways to cool students indoors, outdoors, and through behavioral management. Further, school system leaders need a better system to track how schools are currently experiencing extreme heat and what strategies could be employed to respond to heat exposure (closing schools, informed behavioral interventions to manage heat exposure, green infrastructure to build resilience, etc). Federal involvement is essential for creating this tool. Finally, to address the root causes of excessive classroom heat, schools will need to transform their infrastructure through HVAC investments and improvements, greening, playground material changes and shading. HVAC costs alone are expected to be $40 billion for all U.S. schools that need infrastructure improvements. While Inflation Reduction Act (IRA) tax credits are available for updating HVAC systems, many low-wealth schools will not be able to finance the gap between the credit coverage and the true cost and will need additional financial assistance.
Make Housing and Eviction Policy More Climate-Aware and Resilient
Most of the U.S. lacks minimum cooling requirements for buildings and existence of a cooling device within the property. Adoption of the latest building energy codes, despite their previously described limitations, can still be a cost-saving and life-saving advancement according to research by the DOE. For new properties, the Federal Housing Finance Agency could require that they adhere to the latest energy codes to receive a mortgage from Government Sponsored Enterprises, which is already under consideration by Housing and Urban Development (HUD) and the U.S. Department of Agriculture (USDA) for their mortgage products. For older construction, there could be requirements for adequate cooling to exist in the property at the point of sale.
For all property types, weatherization audits, through the Weatherization Assistance Program (WAP) and Low-Income Home Energy Assistance Program (LIHEAP), can be expanded to consider heat resilience and cooling efficiency of the property and then identify upgrades such as more efficient HVAC, building envelope improvements, cool roofs, cool walls, shade, and other infrastructure. If cooling the entire property is unfeasible or costly, homeowners could benefit from creating “Climate Safe Rooms” which are guaranteed to be safe during a heat wave. DOE and HUD could collaborate to demonstrate climate safe rooms in affordable housing, where many residents lack access to consistent cooling.
Some housing types are more risky than others. People living in manufactured homes in Arizona were 6 to 8 times more likely to die indoors due to extreme heat. This is because of poorly functioning or completely defunct cooling systems and/or inability to pay electric bills. Manufactured home park landlords can also set a variety of rules for homeowners, including banning cooling devices like window ACs and shade systems. While states like Arizona have now passed laws making these bans illegal, there is a need for a nationwide policy for secure access to cooling. HUD does not regulate manufactured homes parks, but does finance the parks through Section 207 mortgages and could stipulate park owners must guarantee resident safety. Finally, HUD could also update the Manufactured Home Construction and Safety Standards to allow for HVAC and other cooling regulations in local building codes to apply to manufactured homes, as they do for other forms of housing, as well as require homes perform to a certain level of cooling under high heat conditions.
Renter’s are another highly vulnerable population. Most states do not require landlords to provide cooling devices to tenants or keep housing below risky temperatures. HUD for example does not require cooling devices in public housing, although regulations exist for heating. HUD could implement similar guarantees of a “right to cool”. Evictions in the summer months are also on the rise, due to rising rents compounded with rising energy costs, putting people out in the deadly heat. Keeping people in housing should be of the utmost importance, yet implementation remains fractured across the nation. Eviction moratoriums at a national level have been challenged by the Supreme Court, which overturned the CDC’s COVID-19 moratorium.
Address Communities’ Needs for Long-Term Infrastructure Funding Support
Heat vulnerability mapping has advanced significantly in the past few years. Federal programs like the NIHHIS’s Urban Heat Island Mapping Campaigns have mapped +60 communities in the United States that have guided city policy. The Census’ new product, Community Resilience Estimates (CRE) for Heat, assesses vulnerability at the level of individuals and households. Finally, researchers and non-profit organizations have been developing tools that can assess risk and also aid in individual or local decision-making, such as the Climate Health and Risk Tool and Heat FactorⓇ.
Advancements in our understanding of heat’s impacts and potential interventions have not translated to sustained resources to support transformative infrastructure development. As one interviewee put it “communities that have mapped their urban heat islands are still waiting on funding opportunities to build relevant infrastructure projects”. Federal grants for mitigation and resilience may or may not consider heat resilience projects “cost-effective” and aligned with grant-making objectives, leading to rejection.
FEMA’s Hazard Mitigation Grants (HMGP), made available only after a federally-declared disaster, can only be used for extreme heat in specific circumstances and recommends that cost-effective heat mitigation projects will also “reduce risks of other hazards”. Another example, FEMA’s BRIC grant has rejected cooling centers, HVAC upgrades, and weatherization activities, all strategies with some benefit to preventing morbidity and mortality. Green infrastructure projects, with co-benefits such as flood mitigation, have been more successful, often because the BCA is based on the property-damaging hazard, flooding. Only one FEMA BRIC project has been funded with heat as the main hazard, an urban greening project in Portland, Oregon. This unknown regarding grant success can lead to communities not applying with a heat-focused project, when time could be better spent securing grants for other community priorities. FEMA’s announcement that it will fund net-zero projects, including passive heating and cooling, through its HMGP and BRIC programs and Public Assistance could shift the paradigm, yet communities will likely need more guidance and technical assistance to execute these projects.
To invest in resilience to the growing risk of heat, policymakers will need to create a dedicated and reliable funding resource. Federal stakeholders can look to the states for models. California’s Integrated Climate Adaptation and Resiliency Program’s Extreme Heat and Community Resilience grants are currently slated to allocate $118 million to 20-40 communities for planning and implementation grants over three rounds. To start, FEMA could replicate this program, similar to its specific programs for wildfires, providing $50,000 to $5 million to a wide range of heat resilience projects, and make it eligible for joint funding through BRIC. DOE’s $105 million FY25 budget request for a program for planning, development, and demonstration of community-scale solutions to mitigate extreme heat in low-income communities is a step in the right direction. If funded, the program would benefit from coordinating with FEMA’s BRIC program on high-impact solutions.
Workforce Safety And Development
Set Indoor and Outdoor Temperature Standards and Workplace Protections to Protect Human Health
Our understanding of when heat becomes risky to human health and impacts daily governance is still in development. Our interviewees shared that there is not yet consensus or agreement on the lower threshold for 1) when outdoor and indoor temperatures risks begin and 2) at what level of continued exposure should there be cause for action, such as implementing breaks for workers or deploying rapid emergency cooling to residents. For workplaces, guidelines will come soon: the Occupational Health and Safety Administration (OSHA) is set to release their heat standard for indoor and outdoor workers by the end of 2024, which will advance heat safety for workers across the country. For all other settings (such as residential settings and schools), the jury is still out on a valid threshold and a regulatory mechanism to establish it.
Enforcement of standards is necessary for realizing their full potential. In preparation for a workplace heat standard, interviewees recommended the Department of Labor create an advanced Hazard Alert System for Heat (using an evolved data standard discussed in a later section) in order to better pinpoint regulatory enforcement. Small businesses will also need help to be prepared for compliance with the new standard. DOL and the Small Businesses Administration should consider setting up a navigator program for resourcing energy-efficient, worker-centric cooling strategies, leveraging IRA funds where applicable.
Build the Extreme Heat Resilience Workforce
Extreme heat is not just a challenge to worker health, it’s also a challenge to workforce ability and capacity. As heat becomes a threat to the entire nation, many fields are needing to rapidly adapt to entirely new knowledge bases. For example, much of the health workforce, doctors, nurses, public health workers, receive little to no education on climate change and climate’s health impacts. Programs are beginning to crop up, such as Harvard’s C-Change Program, yet will need support to scale. With the federal government being the nation’s largest single source funder of graduate medical education, there are many levers at their disposal to develop, incentivize, and even require climate and health education. The U.S. Public Health Commissioned Corps is another program that could mobilize a climate-aware health workforce, placing professionals with a deep awareness of climate change’s impact on health in local communities.
The weatherization and decarbonization workforce must also be made aware and ready for heat’s growing impacts and emerging strategies to build building and community-scale resilience. While promising strategies exist for heat mitigation, such as cool walls and roofs, these interventions are largely not considered during weatherization audits and energy efficiency audits. Tax credits that have been created by the IRA/BIL could be used for interventions for passive or low-energy cooling, yet a lack of clarity prevents their uptake and implementation. For example, EPA’s EnergyStar program used to certify roofing products before the program sunsetted in 2022. Stakeholders at DOE and EPA should consider their role in workforce readiness for extreme heat, collaborating with third party entities to build awareness about these promising strategies.
Navigating all of the benefits of the IRA and BIL is challenging for resource-strapped communities and households. Program navigators for weatherization assistance and resilience could be an incredible asset to low-resource communities, and leverage IRA resources for technical assistance as well as the newly created American Climate Corps.
Finally, the federal government workforce is being stretched thin by the sheer number of new mandates in IRA and BIL. To meet the moment, agencies have used flexible hiring mechanisms like the Intergovernmental Personnel Act (IPAs) and for some offices its BIL and IRA connected Direct Hire Authority to make those critical talent decisions and staff their agencies. DOE, for example, has exceeded its goals – hiring over 1000 new employees to date. But not all agencies and offices have access to the Direct Hire Authority – and it’s set to expire anywhere between 2025 (for IRA) and 2027 (for BIL). Congress should be encouraged to expand this authority, extend it beyond 2025 and 2027 respectively, and remove the limit on the number of staff allowed. Further, agencies should be encouraged to use other flexible hiring mechanisms like IPAs and other termed positions. The federal government should have the talent needed to meet its current mandates and be prepared to solve problems like extreme heat.
Public Health, Preparedness, And Health Security
Build Healthcare System Preparedness
Years of underinvestment in preparedness have impacted U.S. health infrastructure’s surveillance, data collection, and workforce capacity to respond to emerging climate threats like extreme heat. The Administration for Strategic Planning and Response’s Hospital Preparedness Program, which prepares healthcare systems for emergencies, has had its budget reduced by 67% from FY 2002-FY2022, considering inflation. Further, the Center for Disease Control and Prevention (CDC) has seen a 20% budget reduction from FY 2002-2022. The CDC’s Climate Ready States and Cities Initiative can only support nine states, one city, and one county, despite 40 jurisdictions having applied. The Trust for America’s Health (TFAH) found increasing funding from $10 million to $110 million is required to support all states, and improve climate surveillance. The TFAH also found that an additional $75 million is needed to extend the CDC’s National Environmental Public Health Tracking Program, a program that tracks threats and plans interventions, to every state. Finally, the Office of Climate Change and Health Equity, the sole office within Health and Human Services solely dedicated to the intersection of climate and health, has yet to receive direct appropriations to support its work.
Centers for Medicare and Medicaid (CMS) and the Healthcare Resources and Services Administration (HRSA) provide critical investments to healthcare facilities, operations, care provision, and the medical workforce, yet have no publicly available programs dedicated to building climate resilience in the face of rising temperatures. The Veterans Health Administration (VHA), the largest integrated healthcare system in the U.S., includes responding to heat wave exposure in its agency Climate Action Plan and has made commitments to developing biosurveillance systems that incorporate external data on air quality, temperature, heat index, and weather as well as upgrading medical center infrastructure. This is critical as 62% of VHA medical centers are exposed to extreme heat and the VHA sees a rise in heat-related illness in the Veteran population. Given its sheer size, systems changes like this made by the VHA can drive real change in healthcare practice.
To build resilience to extreme heat within healthcare systems, our interviews and literature review highlighted that these three actions are most critical: 1) increasing surveillance and tracking of heat-related illness through improvements to medical diagnosis and coding practices and technological systems (i.e. EHRs); 2) leveraging healthcare financing for preventative treatments (i.e. cooling devices), incentives for climate-change preparedness, accurate coding and treatment, and quality care delivery (CQIs), and requirements for accreditation and reimbursements; and 3) fostering capacity-building through grants, technical assistance, planning support and guidance, and emergency preparedness.
Design Activation Thresholds for Public Health, Medical, and Emergency Responses
Despite the fact that extreme heat events have overwhelmed local capacity and triggered local disaster declarations, heat is not explicitly required in healthcare preparedness efforts authorized under the Pandemics and All Hazards Preparedness Act (PAHPA), insufficiently included or not included at all in local and state hazard mitigation plans required by FEMA, and there has yet to be a federal disaster declaration for heat. This all inhibits the deployment of federal resources to mitigation, planning, and response that states and local jurisdictions rely on for other hazards. Our interviewees recommended that there needs to be better “activation thresholds” for heat i.e. markers that the hazard has reached a level of impact that needs additional capacity and resources. Most thresholds set right now just rely on high-temperatures, not the risk factors that exacerbate the impacts of heat. Data inputs into these locally-relevant thresholds can include wet-bulb globe temperature (which accounts for humidity), heat stress risk, level of acclimatization, nighttime temperatures, building conditions and cooling device uptake, work situations, other compounding health risks like wildfire smoke, and other factors. These activation thresholds should also be designed around the most heat-vulnerable populations, such as children, the elderly, pregnant people, and those with comorbidities.
Increased transmission of viral pathogens and pathogen spread is also a growing risk of overall hotter average temperatures that needs more attention. Increased pathogen surveillance and correlation with existing climate conditions would greatly enable U.S. pandemic and endemic disease surveillance. Finally, no program to date at the Biomedical Advanced Development and Research Authority has focused on creating climate-aware medical countermeasures and the 2022-2026 strategic plan includes no mention of climate change.
Reduce Energy Burdens, Utility Insecurity, and Grid Insecurity
As temperatures rise, so do energy bills. Americans are facing an ever-growing burden of energy debt. 16% (20.9 million people) of U.S. households find themselves behind on their energy bills, increasing the risk of utility shut-offs due to non-payment. The Low Income Home Energy Assistance Program (LIHEAP) exists to relieve energy burdens, yet was designed primarily for heating assistance. Thus, the LIHEAP formulas advantage states with historically frigid climates. Further, most states use their LIHEAP budgets for heating first, leaving what remains for cooling assistance (or just don’t offer cooling assistance at all). As a result, nationally from 2001-2019, only 5% of energy assistance went to cooling. Finally, the LIHEAP program is massively oversubscribed, and can only service a portion of needy families. To adapt to a hotter world, LIHEAP’s budgets must increase and allocation formulas will need to be made more “cooling”-aware and equitable for hot-weather states. The FY25 presidential budget keeps LIHEAP’s funding levels at $4.1 billion, while also proposing expanding eligible activities that will draw on available resources. The National Energy Assistance Directors Association recent analysis found that this funding level could cut ~1.5 million families from the program and cut program benefits like cooling.
Another key issue is that 31 states have no policy preventing energy shut-offs during excessive heat events and even the states that have policies vary widely in their cut-off points. These cut-off policies are all set at the state level, and there is still an ongoing need to identify best practices that save lives. While the Public Utility Regulatory Policies Act of 1978 (PURPA) prohibits electric utilities from shutting off home electricity for overdue bills when doing so would be dangerous for someone’s health, it does not have explicit protections for extreme weather (hot/cold). Reforms to PURPA could be considered that require utilities to have moratoriums on energy shut-offs during extreme heat seasons.
Finally, grid resilience will become even more essential in a hotter climate. Power outages and blackouts during extreme heat events are deadly. If a blackout were to occur in Phoenix, Arizona during the summer, nearly 900,000 people would need immediate medical attention. Rising use of AC itself is a risk factor for blackouts due to increases in energy demand. The North American Electric Reliability Corporation (NERC), a regulatory organization that works to reduce risks to power grid infrastructure, issued a dire warning that two-thirds of the U.S. are facing reliability challenges because of heatwaves. Ensuring grids are ready for the climate to come should be top priority for DOE, the Federal Emergency Management Agency (FEMA), and the Federal Energy Regulatory Commission (FERC). Given the risks to human health, the Centers for Disease Control and Prevention (CDC) should work with public health organizations to prepare for blackouts and grid failure events.
Address Critical Needs of Confined Populations Facing Heat
Confined populations, whether because of their medical status or legal status, are vulnerable to extreme heat indoors. Long-term care facilities are required by law to keep properties within 71-81℉. Yet, long-term care facilities are reporting challenges actually meeting resident’s needs in a disaster, such as a power outage, calling for a need for more coordination with CMS.
Incarcerated populations on the other hand are not guaranteed any cooling, even as summers become more brutal. This directly leads to an increase in deaths, 45% of U.S. detention facilities saw spikes in deaths on hazardous heat days from 1982 to 2020. Despite this lack of sufficient cooling being “cruel and unusual” punishment, there has been no public activity to date from the Department of Justice to secure cooling infrastructure for federal prisons or work with state prisons to expand cooling infrastructure. The National Institute of Corrections does recommend ASHRAE 55 Thermal Environmental Conditions for Human Occupancy to corrections institutions, though this metric needs to be updated for our evolving understanding of extreme heat’s risks to human health.
Food Security And Multi Hazard Resilience
Anticipate and Prevent Supply Chain Disruptions
Hotter temperatures are changing the landscape of American and global food production. 70% of global agriculture is expected to be affected by heat stress by 2045. Recent heat waves have already killed crops and livestock en masse, leading to lower yields and even shortages for certain products – like olive oil, potatoes, coffee, rice, and fruits. Rising heat is also poised to reshape local and state economies that rely on their changing climatic capabilities to produce certain crops. Oranges, a $5 billion dollar industry for Florida, are struggling in the heat which stresses the trees and provides fertile ground for pathogens. As a result, Florida is facing its worst citrus yield since the Great Depression. A decrease in winter chill is another growing risk, as many perennial crops have adapted to certain amounts of accumulated winter chill to develop and bloom. Winter-time heat is shaking up plants’ biological clocks, decreasing quality and yield. Overall, extreme heat is impacting American household bottom lines in the short-term and long-term through heat-exacerbated earning losses and spiking food prices.
Ensuring ongoing access to critical commodity and specialty agricultural products in a future of higher temperatures is a national security priority. Resilience of products to extreme heat could be included as a future requirement in the Federal Supplier Climate Risks and Resilience Rule that governs Federal Acquisition Regulations. Further, FAS’ work scoping the federal landscape has shown there are few federal research and development programs, financial assistance opportunities, and incentives for heat resilience, and our interviewees concurred with that assessment. The U.S. Department of Agriculture (USDA) can prepare farmers for future climate risks and hotter temperatures, ensuring consistent food production and reducing the losses and needed economic pay-outs from the USDA through crop insurance and disaster assistance. The USDA can accelerate advances in biotechnology and genetic engineering to improve heat resilience of agricultural products while also encouraging practices like shade, effective water management, and soil regeneration that build system-wide resilience. As Congress continues to consider reauthorizations and appropriations for the Farm Bill, they should consider fully funding the Agriculture Advanced Research and Development Authority to advance resilient agriculture R&D while also increasing funding to the USDA Climate Hubs to support roll-out of heat resilient practices.
Connect Drought Resilience and Heat Resilience Strategies
Hotter winters have literal downstream consequences. Warming is shrinking the snowpack that feeds rivers, leading to further groundwater reliance, straining aquifers to the brink of complete collapse. Warmer temperatures also leads to more surface water evaporating, thus leaving less to seep through the ground to replenish overstressed aquifers. Rising temperatures also mean that plants need more water, as they evapotranspirate at greater rates to keep their internal temperatures in-check. All of these factors compound the growing risk of drought facing American communities. Drought, now made worse by high heat conditions, accounts for a significant portion of annual agricultural losses. 80% of 2023 emergency disaster designations declared by the United States Department of Agriculture (USDA) were for drought and/or excessive heat. Secure access to water is an escalating catastrophe, and to address it requires a national strategy that accounts for future hotter temperatures and how they will put strain on water accounts necessary to sustain agricultural production and human habitation.
Heat and dry weather/drought also combine to make prime conditions for megawildfires. The smoke then generated by these fires compounds the health impacts of extreme heat, with research showing that concurrent effects of heat and smoke drive up the number of hospitalizations and deaths. More funding from Congress is needed to improve wildfire forecasting and threat intelligence in the era of compounding hazards.
Planning And Response
Reform the Benefit-Costs Analysis
Benefit cost analysis (BCA) is a critical tool for guiding infrastructure investments, and yet is not set up to account for the benefits of heat mitigation investments. When the focus of the BCA is mitigating property damage and loss of life, it will discount impact’s that go beyond those damages such as economic losses, learning losses, wage losses, and healthcare costs. Research will likely be needed to generate the pre-calculated benefits of heat mitigation infrastructure, such as avoiding heat illness, death, and wage losses and preventing widespread power failures (a growing risk). Further, strategies that enhance an equitable response, articulated in the recent update to the Office of Management and Budget’s Circular A-4, need to be quantified. This could include response efforts that protect the most vulnerable populations to extreme heat, such as checking in on heat sensitive households identified by the CRE for Heat. Developing these metrics will take time, and should be done in partnership with agencies like the DOE, EPA, and CDC. Finally, FEMA’s BCA is often based on a single hazard, the one with the highest BCA ratio, making it more challenging to work on multi-hazard resilience. FEMA should develop BCA methods that allow for accounting of an infrastructure investment for community resilience to many hazards (like resilience hubs).
Create the “Plan” for How the Federal Emergency Management Agency and Others Should Respond to an Extreme Heat Disaster
Extreme heat’s extended duration, from a few days to several months, poses a significant challenge to existing disaster policy’s focus on acute events that damage property. An acute focus on infrastructure damages by FEMA has been an insurmountable barrier to all past attempts to declare extreme heat as a disaster and receive federal disaster assistance. Because in theory, FEMA can reimburse state and local governments for any disaster response effort that exceeds local resources, including heat waves. Our interviewees acknowledged that federal recognition that heat waves are disasters will only come with extending the definition of what a disaster is.
New governance models will need to be created for climate and health hazards like extreme heat, focusing on an adaptation forward, people-centered disaster response approach given the outsized impact of heat hazards on human health and economic productivity. Such a shift will challenge the federal government’s existing authorities authorized under national disaster law, the Stafford Act, which at this current moment does not consider “human damages” beyond loss of life. Thus, we do not see how existing infrastructure fails to provide critical function during these heat hazard events, such as secure learning, secure workplaces, secure municipal operations, secure healthcare delivery, and resultantly strains or exceeds local resources to respond. By quantifying more of these damages, there will then be an existing incentive to design responses that address current impacts and plan for and mitigate future impacts.
Finally, there are highly-risky heat disasters that we need to be executing planning scenarios for, specifically an extended power outage in a city under high-heat conditions. A power outage during the summer in Phoenix would send 800,000 people to the emergency room, which would very likely overwhelm local resources and those of all surrounding jurisdictions. There is a need for a power outage during an extended heat wave to be an included planning scenario for emergency management exercises lead by state and local governments. FEMA should produce a comprehensive list of everything a city needs to be prepared for a catastrophic power outage.
Spur Insurance and Financing Innovation
While insurance is the countries’ largest industry, few insurance products and services exist in the U.S. to cover the losses from extreme heat. The U.S. Department of the Treasury recently acknowledged this lack of comprehensive insurance for extreme heat’s impacts in its comprehensive report on how climate change worsens household finances. Heat insurance for individuals could manifest in a variety of ways: security from utility cost spikes during extreme weather events, real-estate assessment and scoring for future heat-risk, “worker safety” coverage to protect wages during extremely hot days where it might be unsafe to work, protections for household items/resources lost due an extended blackout or power outage, and full coverage for healthcare expenses caused by or exacerbated by heat waves. California is currently leading the country on thinking through the role of the insurance industry in mitigating extreme heat’s impacts, and should be a model to watch by federal stakeholders to see what can be scaled and replicated across the nation.
Further, it is important that investments made today are resilient for the climate conditions of tomorrow. The Office of Management and Budget’s November 2023 memo on climate-smart infrastructure, currently being implemented, provides technical guidance on how federal financial assistance programs can and should be invested in climate resilience. A yet unexplored financial lever for climate resilience identified in our interviews is federally-backed municipal bonds. Climate change is undermining this once stable investment, as cities and local governments struggle to pay back interest due to the rising costs of addressing hazards. The municipal bond market could price climate risk when deciding on interest payments, and give beneficial rates to jurisdictions that have done a full analysis of their risks and made steps towards resilience.
Finally, there is a need to update assessments of heat risk that are used to make insurance and financial decisions. Recent research by the DOE has found that the FEMA NRI property damage data appear to be deficient and underestimate damages when compared to published values for recent U.S. extreme temperature events. To start, FEMA should consider including metrics in its NRI that characterize the building stock (i.e. by adherence to certain building codes) and its thermal comfort levels (even with cooling devices) as well as thermal resilience.
Incorporate Future Climate Projections into Planning at All Levels
Recent research has shown that cities and counties are barreling toward temperature thresholds at which it would be dangerous to operate municipal services, affecting the operations of daily life. Yet little of this future risk is accounted for in the various planning activities (for public health, emergency preparedness, grid security, transportation, urban design, etc) done by local and state governments. Our interviewees expressed that because many plans are based on historical and current risk data, there is little anticipation of the future impacts of hotter temperatures when making current planning choices.
One example stood out around nature-based solutions (NBS): while NBS has received over a billion dollars in federal funding and is argued as an approach to mitigate extreme heat’s impacts – planners are not always considering whether the trees planted today will survive effectively in 20-30 years of warming. Reporting has shown that Southern Nevada is at risk of losing many of its shade trees due to inadequate species selection, as the trees that once thrived in this climate exceed their zones of heat tolerance.
Changes are being made to some federally-required planning processes to require assessment of future risk. FEMA’s National Mitigation Planning Program now requires state and local governments to plan for future risks caused by climate change, land use, and population change to receive emergency disaster funds and mitigation funding. While extreme heat is a noteworthy future risk, it is not explicitly required in the new guidelines. As of April 2023, only half of U.S. states had a section dedicated to extreme heat in their Hazard Mitigation Plans.
Climate.gov, operated by NOAA, was a recommended starting place for a library of future climate files that can be brought into planning processes and resilience analysis. Technical assistance and decision-making tools that support planners in making predictive analyses based on future extreme temperature conditions can help inform the effective design of resilient transportation systems, infrastructure investments, public health activities, and grids, and ensure accurate estimations of investment cost effectiveness over the measure lifetime.
Data And Indices
Set Standards for Data Collection and Analysis
While official CDC-reported deaths from heat, approximately 1670 in 2022, exceed those from any other natural hazard, experts widely agree this number is an undercount. True mortality is likely at a rate of 10,000 deaths a year from extreme heat under current climate conditions. Many factors compound this systematic undercount: hospitals often do not consider extreme heat in their hazard preparedness plans, there’s a lack of awareness around ICD-10 coding for heat illness, death attribution exacerbated/caused by heat is often attributed to other causes. Retraining the healthcare workforce and modernizing death counting for climate change will take time, our interviewees acknowledged. Thus, decision makers need better data and surveillance systems now to address this growing public health crisis. Excess deaths analysis could provide a proxy data point for the true number of heat deaths, and has already been employed by California to assess the impact of past heat waves. The CDC has utilized excess death methods in tracking the COVID-19 pandemic, and could apply this analysis to “climate killers” like extreme heat to inform healthcare system planning ahead of Summer 2024 (such as forecasting tools like HeatRisk). It will be critical to set a standard methodology in order to compare heat’s impacts in different communities across the United States. True mortality is also essential to enhancing the benefit-cost analysis for heat mitigation and resilience.
Our conversations also highlighted the data gaps that exist around counting worker injuries and deaths due to extreme heat. For work-related heat-health impacts, injuries or deaths are often only counted if there’s a hospital admission that is a required report, heat-exacerbated injuries (i.e. falls) aren’t often counted as heat-related, and harms off the job (i.e. long-term kidney impacts) go unnoticed. Studies estimate that California alone saw 20,000 heat injuries a year, while The U.S. Department of Labor (DOL) reports only 3400 injuries a year nationally. DOL could track how overall workplace injuries correlate with temperature to develop a methodology that would yield much more accurate numbers around true heat impacts.
Finally, anticipating the full risks of heat due to factors like existing infrastructure, social vulnerability, and levels of community resilience, remains a work in progress. For example, FEMA’s National Risk Index (which informs environmental justice tools like the Climate and Economic Justice Screening Tool and the Community Disaster Resilience Zones program) has notable limitations due to its reliance on previous weather data and narrow focus on mortality reduction, leading to underestimates of damages when compared to published values for recent U.S. extreme temperature events. There is a big opportunity to develop a standard data set for extreme heat risks and vulnerabilities in current and future anticipated climate conditions. This data set can then produce high-quality and relevant tools for community decision making (like FEMA’s Flood Maps) and inform federal screening tools and funding decisions.
Create Regulatory Oversight Infrastructure for Extreme Heat
There are only a few regulatory levers currently in place or in the regulatory pipeline to protect Americans from the growing heat and build more heat resilient communities. These include the temperature standards for senior living facilities set by CMS and OSHA’s upcoming heat standard. There are many more common settings: homes, schools and childcare facilities, transit, correctional facilities, and outdoor public spaces where regulations are needed. There will also need to be expanded enforcement of the regulations, including better monitoring of temperatures outdoors and indoors. HUD, EPA, and NOAA should work to identify expansion opportunities to indoor and outdoor air temperature monitoring, seeking additional funding from Congress where needed
Future regulations for mitigating extreme heat exposure can be conceptualized in the following three ways: technology standards, the required presence of a cooling and/or thermal-regulating technology, behavioral guidelines and expectations, required actions to avert overexposure, and performance standards, requirements that heat exposure cannot cross a certain threshold. These potential regulations will need to be conceptualized, reviewed, and implemented by several federal agencies, as authority for different aspects of heat exposure is fragmented across the federal government. Some examples of regulatory levers identified through our interviews (and introduced in previous sections) include:
- HUD could have standards for building performance that includes thermal comfort and safety, for its properties, backed-mortgages, and public housing it supports, as well as requirements for reducing building waste heat.
- DOE could expand its performance assessment and certification of energy efficiency products to those that also enhance thermal comfort and resilience.
- FEMA could require individuals, local governments, and state governments to do mitigation planning for extreme heat, and make resources then available to build community-scale thermal resilience.
- DOT could implement requirements for infrastructure projects to not increase urban areas UHI effects.
- EPA could further its analysis of the compounding effects of hot air and air pollution, and consider hotter air temperatures (such as those in UHIs) a risk to guaranteeing clean air.
- The Administration for Strategic Planning and Response (ASPR) could require hospital planning for surges in heat illness during heat waves to receive Hospital Preparedness Program funding.
Conclusion
Extreme heat, both acute and chronic, is a growing threat to American livelihoods, affecting household incomes, students’ learning, worker safety, food security, and health and wellbeing. While the policy landscape for addressing heat is nascent, this report offers recommendations for near and long term solutions that policymakers can consider. Complimentary to FAS’ Extreme Heat Policy Sprint, we hope this report can be a toolkit for potential realistic actions.