Establish data collaboratives to foster meaningful public involvement
Federal agencies are striving to expand the role of the public, including members of marginalized communities, in developing regulatory policy. At the same time, agencies are considering how to mobilize data of increasing size and complexity to ensure that policies are equitable and evidence-based. However, community engagement has rarely been extended to the process of examining and interpreting data. This is a missed opportunity: community members can offer critical context to quantitative data, ground-truth data analyses, and suggest ways of looking at data that could inform policy responses to pressing problems in their lives. Realizing this opportunity requires a structure for public participation in which community members can expect both support from agency staff in accessing and understanding data and genuine openness to new perspectives on quantitative analysis.
To deepen community involvement in developing evidence-based policy, federal agencies should form Data Collaboratives in which staff and members of the public engage in mutual learning about available datasets and their affordances for clarifying policy problems.
Details
Executive Order 14094 and the Office of Management and Budget’s subsequent guidance memo direct federal agencies to broaden public participation and community engagement in the federal regulatory process. Among the aims of this policy are to establish two-way communications and promote trust between government agencies and the public, particularly members of historically underserved communities. Under the Executive Order, the federal government also seeks to involve communities earlier in the policy process. This new attention to community engagement can seem disconnected from the federal government’s long-standing commitment to evidence-based policy and efforts to ensure that data available to agencies support equity in policy-making; assessing data and evidence is usually considered a job for people with highly specialized, quantitative skills. However, lack of transparency about the collection and uses of data can undermine public trust in government decision-making. Further, communities may have vital knowledge that credentialed experts don’t, knowledge that could help put data in context and make analyses more relevant to problems on the ground.
For the federal government to achieve its goals of broadened participation and equitable data, opportunities must be created for members of the public and underserved communities to help shape how data are used to inform public policy. Data Collaboratives would provide such an opportunity. Data Collaboratives would consist of agency staff and individuals affected by the agency’s policies. Each member of a Data Collaborative would be regarded as someone with valuable knowledge and insight; staff members’ role would not be to explain or educate but to learn alongside community participants. To foster mutual learning, Data Collaboratives would meet regularly and frequently (e.g., every other week) for a year or more.
Each Data Collaborative would focus on a policy problem that an agency wishes to address. The Environmental Protection Agency might, for example, form a Data Collaborative on pollution prevention in the oil and gas sector. Depending on the policy problem, staff from multiple agencies may be involved alongside community participants. The Data Collaborative’s goal would be to surface the datasets potentially relevant to the policy problem, understand how they could inform the problem, and identify their limitations. Data Collaboratives would not make formal recommendations or seek consensus; however, ongoing deliberations about the datasets and their affordances can be expected to create a more robust foundation for the use of data in policy development and the development of additional data resources.
Recommendations
The Office of Management and Budget should
- Establish a government-wide Data Collaboratives program in consultation with the Chief Data Officers Council.
- Work with leadership at federal agencies to identify policy problems that would benefit from consideration by a Data Collaborative. It is expected that deputy administrators, heads of equity and diversity offices, and chief data officers would be among those consulted.
- Hire a full-time director of Data Collaboratives to lead such tasks as coordinating with public participants, facilitating meetings, and ensuring that relevant data resources are available to all collaborative members.
- Ensure agencies’ ability to provide the material support necessary to secure the participation of underrepresented community members in Data Collaboratives, such as stipends, childcare, and transportation.
- Support agencies in highlighting the activities and accomplishments of Data Collaboratives through social media, press releases, open houses, and other means.
Conclusion
Data Collaboratives would move public participation and community engagement upstream in the policy process by creating opportunities for community members to contribute their lived experience to the assessment of data and the framing of policy problems. This would in turn foster two-way communication and trusting relationships between government and the public. Data Collaboratives would also help ensure that data and their uses in federal government are equitable, by inviting a broader range of perspectives on how data analysis can promote equity and where relevant data are missing. Finally, Data Collaboratives would be one vehicle for enabling individuals to participate in science, technology, engineering, math, and medicine activities throughout their lives, increasing the quality of American science and the competitiveness of American industry.
Make publishing more efficient and equitable by supporting a “publish, then review” model
Preprinting – a process in which researchers upload manuscripts to online servers prior to the completion of a formal peer review process – has proven to be a valuable tool for disseminating preliminary scientific findings. This model has the potential to speed up the process of discovery, enhance rigor through broad discussion, support equitable access to publishing, and promote transparency of the peer review process. Yet the model’s use and expansion is limited by a lack of explicit recognition within funding agency assessment practices.
The federal government should take action to support preprinting, preprint review, and “no-pay” publishing models in order to make scholarly publishing of federal outputs more rapid, rigorous, and cost-efficient.
Details
In 2022, the Office of Science and Technology Policy (OSTP)’s “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research” memo, written by Dr. Alondra Nelson, directed federal funding agencies to make the results of taxpayer-supported research immediately accessible to readers at no cost. This important development extended John P. Holdren’s 2013 “Increasing Access to the Results of Federally Funded Scientific Research” memo by covering all federal agencies and removing 12-month embargoes to free access and mirrored developments such as the open access provisions of Horizon 2020 in Europe.
One of the key provisions of the Nelson memo is that federal agencies should “allow researchers to include reasonable publication costs … as allowable expenses in all research budgets,” signaling support for the Article Processing Charges (APC) model. Thus, the Nelson memo creates barriers to equitable publishing for researchers with limited access to funds. Furthermore, leaving the definition of “reasonable costs” open to interpretation creates the risk that an increasing proportion of federal research funds will be siphoned by publishing. In 2022, OSTP estimated that American taxpayers are already paying $390 to $798 million annually to publish federally funded research.
Without further interventions, these costs are likely to rise, since publishers have historically responded to increasing demand for open access publishing by shifting from a subscription model to one in which authors pay to publish with article processing charges (APCs). For example, APC charges increased by 50 percent from 2010 to 2019.
The “no pay” model
In May 2023, the European Union’s council of ministers called for a “no pay” academic publishing model, in which costs are paid directly by institutions and funders to ensure equitable access to read and publish scholarship. There are several routes to achieve the no pay model, including transitioning journals to ‘Diamond’ Open Access models, in which neither authors nor readers are charged.
However, in contrast to models that rely on transforming journal publishing, an alternative approach relies on the burgeoning preprint system. Preprints are manuscripts posted online by authors to a repository, without charge to authors or readers. Over the past decade, their use across the scientific enterprise has grown dramatically, offering unique flexibility and speed to scientists and encouraging dynamic conversation. More recently, preprints have been paired with a new system of preprint peer review. In this model, organizations like Peer Community In, Review Commons, and RR\ID organize expert review of preprints from the community. These reviews are posted publicly and independent of a specific publisher or journal’s process.
Despite the growing popularity of this approach, its uptake is limited by a lack of support and incorporation into science funding and evaluation models. Federal action to encourage the “publish, then review” model offers several benefits:
- Research is available sooner, and society benefits more rapidly from new scientific findings. With preprints, researchers share their work with the community months or years ahead of journal publication, allowing others to build off their advances.
- Peer review is more efficient and rigorous because the content of the review reports (though not necessarily the identity of the reviewers) is open. Readers are able to understand the level of scrutiny that went into the review process. Furthermore, an open review process enables anyone in the community to join the conversation and bring in perspectives and expertise that are currently excluded. The review process is less wasteful since reviews are not discarded with journal rejection, making better use of researchers’ time.
- Taxpayer research dollars are used more effectively. Disentangling transparent fees for dissemination and peer reviews from a publishing market driven largely by prestige would result in lower publishing costs, enabling additional funds to be used for research.
Recommendations
To support preprint-based publishing and equitable access to research:
Congress should
- Commission a report from the National Academies of Sciences, Engineering, and Medicine on benefits, risks, and projected costs to American taxpayers of supporting alternative scholarly publishing approaches, including open infrastructure for the “publish, then review” model.
OSTP should
- Coordinate support for key disciplinary infrastructures and peer review service providers with partnerships, discoverability initiatives, and funding.
- Draft a policy in which agencies require that papers resulting from federal funding are preprinted at or before submission to peer review and updated with each subsequent version; then work with agencies to conduct a feasibility study and stakeholder engagement to understand opportunities (including cost savings) and obstacles to implementation.
Science funding agencies should
- Recognize preprints and public peer review. Following the lead of the National Institutes of Health’s 2017 Guide Notice on Reporting Preprints and Other Interim Research Products, revise grant application and report guidelines to allow researchers to cite preprints. Extend this provision to include publicly accessible reviews they have received and authored. Provide guidance to reviewers on evaluating these outputs as scientific contributions within an applicant’s body of work.
Establish grant supplements for open science infrastructure security
Open science infrastructure (OSI), such as platforms for sharing research products or conducting analyses, is vulnerable to security threats and misappropriation. Because these systems are designed to be inclusive and accessible, they often require few credentials of their users. However, this quality also puts OSI at risk for attack and misuse. Seeking to provide quality tools to their users, OSI builders dedicate their often scant funding resources to addressing these security issues, sometimes delaying other important software work.
To support these teams and allow for timely resolution to security problems, science funders should offer security-focused grant supplements to funded OSI projects.
Details
Existing federal policy and funding programs recognize the importance of security to scholarly infrastructure like OSI. For example, in October 2023, President Biden issued an Executive Order to manage the risks of artificial intelligence (AI) and ensure these technologies are safe, secure, and trustworthy. Also, under the Secure and Trustworthy Cyberspace program, the National Science Foundation (NSF) provides grants to ensure the security of cyberinfrastructure and asks scholars who collect data to plan for its secure storage and sharing. Furthermore, agencies like NSF and the National Institutes of Health (NIH) already offer supplements for existing grants. What is still needed is rapid dispersal of funds to address unanticipated security concerns across scientific domains.
Risks like secure shell (SSH) attacks, data poisoning, and the proliferation of mis/disinformation on OSI threaten the utility, sustainability, and reputation of OSI. These concerns are urgent. New access to powerful generative AI tools, for instance, makes it easy to create disinformation that can convincingly mimic the rigorous science shared via OSI. In fact, increased open access to science can accelerate the proliferation of AI-generated scholarly disinformation by improving the accuracy of the models that generate it.
OSI is commonly funded by grants that afford little support for the maintenance work that could stop misappropriation and security threats. Without financial resources and an explicit commitment to a funder, it is difficult for software teams to prioritize these efforts. To ensure uptake of OSI and its continued utility, these teams must have greater access to financial resources and relevant talent to address these security concerns and norms violations.
Recommendations
Security concerns may be unanticipated and urgent, not aligning with calls for research proposals. To provide support for OSI with security risks in a timely manner, executive action should be taken through federal agencies funding science infrastructure (NSF, NIH, NASA, DOE, DOD, NOAA). These agencies should offer research supplements to address OSI misappropriation and security threats. Supplement requests would be subject to internal review by funding agencies but not subject to peer review, allowing teams to circumvent a lengthier review process for a full grant proposal. Research supplements, unlike full grant proposals, will allow researchers to nimbly respond to novel security concerns that arise after they receive their initial funding. Additionally, researchers who are less familiar with security issues but who provide OSI may not anticipate all relevant threats when the project is conceived and initial funding is distributed (managers of from-scratch science gateways are one possible example). Supplying funds through supplements when the need arises can protect sensitive data and infrastructure.
These research supplements can be made available to principal investigators and co-principal investigators with active awards. Supplements may be used to support additional or existing personnel, allowing OSI builders to bring new expertise to their teams as necessary. To ensure that funds can address unanticipated security issues in OSI from a variety of scholarly domains, supplement recipients need not be funded under an existing program to explicitly support open science infrastructure (e.g., NSF’s POSE program).
To minimize the administrative burden of review, applications for supplements should be kept short (e.g., no more than five pages, excluding budget) and should include the following:
- A description of the security issue to be addressed
- A convincing argument that the infrastructure has goals of increasing the inclusion, accessibility, and/or transparency of science and that those goals are exacerbating the relevant security threat
- A description of the steps to be taken to address the security issue and a well-supported argument that the funded researchers have the expertise and tools necessary to carry those steps out
- A brief description of the original grant’s scope, making evident that the supplemental funding will support work outside of the original scope
- An explanation as to why a grant supplement is more appropriate for their circumstances than a new full grant application
- A budget for the work
By appropriating $3 million annually across federal science funders, 40 supplemental awards of $75,000 each could be distributed to OSI projects. While the budget needed to address each security issue will vary, this estimate demonstrates the reach that these supplements could have.
Research software like OSI often struggles to find funding for maintenance. These much-needed supplemental funds will ensure that OSI developers can speedily prioritize important security-related work without doing so at the expense of other planned software work. Without this funding, we risk compromising the reputation of open science, consuming precious development resources allocated to other tasks, and negatively affecting OSI users’ experience. Grant supplements to address OSI security threats and misappropriation ensure the sustainability of OSI going forward.
Expand capacity and coordination to better integrate community data into environmental governance
Frontline communities bear the brunt of harms created by climate change and environmental pollution, but they also increasingly generate their own data, providing critical social and environmental context often not present in research or agency-collected data. However, community data collectors face many obstacles to integrating this data into federal systems: they must navigate complex local and federal policies within dense legal landscapes, and even when there is interest or demonstrated need, agencies and researchers may lack the capacity to find or integrate this data responsibly.
Federal research and regulatory agencies, as well as the White House, are increasingly supporting community-led environmental justice initiatives, presenting an opportunity to better integrate local and contextualized information into more effective and responsive environmental policy.
The Environmental Protection Agency (EPA) should better integrate community data into environmental research and governance by building internal capacity for recognizing and applying such data, facilitating connections between data communities, and addressing misalignments with data standards.
Details
Community science and monitoring are often overlooked yet vital facets of open science. Community science collaborations and their resulting data have led to historic environmental justice victories that underscore the importance of contextualized community-generated data in environmental problem-solving and evidence-informed policy-making.
Momentum around integrating community-generated environmental data has been building at the federal level for the past decade. In 2016, the report “A Vision for Citizen Science at EPA,” produced by the National Advisory Council for Environmental Policy and Technology (NACEPT), thoroughly diagnosed the need for a clear framework for moving community-generated environmental data and information into governance processes. Since then, EPA has developed additional participatory science resources, including a participatory science vision, policy guidelines, and equipment loan programs. More recently, in 2022, the EPA created an Equity Action Plan in alignment with their 2022–2026 Strategic Plan and established an Office of Environmental Justice and External Civil Rights (OEJECR). And, in 2023, as a part of the cross-agency Year of Open Science, the National Aeronautics and Space Administration (NASA)’s Transform to Open Science (TOPS) program lists “broadening participation by historically excluded communities” as a requisite part of its strategic objectives.
It is evident that the EPA and research funding agencies like NASA have a strategic and mission-driven interest in collaborating with communities bearing the brunt of environmental and climate injustice to unlock the potential of their data. It is also clear that current methods aren’t working. Communities that collect and use environmental data still must navigate disjointed reporting policies and data standards and face a dearth of resources on how to share data with relevant stakeholders within the federal government. There is a critical lack of capacity and coordination directed at cross-agency integration of community data and the infrastructure that could enable the use of this data in regulatory and policy-making processes.
Recommendations
To build government capacity to integrate community-generated data into environmental governance, the EPA should:
- Create a memorandum of understanding between the EPA’s OEJECR, National Environmental Justice Advisory Council (NEJAC), Office of Management and Budget (OMB), United States Digital Service (USDS), and relevant research agencies, including NASA, National Atmospheric and Oceanic Administration (NOAA), and National Science Foundation (NSF), to develop a collaborative framework for building internal capacity for generating and applying community-generated data, as well as managing it to enable its broader responsible reuse.
- Develop and distribute guidance on responsible scientific collaboration with communities that prioritizes ethical open science and data-sharing practices that center community and environmental justice priorities.
- Create a capacity-building program, designed by and with environmental justice and data-collecting communities, focused on building translational and intermediary roles within the EPA that can facilitate connections and responsible sharing between data holders and seekers. Incentivize the application of the aforementioned guidance within federally funded research by recruiting and training program staff to act as translational liaisons situated between the OEJECR, regional EPA offices, and relevant research funding agencies, including NASA, NOAA, and NSF.
To facilitate connections between communities generating data, the EPA should:
- Expand the scope of the current Environmental Information Exchange Network (EN) to include facilitation of environmental data sharing by community-based organizations and community science initiatives.
- Initiate a working group including representatives from data-generating community organizations to develop recommendations on how EN might accommodate community data and how its data governance processes can center community and environmental justice priorities.
- Provide grant funding within the EN earmarked for community-based organizations to support data integration with the EN platform. This could include hiring contractors with technical data management expertise to support data uploading within established standards or to build capacity internally within community-based organizations to collect and manage data according to EN standards.
- Expand the resources available for EN partners that support data quality assurance, advocacy, and sharing, for example by providing technical assistance through regional EPA offices trained through the aforementioned capacity-building program.
To address misaligned data standards, the EPA, in partnership with USDS and the OMB, should:
- Update and promote guidance resources for communities and community-based organizations aiming to apply the data standards EPA uses to integrate data in regulatory decisions.
- Initiate a collaborative co-design process for new data standards that can accommodate community-generated data, with representation from communities who collect environmental data. This may require the creation of maps or crosswalks to facilitate translation between standards, including research data standards, as well as internal capacity to maintain these crosswalks.
Community-generated data provides contextualized environmental information essential for evidence-based policy-making and regulation, which in turn reduces wasteful spending by designing effective programs. Moreover, healthcare costs will be reduced for the general public if better evidence is used to address pollution, and climate adaptation costs could be reduced if we can use more localized and granular data to address pressing environmental and climate issues now rather than in the future.
Our recommendations call for the addition of at least 10 full-time employees for each regional EPA office. The additional positions proposed could fill existing vacancies in newly established offices like the OEJECR. Additional budgetary allocations can also be made to the EPA’s EN to support technical infrastructure alterations and grant-making.
While there is substantial momentum and attention on community environmental data, our proposed capacity stimulus can make existing EPA processes more effective at achieving their mission and supports rebuilding trust in agencies that are meant to serve the public.
A Matter of Trust: Helping the Bioeconomy Reach Its Full Potential with Translational Governance
The promise of the bioeconomy is massive and fast-growing—offering new jobs, enhanced supply chains, novel technologies, and sustainable bioproducts valued at a projected $4 trillion over the next 16 years. Although the United States has been a global leader, advancements in the bioeconomy—whether it’s investing in specialized infrastructural hardware or building a multidisciplinary STEM workforce—are subject to public trust. In fact, public trust is the key to unlocking the full potential of the bioeconomy, and without it, the United States may fall short of long-term economic goals and even fall behind peer nations as a bioeconomy leader. Recent failures of the federal regulatory system for biotechnology threaten public trust, and recent regulations have been criticized for their lack of transparency. As a result, cross-sector efforts aim not just to reimagine the bioeconomy but to create a coordinated regulatory system for it. Burdened by decreasing public trust in the federal government, even the most coordinated regulatory systems will fail to boost the bioeconomy if they cannot instill public trust.
In response, the Biden-Harris Administration should direct a Bioeconomy Initiative Coordination Office (BICO) to establish a public engagement mechanism parallel with the biotechnology regulatory system. Citizen engagement and transparency are key to building public trust, yet current public engagement mechanisms cannot convey trust to a public skeptical of a biotechnology’s rewards in light of perceived risks. Bioeconomy coordination efforts should therefore prioritize public trust by adopting a new public-facing biotechnology evaluation program that collects data from nontraditional audiences via participatory technology assessments (pTA) and Multi-Criteria Decision Analysis (MCDA/MCDM) and provides insight that addresses limitations. In accordance with the CHIPS and Science Act, the public engagement program will provide a mechanism for a BICO to build public trust while advancing the bioeconomy.
The public engagement program will serve as a decision-making resource for the Coordinated Framework for the Regulation of Biotechnology (CFRB) and a data repository for evaluating public acceptance in the present and future bioeconomy.
Challenge and Opportunity
While policymakers have been addressing the challenge of sharing regulatory space among the three key agencies—Environmental Protection Agency (EPA), Food and Drug Administration (FDA), and USDA—transparency and public trust remain challenges for federal agencies, small to midsize developers, and even the public at large. The government plays a vital role in the regulatory process by providing guidelines that govern the interactions between the developers and consumers of biotechnology. For over 30 years, product developers have depended on strategic alliances between product developers and the government to ensure the market success of biotechnology. The marketplace and regulatory oversight are tightly linked, and their impacts on public confidence in the bioeconomy cannot be separated.
When it comes to a consumer’s purchase of a biotechnology product, the pivotal factor is often not price but trust. In 2016, the National Academy of Sciences, Engineering, and Medicine released recommendations on aligning public values with gene drive research. The report revealed that public engagement that promotes a “bi-directional exchange of information and perspectives” can increase public trust. Moreover, a 2022 report on Gene Drives in Agriculture highlights the importance of considering public perception and acceptance in risk-based decision-making.
The CHIPS and Science Act provides an opportunity to address transparency and public trust within the federal regulatory system for biotechnology by directing the Office of Science and Technology Policy (OSTP) to establish a Coordination Office for the National Engineering Biology Research and Development Initiative. The coordination office (i.e., BICO) will serve as a point of contact for cross-sector engagement and create a junction for the exchange of technical and programmatic information. Additionally, the office will conduct public outreach and produce recommendations for strengthening the bioeconomy.
This policy window presents a novel opportunity to create a regulatory system for the bioeconomy that also encompasses the voice of the general public. History of requests for information, public hearings, and cross-sector partnerships demonstrates the public’s capacity—or at least specific subsets of experts therein—to fill gaps, oversights, and ambiguities within biotechnology regulations.
While expert opinion is essential for developing regulation, so too are the opinions of the general public. Historically, discussions about values, sentiments, and opinions on biotechnology have been dominated by technical experts (for example, through debates on product vs. process, genetically engineered vs. genetically modified organisms, and perceived safety). Biotechnology discourse has primarily been restricted to these traditional, technical audiences, and as a result, public calls to address concerns about biotechnology are drowned out by expert opinions. We need a mechanism for public engagement that prioritizes collecting data from nontraditional audiences. This will ensure sustainable and responsible advancements in the bioeconomy.
If we want to establish a bioeconomy that increases national competitiveness, then we need to increase the participation of nontraditional audiences. Although some public concerns are unlikely to be allayed through policy change (e.g., addressing calls for a ban on genetically engineered or modified products), a public engagement program could identify the underlying issue(s) for these concerns. This would enable the adoption of comprehensive strategies that increase public trust, further sustaining the bioeconomy.
Research shows that public comment and notice periods are less likely to hear from nontraditional audiences—that is, members of underserved communities, workers, smaller market entities, and new firms. Despite the statutory and capacity-based obstacles federal agencies face in increasing public participation, the Executive Office of the President seeks to broaden public participation and community engagement in the federal regulatory process. Public engagement programs provide a platform to interact with interested parties that represent a wide range of perspectives. Thus, information gathered from public engagement could inform future proposed updates to the CFRB and the regulatory pathways for new products. In this way, the public opinions and sentiments can be incorporated into a translational governance framework to bring translational value to the bioeconomy. Since increasing public trust is complementary to advancing the bioeconomy, there is a translational value in strategically integrating a collective perception of risk and safety into future biotechnology regulation. In this case, translational governance allows for regulation that is informed by science and is responsive to the values of citizens, effectively introducing a policy lever that improves the adoption of, and investment in, the U.S. bioeconomy.
The future of biotechnology regulation is an emerging innovative ecosystem. The path to accomplishing economic goals within this ecosystem requires a new public-facing engagement mechanism framework that satisfies congressional directives and advances the bioeconomy. This framework provides a BICO with the scaffolding necessary to create an infrastructure that invites public input and community reflection and the potential to decrease the number of biotechnologies that fail to reach the market. The proposed public engagement mechanism will work alongside the current regulatory system for biotechnology to enhance public trust, improve interagency coordination, and strengthen the bioeconomy.
Plan of Action
To reach our national bioeconomic policy goals, the BICO should use a public engagement program to solicit perspectives and develop an understanding of non-economic values, such as deeply held beliefs about the relationship between humans and the environment or personal or cultural perspectives related to specific biotechnologies. The BICO should devote $10 million over five years to public engagement programs and advisory board activities that (a) report to the BICO but are carried out through external partnerships; (b) provide meaningful social data for biotechnology regulation while running parallel to the CFRB regulatory system; and (c) produce a repository of public acceptance data for horizon scanning. These programs will inform regulatory decision-making, increase public trust, and achieve the congressional directives outlined in Sec. 10402 of the CHIPS & Science Act.
Recommendation 1. Establish a Bioeconomy Initiative Coordination Office (BICO) as a home office for interagency coordination.
The BICO should be housed within the Office of Science and Technology Policy (OSTP). The creation of a BICO is in alignment with the mandates of Executive Order (EO) 14081, Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure Bioeconomy, and the statutory authority granted to the OSTP through the CHIPS and Science Act.
Congress should allocate $2 million annually for five years to the BICO to carry out a public engagement program and advisory board activities in coordination with the EPA, FDA, and USDA.
The public engagement program would be housed within the BICO as a public-facing data-generating mechanism that parallels the current federal regulatory system for biotechnology.
The bioeconomy cuts across sectors (e.g., agriculture, health, materials, energy) and actively creates new connections and opportunities for national competitiveness. A thriving bioeconomy must ensure regulatory policy coherence and alignment among these sectors, and the BICO should be able to comprehensively integrate information from multiple sectors into a strategy that increases public awareness and acceptance of bioeconomy-related products and services. Public engagement should be used to build a data ecosystem of values related to biotechnologies that advance the bioeconomy.
Recommendation 2. Establish a process for public engagement and produce a large repository of public acceptance data.
Public acceptance data will be collected alongside the “biological data ecosystem,” as referenced by the Biden Administration in EO 14081, that advances innovation in the bioeconomy. To provide an expert element to the public engagement process, an advisory board should be involved in translating public acceptance data (opinions on how biotechnologies align with values) into policy suggestions and recommendations for regulatory agencies. The advisory board should be a formal entity recognizable under the Federal Advisory Committee Act (FACA) and the Freedom of Information Act (FOIA). It should be diverse but not so large that it becomes inefficient in fulfilling its mandate. Striking a balance between compositional diversity and operational efficiency is critical to ensuring the board provides valuable insights and recommendations to the BICO. The advisory board should consist of up to 25 members, reflect NSF data on diversity and STEM, and include a diverse range of citizens, from everyday consumers (such as parents, young adults, and patients from different ethnic backgrounds) to specialists in various disciplines (such as biologists, philosophers, hair stylists, sanitation workers, social workers, and dietitians). To promote transparency and increase public trust, data will be subject to FACA and the FOIA regulations, and advisory board meetings must be accessible to the public and their details must be published in the Federal Register. Additionally, management and applications of any data collected should employ CARE (Collective Benefit, Authority to Control, Responsibility, Ethics) Principles for Indigenous Data Governance, which complement FAIR (Findable, Accessible, Interoperable and Reusable) Principles. Adopting CARE brings a people and purpose orientation to data governance and is rooted in Indigenous Peoples’ sovereignty.
The BICO can look to the National Science and Technology Council’s published requests for information (RFI) and public meetings as a model for public engagement. BICO should work with an inclusive network of external partners to design workshops for collecting public acceptance data. Using participatory technology assessments (pTA) methods, the BICO will fund public engagement activities such as open-framing focus groups, workshops, and forums that prioritize input from nontraditional public audiences. The BICO office should use pre-submission data, past technologies, near-term biotechnologies, and, where helpful, imaginative scenarios to produce case studies to engage with these audiences. Public engagement should be hosted by external grantees who maintain a wide-ranging network of interdisciplinary specialists and interested citizens to facilitate activities.
Qualitative and quantitative data will be used to reveal themes, public values, and rationales, which will aid product developers and others in the bioeconomy as they decide on new directions and potential products. This process will also serve as the primary data source for the public acceptance data repository. Evolving risk pathways are a considerable concern for any regulatory system, especially one tasked with regulating biotechnologies. How risks are managed is subject to many factors (history, knowledge, experience product) and has a lasting impact on public trust. Advancing the bioeconomy requires a transparent decision-making process that integrates public input and allows society to redefine risks and safety as a collective. Public acceptance data should inform an understanding of values, risks, and safety and improve horizon-scanning capabilities.
Conclusion
As the use of biotechnology continues to expand, policymakers must remain adaptive in their regulatory approach to ensure that public trust is acquired and maintained. Recent federal action to boost the bioeconomy provides an opportunity for policymakers to expand public engagement and improve public acceptance of biotechnology. By centralizing coordination and integrating public input, policymakers can create a responsive regulatory convention that advances the bioeconomy while also building public trust. To achieve this, the public engagement program will combine elements of community-based participatory research, value-based assessments, pTA, and CARE Principles for Indigenous Data Governance. This approach will create a translational mechanism that improves interagency coordination and builds public trust. As the government works to create a regulatory framework for the bioeconomy, the need for public participation will only increase. By leveraging the expertise and perspectives of a diverse range of interested parties, policymakers can ensure that the regulatory framework is aligned with public values and concerns while promoting innovation and progress in the U.S. bioeconomy.
Translational governance focuses on expediting the implementation of regulations to safeguard human health and the environment while simultaneously encouraging innovation. This approach involves integrating non-economic values into decision-making processes to enhance scientific and statutory criteria of risk and safety by considering public perceptions of risk and safety. Essentially, it is the policy and regulatory equivalent of translational research, which strives to bring healthcare discoveries to market swiftly and safely.
The Office of Science and Technology Policy (OSTP) should use translational governance through public engagement as a backbone of the National Engineering Biology Research and Development Initiative. Following the designation of an interagency committee by the OSTP—and once established under the scope of direction outlined in Sec. 10403 of the CHIPS and Science Act—the Initiative Coordination Office should use a public engagement program to support the following National Engineering Biology Research and Development Initiative congressional directives (Sec. 10402):
- 1) Supporting social and behavioral sciences and economics research that advances the field of engineering biology and contributes to the development and public understanding of new products, processes, and technologies.
- 2) Improving the understanding of engineering biology of the scientific and lay public and supporting greater evidence-based public discourse about its benefits and risks.
- 3) Supporting research relating to the risks and benefits of engineering biology, including under subsection d: Ensuring, through the agencies and departments that participate in the Initiative, that public input and outreach are integrated into the Initiative by the convening of regular and ongoing public discussions through mechanisms such as workshops, consensus conferences, and educational events, as appropriate].
- 4) Expanding the number of researchers, educators, and students and a retooled workforce with engineering biology training, including from traditionally underrepresented and underserved populations.
- 5) Accelerating the translation and commercialization of engineering biology and biomanufacturing research and development by the private sector.
- 6) Improving the interagency planning and coordination of federal government activities related to engineering biology.
In 1986, the newly issued regulatory system for biotechnology products faced significant statutory challenges in establishing the jurisdiction of the three key regulatory agencies (EPA, FDA, and USDA). In those early days, agency coordination allowed for the successful regulation of products that had shared jurisdiction. For example, one agency would regulate the plant in the field (USDA), and another would regulate the feed or food produced by the plant (FDA/DHHS). However, as the biotechnology product landscape has advanced, so has the complexity of agency coordination. For example, at the time of their commercialization, plants that were modified to exhibit pesticidal traits, specific microbial products, and certain genetically modified organisms cut across product use-specific regulations that were organized according to agency (i.e., field plants, food, pesticides). In response, the three key agencies have traditionally implemented their own rules and regulations (e.g., EPA’s Generally Recognized as Safe, USDA’s Am I Regulated?, USDA SECURE Rule). While such policy action is under their statutory authorities, it has resulted in policy resistance, reinforcing the confusion and lack of transparency within the regulatory process.
Since its formal debut on June 26, 1986, the CFRB has undergone two major updates, in 1992 and 2017. Additionally, the CFRB has been subject to multiple memorandums of understanding as well as two executive orders across two consecutive administrations (Trump and Biden). With the arrival of The CHIPS and Science Act (2022) and Executive Order 14081, the CFRB will likely undertake one of its most extensive updates—modernization for the bioeconomy.
According to the EPA, when the CFRB was issued in 1986, the expectation was that the framework would respond to the experiences of the industry and the agencies and that modifications would be accomplished through administrative or legislative actions. Moreover, upon their release of the 2017 updates to the CFRB, the Obama Administration described the CFRB as a “flexible regulatory structure that provides appropriate oversight for all products of modern biotechnology.” With this understanding, the CFRB is designed to be iterative and responsive to change. However, as this memo and other reports demonstrate, not all products of modern biotechnology are subject to appropriate oversight. The opportunity loss between addressing regulatory concerns and acquiring the regulations necessary to capitalize on the evolving biotechnology landscape fully presents a costly delay. The CFRB is falling behind biotechnology in a manner that hampers the bioeconomy—and likely—the future economy.
Automating Scientific Discovery: A Research Agenda for Advancing Self-Driving Labs
Despite significant advances in scientific tools and methods, the traditional, labor-intensive model of scientific research in materials discovery has seen little innovation. The reliance on highly skilled but underpaid graduate students as labor to run experiments hinders the labor productivity of our scientific ecosystem. An emerging technology platform known as Self-Driving Labs (SDLs), which use commoditized robotics and artificial intelligence for automated experimentation, presents a potential solution to these challenges.
SDLs are not just theoretical constructs but have already been implemented at small scales in a few labs. An ARPA-E-funded Grand Challenge could drive funding, innovation, and development of SDLs, accelerating their integration into the scientific process. A Focused Research Organization (FRO) can also help create more modular and open-source components for SDLs and can be funded by philanthropies or the Department of Energy’s (DOE) new foundation. With additional funding, DOE national labs can also establish user facilities for scientists across the country to gain more experience working with autonomous scientific discovery platforms. In an era of strategic competition, funding emerging technology platforms like SDLs is all the more important to help the United States maintain its lead in materials innovation.
Challenge and Opportunity
New scientific ideas are critical for technological progress. These ideas often form the seed insight to creating new technologies: lighter cars that are more energy efficient, stronger submarines to support national security, and more efficient clean energy like solar panels and offshore wind. While the past several centuries have seen incredible progress in scientific understanding, the fundamental labor structure of how we do science has not changed. Our microscopes have become far more sophisticated, yet the actual synthesizing and testing of new materials is still laboriously done in university laboratories by highly knowledgeable graduate students. The lack of innovation in how we historically use scientific labor pools may account for stagnation of research labor productivity, a primary cause of concerns about the slowing of scientific progress. Indeed, analysis of scientific literature suggests that scientific papers are becoming less disruptive over time and that new ideas are getting harder to find. The slowing rate of new scientific ideas, particularly in the discovery of new materials or advances in materials efficiency, poses a substantial risk, potentially costing billions of dollars in economic value and jeopardizing global competitiveness. However, incredible advances in artificial intelligence (AI) coupled with the rise of cheap but robust robot arms are leading to a promising new paradigm of material discovery and innovation: Self-Driving Labs. An SDL is a platform where material synthesis and characterization is done by robots, with AI models intelligently selecting new material designs to test based on previous experimental results. These platforms enable researchers to rapidly explore and optimize designs within otherwise unfeasibly large search spaces.
Today, most material science labs are organized around a faculty member or principal investigator (PI), who manages a team of graduate students. Each graduate student designs experiments and hypotheses in collaboration with a PI, and then executes the experiment, synthesizing the material and characterizing its property. Unfortunately, that last step is often laborious and the most time-consuming. This sequential method to material discovery, where highly knowledgeable graduate students spend large portions of their time doing manual wet lab work, rate limits the amount of experiments and potential discoveries by a given lab group. SDLs can significantly improve the labor productivity of our scientific enterprise, freeing highly skilled graduate students from menial experimental labor to craft new theories or distill novel insights from autonomously collected data. Additionally, they yield more reproducible outcomes as experiments are run by code-driven motors, rather than by humans who may forget to include certain experimental details or have natural variations between procedures.
Self-Driving Labs are not a pipe dream. The biotech industry has spent decades developing advanced high-throughput synthesis and automation. For instance, while in the 1970s statins (one of the most successful cholesterol-lowering drug families) were discovered in part by a researcher testing 3800 cultures manually over a year, today, companies like AstraZeneca invest millions of dollars in automation and high-throughput research equipment (see figure 1). While drug and material discovery share some characteristics (e.g., combinatorially large search spaces and high impact of discovery), materials R&D has historically seen fewer capital investments in automation, primarily because it sits further upstream from where private investments anticipate predictable returns. There are, however, a few notable examples of SDLs being developed today. For instance, researchers at Boston University used a robot arm to test 3D-printed designs for uniaxial compression energy adsorption, an important mechanical property for designing stronger structures in civil engineering and aerospace. A Bayesian optimizer was then used to iterate over 25,000 designs in a search space with trillions of possible candidates, which led to an optimized structure with the highest recorded mechanical energy adsorption to date. Researchers at North Carolina State University used a microfluidic platform to autonomously synthesize >100 quantum dots, discovering formulations that were better than the previous state of the art in that material family.
These first-of-a-kind SDLs have shown exciting initial results demonstrating their ability to discover new material designs in a haystack of thousands to trillions of possible designs, which would be too large for any human researcher to grasp. However, SDLs are still an emerging technology platform. In order to scale up and realize their full potential, the federal government will need to make significant and coordinated research investments to derisk this materials innovation platform and demonstrate the return on capital before the private sector is willing to invest it.
Other nations are beginning to recognize the importance of a structured approach to funding SDLs: University of Toronto’s Alan Aspuru-Guzik, a former Harvard professor who left the United States in 2018, has created an Acceleration Consortium to deploy these SDLs and recently received $200 million in research funding, Canada’s largest ever research grant. In an era of strategic competition and climate challenges, maintaining U.S. competitiveness in materials innovation is more important than ever. Building a strong research program to fund, build, and deploy SDLs in research labs should be a part of the U.S. innovation portfolio.
Plan of Action
While several labs in the United States are working on SDLs, they have all received small, ad-hoc grants that are not coordinated in any way. A federal government funding program dedicated to self-driving labs does not currently exist. As a result, the SDLs are constrained to low-hanging material systems (e.g., microfluidics), with the lack of patient capital hindering labs’ ability to scale these systems and realize their true potential. A coordinated U.S. research program for Self-Driving Labs should:
Initiate an ARPA-E SDL Grand Challenge: Drawing inspiration from DARPA’s previous grand challenges that have catalyzed advancements in self-driving vehicles, ARPA-E should establish a Grand Challenge to catalyze state-of-the-art advancements in SDLs for scientific research. This challenge would involve an open call for teams to submit proposals for SDL projects, with a transparent set of performance metrics and benchmarks. Successful applicants would then receive funding to develop SDLs that demonstrate breakthroughs in automated scientific research. A projected budget for this initiative is $30 million1, divided among six selected teams, each receiving $5 million over a four-year period to build and validate their SDL concepts. While ARPA-E is best positioned in terms of authority and funding flexibility, other institutions like National Science Foundation (NSF) or DARPA itself could also fund similar programs.
Establish a Focused Research Organization to open-source SDL components: This FRO would be responsible for developing modular, open-source hardware and software specifically designed for SDL applications. Creating common standards for both the hardware and software needed for SDLs will make such technology more accessible and encourage wider adoption. The FRO would also conduct research on how automation via SDLs is likely to reshape labor roles within scientific research and provide best practices on how to incorporate SDLs into scientific workflows. A proposed operational timeframe for this organization is five years, with an estimated budget of $18 million over that time period. The organization would work on prototyping SDL-specific hardware solutions and make them available on an open-source basis to foster wider community participation and iterative improvement. A FRO could be spun out of the DOE’s new Foundation for Energy Security (FESI), which would continue to establish the DOE’s role as an innovative science funder and be an exciting opportunity for FESI to work with nontraditional technical organizations. Using FESI would not require any new authorities and could leverage philanthropic funding, rather than requiring congressional appropriations.
Provide dedicated funding for the DOE national labs to build self-driving lab user facilities, so the United States can build institutional expertise in SDL operations and allow other U.S. scientists to familiarize themselves with these platforms. This funding can be specifically set aside by the DOE Office of Science or through line-item appropriations from Congress. Existing prototype SDLs, like the Argonne National Lab Rapid Prototyping Lab or Berkeley Lab’s A-Lab, that have emerged in the past several years lack sustained DOE funding but could be scaled up and supported with only $50 million in total funding over the next five years. SDLs are also one of the primary applications identified by the national labs in the “AI for Science, Energy, and Security” report, demonstrating willingness to build out this infrastructure and underscoring the recognized strategic importance of SDLs by the scientific research community.
As with any new laboratory technique, SDLs are not necessarily an appropriate tool for everything. Given that their main benefit lies in automation and the ability to rapidly iterate through designs experimentally, SDLs are likely best suited for:
- Material families with combinatorially large design spaces that lack clear design theories or numerical models (e.g., metal organic frameworks, perovskites)
- Experiments where synthesis and characterization are either relatively quick or cheap and are amenable to automated handling (e.g., UV-vis spectroscopy is relatively simple, in-situ characterization technique)
- Scientific fields where numerical models are not accurate enough to use for training surrogate models or where there is a lack of experimental data repositories (e.g., the challenges of using density functional theory in material science as a reliable surrogate model)
While these heuristics are suggested as guidelines, it will take a full-fledged program with actual results to determine what systems are most amenable to SDL disruption.
When it comes to exciting new technologies, there can be incentives to misuse terms. Self-Driving Labs can be precisely defined as the automation of both material synthesis and characterization that includes some degree of intelligent, automated decision-making in-the-loop. Based on this definition, here are common classes of experiments that are not SDLs:
- High-throughput synthesis, where synthesis automation allows for the rapid synthesis of many different material formulations in parallel (lacks characterization and AI-in-the-loop)
- Using AI as a surrogate trained over numerical models, which is based on software-only results. Using an AI surrogate model to make material predictions and then synthesizing an optimal material is also not a SDL, though certainly still quite the accomplishment for AI in science (lacks discovery of synthesis procedures and requires numerical models or prior existing data, neither of which are always readily available in the material sciences).
SDLs, like every other technology that we have adopted over the years, eliminate routine tasks that scientists must currently spend their time on. They will allow scientists to spend more time understanding scientific data, validating theories, and developing models for further experiments. They can automate routine tasks but not the job of being a scientist.
However, because SDLs require more firmware and software, they may favor larger facilities that can maintain long-term technicians and engineers who maintain and customize SDL platforms for various applications. An FRO could help address this asymmetry by developing open-source and modular software that smaller labs can adopt more easily upfront.
Finding True North: How Community Navigator Programs Can Forward Distributional Justice
State, local, and Tribal governments still face major capacity issues when it comes to accessing federal funding opportunities – even with the sheer amount of programs started since the Bipartisan Infrastructure Law (BIL) and Inflation Reduction Act (IRA) were passed. Communities need more technical assistance if implementation of those bills is going to reach its full potential, but federal agencies charged with distributing funding can’t offer the amount needed to get resources to where they need to go quickly, effectively, and equitably.
Community navigator programs offer a potential solution. Navigators are local and regional experts with a deep understanding of the climate and clean energy challenges and opportunities in their area. These navigators can be trained in federal funding requirements, clean energy technologies, permitting processes, and more – allowing them to share that knowledge with their communities and boost capacity.
Federal agencies like the Department of Energy (DOE) should invest in standing up these programs by collecting feedback on specific capacity needs from regional partners and attaching them to existing technical assistance funding. These programs can look different, but agencies should consider specific goals and desired outcomes, identify appropriate regional and local partners, and explore additional flexible funding opportunities to get them off the ground.
Community navigator programs can provide much-needed capacity combined with deep place-based knowledge to create local champions with expertise in accessing federal funding – relieving agencies of technical assistance burdens and smoothing grant-writing processes for local and state partners. Agencies should quickly take advantage of these programs to implement funding more effectively.
Challenge
BIL/IRA implementation is well under way, with countless programs being stood up at record speed by federal agencies. Of course, the sheer size of the packages means that there is still quite a bit of funding on the table at DOE that risks not being distributed effectively or equitably in the allotted time frame. While the agency is making huge strides to roll out its resources—which include state-level block grants, loan guarantee programs, and tax rebates—it has limited capacity to fully understand the unique needs of individual cities and communities and to support each location effectively in accessing funding opportunities and implementing related programs.
Subnational actors own the burden of distributing and applying for funding. States, cities, and communities want to support distribution, but they are not equally prepared to access federal funding quickly. They lack what officials call absorptive capacity, the ability to apply for, distribute, and implement funding packages. Agencies don’t have comprehensive knowledge of barriers to implementation across the hundreds of thousands of communities and can’t provide individualized technical assistance that is needed.
Two recent research projects identified several keys ways that cities, state governments, and technical assistance organizations need support from federal agencies:
- Identifying appropriate federal funding opportunities and matching with projects and stakeholders
- Understanding complex federal requirements and processes for accessing those opportunities
- Assistance with completing applications quickly and accurately
- Assembling the necessary technical capacity during the pre-application period to develop a quality application with a higher likelihood of funding
- Guidance on allowable expenditures from federal funding that support the overall technical capacity or coordinating capability of a subnational entity to collect, analyze, and securely share data on project outcomes
While this research focuses on several BIL/IRA agencies, the Department of Energy in particular distributed hundreds of billions of dollars to communities over the past few years. DOE faces an additional challenge: up until 2020, the agency was mainly focused on conducting basic science research. With the advent of BIL, IRA, and the CHIPS and Science Act, it had to adjust quickly to conduct more deployment and loan guarantee activities.
In order to meet community needs, DOE needs help – and at its core, this problem is one of talent and capacity. Since the passage of BIL, DOE has increased its hiring and bolstered its offices through the Clean Energy Corps.
Yet even if DOE could hire faster and more effectively, the sheer scope of the problem outweighs any number of federal employees. Candidates need not only certain skills but also knowledge specific to each community. To fully meet the needs of the localities and individuals it aims to reach, DOE would need to develop thorough community competency for the entire country. With over 29,000 defined communities in the United States – with about half being classified as ‘low capacity’ – it’s simply impossible to hire enough people or identify and overcome the barriers each one faces in the short amount of time allotted to implementation of BIL/IRA. Government needs outside support in order to distribute funds quickly and equitably.
Opportunity
DOE, the rest of the federal government, and the national labs are keen to provide significant technical assistance for their programs. DOE’s Office of State and Community Energy Programs has put considerable time and energy into expanding its community support efforts, including the recently stood up Office of Community Engagement and the Community Energy Fellows program.
National labs have been engaging communities for a long time – the National Renewable Energy Laboratory (NREL) conducts trainings and information sessions, answers questions, and connects communities with regional and federal resources. Colorado and Alaska, for example, were well-positioned to take advantage of federal funding when BIL/IRA were released as a result of federal training opportunities from the NREL, DOE, and other institutions, as well as local and regional coordinated approaches to preparing. Their absorptive capacity has helped them successfully access opportunities – but only because communities, cities, and Tribal governments in those regions have spent the last decade preparing for clean energy opportunities.
While this type of long-term technical assistance and training is necessary, there are resources available right now that are at risk of not being used if states, cities, and communities can’t develop capacity quickly. As DOE continues to flex its deployment and demonstration muscles, the agency needs to invest in community engagement and regional capacity to ensure long-term success across the country.
A key way that DOE can help meet the needs of states and cities that are implementing funding is by standing up community navigator programs. These programs take many forms, but broadly, they leverage the expertise of individuals or organizations within a state or community that can act as guides to the barriers and opportunities within that place.
Community navigators themselves have several benefits. They can act as a catalytic resource by delivering quality technical assistance where federal agencies may not have capacity. In DOE’s case, this could help communities understand funding opportunities and requirements, identify appropriate funding opportunities, explore new clean energy technologies that might meet the needs of the community, and actually complete applications for funding quickly and accurately. They understand regional assets and available capital and have strong existing relationships. Further, community navigators can help build networks – connecting community-based organizations, start-ups, and subnational government agencies based on focus areas.
The DOE and other agencies with BIL/IRA mandates should design programs to leverage these navigators in order to better support state and local organizations with implementation. Programs that leverage community navigators will increase the efficiency of federal technical assistance resources, stretching them further, and will help build capacity within subnational organizations to sustain climate and clean energy initiatives longer term.
These programs can target a range of issues. In the past, they have been used to support access to individual benefits, but expanding their scope could lead to broader results for communities. Training community organizations, and by extension individuals, on how to engage with federal funding and assess capital, development, and infrastructure improvement opportunities in their own regions can help federal agencies take a more holistic approach to implementation and supporting communities. Applying for funding takes work, and navigators can help – but they can also support the rollout of proposed programs once funding is awarded and ensure the projects are seen through their life cycles. For example, understanding broader federal guidance on funding opportunities like the Office of Management and Budget’s proposed revisions to the Uniform Grants Guidance can give navigators and communities additional tools for monitoring and evaluation and administrative capacity.
Benefits of these programs aren’t limited to funding opportunities and program implementation – they can help smooth permitting processes as well. Navigators can act as ready-made champions for and experts on clean energy technologies and potential community concerns. In some communities, distrust of clean energy sources, companies, and government officials can slow permitting, especially for emerging technologies that are subject to misinformation or lack of wider recognition. Supporting community champions that understand the technologies, can advocate on their behalf, and can facilitate relationship building between developers and community members can reduce opposition to clean energy projects.
Further, community navigator programs could help alleviate cost-recovery concerns from permitting teams. Permitting staff within agencies understand that communities need support, especially in the pre-application period, but in the interest of being good stewards of taxpayer dollars they are often reluctant to invest in applications that may not turn into projects.
Overall, these programs have major potential for expanding the technical assistance resources of federal agencies and the capacity of state and local governments and community-based organizations. Federal agencies with a BIL/IRA mandate should design and stand up these programs alongside the rollout of funding opportunities.
Plan of Action
With the Biden Administration’s focus on community engagement and climate and energy justice, agencies have a window of opportunity in which to expand these programs. In order to effectively expand community navigator programs, offices should:
Build community navigator programs into existing technical assistance budgets.
Offices at agencies and subcomponents with BIL/IRA funding like the Department of Energy, the Bureau of Ocean Energy Management, the Bureau of Land Management (BLM), and the Environmental Protection Agency (EPA) have expanded their technical assistance programs alongside introducing new initiatives from that same funding. Community navigator programs are primarily models for providing technical assistance – and can use programmatic funding. Offices should assess funding capabilities and explore flexible funding mechanisms like the ones below.
Some existing programs are attached to large block grant funding, like DOE’s Community Energy Programs attached to the Energy Efficiency and Conservation Block Grant Program. This is a useful practice as the funding source has broad goals and is relatively large and regionally nonspecific.
Collect feedback from regional partners on specific challenges and capacity needs to appropriately tailor community navigator programs.
Before setting up a program, offices should convene local and regional partners to assess major challenges in communities and better design a program. Feedback collection can take the form of journey mapping, listening sessions, convenings, or other structures. These meetings should rely on partners’ expertise and understanding of the opportunities specific to their communities.
For example, if there’s sufficient capacity for grant-writing but a lack of expertise in specific clean energy technologies that a region is interested in, that would inform the goals, curricula, and partners of a particular program. It also would help determine where the program should sit: if it’s targeted at developing clean energy expertise in order to overcome permitting hurdles, it might fit better at the BLM or be a good candidate for a partnership between a DOE office and BLM.
Partner with other federal agencies to develop more holistic programs.
The goals of these programs often speak to the mission of several agencies – for example, the goal of just and equitable technical assistance has already led to the Environmental Justice Thriving Communities Technical Assistance Centers program, a collaboration between EPA and DOE. By combining resources, agencies and offices can even further expand the capacity of a region and increase accessibility to more federal funding opportunities.
A good example of offices collaborating on these programs is below, with the Arctic Energy Ambassadors, funded by the Office of State and Community Energy Programs (SCEP) and the Arctic Energy Office.
Roadmap for Success
There are several initial considerations for building out a program, including solidifying the program’s goals, ensuring available funding sources and mechanisms, and identifying regional and local partners to ensure it is sustainable and effective. Community navigator programs should:
Identify a need and outline clear goals for the program.
Offices should clearly understand the goals of a program. This should go without saying, but given the inconsistency in needs, capacity, and readiness across different communities, it’s key to develop a program that has defined what success looks like for the participants and region. For example, community navigator programs could specifically work to help a region navigate permitting processes; develop several projects of a singular clean energy technology; or understand how to apply for federal grants effectively. Just one of those goals could underpin an entire program.
Ideally, community navigator programs would offer a more holistic approach – working with regional organizations or training participants who understand the challenges and opportunities within their region to identify and assess federal funding opportunities and work together to develop projects from start to finish. But agencies just setting up programs should start with a more directed approach and seek to understand what would be most helpful for an area.
Source and secure available funding, including considerations for flexible mechanisms.
There are a number of available models using different funding and structural mechanisms. Part of the benefit of these programs is that they don’t rely solely on hiring new technical assistance staff, and offices can use programmatic funds more flexibly to work with partners. Rather than hiring staff to work directly for an agency, offices can work with local and regional organizations to administer programs, train other individuals and organizations, and augment local and community capacity.
Further, offices should aim to work across the agency and identify opportunities to pool resources. The IRA provided a significant amount of funding for technical assistance across the agency – for example, the State Energy Program funding at SCEP, the Energy Improvements in Rural and Remote Areas funding at the Office of Clean Energy Demonstrations (OCED), and the Environmental Justice Thriving Communities Technical Assistance Centers program from a Department of Transportation/Department of Energy partnership could all be used to fund these programs or award funding to organizations that could administer programs.
Community navigator programs could also be good candidates for entities like FESI, the DOE’s newly authorized Foundation for Energy Security and Innovation. Although FESI must be set up by DOE, once formally established it becomes a 501(c)(3) organization and can combine congressionally appropriated funding with philanthropic or private investments, making it a more flexible tool for collaborative projects. FESI is a good tool for the partnerships described above – it could hold funding from various sources and support partners overseeing programs while convening with their federal counterparts.
Finally, DOE is also exploring the expanded use of Partnership Intermediary Agreements (PIAs), public-private partnership tools that are explicitly targeted at nontraditional partners. As the DOE continues to announce and distribute BIL/IRA funds, these agreements could be used to administer community navigator programs.
Build relationships and partner with appropriate local and regional stakeholders.
Funding shouldn’t be the only consideration. Agency offices need to ensure they identify appropriate local and regional partners, both for administration and funding. Partners should be their own form of community navigators – they should understand the region’s clean energy ecosystem and the unique needs of the communities within. In different places, the reach and existence of these partners may vary – not every locality will have a dedicated nonprofit or institution supporting clean energy development, environmental justice, or workforce, for example. In those cases, there could be regional or county-level partners that have broader scope and more capacity and would be more effective federal partners. Partner organizations should not only understand community needs but have a baseline level of experience in working with the federal government in order to effectively function as the link between the two entities. Finding the right balance of community understanding and experience with federal funding is key.
This is not foolproof. NREL’s ‘Community to Clean Energy (C2C) Peer Learning Cohorts’ can help local champions share challenges and best practices across states and communities and are useful tools for enhancing local capacity. But this program faces similar challenges as other technical assistance programs: participants engage with federal institutions that provide training and technical expertise that may not directly speak to local experience. It may be more effective to train a local or regional organization with a deeper understanding of the specific challenges and opportunities of a place and greater immediate buy-in from the community. It’s challenging for NREL as well to identify the best candidates in communities across the country without that in-depth knowledge of a region.
Additional federal technical assistance support is sorely needed if BIL/IRA funds are to be distributed equitably and quickly. Federal agencies are moving faster than ever before but don’t have the capacity to assess state and local needs. Developing models for state and local partners can help agencies get funding out the door and where it needs to go to support communities moving towards a clean energy transition.
Case Study: DOE’s Arctic Energy Ambassadors
DOE’s Arctic Energy Office (AEO) has been training state level champions for years but recently introduced the Arctic Energy Ambassadors program, using community navigators to expand clean energy project development.
The program, announced in late October 2023, will support regional champions of clean energy with training and resources to help expand their impact in their communities and across Alaska. The ambassadors’ ultimate goal is clean energy project development: helping local practitioners access federal resources, identify appropriate funding opportunities, and address their communities’ specific clean energy challenges.
The Arctic Energy Office is leading the program with help from several federal and subnational organizations. DOE’s Office of State and Community Engagement and Office of Energy Efficiency and Renewable Energy are also providing funding.
On the ground, the Denali Commission will oversee distribution of funding, and the Alaska Municipal League will administer the program. The combination of comparative advantages is what will hopefully make this program successful. The Denali Commission, in addition to receiving congressionally appropriated funding, can receive funds from other nonfederal sources in service of its mission. This could help the Commission sustain the ambassadors over the longer term and use funds more flexibly. The Commission also has closer relationships with state-level and Tribal governments and can provide insight into regional clean energy needs.
The Alaska Municipal League (AML) brings additional value as a partner; its role in supporting local governments across Alaska gives it a strong sense of community strengths and needs. AML will recruit, assess, and identify the 12 ambassadors and coordinate program logistics and travel for programming. Identifying the right candidates for the program requires in-depth knowledge of Alaskan communities, including more rural and remote ones.
For its own part, the AEO will provide the content and technical expertise for the program. DOE continues to host an incredible wealth of subject matter knowledge on cutting-edge clean energy technologies, and its leadership in this area combined with the local understanding and administration by AML and Denali Commission will help the Arctic Energy Ambassadors succeed in the years to come.
In all, strong local and regional partners, diverse funding sources and flexible mechanisms for delivering it, and clear goals for community navigator programs are key for successful administration. The Arctic Energy Ambassadors represents one model that other agencies can look to for success.
Case study: SCEP’s Community Energy Fellows Program
DOE’s State and Community Energy Programs office has been working tirelessly to implement BIL and IRA, and last year as part of those efforts it introduced the Community Energy Fellows Program (CEFP).
This program aims to support local and Tribal governments with their projects funded by the Energy Efficiency and Conservation Block Grants. CEFP matches midcareer energy professionals with host organizations to provide support and technical assistance on projects as well as learn more about how clean energy development happens.
Because the program has a much broader scope than the Arctic Energy Fellows, it solicits and assesses host institutions as well as Fellows. This allows SCEP to more effectively match the two based on issue areas, expertise, and specific skillsets. This structure allows for multiple community navigators – the host institution understands the needs of its community and the Fellow brings expertise in federal programs and clean energy development. Both parties gain from the fellowship.
In addition, SCEP has added another resource: Clean Energy Coaches, who provide another layer of expertise to the host institution and the Fellow. These coaches will help develop the Fellows’ skills as they work to support the host institution and community.
Some of the awards are already being rolled out, with a second call for host institutions and Fellows out now. Communities in southern Maine participating in the program are optimistic about the support that navigators will provide – and their project leads have a keen sense of the challenges in their communities.
As the program continues to grow, it can provide a great opportunity for other agencies and offices to learn from its success.
Laying the Foundation for the Low-Carbon Cement and Concrete Industry
This report is part of a series on underinvested clean energy technologies, the challenges they face, and how the Department of Energy can use its Other Transaction Authority to implement programs custom tailored to those challenges.
Cement and concrete production is one of the hardest industries to decarbonize. Solutions for low-emissions cement and concrete are much less mature than those for other green technologies like solar and wind energy and electric vehicles. Nevertheless, over the past few years, young companies have achieved significant milestones in piloting their technologies and certifying their performance and emissions reductions. In order to finance new manufacturing facilities and scale promising solutions, companies will need to demonstrate consistent demand for their products at a financially sustainable price. Demand support from the Department of Energy (DOE) can help companies meet this requirement and unlock private financing for commercial-scale projects. Using its Other Transactions Authority, DOE could design a demand-support program involving double-sided auctions, contracts for difference, or price and volume guarantees. To fund such a program using existing funds, the DOE could incorporate it into the Industrial Demonstrations Program. However, additional funding from Congress would allow the DOE to implement a more robust program. Through such an initiative, the government would accelerate the adoption of low-emissions cement and concrete, providing emissions reductions benefits across the country while setting the United States up for success in the future clean industrial economy.
Introduction
Besides water, concrete is the most consumed material in the world. It is the material of choice for construction thanks to its durability, versatility, and affordability. As of 2022, the cement and concrete sector accounted for nine percent of global carbon emissions. The vast majority of the embodied emissions of concrete come from the production of Portland cement. Cement production emits carbon through the burning of fossil fuels to heat kilns (40% of emissions) and the chemical process of turning limestone and clay into cement using that heat (60% of emissions). Electrifying production facilities and making them more energy efficient can help decarbonize the former but not the latter, which requires deeper innovation.
Current solutions on the market substitute a portion of the cement used in concrete mixtures with Supplementary Cementitious Materials (SCMs) like fly ash, slag, or unprocessed limestone, reducing the embodied emissions of the resulting concrete. But these SCMs cannot replace all of the cement in concrete, and currently there is an insufficient supply of readily usable fly ash and slag for wider adoption across the industry.
The next generation of ultra-low-carbon, carbon-neutral, and even carbon-negative solutions seeks to develop alternative feedstocks and processes for producing cement or cementitious materials that can replace cement entirely and to capture carbon in aggregates and wet concrete. The DOE reports that testing and scaling these new technologies is crucial to fully eliminate emissions from concrete by 2050. Bringing these new technologies to the market will not only help the United States meet its climate goals but also promote U.S. leadership in manufacturing.
A number of companies have established pilot facilities or are in the process of constructing them. These companies have successfully produced near-carbon-neutral and even carbon-negative concrete. Building off of these milestones, companies will need to secure financing to build full-scale commercial facilities and increase their manufacturing capacity.
Challenges Facing Low-Carbon Cement and Concrete
A key requirement for accessing both private-sector and government financing for new facilities is that companies obtain long-term offtake agreements, which assure financiers that there will be a steady source of revenue once the facility is built. But the boom-and-bust nature of the construction industry discourages construction companies and intermediaries from entering into long-term financial commitments in case there won’t be a project to use the materials for. Cement, aggregates, and other concrete inputs also take up significant volume, so it would be difficult and costly for potential offtakers to store excess amounts during construction lulls. For these reasons, construction contractors procure concrete on an as-needed, project-specific basis.
Adding to the complexity, structural features of the cement and concrete market increase the difficulty of securing long-term offtake agreements:
- Long, fragmented supply chain: While the supply chain is highly concentrated at either end, there are multiple intermediaries between the actual producers of cement, aggregates, and other inputs and the final construction customers. These include the thousands of ready-mix concrete producers, along with materials dealers, construction contractors, and subcontractors. As a result, construction customers usually aren’t buying materials themselves, and their contractors or subcontractors often aren’t buying materials directly from cement producers.
- Regional fragmentation: Cement, aggregates, and other concrete inputs are heavy products, which entail high freight costs and embodied emissions from transportation, so producers have a limited range in which they are willing to ship their product. After these products are shipped to a ready-mix concrete facility, the fresh concrete must then be delivered to the construction site within 60 to 90 minutes or the concrete will harden. As a result, the localization of supply chains limits the potential customers for a new manufacturing plant.
- Low margins: The cement and concrete markets operate with very low margins, so buyers are highly sensitive to price. Consequently, low-carbon cement and concrete may struggle to compete against conventional options due to their green premiums.
Luckily, private construction is not the only customer for concrete. The U.S. government (federal, state, and local combined) accounts for roughly 50% of all concrete procurement in the country. Used correctly, the government’s purchasing power can be a powerful lever for spurring the adoption of decarbonized cement and concrete. However, the government faces similar barriers as the private sector against entering into long-term offtake agreements. Government procurement of concrete goes through multiple intermediaries and operates on an as-needed, project-specific basis: government agencies like the General Services Administration (GSA) enter into agreements with construction contractors for specific projects, and then the contractors or their subcontractors make the ultimate purchasing decisions for concrete.
Federal Support
The Federal Buy Clean Initiative, enacted in 2021 by the Biden Administration, is starting to address the procurement challenge for low-carbon cement and concrete. Among the initiative’s programs is the allocation of $4.5 billion from the Inflation Reduction Act (IRA) for the GSA and the Department of Transportation (DOT) to use lower-carbon construction materials. Under the initiative, the GSA is piloting directly procuring low-embodied-carbon materials for federal construction projects. To qualify as low-embodied-carbon concrete under the GSA’s interim requirements, concrete mixtures only have to achieve a roughly 25–50% reduction in carbon content,1 depending on the compressive strength. The requirement may be even less if no concrete meeting this standard is available near the project site. Since the bar is only slightly below traditional concrete, young companies developing the solutions to fully decarbonize concrete will have trouble competing in terms of price against companies producing more well-established but higher-emission solutions like fly ash, slag, and limestone concrete mixtures to secure procurement contracts. Moreover, the just-in-time and project-specific nature of these procurement contracts means they still don’t address juvenile companies’ need for long-term price and customer security in order to scale up.
The ideal solution for this is a demand-support program. The DOE Office of Clean Energy Demonstrations (OCED) is developing a demand-support program for the Hydrogen Hubs initiative, setting aside $1 billion for demand-support to accompany the $7 billion in direct funding to regional Hydrogen Hubs. In its request for proposals, OCED says that the hydrogen demand-support program will address the “fundamental mismatch in [the market] between producers, who need long-term certainty of high-volume demand in order to secure financing to build a project, and buyers, who often prefer to buy on a short-term basis at more modest volumes, especially for products that have yet to be produced at scale and [are] expected to see cost decreases.”
A demand-support program could do the same for low-carbon cement and concrete, addressing the market challenges that grants alone cannot. OCED is reviewing applications for the $6.3 billion Industrial Demonstrations Program. Similar to the Hydrogen Hubs, OCED could consider setting aside $500 million to $1 billion of the program funds to implement demand-support programs for the two highest-emitting heavy industries, low-carbon cement/concrete and steel, at $250 million to $500 million each.
Additional funding from Congress would allow DOE to implement a more robust demand-support program. Federal investment in industrial decarbonization grew from $1.5 billion in FY21 to over $10 billion in FY23, thanks largely to new funding from BIL and IRA. However, the sector remains underfunded relative to its emissions, contributing 23% of the country’s emissions while receiving less than 12% of Federal climate innovation funding. A promising piece of legislation that was recently introduced is The Concrete and Asphalt Innovation Act of 2023, which would, among other things, direct the DOE to establish a program of research, development, demonstration, and commercial application of low-emissions cement, concrete, asphalt binder, and asphalt mixture. This would include a demonstration initiative authorized at $200 million and the production of a five-year strategic plan to identify new programs and resources needed to carry out the mission. If the legislation is passed, the DOE could propose a demand-support program in its strategic plan and request funding from Congress to set it up, though the faster route would be for Congress to add a section to the Act directly establishing a demand-support program within DOE and authorizing funding for it before passing the Act.
Other Transactions Authority
BIL and IRA gave DOE an expanded mandate to support innovative technologies from early-stage research through commercialization. In order to do so, DOE must be just as innovative in its use of its available authorities and resources. Tackling the challenge of bringing technologies from pilot to commercialization requires DOE to look beyond traditional grant, loan, and procurement mechanisms. Previously, we have identified the DOE’s Other Transaction Authority (OTA) as an underleveraged tool for accelerating clean energy technologies.
OTA is defined in legislation as the authority to enter into transactions that are not government grants or contracts in order to advance an agency’s mission. This negative definition provides DOE with significant freedom to design and implement flexible financial agreements that can be tailored to address the unique challenges that different technologies face. DOE plans to use OTA to implement the hydrogen demand-support program, and it could also be used for a demand-support program for low-carbon cement and concrete. The DOE’s new Guide to Other Transactions provides official guidance on how DOE personnel can use the flexibilities provided by OTA.
Defining Products for Demand Support
Before setting up a demand-support program, DOE first needs to define what a low-carbon cement or concrete product is and the value it provides in emissions avoided. This is not straightforward due to (1) the heterogeneity of solutions, which prevents apples-to-apples comparisons in price, and (2) variations in the amount of avoided emissions that different solutions can provide. To address the first issue, for products that are not ready-mix concrete, the DOE should calculate the cost of a unit of concrete made using the product, based on a standardized mix ratio of a specific compressive strength and market prices for the other components of the concrete mix. To address the second issue, the DOE should then divide the calculated price per unit of concrete (e.g., $/m3) by the amount of CO2 emissions avoided per unit of concrete compared to the NRCMA’s industry average (e.g., kg/m3) to determine the effective price per unit of CO2 emissions avoided. The DOE can then fairly compare bids from different projects using this metric. Such an approach would result in the government providing demand support for the products that are most cost-effective at reducing carbon emissions, rather than solely the cheapest.
Furthermore, the DOE should put an upper limit on the amount of embodied carbon that the concrete product or concrete made with the product must meet in order to qualify as “low carbon.” We suggest that the DOE use the limits established by the First Movers Coalition, an international corporate advanced market commitment for concrete and other hard-to-abate industries organized by the World Economic Forum. The limits were developed through conversations with incumbent suppliers, start-ups, nonprofits, and intergovernmental organizations on what would be achievable by 2030. The limits were designed to help move the needle towards commercializing solutions that enable full decarbonization.
Companies that participate in a DOE demand-support program should be required after one or two years of operations to confirm that their product meets these limits through an Environmental Product Declaration.2 Using carbon offsets to reach that limit should not be allowed, since the goal is to spur the innovation and scaling of technologies that can eventually fully decarbonize the cement and concrete industry.
Below are some ideas for how DOE can set up a demand-support program for low-carbon cement and concrete.
Program Proposals
Double-Sided Auction
Double-sided auctions are designed to support the development of production capacity for green technologies and products and the creation of a market by providing long-term price certainty to suppliers and facilitating the sale of their products to buyers. As the name suggests, a double-sided auction consists of two phases: First, the government or an intermediary organization holds a reverse auction for long-term purchase agreements (e.g., 10 years) for the product from suppliers, who are incentivized to bid the lowest possible price in order to win. Next, the government conducts annual auctions of short-term sales agreements to buyers of the product. Once sales agreements are finalized, the product is delivered directly from the supplier to the buyer, with the government acting as a transparent intermediary. The government thus serves as a market maker by coordinating the purchase and sale of the product from producers to buyers. Government funding covers the difference between the original purchase price and the final sale price, reducing the impact of the green premium for buyers and sellers.
While the federal government has not yet implemented a double-sided auction program, OCED is considering setting up the hydrogen demand-support measure as a “market maker” that provides a “ready purchaser/seller for clean hydrogen.” Such a market maker program could be implemented most efficiently through double-sided auctions.
Germany was the first to conceive of and develop the double-sided auction scheme. The H2Global initiative was established in 2021 to support the development of production capacity for green hydrogen and its derivative products. The program is implemented by Hintco, an intermediary company, which is currently evaluating bids for its first auction for the purchase of green ammonia, methanol, and e-fuels, with final contracts expected to be announced as soon as this month. Products will start to be delivered by the end of 2024.

(Source: H2Global)
A double-sided auction scheme for low-carbon cement and concrete would address producers’ need for long-term offtake agreements while matching buyers’ short-term procurement needs. The auctions would also help develop transparent market prices for low-carbon cement and concrete products.
All bids for purchase agreements should include detailed technical specifications and/or certifications for the product, the desired price per unit, and a robust, third-party life-cycle assessment of the amount of embodied carbon per unit of concrete made with the product, at different compressive strengths. Additionally, bids of ready-mix concrete should include the location(s) of their production facility or facilities, and bids of cement and other concrete inputs should include information on the locations of ready-mix concrete facilities capable of producing concrete using their products. The DOE should then select bids through a pure reverse auction using the calculated effective price per unit of CO2 emissions avoided. To account for regional fragmentation, the DOE could conduct separate auctions for each region of the country.
A double-sided auction presents similar benefits to the low-carbon cement and concrete industry as an advance market commitment would. However, the addition of an efficient, built-in system for the government to then sell that cement or concrete allotment to a buyer means that the government is not obligated to use the cement or concrete itself. This is important because the logistics of matching cement or concrete production to a suitable government construction project can be difficult due to regional fragmentation, and the DOE is not a major procurer of cement and concrete.3 Instead, under this scheme, federal, state, or local agencies working on a construction project or their contractors could check the double-sided auction program each year to see if there is a product offering in their region that matches their project needs and sustainability goals for that year, and if so, submit a bid to procure it. In fact, this should be encouraged as a part of the Federal Buy Clean Initiative, since the government is such an important consumer of cement and concrete products.
Contracts for Difference
Contracts for difference (CfD, or sometimes called two-way CfD) programs aim to provide price certainty for green technology projects and close the gap between the price that producers need and the price that buyers are willing to offer. CfD have been used by the United Kingdom and France primarily to support the development of large-scale renewable energy projects. However, CfD can also be used to support the development of production capacity for other green technologies. OCED is considering CfD (also known as pay-for-difference contracts) for its hydrogen demand-support program.
CfD are long-term contracts signed between the government or a government-sponsored entity and companies looking to expand production capacity for a green product.4 The contract guarantees that once the production facility comes online, the government will ensure a steady price by paying suppliers the difference between the market price for which they are able to sell their product and a predetermined “strike price.” On the other hand, if the market price rises above the strike price, the supplier will pay the difference back to the government. This prevents the public from funding any potential windfall profits.


(Source: Canadian Climate Institute)
A CfD program could provide a source of demand certainty for low-carbon cement and concrete companies looking to finance the construction of pilot- and commercial-scale manufacturing plants or the retrofitting of existing plants. The selection of recipients and strike prices should be determined through annual reverse auctions. In a typical reverse auction for CfD, the government sets a cap on the maximum number of units of product and the max strike price they’re willing to accept. Each project candidate then places a sealed bid for a unit price and the amount of product they plan to produce. The bids are ranked by unit price, and projects are accepted from low to high unit price until either the max total capacity or max strike price is reached. The last project accepted sets the strike price for all accepted projects. The strike price is adjusted annually for inflation but otherwise fixed over the course of the contract. Compared to traditional subsidy programs, a CfD program can be much more cost-efficient thanks to the reverse auction process. The UK’s CfD program has seen the strike price fall with each successive round of auctions.
Applying this to the low-carbon cement and concrete industry requires some adjustments, since there are a variety of products for decarbonizing cement and concrete. As discussed prior, the DOE should compare project bids according to the effective price per unit CO2 abated when the product is used to make concrete. The DOE should also set a cap on the maximum volume of CO2 it wishes to abate and the maximum effective price per unit of CO2 abated that it is willing to pay. Bids can then be accepted from low to high price until one of those caps is hit. Instead of establishing a single strike price, the DOE should use the accepted project’s bid price as the strike price to account for the variation in types of products.
Backstop Price Guarantee
A CfD program could be designed as a backstop price guarantee if one removes the requirement that suppliers pay the government back when market prices rise above the strike price. In this case, the DOE would set a lower maximum strike price for CO2 abatement, knowing that suppliers will be willing to bid lower strike prices, since there is now the opportunity for unrestricted profits above the strike price. The DOE would then only pay in the worst-case scenario when the market price falls below the strike price, which would operate as an effective price floor.
Backstop Volume Guarantee
Alternatively, the DOE could address demand uncertainty by providing a volume guarantee. In this case, the DOE could conduct a reverse auction for volume guarantee agreements with manufacturers, wherein the DOE would commit to purchasing any units of product short of the volume guarantee that the company is unable to sell each year for a certain price, and the company would commit to a ceiling on the price they will charge buyers.5 Using OTA, the DOE could implement such a program in collaboration with DOT or GSA, wherein DOE would purchase the materials and DOT or GSA would use the materials for their construction needs.
Other Considerations for Implementation
Rather than directly managing a demand-support program, the DOE should enter into an OT agreement with an external nonprofit entity to administer the contracts.6 The nonprofit entity would then hold auctions and select, manage, and fulfill the contracts. DOE is currently in the process of doing this for the hydrogen demand-support program.
A nonprofit entity could provide two main benefits. First, the logistics of implementing such a program would not be trivial, given the number of different suppliers, intermediaries, and offtakers involved. An external entity would have an easier and faster time hiring staff with the necessary expertise compared to the federal hiring process and limited budget for program direction that the DOE has to contend with. Second, the entity’s independent nature would make it easier to gain lasting bipartisan support for the demand-support program, since the entity would not be directly associated with any one administration.
Coordination with Other DOE Programs
The green premium for near-zero-carbon cement and concrete products is steep, and demand-support programs like the ones proposed in this report should not be considered a cure-all for the industry, since it may be difficult to secure a large enough budget for any one such program to fully address the green premium across the industry. Rather, demand-support programs can complement the multiple existing funding authorities within the DOE by closing the residual gap between emerging technologies and conventional alternatives after other programs have helped to lower the green premium.
The DOE’s Loan Programs Office (LPO) received a significant increase in their lending authorities from the IRA and has the ability to provide loans or loan guarantees to innovative clean cement facilities, resulting in cheaper capital financing and providing an effective subsidy. In addition, the IRA and the Bipartisan Infrastructure Law provided substantial new funding for the demonstration of industrial decarbonization technologies through OCED.
Policies like these can be chained together. For example, a clean cement start-up could simultaneously apply to OCED for funding to demonstrate their technology at scale and a loan or loan guarantee from LPO after due diligence on their business plan. Together, these two programs drive down the cost of the green premium and derisk the companies that successfully receive their support, leaving a much more modest price premium that a mechanism like a double-sided auction could affordably cover with less risk.
Successfully chaining policies like this requires deep coordination across DOE offices. OCED and LPO would need to work in lockstep in conducting technical evaluations and due diligence of projects that apply to both and prioritize funding of projects that meet both offices’ criteria for success. The best projects should be offered both demonstration funding from OCED and conditional commitments from LPO, which would provide companies with the confidence that they will receive follow-on funding if the demonstration is successful and other conditions are met, while posing no added risk to LPO since companies will need to meet their conditions first before receiving funds. The assessments should also consider whether the project would be a strong candidate for receiving demand support through a double-sided auction, CfD program, or price/volume guarantee, which would help further derisk the loan/loan guarantee and justify the demonstration funding.
Candidates for receiving support from all three public funding instruments would of course need to be especially rigorously evaluated, since the fiscal risk and potential political backlash of such a project failing is also much greater. If successful, such coordination would ensure that the combination of these programs substantially moves the needle on bringing emerging technologies in green cement and concrete to commercial scale.
Conclusion
Demand support can help address the key barrier that low-carbon cement and concrete companies face in scaling their technologies and financing commercial-scale manufacturing facilities. Whichever approach the DOE chooses to take, the agency should keep in mind (1) the importance of setting an ambitious standard for what qualifies as low-carbon cement and concrete and comparing proposals using a metric that accounts for the range of different product types and embodied emissions, (2) the complex implementation logistics, and (3) the benefits of coordinating a demand-support program with the agency’s demonstration and loan programs. Implemented successfully, such a program would crowd in private investment, accelerate commercialization, and lay the foundation for the clean industrial economy in the United States.
Breaking Ground on Next-Generation Geothermal Energy
This report is part one of a series on underinvested clean energy technologies, the challenges they face, and how the Department of Energy can use its Other Transaction Authority to implement programs custom tailored to those challenges.
The United States has been gifted with an abundance of clean, firm geothermal energy lying below our feet – tens of thousands of times more than the country has in untapped fossil fuels. Geothermal technology is entering a new era, with innovative approaches on their way to commercialization that will unlock access to more types of geothermal resources. However, the development of commercial-scale geothermal projects is an expensive affair, and the U.S. government has severely underinvested in this technology. The Inflation Reduction Act and the Bipartisan Infrastructure Law concentrated clean energy investments in solar and wind, which are great near-term solutions for decarbonization, but neglected to invest sufficiently in solutions like geothermal energy, which are necessary to reach full decarbonization in the long term. With new funding from Congress or potentially the creative (re)allocation of existing funding, the Department of Energy (DOE) could take a number of different approaches to accelerating progress in next-generation geothermal energy, from leasing agency land for project development to providing milestone payments for the costly drilling phases of development.
Introduction
As the United States power grid transitions towards clean energy, the increasing mix of intermittent renewable energy sources like solar and wind must be balanced by sources of clean firm power that are available around the clock in order to ensure grid reliability and reduce the need to overbuild solar, wind, and battery capacity. Geothermal power is a leading contender for addressing this issue.
Conventional geothermal (also known as hydrothermal) power plants tap into existing hot underground aquifers and circulate the hot water to the surface to generate electricity. Thanks to an abundance of geothermal resources close to the earth’s surface in the western part of the country, the United States currently leads the world in geothermal power generation. Conventional geothermal power plants are typically located near geysers and steam vents, which indicate the presence of hydrothermal resources belowground. However, these hydrothermal sites represent just a small fraction of the total untapped geothermal potential beneath our feet — more than the potential of fossil fuel and nuclear fuel reserves combined.
Next-generation geothermal technologies, such as enhanced geothermal systems (EGS), closed-loop or advanced geothermal systems (AGS), and other novel designs, promise to allow access to a wider range of geothermal resources. Some designs can potentially also serve double duty as long-duration energy storage. Rather than tapping into existing hydrothermal reservoirs underground, these technologies drill into hot dry rock, engineer independent reservoirs using either hydraulic stimulation or extensive horizontal drilling, and then introduce new fluids to bring geothermal energy to the surface. These new technologies have benefited from advances in the oil and gas industry, resulting in lower drilling costs and higher success rates. Furthermore, some companies have been developing designs for retrofitting abandoned oil and gas wells to convert them into geothermal power plants. The commonalities between these two sectors present an opportunity not only to leverage the existing workforce, engineering expertise, and supply chain from the oil and gas industry to grow the geothermal industry but also to support a just transition such that current workers employed by the oil and gas industry have an opportunity to help build our clean energy future.
Over the past few years, a number of next-generation geothermal companies have had successful pilot demonstrations, and some are now developing commercial-scale projects. As a result of these successes and the growing demand for clean firm power, power purchase agreements (PPAs) for an unprecedented 1GW of geothermal power have been signed with utilities, community choice aggregators (CCAs), and commercial customers in the United States in 2022 and 2023 combined. In 2023, PPAs for next-generation geothermal projects surpassed those for conventional geothermal projects in terms of capacity. While this is promising, barriers remain to the development of commercial-scale geothermal projects. To meet its goal of net-zero emissions by 2050, the United States will need to invest in overcoming these barriers for next-generation geothermal energy now, lest the technology fail to scale to the level necessary for a fully decarbonized grid.
Meanwhile, conventional hydrothermal still has a role to play in the clean energy transition. The United States needs all the clean firm power that it can get, whether that comes from conventional or next-generation geothermal, in order to retire baseload coal and natural gas plants. The construction of conventional hydrothermal power plants is less expensive and cheaper to finance, since it’s a tried and tested technology, and there are still plenty of untapped hydrothermal resources in the western part of the country.
Challenges Facing Geothermal Projects
Funding is the biggest barrier to commercial development of next-generation geothermal projects. There are two types of private financing: equity financing or debt financing. Equity financing is more risk tolerant and is typically the source of funding for start-ups as they move from the R&D to demonstration phases of their technology. But because equity financing has a dilutive effect on the company, when it comes to the construction of commercial-scale projects, debt financing is preferred. However, first-of-a-kind commercial projects are almost always precluded from accessing debt financing. It is commonly understood within industry that private lenders will not take on technology risk, meaning that technologies must be at a Technology Readiness Level (TRL) of 9, where they have been proven to operate at commercial scale, and government lenders like the DOE Loan Programs Office (LPO) generally will not take on any risk that private lenders won’t. Manifestations of technology risk in next-generation geothermal include the possibility of underproduction, which would impact the plant’s profitability, or that capacity will decline faster than expected, reducing the plant’s operating lifetime. Moving next-generation technologies from the current TRL-7 level to TRL-9 will be key to establishing the reliability of these emerging technologies and unlocking debt financing for future commercial-scale projects.
Underproduction will likely remain a risk, though to a lesser extent, for next-generation projects even after technologies reach TRL-9. This is because uncertainty in the exploration and subsurface characterization process makes it possible for developers to overestimate the temperature gradient and thus the production capacity of a project. Hydrothermal projects also share this risk: the factors determining the production capacity for hydrothermal projects include not only the temperature gradient but also the flow rate and enthalpy of the natural reservoir. In the worst-case scenario, drilling can result in a dry hole that produces no hot fluids at all. This becomes a financial issue if the project is unable to generate as much revenue as expected due to underproduction or additional wells must be drilled to compensate, driving up the total project cost. Thus, underproduction is a risk shared by both next-generation and conventional geothermal projects. Research into improvements to the accuracy and cost of geothermal exploration and subsurface characterization can help mitigate this risk but may not eliminate it entirely, since there is a risk-cost trade-off in how much time is spent on exploration and subsurface characterization.
Another challenge for both next-generation and conventional geothermal projects is that they are more expensive to develop than solar or wind projects. Drilling requires significant upfront capital expenditures, making up about half of the total capital costs of developing a geothermal project, if not more. For example, in EGS projects, the first few wells can cost around $10 million each, while conventional hydrothermal wells, which are shallower, can cost around $3–7 million each. While conventional hydrothermal plants only consist of two to six wells on average, designs for commercial EGS projects can require several times that amount of wells. Luckily, EGS projects benefit from the fact that wells can be drilled identically, so projects expect to move down the learning curve as they drill more wells, resulting in faster and cheaper drilling. Initial data from commercial-scale projects currently being developed suggest that the learning curves may be even steeper than expected. Nevertheless, this will need to be proven at scale across different locations. Some companies have managed to forgo expensive drilling costs by focusing on developing technologies that can be installed within idle hydrothermal wells or abandoned oil and gas wells to convert them into productive geothermal wells.
Beyond funding, geothermal projects need to obtain land where there are suitable geothermal resources and permits for each stage of project development. The best geothermal resources in the United States are concentrated in the West, where the federal government owns most of the land. The Bureau of Land Management (BLM) manages a lot of that land, in addition to all subsurface resources on federal land. However, there is inconsistency in how the BLM leases its land, depending on the state. While Nevada BLM has been very consistent about holding regular lease sales each year, California BLM has not held a lease sale since 2016. Adding to the complexity is the fact that although BLM manages all subsurface resources on federal land, surface land may sometimes be managed by a different agency, in which case both agencies will need to be involved in the leasing and permitting process.
Last, next-generation geothermal companies face a green premium on electricity produced using their technology, though the green premium does not appear to be as significant of a challenge for next-generation geothermal as it is for other green technologies. In states with high renewables penetration, utilities and their regulators are beginning to recognize the extra value that clean firm power provides in terms of grid reliability. For example, the California Public Utility Commission has issued an order for utilities to procure 1 GW of clean, firm power by 2026, motivating a wave of new demand from utilities and community choice aggregators. As a result of this demand and California’s high electricity prices in general, geothermal projects have successfully signed a flurry of PPAs over the past year. These have included projects located in Nevada and Utah that can transmit electricity to California customers. In most other western states, however, electricity prices are much lower, so utility companies can be reluctant to sign PPAs for next-generation geothermal projects if they aren’t required to, due to the high cost and technology risk. As a result, next-generation geothermal projects in those states have turned to commercial customers, like those operating data centers, who are willing to pay more to meet their sustainability goals.
Federal Support
The federal government is beginning to recognize the important role of next-generation geothermal power for the clean energy transition. For the first time in 2023, geothermal energy became eligible for the renewable energy investment and production tax credits, thanks to technology-neutral language introduced in the Inflation Reduction Act (IRA). Within the DOE, the agency launched the Enhanced Geothermal Shot in 2022, led by the Geothermal Technologies Office (GTO), to reduce the cost of EGS by 90% to $45/MWh by 2035 and make geothermal widely available. In 2020, the Frontier Observatory for Research in Geothermal Energy (FORGE), a dedicated underground field laboratory for EGS research, drilling, and technology testing established by GTO in 2014, drilled their first well using new approaches and tools the lab had developed. This year, GTO announced funding for seven EGS pilot demonstrations from the Bipartisan Infrastructure Law (BIL), for which GTO is currently reviewing the first round of applications. GTO also awarded the Geothermal Energy from Oil and gas Demonstrated Engineering (GEODE) grant to a consortium formed by Project Innerspace, the Society of Petroleum Engineering International, and Geothermal Rising, with over 100 partner entities, to transfer best practices from the oil and gas industry to geothermal, support demonstrations and deployments, identify barriers to growth in the industry, and encourage workforce adoption.
While these initiatives are a good start, significantly more funding from Congress is necessary to support the development of pilot demonstrations and commercial-scale projects and enable wider adoption of geothermal energy. The BIL notably expanded the DOE’s mission area in supporting the deployment of clean energy technologies, including establishing the Office of Clean Energy Demonstrations (OCED) and funding demonstration programs from the Energy Division of BIL and the Energy Act of 2020. However, the $84 million in funding authorized for geothermal pilot demonstrations was only a fraction of the funding that other programs received from BIL and not commensurate to the actual cost of next-generation geothermal projects. Congress should be investing an order of magnitude more into next-generation geothermal projects, in order to maintain U.S. leadership in geothermal energy and reap the many benefits to the grid, the climate, and the economy.
Another key issue is that DOE has currently and in the past limited all of its funding for next-generation geothermal to EGS technologies only. As a result, companies pursuing closed-loop/AGS and other next-generation technologies cannot qualify, leading some projects to be moved abroad. Given GTO’s historically limited budget, it’s possible that this was the result of a strategic decision to focus their funding on one technology rather than diluting it across multiple technologies. However, given that none of these technologies have been successfully commercialized at a wide scale yet, DOE may be missing the opportunity to invest in the full range of viable approaches. DOE appears to be aware of this, as the agency currently has a working group on AGS. New funding from Congress would allow DOE to diversify its investments to support the demonstration and commercial application of other next-generation geothermal technologies.
Alternatively, there are a number of OCED programs with funding from BIL that have not yet been fully spent (Table 1). Congress could reallocate some of that funding towards a new program supporting next-generation geothermal projects within OCED. Though not ideal, this may be a more palatable near-term solution for the current Congress than appropriating new funding.
A third option is that DOE could use some of the funding for the Energy Improvements in Rural and Remote Areas program, of which $635 million remains unallocated, to support geothermal projects. Though the program’s authorization does not explicitly mention geothermal energy, geothermal is a good candidate given the abundance of geothermal production potential in rural and remote areas in the West. Moreover, as a clean firm power source, geothermal has a comparative advantage over other renewable energy sources in improving energy reliability.
Other Transactions Authority
BIL and IRA gave DOE an expanded mandate to support innovative technologies from early stage research through commercialization. To do so, DOE will need to be just as innovative in its use of its available authorities and resources. Tackling the challenge of scaling technologies from pilot to commercialization will require DOE to look beyond traditional grant, loan, and procurement mechanisms. Previously, we identified the DOE’s Other Transaction Authority (OTA) as an underleveraged tool for accelerating clean energy technologies.
OTA is defined in legislation as the authority to enter into any transaction that is not a government grant or contract. This negative definition provides DOE with significant freedom to design and implement flexible financial agreements that can be tailored to the unique challenges that different technologies face. OT agreements allow DOE to be more creative, and potentially more cost-effective, in how it supports the commercialization of new technologies, such as facilitating the development of new markets, mitigating risks and market failures, and providing innovative new types of demand-side “pull” funding and supply-side “push” funding. The DOE’s new Guide to Other Transactions provides official guidance on how DOE personnel can use the flexibilities provided by OTA.
With additional funding from Congress, the DOE could use OT agreements to address the unique barriers that geothermal projects face in ways that may not be possible through other mechanisms. Below are four proposals for how the DOE can do so. We chose to focus on supporting next-generation geothermal projects, since the young industry currently requires more governmental support to grow, but we included ideas that would benefit conventional hydrothermal projects as well.
Program Proposals
Geothermal Development on Agency Land
This year, the Defense Innovation Unit issued its first funding opportunity specifically for geothermal energy. The four winning projects will aim to develop innovative geothermal power projects on Department of Defense (DoD) bases for both direct consumption by the base and sale to the local grid. OT agreements were used for this program to develop mutually beneficial custom terms. For project developers, DoD provided funding for surveying, design, and proposal development in addition to land for the actual project development. The agreement terms also gave companies permission to use the technology and information gained from the project for other commercial use. For DoD, these projects are an opportunity to improve the energy resilience and independence of its bases while also reducing emissions. By implementing the prototype agreement using OTA, DoD will have the option to enter into a follow-on OT agreement with project developers without further competition, expediting future processes.
DOE could implement a similar program for its 2.4 million acres of land. In particular, the DOE’s land in Idaho and other western states has favorable geothermal resources, which the DOE has considered leasing. By providing some funding for surveying and proposal development like the DoD, the DOE can increase the odds of successful project development, compared to simply leasing the land without funding support. The DOE could also offer technical support to projects from its national labs.
With such a program, a lot of the value that the DOE would be providing is the land itself, which the DOE currently has more of than actual funding for geothermal energy. The funding needed for surveying and proposal development is much less than would be needed to support the actual construction of demonstration projects, so GTO could feasibly request funding for such a program through the annual appropriations process. Depending on the program outcomes and the resulting proposals, the DOE could then go back to Congress to request follow-on funding to support actual project construction.
Drilling Cost-Share Program
To help defray the high cost of drilling, the DOE could implement a milestone-based cost-share program. There is precedent for government cost-share programs for geothermal: in 1973, before the DOE was even established, Congress passed the Geothermal Loan Guarantee Program to provide “investment security to the public and private sectors to exploit geothermal resources” in the early days of the industry. Later, the DOE funded the Cascades I and II Cost Shared Programs. Then, from 2000 to 2007, the DOE ran the Geothermal Resource Exploration and Definitions (GRED) I, II, and III Cost-Share Programs. This year, the DOE launched its EGS Pilot Demonstrations program.
A milestone payment structure could be favorable for supporting expensive, next-generation geothermal projects because the government takes on less risk compared to providing all of the funding upfront. Initial funding could be provided for drilling the first few wells. Successful and on-time completion of drilling could then unlock additional funding to drill more wells, and so on. In the past, both the DoD and the National Aeronautics and Space Administration (NASA) have structured their OT agreements using milestone payments, most famously between NASA and SpaceX for the development of the Falcon9 space launch vehicle. The NASA and SpaceX agreement included not just technical but also financial milestones for the investment of additional private capital into the project. The DOE could do the same and include both technical and financial milestones in a geothermal cost-share program.
Risk Insurance Program
Longer term, the DOE could implement a risk insurance program for conventional hydrothermal and next-generation geothermal projects. Insuring against underproduction could make it easier and cheaper for projects to be financed, since the potential downside for investors would be capped. The DOE could initially offer insurance just for conventional hydrothermal, since there is already extensive data on past commercial projects that can inform how the insurance is designed. In order to design insurance for next-generation technologies, more commercial-scale projects will first need to be built to collect the data necessary to assess the underproduction risk of different approaches.
France has administered a successful Geothermal Public Risk Insurance Fund for conventional hydrothermal projects since 1982. The insurance originally consisted of two parts: a Short-Term Fund to cover the risk of underproduction and a Long-Term Fund to cover uncertain long-term behavior over the operating lifetime of the geothermal plant. The Short-Term Fund asked project owners to pay a premium of 1.5% of the maximum guaranteed amount. In return, the Short-Term Fund provided a 20% subsidy for the cost of drilling the first well and, in the case of reduced output or a dry hole, a compensation between 20% and 90% of the maximum guaranteed amount (inclusive of the subsidy that has already been paid). The exact compensation is determined based on a formula for the amount necessary to restore the project’s profitability with its reduced output. The Short-Term Fund relied on a high success rate, especially in the Paris Basin where there is known to be good hydrothermal resources, to fund the costs of failures. Geothermal developers that chose to get coverage from the Short-Term Fund were required to also get coverage from the Long-Term Fund, which was designed to hedge against the possibility of unexpected geological or geothermal changes within the wells, such as if their output declined faster than expected or severe corrosion or scaling occurred, over the geothermal plant’s operating lifetime. The Long-Term Fund ended in 2015, but a new iteration of the Short-Term Fund was approved in 2023.
The Netherlands has successfully run a similar program to the Short-Term Fund since the 2000s. Private-sector attempts at setting up geothermal risk insurance packages in Europe and around the world have mostly failed, though. The premiums were often too high, costing up to 25–30% of the cost of drilling, and were established in developing markets where not enough projects were being developed to mutualize the risk.
To implement such a program at the DOE, projects seeking coverage would first submit an application consisting of the technical plan, timeline, expected costs, and expected output. The DOE would then conduct rigorous due diligence to ensure that the project’s proposal is reasonable. Once accepted, projects would pay a small premium upfront; the DOE should keep in mind the failed attempts at private-sector insurance packages and ensure that the premium is affordable. In the case that either the installed capacity is much lower than expected or the output capacity declines significantly over the course of the first year of operations, the Fund would compensate the project based on the level of underproduction and the amount necessary to restore the project’s profitability with a reduced output. The French Short-Term Fund calculated compensation based on characteristics of the hydrothermal wells; the DOE would need to develop its own formulas reflective of the costs and characteristics of different next-generation geothermal technologies once commercial data actually exists.
Before setting up a geothermal insurance fund, the DOE should investigate whether there are enough geothermal projects being developed across the country to ensure the mutualization of risk and whether there is enough commercial data to properly evaluate the risk. Another concern for next-generation geothermal is that a high failure rate could cause the fund to run out. To mitigate this, the DOE will need to analyze future commercial data for different next-generation technologies to assess whether each technology is mature enough for a sustainable insurance program. Last, poor state capacity could impede the feasibility of implementing such a program. The DOE will need personnel on staff that are sufficiently knowledgeable about the range of emerging technologies in order to properly evaluate technical plans, understand their risks, and design an appropriate insurance package.
Production Subsidy
While the green premium for next-generation geothermal has not been an issue in California, it may be slowing down project development in other states with lower electricity prices. The Inflation Reduction Act introduced a new clean energy Production Tax Credit that included geothermal energy for the first time. However, due to the higher development costs of next-generation geothermal projects compared to other renewable energy projects, that subsidy is insufficient to fully bridge the green premium. DOE could use OTA to introduce a production subsidy for next-generation geothermal energy with varied rates depending on the state that the electricity is sold to and its average baseload electricity price (e.g., the production subsidy likely would not apply to California). This would help address variations in the green premium across different states and expand the number of states in which it is financially viable to develop next-generation geothermal projects.
Conclusion
The United States is well-positioned to lead the next-generation geothermal industry, with its abundance of geothermal resources and opportunities to leverage the knowledge and workforce of the domestic oil and gas industry. The responsibility is on Congress to ensure that DOE has the necessary funding to support the full range of innovative technologies being pursued by this young industry. With more funding, DOE can take advantage of the flexibility offered by OTA to create agreements tailored to the unique challenges that the geothermal industry faces as it begins to scale. Successful commercialization would pave the way to unlocking access to 24/7 clean energy almost anywhere in the country and help future-proof the transition to a fully decarbonized power grid.
Bio x AI: Policy Recommendations for a New Frontier
Artificial intelligence (AI) is likely to yield tremendous advances in our basic understanding of biological systems, as well as significant benefits for health, agriculture, and the broader bioeconomy. However, AI tools, if misused or developed irresponsibly, can also pose risks to biosecurity. The landscape of biosecurity risks related to AI is complex and rapidly changing, and understanding the range of issues requires diverse perspectives and expertise. To better understand and address these challenges, FAS initiated the Bio x AI Policy Development Sprint to solicit creative recommendations from subject matter experts in the life sciences, biosecurity, and governance of emerging technologies. Through a competitive selection process, FAS identified six promising ideas and, over the course of seven weeks, worked closely with the authors to develop them into the recommendations included here. These recommendations cover a diverse range of topics to match the diversity of challenges that AI poses in the life sciences. We believe that these will help inform policy development on these topics, including the work of the National Security Commission on Emerging Biotechnologies.
AI tool developers and others have put significant effort into establishing frameworks to evaluate and reduce risks, including biological risks, that might arise from “foundation” models (i.e., large models designed to be used for many different purposes). These include voluntary commitments from major industry stakeholders, and several efforts to develop methods for evaluations of these models. The Biden Administration’s recent Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (Bioeconomy EO) furthers this work and establishes a framework for evaluating and reducing risks related to AI.
However, the U.S. government will need creative solutions to establish oversight for biodesign tools (i.e., more specialized AI models that are trained on biological data and provide insight into biological systems). Although there are differing perspectives among experts, including those who participated in this Policy Sprint, about the magnitude of risks that these tools pose, they undoubtedly are an important part of the landscape of biosecurity risks that may arise from AI. Three of the submissions to this Policy Sprint address the need for oversight of these tools. Oliver Crook, a postdoctoral researcher at the University of Oxford and a machine learning expert, calls on the U.S. government to ensure responsible development of biodesign tools by instituting a framework for checklist-based, institutional oversight for these tools while Richard Moulange, AI-Biosecurity Fellow at the Centre for Long-Term Resilience, and Sophie Rose, Senior Biosecurity Policy Advisor at the Centre for Long-Term Resilience, expand on the Executive Order on AI with recommendations for establishing standards for evaluating their risks. In his submission, Samuel Curtis, an AI Governance Associate at The Future Society, takes a more open-science approach, with a recommendation to expand infrastructure for cloud-based computational resources internationally to promote critical advances in biodesign tools while establishing norms for responsible development.
Two of the submissions to this Policy Sprint work to improve biosecurity at the interface where digital designs might become biological reality. Shrestha Rath, a scientist and biosecurity researcher, focuses on biosecurity screening of synthetic DNA, which the Executive Order on AI highlights as a key safeguard, and contains recommendations for how to improve screening methods to better prepare for designs produced using AI. Tessa Alexanian, a biosecurity and bioweapons expert, calls for the U.S. government to issue guidance on biosecurity practices for automated laboratories, sometimes called “cloud labs,” that can generate organisms and other biological agents.
This Policy Sprint highlights the diversity of perspectives and expertise that will be needed to fully explore the intersections of AI with the life sciences, and the wide range of approaches that will be required to address their biosecurity risks. Each of these recommendations represents an opportunity for the U.S. government to reduce risks related to AI, solidify the U.S. as a global leader in AI governance, and ensure a safer and more secure future.
Recommendations
- Develop a Screening Framework Guidance for AI-Enabled Automated Labs by Tessa Alexanian
- An Evidence-Based Approach to Identifying and Mitigating Biological Risks From AI-Enabled Biological Tools by Richard Moulange & Sophie Rose
- A Path to Self-governance of AI-Enabled Biology by Oliver Crook
- A Global Compute Cloud to Advance Safe Science and Innovation by Samuel Curtis
- Establish Collaboration Between Developers of Gene Synthesis Screening Tools and AI Tools Trained on Biological Data by Shrestha Rath
- Responsible and Secure AI in Production Agriculture by Jennifer Clarke
Develop a Screening Framework Guidance for AI-Enabled Automated Labs
Tessa Alexanian
Protecting against the risk that AI is used to engineer dangerous biological materials is a key priority in the Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence (AI EO). AI-engineered biological materials only become dangerous after digital designs are converted into physical biological agents, and biosecurity organizations have recommended safeguarding this digital-to-physical transition. In Section 4.4(b), the AI EO targets this transition by calling for standards and incentives that ensure appropriate screening of synthetic nucleic acids. This should be complemented by screening at another digital-to-physical interface: AI-enabled automated labs, such as cloud labs and self-driving labs.1
Laboratory biorisk management does not need to be reinvented for AI-enabled labs; existing US biosafety practices and Dual Use Research of Concern (DURC) oversight can be adapted and applied, as can emerging best practices for AI safety. However, the U.S. government should develop guidance that addresses two unique aspects of AI-enabled automated labs:
- Remote access to laboratory equipment may allow equipment to be misused by actors who would find it difficult to purchase or program it themselves.
- Unsupervised engineering of biological materials could produce dangerous agents without appropriate safeguards (e.g., if a viral vector regains transmissibility during autonomous experiments).
In short, guidance should ensure that providers of labs are aware of who is using the lab (customer screening), and what it is being used for (experiment screening).
It’s unclear precisely if and when automated labs will become broadly accessible to biologists, though a 2021 WHO horizon scan described them as posing dual-use concerns within five years. These are dual-use concerns, and policymakers must also consider the benefits that remotely accessible, high throughput, AI-driven labs offer for scientific discovery and biomedical innovation. The Australia Group discussed potential policy responses to cloud labs in 2019, including customer screening, experiment screening, and cybersecurity, though no guidance has been released. This is the right moment to develop screening guidance because automated labs are not yet widely used, but are attracting increasing investment and attention.
Recommendations
The evolution of U.S. policy on nucleic acid synthesis screening shows how the government can proactively identify best practices, issue voluntary guidance, allow stakeholders to test the guidance, and eventually require that federally-funded researchers procure from providers that follow a framework derived from the guidance.
Recommendation 1. Convene stakeholders to identify screening best practices
Major cloud lab companies already implement some screening and monitoring, and developers of self-driving labs recognize risks associated with them, but security practices are not standardized. The government should bring together industry and academic stakeholders to assess which capabilities of AI-enabled automated labs pose the most risk and share best practices for appropriate management of these risks.
As a starting point, aspects of the Administration for Strategic Preparedness and Response’s (ASPR) Screening Framework Guidance for synthetic nucleic acids can be adapted for AI-enabled automated labs. Labs that offer remote access could follow a similar process for customer screening, including verifying identity for all customers and verifying legitimacy for work that poses elevated dual-use concerns. If an AI system operating an autonomous or self-driving lab places a synthesis order for a sequence of concern, this could trigger a layer of human-in-the-loop approval.
Best practices will require cross-domain collaboration between experts in machine learning, laboratory automation, autonomous science, biosafety and biosecurity. Consortia such as the International Gene Synthesis Consortium and Global Biofoundy Alliance already have U.S. cloud labs among their members and may be a useful starting point for stakeholder identification.
Recommendation 2. Develop guidance based on these best practices
The Director of the Office of Science and Technology Policy (OSTP) should lead an interagency policy development process to create screening guidance for AI-enabled automated labs. The guidance will build upon stakeholder consultations conducted under Recommendation 1, as well as the recent ASPR-led update to the Screening Framework Guidance, ongoing OSTP-led consultations on DURC oversight, and OSTP-led development of a nucleic acid synthesis screening framework under Section 4.4(b) of the AI EO.
The guidance should describe processes for customer screening and experiment screening. It should address biosafety and biosecurity risks associated with unsupervised engineering of biological materials, including recommended practices for:
- Dual-use review for automated protocols. Automated protocols typically undergo human review because operators of automated labs don’t want to run experiments that fail. Guidance should outline when protocols should undergo additional review for dual-use; the categories of experiments in the DURC policy provide a starting point.
- Identifying biological agents in automated labs. When agents are received from customers, their DNA should be sequenced to ensure they have been labeled correctly. Agents engineered through unsupervised experiments should also be screened after some number of closed-loop experimental cycles.
Recommendation 3. Invest in predictive biology for risk mitigation
The Department of Homeland Security (DHS) and Department of Defense (DOD), building off the evaluation they will conduct under 4.4(a)(i) of the AI EO, should fund programs to develop predictive models to improve biorisk management in AI-enabled automated labs.
It is presently difficult to predict the behavior of biological systems, and there is little focus specifically on predictive biology for risk mitigation. AI could perform real-time risk evaluations and anomaly detection in self-driving labs; for example, autonomous science researchers have highlighted the need to develop models that can recognize novel compounds with potentially harmful properties. The government can actively contribute to innovation in this area; the IARPA Fun GCAT program, which developed methods to assess whether DNA sequences pose a threat, is an example of relevant government-funded AI capability development.
An evidence-based approach to identifying and mitigating biological risks from AI-enabled biological tools
Richard Moulange & Sophie Rose
Both AI-enabled biological tools and large language models (LLMs) have advanced rapidly in a short time. While these tools have immense potential to drive innovation, they could also threaten the United States’ national security.
AI-enabled biological tools refer to AI tools trained on biological data using machine learning techniques, such as deep neural networks. They can already design novel proteins, viral vectors and other biological agents, and may in the future be able to fully automate parts of the biomedical research and development process.
Sophisticated state and non-state actors could potentially use AI-enabled tools to more easily develop biological weapons (BW) or design them to evade existing countermeasures . As accessibility and ease of use of these tools improves, a broader pool of actors is enabled.
This threat was recognized by the recent Executive Order on Safe AI, which calls for evaluation of all AI models (not just LLMs) for capabilities enabling chemical, biological, radiological and nuclear (CBRN) threats, and recommendations for how to mitigate identified risks.
Developing novel AI-enabled biological tool -evaluation systems within 270 days, as directed by the Executive Order §4.1(b), will be incredibly challenging, because:
- There appears to have been little progress on developing benchmarks or evaluations for AI-enabled biological tools in academia or industry, and government capacity (in the U.S. and the UK) has so far focused on model evaluations for LLMs, not AI-enabled biological tools.
- Capabilities are entirely dual-use: for example, tools that can predict which viral mutations improve vaccine targeting can very likely identify mutations that increase vaccine evasion.
To achieve this, it will be important to identify and prioritize those AI-enabled biological tools that pose the most urgent risks, and balance these against the potential benefits. However, government agencies and tool developers currently seem to struggle to:
- Specify which AI–bio capabilities are the most concerning;
- Determine the scope of AI–enabled tools that pose significant biosecurity risks; and
- Anticipate how these risks might evolve as more tools are developed and integrated
Some frontier AI labs have assessed the biological risks associated with LLMs , but there is no public evidence of AI-enabled biological tool evaluation or red-teaming, nor are there currently standards for developing—or requirements to implement—them. The White House Executive Order will build upon industry evaluation efforts for frontier models, addressing the risk posed by LLMs, but analogous efforts are needed for AI-enabled biological tools.
Given the lack of research on AI-enabled biological tool evaluation, the U.S. Government must urgently stand up a specific program to address this gap and meet the Executive Order directives. Without evaluation capabilities, the United States will be unable to scope regulations around the deployment of these tools, and will be vulnerable to strategic surprise. Doing so now is essential to capitalize on the momentum generated by the Executive Order, and comprehensively address the relevant directives within 270 days.
Recommendations
The U.S. Government should urgently acquire the ability to evaluate biological capabilities of AI-enabled biological tools via a specific joint program at the Departments of Energy (DOE) and Homeland Security (DHS), in collaboration with other relevant agencies.
Strengthening the U.S. Government’s ability to evaluate models prior to their deployment is analogous to responsible drug or medical device development: we must ensure novel products do not cause significant harm, before making them available for widespread public use.
The objective(s) of this program would be:
- Develop state-of-the-art evaluations for dangerous biological capabilities
- Establish Department of Energy (DOE) sandbox for testing evaluations on a variety of AI-enabled biological tools
- Produce standards for performance, structure and securitisation of capability evaluations
- Use evaluations of the maturity and capabilities of AI-enabled biological tools to inform U.S. Intelligence Community assessments of potential adversaries’ current bio-weapon capabilities
Implementation
- Standing up and sustaining DOE and DHS’s ‘Bio Capability Evaluations’ program will require an initial investment of $2 million USD and $2 million/year until 2030 to sustain. Funding should draw on existing National Intelligence Program appropriations.
- Supporting DOE to establish a sandbox for conducting ongoing evaluations of AI-enabled biological tools will require investment of $10 million annually. This could be appropriated to DOE under the National Defense Authorization Act (Title II: Research, Development, Test and Evaluation), which establishes funding for AI defense programs.
Lead agencies and organizations
- U.S. Department of Energy (DOE) can draw on expertise from National Labs, which often evaluate—and develop risk mitigation measures for—technologies with CBRN implications.
- U.S. Department of Homeland Security (DHS) can inform threat assessments and inform biological risk mitigation strategy and policy.
- National Institute for Standards and Technology (NIST) can develop the standards for the performance, structure and securitization of dangerous capability evaluations.
- U.S. Department of Health and Human Services (HHS) can leverage their AI Community of Practice (CoP) as an avenue for communicating with BT developers and researchers. The National Institutes of Health (NIH) funds relevant research and will therefore need to be involved in evaluations.
They should coordinate with other relevant agencies, including but not limited to the Department of Defense, and the National Counterproliferation and Biosecurity Center.
The benefits of implementing this program include:
Leveraging public-private expertise. Public-private partnerships (involving both academia and industry) will produce comprehensive evaluations that incorporate technical nuances and national security considerations. This allows the U.S. Government to retain access to diverse expertise whilst safeguarding the sensitive nature of dangerous capability evaluations contents and output—which is harder to guarantee with third-party evaluators.
Enabling evidence-based regulatory decision-making. Evaluating AI tools allows the U.S. Government to identify the models and capabilities that pose the greatest biosecurity risks, enabling effective and appropriately-scoped regulations. Avoiding blanket regulations results in a better balance of the considerations of innovation and economic growth with those of risk mitigation and security.
Broad scope of evaluation application. AI-enabled biological tools vary widely in their application and current state of maturity. Subsequently, what constitutes a concerning, or dangerous, capability may vary widely across tools, necessitating the development of tailored evaluations.
A path to self-governance of AI-enabled biology
Oliver Crook
Artificial intelligence (AI) and machine learning (ML) are being increasingly employed for the design of proteins with specific functions. By adopting these tools, researchers have been able to achieve high success rates designing and generating proteins with certain properties. This will accelerate the design of new medical therapies such as antibodies, vaccines and biotechnologies such as nanopores. However, AI-enabled biology could also be used for malicious – rather than benevolent – purposes. Despite this potential for misuse, there is little to no oversight over what tools can be developed, the data they can be trained on, and how the developed tool can be deployed. While more robust guardrails are needed, any proposed regulation must also be balanced, so that it encourages responsible innovation.
AI-enabled biology is still a specialized methodology that requires significant technical expertise, access to powerful computational resources, and sufficient quantities of data. As the performance of these models increases, their potential for generating significantly harmful agents grows as well. With AI-enabled biology becoming more accessible, the value of guardrails early on in the development of this technology is paramount before widespread technology proliferation makes it challenging – or impossible – to govern. Furthermore, smart policies implemented now can allow us to better monitor the pace of development, and guide reasonable and measured policy in the future.
Here, we propose that fostering self-governance and self-reporting is a scalable approach to this policy challenge. During the research, development and deployment (RDD) phases, practitioners report on a pre-decided checklist and make an ethics declaration. While advancing knowledge is an academic imperative, funders, editors, and institutions need to be fully aware of the risks of some research and have opportunities to adjust the RDD plan, as needed, to ensure that AI models are developed responsibly. Whilst similar policies have already been introduced by some machine learning venues (1, 2, 3), the proposal here seeks to strengthen, formalize and broaden the scope of those proposals. Ultimately, the checklist and ethics declarations seek confirmation from multiple parties during each of the RDD phases that the research is of fundamental public good. We recommend that the National Institutes of Health (NIH) leads on this policy challenge and builds upon decades of experience on related issues.
The recent executive order framework for safe AI provides an opportunity to build upon initial recommendations on reporting but with greater specificity on AI-enabled biology. The proposal fits squarely into the desire under section 4.4 for the executive order to reduce the misuse of AI to assist in the development and design of biological weapons.
Recommendations
We propose the following recommendations:
Recommendation 1. With leadership from the NIH Office of Science Policy, life sciences funding agencies should coordinate development of a checklist in consultation with AI-enabled biology model developers, non-government funders, publishers, and nonprofit organizations that evaluates risks and benefits of the model.
The checklist should take the form of a list of pre-specified questions and guided free-form text. The questions should gather basic information about the models employed: their size, their compute usage and the data they were trained on. This will allow them to be characterized in comparison with existing models. The intended use of the model should be stated along with any dual-use behavior of the model that has already been identified. The document should also reveal whether any strategies have been employed to mitigate the harmful capabilities that the model might demonstrate.
At each stage of the RDD, the predefined checklist for that stage is completed and submitted to the institution.
Recommendation 2. Each institute employing AI-enabled biology across RDD should elect a small internally-led, cross-disciplinary committee to examine and evaluate, at each phase, the submitted checklists. To reduce workload, only models that fall under the executive order specifications or dual use research of concern (DURC) should be considered. The committee makes recommendations based on the value of the work. The committee then posts their proceedings of meetings publicly (as for Institutional Biosafety Committees), except for publicly sensitive intellectual property. If the benefit of the work cannot be evaluated or the outcomes are largely unpredictable the committee should work with the model developer to adjust the RDD plan, as needed. The checklist and institutional signature are then made available to NIH and funding agencies and, upon completion of the project, such as at publication, the checklists are made publicly available.
By following these recommendations, high-risk research will be caught at an institutional level and internal recommendations can facilitate timely mitigation of harms. Public release of committee deliberations and ethics checklists will enable third parties to scrutinize model development and raise concerns. This approach ensures a hierarchy of oversight that allows individuals, institutes, funders and governments to identify and address risks before AI models are developed rather than after the work has been completed.
We recommend that $5 million dollars be provided to the NIH Office of Science Policy to implement this policy. This money would cover hiring a ‘Director of Ethics of AI-enabled Biology’ to oversee this research and several full time researchers/administrators ($1.5 million). These employees should conduct outreach to the institutes to ensure that the policy is understood, to answer any questions, and to facilitate community efforts to develop and update the checklist ($1 million). Additional grants should be made available to allow researchers and non-profit organizations to audit the checklists and committees, evaluate the checklists, and research the socio-technological implications of the checklists ($1.5 million). The rapid pace of development of AI means that the checklists will need to be reevaluated on a yearly basis, with $1 million of funding available to evaluate the impact of these grants. Funding should grow inline with the pace of technological development. Specific subareas of AI-enabled biology may need specific checklists depending on their risk profile.
This recommendation is scalable; once the checklists have been made, the majority of the work is placed in the hands of practitioners rather than government. In addition, these checklists provide valuable information to inform future governance agendas. For example, limiting computational resources to curtail dangerous applications (compute governance) cannot proceed without detailed understanding of how much compute is required to achieve certain goals. Furthermore, it places responsibility on practitioners requiring them to engage with the risk that could arise from their work, with institutes having the ability to make recommendations on how to reduce the risks from models. This approach draws on similar frameworks that support self-governance, such as oversight by Institutional Biosafety Committees (IBCs). This self-governance proposal is well complemented by alternative policies around open access of AI-enabled biology tools, as well as policies strengthening DNA synthesis screening protocols to catch misuse at different places along a broadly-defined value chain.
A Global Compute Cloud to Advance Safe Science and Innovation
Samuel Curtis
Advancements in deep learning have ushered in significant progress in the predictive accuracy and design capabilities of biological design tools (BDTs), opening new frontiers in science and medicine through the design of novel functional molecules. However, these same technologies may be misused to create dangerous biological materials. Mitigating the risks of misuse of BDTs is complicated by the need to maintain openness and accessibility among globally-distributed research and development communities. One approach toward balancing both risks of misuse and the accessibility requirements of development communities would be to establish a federally-funded and globally-accessible compute cloud through which developers could provide secure access to their BDTs.
The term “biological design tools” (or “BDTs”) is a neologism referring to “systems trained on biological data that can help design new proteins or other biological agents.” Computational biological design is, in essence, a data-driven optimization problem. Consequently, over the past decade, breakthroughs in deep learning have propelled progress in computational biology. Today, many of the most advanced BDTs incorporate deep learning techniques and are used and developed by networks of academic researchers distributed across the globe. For example, the Rosetta Software Suite, one of the most popular BDT software packages, is used and developed by Rosetta Commons—an academic consortium of over 100 principal investigators spanning five continents.
Contributions of BDTs to science and medicine are difficult to overstate. There are already several AI-designed molecules in early-stage clinical trials. BDTs are now used to identify new drug targets, design new therapeutics, and construct faster and less expensive drug synthesis techniques. There are already several AI-designed molecules in early-stage clinical trials.
Unfortunately, these same BDTs can be used for harm. They may be used to create pathogens that are more transmissible or virulent than known agents, target specific sub-populations, or evade existing DNA synthesis screening mechanisms. Moreover, developments in other classes of AI systems portend reduced barriers to BDT misuse. One group at RAND Corporation found that language models could provide guidance that could assist in planning and executing a biological attack, and another group from MIT demonstrated how language models could be used to elicit instructions for synthesizing a potentially pandemic pathogen. Similarly, language models could accelerate the acquisition or interpretation of information required to misuse BDTs. Technologies on the horizon, such as multimodal “action transformers,” could help individuals navigate BDT software, further lowering barriers to misuse.
Research points to several measures BDT developers could employ to reduce risks of misuse, such as securing machine learning model weights (the numerical values representing the learned patterns and information that the model has acquired during training), implementing structured access controls, and adopting Know Your Customer (KYC) processes. However, precaution would have to be taken to not unduly limit access to these tools, which could, in aggregate, impede scientific and medical advancement. For any given tool, access limitations risk diminishing its competitiveness (its available features and performance relative to other tools). These tradeoffs extend to their developers’ interests, whereby stifling the development of tools may jeopardize research, funding, and even career stability. The difficulties of striking a balance in managing risk are compounded by the decentralized, globally-distributed nature of BDT development communities. To suit their needs, risk-mitigation measures should involve minimal, if any, geographic or political restrictions placed on access while simultaneously expanding the ability to monitor for and respond to indicators of risk or patterns of misuse.
One approach that would balance the simultaneous needs for accessibility and security would be for the federal government to establish a global compute cloud for academic research, bearing the costs of running servers and maintaining the security of the cloud infrastructure in the shared interests of advancing public safety and medicine. A compute cloud would enable developers to provide access to their tools through computing infrastructure managed—and held to specific security standards—by U.S. public servants. Such infrastructure could even expand access for researchers, including underserved communities, through fast-tracked grants in the form of computational resources.
However, if computing infrastructure is not designed to reflect the needs of the development community—namely, its global research community—it is unlikely to be adopted in practice. Thus, to fully realize the potential of a compute cloud among BDT development communities, access to the infrastructure should extend beyond U.S. borders. At the same time, the efforts should ensure the cloud has requisite monitoring capabilities to identify risk indicators or patterns of misuse and impose access restrictions flexibly. By balancing oversight with accessibility, a thoughtfully-designed compute cloud could enable transparency and collaboration while mitigating the risks of these emerging technologies.
Recommendations
The U.S. government should establish a federally-funded, globally-accessible compute cloud through which developers could securely provide access to BDTs. In fact, the Biden Administration’s October 2023 “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” (the “AI EO”) lays groundwork by establishing a pilot program of a National AI Research Resource (NAIRR)—a shared research infrastructure providing AI researchers and students with expanded access to computational resources, high-quality data, educational tools, and user support. Moving forward, to increase the pilot program’s potential for adoption by BDT developers and users, relevant federal departments and agencies should take concerted action in the timelines circumscribed by the AI EO to address the practical requirements of BDT development communities: the simultaneous need to expand access outside U.S. borders while bolstering the capacity to monitor for misuse.
It is important to note that a federally-funded compute cloud has been years in the making. The National AI Initiative Act of 2020 directed the National Science Foundation (NSF), in consultation with the Office of Science and Technology Policy (OSTP), to establish a task force to create a roadmap for the NAIRR. In January 2023, the NAIRR Task Force released its final report, “Strengthening and Democratizing the U.S. Artificial Intelligence Innovation Ecosystem,” which presented a detailed implementation plan for establishing the NAIRR. The Biden Administration’s AI EO then directed the Director of NSF, in coordination with the heads of agencies deemed appropriate by the Director, to launch a pilot program “consistent with past recommendations of the NAIRR Task Force.”
However, the Task Force’s past recommendations are likely to fall short of the needs of BDT development communities (not to mention other AI development communities). In its report, the Task Force described NAIRR’s primary user groups as “U.S.-based AI researchers and students at U.S. academic institutions, non-profit organizations, Federal agencies or FFRDCs, or startups and small businesses awarded [Small Business Innovation Research] or [Small Business Technology Transfer] funding,” and its resource allocation process is oriented toward this user base. Separately, Stanford University’s Institute for Human-centered AI (HAI) and the National Security Commission on Artificial Intelligence (NSCAI) have proposed institutions, building upon or complementing NAIRR, that would support international research consortiums (a Multilateral AI Research Institute and an International Digital Democracy Initiative, respectively), but the NAIRR Task Force’s report—upon which the AI EO’s pilot program is based—does not substantively address this user base.
In launching the NAIRR pilot program under Sec. 5.2(a)(i), the NSF should put the access and security needs of international research consortiums front and center, conferring with heads of departments and agencies with relevant scope and expertise, such as the Department of State, US Agency for International Development (USAID), Department of Education, the National Institutes of Health, and the Department of Energy. The NAIRR Operating Entity (as defined in the Task Force’s report) should investigate how funding, resource allocation, and cybersecurity could be adapted to accommodate researchers outside of U.S. borders. In implementing the NAIRR pilot program, the NSF should incorporate BDTs in their development of guidelines, standards, and best practices for AI safety and security, per Sec. 4.1, which could serve as standards with which NAIRR users should be required to comply. Furthermore, the NSF Regional Innovation Engine launched through Sec. 5(a)(ii) should consider focusing on international research collaborations, such as those in the realm of biological design.
Besides the NSF, which is charged with piloting NAIRR, relevant departments and agencies should take concerted action in implementing the AI EO to address issues of accessibility and security that are intertwined with international research collaborations. This includes but is not limited to:
- In accordance with Sec. 5.2(a)(i), the departments and agencies listed above should be tasked with investigating the access and security needs of international research collaborations and include these in the reports they are required to submit to the NSF. This should be done in concert with the development of guidelines, standards, and best practices for AI safety and security required by Sec. 4.1.
- In fulfilling the requirements of Sec. 5.2(c-d), the Under Secretary of Commerce for Intellectual Property, the Director of the United States Patent and Trademark Office, and the Secretary of Homeland Security should, in the reports and guidance on matters related to intellectual property that they are required to develop, clarify ambiguities and preemptively address challenges that might arise in the cross-border data use agreements.
- Under the terms of Sec. 5.2(h), the President’s Council of Advisors on Science and Technology should, in its development of “a report on the potential role of AI […] in research aimed at tackling major societal and global challenges,” focus on the nature of decentralized, international collaboration on AI systems used for biological design.
- Pursuant to Sec. 11(a-d), the Secretary of State, the Assistant to the President for National Security Affairs, the Assistant to the President for Economic Policy, and the Director of OSTP should focus on AI used for biological design as a use case for expanding engagements with international allies and partners, and establish a robust international framework for managing the risks and harnessing the benefits of AI. Furthermore, the Secretary of Commerce should make this use case a key feature of its plan for global engagement in promoting and developing AI standards.
The AI EO provides a window of opportunity for the U.S. to take steps toward mitigating the risks posed by BDT misuse. In doing so, it will be necessary for regulatory agencies to proactively seek to understand and attend to the needs of BDT development communities, which will increase the likelihood that government-supported solutions, such as the NAIRR pilot program—and potentially future fully-fledged iterations enacted via Congress—are adopted by these communities. By making progress toward reducing BDT misuse risk while promoting safe, secure access to cutting-edge tools, the U.S. could affirm its role as a vanguard of responsible innovation in 21st-century science and medicine.
Establish collaboration between developers of gene synthesis screening tools and AI tools trained on biological data
Shrestha Rath
Biological Design Tools (BDTs) are a subset of AI models trained on genetic and/or protein data developed for use in life sciences. These tools have recently seen major performance gains, enabling breakthroughs like accurate protein structure predictions by AlphaFold2, and alleviating a longstanding challenge in life sciences.
While promising for legitimate research, BDTs risk misuse without oversight. Because universal screening of gene synthesis is currently lacking, potential threat agents could be digitally designed with assistance from BDTs and then physically made using gene synthesis. BDTs pose particular challenges because:
- The growing number of gene synthesis orders evade current screening capabilities. Industry experts in gene synthesis companies report that a small but concerning portion of orders for synthetic nucleic acid sequences show little or no homology with known sequences in widely-used genetic databases and so are not captured by current screening techniques. Advances in BDTs are likely going to make such a scenario more common, thus exacerbating the risk of misuse of synthetic DNA. The combined use of BDTs and gene synthesis has the potential to aid the “design” and “build” steps for malicious misuse. Strengthening screening capabilities to keep up with advances in BDTs is an attractive early intervention point to prevent this misuse.
- Potential for substantial breakthroughs in BDTs. While BDTs for applications beyond protein design face significant challenges, and most are not yet mature, companies are likely to invest in generating data to improve these tools because they see significant economic value in doing so. Moreover, some AI experts speculate that if we used the same amount of computational resources as LLMs when training protein language models, we could significantly improve their performance. Thus there is significant uncertainty about rapid advances in BDTs and how they may affect the potential for misuse.
BDT development is currently concentrated among a few actors, which makes policy implementation tractable, but risks will decentralize over time away from a handful of well-resourced academic labs and private AI companies. The U.S. government should take advantage of this unique window of opportunity to implement policy guardrails while the next generation of advanced BDTs are in development.
In short, it is important that developers of BDTs work together with developers and users of gene synthesis screening tools. This will promote shared understanding of the risks around potential misuse of synthetic nucleic acids, which may be exacerbated by advances in AI.
By bringing together key stakeholders to share information and align on safety standards the U.S. government can steer these technologies to maximize benefits and minimize widespread harms. Section 4.4 (b) of the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (henceforth referred to as the “Executive Order”) also emphasizes mitigating risks from the “misuse of synthetic nucleic acids, which could be substantially increased by AI’s capabilities”.
Gene synthesis companies and/or organizations involved in developing gene screening mechanisms are henceforth referred to as “DNA Screener”. Academic and for-profit stakeholders developing BDTs are henceforth referred to as “BDT Developer”. Gene synthesis companies providing synthetic DNA as a service irrespective of their screening capabilities are referred to as “DNA Provider”.
Recommendations
By bringing together key stakeholders (BDT developers, DNA screeners, and security experts) to share information and align on safety standards, the U.S. government can steer these technologies to maximize benefits and minimize widespread harms. Implementing these recommendations requires allocating financial resources and coordinating interagency work.
There are near-term and long-term opportunities to improve coordination between DNA Screeners, DNA Providers (that use in-house screening mechanisms) and BDT Developers, such as to avoid future potential backlash, over-regulation and legal liability. As part of the implementation of Section 4.4 of the Executive Order, the Department of Energy and the National Institute for Standards and Technology should:
Recommendation 1. Convene BDT Developers, DNA Screeners along with ethics, security and legal experts to a) Share information on AI model capabilities and their implications for DNA sequence screening; and b) Facilitate discussion on shared safeguards and security standards. Technical security standards may include adversarial training to make the AI models robust against purposeful misuse, BDTs refusing to follow user requests when the requested action may be harmful (refusals and blacklisting), and maintaining user logs.
Recommendation 2. Create an advisory group to investigate metrics that measure performance of protein BDTs for DNA Screeners, in line with Section 4.4 (a)(ii)(A) of the Executive Order. Metrics that capture BDT performance and thus risk posed by advanced BDTs would give DNA Screeners helpful context while screening orders. For example, some current methods for benchmarking AI-enabled protein design methods focus on sequence recovery, where the backbones of natural proteins with known amino-acid sequences are passed as the input and the accuracy of the method is measured by identity between the predicted sequence and the true sequence.
Recommendation 3. Support and fund development of AI-enabled DNA screening mechanisms that will keep pace with BDTs. U.S. national laboratories should support these types of efforts because commercial incentives for such tools are lacking. IARPA’s Fun GCAT is an exemplary case in this regard.
Recommendation 4. Conduct structured red teaming for current DNA screening methods to ensure they account for functional variants of Sequence Of Concern (SOCs) that may be developed with the help of BDTs and other AI tools. Such red teaming exercises should include expert stakeholders involved in development of screening mechanisms and national security community.
Recommendation 5. Establish both policy frameworks and technical safeguards for identification of certifiable origins. Designs produced by BDTs could require a cryptographically signed certificate detailing the inputs used in the design process of their synthetic nucleic acid order, ultimately providing useful contextual information aids DNA Screeners to check for harmful intent captured in the requests made to the model.
Recommendation 6. Fund third-party evaluations of BDTs to determine how their use might affect DNA sequence screening and provide this information to those performing screening. Having these evaluations would be helpful for new, small and growing DNA Providers and alleviate the burden on established DNA Providers as screening capabilities become more sophisticated. A similar system exists in the automobile industry where insurance providers conduct their own car safety and crash tests to inform premium-related decisions.
Proliferation of open-source tools accelerates innovation and democratization of AI, and is a growing concern in the context of biological misuse. The recommendations here strengthen biosecurity screening at a key point in realizing these risks. This framework could be implemented alongside other approaches to reduce the risks that arise from BDTs. These include: introducing Know Your Customer (KYC) frameworks that monitor buyers/users of AI tools; requiring BDT developers to undergo training on assessing and mitigating dual-use risks of their work; and encouraging voluntary guidelines to reduce misuse risks, for instance, by employing model evaluations prior to release, refraining from publishing preprints or releasing model weights until such evaluations are complete. This multi-pronged approach can help ensure that AI tools are developed responsibly and that biosecurity risks are managed.
Responsible and Secure AI in Production Agriculture
Jennifer Clarke
Agriculture, food, and related industries represent over 5% of domestic GDP. The health of these industries has a direct impact on domestic food security, which equates to a direct impact on national security. In other words, food security is biosecurity is national security. As the world population continues to increase and climate change brings challenges to agricultural production, we need an efficiency and productivity revolution in agriculture. This means using less land and natural resources to produce more food and feed. For decision-makers in agriculture, the lack of human resources and narrow economic margins are driving interest in automation and properly utilizing AI to help increase productivity while decreasing waste amid increasing costs.
Congress should provide funding to support the establishment of a new office within the USDA to coordinate, enable, and oversee the use of AI in production agriculture and agricultural research.
The agriculture, food, and related industries are turning to AI technologies to enable automation and drive the adoption of precision agriculture technologies. The use of AI in agriculture often depends on proprietary approaches that have not been validated by an independent, open process. In addition, it is unclear whether AI tools aimed at the agricultural sector will address critical needs as identified by the producer community. This leads to the potential for detrimental recommendations and loss of trust across producer communities. These will impede adoption of precision agriculture technologies, which is necessary to domestic and sustainable food security.
The industry is promoting AI technologies to help yield healthier crops, control pests, monitor soil and growing conditions, organize data for farmers, help with workload, and improve a wide range of agriculture-related tasks in the entire food supply chain.
However, the use of networked technologies approaches in agriculture poses risks. AI use could add to this problem if not implemented carefully. For example, the use of biased or irrelevant data in AI development can result in poor performance, which erodes producer trust in both extension services and expert systems, hindering adoption. As adoption increases, it is likely that farmers will use a small number of available platforms; this creates centralized points of failure where a limited attack can cause disproportionate harm. The 2021 cyberattack on JBS, the world’s largest meat processor, and a 2021 ransomware attack on NEW Cooperative, which provides feed grains for 11 million farm animals in the United States, demonstrate the potential risks from agricultural cybersystems. Without established cybersecurity standards for AI systems, those systems with broad adoption across agricultural sectors will represent targets of opportunity.
As evidenced by the recent Executive Order on the Safe Secure and Trustworthy Development and Use of Artificial Intelligence and AI Safety Summit held at Bletchley Park, there is considerable interest and attention being given to AI governance and policy by both national and international regulatory bodies. There is a recognition that the risks of AI require more attention and investments in both technical and policy research.
This recognition dovetails with an increase in emphasis on the use of automation and AI in agriculture to enable adoption of new agricultural practices. Increased adoption in the short term is required to reduce greenhouse gas emissions and ensure sustainability of domestic food production. Unfortunately, trust in commercial and governmental entities among agricultural producers is low and has been eroded by corporate data policies. Fortunately, this erosion can be reversed by prompt action on regulation and policy that respects the role of the producer in food and national security. Now is the time to promote the adoption of best practices and responsible development to establish security as a habit among agricultural stakeholders.
Recommendations
To ensure that the future of domestic agriculture and food production leverages the benefits of AI while mitigating the risks of AI, the U.S. government should invest in institutional cooperation; AI research and education; and development and enforcement of best practices.
Recommendation: An Office should be established within USDA focused on AI in Production Agriculture, and Congress should appropriate $5 million over the next 5 years for a total of $25 million for this office. Cooperation among multiple institutions (public, private, nonprofit) will be needed to provide oversight on the behavior of AI in production agriculture including the impact of non-human algorithms and data sharing agreements (“the algorithmic economy”). This level of funding will encourage both federal and non-federal partners to engage with the Office and support its mission. This Office should establish and take direction from an Advisory body led by USDA with inclusive representation across stakeholder organizations including industry (e.g., AgGateway, Microsoft, John Deere), nonprofit organizations (e.g., AgDataTransparent, American Farmland Trust, Farm Bureaus, Ag Data Coalition, Council for Agricultural Science and Technology (CAST), ASABE, ISO), government (e.g., NIST, OSTP), and academia (e.g., APLU, Ag Extension). This advisory body will operate under the Federal Advisory Committee Act (FACA) to identify challenges and recommend solutions, e.g., develop regulations or other oversight specific to agricultural use of AI, including data use agreements and third-party validation, that reduces the uncertainty about risk scenarios and the effect of countermeasures. The office and its advisory body can solicit broad input on regulation, necessary legislation, incentives and reforms, and enforcement measures through Requests for Information and Dear Colleague letters. It should promote best practices as described below, i.e., incentivize responsible use and adoption, through equitable data governance, access, and private-public partnerships. An example of an incentive is providing rebates to producers who purchase equipment that utilizes validated AI technology.
To support development of best practices for the use of AI in production agriculture, in partnership with NIH, NSF, and DOD/DOE, the proposed Office should coordinate funding for research and education on the sociotechnical context of AI in agriculture across foundational disciplines including computer science, mathematics, statistics, psychology, and sociology. This new discipline of applied AI (built on theoretical advances in AI since the 1950s) should provide a foundation for developing best practices for responsible AI development starting with general, accepted standards (e.g., NIST’s framework). For example, best practices may include transparency through the open source community and independent validation processes for models and software. AI model training requires an immense amount of data and AI models for agriculture will require many types of data sets specific to production systems (e.g., weather, soil, management practices, etc.). There is an urgent need for standards around data access and use that balance advances and adoption of precision agriculture with privacy and cybersecurity concerns.
In support of the work of the proposed Office, Congress should appropriate funding at $20M/year to USDA to support the development of programs at land-grant universities that provide multidisciplinary training in AI and production agriculture. The national agricultural production cyberinfrastructure (CI) has become critical to food security and carbon capture in the 21st century. A robust talent pipeline is necessary to support, develop, and implement this CI in preparation for the growth in automation and AI. There is also a critical need for individuals trained in both AI and production agriculture who can lead user-centered design and digital services on behalf of producers. Training must include foundation knowledge of statistics, computer science, engineering, and agricultural sciences coupled with experiential learning that provide trainees with opportunities to translate their knowledge to address current CI challenges. These opportunities may arise from interagency cooperation at the federal, state, and local levels, in partnership with grower cooperatives, farm bureaus, and land-grant universities, to ensure that training meets pressing and future needs in agricultural systems.
Connecting Utility-Scale Renewable Energy Resources with Rural-Urban Transmission
There is a vast amount of wind and solar power ready to be harvested and moved to market across the United States, but it must be connected through long-distance transmission to protect against intermittency instability. Strategically placed long-distance transmission also ensures that rural and urban populations benefit economically from the transition to clean energy.
The Biden-Harris Administration should facilitate the transition to a clean grid by aggressively supporting utility-scale renewable energy resources in rural areas that are connected to urban centers through modernized high-voltage direct current (HVDC) transmission. To move toward total electrification and a decarbonized grid, the Department of the Interior (DOI) and the Bureau of Land Management (BLM) must encourage renewable energy production on federal land through the BLM’s multiple-use mandate. BLM must work in tandem with the Department of Energy (DOE), Department of Transportation (DOT), and the Federal Energy Regulatory Commission (FERC) to transport increased clean power generation through newly constructed HVDC lines that can handle this capacity.
This two-pronged approach will move loads from high-generation, low-demand rural areas to low-generation, high-demand (often coastal) urban hubs. As residents in the East arrive home from work and turn on their TVs, the sun is still up in the West and can provide for their energy needs. As residents in the Northwest wake up, grind coffee, and tune into the news, they can rely on power from the Midwest, where the wind is blowing.
Challenge and Opportunity
Utility-Scale Renewable Energy Development on Federal Land
After taking office, the Biden-Harris Administration rejoined the Paris Climate Agreement and committed the United States to reduce greenhouse gas (GHG) emissions by 50–52% below 2005 levels by 2030. The Inflation Reduction Act (IRA) is a positive step toward meeting these GHG emissions goals. The IRA allocated $369 billion to climate and energy security investments, which should be used to bolster development of renewables on federal lands. Together with the Infrastructure Investment and Jobs Act, this funding affords an enormous opportunity.
Building utility-scale renewable energy infrastructure such as wind or solar requires a vast amount of space. A utility-scale solar power plant could require between 5 and 10 acres of land in order to generate enough energy to power approximately 173 homes.
The federal government owns a vast amount of land, some of which is viable for wind and solar. To be exact, the federal government owns 640 million acres of land (nearly one-third of all U.S. land), which is managed through the Bureau of Land Management (BLM), the Fish and Wildlife Services (FWS), the National Park Service (NPS), the Forest Service (USFS), and the Department of Defense (DOD).
Land owned by the BLM (245 million acres) and the USFS (193 million acres) falls under similar multiple-use, sustained-yield mandates. The majority of those combined 438 million acres under BLM jurisdiction are the concern of this memo. According to the Federal Land Policy and Management Act of 1976 (FLPMA), resources and uses on those federal lands must be used in a balanced combination that “best meets present and future needs of the American people.” This multiple-use mandate presents an enormous opportunity for deployment of utility-scale renewable energy resources. The BLM manages over 19 million acres of public lands with excellent solar potential across six states and 20.6 million acres of public lands with excellent wind potential. This land is ripe for utility-scale renewable energy generation and will be critical to achieving the nation’s decarbonization goals. Green energy generation on these lands should be privileged.
Together, the 15 central U.S. states account for the majority of national wind and solar technical potential. However, these states are projected to comprise only a third of the nation’s electrical demand in 2050. Population-dense and predominantly coastal cities have higher energy demand, while the Midwest and Southwest are dominated by rural communities and public land. Transmission lines are needed to transport renewable energy from these central states to the urban centers with large energy markets.
Transmission Development on a Rural-Urban Grid
The U.S. grid is split into three regions: the Western Interconnection, the Eastern Interconnection, and ERCOT Interconnection (Texas). These three regions are only minimally connected nationally, regionally, or even through interstate connections due to intense localism on the part of utilities that are not financially incentivized to engage in regional transmission. There are three key utility ownership models in the United States: private investor-owned utilities (IOUs), public power utilities owned by states or municipalities, and nonprofit rural electric cooperatives (co-ops).
The Federal Energy Regulatory Commission is an independent agency that regulates the interstate transmission of electricity. In this capacity, it ensures that regional goals are established and met. Two types of entities established by FERC, regional transmission organizations (RTOs) and independent system operators (ISOs), help to coordinate regional transmission across utilities. RTOs are voluntary bodies of utilities that streamline and coordinate regional transmission initiatives and objectives. ISOs are independent and federally regulated entities that coordinate regional transmission to ensure nondiscriminatory access and streamline regional goals. ISOs and RTOs are similar, but RTOs generally have jurisdiction over a larger geographic area. Two-thirds of the nation’s electricity load is served in ISO/RTO regions. The remainder of the energy market is dominated by vertically integrated utilities that manage both transmission and distribution.
Establishing more connections among the three regional grids will support renewable energy development, reduce GHG emissions, save consumers money, increase resilience, and create jobs. Connecting the power grid across states and time zones is also vital to peak load control. Greater connection mitigates the inherent instability of renewables: if clouds cover the sun in the East, winds will still blow in the Midwest. If those winds die, water will still flow in the Northwest’s rivers.
The best way to make connections between regional and local grids is through high-voltage direct current electrical transmission systems. HVDC transmission allows for the direct current (DC) transfer of power over long distances, which is more energetically efficient than alternating current (AC).
There is precedent and forward momentum on developing interstate transmission, including projects like SunZia in the Southwest, TransWest Express in the Mountain West, Grain Belt Express in the Midwest, and Champlain Hudson Power Express in the Northeast. The Midcontinent Independent System Operator (MISO) recently approved $10.3 billion in regional HVDC lines, a move that is projected to generate up to $52.2 billion in net benefits through mitigated blackouts and increased fuel savings.
Though co-ops account for the smallest percentage of utilities (there are 812 total), they are found in the primarily rural Midwest, where there is high generation potential for solar and wind energy. Here, utility participation in RTOs is low. FERC has expressed disinterest in mandating RTO participation and in taking punitive action. However, it can incentivize regional planning through RTO membership or, where unappealing to local utilities, incentivize regional transmission investment through joint ownership structures.
The Biden-Harris Administration has taken the first steps to address these issues, such as releasing an Action Plan in 2022 to encourage federal agencies to expedite the permitting process of renewable energy. The president should expand on the existing Action Plan to build a larger coalition of contributors and also encourage the following recommendations to facilitate maximum clean-energy transition efficiency. Achieving the Biden-Harris Administration decarbonization targets requires the tandem development of rural utility-scale renewable energy and regional HVDC transmission to carry this energy to urban centers, benefiting people and economies across the United States.
Plan of Action
Recommendation 1. BLM should prioritize renewable energy permit awards near planned HVDC transmission lines and existing rights-of-way.
Compared to FY20, BLM reported that it has increased renewable energy permitting activities by 35%, supporting the development of 2,898 MW of onshore solar, wind, and geothermal energy generation capacity. BLM received 130 proposals for renewable energy generation projects on public lands and six applications for interconnected transmission lines in 2021. The transmission line proposals would support 17 GW of energy, which would also support the transmission of renewable energy on non-federal land across the Southwest.
DOI can directly support renewable energy generation by instructing BLM to ensure that contracts are awarded through the multiple-use, sustained-yield mandate in a specific way. Though Section 50265 of the IRA mandates that oil and gas leases must continue, DOI can plan with an eye to the future. Renewables built on public lands should be constructed in areas closest to planned HVDC transmission, including but not limited to Kansas, Wyoming, and New Mexico. Renewables should always take precedence over coal, oil, and natural gas in areas where existing or future HVDC transmission lines are planned to begin construction or upgrades. Renewables should also always take precedence near railways and federal highways, where HVDC transmission is more easily implemented. Contracts for renewables near planned HVDC interstate transmission lines and existing rights-of-way like railways and highways should be given precedence in the awards process. This will prime the grid for the Biden-Harris Administration’s decarbonization goals and ensure that oil and gas generation is situated closer to legacy lines that are more likely to be retired sooner. DOI has unique considerations due to Section 50265 of the IRA, but it can still coordinate with other federal agencies to manage its constraints and judiciously prioritize transmission-adjacent renewable energy generation sites.
Recommendation 2. FERC should incentivize regional transmission planning by encouraging federal-local partnerships, introducing joint-ownership structures, and amending Order 1000.
FERC should encourage RTOs to prioritize regional transmission planning in order to meet decarbonization goals and comply with an influx of cheaper, cleaner energy into its portfolio. The FERC-NARUC Task Force is a good starting point for this cooperation and should be expanded upon. This federal-state task force on electric transmission is a good blueprint for how federal objectives for regional planning can work hand-in-hand with local considerations. FERC can highlight positive cases like SB448 in Nevada, which incentivizes long-distance transmission and mandates the state’s participation in an RTO by 2030. FERC should encourage utility participation in RTOs but emphasize that long-distance transmission planning and implementation is the ultimate objective. Where RTO participation is not feasible, FERC can incentivize utility participation in regional transmission planning in other ways.
FERC should incentivize utility participation in regional transmission by encouraging joint-ownership structures, as explored in a 2019 incentives docket. In March 2019, FERC released a Notice of Inquiry seeking comments on “the scope and implementation of its electric transmission incentives regulations and policy.” Commenters supported non-public utility joint-ownership promotion, including equity in transmission lines that can offset customer rates, depending on the financing structure. In February 2023, FERC approved incentives for two of Great River Energy’s interstate transmission projects, in which it will own a 52.3% stake of the Minnesota Iron Range project and 5% of the Big Stone project. In the Iron Range project, Great River can use a 50% equity and 50% debt capital structure, placing the construction expenses on its rate base. The cash flow generated by this capital structure is necessary for the completion of this interstate transmission line, and FERC should encourage similar projects and incentives.
FERC should amend Order 1000—Transmission Planning and Cost Allocation. As former Commissioner Glick has noted, Order 1000 in its current iteration unintentionally encourages the construction of smaller lines over larger-scale regional transmission lines because utilities prefer not to engage in potentially lengthy, expensive competition processes. In April 2022, FERC published a Notice of Proposed Rulemaking (NOPR), which, among other things, attempts to address this perverse incentive by amending the order “to permit the exercise of a federal rights of first refusal for transmission facilities selected in a regional transmission plan for purposes of cost allocation, conditioned on the incumbent transmission provider establishing joint ownership of those facilities.” Amending this rule and allowing federal ROFR for joint ownership structures will encourage partnerships, spread risks across more parties, and allow greater access to large investments that traditionally require an insurmountable capital investment for most investors new to this sector. The NOPR also encouraged long-term regional transmission planning and improved coordination between local and regional entities and implementation goals. The amendment was supported by both utilities and environmental groups. Public comments were closed for submission in summer 2022. Now, over a year later, FERC should act quickly to issue a final rule on amending Order 1000.
In addition to incentivizing more regionally focused transmission planning at the utility level, federal agencies should work together to ensure that HVDC lines are strategically placed to facilitate the delivery of renewable energy to large markets.
Recommendation 3. The Biden-Harris Administration should encourage the Department of Transportation to work with the Grid Deployment Office (GDO) and approve state DOT plans for HVDC lines along existing highways and railroads.
In 2021, the Federal Highway Administration (FHWA) released a memorandum providing guidance that state departments of transportation may leverage “alternative uses” of existing highway rights of way (ROW), including for renewable energy, charging stations, transmission lines, and broadband projects, and that the FHWA may approve alternative uses for ROWs so long as they benefit the public and do not impair traffic. The GDO, created by the Biden-Harris Administration, should work directly with state DOTs to plan for future interstate lines. As these departments coordinate, they should use a future highway framework characterized by increased electric vehicle (EV) usage, increased EV charging station needs, and improved mass transit. This will allow DOT to reinterpret impeding the “free and safe flow of traffic.” The FHWA should encourage state DOTs to use the SOO Green HVDC Link as a blueprint. The idea of reconciling siting issues by building transmission lines along existing rights-of-way such as highways or railroads is known to this administration, as evidenced by President Biden’s reference in a 2022 White House Statement and by FERC’s June 2020 report on barriers and opportunities for HVDC transmission.
Recommendation 4. DOI, the Department of Agriculture (USDA), DOD, DOE, and the Environmental Protection Agency (EPA) should sign a new Memorandum of Understanding (MOU) that builds on their 2022 MOU but includes DOT.
In 2022, DOI, USDA, DOD, DOE, and the EPA signed an MOU that would expedite the review process of renewable energy projects on federal lands. DOT, specifically its FHWA and Federal Railroad Administration (FRA), should be included in this memorandum. The president should direct these agencies to sign a second MOU to work together to create a regional and national outline for future transmission lines and prioritize permit requests that align with that outline. This new MOU should add the DOT and illustrate the specific ways that FHWA and FRA can support its goals by repurposing existing transportation rights-of-ways.
Recommendation 5. All future covered transmission planning should align with the MOU proposed in Recommendation 4.
Under Section 50152 of the IRA, the DOE received $760 million to distribute federal grants for the development of covered transmission projects. Section 50153 appropriates an additional $100 million to DOE, which is specifically tailored to wind electricity planning and development, both offshore and interregional. The DOE should require that all transmission planning using this federal funding align with the long-term outline created under the MOU recommended above. Additionally, preference should be given to transmission lines (receiving federal funding) that link utility-scale renewable energy projects with large urban centers.
Recommendation 6. The EPA should fund technical and educational training to rural and disadvantaged communities that might benefit from an influx of high-demand green energy jobs.
The federal government should leverage existing funding to ensure that rural and disadvantaged communities directly benefit from economic development opportunities facilitated by the clean energy transition. The EPA should use funds from Section 60107 of the IRA to provide technical and educational assistance to low-income and disadvantaged communities in the form of job training and planning. EPA funding can be used to ensure that local communities have the technical knowledge to take advantage of the jobs and opportunities created by projects like the SOO Green HVDC Link. Because this section of the IRA only funds up to $17 million in job training, this should be allocated to supplement community colleges and other technical training programs that have established curricula and expertise.
To ensure that efforts are successful in the long term, federal agencies, utilities, and other stakeholders must have access to accurate and current information about transmission needs nationwide.
Recommendation 7. Congress should fund regular updates to existing future transmission needs studies.
Congress must continue to approve future research into both halves of the electrification equation: generation and transmission. Congress already approved funding for the NREL Electrification Futures Study and the NREL Interconnections SEAM Study, both published in 2021. These studies allow NREL to determine best-case scenario models and then communicate its research to the RTOs that are best positioned to help IOUs plan for future regional transmission. These studies also guide FERC and the GDO as they determine best-case scenarios for linking rural clean energy resources to urban energy markets.
In addition, Congress must continue to fund the GDO National Transmission Needs Study, which was funded by the Bipartisan Infrastructure Law (BIL). This study researches capacity constraints and congestion on the transmission grid and will help FERC and RTOs determine where future transmission should be planned in order to relieve pressure and meet needs. The final Needs Study was issued in summer 2023, but it must be updated on a regular basis if the country is to actively move toward grid coordination.
The Summer 2023 Needs Study included, for the first time, modeling and discussion of anticipated future capacity constraints and transmission congestion. As the grid continues to evolve and different types of renewable energy are integrated into the grid, future needs studies should continue to include forward-looking models under a variety of renewable energy scenarios.
Conclusion
The Biden-Harris Administration has rejoined the Paris Climate Agreement, affirming their commitment to significant decarbonization goals. To achieve this end, the administration must follow a two-pronged approach that facilitates the installation of utility-scale renewable energy on public lands in the Midwest and Southwest and expedites the implementation of HVDC transmission lines that will link these resources to urban energy markets.
It is impossible to meet the Biden-Harris Administration climate goals without drastic action to encourage further electrification, renewable energy development, and transmission planning. Fortunately, these actions are ripe for bipartisan coordination and are already supported through existing laws like the IRA and BIL. These recommendations will help meet these goals and secure a brighter future for Americans across the rural-urban divide.
FERC has made recent strides toward encouraging transmission modernization through Order No. 2023. While this rule primarily addresses the “largest interconnection queue size in history” and takes steps to accelerate the interconnection process, it does not address the lack of transmission capacity and infrastructure nationally. Order No. 2023 is a vital step forward in interconnection process modernization, and it should be the first of many toward large-scale transmission planning.
As of November 2021, BLM-managed lands produced 12 GW of power from renewable energy sources, through 36 wind, 37 solar, and 48 geothermal permitted projects. To put this number into perspective, 1 GW is enough to power approximately 750,000 homes. Helpfully, BLM maintains a list of planned and approved renewable energy projects on its lands. Additionally, the Wilderness Society maintains an interactive map of energy projects on public lands.
In contrast, BLM manages over 37,000 oil and gas leases, including over 96,000 wells.
Due to their high renewable-energy development potential, Midwest and Southwest states stand to disproportionately gain from a clean energy jobs boom in the fields of construction, management, and the technical trades. Given the West’s and Northeast’s desire for a decarbonized grid and their comparatively greater energy use, these states will benefit by receiving greater amounts of renewable energy to meet their energy needs and decarbonization goals.
The United States lags in the number of HVDC transmission lines, particularly compared to China and Europe. In 2022, only 552 miles of high voltage transmission were added to the United States. Currently, there are four regional transmission lines proposed, two of which expect to begin construction this year. Of these planned lines, three are in the Midwest and Southwest, and one is in the Northeast. While this is progress, China has recently invested $26 billion in a national network of ultra-high-voltage lines.
Five agencies manage federal land, including BLM, USFS, FWS, NPS, USDA, and DOD. However, only BLM and USFS operate under the FLPMA’s multiple-use, sustained-yield mandates, and their land-use mandates are similar. The other agencies’ mandates require them to protect and conserve animals and plants, promote tourism and engagement with public lands, and manage military installations and bases. This said, BLM and USFS are the best candidates for developing utility-scale renewable energy resources through their specific mandates. This memo focuses on the larger of those entities, which has greater potential for substantial renewable energy development and an established permitting system. As discussed in this USFS and NREL study, the study of renewable-energy resource construction on national forest system lands is still in early stages, whereas BLM’s policies and systems are developed.
It is not within the scope of this memo to address issues specific to Tribal lands. However, various federal agencies offer clean energy funding specifically for Tribes, such as the Tribal Energy Loan Guarantee Program. If desired by Tribal communities, the U.S. government should prioritize funding for HVDC transmission lines that link Tribal power generation to Tribal urban centers and utility grids. For tribes seeking guidance on implementing utility-scale projects, Navajo Nation can serve as one model. Navajo Nation has the highest solar potential of any tribal land in the country. They have successfully constructed the Kayenta Solar Project (55 MW of energy), and have finalized leases for the Cameron Solar Plant (200 MW) and the Red Mesa Tapaha Solar Generation Plant (70 MW). The Cameron project alone will generate $109 million over the next 30 years for tribal coffers through tax revenue, lease payments, and energy transmission payments. Another example is the solar energy portfolio of Moapa Band of Paiute Indians. The Tribe manages a growing portfolio of utility-scale solar projects, including Moapa Southern Paiute Solar Project (250 MW), and the first utility-scale installation on tribal land. Currently under development are the Arrow Canyon Solar Project, the Southern Bighorn Solar Project, and the Chuckwalla Solar Projects, all of which feature joint ownership between tribal, federal, and private stakeholders.
Engaging Coal Communities in Decarbonization Through Nuclear Energy
The United States is committed to the ambitious goal of reaching net-zero emissions globally by 2050, requiring rapid deployment of clean energy domestically and across the world. Reducing emissions while meeting energy demand requires firm power sources that produce energy at any time and in adverse weather conditions, unlike solar or wind energy. Advanced nuclear reactors, the newest generation of nuclear power plants, are firm energy sources that offer potential increases in efficiency and safety compared to traditional nuclear plants. Adding more nuclear power plants will help the United States meet energy demand while reducing emissions. Further, building advanced nuclear plants on the sites of former coal plants could create benefits for struggling coal communities and result in significant cost savings for project developers. Realizing these benefits for our environment, coal communities, and utilities requires coordinating and expanding existing efforts. The Foundation for Energy Security and Innovation (FESI), the US Department of Energy (DOE), and Congress should each take actions to align and strengthen advanced nuclear initiatives and engagement with coal communities in the project development process.
Challenge and Opportunity
Reducing carbon emissions while meeting energy demand will require the continued use of firm power sources. Coal power, once a major source of firm energy for the United States, has declined since 2009, due to federal and state commitments to clean energy and competition with other clean energy sources. Power generated from coal plants is expected to drop to half of current levels by 2050 as upwards of 100 plants retire. The DOE found that sites of retiring coal plants are promising candidates for advanced nuclear plants, considering the similarities in site requirements, the ability to reuse existing infrastructure, and the overlap in workforce needs. Advanced nuclear reactors are the next generation of nuclear technology that includes both small modular reactors (SMRs), which function similar to traditional light-water reactors except on a smaller site, and non-light-water reactors, which are also physically smaller but use different methods to control reactor temperature. However, the DOE’s study and additional analysis from the Bipartisan Policy Center also identified significant challenges to constructing new nuclear power plants, including the risk of cost overrun, licensing timeline uncertainties, and opposition from communities around plant sites. Congress took steps to promote advanced nuclear power in the Inflation Reduction Act and the CHIPS and Science Act, but more coordination is needed. To commercialize advanced nuclear to support our decarbonization goals, the DOE estimates that utilities must commit to deploying at least five advanced nuclear reactors of the same design by 2025. There are currently no agreements to do so.
The Case for Coal to Nuclear
Coal-dependent communities and the estimated 37,000 people working in coal power plants could benefit from the construction of advanced nuclear reactors. Benefits include the potential addition of more than 650 jobs, about 15% higher pay on average, and the ability for some of the existing workforce to transition without additional experience, training, or certification. Jobs in nuclear energy also experience fewer fatal accidents, minor injuries, and harmful exposures than jobs in coal plants. Advanced nuclear energy could revitalize coal communities, which have suffered labor shocks and population decline since the 1980s. By embracing advanced nuclear power, these communities can reap economic benefits and create a pathway toward a sustainable and prosperous future. For instance, in one case study by the DOE, replacing a 924 MWe coal plant with nuclear increased regional economic activity by $275 million. Before benefits are realized, project developers must partner with local communities and other stakeholders to align interests and gain public support so that they may secure agreements for coal-to-nuclear transition projects.
Communities living near existing nuclear plants tend to view nuclear power more favorably than those who do not, but gaining acceptance to construct new plants in communities less familiar with nuclear energy is challenging. Past efforts using a top-down approach were met with resistance and created a legacy of mistrust between communities and the nuclear industry. Stakeholders can slow or stop nuclear construction through lawsuits and lengthy studies under the National Environmental Policy Act (NEPA), and 12 states have restrictions or total bans on new nuclear construction. Absent changes to the licensing and regulatory process, project developers must mitigate this risk through a process of meaningful stakeholder and community engagement. A just transition from coal to nuclear energy production requires developers to listen and respond to local communities’ concerns and needs through the process of planning, siting, licensing, design, construction, and eventual decommissioning. Project developers need guidance and collective learning to update the siting process with more earnest practices of engagement with the public and stakeholders. Coal communities also need support in transitioning a workforce for nuclear reactor operations.
Strengthen and Align Existing Efforts
Nuclear energy companies, utilities, the DOE, and researchers are already exploring community engagement and considering labor transitions for advanced nuclear power plants. NuScale Power, TerraPower, and X-energy are leading in both the technical development of advanced nuclear and in considerations of community benefits and stakeholder management. The Utah Associated Municipal Power Systems (UAMPS), which is hosting NuScale’s demonstration SMR, spent decades engaging with communities across 49 utilities over seven states before signing an agreement with NuScale. Their carbon-free power project involved over 200 public meetings, resulting in several member utilities choosing to pursue SMRs. Universities are collaborating with the Idaho National Laboratory to analyze energy markets using a multidisciplinary framework that considers community values, resources, capabilities, and infrastructure. Coordinated efforts by researchers near the TerraPower Natrium demonstration site investigate how local communities view the cost, benefits, procedures, and justice elements of the project.
The DOE also works to improve stakeholder and community engagement across multiple offices and initiatives. Most notably, the Office of Nuclear Energy is using a consent-based siting process, developed with extensive public input, to select sites for interim storage and disposal of spent nuclear fuel. The office distributed $26 million to universities, nonprofits, and private partners to facilitate engagement with communities considering the costs and benefits of hosting a spent fuel site. DOE requires all recipients of funds from the Infrastructure Investment and Jobs Act and the Inflation Reduction Act, including companies hosting advanced nuclear demonstration projects, to submit community benefits plans outlining community and labor organization engagement. The DOE’s new Commercial Liftoff Reports for advanced nuclear and other clean energy technologies are detailed and actionable policy documents strengthened by the inclusion of critical societal considerations.
Through the CHIPS and Science Act, Congress established or expanded DOE programs that promote both the development of advanced nuclear on sites of former coal plants and the research of public engagement for nuclear energy. The Nuclear Energy University Program (NEUP) has funded technical nuclear energy research at universities since 2009. The CHIPS Act expanded the program to include research that supports community engagement, participation, and confidence in nuclear energy. The Act also established, but did not fund, a new advanced nuclear technology development program that prioritizes projects at sites of retiring coal plants and those that include elements of workforce development. An expansion of an existing nuclear energy training program was cut from the final CHIPS Act, but the expansion is proposed again in the Nuclear Fuel Security Act of 2023.
More coordination is required among DOE, the nuclear industry, and utilities. Congress should also take action to fund initiatives authorized by recent legislation that enable the coal-to-nuclear transition.
Plan of Action
Recommendations for Federal Agencies
Recommendation 1. A sizable coordinating body, such as the Foundation for Energy Security and Innovation (FESI) or the Appalachian Regional Commission (ARC), should support the project developer’s efforts to include community engagement in the siting, planning, design, and construction process of advanced nuclear power plants.
FESI is a new foundation to help the DOE commercialize energy technology by supporting and coordinating stakeholder groups. ARC is a partnership between the federal government and Appalachian states that supports economic development through grantmaking and conducting research on issues related to the region’s challenges. FESI and ARC are coordinating bodies that can connect disparate efforts by developers, academic experts, and the DOE through various enabling and connecting initiatives. Efforts should leverage existing resources on consent-based siting processes developed by the DOE. While these processes are specific to siting spent nuclear fuel storage facilities, the roadmap and sequencing elements can be replicated for other goals. Stage 1 of the DOE’s planning and capacity-building process focuses on building relationships with communities and stakeholders and engaging in mutual learning about the topic. FESI or ARC can establish programs and activities to support planning and capacity building by utilities and the nuclear industry.
FESI could pursue activities such as:
- Hosting a community of practice for public engagement staff at utilities and nuclear energy companies, experts in public engagement methods design, and the Department of Energy
- Conducting activities such as stakeholder analysis, community interest surveys, and engagement to determine community needs and concerns, across all coal communities
- Providing technical assistance on community engagement methods and strategies to utilities and nuclear energy companies
ARC could conduct studies such as stakeholder analysis and community interest surveys to determine community needs and concerns across Appalachian coal communities.
Recommendation 2. The DOE should continue expanding the Nuclear Energy University Program (NEUP) to fund programs that support nontechnical nuclear research in the social sciences or law that can support community engagement, participation, and confidence in nuclear energy systems, including the navigation of the licensing required for advanced reactor deployment.
Evolving processes to include effective community engagement will require new knowledge in the social sciences and shifting the culture of nuclear education and training. Since 2009, the DOE Office of Nuclear Energy has supported nuclear energy research and equipment upgrades at U.S. colleges and universities through the NEUP. Except for a few recent examples, including the University of Wyoming project cited above, most projects funded were scientific or technical. Congress recognized the importance of supporting research in nontechnical areas by authorizing the expansion of NEUP to include nontechnical nuclear research in the CHIPS and Science Act. DOE should not wait for additional appropriations to expand this program. Further, NEUP should encourage awardees to participate in communities of practice hosted by FESI or other bodies.
Recommendation 3. The DOE Office of Energy Jobs and the Department of Labor (DOL) should collaborate on the creation and dissemination of training standards focused on the nuclear plant jobs for which extensive training, licensing, or experience is required for former coal plant workers.
Sites of former coal plants are promising candidates for advanced nuclear reactors because most job roles are directly transferable. However, an estimated 23% of nuclear plant jobs—operators, senior managers, and some technicians—require extensive licensing from the Nuclear Regulatory Commission (NRC) and direct experience in nuclear roles. It is possible that an experienced coal plant operator and an entry-level nuclear hire would require the same training path to become an NRC-licensed nuclear plant operator.
Supporting the clean energy workforce transition fits within existing priorities for the DOE’s Office of Energy Jobs and the DOL, as expressed in the memorandum of understanding signed on June 21, 2022. Section V.C. asserts the departments share joint responsibility for “supporting the creation and expansion of high-quality and equitable workforce development programs that connect new, incumbent, and displaced workers with quality energy infrastructure and supply chain jobs.” Job transition pathways and specific training needs will become apparent through additional studies by interested parties and lessons from programs such as the Advanced Reactor Demonstration Program and the Clean Energy Demonstration Program on Current and Former Mine Land. The departments should capture and synthesize this knowledge into standards from which industry and utilities can design targeted job transition programs.
Recommendations for Congress
Recommendation 4. Congress should fully appropriate key provisions of the CHIPS and Science Act to support coal communities’ transition to nuclear energy.
- Appropriate $800 million over FY2024 to FY2027 to establish the DOE Advanced Nuclear Technologies Federal Research, Development, and Demonstration Program: The CHIPS and Science Act established this program to promote the development of advanced nuclear reactors and prioritizes projects at sites of retiring coal power plants and those that include workforce development programs. These critical workforce training programs need direct funding.
- Appropriate an additional $15 million from FY2024 to FY2025 to the NEUP: The CHIPS and Science Act authorizes an additional $15 million from FY 2023 to FY 2025 to the NEUP within the Office of Nuclear Energy, increasing the annual total amount from $30 million to $45 million. Since CHIPS included an authorization to expand the program to include nontechnical nuclear research, the expansion should come with increased funding.
Recommendation 5. Congress should expand the Nuclear Energy Graduate Traineeship Subprogram to include workforce development through community colleges, trade schools, apprenticeships, and pre-apprenticeships.
The current Traineeship Subprogram supports workforce development and advanced training through universities only. Expanding this direct funding for job training through community colleges, trade schools, and apprenticeships will support utilities’ and industries’ efforts to transition the coal workforce into advanced nuclear jobs.
Recommendation 6. Congress should amend Section 45U, the Nuclear Production Tax Credit for existing nuclear plants, to require apprenticeship requirements similar to those for future advanced nuclear plants covered under Section 45Y, the Clean Energy Production Tax Credit.
Starting in 2025, new nuclear power plant projects will be eligible for the New Clean Energy Production and Investment Tax Credits if they meet certain apprenticeship requirements. However, plants established before 2025 will not be eligible for these incentives. Congress should add apprenticeship requirements to the Nuclear Production Tax Credit so that activities at existing plants strengthen the total nuclear workforce. Credits should be awarded with priority to companies implementing apprenticeship programs designed for former coal industry workers.
Conclusion
The ambitious goal of reaching net-zero emissions globally requires the rapid deployment of clean energy technologies, in particular firm clean energy such as advanced nuclear power. Since the 1980s, communities around coal power plants have suffered from industry shifts and will continue to accumulate disadvantages without support. Coal-to-nuclear transition projects advance the nation’s decarbonization efforts while creating benefits for developers and revitalizing coal communities. Utilities, the nuclear industry, the DOE, and researchers are advancing community engagement practices and methods, but more effort is required to share best practices and ensure coordination in these emerging practices. FESI or other large coordinating bodies should fill this gap by hosting communities of practice, producing knowledge on community values and attitudes, or providing technical assistance. DOE should continue to promote community engagement research and help articulate workforce development needs. Congress should fully fund initiatives authorized by recent legislation to promote the coal to nuclear transition. Action now will ensure that our clean firm power needs are met and that coal communities benefit from the clean energy transition.
Transitioning coal miners directly into clean energy is challenging considering the difference in skills and labor demand between the sectors. Most attempts to transition coal miners should focus on training in fields with similar skill requirements, such as job training for manufacturing roles within the Appalachian Climate Technology Coalition. Congress could also provide funding for unemployed coal miners to pursue education for other employment.
A significant challenge is aligning the construction of advanced nuclear plants with the decommissioning of coal plants. Advanced nuclear project timelines are subject to various delays and uncertainties. For example, the first commercial demonstration of small modular reactor technology in the United States, the TerraPower plant in Wyoming, is delayed due to the high-assay low-enriched uranium supply chain. The Nuclear Regulatory Commission’s licensing process also creates uncertainty and extends project timelines.
Methods exist to safely contain radioactive material as it decays to more stable isotopes. The waste is stored on site at the power plant in secure pools in the shorter term and in storage casks capable of containing the material for at least 100 years in the longer term. The DOE must continue pursuing interim consolidated storage solutions as well as a permanent geological repository, but the lack of these facilities should not pose a significant barrier to constructing advanced nuclear power plants. The United States should also continue to pursue recycling spent fuel.
More analysis is required to better understand these impacts. A study conducted by Argonne National Laboratory found that while the attributes of spent fuel vary by the exact design of reactor, overall there are no unique challenges to managing fuel from advanced reactors compared to fuel from traditional reactors. A separate study found that spent fuel from advanced reactors will contain more fissile nuclides, which makes waste management more challenging. As the DOE continues to identify interim and permanent storage sites through a consent-based process, utilities and public engagement efforts must interrogate the unique waste management challenges when evaluating particular advanced nuclear technology options.
Similar to waste output, the risk of proliferation from advanced reactors varies on the specific technologies and requires more interrogation. Some advanced reactor designs, such as the TerraPower Natrium reactor, require the use of fuel that is more enriched than the fuel used in traditional designs. However, the safeguards required between the two types of fuel are not significantly different. Other designs, such as the TerraPower TWR, are expected to be able to use depleted or natural uranium sources, and the NuScale VOYGR models use traditional fuel. All reactors have the capacity to produce fissile material, so as the United States expands its nuclear energy capabilities, efforts should be made to expand current safeguards limiting proliferation to fuel as it is prepared for plants and after it has been used.