Bio x AI: Policy Recommendations for a New Frontier
Artificial intelligence (AI) is likely to yield tremendous advances in our basic understanding of biological systems, as well as significant benefits for health, agriculture, and the broader bioeconomy. However, AI tools, if misused or developed irresponsibly, can also pose risks to biosecurity. The landscape of biosecurity risks related to AI is complex and rapidly changing, and understanding the range of issues requires diverse perspectives and expertise. To better understand and address these challenges, FAS initiated the Bio x AI Policy Development Sprint to solicit creative recommendations from subject matter experts in the life sciences, biosecurity, and governance of emerging technologies. Through a competitive selection process, FAS identified six promising ideas and, over the course of seven weeks, worked closely with the authors to develop them into the recommendations included here. These recommendations cover a diverse range of topics to match the diversity of challenges that AI poses in the life sciences. We believe that these will help inform policy development on these topics, including the work of the National Security Commission on Emerging Biotechnologies.
AI tool developers and others have put significant effort into establishing frameworks to evaluate and reduce risks, including biological risks, that might arise from “foundation” models (i.e., large models designed to be used for many different purposes). These include voluntary commitments from major industry stakeholders, and several efforts to develop methods for evaluations of these models. The Biden Administration’s recent Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (Bioeconomy EO) furthers this work and establishes a framework for evaluating and reducing risks related to AI.
However, the U.S. government will need creative solutions to establish oversight for biodesign tools (i.e., more specialized AI models that are trained on biological data and provide insight into biological systems). Although there are differing perspectives among experts, including those who participated in this Policy Sprint, about the magnitude of risks that these tools pose, they undoubtedly are an important part of the landscape of biosecurity risks that may arise from AI. Three of the submissions to this Policy Sprint address the need for oversight of these tools. Oliver Crook, a postdoctoral researcher at the University of Oxford and a machine learning expert, calls on the U.S. government to ensure responsible development of biodesign tools by instituting a framework for checklist-based, institutional oversight for these tools while Richard Moulange, AI-Biosecurity Fellow at the Centre for Long-Term Resilience, and Sophie Rose, Senior Biosecurity Policy Advisor at the Centre for Long-Term Resilience, expand on the Executive Order on AI with recommendations for establishing standards for evaluating their risks. In his submission, Samuel Curtis, an AI Governance Associate at The Future Society, takes a more open-science approach, with a recommendation to expand infrastructure for cloud-based computational resources internationally to promote critical advances in biodesign tools while establishing norms for responsible development.
Two of the submissions to this Policy Sprint work to improve biosecurity at the interface where digital designs might become biological reality. Shrestha Rath, a scientist and biosecurity researcher, focuses on biosecurity screening of synthetic DNA, which the Executive Order on AI highlights as a key safeguard, and contains recommendations for how to improve screening methods to better prepare for designs produced using AI. Tessa Alexanian, a biosecurity and bioweapons expert, calls for the U.S. government to issue guidance on biosecurity practices for automated laboratories, sometimes called “cloud labs,” that can generate organisms and other biological agents.
This Policy Sprint highlights the diversity of perspectives and expertise that will be needed to fully explore the intersections of AI with the life sciences, and the wide range of approaches that will be required to address their biosecurity risks. Each of these recommendations represents an opportunity for the U.S. government to reduce risks related to AI, solidify the U.S. as a global leader in AI governance, and ensure a safer and more secure future.
Recommendations
- Develop a Screening Framework Guidance for AI-Enabled Automated Labs by Tessa Alexanian
- An Evidence-Based Approach to Identifying and Mitigating Biological Risks From AI-Enabled Biological Tools by Richard Moulange & Sophie Rose
- A Path to Self-governance of AI-Enabled Biology by Oliver Crook
- A Global Compute Cloud to Advance Safe Science and Innovation by Samuel Curtis
- Establish Collaboration Between Developers of Gene Synthesis Screening Tools and AI Tools Trained on Biological Data by Shrestha Rath
- Responsible and Secure AI in Production Agriculture by Jennifer Clarke
Develop a Screening Framework Guidance for AI-Enabled Automated Labs
Tessa Alexanian
Protecting against the risk that AI is used to engineer dangerous biological materials is a key priority in the Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence (AI EO). AI-engineered biological materials only become dangerous after digital designs are converted into physical biological agents, and biosecurity organizations have recommended safeguarding this digital-to-physical transition. In Section 4.4(b), the AI EO targets this transition by calling for standards and incentives that ensure appropriate screening of synthetic nucleic acids. This should be complemented by screening at another digital-to-physical interface: AI-enabled automated labs, such as cloud labs and self-driving labs.1
Laboratory biorisk management does not need to be reinvented for AI-enabled labs; existing US biosafety practices and Dual Use Research of Concern (DURC) oversight can be adapted and applied, as can emerging best practices for AI safety. However, the U.S. government should develop guidance that addresses two unique aspects of AI-enabled automated labs:
- Remote access to laboratory equipment may allow equipment to be misused by actors who would find it difficult to purchase or program it themselves.
- Unsupervised engineering of biological materials could produce dangerous agents without appropriate safeguards (e.g., if a viral vector regains transmissibility during autonomous experiments).
In short, guidance should ensure that providers of labs are aware of who is using the lab (customer screening), and what it is being used for (experiment screening).
It’s unclear precisely if and when automated labs will become broadly accessible to biologists, though a 2021 WHO horizon scan described them as posing dual-use concerns within five years. These are dual-use concerns, and policymakers must also consider the benefits that remotely accessible, high throughput, AI-driven labs offer for scientific discovery and biomedical innovation. The Australia Group discussed potential policy responses to cloud labs in 2019, including customer screening, experiment screening, and cybersecurity, though no guidance has been released. This is the right moment to develop screening guidance because automated labs are not yet widely used, but are attracting increasing investment and attention.
Recommendations
The evolution of U.S. policy on nucleic acid synthesis screening shows how the government can proactively identify best practices, issue voluntary guidance, allow stakeholders to test the guidance, and eventually require that federally-funded researchers procure from providers that follow a framework derived from the guidance.
Recommendation 1. Convene stakeholders to identify screening best practices
Major cloud lab companies already implement some screening and monitoring, and developers of self-driving labs recognize risks associated with them, but security practices are not standardized. The government should bring together industry and academic stakeholders to assess which capabilities of AI-enabled automated labs pose the most risk and share best practices for appropriate management of these risks.
As a starting point, aspects of the Administration for Strategic Preparedness and Response’s (ASPR) Screening Framework Guidance for synthetic nucleic acids can be adapted for AI-enabled automated labs. Labs that offer remote access could follow a similar process for customer screening, including verifying identity for all customers and verifying legitimacy for work that poses elevated dual-use concerns. If an AI system operating an autonomous or self-driving lab places a synthesis order for a sequence of concern, this could trigger a layer of human-in-the-loop approval.
Best practices will require cross-domain collaboration between experts in machine learning, laboratory automation, autonomous science, biosafety and biosecurity. Consortia such as the International Gene Synthesis Consortium and Global Biofoundy Alliance already have U.S. cloud labs among their members and may be a useful starting point for stakeholder identification.
Recommendation 2. Develop guidance based on these best practices
The Director of the Office of Science and Technology Policy (OSTP) should lead an interagency policy development process to create screening guidance for AI-enabled automated labs. The guidance will build upon stakeholder consultations conducted under Recommendation 1, as well as the recent ASPR-led update to the Screening Framework Guidance, ongoing OSTP-led consultations on DURC oversight, and OSTP-led development of a nucleic acid synthesis screening framework under Section 4.4(b) of the AI EO.
The guidance should describe processes for customer screening and experiment screening. It should address biosafety and biosecurity risks associated with unsupervised engineering of biological materials, including recommended practices for:
- Dual-use review for automated protocols. Automated protocols typically undergo human review because operators of automated labs don’t want to run experiments that fail. Guidance should outline when protocols should undergo additional review for dual-use; the categories of experiments in the DURC policy provide a starting point.
- Identifying biological agents in automated labs. When agents are received from customers, their DNA should be sequenced to ensure they have been labeled correctly. Agents engineered through unsupervised experiments should also be screened after some number of closed-loop experimental cycles.
Recommendation 3. Invest in predictive biology for risk mitigation
The Department of Homeland Security (DHS) and Department of Defense (DOD), building off the evaluation they will conduct under 4.4(a)(i) of the AI EO, should fund programs to develop predictive models to improve biorisk management in AI-enabled automated labs.
It is presently difficult to predict the behavior of biological systems, and there is little focus specifically on predictive biology for risk mitigation. AI could perform real-time risk evaluations and anomaly detection in self-driving labs; for example, autonomous science researchers have highlighted the need to develop models that can recognize novel compounds with potentially harmful properties. The government can actively contribute to innovation in this area; the IARPA Fun GCAT program, which developed methods to assess whether DNA sequences pose a threat, is an example of relevant government-funded AI capability development.
An evidence-based approach to identifying and mitigating biological risks from AI-enabled biological tools
Richard Moulange & Sophie Rose
Both AI-enabled biological tools and large language models (LLMs) have advanced rapidly in a short time. While these tools have immense potential to drive innovation, they could also threaten the United States’ national security.
AI-enabled biological tools refer to AI tools trained on biological data using machine learning techniques, such as deep neural networks. They can already design novel proteins, viral vectors and other biological agents, and may in the future be able to fully automate parts of the biomedical research and development process.
Sophisticated state and non-state actors could potentially use AI-enabled tools to more easily develop biological weapons (BW) or design them to evade existing countermeasures . As accessibility and ease of use of these tools improves, a broader pool of actors is enabled.
This threat was recognized by the recent Executive Order on Safe AI, which calls for evaluation of all AI models (not just LLMs) for capabilities enabling chemical, biological, radiological and nuclear (CBRN) threats, and recommendations for how to mitigate identified risks.
Developing novel AI-enabled biological tool -evaluation systems within 270 days, as directed by the Executive Order §4.1(b), will be incredibly challenging, because:
- There appears to have been little progress on developing benchmarks or evaluations for AI-enabled biological tools in academia or industry, and government capacity (in the U.S. and the UK) has so far focused on model evaluations for LLMs, not AI-enabled biological tools.
- Capabilities are entirely dual-use: for example, tools that can predict which viral mutations improve vaccine targeting can very likely identify mutations that increase vaccine evasion.
To achieve this, it will be important to identify and prioritize those AI-enabled biological tools that pose the most urgent risks, and balance these against the potential benefits. However, government agencies and tool developers currently seem to struggle to:
- Specify which AI–bio capabilities are the most concerning;
- Determine the scope of AI–enabled tools that pose significant biosecurity risks; and
- Anticipate how these risks might evolve as more tools are developed and integrated
Some frontier AI labs have assessed the biological risks associated with LLMs , but there is no public evidence of AI-enabled biological tool evaluation or red-teaming, nor are there currently standards for developing—or requirements to implement—them. The White House Executive Order will build upon industry evaluation efforts for frontier models, addressing the risk posed by LLMs, but analogous efforts are needed for AI-enabled biological tools.
Given the lack of research on AI-enabled biological tool evaluation, the U.S. Government must urgently stand up a specific program to address this gap and meet the Executive Order directives. Without evaluation capabilities, the United States will be unable to scope regulations around the deployment of these tools, and will be vulnerable to strategic surprise. Doing so now is essential to capitalize on the momentum generated by the Executive Order, and comprehensively address the relevant directives within 270 days.
Recommendations
The U.S. Government should urgently acquire the ability to evaluate biological capabilities of AI-enabled biological tools via a specific joint program at the Departments of Energy (DOE) and Homeland Security (DHS), in collaboration with other relevant agencies.
Strengthening the U.S. Government’s ability to evaluate models prior to their deployment is analogous to responsible drug or medical device development: we must ensure novel products do not cause significant harm, before making them available for widespread public use.
The objective(s) of this program would be:
- Develop state-of-the-art evaluations for dangerous biological capabilities
- Establish Department of Energy (DOE) sandbox for testing evaluations on a variety of AI-enabled biological tools
- Produce standards for performance, structure and securitisation of capability evaluations
- Use evaluations of the maturity and capabilities of AI-enabled biological tools to inform U.S. Intelligence Community assessments of potential adversaries’ current bio-weapon capabilities
Implementation
- Standing up and sustaining DOE and DHS’s ‘Bio Capability Evaluations’ program will require an initial investment of $2 million USD and $2 million/year until 2030 to sustain. Funding should draw on existing National Intelligence Program appropriations.
- Supporting DOE to establish a sandbox for conducting ongoing evaluations of AI-enabled biological tools will require investment of $10 million annually. This could be appropriated to DOE under the National Defense Authorization Act (Title II: Research, Development, Test and Evaluation), which establishes funding for AI defense programs.
Lead agencies and organizations
- U.S. Department of Energy (DOE) can draw on expertise from National Labs, which often evaluate—and develop risk mitigation measures for—technologies with CBRN implications.
- U.S. Department of Homeland Security (DHS) can inform threat assessments and inform biological risk mitigation strategy and policy.
- National Institute for Standards and Technology (NIST) can develop the standards for the performance, structure and securitization of dangerous capability evaluations.
- U.S. Department of Health and Human Services (HHS) can leverage their AI Community of Practice (CoP) as an avenue for communicating with BT developers and researchers. The National Institutes of Health (NIH) funds relevant research and will therefore need to be involved in evaluations.
They should coordinate with other relevant agencies, including but not limited to the Department of Defense, and the National Counterproliferation and Biosecurity Center.
The benefits of implementing this program include:
Leveraging public-private expertise. Public-private partnerships (involving both academia and industry) will produce comprehensive evaluations that incorporate technical nuances and national security considerations. This allows the U.S. Government to retain access to diverse expertise whilst safeguarding the sensitive nature of dangerous capability evaluations contents and output—which is harder to guarantee with third-party evaluators.
Enabling evidence-based regulatory decision-making. Evaluating AI tools allows the U.S. Government to identify the models and capabilities that pose the greatest biosecurity risks, enabling effective and appropriately-scoped regulations. Avoiding blanket regulations results in a better balance of the considerations of innovation and economic growth with those of risk mitigation and security.
Broad scope of evaluation application. AI-enabled biological tools vary widely in their application and current state of maturity. Subsequently, what constitutes a concerning, or dangerous, capability may vary widely across tools, necessitating the development of tailored evaluations.
A path to self-governance of AI-enabled biology
Oliver Crook
Artificial intelligence (AI) and machine learning (ML) are being increasingly employed for the design of proteins with specific functions. By adopting these tools, researchers have been able to achieve high success rates designing and generating proteins with certain properties. This will accelerate the design of new medical therapies such as antibodies, vaccines and biotechnologies such as nanopores. However, AI-enabled biology could also be used for malicious – rather than benevolent – purposes. Despite this potential for misuse, there is little to no oversight over what tools can be developed, the data they can be trained on, and how the developed tool can be deployed. While more robust guardrails are needed, any proposed regulation must also be balanced, so that it encourages responsible innovation.
AI-enabled biology is still a specialized methodology that requires significant technical expertise, access to powerful computational resources, and sufficient quantities of data. As the performance of these models increases, their potential for generating significantly harmful agents grows as well. With AI-enabled biology becoming more accessible, the value of guardrails early on in the development of this technology is paramount before widespread technology proliferation makes it challenging – or impossible – to govern. Furthermore, smart policies implemented now can allow us to better monitor the pace of development, and guide reasonable and measured policy in the future.
Here, we propose that fostering self-governance and self-reporting is a scalable approach to this policy challenge. During the research, development and deployment (RDD) phases, practitioners report on a pre-decided checklist and make an ethics declaration. While advancing knowledge is an academic imperative, funders, editors, and institutions need to be fully aware of the risks of some research and have opportunities to adjust the RDD plan, as needed, to ensure that AI models are developed responsibly. Whilst similar policies have already been introduced by some machine learning venues (1, 2, 3), the proposal here seeks to strengthen, formalize and broaden the scope of those proposals. Ultimately, the checklist and ethics declarations seek confirmation from multiple parties during each of the RDD phases that the research is of fundamental public good. We recommend that the National Institutes of Health (NIH) leads on this policy challenge and builds upon decades of experience on related issues.
The recent executive order framework for safe AI provides an opportunity to build upon initial recommendations on reporting but with greater specificity on AI-enabled biology. The proposal fits squarely into the desire under section 4.4 for the executive order to reduce the misuse of AI to assist in the development and design of biological weapons.
Recommendations
We propose the following recommendations:
Recommendation 1. With leadership from the NIH Office of Science Policy, life sciences funding agencies should coordinate development of a checklist in consultation with AI-enabled biology model developers, non-government funders, publishers, and nonprofit organizations that evaluates risks and benefits of the model.
The checklist should take the form of a list of pre-specified questions and guided free-form text. The questions should gather basic information about the models employed: their size, their compute usage and the data they were trained on. This will allow them to be characterized in comparison with existing models. The intended use of the model should be stated along with any dual-use behavior of the model that has already been identified. The document should also reveal whether any strategies have been employed to mitigate the harmful capabilities that the model might demonstrate.
At each stage of the RDD, the predefined checklist for that stage is completed and submitted to the institution.
Recommendation 2. Each institute employing AI-enabled biology across RDD should elect a small internally-led, cross-disciplinary committee to examine and evaluate, at each phase, the submitted checklists. To reduce workload, only models that fall under the executive order specifications or dual use research of concern (DURC) should be considered. The committee makes recommendations based on the value of the work. The committee then posts their proceedings of meetings publicly (as for Institutional Biosafety Committees), except for publicly sensitive intellectual property. If the benefit of the work cannot be evaluated or the outcomes are largely unpredictable the committee should work with the model developer to adjust the RDD plan, as needed. The checklist and institutional signature are then made available to NIH and funding agencies and, upon completion of the project, such as at publication, the checklists are made publicly available.
By following these recommendations, high-risk research will be caught at an institutional level and internal recommendations can facilitate timely mitigation of harms. Public release of committee deliberations and ethics checklists will enable third parties to scrutinize model development and raise concerns. This approach ensures a hierarchy of oversight that allows individuals, institutes, funders and governments to identify and address risks before AI models are developed rather than after the work has been completed.
We recommend that $5 million dollars be provided to the NIH Office of Science Policy to implement this policy. This money would cover hiring a ‘Director of Ethics of AI-enabled Biology’ to oversee this research and several full time researchers/administrators ($1.5 million). These employees should conduct outreach to the institutes to ensure that the policy is understood, to answer any questions, and to facilitate community efforts to develop and update the checklist ($1 million). Additional grants should be made available to allow researchers and non-profit organizations to audit the checklists and committees, evaluate the checklists, and research the socio-technological implications of the checklists ($1.5 million). The rapid pace of development of AI means that the checklists will need to be reevaluated on a yearly basis, with $1 million of funding available to evaluate the impact of these grants. Funding should grow inline with the pace of technological development. Specific subareas of AI-enabled biology may need specific checklists depending on their risk profile.
This recommendation is scalable; once the checklists have been made, the majority of the work is placed in the hands of practitioners rather than government. In addition, these checklists provide valuable information to inform future governance agendas. For example, limiting computational resources to curtail dangerous applications (compute governance) cannot proceed without detailed understanding of how much compute is required to achieve certain goals. Furthermore, it places responsibility on practitioners requiring them to engage with the risk that could arise from their work, with institutes having the ability to make recommendations on how to reduce the risks from models. This approach draws on similar frameworks that support self-governance, such as oversight by Institutional Biosafety Committees (IBCs). This self-governance proposal is well complemented by alternative policies around open access of AI-enabled biology tools, as well as policies strengthening DNA synthesis screening protocols to catch misuse at different places along a broadly-defined value chain.
A Global Compute Cloud to Advance Safe Science and Innovation
Samuel Curtis
Advancements in deep learning have ushered in significant progress in the predictive accuracy and design capabilities of biological design tools (BDTs), opening new frontiers in science and medicine through the design of novel functional molecules. However, these same technologies may be misused to create dangerous biological materials. Mitigating the risks of misuse of BDTs is complicated by the need to maintain openness and accessibility among globally-distributed research and development communities. One approach toward balancing both risks of misuse and the accessibility requirements of development communities would be to establish a federally-funded and globally-accessible compute cloud through which developers could provide secure access to their BDTs.
The term “biological design tools” (or “BDTs”) is a neologism referring to “systems trained on biological data that can help design new proteins or other biological agents.” Computational biological design is, in essence, a data-driven optimization problem. Consequently, over the past decade, breakthroughs in deep learning have propelled progress in computational biology. Today, many of the most advanced BDTs incorporate deep learning techniques and are used and developed by networks of academic researchers distributed across the globe. For example, the Rosetta Software Suite, one of the most popular BDT software packages, is used and developed by Rosetta Commons—an academic consortium of over 100 principal investigators spanning five continents.
Contributions of BDTs to science and medicine are difficult to overstate. There are already several AI-designed molecules in early-stage clinical trials. BDTs are now used to identify new drug targets, design new therapeutics, and construct faster and less expensive drug synthesis techniques. There are already several AI-designed molecules in early-stage clinical trials.
Unfortunately, these same BDTs can be used for harm. They may be used to create pathogens that are more transmissible or virulent than known agents, target specific sub-populations, or evade existing DNA synthesis screening mechanisms. Moreover, developments in other classes of AI systems portend reduced barriers to BDT misuse. One group at RAND Corporation found that language models could provide guidance that could assist in planning and executing a biological attack, and another group from MIT demonstrated how language models could be used to elicit instructions for synthesizing a potentially pandemic pathogen. Similarly, language models could accelerate the acquisition or interpretation of information required to misuse BDTs. Technologies on the horizon, such as multimodal “action transformers,” could help individuals navigate BDT software, further lowering barriers to misuse.
Research points to several measures BDT developers could employ to reduce risks of misuse, such as securing machine learning model weights (the numerical values representing the learned patterns and information that the model has acquired during training), implementing structured access controls, and adopting Know Your Customer (KYC) processes. However, precaution would have to be taken to not unduly limit access to these tools, which could, in aggregate, impede scientific and medical advancement. For any given tool, access limitations risk diminishing its competitiveness (its available features and performance relative to other tools). These tradeoffs extend to their developers’ interests, whereby stifling the development of tools may jeopardize research, funding, and even career stability. The difficulties of striking a balance in managing risk are compounded by the decentralized, globally-distributed nature of BDT development communities. To suit their needs, risk-mitigation measures should involve minimal, if any, geographic or political restrictions placed on access while simultaneously expanding the ability to monitor for and respond to indicators of risk or patterns of misuse.
One approach that would balance the simultaneous needs for accessibility and security would be for the federal government to establish a global compute cloud for academic research, bearing the costs of running servers and maintaining the security of the cloud infrastructure in the shared interests of advancing public safety and medicine. A compute cloud would enable developers to provide access to their tools through computing infrastructure managed—and held to specific security standards—by U.S. public servants. Such infrastructure could even expand access for researchers, including underserved communities, through fast-tracked grants in the form of computational resources.
However, if computing infrastructure is not designed to reflect the needs of the development community—namely, its global research community—it is unlikely to be adopted in practice. Thus, to fully realize the potential of a compute cloud among BDT development communities, access to the infrastructure should extend beyond U.S. borders. At the same time, the efforts should ensure the cloud has requisite monitoring capabilities to identify risk indicators or patterns of misuse and impose access restrictions flexibly. By balancing oversight with accessibility, a thoughtfully-designed compute cloud could enable transparency and collaboration while mitigating the risks of these emerging technologies.
Recommendations
The U.S. government should establish a federally-funded, globally-accessible compute cloud through which developers could securely provide access to BDTs. In fact, the Biden Administration’s October 2023 “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” (the “AI EO”) lays groundwork by establishing a pilot program of a National AI Research Resource (NAIRR)—a shared research infrastructure providing AI researchers and students with expanded access to computational resources, high-quality data, educational tools, and user support. Moving forward, to increase the pilot program’s potential for adoption by BDT developers and users, relevant federal departments and agencies should take concerted action in the timelines circumscribed by the AI EO to address the practical requirements of BDT development communities: the simultaneous need to expand access outside U.S. borders while bolstering the capacity to monitor for misuse.
It is important to note that a federally-funded compute cloud has been years in the making. The National AI Initiative Act of 2020 directed the National Science Foundation (NSF), in consultation with the Office of Science and Technology Policy (OSTP), to establish a task force to create a roadmap for the NAIRR. In January 2023, the NAIRR Task Force released its final report, “Strengthening and Democratizing the U.S. Artificial Intelligence Innovation Ecosystem,” which presented a detailed implementation plan for establishing the NAIRR. The Biden Administration’s AI EO then directed the Director of NSF, in coordination with the heads of agencies deemed appropriate by the Director, to launch a pilot program “consistent with past recommendations of the NAIRR Task Force.”
However, the Task Force’s past recommendations are likely to fall short of the needs of BDT development communities (not to mention other AI development communities). In its report, the Task Force described NAIRR’s primary user groups as “U.S.-based AI researchers and students at U.S. academic institutions, non-profit organizations, Federal agencies or FFRDCs, or startups and small businesses awarded [Small Business Innovation Research] or [Small Business Technology Transfer] funding,” and its resource allocation process is oriented toward this user base. Separately, Stanford University’s Institute for Human-centered AI (HAI) and the National Security Commission on Artificial Intelligence (NSCAI) have proposed institutions, building upon or complementing NAIRR, that would support international research consortiums (a Multilateral AI Research Institute and an International Digital Democracy Initiative, respectively), but the NAIRR Task Force’s report—upon which the AI EO’s pilot program is based—does not substantively address this user base.
In launching the NAIRR pilot program under Sec. 5.2(a)(i), the NSF should put the access and security needs of international research consortiums front and center, conferring with heads of departments and agencies with relevant scope and expertise, such as the Department of State, US Agency for International Development (USAID), Department of Education, the National Institutes of Health, and the Department of Energy. The NAIRR Operating Entity (as defined in the Task Force’s report) should investigate how funding, resource allocation, and cybersecurity could be adapted to accommodate researchers outside of U.S. borders. In implementing the NAIRR pilot program, the NSF should incorporate BDTs in their development of guidelines, standards, and best practices for AI safety and security, per Sec. 4.1, which could serve as standards with which NAIRR users should be required to comply. Furthermore, the NSF Regional Innovation Engine launched through Sec. 5(a)(ii) should consider focusing on international research collaborations, such as those in the realm of biological design.
Besides the NSF, which is charged with piloting NAIRR, relevant departments and agencies should take concerted action in implementing the AI EO to address issues of accessibility and security that are intertwined with international research collaborations. This includes but is not limited to:
- In accordance with Sec. 5.2(a)(i), the departments and agencies listed above should be tasked with investigating the access and security needs of international research collaborations and include these in the reports they are required to submit to the NSF. This should be done in concert with the development of guidelines, standards, and best practices for AI safety and security required by Sec. 4.1.
- In fulfilling the requirements of Sec. 5.2(c-d), the Under Secretary of Commerce for Intellectual Property, the Director of the United States Patent and Trademark Office, and the Secretary of Homeland Security should, in the reports and guidance on matters related to intellectual property that they are required to develop, clarify ambiguities and preemptively address challenges that might arise in the cross-border data use agreements.
- Under the terms of Sec. 5.2(h), the President’s Council of Advisors on Science and Technology should, in its development of “a report on the potential role of AI […] in research aimed at tackling major societal and global challenges,” focus on the nature of decentralized, international collaboration on AI systems used for biological design.
- Pursuant to Sec. 11(a-d), the Secretary of State, the Assistant to the President for National Security Affairs, the Assistant to the President for Economic Policy, and the Director of OSTP should focus on AI used for biological design as a use case for expanding engagements with international allies and partners, and establish a robust international framework for managing the risks and harnessing the benefits of AI. Furthermore, the Secretary of Commerce should make this use case a key feature of its plan for global engagement in promoting and developing AI standards.
The AI EO provides a window of opportunity for the U.S. to take steps toward mitigating the risks posed by BDT misuse. In doing so, it will be necessary for regulatory agencies to proactively seek to understand and attend to the needs of BDT development communities, which will increase the likelihood that government-supported solutions, such as the NAIRR pilot program—and potentially future fully-fledged iterations enacted via Congress—are adopted by these communities. By making progress toward reducing BDT misuse risk while promoting safe, secure access to cutting-edge tools, the U.S. could affirm its role as a vanguard of responsible innovation in 21st-century science and medicine.
Establish collaboration between developers of gene synthesis screening tools and AI tools trained on biological data
Shrestha Rath
Biological Design Tools (BDTs) are a subset of AI models trained on genetic and/or protein data developed for use in life sciences. These tools have recently seen major performance gains, enabling breakthroughs like accurate protein structure predictions by AlphaFold2, and alleviating a longstanding challenge in life sciences.
While promising for legitimate research, BDTs risk misuse without oversight. Because universal screening of gene synthesis is currently lacking, potential threat agents could be digitally designed with assistance from BDTs and then physically made using gene synthesis. BDTs pose particular challenges because:
- The growing number of gene synthesis orders evade current screening capabilities. Industry experts in gene synthesis companies report that a small but concerning portion of orders for synthetic nucleic acid sequences show little or no homology with known sequences in widely-used genetic databases and so are not captured by current screening techniques. Advances in BDTs are likely going to make such a scenario more common, thus exacerbating the risk of misuse of synthetic DNA. The combined use of BDTs and gene synthesis has the potential to aid the “design” and “build” steps for malicious misuse. Strengthening screening capabilities to keep up with advances in BDTs is an attractive early intervention point to prevent this misuse.
- Potential for substantial breakthroughs in BDTs. While BDTs for applications beyond protein design face significant challenges, and most are not yet mature, companies are likely to invest in generating data to improve these tools because they see significant economic value in doing so. Moreover, some AI experts speculate that if we used the same amount of computational resources as LLMs when training protein language models, we could significantly improve their performance. Thus there is significant uncertainty about rapid advances in BDTs and how they may affect the potential for misuse.
BDT development is currently concentrated among a few actors, which makes policy implementation tractable, but risks will decentralize over time away from a handful of well-resourced academic labs and private AI companies. The U.S. government should take advantage of this unique window of opportunity to implement policy guardrails while the next generation of advanced BDTs are in development.
In short, it is important that developers of BDTs work together with developers and users of gene synthesis screening tools. This will promote shared understanding of the risks around potential misuse of synthetic nucleic acids, which may be exacerbated by advances in AI.
By bringing together key stakeholders to share information and align on safety standards the U.S. government can steer these technologies to maximize benefits and minimize widespread harms. Section 4.4 (b) of the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (henceforth referred to as the “Executive Order”) also emphasizes mitigating risks from the “misuse of synthetic nucleic acids, which could be substantially increased by AI’s capabilities”.
Gene synthesis companies and/or organizations involved in developing gene screening mechanisms are henceforth referred to as “DNA Screener”. Academic and for-profit stakeholders developing BDTs are henceforth referred to as “BDT Developer”. Gene synthesis companies providing synthetic DNA as a service irrespective of their screening capabilities are referred to as “DNA Provider”.
Recommendations
By bringing together key stakeholders (BDT developers, DNA screeners, and security experts) to share information and align on safety standards, the U.S. government can steer these technologies to maximize benefits and minimize widespread harms. Implementing these recommendations requires allocating financial resources and coordinating interagency work.
There are near-term and long-term opportunities to improve coordination between DNA Screeners, DNA Providers (that use in-house screening mechanisms) and BDT Developers, such as to avoid future potential backlash, over-regulation and legal liability. As part of the implementation of Section 4.4 of the Executive Order, the Department of Energy and the National Institute for Standards and Technology should:
Recommendation 1. Convene BDT Developers, DNA Screeners along with ethics, security and legal experts to a) Share information on AI model capabilities and their implications for DNA sequence screening; and b) Facilitate discussion on shared safeguards and security standards. Technical security standards may include adversarial training to make the AI models robust against purposeful misuse, BDTs refusing to follow user requests when the requested action may be harmful (refusals and blacklisting), and maintaining user logs.
Recommendation 2. Create an advisory group to investigate metrics that measure performance of protein BDTs for DNA Screeners, in line with Section 4.4 (a)(ii)(A) of the Executive Order. Metrics that capture BDT performance and thus risk posed by advanced BDTs would give DNA Screeners helpful context while screening orders. For example, some current methods for benchmarking AI-enabled protein design methods focus on sequence recovery, where the backbones of natural proteins with known amino-acid sequences are passed as the input and the accuracy of the method is measured by identity between the predicted sequence and the true sequence.
Recommendation 3. Support and fund development of AI-enabled DNA screening mechanisms that will keep pace with BDTs. U.S. national laboratories should support these types of efforts because commercial incentives for such tools are lacking. IARPA’s Fun GCAT is an exemplary case in this regard.
Recommendation 4. Conduct structured red teaming for current DNA screening methods to ensure they account for functional variants of Sequence Of Concern (SOCs) that may be developed with the help of BDTs and other AI tools. Such red teaming exercises should include expert stakeholders involved in development of screening mechanisms and national security community.
Recommendation 5. Establish both policy frameworks and technical safeguards for identification of certifiable origins. Designs produced by BDTs could require a cryptographically signed certificate detailing the inputs used in the design process of their synthetic nucleic acid order, ultimately providing useful contextual information aids DNA Screeners to check for harmful intent captured in the requests made to the model.
Recommendation 6. Fund third-party evaluations of BDTs to determine how their use might affect DNA sequence screening and provide this information to those performing screening. Having these evaluations would be helpful for new, small and growing DNA Providers and alleviate the burden on established DNA Providers as screening capabilities become more sophisticated. A similar system exists in the automobile industry where insurance providers conduct their own car safety and crash tests to inform premium-related decisions.
Proliferation of open-source tools accelerates innovation and democratization of AI, and is a growing concern in the context of biological misuse. The recommendations here strengthen biosecurity screening at a key point in realizing these risks. This framework could be implemented alongside other approaches to reduce the risks that arise from BDTs. These include: introducing Know Your Customer (KYC) frameworks that monitor buyers/users of AI tools; requiring BDT developers to undergo training on assessing and mitigating dual-use risks of their work; and encouraging voluntary guidelines to reduce misuse risks, for instance, by employing model evaluations prior to release, refraining from publishing preprints or releasing model weights until such evaluations are complete. This multi-pronged approach can help ensure that AI tools are developed responsibly and that biosecurity risks are managed.
Responsible and Secure AI in Production Agriculture
Jennifer Clarke
Agriculture, food, and related industries represent over 5% of domestic GDP. The health of these industries has a direct impact on domestic food security, which equates to a direct impact on national security. In other words, food security is biosecurity is national security. As the world population continues to increase and climate change brings challenges to agricultural production, we need an efficiency and productivity revolution in agriculture. This means using less land and natural resources to produce more food and feed. For decision-makers in agriculture, the lack of human resources and narrow economic margins are driving interest in automation and properly utilizing AI to help increase productivity while decreasing waste amid increasing costs.
Congress should provide funding to support the establishment of a new office within the USDA to coordinate, enable, and oversee the use of AI in production agriculture and agricultural research.
The agriculture, food, and related industries are turning to AI technologies to enable automation and drive the adoption of precision agriculture technologies. The use of AI in agriculture often depends on proprietary approaches that have not been validated by an independent, open process. In addition, it is unclear whether AI tools aimed at the agricultural sector will address critical needs as identified by the producer community. This leads to the potential for detrimental recommendations and loss of trust across producer communities. These will impede adoption of precision agriculture technologies, which is necessary to domestic and sustainable food security.
The industry is promoting AI technologies to help yield healthier crops, control pests, monitor soil and growing conditions, organize data for farmers, help with workload, and improve a wide range of agriculture-related tasks in the entire food supply chain.
However, the use of networked technologies approaches in agriculture poses risks. AI use could add to this problem if not implemented carefully. For example, the use of biased or irrelevant data in AI development can result in poor performance, which erodes producer trust in both extension services and expert systems, hindering adoption. As adoption increases, it is likely that farmers will use a small number of available platforms; this creates centralized points of failure where a limited attack can cause disproportionate harm. The 2021 cyberattack on JBS, the world’s largest meat processor, and a 2021 ransomware attack on NEW Cooperative, which provides feed grains for 11 million farm animals in the United States, demonstrate the potential risks from agricultural cybersystems. Without established cybersecurity standards for AI systems, those systems with broad adoption across agricultural sectors will represent targets of opportunity.
As evidenced by the recent Executive Order on the Safe Secure and Trustworthy Development and Use of Artificial Intelligence and AI Safety Summit held at Bletchley Park, there is considerable interest and attention being given to AI governance and policy by both national and international regulatory bodies. There is a recognition that the risks of AI require more attention and investments in both technical and policy research.
This recognition dovetails with an increase in emphasis on the use of automation and AI in agriculture to enable adoption of new agricultural practices. Increased adoption in the short term is required to reduce greenhouse gas emissions and ensure sustainability of domestic food production. Unfortunately, trust in commercial and governmental entities among agricultural producers is low and has been eroded by corporate data policies. Fortunately, this erosion can be reversed by prompt action on regulation and policy that respects the role of the producer in food and national security. Now is the time to promote the adoption of best practices and responsible development to establish security as a habit among agricultural stakeholders.
Recommendations
To ensure that the future of domestic agriculture and food production leverages the benefits of AI while mitigating the risks of AI, the U.S. government should invest in institutional cooperation; AI research and education; and development and enforcement of best practices.
Recommendation: An Office should be established within USDA focused on AI in Production Agriculture, and Congress should appropriate $5 million over the next 5 years for a total of $25 million for this office. Cooperation among multiple institutions (public, private, nonprofit) will be needed to provide oversight on the behavior of AI in production agriculture including the impact of non-human algorithms and data sharing agreements (“the algorithmic economy”). This level of funding will encourage both federal and non-federal partners to engage with the Office and support its mission. This Office should establish and take direction from an Advisory body led by USDA with inclusive representation across stakeholder organizations including industry (e.g., AgGateway, Microsoft, John Deere), nonprofit organizations (e.g., AgDataTransparent, American Farmland Trust, Farm Bureaus, Ag Data Coalition, Council for Agricultural Science and Technology (CAST), ASABE, ISO), government (e.g., NIST, OSTP), and academia (e.g., APLU, Ag Extension). This advisory body will operate under the Federal Advisory Committee Act (FACA) to identify challenges and recommend solutions, e.g., develop regulations or other oversight specific to agricultural use of AI, including data use agreements and third-party validation, that reduces the uncertainty about risk scenarios and the effect of countermeasures. The office and its advisory body can solicit broad input on regulation, necessary legislation, incentives and reforms, and enforcement measures through Requests for Information and Dear Colleague letters. It should promote best practices as described below, i.e., incentivize responsible use and adoption, through equitable data governance, access, and private-public partnerships. An example of an incentive is providing rebates to producers who purchase equipment that utilizes validated AI technology.
To support development of best practices for the use of AI in production agriculture, in partnership with NIH, NSF, and DOD/DOE, the proposed Office should coordinate funding for research and education on the sociotechnical context of AI in agriculture across foundational disciplines including computer science, mathematics, statistics, psychology, and sociology. This new discipline of applied AI (built on theoretical advances in AI since the 1950s) should provide a foundation for developing best practices for responsible AI development starting with general, accepted standards (e.g., NIST’s framework). For example, best practices may include transparency through the open source community and independent validation processes for models and software. AI model training requires an immense amount of data and AI models for agriculture will require many types of data sets specific to production systems (e.g., weather, soil, management practices, etc.). There is an urgent need for standards around data access and use that balance advances and adoption of precision agriculture with privacy and cybersecurity concerns.
In support of the work of the proposed Office, Congress should appropriate funding at $20M/year to USDA to support the development of programs at land-grant universities that provide multidisciplinary training in AI and production agriculture. The national agricultural production cyberinfrastructure (CI) has become critical to food security and carbon capture in the 21st century. A robust talent pipeline is necessary to support, develop, and implement this CI in preparation for the growth in automation and AI. There is also a critical need for individuals trained in both AI and production agriculture who can lead user-centered design and digital services on behalf of producers. Training must include foundation knowledge of statistics, computer science, engineering, and agricultural sciences coupled with experiential learning that provide trainees with opportunities to translate their knowledge to address current CI challenges. These opportunities may arise from interagency cooperation at the federal, state, and local levels, in partnership with grower cooperatives, farm bureaus, and land-grant universities, to ensure that training meets pressing and future needs in agricultural systems.
Connecting Utility-Scale Renewable Energy Resources with Rural-Urban Transmission
There is a vast amount of wind and solar power ready to be harvested and moved to market across the United States, but it must be connected through long-distance transmission to protect against intermittency instability. Strategically placed long-distance transmission also ensures that rural and urban populations benefit economically from the transition to clean energy.
The Biden-Harris Administration should facilitate the transition to a clean grid by aggressively supporting utility-scale renewable energy resources in rural areas that are connected to urban centers through modernized high-voltage direct current (HVDC) transmission. To move toward total electrification and a decarbonized grid, the Department of the Interior (DOI) and the Bureau of Land Management (BLM) must encourage renewable energy production on federal land through the BLM’s multiple-use mandate. BLM must work in tandem with the Department of Energy (DOE), Department of Transportation (DOT), and the Federal Energy Regulatory Commission (FERC) to transport increased clean power generation through newly constructed HVDC lines that can handle this capacity.
This two-pronged approach will move loads from high-generation, low-demand rural areas to low-generation, high-demand (often coastal) urban hubs. As residents in the East arrive home from work and turn on their TVs, the sun is still up in the West and can provide for their energy needs. As residents in the Northwest wake up, grind coffee, and tune into the news, they can rely on power from the Midwest, where the wind is blowing.
Challenge and Opportunity
Utility-Scale Renewable Energy Development on Federal Land
After taking office, the Biden-Harris Administration rejoined the Paris Climate Agreement and committed the United States to reduce greenhouse gas (GHG) emissions by 50–52% below 2005 levels by 2030. The Inflation Reduction Act (IRA) is a positive step toward meeting these GHG emissions goals. The IRA allocated $369 billion to climate and energy security investments, which should be used to bolster development of renewables on federal lands. Together with the Infrastructure Investment and Jobs Act, this funding affords an enormous opportunity.
Building utility-scale renewable energy infrastructure such as wind or solar requires a vast amount of space. A utility-scale solar power plant could require between 5 and 10 acres of land in order to generate enough energy to power approximately 173 homes.
The federal government owns a vast amount of land, some of which is viable for wind and solar. To be exact, the federal government owns 640 million acres of land (nearly one-third of all U.S. land), which is managed through the Bureau of Land Management (BLM), the Fish and Wildlife Services (FWS), the National Park Service (NPS), the Forest Service (USFS), and the Department of Defense (DOD).
Land owned by the BLM (245 million acres) and the USFS (193 million acres) falls under similar multiple-use, sustained-yield mandates. The majority of those combined 438 million acres under BLM jurisdiction are the concern of this memo. According to the Federal Land Policy and Management Act of 1976 (FLPMA), resources and uses on those federal lands must be used in a balanced combination that “best meets present and future needs of the American people.” This multiple-use mandate presents an enormous opportunity for deployment of utility-scale renewable energy resources. The BLM manages over 19 million acres of public lands with excellent solar potential across six states and 20.6 million acres of public lands with excellent wind potential. This land is ripe for utility-scale renewable energy generation and will be critical to achieving the nation’s decarbonization goals. Green energy generation on these lands should be privileged.
Together, the 15 central U.S. states account for the majority of national wind and solar technical potential. However, these states are projected to comprise only a third of the nation’s electrical demand in 2050. Population-dense and predominantly coastal cities have higher energy demand, while the Midwest and Southwest are dominated by rural communities and public land. Transmission lines are needed to transport renewable energy from these central states to the urban centers with large energy markets.
Transmission Development on a Rural-Urban Grid
The U.S. grid is split into three regions: the Western Interconnection, the Eastern Interconnection, and ERCOT Interconnection (Texas). These three regions are only minimally connected nationally, regionally, or even through interstate connections due to intense localism on the part of utilities that are not financially incentivized to engage in regional transmission. There are three key utility ownership models in the United States: private investor-owned utilities (IOUs), public power utilities owned by states or municipalities, and nonprofit rural electric cooperatives (co-ops).
The Federal Energy Regulatory Commission is an independent agency that regulates the interstate transmission of electricity. In this capacity, it ensures that regional goals are established and met. Two types of entities established by FERC, regional transmission organizations (RTOs) and independent system operators (ISOs), help to coordinate regional transmission across utilities. RTOs are voluntary bodies of utilities that streamline and coordinate regional transmission initiatives and objectives. ISOs are independent and federally regulated entities that coordinate regional transmission to ensure nondiscriminatory access and streamline regional goals. ISOs and RTOs are similar, but RTOs generally have jurisdiction over a larger geographic area. Two-thirds of the nation’s electricity load is served in ISO/RTO regions. The remainder of the energy market is dominated by vertically integrated utilities that manage both transmission and distribution.
Establishing more connections among the three regional grids will support renewable energy development, reduce GHG emissions, save consumers money, increase resilience, and create jobs. Connecting the power grid across states and time zones is also vital to peak load control. Greater connection mitigates the inherent instability of renewables: if clouds cover the sun in the East, winds will still blow in the Midwest. If those winds die, water will still flow in the Northwest’s rivers.
The best way to make connections between regional and local grids is through high-voltage direct current electrical transmission systems. HVDC transmission allows for the direct current (DC) transfer of power over long distances, which is more energetically efficient than alternating current (AC).
There is precedent and forward momentum on developing interstate transmission, including projects like SunZia in the Southwest, TransWest Express in the Mountain West, Grain Belt Express in the Midwest, and Champlain Hudson Power Express in the Northeast. The Midcontinent Independent System Operator (MISO) recently approved $10.3 billion in regional HVDC lines, a move that is projected to generate up to $52.2 billion in net benefits through mitigated blackouts and increased fuel savings.
Though co-ops account for the smallest percentage of utilities (there are 812 total), they are found in the primarily rural Midwest, where there is high generation potential for solar and wind energy. Here, utility participation in RTOs is low. FERC has expressed disinterest in mandating RTO participation and in taking punitive action. However, it can incentivize regional planning through RTO membership or, where unappealing to local utilities, incentivize regional transmission investment through joint ownership structures.
The Biden-Harris Administration has taken the first steps to address these issues, such as releasing an Action Plan in 2022 to encourage federal agencies to expedite the permitting process of renewable energy. The president should expand on the existing Action Plan to build a larger coalition of contributors and also encourage the following recommendations to facilitate maximum clean-energy transition efficiency. Achieving the Biden-Harris Administration decarbonization targets requires the tandem development of rural utility-scale renewable energy and regional HVDC transmission to carry this energy to urban centers, benefiting people and economies across the United States.
Plan of Action
Recommendation 1. BLM should prioritize renewable energy permit awards near planned HVDC transmission lines and existing rights-of-way.
Compared to FY20, BLM reported that it has increased renewable energy permitting activities by 35%, supporting the development of 2,898 MW of onshore solar, wind, and geothermal energy generation capacity. BLM received 130 proposals for renewable energy generation projects on public lands and six applications for interconnected transmission lines in 2021. The transmission line proposals would support 17 GW of energy, which would also support the transmission of renewable energy on non-federal land across the Southwest.
DOI can directly support renewable energy generation by instructing BLM to ensure that contracts are awarded through the multiple-use, sustained-yield mandate in a specific way. Though Section 50265 of the IRA mandates that oil and gas leases must continue, DOI can plan with an eye to the future. Renewables built on public lands should be constructed in areas closest to planned HVDC transmission, including but not limited to Kansas, Wyoming, and New Mexico. Renewables should always take precedence over coal, oil, and natural gas in areas where existing or future HVDC transmission lines are planned to begin construction or upgrades. Renewables should also always take precedence near railways and federal highways, where HVDC transmission is more easily implemented. Contracts for renewables near planned HVDC interstate transmission lines and existing rights-of-way like railways and highways should be given precedence in the awards process. This will prime the grid for the Biden-Harris Administration’s decarbonization goals and ensure that oil and gas generation is situated closer to legacy lines that are more likely to be retired sooner. DOI has unique considerations due to Section 50265 of the IRA, but it can still coordinate with other federal agencies to manage its constraints and judiciously prioritize transmission-adjacent renewable energy generation sites.
Recommendation 2. FERC should incentivize regional transmission planning by encouraging federal-local partnerships, introducing joint-ownership structures, and amending Order 1000.
FERC should encourage RTOs to prioritize regional transmission planning in order to meet decarbonization goals and comply with an influx of cheaper, cleaner energy into its portfolio. The FERC-NARUC Task Force is a good starting point for this cooperation and should be expanded upon. This federal-state task force on electric transmission is a good blueprint for how federal objectives for regional planning can work hand-in-hand with local considerations. FERC can highlight positive cases like SB448 in Nevada, which incentivizes long-distance transmission and mandates the state’s participation in an RTO by 2030. FERC should encourage utility participation in RTOs but emphasize that long-distance transmission planning and implementation is the ultimate objective. Where RTO participation is not feasible, FERC can incentivize utility participation in regional transmission planning in other ways.
FERC should incentivize utility participation in regional transmission by encouraging joint-ownership structures, as explored in a 2019 incentives docket. In March 2019, FERC released a Notice of Inquiry seeking comments on “the scope and implementation of its electric transmission incentives regulations and policy.” Commenters supported non-public utility joint-ownership promotion, including equity in transmission lines that can offset customer rates, depending on the financing structure. In February 2023, FERC approved incentives for two of Great River Energy’s interstate transmission projects, in which it will own a 52.3% stake of the Minnesota Iron Range project and 5% of the Big Stone project. In the Iron Range project, Great River can use a 50% equity and 50% debt capital structure, placing the construction expenses on its rate base. The cash flow generated by this capital structure is necessary for the completion of this interstate transmission line, and FERC should encourage similar projects and incentives.
FERC should amend Order 1000—Transmission Planning and Cost Allocation. As former Commissioner Glick has noted, Order 1000 in its current iteration unintentionally encourages the construction of smaller lines over larger-scale regional transmission lines because utilities prefer not to engage in potentially lengthy, expensive competition processes. In April 2022, FERC published a Notice of Proposed Rulemaking (NOPR), which, among other things, attempts to address this perverse incentive by amending the order “to permit the exercise of a federal rights of first refusal for transmission facilities selected in a regional transmission plan for purposes of cost allocation, conditioned on the incumbent transmission provider establishing joint ownership of those facilities.” Amending this rule and allowing federal ROFR for joint ownership structures will encourage partnerships, spread risks across more parties, and allow greater access to large investments that traditionally require an insurmountable capital investment for most investors new to this sector. The NOPR also encouraged long-term regional transmission planning and improved coordination between local and regional entities and implementation goals. The amendment was supported by both utilities and environmental groups. Public comments were closed for submission in summer 2022. Now, over a year later, FERC should act quickly to issue a final rule on amending Order 1000.
In addition to incentivizing more regionally focused transmission planning at the utility level, federal agencies should work together to ensure that HVDC lines are strategically placed to facilitate the delivery of renewable energy to large markets.
Recommendation 3. The Biden-Harris Administration should encourage the Department of Transportation to work with the Grid Deployment Office (GDO) and approve state DOT plans for HVDC lines along existing highways and railroads.
In 2021, the Federal Highway Administration (FHWA) released a memorandum providing guidance that state departments of transportation may leverage “alternative uses” of existing highway rights of way (ROW), including for renewable energy, charging stations, transmission lines, and broadband projects, and that the FHWA may approve alternative uses for ROWs so long as they benefit the public and do not impair traffic. The GDO, created by the Biden-Harris Administration, should work directly with state DOTs to plan for future interstate lines. As these departments coordinate, they should use a future highway framework characterized by increased electric vehicle (EV) usage, increased EV charging station needs, and improved mass transit. This will allow DOT to reinterpret impeding the “free and safe flow of traffic.” The FHWA should encourage state DOTs to use the SOO Green HVDC Link as a blueprint. The idea of reconciling siting issues by building transmission lines along existing rights-of-way such as highways or railroads is known to this administration, as evidenced by President Biden’s reference in a 2022 White House Statement and by FERC’s June 2020 report on barriers and opportunities for HVDC transmission.
Recommendation 4. DOI, the Department of Agriculture (USDA), DOD, DOE, and the Environmental Protection Agency (EPA) should sign a new Memorandum of Understanding (MOU) that builds on their 2022 MOU but includes DOT.
In 2022, DOI, USDA, DOD, DOE, and the EPA signed an MOU that would expedite the review process of renewable energy projects on federal lands. DOT, specifically its FHWA and Federal Railroad Administration (FRA), should be included in this memorandum. The president should direct these agencies to sign a second MOU to work together to create a regional and national outline for future transmission lines and prioritize permit requests that align with that outline. This new MOU should add the DOT and illustrate the specific ways that FHWA and FRA can support its goals by repurposing existing transportation rights-of-ways.
Recommendation 5. All future covered transmission planning should align with the MOU proposed in Recommendation 4.
Under Section 50152 of the IRA, the DOE received $760 million to distribute federal grants for the development of covered transmission projects. Section 50153 appropriates an additional $100 million to DOE, which is specifically tailored to wind electricity planning and development, both offshore and interregional. The DOE should require that all transmission planning using this federal funding align with the long-term outline created under the MOU recommended above. Additionally, preference should be given to transmission lines (receiving federal funding) that link utility-scale renewable energy projects with large urban centers.
Recommendation 6. The EPA should fund technical and educational training to rural and disadvantaged communities that might benefit from an influx of high-demand green energy jobs.
The federal government should leverage existing funding to ensure that rural and disadvantaged communities directly benefit from economic development opportunities facilitated by the clean energy transition. The EPA should use funds from Section 60107 of the IRA to provide technical and educational assistance to low-income and disadvantaged communities in the form of job training and planning. EPA funding can be used to ensure that local communities have the technical knowledge to take advantage of the jobs and opportunities created by projects like the SOO Green HVDC Link. Because this section of the IRA only funds up to $17 million in job training, this should be allocated to supplement community colleges and other technical training programs that have established curricula and expertise.
To ensure that efforts are successful in the long term, federal agencies, utilities, and other stakeholders must have access to accurate and current information about transmission needs nationwide.
Recommendation 7. Congress should fund regular updates to existing future transmission needs studies.
Congress must continue to approve future research into both halves of the electrification equation: generation and transmission. Congress already approved funding for the NREL Electrification Futures Study and the NREL Interconnections SEAM Study, both published in 2021. These studies allow NREL to determine best-case scenario models and then communicate its research to the RTOs that are best positioned to help IOUs plan for future regional transmission. These studies also guide FERC and the GDO as they determine best-case scenarios for linking rural clean energy resources to urban energy markets.
In addition, Congress must continue to fund the GDO National Transmission Needs Study, which was funded by the Bipartisan Infrastructure Law (BIL). This study researches capacity constraints and congestion on the transmission grid and will help FERC and RTOs determine where future transmission should be planned in order to relieve pressure and meet needs. The final Needs Study was issued in summer 2023, but it must be updated on a regular basis if the country is to actively move toward grid coordination.
The Summer 2023 Needs Study included, for the first time, modeling and discussion of anticipated future capacity constraints and transmission congestion. As the grid continues to evolve and different types of renewable energy are integrated into the grid, future needs studies should continue to include forward-looking models under a variety of renewable energy scenarios.
Conclusion
The Biden-Harris Administration has rejoined the Paris Climate Agreement, affirming their commitment to significant decarbonization goals. To achieve this end, the administration must follow a two-pronged approach that facilitates the installation of utility-scale renewable energy on public lands in the Midwest and Southwest and expedites the implementation of HVDC transmission lines that will link these resources to urban energy markets.
It is impossible to meet the Biden-Harris Administration climate goals without drastic action to encourage further electrification, renewable energy development, and transmission planning. Fortunately, these actions are ripe for bipartisan coordination and are already supported through existing laws like the IRA and BIL. These recommendations will help meet these goals and secure a brighter future for Americans across the rural-urban divide.
FERC has made recent strides toward encouraging transmission modernization through Order No. 2023. While this rule primarily addresses the “largest interconnection queue size in history” and takes steps to accelerate the interconnection process, it does not address the lack of transmission capacity and infrastructure nationally. Order No. 2023 is a vital step forward in interconnection process modernization, and it should be the first of many toward large-scale transmission planning.
As of November 2021, BLM-managed lands produced 12 GW of power from renewable energy sources, through 36 wind, 37 solar, and 48 geothermal permitted projects. To put this number into perspective, 1 GW is enough to power approximately 750,000 homes. Helpfully, BLM maintains a list of planned and approved renewable energy projects on its lands. Additionally, the Wilderness Society maintains an interactive map of energy projects on public lands.
In contrast, BLM manages over 37,000 oil and gas leases, including over 96,000 wells.
Due to their high renewable-energy development potential, Midwest and Southwest states stand to disproportionately gain from a clean energy jobs boom in the fields of construction, management, and the technical trades. Given the West’s and Northeast’s desire for a decarbonized grid and their comparatively greater energy use, these states will benefit by receiving greater amounts of renewable energy to meet their energy needs and decarbonization goals.
The United States lags in the number of HVDC transmission lines, particularly compared to China and Europe. In 2022, only 552 miles of high voltage transmission were added to the United States. Currently, there are four regional transmission lines proposed, two of which expect to begin construction this year. Of these planned lines, three are in the Midwest and Southwest, and one is in the Northeast. While this is progress, China has recently invested $26 billion in a national network of ultra-high-voltage lines.
Five agencies manage federal land, including BLM, USFS, FWS, NPS, USDA, and DOD. However, only BLM and USFS operate under the FLPMA’s multiple-use, sustained-yield mandates, and their land-use mandates are similar. The other agencies’ mandates require them to protect and conserve animals and plants, promote tourism and engagement with public lands, and manage military installations and bases. This said, BLM and USFS are the best candidates for developing utility-scale renewable energy resources through their specific mandates. This memo focuses on the larger of those entities, which has greater potential for substantial renewable energy development and an established permitting system. As discussed in this USFS and NREL study, the study of renewable-energy resource construction on national forest system lands is still in early stages, whereas BLM’s policies and systems are developed.
It is not within the scope of this memo to address issues specific to Tribal lands. However, various federal agencies offer clean energy funding specifically for Tribes, such as the Tribal Energy Loan Guarantee Program. If desired by Tribal communities, the U.S. government should prioritize funding for HVDC transmission lines that link Tribal power generation to Tribal urban centers and utility grids. For tribes seeking guidance on implementing utility-scale projects, Navajo Nation can serve as one model. Navajo Nation has the highest solar potential of any tribal land in the country. They have successfully constructed the Kayenta Solar Project (55 MW of energy), and have finalized leases for the Cameron Solar Plant (200 MW) and the Red Mesa Tapaha Solar Generation Plant (70 MW). The Cameron project alone will generate $109 million over the next 30 years for tribal coffers through tax revenue, lease payments, and energy transmission payments. Another example is the solar energy portfolio of Moapa Band of Paiute Indians. The Tribe manages a growing portfolio of utility-scale solar projects, including Moapa Southern Paiute Solar Project (250 MW), and the first utility-scale installation on tribal land. Currently under development are the Arrow Canyon Solar Project, the Southern Bighorn Solar Project, and the Chuckwalla Solar Projects, all of which feature joint ownership between tribal, federal, and private stakeholders.
Engaging Coal Communities in Decarbonization Through Nuclear Energy
The United States is committed to the ambitious goal of reaching net-zero emissions globally by 2050, requiring rapid deployment of clean energy domestically and across the world. Reducing emissions while meeting energy demand requires firm power sources that produce energy at any time and in adverse weather conditions, unlike solar or wind energy. Advanced nuclear reactors, the newest generation of nuclear power plants, are firm energy sources that offer potential increases in efficiency and safety compared to traditional nuclear plants. Adding more nuclear power plants will help the United States meet energy demand while reducing emissions. Further, building advanced nuclear plants on the sites of former coal plants could create benefits for struggling coal communities and result in significant cost savings for project developers. Realizing these benefits for our environment, coal communities, and utilities requires coordinating and expanding existing efforts. The Foundation for Energy Security and Innovation (FESI), the US Department of Energy (DOE), and Congress should each take actions to align and strengthen advanced nuclear initiatives and engagement with coal communities in the project development process.
Challenge and Opportunity
Reducing carbon emissions while meeting energy demand will require the continued use of firm power sources. Coal power, once a major source of firm energy for the United States, has declined since 2009, due to federal and state commitments to clean energy and competition with other clean energy sources. Power generated from coal plants is expected to drop to half of current levels by 2050 as upwards of 100 plants retire. The DOE found that sites of retiring coal plants are promising candidates for advanced nuclear plants, considering the similarities in site requirements, the ability to reuse existing infrastructure, and the overlap in workforce needs. Advanced nuclear reactors are the next generation of nuclear technology that includes both small modular reactors (SMRs), which function similar to traditional light-water reactors except on a smaller site, and non-light-water reactors, which are also physically smaller but use different methods to control reactor temperature. However, the DOE’s study and additional analysis from the Bipartisan Policy Center also identified significant challenges to constructing new nuclear power plants, including the risk of cost overrun, licensing timeline uncertainties, and opposition from communities around plant sites. Congress took steps to promote advanced nuclear power in the Inflation Reduction Act and the CHIPS and Science Act, but more coordination is needed. To commercialize advanced nuclear to support our decarbonization goals, the DOE estimates that utilities must commit to deploying at least five advanced nuclear reactors of the same design by 2025. There are currently no agreements to do so.
The Case for Coal to Nuclear
Coal-dependent communities and the estimated 37,000 people working in coal power plants could benefit from the construction of advanced nuclear reactors. Benefits include the potential addition of more than 650 jobs, about 15% higher pay on average, and the ability for some of the existing workforce to transition without additional experience, training, or certification. Jobs in nuclear energy also experience fewer fatal accidents, minor injuries, and harmful exposures than jobs in coal plants. Advanced nuclear energy could revitalize coal communities, which have suffered labor shocks and population decline since the 1980s. By embracing advanced nuclear power, these communities can reap economic benefits and create a pathway toward a sustainable and prosperous future. For instance, in one case study by the DOE, replacing a 924 MWe coal plant with nuclear increased regional economic activity by $275 million. Before benefits are realized, project developers must partner with local communities and other stakeholders to align interests and gain public support so that they may secure agreements for coal-to-nuclear transition projects.
Communities living near existing nuclear plants tend to view nuclear power more favorably than those who do not, but gaining acceptance to construct new plants in communities less familiar with nuclear energy is challenging. Past efforts using a top-down approach were met with resistance and created a legacy of mistrust between communities and the nuclear industry. Stakeholders can slow or stop nuclear construction through lawsuits and lengthy studies under the National Environmental Policy Act (NEPA), and 12 states have restrictions or total bans on new nuclear construction. Absent changes to the licensing and regulatory process, project developers must mitigate this risk through a process of meaningful stakeholder and community engagement. A just transition from coal to nuclear energy production requires developers to listen and respond to local communities’ concerns and needs through the process of planning, siting, licensing, design, construction, and eventual decommissioning. Project developers need guidance and collective learning to update the siting process with more earnest practices of engagement with the public and stakeholders. Coal communities also need support in transitioning a workforce for nuclear reactor operations.
Strengthen and Align Existing Efforts
Nuclear energy companies, utilities, the DOE, and researchers are already exploring community engagement and considering labor transitions for advanced nuclear power plants. NuScale Power, TerraPower, and X-energy are leading in both the technical development of advanced nuclear and in considerations of community benefits and stakeholder management. The Utah Associated Municipal Power Systems (UAMPS), which is hosting NuScale’s demonstration SMR, spent decades engaging with communities across 49 utilities over seven states before signing an agreement with NuScale. Their carbon-free power project involved over 200 public meetings, resulting in several member utilities choosing to pursue SMRs. Universities are collaborating with the Idaho National Laboratory to analyze energy markets using a multidisciplinary framework that considers community values, resources, capabilities, and infrastructure. Coordinated efforts by researchers near the TerraPower Natrium demonstration site investigate how local communities view the cost, benefits, procedures, and justice elements of the project.
The DOE also works to improve stakeholder and community engagement across multiple offices and initiatives. Most notably, the Office of Nuclear Energy is using a consent-based siting process, developed with extensive public input, to select sites for interim storage and disposal of spent nuclear fuel. The office distributed $26 million to universities, nonprofits, and private partners to facilitate engagement with communities considering the costs and benefits of hosting a spent fuel site. DOE requires all recipients of funds from the Infrastructure Investment and Jobs Act and the Inflation Reduction Act, including companies hosting advanced nuclear demonstration projects, to submit community benefits plans outlining community and labor organization engagement. The DOE’s new Commercial Liftoff Reports for advanced nuclear and other clean energy technologies are detailed and actionable policy documents strengthened by the inclusion of critical societal considerations.
Through the CHIPS and Science Act, Congress established or expanded DOE programs that promote both the development of advanced nuclear on sites of former coal plants and the research of public engagement for nuclear energy. The Nuclear Energy University Program (NEUP) has funded technical nuclear energy research at universities since 2009. The CHIPS Act expanded the program to include research that supports community engagement, participation, and confidence in nuclear energy. The Act also established, but did not fund, a new advanced nuclear technology development program that prioritizes projects at sites of retiring coal plants and those that include elements of workforce development. An expansion of an existing nuclear energy training program was cut from the final CHIPS Act, but the expansion is proposed again in the Nuclear Fuel Security Act of 2023.
More coordination is required among DOE, the nuclear industry, and utilities. Congress should also take action to fund initiatives authorized by recent legislation that enable the coal-to-nuclear transition.
Plan of Action
Recommendations for Federal Agencies
Recommendation 1. A sizable coordinating body, such as the Foundation for Energy Security and Innovation (FESI) or the Appalachian Regional Commission (ARC), should support the project developer’s efforts to include community engagement in the siting, planning, design, and construction process of advanced nuclear power plants.
FESI is a new foundation to help the DOE commercialize energy technology by supporting and coordinating stakeholder groups. ARC is a partnership between the federal government and Appalachian states that supports economic development through grantmaking and conducting research on issues related to the region’s challenges. FESI and ARC are coordinating bodies that can connect disparate efforts by developers, academic experts, and the DOE through various enabling and connecting initiatives. Efforts should leverage existing resources on consent-based siting processes developed by the DOE. While these processes are specific to siting spent nuclear fuel storage facilities, the roadmap and sequencing elements can be replicated for other goals. Stage 1 of the DOE’s planning and capacity-building process focuses on building relationships with communities and stakeholders and engaging in mutual learning about the topic. FESI or ARC can establish programs and activities to support planning and capacity building by utilities and the nuclear industry.
FESI could pursue activities such as:
- Hosting a community of practice for public engagement staff at utilities and nuclear energy companies, experts in public engagement methods design, and the Department of Energy
- Conducting activities such as stakeholder analysis, community interest surveys, and engagement to determine community needs and concerns, across all coal communities
- Providing technical assistance on community engagement methods and strategies to utilities and nuclear energy companies
ARC could conduct studies such as stakeholder analysis and community interest surveys to determine community needs and concerns across Appalachian coal communities.
Recommendation 2. The DOE should continue expanding the Nuclear Energy University Program (NEUP) to fund programs that support nontechnical nuclear research in the social sciences or law that can support community engagement, participation, and confidence in nuclear energy systems, including the navigation of the licensing required for advanced reactor deployment.
Evolving processes to include effective community engagement will require new knowledge in the social sciences and shifting the culture of nuclear education and training. Since 2009, the DOE Office of Nuclear Energy has supported nuclear energy research and equipment upgrades at U.S. colleges and universities through the NEUP. Except for a few recent examples, including the University of Wyoming project cited above, most projects funded were scientific or technical. Congress recognized the importance of supporting research in nontechnical areas by authorizing the expansion of NEUP to include nontechnical nuclear research in the CHIPS and Science Act. DOE should not wait for additional appropriations to expand this program. Further, NEUP should encourage awardees to participate in communities of practice hosted by FESI or other bodies.
Recommendation 3. The DOE Office of Energy Jobs and the Department of Labor (DOL) should collaborate on the creation and dissemination of training standards focused on the nuclear plant jobs for which extensive training, licensing, or experience is required for former coal plant workers.
Sites of former coal plants are promising candidates for advanced nuclear reactors because most job roles are directly transferable. However, an estimated 23% of nuclear plant jobs—operators, senior managers, and some technicians—require extensive licensing from the Nuclear Regulatory Commission (NRC) and direct experience in nuclear roles. It is possible that an experienced coal plant operator and an entry-level nuclear hire would require the same training path to become an NRC-licensed nuclear plant operator.
Supporting the clean energy workforce transition fits within existing priorities for the DOE’s Office of Energy Jobs and the DOL, as expressed in the memorandum of understanding signed on June 21, 2022. Section V.C. asserts the departments share joint responsibility for “supporting the creation and expansion of high-quality and equitable workforce development programs that connect new, incumbent, and displaced workers with quality energy infrastructure and supply chain jobs.” Job transition pathways and specific training needs will become apparent through additional studies by interested parties and lessons from programs such as the Advanced Reactor Demonstration Program and the Clean Energy Demonstration Program on Current and Former Mine Land. The departments should capture and synthesize this knowledge into standards from which industry and utilities can design targeted job transition programs.
Recommendations for Congress
Recommendation 4. Congress should fully appropriate key provisions of the CHIPS and Science Act to support coal communities’ transition to nuclear energy.
- Appropriate $800 million over FY2024 to FY2027 to establish the DOE Advanced Nuclear Technologies Federal Research, Development, and Demonstration Program: The CHIPS and Science Act established this program to promote the development of advanced nuclear reactors and prioritizes projects at sites of retiring coal power plants and those that include workforce development programs. These critical workforce training programs need direct funding.
- Appropriate an additional $15 million from FY2024 to FY2025 to the NEUP: The CHIPS and Science Act authorizes an additional $15 million from FY 2023 to FY 2025 to the NEUP within the Office of Nuclear Energy, increasing the annual total amount from $30 million to $45 million. Since CHIPS included an authorization to expand the program to include nontechnical nuclear research, the expansion should come with increased funding.
Recommendation 5. Congress should expand the Nuclear Energy Graduate Traineeship Subprogram to include workforce development through community colleges, trade schools, apprenticeships, and pre-apprenticeships.
The current Traineeship Subprogram supports workforce development and advanced training through universities only. Expanding this direct funding for job training through community colleges, trade schools, and apprenticeships will support utilities’ and industries’ efforts to transition the coal workforce into advanced nuclear jobs.
Recommendation 6. Congress should amend Section 45U, the Nuclear Production Tax Credit for existing nuclear plants, to require apprenticeship requirements similar to those for future advanced nuclear plants covered under Section 45Y, the Clean Energy Production Tax Credit.
Starting in 2025, new nuclear power plant projects will be eligible for the New Clean Energy Production and Investment Tax Credits if they meet certain apprenticeship requirements. However, plants established before 2025 will not be eligible for these incentives. Congress should add apprenticeship requirements to the Nuclear Production Tax Credit so that activities at existing plants strengthen the total nuclear workforce. Credits should be awarded with priority to companies implementing apprenticeship programs designed for former coal industry workers.
Conclusion
The ambitious goal of reaching net-zero emissions globally requires the rapid deployment of clean energy technologies, in particular firm clean energy such as advanced nuclear power. Since the 1980s, communities around coal power plants have suffered from industry shifts and will continue to accumulate disadvantages without support. Coal-to-nuclear transition projects advance the nation’s decarbonization efforts while creating benefits for developers and revitalizing coal communities. Utilities, the nuclear industry, the DOE, and researchers are advancing community engagement practices and methods, but more effort is required to share best practices and ensure coordination in these emerging practices. FESI or other large coordinating bodies should fill this gap by hosting communities of practice, producing knowledge on community values and attitudes, or providing technical assistance. DOE should continue to promote community engagement research and help articulate workforce development needs. Congress should fully fund initiatives authorized by recent legislation to promote the coal to nuclear transition. Action now will ensure that our clean firm power needs are met and that coal communities benefit from the clean energy transition.
Transitioning coal miners directly into clean energy is challenging considering the difference in skills and labor demand between the sectors. Most attempts to transition coal miners should focus on training in fields with similar skill requirements, such as job training for manufacturing roles within the Appalachian Climate Technology Coalition. Congress could also provide funding for unemployed coal miners to pursue education for other employment.
A significant challenge is aligning the construction of advanced nuclear plants with the decommissioning of coal plants. Advanced nuclear project timelines are subject to various delays and uncertainties. For example, the first commercial demonstration of small modular reactor technology in the United States, the TerraPower plant in Wyoming, is delayed due to the high-assay low-enriched uranium supply chain. The Nuclear Regulatory Commission’s licensing process also creates uncertainty and extends project timelines.
Methods exist to safely contain radioactive material as it decays to more stable isotopes. The waste is stored on site at the power plant in secure pools in the shorter term and in storage casks capable of containing the material for at least 100 years in the longer term. The DOE must continue pursuing interim consolidated storage solutions as well as a permanent geological repository, but the lack of these facilities should not pose a significant barrier to constructing advanced nuclear power plants. The United States should also continue to pursue recycling spent fuel.
More analysis is required to better understand these impacts. A study conducted by Argonne National Laboratory found that while the attributes of spent fuel vary by the exact design of reactor, overall there are no unique challenges to managing fuel from advanced reactors compared to fuel from traditional reactors. A separate study found that spent fuel from advanced reactors will contain more fissile nuclides, which makes waste management more challenging. As the DOE continues to identify interim and permanent storage sites through a consent-based process, utilities and public engagement efforts must interrogate the unique waste management challenges when evaluating particular advanced nuclear technology options.
Similar to waste output, the risk of proliferation from advanced reactors varies on the specific technologies and requires more interrogation. Some advanced reactor designs, such as the TerraPower Natrium reactor, require the use of fuel that is more enriched than the fuel used in traditional designs. However, the safeguards required between the two types of fuel are not significantly different. Other designs, such as the TerraPower TWR, are expected to be able to use depleted or natural uranium sources, and the NuScale VOYGR models use traditional fuel. All reactors have the capacity to produce fissile material, so as the United States expands its nuclear energy capabilities, efforts should be made to expand current safeguards limiting proliferation to fuel as it is prepared for plants and after it has been used.
Turning Community Colleges into Engines of Economic Mobility and Dynamism
Community colleges should be drivers of economic mobility, employment, and dynamism in local communities. Unlike four-year institutions, many of which are highly selective and pose significant barriers to entry, two-year colleges are intended to serve people from a wide range of life circumstances. In theory, they are highly egalitarian institutions that enable underserved individuals to access learning, jobs, and opportunities that would otherwise not be available to them.
However, community colleges are asked to do a lot of things with relatively little funding: they serve individuals ranging from highly gifted high school students to prospective transfers to four-year universities to people earning skilled trades certificates. This spreads schools’ attention broadly and is especially problematic given the wide range of non-academic challenges that many of their low-income students face, such as raising dependents and lack of access to reliable transportation. Troublingly, many community college degrees do not result in an economic return on investment (ROI) for their students, and many students will not recoup their investment within five years of completing a community college credential.
To address these issues, policymakers should reform community colleges in two essential ways. First, community colleges should align curricula toward fields with high wages and strong employer demand while increasing the amount of work-based learning. This shift would provide more job-ready graduates and improve student salaries and employment rates, thereby increasing student ROI. Second, the federal government should provide greater financial assistance in the form of Pell Grants and funding for wraparound services such as transportation vouchers and textbooks, allowing more students to access high-quality community college programs and graduate on time. These are interventions with a track record of proven success but require greater funding and support capacity to scale up at a national level.
Challenge and Opportunity
Challenge 1: Community colleges serve a wide range of students, including working parents seeking a better job, students who intend to transfer to four-year universities, and high school students taking dual enrollment classes.
There is no such thing as a typical community college student. As part of the Aspen Prize for Community College Excellence, the Aspen Institute collected demographic and outcomes data for its top 150 community colleges. Within this group, over 30% of students are “nontraditional” students over the age of 25, and 45% are minorities. Moreover, 63% of community college students attend part-time, which poses significant challenges in terms of scheduling and momentum. This severely impacts retention, graduation, and transfer rates. By contrast, just 11% of students at four-year flagship institutions are enrolled part-time. Community colleges must juggle these competing priorities and must do so in the absence of clear guidelines and insufficient resources.
Challenge 2: Underprivileged students tend to struggle the most given their financial constraints and insufficient access to wraparound services. At the same time, community colleges are already starved for resources and may not have the capacity to provide those critical services to these students.
Community college students are more likely than their four-year counterparts to come from less wealthy backgrounds. As of 2016, 16% of students in four-year colleges come from impoverished families, with another 17% coming from families near poverty. By contrast, some 23% of dependent community college students and 47% of independent students come from families with less than $20,000 of annual income. Unsurprisingly, two-thirds of community college students work, with roughly one-third working full-time.
Community colleges will be hard-pressed to cover students’ financial shortfalls from their own budgets. On average, community colleges receive $8,700 per full-time equivalent student versus $17,500 per student for four-year colleges (this overstates funding per enrolled student, because more community college students are enrolled part-time). Moreover, over half of community college funding comes from local and state sources. As a result, schools with the highest proportion of low-income students are more likely to have lower funding.
Challenge 3: The United States needs more nurses and allied healthcare workers, IT and cyber professionals, and skilled tradespeople, but the recruiting pipeline and training pathway for these individuals is often understaffed, highly fragmented, and hyperlocal.
Many industries with high-paying wages have experienced or will soon experience major talent shortages in the next decade. For instance, by 2030 the United States will need another 275,000 registered nurses, which, at the minimum, requires an associate’s degree in nursing (ADN) to sit for the NCLEX entrance exam. The country needs another 350,000 cybersecurity professionals, especially in the federal workforce, where 50% of the workforce is over the age of 50 and approaching retirement age. Finally, and certainly not least, the distributed renewable energy grid will not build itself: 30% of union electricians are between the ages of 50 and 70, and we will need more solar installers, wind technicians, and other skilled trades specialists to enable the green transition.
However, these issues are not easily solved by digitally native solutions rolled out at a national level. Instead, these need to be tackled at a local level. For instance, access to clinical space can only happen through hospitals, while skill development for electricians, installers, and technicians primarily occurs through high school and community college CTE classes and industry-led apprenticeships, all of which require a substantial in-person component. Thus, workforce training to fill shortages will have to be similarly local in nature.
Challenge 4: The value of the two-year associate’s degree and certificates is highly variable and depends on the type of degree or certificate earned.
The Burning Glass Institute studied the career histories of nearly 5 million individuals who graduated between 2010 and 2020 and built a rich dataset that tied salary information to LinkedIn profiles. They then assessed “degree optional” roles (jobs where 50% to 80% of individuals held a degree) and found that a four-year degree provided a 15% wage premium, which was largely driven by the job flexibility provided by the bachelor’s degree. By comparison, they found no such premium for two-year associate’s degrees.
However, these averages hide the economic variance provided by individual degrees. Third Way investigated the economic payback period for graduates of different degree programs, which they defined as the pay increase over the median high school graduate divided by the net tuition cost. Highly technical engineering, healthcare, and computer science associate’s degrees provided exceptional payback periods, with more than 90% recouping their investment in less than five years.
By contrast, other associate’s degrees saw no economic ROI. Some of these degrees, such as general humanities and culinary arts, are unsurprising. However, other fields, such as biological and physical sciences, for which half of students had no ROI, might have had stronger ROIs as bachelor’s degrees.
Similarly, certificate programs have wildly varying ROIs. Nursing and diagnostic and skilled trades generally show a strong ROI, with more than 85% of students recouping their investment within five years. On the other hand, cosmetology, culinary arts, and administrative services are highly likely to receive no ROI, indicative of the low pay in the roles that certificate earners take upon completion of their program.
Together, these studies show that associate’s degree programs and certificates with less-defined career pathways are at risk of value erosion. This may be due in part to real differences in the skills taught in a two-year degree or certificate versus a four-year program. However, it is also clear that highly technical associate’s degrees and certificates designed to meet employer-defined needs have better economic ROIs, suggesting that there is less value erosion in roles with well-defined pathways.
Plan of Action
To address these issues, policymakers, community college leaders, employers, and philanthropic stakeholders should work together to implement five general reforms:
- Reorient community college offerings toward technical associate’s degrees and certificates that have been shown to have a strong, locally proven ROI for students while pruning programs that do not have compelling outcomes. The federal government should allocate funding to programs that have compelling five-year repayment rates and fill jobs that have high and persistent skills shortages. In addition, the Department of Education can write a “Dear Colleague Letter” that focuses on program ROI and suggest that Congress pass laws strengthening ROI requirements for federal funding
- Community colleges and local employers should partner to deliver more job training at scale, including apprenticeships and skills-based part-time work. Many of these programs, such as Project Quest, have been established for years, and community colleges can and should play a bigger role in building a student pipeline and delivering in-classroom training that leads to a high-quality credential. In addition, under the Inflation Reduction Act, local employers can receive tax breaks for clean energy projects that use registered apprenticeships. These apprenticeships, which supplement on-the-job training with classroom instruction and are tailored to employer needs, can be provided by community colleges
- Increase Pell Grant maximums to improve degree affordability and access. For the upcoming 2023–2024 school year, the maximum could be raised by $500, in line with the 2024 President’s Budget. Congress should enact provisions in the president’s budget that would provide $500 million in funding for community college programs that lead to high-paying jobs and $100 million for workforce training, both of which would strengthen post completion outcomes. In addition, Congress should pass legislation that makes Pell Grants nontaxable, which would enable students to use funding on living expenses.
- Develop and fund high-ROI wraparound solutions that have been shown to improve student outcomes, such as those developed by the Accelerated Study in Associate Programs (ASAP). These include career guidance, textbook assistance, and transportation vouchers, among others. The Department of Education should also allow community colleges to spend funding (for example, some of the increases proposed in the president’s budget) on supports that are not already covered by existing entitlement programs. In addition, state and local governments can earmark special taxes and work closely with philanthropic funders to experiment with and deploy wrap around solutions, helping policymakers further assess the most cost-effective interventions.
- Create comprehensive data tracking mechanisms that track data at state and local levels to evaluate student outcomes and relentlessly funnel public, private, and philanthropic capital toward interventions and degree programs that are shown to result in strong outcomes. In particular, the recommendations in the bipartisan College Transparency Act are a good start given that they would tie Integrated Postsecondary Education Data System (IPEDS) and Internal Revenue Service (IRS) data together.
Recommendation 1. Community colleges should reorient their offerings toward degrees that provide strong employment outcomes and student ROI (e.g., associate’s degrees in nursing and maintenance and installer certificates).
The data is unambiguous: Some programs deliver strong outcomes while others are drains of students’ and taxpayers’ money. Community colleges can better serve students by focusing more time and resources on the programs that deliver strong ROIs within their local economic contexts. As Figures 2– 4 show, these programs skew heavily toward nursing and allied health, engineering and computer science, and skilled trades roles. These also dovetail with major labor shortages, suggesting that community colleges can play a significant role in matching labor supply with demand.
This premise sounds deceptively simple but requires a meaningful reimagination of the role that community colleges play. By asking community colleges to refocus toward highly technical associate’s degrees and certificates, they would end up eschewing other aspects of the higher education landscape. In this view, community colleges would de-emphasize the production of liberal arts associate’s degrees. While they would continue to teach core science and humanities courses, the structure and content would be primarily geared toward equipping students with the critical thinking and foundational skills required to excel in higher-level technical courses. Community colleges would thus further increase their role in providing vocational training.
Refocusing community colleges on certain degrees would allow institutions to devote their limited resources to helping students navigate a smaller set of pathways. While it is certainly true that community colleges could improve liberal arts associate’s degree ROIs by helping students transfer to four-year universities, a greater emphasis on vocational degree production would help two-year colleges focus on their core competitive advantage in the higher education market. In the long run, greater focus would reduce administrative burden while helping professors, guidance counselors, and financial aid officers develop expertise in high-demand training and career pathways. In addition, a narrower focus on high-ROI degrees improves the effectiveness of public and philanthropic spending, making large-scale interventions more feasible from political and financial perspectives.
Recommendation 2. At the local level, community colleges should partner with employers to deliver job-specific training at scale (for example, apprenticeships or skills-based part-time work paired with associate’s degrees), helping economies match labor supply and demand while providing students with pay and relevant work experience.
Although increased tuition assistance would significantly improve financial access for many community college students, the reality is that programs such as the Pell Grant, while highly effective, still leave students with major financial gaps. As a result, many community college students end up working: as Figure 1 shows, 47% of independent community college students come from incomes of less than $20,000.
A practical approach would be to ask how we might optimize the value of hours worked rather than asking how we might avoid hours worked at all. Many community college students are employed in retail and other frontline roles: in fact, 23% of students in the Washington state dataset worked in retail at the start of their academic career, while another 19% worked in accommodation and food service. These are entry-level roles that pay low salaries, provide poor benefits, and are unlikely to teach transferable skills in high-paying professions.
A better way to provide wages as well as professionally transferable skills would be to increase funding for work-based training programs, including apprenticeships and part-time roles, that are directly related to the student’s course of study. The Department of Labor should fund a large increase in work-based training programs that provide the following characteristics:
- Are tied to an accredited course of study at a community college or other institution of higher learning with proven outcomes
- Are targeted toward roles and industries with job shortages, such as registered nurses
- Are designed in collaboration with employer partners who will ensure that students are learning skills directly related to their role and industry
- Have sufficient funding for key administrative and wraparound expenses, including career counseling and transportation stipends
Research has started to highlight the long-term benefits of well-designed work-based learning programs focused on high-paying jobs. San Antonio-based Project Quest works with individuals in healthcare, IT, and the skilled trades to provide low-income adults with credentials and employment pathways (sometimes through community colleges but also with trade schools and four-year universities providing certificates). In addition to training, Project Quest provides comprehensive wraparound support for its participants, including financial assistance for tuition, transportation, and books, as well as remedial instruction and career counseling.
In 2019, Project Quest published the results of its nine-year longitudinal study that included a randomized controlled trial of 410 adults, 88% of whom were female, enrolled in healthcare programs. Thus, replicability for other industries may prove challenging. Nonetheless, the study showed highly positive and statistically long-term earnings impacts for its participants, results that have not been easily replicated elsewhere.
Properly designed standalone apprenticeships have the potential to deliver large and positive impacts. For example, the Federation of Advanced Manufacturing Education (FAME) has long had an apprenticeship program in Kentucky to develop high-quality automotive manufacturing talent for skilled trades roles, which blends technical training for automotive manufacturing, skills that can be transferred to any industrial setting, and soft skills education. Participants complete an apprenticeship and finish with an associate’s degree in industrial maintenance technology. Within five years of graduation, FAME graduates had average incomes of almost $100,000.
The Inflation Reduction Act, Infrastructure Act, and CHIPS Act have made it clear that reinvesting in America’s industrial base is a key policy priority. At the same time, the private sector has identified major skill shortages in the skilled trades as well as healthcare and IT. Community college administrators can lead the effort to create work-based training solutions for these key roles and coordinate the efforts of various stakeholders, including the Departments of Education and Labor, state governments, and philanthropic organizations seeking to fund high-quality comprehensive solutions such as the ones developed by Project Quest. In doing so, community college leaders can move to the vanguard of outcomes-driven, ROI-based higher education.
Recommendation 3. The federal government should increase Pell Grant funding and ensure that more students receive funds for which they are eligible.
Pell Grants are an essential component of college funding for many low-income college students, without which higher education would be unaffordable. For the 2023–2024 school year, the Pell Grant maximum is $7,395, and on average students receive around $4,250. By contrast, the average tuition at a community college is just under $4,000, with the total cost of attendance at around $13,500. Thus, the average Pell Grant would cover all of tuition but just one-third of the total cost of attendance, assuming that the student was enrolled full-time. Nonetheless, Pell Grants are highly effective tools: the Federal Reserve Bank of Richmond conducted a pilot study of 9,000 students and found that 64% of Pell recipients had graduated, transferred, or persisted in their program within 200% of the normal completion time, as opposed to 51% of non-Pell recipients.
Increasing Pell Grant awards will have two important effects. First, additional Pell Grant assistance reduces the out-of-pocket tuition burden, in turn increasing financial capacity for critical expenditures such as living expenses, textbooks, and transportation. Second, students who receive Pell Grant funding in excess of the tuition maximum could directly apply funds to those expenditures. However, under current IRS code, Pell Grant funding that is applied to living expenses is taxable. Congress should pass legislation that makes Pell Grants nontaxable in order to avoid penalizing students who use funds on critical expenses that might otherwise go unfilled or would require funding from an outside organization.
President Biden’s 2024 budget, which proposes a $500 increase in the maximum Pell Grant, is an excellent baseline for increasing access to high-quality community college programs. In all, this is estimated to cost $750 million in 2024 (including four-year college students), with a more ambitious pathway to doubling the grant by 2029. Moreover, the president’s budget calls for $500 million to start a discretionary fund that provides free two-year associate’s degree programs for high-quality degrees. These proposals have shown progress at the state level: for instance, Tennessee, a Republican-led state, offers free community or technical college to every high school graduate. Furthermore, tying funding to programs with strong graduation and salary outcomes ensures that funding flows to high-quality programs, improving student ROI and increasing its appeal to taxpayers.
Policymakers should also do more to ensure that students take advantage of funds they are eligible to receive. In 2018, the Wheelhouse Center for Community College Leadership and Research examined data from nearly 320,000 students in California. Over just one semester, they found that students failed to claim $130 million of Pell Grants they were eligible for. Sometimes, students simply forget to apply, but in other cases, financial aid offices put artificial obstacles in the way: half of financial aid officers report asking for additional verification beyond the student list required by the Department of Education. Community colleges should be given more resources to ensure that eligible students apply for grant funding, but financial aid offices can also help by reducing the administrative burden on students and themselves.
Recommendation 4. In addition to expanding Pell Grant uptake, the public sector should fund and distribute wraparound services for community college students focused on high-impact practices, including first-year experiences, guidance counseling and career support, and ancillary benefits, such as textbook vouchers and transportation passes.
In 2014, the Center for Community College Student Engagement assessed 12 community colleges to evaluate three essential outcomes: passing a developmental course in the first year, passing a gatekeeper course in the first year, and persistence in the degree program. They then pinpointed a set of practices that were meaningfully more likely to positively impact one or more of the target outcomes.
One successful intervention that bundles together many of these practices is the Accelerated Study in Associate Programs (ASAP), developed in the City University of New York (CUNY) and eventually expanded to three Ohio community colleges. The ASAP study, a randomized control trial of 896 students at CUNY and 1,501 students in Ohio, provided tuition assistance and wraparound supports such as tutoring, career services, and textbook vouchers.
The program delivered outstanding results: 55% of CUNY students graduated with a two-year or four-year degree versus 44% of the control group. The Ohio results were even more compelling, with the ASAP program improving two-year graduation rates by 15.6% and four-year registrations by 5.7% at a 0.01 significance level.
To scale these programs, the federal government should allow grants to be used for wraparound supports with strong research-based impacts, potentially drawing from (or in addition to) the $500 million community college discretionary fund in the president’s budget. Ideally, this would be done via competitive applications with an emphasis on programs that target disadvantaged communities and focus on high-quality degree programs. Moreover, this could be set up via matching funds that incentivize state and local governments as well as philanthropic players to play a larger role in creating wraparound supports and administrative structures that would allow community colleges to better provide these services in the long term.
Recommendation 5. Federal, state, and local policymakers, working with large grant-writing foundations, should focus funding on interventions proven to result in higher graduation, transfer, and employment rates. As a first step, Congress should pass laws mandating the creation of datasets that merge educational and earnings data, which will help decision-makers and funders link dollars to outcomes.
Despite the success of programs such as ASAP and Project Quest, there is still a dearth of high-quality studies on comprehensive interventions. This is partly because there are relatively few such programs to begin with. Nonetheless, early results seem promising. The question is, how can we ensure that programs are properly measured in order to enable further public, private, and nonprofit financing?
Unlike ASAP and Project Quest, most programs do not rigorously track data over a long period. For instance, the American Association of Community Colleges provides a repository of data on community college apprenticeships, broken out at an aggregate level as well as by school partner. However, a closer look shows that the public-facing dataset is missing rudimentary information on the number of apprentices who complete their programs, what types of programs have a high rate of completers versus non completers, and employment outcomes, let alone richer datasets that include background demographic information, longitudinal earnings tracking, and other pieces of information essential to constructing statistically rigorous studies of student ROI.

While there may be more privately held data in their database, the paucity of available public information is indicative of the state of data tracking for community colleges and work-based training programs. In general, institutions are not sufficiently funded to continue data tracking beyond completion or departure, leaving enormous gaps.
One way to get around this issue is to require more rigorous data collection and longitudinal tracking, leveraging existing administrative data where possible. Fortunately, there is already a bill on the floor, called the College Transparency Act, which includes provisions requiring the Education Department to match student-level data with IRS tax data to measure post completion employment rates as well as mean and median earnings by institution, program of study, and credential level. Congress should pass the act, which enjoys bipartisan support. Passing the College Transparency Act would create the much-needed foundation to rigorously compare ROI and enable greater accountability for community colleges and higher ed writ large.
Conclusion
Designed correctly, community colleges can be fonts of economic opportunity, especially for individuals from underserved backgrounds whose primary goal is to enter into a well-paying role upon program completion. By collecting high-quality data, focusing on degrees with strong outcomes, providing quality work-based training, and funding wraparound supports and tuition assistance, community colleges can be much stronger, more effective engines for students and local communities. While these reforms will take time and energy from public policymakers, community college leaders, and employers, they have the potential to deliver compelling outcomes and are worth the investment.
Certain constituents would be negatively impacted: for example, high school dual enrollment students would have fewer options for advanced course offerings, and students who want a physics, biology, economics, or similar degree would need to choose a four-year university. On the other hand, this is likely a healthy outcome. Academically gifted high school students could take AP courses in person at their high school or virtually, while liberal arts students would end up at four-year institutions where there is an appropriate amount of time to master the subject matter and the degree ROI is clearer.
The Ohio ASAP program included the following elements:
- Tutoring: Students were required to attend tutoring if they were taking developmental (remedial) courses, on academic probation, or identified as struggling by a faculty member or adviser.
- Career services: Students were required to meet with campus career services staff or participate in an approved career services event once per semester.
- Tuition waiver: A tuition waiver covered any gap between financial aid and college tuition and fees.
Monthly incentive: Students were offered a monthly incentive in the form of a $50 gas/grocery gift card, contingent on participation in program services. - Textbook voucher: A voucher covered textbook costs.
- Course enrollment: Blocked courses and consolidated schedules held seats for program students in specific sections of courses during the first year.
- First-year seminar: New students were required to take a first-year seminar (or “success course”) covering topics such as study skills and note-taking.
- Full-time enrollment: Students were required to attend college full-time during the fall and spring semesters and were encouraged to enroll in summer classes.
Although the program cost an additional $5,500 in direct costs per student (and a further $2,500 because students took more courses and degrees), the total cost per degree attained decreased because the program had a significant positive impact on graduation rates. Degree attainment is an essential key performance indicator because there are large differences in economic ROI for graduates and nongraduates, especially at community colleges.
Greater experimentation with publicly funded wraparounds, including greater uptake of entitlements for which students might be eligible, will help policymakers identify the most impactful components of the ASAP intervention. Over time, this will reduce direct costs while continuing to improve the cost per degree attained.
Project Quest included the following wraparound supports:
- Financial assistance to cover tuition and fees for classes, books, transportation, uniforms, licensing exams, and tutoring.
- Remedial instruction in math and reading to help individuals pass placement tests.
- Counseling to address personal and academic concerns and provide motivation and emotional support.
- Referrals to outside agencies for assistance with utility bills, childcare, food, and other services as well as direct financial assistance with other supports on an as-needed basis.
- Weekly meetings that focused on life skills, including time management, study skills, critical thinking, and conflict resolution.
- Job placement assistance, including help with writing résumés and interviewing, as well as referrals to employers that are hiring.
A study by Brookings and Opportunity America of graduates between 2010 and 2017 showed dramatic increases in five-year post completion wages (almost $100,000 for FAME graduates versus slightly over $50,000 for non-FAME participants). Much of the earnings impact can be attributed to differences in graduation rates: 80% of FAME participants graduate, compared to 30% elsewhere. It should be noted that FAME was not a randomized control trial (unlike Project Quest) but rather a match-paired study with a FAME participant and a “similar” individual, and data was only available for 24 of the 143 FAME participants at the five-year postgraduation mark. Nonetheless, research clearly shows that corporations, workforce development agencies, and community colleges can pair the best of Project Quest and FAME (the wraparound support provided by Quest, the broad and high-quality training in FAME, and the focus on high-demand roles in both) to optimize students’ outcomes.
The Workforce Innovation and Opportunity Act (WIOA) youth apprenticeship and Perkins V programs have appropriated funding that could be used to expand work-based training for community college students. For 2022–2023, Congress appropriated $933 million for youth activities under WIOA, while Perkins V provides roughly $1.4 billion in state formula grants for youth and adult training. However, 75% of WIOA funding goes to out-of-school youth, while Perkins funding covers a wide range of career and technical education programs across secondary, postsecondary, and adult learning. Either program could administer additional funding focused on work-based learning tied to a community college degree, but Congress should appropriate or divert funds to serve these needs. Philanthropic funds could also play a role, especially in funding wraparound supports and administrative expenses, but centralized public funding is needed to ensure appropriate funding and rollout.
Contrary to popular belief, working while going to community college does not necessarily detract from student performance. Researcher Mina Dadgar pulled over 40,000 community college student records from the state of Washington and linked them to tax data. Although work did have a statistically significant negative impact on quarterly credits earned and GPA, it does not have a practically significant negative effect on student outcomes.
From the regression analysis above, we can see that each additional hour of work reduces the quarterly credits earned by .065 credits and grade point average (GPA) by roughly .005 points. Assuming that a student works 15 hours per week, the student would be expected to take one less credit per quarter, or three credits assuming that they are enrolled throughout the year. This is, in effect, one class per year, which while not negligible is not a major loss to academic attainment. Similarly, working 15 hours per week would predict a GPA decline of about .06 points—again, not a substantial effect on academic performance.
One promising structure is the social impact bond (sometimes referred to as pay for success). In this model, private investors provide upfront capital to social intervention programs and are repaid if certain performance targets are met. Although establishing the proper baseline can be challenging, the contract involves payment for reducing the overall cost of service (for instance, interventions that proactively reduce recidivism or hospital visits for chronic disease).
Existing programs focus on the financial returns of “investing” in students’ training and upskilling: for instance, impact financier Social Finance and coding bootcamp General Assembly launched a career impact bond that has funded over 800 underserved individuals seeking a credential in technology. However, there is the potential for much broader assessments of economic value that increase the appeal of comprehensive wraparound solutions. In the case of workforce training, the ideal program design might involve an assessment of the overall reduction in social services associated with individuals trapped in poverty (for instance, increased healthcare costs or extended social services provision) as well as the increase in economic activity and tax receipts from a higher-paying job.
As a result, these types of targets encourage more holistic interventions such as the ones we see in the ASAP and Project Quest programs because investors and program managers benefit from students’ long-term success, not just their short-term success. This also incentivizes rigorous data tracking, which in the long term will provide critical information on intervention packages that have the strongest positive impact while weeding out those that are not as effective in improving outcomes.
BioNETWORK: The Internet of Distributed Biomanufacturing
Summary
The future of United States industrial growth resides in the establishment of biotechnology as a new pillar of industrial domestic manufacturing, thus enabling delivery of robust supply chains and revolutionary products such as materials, pharmaceuticals, food, energy. Traditional centralized manufacturing of the past is brittle, prone to disruption, and unable to deliver new products that leverage unique attributes of biology. Today, there exists the opportunity to develop the science, infrastructure, and workforce to establish the BioNETWORK to advance domestic distributed biomanufacturing, strengthen U.S.-based supply chain intermediaries, provide workforce development for underserved communities, and achieve our own global independence and viability in biomanufacturing. Implementing the BioNETWORK to create an end-to-end distributed biomanufacturing platform will fulfill the Executive Order on Advancing Biotechnology and Biomanufacturing Innovation and White House Office of Science and Technology Policy (OSTP) Bold Goals for U.S. Biotechnology and Biomanufacturing.
Challenge and Opportunity
Biotechnology harnesses the power of biology to create new services and products, and the economic activity derived from biotechnology and biomanufacturing is referred to as the bioeconomy. Today, biomanufacturing and most other traditional non-biomanufacturing is centralized. Traditional manufacturing is brittle, does not enhance national economic impact or best use national raw materials/resources, and does not maximize innovation enabled by the unique workforce distributed across the United States. Moreover, in this era of supply chain disruptions due to international competition, climate change, and pandemic-sized threats (both known and unknown), centralized approaches that constitute a single point of attack/failure and necessarily restricted, localized economic impact are themselves a huge risk. While federal government support for biotechnology has increased with recent executive orders and policy papers, the overarching concepts are broad, do not provide actionable steps for the private sector to respond to, and do not provide the proper organization and goals that would drive outcomes of real manufacturing, resulting in processes or products that directly impact consumers. A new program must be developed with clear milestones and deliverables to address the main challenges of biomanufacturing. Centralized biomanufacturing is less secure and does not deliver on the full potential of biotechnology because it is:
- Reliant on a narrow set of feedstocks and reagents that are not local, introducing supply chain vulnerabilities that can halt bioproduction in its earliest steps of manufacturing.
- Inflexible for determining the most effective, stable, scalable, and safe methods of biomanufacturing needed for multiple products in large facilities.
- Serial in scheduling, which introduces large delays in production and limits capacity and product diversity.
- Bespoke and not easily replicated when it comes to selection and design of microbial strains, cell free systems, and sequences of known function outside of the facility that made them. Scale-up and reproducibility of biomanufacturing products are limited.
- Creating waste streams because circular economies are not leveraged.
- Vulnerable to personnel shortages due to shifting economic, health, or other circumstances related to undertraining of a biotechnology specialized workforce.
Single point failures in centralized manufacturing are a root cause of product disruptions and are highlighted by current events. The COVID-19 pandemic revealed that point failures in the workforce or raw materials created disruptions in the centralized manufacturing, and availability of hand sanitizers, rubber gloves, masks, basic medicines, and active pharmaceutical ingredients impacted every American. International conflict with China and other adversarial countries has also created vulnerabilities in the sole source access to rare earth metals used in electronics, batteries, and displays, driving the need for alternate options for manufacturing that do not rely on single points of supply. To offset this situation, the United States has access to workforce, raw materials, and waste streams geographically distributed across the country that can be harnessed by biomanufacturing to produce both health and industrial products needed by U.S. consumers. However, currently there are only limited distributed manufacturing infrastructure development efforts to locally process those raw materials, leaving societal, economic, and unrealized national security risks on the table. Nation-scale parallel production in multiple facilities is needed to robustly create products to meet consumer demand in health, industrial, energy, and food markets.
The BioNETWORK inverts the problem of a traditional centralized biomanufacturing facility and expertise paradigm by delivering a decentralized, resilient network enabling members to rapidly access manufacturing facilities, expertise, and data repositories, as needed and wherever they reside within the system, by integrating the substantial existing U.S. bioindustrial capabilities and resources to maximize nationwide outcomes. The BioNETWORK should be constructed as an aggregate of industrial, academic, financial, and nonprofit entities, organized in six regionally-aligned nodes (see figure below for notional regional distribution) of biomanufacturing infrastructure that together form a hub network that cultivates collaboration, rapid technology advances, and workforce development in underserved communities. The BioNETWORK’s fundamental design and construction aligns with the need for new regional technology development initiatives that expand the geographical distribution of innovative activity in the U.S., as stated in the CHIPS and Science Act. The BioNETWORK acts as the physical and information layer of manufacturing innovation, generating market forces, and leveraging ubiquitous data capture and feedback loops to accelerate innovation and scale-up necessary for full-scale production of novel biomaterials, polymers, small molecules, or microbes themselves. As a secure network, BioNETWORK serves as the physical and virtual backbone of the constituent biomanufacturing entities and their customers, providing unified, distributed manufacturing facilities, digital infrastructure to securely and efficiently exchange information/datasets, and enabling automated process development. Together the nodes function in an integrated way to adaptively solve biotechnology infrastructure challenges as well as load balancing supply chain constraints in real-time depending on the need. This includes automated infrastructure provisioning of small, medium, or large biomanufacturing facilities, supply of regional raw materials, customization of process flow across the network, allocation of labor, and optimization of the economic effectiveness. The BioNETWORK also supports the implementation of a national, multi-tenant cloud lab and enables a systematic assessment of supply chain capabilities/vulnerabilities for biomanufacturing.

As a secure network, BioNETWORK serves as the physical and virtual backbone of the constituent biomanufacturing entities and their customers, providing unified, distributed manufacturing facilities, digital infrastructure to securely and efficiently exchange information/datasets, and enabling automated process development.
Plan of Action
Congress should appropriate funding for an interagency coordination office co-chaired by the OSTP and the Department of Commerce (DOC) and provide $500 million to the DOC, Department of Energy (DOE), and Department of Defense (DOD) to initiate the BioNETWORK and use its structure to fulfill economic goals and create industrial growth opportunities within its three themes:
- Provide alternative supply chain pathways via biotechnologies and biomanufacturing to promote economic security. Leverage BioNETWORK R&D opportunities to develop innovative biomanufacturing pathways that could address supply chain bottlenecks for critical drugs, chemicals, and other materials.
- Explore distributed biomanufacturing innovation to enhance supply chain resilience. Leverage BioNETWORK R&D efforts to advance flexible and adaptive biomanufacturing platforms to mitigate the effects of supply chain disruptions.
- Address standards and data infrastructure to support biotechnology and biomanufacturing commercialization and trade. Leverage BioNETWORK R&D needed to enable data interoperability across the network to enable scale-up and increase global competitiveness.
To achieve these goals, the policy Plan of Action includes the following steps:
1. Congress should appropriate $10 million to establish an interagency coordination office within OSTP that is co-chaired by the DOC. This fulfills the White House Executive Order and CHIPs and Science mandates for better interagency coordination among the DOE, DOC, DOD, National Institute of Standards and Technology (NIST), and the National Science Foundation (NSF).
2. Congress should then appropriate $500 million to DOC and DOE to fund a biomanufacturing moonshot that includes creating the pilot network of three nodes to form the BioNETWORK in regions of the U.S. within six months of receiving funding. This funding should be managed by the interagency coordination office in collaboration with a not-for-profit organization whose mission is to build, deploy, and manage the BioNETWORK together with the federal entities. The role of the not-for-profit is to ensure that a trusted, unbiased partner (not influenced by outside entities) is involved, such that the interests of the taxpayer, U.S. government, and commercial sectors are all represented in the most beneficial way possible. The mission should include education, workforce development, safety/security, and sustainment as core principles, such that the BioNETWORK can stand alone once established. The new work to build the network should also synergize with the foundational science of the NSF and the national security focus of DOD biotechnology programs.
3. Continued investment of an additional $500 million should be appropriated by Congress to create economic incentives to sustain and transition the BioNETWORK from public funding to full commercial operation. This step requires evaluation of concrete go/no-go milestones and deliverables to ensure on-time, on-budget operations have been met. The interagency coordination office should work with DOC, DOE, DOD, and other agencies to leverage these incentives and create other opportunities to promote the BioNETWORK so that it does not require public funding to keep itself sustainable and can obtain private funding.
Create a Pilot Network of Three Nodes
To accelerate beyond current biomanufacturing programs and efforts, the first three nodes of the BioNETWORK should be constructed in three new disparate geographic regions (i.e., East, Midwest, West, or other locations with relevant feedstocks, workforce, or component infrastructure) to show the networking capabilities for distributed manufacturing. The scale of funding required to design, construct, and deploy the first three nodes is $500 million. The initiation and construction of the BioNETWORK should commence within six months. The DOE should lead the initiation and deployment of the technical construction of the BioNETWORK through Theme 2 of their Biomanufacturing goals, which “seeks alternative processes to produce chemicals and materials from renewable biomass and intermediate feedstocks by developing low-carbon-intensity product pathways and promoting a circular economy for materials.” Each node should create regional partnerships that have four entities (a physical manufacturing facility, a cell programming entity, an academic research and development entity, and a workforce/resource entity). All four entities will contain both physical facilities such as industrial fermentation and wet lab space, as well as the workforce needed to run them. On top of the pilot nodes, a science and technology/engineering integrator of the system should be identified to coordinate the effort and lead security/safety efforts for the physical network. Construction of the initial BioNETWORK should be completed within two years.
Achievement of the BioNETWORK goals requires the design plan to:
- Leverage and use regional feedstocks and reagents across the U.S. as inputs to bioproduction to create robustness in the earliest steps of manufacturing.
- Automate the integrated use of small, intermediate, and large-scale biomanufacturing facilities so that they are effective, stable, scalable, and safe for biomanufacturing demand.
- Parallelize scheduling of infrastructure and resources to minimize delays in production and maximize capacity and product diversity.
- Incorporate methods for replication when it comes to selection and design of microbial strains, cell free systems, and sequences of known function.
- Reuse waste streams to create circular economies.
- Include infrastructure biomanufacturing standards from NIST.
The BioNETWORK construction milestones should fulfill the White House OSTP bold goals through new capabilities delivered via distributed manufacturing infrastructure:
- Networked data for distributed biomanufacturing—“establishing a Data Initiative to ensure that high-quality, wide-ranging, easily accessible, and secure biological data sets can drive breakthroughs for the U.S. bioeconomy.”
- Domestic distributed biomanufacturing infrastructure—“expanding domestic capacity to manufacture all the biotechnology products we invent in the United States and support a resilient supply chain.”
- Local hubs for workforce development—“growing training and education opportunities for the biotechnology and biomanufacturing workforce of the future.”
Full Network: Plan for Sustainability
Congress and executive branch agencies establish economic incentives for commercial entities, state/local governments, and consumers of bioindustrial manufacturing products to create commercialization pathways that enhance local economies while also supporting the national network. These include tax credits, tax breaks, low interest loans, and underwritten loans as a starting point. To facilitate tech transition, unique lab-to-market mechanisms and proven tools to address market failure and applied technologies gaps should be used in conjunction with those in the Inflation Reduction Act. This includes prize and challenge competitions, market shaping procurement or loan programs, and streamlined funding of open, cross-disciplinary research, and funding at the state and local levels.
A new public-private partnership could coordinate across multiple efforts to ensure they drive toward rapid technology deployment and integration. This includes implementing a convertible debt plan that rewards BioNETWORK members with equity after reaching key milestones, providing an opportunity for discounted buyout by other investors during rounds of funding, and working with the federal government to design market-shaping mechanisms such as advance market commitments to guarantee purchase of a bioproduction company’s spec-meeting product.
Additionally, the BioNETWORK should be required to expand the repertoire of domestic renewable raw materials into a suite of high-demand, industry-ready products as prescribed in the DOC’s goals in biomanufacturing. This will ensure all regions have support for commercial goods and can automatically assess domestic supply chain capabilities and vulnerabilities, and are provided compensatory remediation on demand. The full BioNETWORK consists of six nodes—aligned to each of the major geographic regions and/or EDA regions in the United States—which have unique raw materials, workforce, infrastructure, and consumption of products that contribute to supporting the overall network functionality. The full BioNETWORK should be active within five years of project initiation and be evaluated against phased milestones throughout.
Conclusion
Networked solutions are resilient and enduring. A single factory is at risk of transfer to foreign ownership, closure, or obsolescence. The BioNETWORK creates connectivity among distributed biomanufacturing physical infrastructure to form a network with a robust domestic value chain. Today’s biomanufacturing investments suffer from the need to vertically integrate due to lack of flexible capacity across the value chain, which raises capital requirements and overall risk. The BioNETWORK drives horizontal integration through the network nodes via new infrastructure, connecting physical infrastructure of the nodes within the system. The result is a multi-sided marketplace for biotechnology innovation, products, and commercialization.
The federal government should initiate a new program and select performers within the next six months to begin the research, development, and construction of the first three nodes of the BioNETWORK. Taking action to establish the BioNETWORK ensures that the United States has the necessary physical and virtual infrastructure to grow the bioeconomy and its international leadership in biotechnology. The BioNETWORK creates new job opportunities for people across the country where training in biotechnology expands the skill sets of people with broad-spectrum applicability from trades to advanced degrees. The BioNETWORK drives circular economies where raw materials from rural and urban centers enter the network and are transformed into high-value products such as advanced materials, pharmaceuticals, food, and energy. The BioNETWORK protects U.S. supply chain resiliency through distributed manufacturing and links regional development into a national capability to establish biomanufacturing as a pillar of economic and technological growth for today and into the 22nd century.
Establishment of the BioNETWORK scales, connects, and networks the impact of a hub and tailors it to the needs of bioindustrial manufacturing, which requires regional feedstocks and integration of small-, intermediate-, and large-scale industrial fermentation facilities scattered across the United States to form an end-to-end distributed biomanufacturing platform. Similar to the goals of the EDA hub program, the BioNETWORK will accelerate regional economic activity, workforce development, and re-establishment of domestic manufacturing. Leveraging activity of the EDA and NSF Biofoundries program is an opportunity for coordination across the interagency.
Retrofitting existing small-, intermediate-, and large-scale biomanufacturing facilities/plants is necessary to construct the connected BioNETWORK. This includes new/modified fermentation equipment, scale-up and purification hardware, software/communications for networking, transportation, load-balancing, and security infrastructure.
Clear, measurable intermediate milestones and deliverables are required to ensure that the BioNETWORK is on track. Every three months, key performance metrics and indicators should be used to demonstrate technical functionality. Planned economic and workforce targets should be established every year and tracked for performance. Adjustments to the technical and business plans should be implemented if needed to ensure the overarching goals are achieved.
A major outcome of the BioNETWORK program is that biomanufacturing in the United States becomes on par with the other traditional pillars of manufacturing such as chemicals, food, and electronics. Workforce retraining to support this industry leads to new high-paying jobs as well as new consumer product sectors and markets with new avenues for economic growth. Failure to deploy the BioNETWORK leaves the United States vulnerable to supply chain disruption, little to no growth in manufacturing, and out competition by China and other peer nations that are investing in and growing biotechnology.
Secondary milestones include key performance indicators, including increased capacity, decrease in production time, robustness (more up time vs. down time), cheaper costs, better use of regional raw materials, etc.
Towards a Well-Being Economy: Establishing an American Mental Wealth Observatory
Summary
Countries are facing dynamic, multidimensional, and interconnected crises. The pandemic, climate change, rising economic inequalities, food and energy insecurity, political polarization, increasing prevalence of youth mental and substance use disorders, and misinformation are converging, with enormous sociopolitical and economic consequences that are weakening democracies, corroding the social fabric of communities, and threatening social stability and national security. Globalization and digitalization are synchronizing, amplifying, and accelerating these crises globally by facilitating the rapid spread of disinformation through social media platforms, enabling the swift transmission of infectious diseases across borders, exacerbating environmental degradation through increased consumption and production, and intensifying economic inequalities as digital advancements reshape job markets and access to opportunities.
Systemic action is needed to address these interconnected threats to American well-being.
A pathway to addressing these issues lies in transitioning to a Well-Being Economy, one that better aligns and balances the interests of collective well-being and social prosperity with traditional economic and commercial interests. This paradigm shift encompasses a ‘Mental Wealth’ approach to national progress, recognizing that sustainable national prosperity encompasses more than just economic growth and instead elevates and integrates social prosperity and inclusivity with economic prosperity. To embark on this transformative journey, we propose establishing an American Mental Wealth Observatory, a translational research entity that will provide the capacity to quantify and track the nation’s Mental Wealth, generate the transdisciplinary science needed to empower decision makers to achieve multisystem resilience, social and economic stability, and sustainable, inclusive national prosperity.
Challenge and Opportunity
America is facing challenges that pose significant threats to economic security and social stability. Income and wealth inequalities have risen significantly over the last 40 years, with the top 10% of the population capturing 45.5% of the total income and 70.7% of the total wealth of the nation in 2020. Loneliness, isolation, and lack of connection are a public health crisis affecting nearly half of adults in the U.S. In addition to increasing the risk of premature mortality, loneliness is associated with a three-fold greater risk of dementia.
Gun-related suicides and homicides have risen sharply over the last decade. Mental disorders are highly prevalent. Currently, more than 32% of adults and 47% of young people (18–29 years) report experiencing symptoms of anxiety and depression. The COVID-19 pandemic compounded the burden, with a 25–30% upsurge in the prevalence of depressive and anxiety disorders. America is experiencing a social deterioration that threatens its continued prosperity, as evidenced by escalating hate crimes, racial tensions, conflicts, and deepening political polarization.
To reverse these alarming trends in America and globally, policymakers must first acknowledge that these problems are interconnected and cannot effectively be tackled in isolation. For example, despite the tireless efforts of prominent stakeholder groups and policymakers, the burden of mental disorders persists, with no substantial reduction in global burden since the 1990s. This lack of progress is evident even in high-income countries where investments in and access to mental health care have increased.
Strengthening or reforming mental health systems, developing more effective models of care, addressing workforce capacity challenges, leveraging technology for scalability, and advancing pharmaceuticals are all vital for enhancing recovery rates among individuals grappling with mental health and substance use issues. However, policymakers must also better understand the root causes of these challenges so we can reshape the economic and social environments that give rise to common mental disorders.
Understanding and Addressing the Root Causes
Prevention research and action often focus on understanding and addressing the social determinants of health and well-being. However, this approach lacks focus. For example, traditional analytic approaches have delivered an extensive array of social determinants of mental health and well-being, which are presented to policymakers as imperatives for investment. These include (but are not limited to):
- Adverse early life exposures (abuse and neglect)
- Substance misuse
- Domestic, family, and community violence
- Unemployment
- Poverty and inequality
- Poor education quality
- Homelessness
- Social disconnection
- Food insecurity
- Pollution
- Natural disasters and climate change
This practice is replicated across other public health and social challenges, such as obesity, child health and development, and specific infectious and chronic diseases. Long lists of social determinants lobbied for investment lead policymakers to conclude that nations simply can’t afford to invest sufficiently to solve these health and social challenges.
However, it Is likely that many of these determinants and challenges are merely symptoms of a more systemic problem. Therefore, treating the ongoing symptoms only perpetuates a cycle of temporary relief, diverts resources away from nurturing innovation, and impedes genuine progress.
To create environments that foster mental health and well-being, where children can thrive and fulfill their potential, where people can pursue meaningful vocation and feel connected and supported to give back to communities, and where Americans can live a healthy, active, and purposeful life, policymakers must recognize human flourishing and prosperity of nations depends on a delicate balance of interconnected systems.
The Rise of Gross Domestic Product: An Imperfect Measure for Assessing the Success and Wealth of Nations
To understand the roots of our current challenges, we need to look at the history of the foundational economic metric, gross domestic product (GDP). While the concept of GDP had been established decades earlier, it was during a 1960 meeting of the Organization for Economic Co-operation and Development that economic growth became a primary ambition of nations. In the shadow of two world wars and the Great Depression, member countries pledged to achieve the highest sustainable economic growth, employment, efficiency, and development of the world economy as their top priority (Articles 1 & 2).
GDP growth became the definitive measure of a government’s economic management and its people’s welfare. Over subsequent decades, economists and governments worldwide designed policies and implemented reforms aimed at maximizing economic efficiency and optimizing macroeconomic structures to ensure consistent GDP growth. The belief was that by optimizing the economic system, prosperity could be achieved for all, allowing governments to afford investments in other crucial areas. However, prioritizing the optimization of one system above all others can have unintended consequences, destabilizing interconnected systems and leading to a host of symptoms we currently recognize as the social determinants of health.
As a result of the relentless focus on optimizing processes, streamlining resources, and maximizing worker productivity and output, our health, social, political, and environmental systems are fragile and deteriorating. By neglecting the necessary buffers, redundancies, and adaptive capacities that foster resilience, organizations and nations have unwittingly left themselves exposed to shocks and disruptions. Americans face a multitude of interconnected crises, which will profoundly impact life expectancy, healthy development and aging, social stability, individual and collective well-being, and our very ability to respond resiliently to global threats. Prioritizing economic growth has led to neglect and destabilization of other vital systems critical to human flourishing.
Shifting Paradigms: Building the Nation’s Mental Wealth
The system of national accounts that underpins the calculation of GDP is a significant human achievement, providing a global standard for measuring economic activity. It has evolved over time to encompass a wider range of activities based on what is considered productive to an economy. As recently as 1993, finance was deemed “explicitly productive” and included in GDP. More recently, Biden-Harris Administration leaders have advanced guidance for accounting for ecosystem services in benefit-cost analyses for regulatory decision-making and a roadmap for natural capital inclusion in the nation’s economic accounting services. This shows the potential to expand what counts as beneficial to the American economy—and what should be measured as a part of economic growth.
While many alternative indices and indicators of well-being and national prosperity have been proposed, such as the genuine progress indicator, the vast majority of policy decisions and priorities remain focused on growing GDP. Further, these metrics often fail to recognize the inherent value of the system of national accounts that GDP is based on. To account for this, Mental Wealth is a measure that expands the inputs of GDP to include well-being indicators. In addition to economic production metrics, Mental Wealth includes both unpaid activities that contribute to the social fabric of nations and social investments that build community resilience. These unpaid activities (Figure 1, social contributions, Cs) include volunteering, caregiving, civic participation, environmental restoration, and stewardship, and are collectively called social production. Mental Wealth also includes the sum of investment in community infrastructure that enables engagement in socially productive activities (Figure 1, social investment, Is). This more holistic indicator of national prosperity provides an opportunity to shift policy priorities towards greater balance between the economy and broader societal goals and is a measure of the strength of a Well-Being Economy.

Mental wealth is a more comprehensive measure of national prosperity that monetizes the value generated by a nation’s economic and social productivity.
Valuing social production also promotes a more inclusive narrative of a contributing life, and it helps to rebalance societal focus from individual self-interest to collective responsibilities. A recent report suggests that, in 2021, Americans contributed more than $2.293 trillion in social production, equating to 9.8% of GDP that year. Yet social production is significantly underestimated due to data gaps. More data collection is needed to analyze the extent and trends of social production, estimate the nation’s Mental Wealth, and assess the impact of policies on the balance between social and economic production.
Unlocking Policy Insights through Systems Modeling and Simulation
Systems modeling plays a vital role in the transition to a Well-Being Economy by providing an understanding of the complex interdependencies between economic, social, environmental, and health systems, and guiding policy actions. Systems modeling brings together expertise in mathematics, biostatistics, social science, psychology, economics, and more, with disparate datasets and best available evidence across multiple disciplines, to better understand which policies across which sectors will deliver the greatest benefits to the economy and society in balance. Simulation allows policymakers to anticipate the impacts of different policies, identify strategic leverage points, assess trade-offs and synergies, and make more informed decisions in pursuit of a Well-Being Economy. Forecasting and future projections are a long-standing staple activity of infectious disease epidemiologists, business and economic strategists, and government agencies such as the National Oceanic and Atmospheric Administration, geared towards preparing the nation for the economic realities of climate change.
Plan of Action
An American Mental Wealth Observatory to Support Transition to a Well-Being Economy
Given the social deterioration that is threatening America’s resilience, stability, and sustainable economic prosperity, the federal government must systemically redress the imbalance by establishing a framework that privileges an inclusive, holistic, and balanced approach to development. The government should invest in an American Mental Wealth Observatory (Table 1) as critical infrastructure to guide this transition. The Observatory will report regularly on the strength of the Well-Being Economy as a part of economic reporting (see Table 1, Stream 1); generate the transdisciplinary science needed to inform systemic reforms and coordinated policies that optimize economic, environmental, health and social sectors in balance such as adding Mental Wealth to the system of national accounts (Streams 2–4); and engage in the communication and diplomacy needed to achieve national and international cooperation in transitioning to a Well-Being Economy (Streams 5–6).
This transformative endeavor demands the combined instruments of science, policy, politics, public resolve, social legislation, and international cooperation. It recognizes the interconnectedness of systems and the importance of a systemic and balanced approach to social and economic development in order to build equitable long-term resilience, a current federal interagency priority. The Observatory will make better use of available data from across multiple sectors to provide evidence-based analysis, guidance, and advice. The Observatory will bring together leading scientists (across disciplines of economics, social science, implementation science, psychology, mathematics, biostatistics, business, and complex systems science), policy experts, and industry partners through public-private partnerships to rapidly develop tools, technologies, and insights to inform policy and planning at national, state, and local levels. Importantly, the Observatory will also build coalitions between key cross-sectoral stakeholders and seek mandates for change at national and international levels.
The American Mental Wealth Observatory should be chartered by the National Science and Technology Council, building off the work of the White House Report on Mental Health Research Priorities. Federal partners should include, at a minimum, the Department of Health and Human Services (HHS) Office of the Assistant Secretary for Health (OASH), specifically the Office of the Surgeon General (OSG) and Office of Disease Prevention and Health Promotion (ODPHP); the Substance Abuse and Mental Health Services Administration (SAMHSA); the Office of Management and Budget; the Council of Economic Advisors (CEA); and the Department of Commerce (DOC), alongside strong research capacity provided by the National Science Foundation (NSF) and the National Institutes of Health (NIH).
Operationalizing the American Mental Wealth Observatory will require an annual investment of $12 million from diverse sources, including government appropriations, private foundations, and philanthropy. This funding would be used to implement a comprehensive range of priority initiatives spanning the six streams of activity (Table 2) coordinated by the American Mental Wealth Observatory leadership. Acknowledging the critical role of brain capital in upholding America’s prosperity and security, this investment offers considerable returns for the American people.
Conclusion
America stands at a pivotal moment, facing the aftermath of a pandemic, a pressing crisis in youth mental and substance use disorders, and a growing sense of disconnection and loneliness. The fragility of our health, social, environmental, and political systems has come into sharp focus, and global threats of climate change and generative AI loom large. There is a growing sense that the current path is unsustainable.
After six decades of optimizing the economic system for growth in GDP, Americans are reaching a tipping point where losses due to systemic fragility, disruption, instability, and civil unrest will outweigh the benefits. The United States government and private sector leaders must forge a new path. The models and approaches that guided us through the 20th century are ill-equipped to guide us through the challenges and threats of the 21st century.
This realization presents an extraordinary opportunity to transition to a Well-Being Economy and rebuild the Mental Wealth of the nations. An American Mental Wealth Observatory will provide the data and science capacity to help shape a new generation grounded in enlightened global citizenship, civic-mindedness, and human understanding and equipped with the cognitive, emotional, and social resources to address global challenges with unity, creativity, and resilience.
The University of Sydney’s Mental Wealth Initiative thanks the following organizations for their support in drafting this memo: FAS, OECD, Rice University’s Baker Institute for Public Policy, Boston University School of Public Health, the Brain Capital Alliance, and CSART.
Brain capital is a collective term for brain skills and brain health, which are fundamental drivers of economic and social prosperity. Brain capital comprises (1) brain skills, which includes the ability to think, feel, work together, be creative, and solve complex problems, and (2) brain health, which includes mental health, well-being, and neurological disorders that critically impact the ability to use brain skills effectively, for building and maintaining positive relationships with others, and for resilience against challenges and uncertainties.
Social production is the glue that holds society together. These unpaid social contributions foster community well-being, support our economic productivity, improve environmental wellbeing, and help make us more prosperous and resilient as a nation.
Social production includes volunteering and charity work, educating and caring for children, participating in community groups, and environmental restoration—basically any activity that contributes to the social fabric and community well-being.
Making the value of social production visible helps us track how economic policies are affecting social prosperity and allows governments to act to prevent an erosion of our social fabric. So instead of just measuring our economic well-being through GDP, measuring and reporting social production as well gives us a more holistic picture of our national welfare. The two combined (GDP plus social production) is what we call the overall Mental Wealth of the nation, which is a measure of the strength of a Well-Being Economy.
The Mental Wealth metric extends GDP to include not only the value generated by our economic productivity but also the value of this social productivity. In essence, it is a single measure of the strength of a Well-Being Economy. Without a Mental Wealth assessment, we won’t know how we are tracking overall in transitioning to such an economy.
Furthermore, GDP only includes the value created by those in the labor market. The exclusion of socially productive activities sends a signal that society does not value the contributions made by those not in the formal labor market. Privileging employment as a legitimate social role and indicator of societal integration leads to the structural and social marginalization of the unemployed, older adults, and the disabled, which in turn leads to lower social participation, intergenerational dependence, and the erosion of mental health and well-being.
Well-being frameworks are an important evolution in our journey to understand national prosperity and progress in more holistic terms. Dashboards of 50-80 indicators like those proposed in Australia, Scotland, New Zealand, Iceland, Wales, and Finland include things like health, education, housing, income and wealth distribution, life satisfaction, and more, which help track some important contributors to social well-being.
However, these sorts of dashboards are unlikely to compete with topline economic measures like GDP as a policy focus. Some indicators will go up, some will go down, some will remain steady, so dashboards lack the ability to provide a clear statement of overall progress to drive policy change.
We need an overarching measure. Measurement of the value of social production can be integrated into the system of national accounts so that we can regularly report on the nation’s overall economic and social well-being (or Mental Wealth). Mental Wealth provides a dynamic measure of the strength (and good management) of a Well-Being Economy. By adopting Mental Wealth as an overarching indicator, we also gain an improved understanding of the interdependence of a healthy economy and a healthy society.
Developing a Mentor-Protégé Program for Fintech SBLC Lenders
Summary
The Biden Administration has recognized that small businesses, particularly minority-owned small businesses lack adequate access to capital. While SBA has operated its 7(a) Loan Program for multiple decades the program has historically shown poor results reaching minority-owned businesses and those in low- and moderate-income communities. Recently, the SBA has leveraged innovative fintech lenders to help fill this gap.
While the agency has finalized a rule that would allow fintech companies to participate in the 7(a) Loan Program, there are significant concerns that new entrants would put the program at risk due to a lack of internal controls and transparent evaluation. To help increase lending to low- and moderate-income communities while not increasing the overall risk to the 7(a) Loan Program, SBA should establish a mentor-protégé program and conditional certification regime for innovative financial technology companies to participate responsibly in the SBA’s 7(a) Loan Program and ensure that SBA adequately preserves the safety and soundness of the program.
The Challenge
The Biden Administration has recognized that small businesses, particularly minority-owned small businesses lack adequate access to capital. While SBA has operated its 7(a) Loan Program for multiple decades, the program has historically shown poor results in reaching minority-owned businesses and those in low- and moderate-income communities. According to a 2022 Congressional Research Service report, “[i]n FY2021, 30.1% of 7(a) loan approvals ($10.98 billion of $36.54 billion) were [made] to minority-owned businesses (20.8% Asian, 6.0% Hispanic, 2.6% African American, and 0.7% American Indian)”.
SBA has made a concerted effort previously to increase 7(a) small business lending to underserved communities by establishing the Community Advantage (CA) 7(a) loan initiative. Launched as a pilot program in 2011 and subsequently reauthorized, the CA loan initiative has been successful in encouraging mission-driven nonprofit lenders to underserved communities; however, the impact has been relatively small when compared to the traditional 7(a) loan program. In FY 2022, the CA Pilot Program approved just 722 loans totaling $114,804; whereas the general SBA 7(a) Loan Program approved 3,501 loans totaling $3,498,234,800–an order of magnitude of difference.
The COVID-19 pandemic created an unprecedented demand for assistance to the country’s small businesses, as they were forced to close their doors and saw their revenues dwindle. Congress responded to this demand by passing the Coronavirus Aid, Relief, and Economic Security (CARES) Act, which established one of the largest government-backed lending programs ever, the Paycheck Protection Program (PPP). During the PPP, fintech lenders, which for this policy memo includes technology-savvy banks and nonbank financial institutions that operate online and through mobile applications, proved uniquely adept at serving small businesses in traditionally underserved communities, even without specific guidance to do so from the SBA.
Many of the borrowers assisted by fintech lenders did not have pre-existing borrowing relationships with a financial institution and were therefore deprioritized by traditional financial institutions offering PPP loans, who favored lending to small businesses with existing relationships. Previously published research showed that not only did fintech lenders receive more applications from businesses from Black and Hispanic-owned businesses, but also extended a significant amount of lending to these businesses. Fintech lenders therefore expanded the impact of the PPP to underserved borrowers and successfully bolstered the efforts of mission-driven lenders, such as Community Development Financial Institutions and Minority Depository Institutions. For example, Unity National Bank of Houston, a Minority Depository Institution partnered with Cross River, a tech-focused bank that partners with fintech companies, to increase its lending from 500 loans to nearly 200,000 loans by leveraging Cross River’s lending technology. Similarly, Accion Opportunity Fund, a large Community Development Financial Institution, partnered with Lending Club, another tech-focused lender, to improve both entities’ lending operations to borrowers that were underserved during the first round of the PPP. However, Community Development Financial Institutions and Minority Depository Institutions often face challenges procuring and implementing the technology needed to help scale their nontraditional lending activities, which limits the efficacy of their mission-driven lending in an increasingly internet-based lending environment.
In an effort to increase access to capital and build on the efforts of fintech companies that successfully provided capital to small businesses in the Paycheck Protection Program, SBA has proposed lifting its moratorium on non-depository lenders participating in the program. SBA and the Biden Administration have shown real progress in removing the moratorium on Small Business Lending Company (SBLC) licenses to include fintech companies, which would expand the eligible participants in the program for the first time in 40 years.
Expanding access to capital and support for small businesses is a key priority for the Biden Administration. Specifically, the Administration noted the importance of expanding underserved small business’ access to capital. They recommended expanding the SBA’s 7(a) program by extending SBLC licenses to nonbank lenders, which include fintech companies, as one promising strategy. To this end, SBA has established a strategy of expanding its lending network by leveraging fintech companies. The SBA previously issued a proposed rulemaking to remove the moratorium on SBLC licenses and add three new categories of SBLC licenses.
However, policymakers and some industry participants have cast serious doubts on fintech companies’ participation in SBA’s 7(a) Loan Program, due to weak internal controls of unpartnered fintech companies and subsequent fraud issues experienced during the Paycheck Protection Program. Further, these critics have cited concerns with the agency’s ability to properly oversee these fintech companies due to a lack of ability to manage the fraud risks associated with developing or expanding a lending program that includes unpartnered fintech companies. Overall, the agency has shown that both it and fintech companies should improve their engagement together to ensure that the many program requirements are adhered to, and that SBA improves its abilities to mitigate potentially new or unique risks to the 7(a) Loan Program.
The Plan of Action
To solve the aforementioned issues, SBA should establish a mentor-protégé program and conditional certification regime for innovative nonbank financial technology companies to participate responsibly in the SBA’s 7(a) Loan Program. By creating a mentor-protégé program and conditional certification regime, SBA can continue to encourage the expansion of the 7(a) Loan Program to lenders that have shown their willingness and ability to lend to traditionally underserved small business borrowers, while ensuring that the agency adequately preserves the safety and soundness of the 7(a) Loan Program.
In the proposed mentor-protégé program, SBA would conduct an initial assessment of the fintech applicant and provide a conditional certification contingent on the fintech’s participation in the mentor-protégé program. To ensure that only the most well-suited fintech companies are allowed to engage in the 7(a) program, SBA should conduct a fair lending assessment. This would include a gap analysis of the company’s lending processes, akin to the existing interagency fair lending examinations conducted by the federal banking regulators. Further, SBA should require fintech companies to complete a “Community Lending Plan” detailing the specific small business lending activities the fintech company intends to complete in traditionally underserved areas. SBA would conduct a review of applications it receives and match them with banks that are established 7(a) lenders.
To help ensure that both mentors and protégés develop throughout the program, SBA would need to create program criteria for both mentor banks and protégé fintech companies. Mentor criteria should focus on ensuring that mentor banks assist and grow the knowledge of their fintech proteges. Thus, both mentors and protégés should be required to complete periodic progress reports. Further, mentors should conduct their own periodic assessments of the fintech protégé’s compliance and lending processes to ensure that the fintech is able to comply with existing 7(a) Loan Program requirements and not create an undue risk to the program. These criteria should be determined based on the expertise of the Office of Capital Access and Office of Credit Risk Management with advice from SBA’s 8(a) Business Development Program staff. Lastly, to ensure that mentors and protégés can speak candidly about their experience with the other participant, SBA would need to create communication portals for both entities that are walled-off review by either participant.
Recognizing the potential apprehension existing 7(a) lenders might have to eventually increasing competition to the 7(a) lending market, the SBA would need to incentivize banks to provide mentorship services to fintech companies by providing participating mentor banks with Community Reinvestment Act (CRA) credit and an increased SBA guarantee threshold for the bank’s 7(a) loans. By pursuing these two incentives, the SBA would provide banks with clear business and regulatory benefits from participating in the mentor-protégé program.
Based on a review of the SBA’s 2023 Congressional Budget Justification, SBA has accounted for much of the increased cost that would stem from expanding the 7(a) Lending Program to additional SBLCs. SBA noted that part of its $93.6 million request for fiscal year 2023 was to attract new lenders that participated in the Paycheck Protection Program. Similarly, SBA identified the need to continue building its oversight of Paycheck Protection Program and Community Advantage lenders. To this end, SBA requested an additional $13.9 million in small business lender oversight. Establishing the 7(a) mentor-protégé program would likely require only a small amount of additional funds relative to the 2023 requested amount. To account for the additional programmatic and administrative requirements needed to establish the 7(a) mentor-protégé program, SBA should include an additional $500 thousand to $1 million to its future Congressional Budget Justifications.
Success of the mentor-protégé program depends on robust program requirements and continuous monitoring to ensure the participants are adhering to the goal of responsibly expanding capital access to underserved small businesses. To accomplish this endeavor, SBA should leverage the internal expertise of its Office of Capital Access and Office of Credit Risk Management, while also coordinating with prudential and state financial services regulators to adequately understand the novel business models of fintech companies applying to and participating in the program. Interagency coordination between state and federal regulators will ensure that the 7(a) program’s integrity is maintained at the macro and micro levels.
Conclusion
Expanding small business lending to low- and moderate- income communities is an especially important endeavor. Few opportunities for real social and economic growth exist in these traditionally underserved communities without robust access to small business credit. While the importance of expanding access is clear, SBA has a responsibility to ensure that its flagship 7(a) Loan Program remains safe, sound, and available for the benefit of all small businesses. The recent decision to finalize rulemaking that would expand allowable lenders to the 7(a) Loan Program must come with careful consideration of which lenders should be able to participate. Incorporating fintech lenders presents an opportunity to solve the issues of small business lending to traditionally underserved communities. However, given the concerns identified throughout the rulemaking process and after its finalization, SBA should work diligently to ensure that only the best-suited entities are allowed to become 7(a) lenders. To help ensure that this occurs, they should create a mentor-protégé program that will afford fintech companies the best opportunity to succeed in the program while maintaining the safety and soundness that is so important to the overall success of the 7(a) Loan Program.
Expanding access to capital and support for small businesses is a key priority for the Biden Administration. Specifically, the Administration noted the importance of expanding underserved small business’ access to capital by expanding the SBA’s 7(a) program through extending SBLC licenses to nonbank lenders, which include fintech companies. To this end, SBA established a strategy of expanding its lending network by leveraging fintech companies. The SBA previously issued and finalized a rulemaking process to remove the moratorium on SBLC licenses and add three new categories of SBLC licenses.
Success of the mentor-protégé program depends on robust program requirements and continuous monitoring to ensure the participants are adhering to the goal of responsibly expanding capital access to underserved small businesses. To accomplish this endeavor, SBA should leverage the internal expertise of its Office of Capital Access and Office of Credit Risk Management, while also coordinating with prudential and state financial services regulators to adequately understand the novel business models of fintech companies applying to and participating in the program.
The SBA can conduct an initial assessment of the fintech applicant and provide a conditional certification contingent on the fintech’s participation in the mentor-protégé program. Further, the SBA should develop program criteria for both mentor banks and protégé fintech companies and application portals for both entities. SBA would conduct a review of applications it receives. The SBA would incentivize banks to provide mentorship services to fintech companies seeking to gain SBLC certification by providing CRA credit banks and an increased SBA guarantee threshold for the bank’s 7(a) loans.
Providing equitable access to capital for underserved communities in our country will require actions beyond the scope of this policy recommendation, including changes to the regulations that govern community banks, fintech lenders, CDFIs, and other mission-driven lenders. Fintech lenders have a proven ability to contribute to this expansion of capital access, given their collective performance as PPP lenders. In addition, fintech lenders have an ability to scale the solutions that they provide quickly, something that CDFIs and other mission-led lenders have traditionally struggled to do well.
Fintech lenders compete with conventional lenders for market share; the SBA should take care not to create programs that give one competing group an advantage over another. Creating a bespoke program, tailored to the needs of fintech lenders, would run the risk of creating more than an incidental competitive advantage. Instead, this program proposal advocates for utilizing a mentorship model that helps build strategic partnerships to accelerate access to capital for underserved groups, without creating separate rules or carve-outs.
Leveraging Positive Tipping Points to Accelerate Decarbonization
Summary
The Biden Administration has committed the United States to net-zero emissions by 2050. Meeting this commitment requires drastic decarbonization transitions across all sectors of society at a pace never seen before. This can be made possible by positive tipping points, which demarcate thresholds in decarbonization transitions that, once crossed, ensure rapid progress towards completion. A new generation of economic models enables the analysis of these tipping points and the evaluation of effective policy interventions.
The Biden Administration should undertake a three-pronged strategy for leveraging the power of positive tipping points to create a larger-than-anticipated return on investment in the transition to a clean energy future. First, the President’s Council of Advisors on Science and Technology (PCAST) and the Council of Economic Advisors (CEA) should evaluate new economic models and make recommendations for how agencies can incorporate such models into their decision-making process. Second, federal agencies should integrate positive tipping points into the research agendas of existing research centers and programs to uncover additional decarbonization opportunities. Finally, federal agencies should develop decarbonization strategies and policies based on insights from this research.
Challenge and Opportunity
Climate change brings us closer each year to triggering negative tipping points, such as the collapse of the West Antarctic ice sheet or the Atlantic Meridional Overturning Circulation. These negative tipping points, driven by self-reinforcing environmental feedback loops, significantly accelerate the pace of climate change.
Meeting the Biden Administration’s commitment to net-zero emissions by 2050 will reduce the risk of these negative tipping points but requires the United States to significantly accelerate the current pace of decarbonization. Traditional economic models used by the federal government and organizations such as the International Energy Agency consistently underestimate the progress of zero-emission technologies and the return on investment of policies that enable a faster transition, resulting in the agency’s “largest ever upwards revision” last year. A new school of thought presents “evidence-based hope” for rapidly accelerating the pace of decarbonization transitions. Researchers point out that our society consists of complex and interconnected social, economic, and technological systems that do not change linearly under a transition, as traditional models assume; rather, when a positive tipping point is crossed, changes made to the system can lead to disproportionately large effects. A new generation of economic models has emerged to support policymakers in understanding these complex systems in transition and identifying the best policies for driving cost-effective decarbonization.
At COP26 in 2021, leaders of countries responsible for 73% of world emissions, including the United States, committed to work together to reach positive tipping points under the Breakthrough Agenda. The United Kingdom and other European countries have led the movement thus far, but there is an opportunity for the United States to join as a leader in implementing policies that intentionally leverage positive tipping points and benefit from the shared learnings of other nations.
Domestically, the Inflation Reduction Act (IRA) and the Infrastructure Investment and Jobs Act (IIJA) include some of the strongest climate policies that the country has ever seen. The implementation of these policies presents a natural experiment for studying the impact of different policy interventions on progress towards positive tipping points.
How do positive tipping points work?
Figure 1. Diagram of a system and its positive tipping point. The levers for change on the left push the system away from the current high-emission state and towards a new net-zero state. As the system moves away from the current state, the self-reinforcing feedback loops in the system become stronger and accelerate the transition. At the positive tipping point, the feedback loops become strong enough to drive the system towards the new state without further support from the levers for change. Thus, policy interventions for decarbonization transitions are most crucial in the lead up to a positive tipping point. (Adapted from the Green Futures Network.)
Just as negative tipping points in the environment accelerate the pace of climate change, positive tipping points in our social, economic, and technological systems hold the potential to rapidly accelerate the pace of decarbonization (Figure 1). These positive tipping points are driven by feedback loops that generate increasing returns to adoption and make new consumers more likely to adopt (Figure 2):
- Learning by doing: As manufacturers produce more of a technology, they learn how to produce the technology better and cheaper, incentivizing new consumers to adopt it.
- Economies of scale: Manufacturers are able to realize cost savings as they increase their production capacity, which they pass on to consumers as lower prices that spur more demand.
- Social contagion: The more people adopt a new technology, the more likely other people will imitate and adopt it.
- Complementary technology reinforcement: As a technology is adopted more widely, complementary technology and infrastructure emerge to make it more useful and accessible.
The right set of policies can harness this phenomenon to realize significantly greater returns on investment and trigger positive tipping points that give zero-emission technologies a serious boost over incumbent fossil-based technologies.
Figure 2. Examples of positive feedback loops: (a) learning by doing, (b) social contagion, and (c) complementary technology reinforcement.
One way of visualizing progress towards a positive tipping point is the S-curve, where the adoption of a new zero-emission technology grows exponentially and then saturates at full adoption. This S-curve behavior is characteristic of many historic energy and infrastructure technologies (Figure 3). From these historic examples, researchers have identified that the positive tipping point occurs between 10% and 40% adoption. Crossing this adoption threshold is difficult to reverse and typically guarantees that a technology will complete the S-curve.
Figure 3. The historic adoption of a sample of infrastructure and energy systems (top) and manufactured goods (bottom). Note that the sharpness of the S-curve can vary significantly. (Source: Systemiq)
For example, over the past two decades, the Norwegian government helped build electric vehicle (EV) charging infrastructure (complementary technology) and used taxes and subsidies to lower the price of EVs below that of gas vehicles. As a result, consumers began purchasing the cheaper EVs, and over time manufacturers introduced new models of EVs that were cheaper and more appealing than previous models (learning by doing and economies of scale). This led to EVs skyrocketing to 88% of new car sales in 2022. Norway has since announced that it would start easing its subsidies for EVs by introducing two new EV taxes for 2023, yet EV sales have continued to grow, taking up 90% of total sales so far in 2023, demonstrating the difficult-to-reverse nature of positive tipping points. Norway is now on track to reach a second tipping point that will occur when EVs reach price parity with gas vehicles without assistance from taxes or subsidies.
Due to the interconnected nature of social and technological systems, triggering one positive tipping point can potentially increase the odds of another tipping point at a greater scale, resulting in “upward-scaling tipping cascades.” Upward-scaling tipping cascades can occur in two ways: (1) from a smaller system to a larger system (e.g., as more states reach their tipping point for EV adoption, the nation as a whole gets closer to its tipping point) and (2) from one sector to another. For the latter, researchers have identified three super-leverage points that policymakers can use to trigger tipping cascades across multiple sectors:
- Light-duty EVs → heavy-duty EVs and renewable energy storage: The development of cheaper batteries for light-duty EVs will enable cheaper heavy-duty EVs and renewable energy storage thanks to shared underlying battery technology. The build-out of charging infrastructure for light-duty EVs will also facilitate the deployment of heavy-duty EVs.
- Green ammonia → heavy industries, shipping, and aviation: The production of green ammonia requires green hydrogen as an input, so the growth of the former will spur the growth of the latter. Greater production of green hydrogen and green ammonia will catalyze the decarbonization of the heavy industries, shipping, and aviation sectors, which use these chemicals as fuel inputs.
- Traditional and alternative plant proteins → land use: Widespread consumption of traditional and alternative plant proteins over animal protein will reduce pressure on land-use change for agriculture and potentially restore significant amounts of land for conservation and carbon sequestration.
The potential for this multiplier effect makes positive tipping points all the more promising and critical to understand.
Further research to identify positive tipping points and tipping cascades and to improve models for evaluating policy impacts holds great potential for uncovering additional decarbonization opportunities. Policymakers should take full advantage of this growing field of research by integrating its models and insights into the climate policy decision-making process and translating insights from researchers into evidence-based policies.
Plan of Action
In order for the government to leverage positive tipping points, policymakers must be able to (1) identify positive tipping points and tipping cascades before they occur, (2) understand which policies or sequences of policies may be most cost-effective and impactful in enabling positive tipping points, and (3) integrate that insight into policy decision-making. The following recommendations would create the foundations of this process.
Recommendation 1. Evaluate and adopt new economic models
The President’s Council of Advisors on Science and Technology (PCAST) and the Council of Economic Advisors (CEA) should conduct a joint evaluation of new economic models and case studies to identify where new models have been proven to be more accurate for modeling decarbonization transitions and where there are remaining gaps. They should then issue a report with recommendations on opportunities for funding further research on positive tipping points and new economic models and advise sub agenciessubagencies responsible for modeling and projections, such as the Energy Information Administration within the Department of Energy (DOE), on how to adopt these new economic models.
Recommendation 2. Integrate positive tipping points into the research agenda of federally funded research centers and programs.
There is a growing body of research coming primarily from Europe, led by the Global Systems Institute and the Economics of Energy Innovation and Systems Transition at the University of Exeter and Systemiq, that is investigating global progress towards positive tipping points and different potential policy interventions. The federal government should foster the growth of this research area within the United States in order to study positive tipping points and develop models and forecasts for the U.S. context.
There are several existing government-funded research programs and centers that align well with positive tipping points and would benefit synergistically from adding this to their research agenda:
- Power, buildings, and heavy industries: The National Renewable Energy Laboratory (NREL), funded by the DOE, has an energy analysis research program that conducts analyses of future systems scenarios, market and policy impacts, sustainability, and techno-economics. This research program would be a good fit for taking on research on positive tipping points in the power, buildings, and heavy industries sectors.
- Transportation: The National Center for Sustainable Transportation (NCST), funded by the Department of Transportation, conducts research around four research themes—Environmentally Responsible Infrastructure and Operations, Multi-Modal Travel and Sustainable Land Use, Zero-Emission Vehicle and Fuel Technologies, and Institutional Change—in order to “address the most pressing policy questions and ensure our research results are incorporated into the policy-making process.” The policy-oriented focus of the NCST makes it a good candidate for conducting research on positive tipping points in the transportation sector under the theme of Institutional Change and translating research results into policy briefs.
- Food and agriculture: The Agriculture and Food Research Initiative’s Sustainable Agriculture Systems program funds research projects that take a systems approach to studying how to promote transformational change in the U.S. food and agriculture system in the face of a changing climate. In the next Request for Applications, the program should include research on positive tipping points in food and agricultural systems as a topic that the program will fund.
- General: The Science and Technology Policy Institute (STPI), a Federally Funded Research and Development Center (FFRDC) sponsored by the National Science Foundation, conducts research to inform policy decisions by the White House Office of Science and Technology Policy. STPI is another potential candidate for conducting research on positive tipping points in a variety of sectors.
Recommendation 3. Use insights from positive tipping points research to develop and implement policies to accelerate progress towards positive tipping points
Researchers have already identified three super-leverage points around which the federal government should consider developing and implementing policies. As future research is published, the PCAST should make further recommendations on actions that the federal government can take in leveraging positive tipping points.
Super-Leverage Point #1: Mandating Zero-Emission Vehicles (ZEVs)
ZEV mandates require car manufacturers to sell a rising proportion of ZEVs within their light duty vehicles sales. Ensuring a growing supply of ZEVs results in falling costs and rising demand. Evidence of the effect of such policies in U.S. states, Canadian provinces, and China and future projections suggest that ZEV mandates are a crucial policy lever for ensuring a full EV transition. Such policies rely on the reallocation of private capital rather than government spending, making it particularly cost-effective. Combined with the investments in EV manufacturing and public charging infrastructure in the IRA and IIJA, a national ZEV mandate could radically boost the EV transition.
A national ZEV mandate is unlikely to pass Congress anytime soon. However, the recently proposed Environmental Protection Agency (EPA) greenhouse gas emissions standards for passenger cars and trucks would effectively require 67% of car sales to be ZEVs by 2032 in order for car manufacturers to comply with the regulations. The proposed standards would provide regulatory strength behind the Biden Administration’s goal of 50% of new cars sold by 2030 to be ZEVs. The EPA should finalize these standards as soon as possible at or above the currently proposed stringency.
The proposed EPA standards are projected to result in a 50% reduction in the price of EV batteries by 2035. This will have knock-on effects on the cost of batteries for renewable energy storage and battery electric trucks and other heavy-duty vehicles, which would likely bring forward the cost parity tipping point for these technologies by a number of years.
Super-Leverage Point #2: Mandating Green Ammonia Use in Fertilizer Production
Ammonia is the primary ingredient for producing nitrogen-based fertilizer and requires hydrogen as an input. Traditionally, this hydrogen is produced from natural gas, and the production of hydrogen for ammonia accounts for 1% of global CO2 emissions. Green hydrogen produced from water and powered by renewable energy would enable the production of green ammonia for nitrogen-based fertilizers.
Based on a DOE tipping point analysis, green ammonia production is one of the most promising areas for initial large-scale deployment of green hydrogen, thanks to its ability to use established ammonia supply chains and economies of scale. Green ammonia production also has one of the lowest green premia in the hydrogen economy. Green ammonia production will enable infrastructure development and cost reductions for green hydrogen to decarbonize other sectors, including shipping, aviation, and heavy industries like steel.
The Biden Administration should set a target for green ammonia production for domestic fertilizer in the Federal Sustainability Plan similar to India’s draft hydrogen strategy requiring 20% green ammonia production by 2027–2028. The EPA should then propose Clean Air Act carbon emission limits and guidelines for nitrogen-based fertilizer production plants, similar to the recently proposed standards for coal and natural gas power plants, to provide regulatory strength behind that target. These limits would effectively require fertilizer plants to blend a growing percentage of green ammonia into their production line in order to meet emission limits. According to the DOE, the clean hydrogen production tax credit in the IRA has enabled cost parity between green ammonia and fossil-based ammonia, so the EPA should be able to set such limits without increasing food production costs.
Super-Leverage Point #3: Public Procurement to Promote Plant and Alternative Proteins
Shifting protein consumption from meat to plant and alternative proteins can reduce emissions from livestock farming and reduce land use change for meat production. Plant proteins refer to protein-rich plants, such as nuts and legumes, and traditional products made from those plants, such as tofu and tempeh. Alternative proteins currently on the market include plant- and fermentation-based protein products intended to mimic the taste and texture of meat. Studies show that if plant and alternative proteins are able to reach a tipping point of 20% market share, this would ease up 7–15% of land currently used for agriculture to conservation and the restoration of its ability to serve as a carbon sink.
Public procurement of alternative proteins for federal food programs leverages government spending power to support this nascent market and introduce new consumers to alternative proteins, thus increasing its accessibility and social traction. Last year, the National Defense Authorization Act established a three-year pilot program for the U.S. Navy to offer alternative protein options. The California state legislature also invested $700 million to support schools in procuring more plant-based foods and training staff on how to prepare plant-based meals.
The United States Department of Agriculture (USDA) is a major procurer of food through collaboration between the Agricultural Marketing Service (AMS) and the Food and Nutrition Service (FNS) and distributes the majority of procured food through the Child Nutrition Programs (CNPs), especially the National School Lunch Program (NSLP). Currently, AMS does not procure any traditional or alternative protein products made from plant protein, but USDA guidelines do allow traditional and alternative protein products to fulfill meat/meat alternate requirements for CNPs. The AMS should develop product specifications and requirements for procuring these types of products and assist traditional and alternative protein companies to become USDA food vendors. The FNS should then launch a pilot program spending, for example, 1% of their procurement budget on traditional and alternative protein products. This should be supported by education and training of food service workers at schools that participate in the NSLP on how to prepare meals using traditional and alternative proteins.
Conclusion
The sooner that positive tipping points that accelerate desired transitions are triggered, the sooner that decarbonization transitions will be realized and net-zero goals will be met. Early intervention is crucial for supporting the growth and adoption of new zero-emission technologies. The recommendations above present the foundations of a strategy for leveraging positive tipping points and accelerating climate action.
Acknowledgements
I’d like to acknowledge Erica Goldman for her generous feedback and advice on this piece and for her thought leadership on this topic at FAS.
The key conditions for triggering a positive tipping point are affordability, attractiveness, and accessibility of new zero-emission technologies compared to incumbents. Affordability is often the most crucial condition: achieving price parity with incumbent technologies (with and then without the support of taxes and subsidies) can unlock rapid growth and adoption. Attractiveness refers to consumer preferences about a new technology’s performance, complementary features, or ability to signal social values. Accessibility refers to whether supporting infrastructure or knowledge, such as charging stations for EVs or recipes for cooking alternative proteins, is commonly available to support adoption. Due to the relative nature of these conditions, policymakers can influence them either by making the new technology more affordable, attractive, and accessible or by making the incumbent technology less affordable, attractive, and accessible. Often, a combination of both approaches is required to achieve the optimal effect.
States can cooperate to identify and coordinate policies that activate upward-scaling tipping cascades into other states and eventually the federal government. A promising example of this is the growing adoption of California’s Advanced Clean Cars II EV sales mandate by Vermont, New York, Washington, Oregon, Rhode Island, New Jersey, Maryland, and soon Colorado, Massachusetts, and Delaware.
Social contagion, mentioned above, is a powerful type of feedback loop that can drive the spread of not just technology adoption but also new behaviors, opinions, knowledge, and social norms. Through social contagion, social movement can be formed, capable of wielding greater influence than the sum of individuals. That influence can then translate into demands for government and industry action to decarbonize. A prime example is Greta Thunberg and the Fridays for Future student movement. Another example is the Social Tipping Point Coalition that in 2021 rallied a coalition of over 100 scientists, universities, nongovernmental and grassroots organizations, and other individuals to petition the new Dutch parliament to implement new climate policies.
Industry has a direct hand in creating the conditions for a positive tipping point through their business models, technological development, and production. Industries are more likely to invest in adopting and improving low- and zero-carbon technologies and practices if the government clearly signals that it will back the transition, resulting in positive, reinforcing “ambition loops” between government climate policy and industry climate action. Industry coordination is also key to ensuring that new technologies are complementary and that infrastructure supporting a technology is developed alongside the technology itself. For example, coordination between EV companies is necessary to develop compatible charging mechanisms across manufacturers. Coordination between charging companies and EV companies can help charging companies identify which geographies have greatest demand for chargers.
International coordination strengthens positive feedback loops and accelerates cost reductions for green technologies. For example, a recent study suggests that if the three largest car markets—the United States, Europe, and China—implement zero-emission vehicle (ZEV) sales mandates (i.e., requirements that an increasing percentage of each car manufacturer’s sales must be EVs), EVs will be able to reach cost parity with gas vehicles five years sooner than in the scenario without those ZEV mandates.
The U.S. Global Change Research Program’s 2022–2031 Strategic Plan includes tipping points and nonlinear changes in complex systems as two of its research priorities. Specifically, the Strategic Plan highlights the need to investigate “the potential for beneficial tipping points” and incorporate research on nonlinearity in economics-based models to evaluate societal decisions in future National Climate Assessments. However, it will take another four to five years to produce the next National Climate Assessment under this strategic plan. (The fifth National Climate Assessment, which is expected to be published this fall, was drafted before the new strategic plan was published.) Thus, additional executive and agency action is necessary to operationalize positive tipping points in the federal government before the next National Climate Assessment is released.
The federal government currently collects some data on the sales and adoption rates of the relatively more mature clean energy technologies, such as electric vehicles. A 2022 Bloomberg report attempted to identify “early-stage tipping points” at around 5% adoption for 10 clean energy technologies that reflect when their adoption becomes measurably exponential and compare their adoption curves across countries globally. Beyond adoption rates, a number of additional factors indicate progress towards positive tipping points, such as the number of companies investing in a zero-emission technology or the number of states adopting regulations or incentives that support zero-emission technologies in a sector. Tracking these indicators can help policymakers sense when a system is approaching a positive tipping point. The nonprofit Systems Change Lab currently tracks the adoption of decarbonization technologies and factors that affect decarbonization transitions on a global scale. Philanthropic funding or a public-private partnership with the Systems Change Lab could leverage their existing infrastructure to track tipping point indicators on a national scale for the United States.
Approaching a positive tipping point first requires a system to become destabilized in order to make change possible. Once a positive tipping point is crossed, the system then accelerates towards a new state and begins to restabilize. However, the destabilization during the transition can have unintended consequences due to the rapid shift in how social, economic, and technological systems are organized and how resources are distributed within those systems. Potential risks include economic precarity for people employed in rapidly declining industries and resulting social instability and backlash. This can potentially exacerbate inequality and undesirable social division. As such, policies ensuring a just transition must be implemented alongside policies to accelerate positive tipping points. Research on the interaction between these policies is currently ongoing. It is essential that decisions to develop policies that accelerate movement towards positive tipping points always consider and evaluate the potential for unintended consequences.
Building the Talent Pipeline for the Energy Transition: Aligning U.S. Workforce Investment for Energy Security and Supply Chain Resilience
Summary
With the passage of the Infrastructure Investment and Jobs Act (IIJA), the CHIPS and Science Act, and the Inflation Reduction Act (IRA), the United States has outlined a de facto industrial policy to facilitate and accelerate the energy transition while seeking energy security and supply chain resilience. The rapid pace of industrial transformation driven by the energy transition will manifest as a human capital challenge, and the workforce will be realigned to the industrial policy that is rapidly transforming the labor market. The energy transition, combined with nearshoring, will rapidly retool the global economy and, with it, the skills and expertise necessary for workers to succeed in the labor market. A rapid, massive, and ongoing overhaul of workforce development systems will allow today’s and tomorrow’s workers to power the transition to energy security, resilient supply chains, and the new energy economy—but they require the right training opportunities scaled to match the needs of industry to do so.
Policymakers and legislators recognize this challenge, yet strategies and programs often sit in disparate parts of government agencies in labor, trade, commerce, and education. A single strategy that coordinates a diverse range of government policies and programs dedicated to training this emerging workforce can transform how young people prepare for and access the labor market and equip them with the tools to have a chance at economic security and well-being.
Modeled after the U.S. Department of Labor’s (DOL) Trade Adjustment Assistance Community College and Career Training (TAACCCT) program, we propose the Energy Security Workforce Training (ESWT) Initiative to align existing U.S. government support for education and training focused on the jobs powering the energy transition. The Biden-Harris Administration should name an ESWT Coordinator to manage and align domestic investments in training and workforce across the federal government. The coordinator will spearhead efforts to identify skills gaps with industry, host a ESWT White House Summit to galvanize private and social sector commitments, encourage data normalization and sharing between training programs to identify what works, and ensure funds from existing programs scale evidence-based sector-specific training programs. ESWT should also encompass an international component for nearshored supply chains to perform a similar function to the domestic coordinator in target countries like Mexico and promote two-way learning between domestic and international agencies on successful workforce training investments in clean energy and advanced manufacturing.
Challenge and Opportunity
With the passage of the Infrastructure Investment and Jobs Act and the Inflation Reduction Act, the United States has a de facto industrial policy to facilitate and accelerate the energy transition while seeking energy security and supply chain resilience. However, our current workforce investments are not focused on the growing green skills gap. We require workforce investment aligned to the industrial policy that is rapidly transforming the labor market, to support both domestic jobs and the foreign supply chains that domestic jobs depend on.
Preparing Americans to Power the Energy Transition
The rapid pace of industrial transformation driven by the energy transition will manifest as a human capital challenge. The energy transition will transform and create new jobs—requiring a massive investment to skill up the workers who will power the energy transition. Driving this rapid transition are billions of dollars slated for incentives and tax credits for renewable energy and infrastructure, advanced manufacturing, and supply chain creation for goods like electric vehicle batteries over the coming years. The vast upheaval caused by the energy transition combined with nearshoring is transforming both current jobs as well as the labor market young people will enter over the coming decade. The jobs created by the energy transition have the potential to shift a whole generation into the middle class while providing meaningful, engaging work.
Moving low-income students into the middle class over the next 10 years will require that education and training institutions meet the rapid pace of industrial transformation required by the energy transition. Education and training providers struggle to keep up with the rapid pace of industrial transformation, resulting in skills gaps. Skills gaps are the distance between the skills graduates leave education and training with and the skills required by industry. Skills gaps rob young people of opportunities and firms of productivity. And according to LinkedIn’s latest Green Economy report, we are facing a green skills gap—with the demand for green skills outpacing the supply in the labor force. Firms have cited skills gaps in diverse sectors related to the energy transition, including infrastructure, direct air capture, electromobility, and geothermal power.
Graduates with market-relevant skills earn between two and six times what their peers earn, based on evaluations of International Youth Foundation’s (IYF) programming. In addition, effective workforce development lowers recruitment, selection, and training costs for firms—thereby lowering the transaction costs to scale moving people into the positions needed to power the energy transition. Industrial transformation for the energy transition involves automation, remote sensing, and networked processes changing the role of the technician—who is no longer required to execute tasks but instead to manage automated processes and robots that now execute tasks. This changes the fundamental skills required of technicians to include higher-order skills for managing processes and robots.
We will not be able to transform industry or seize the opportunities of the new energy future without overhauling education and training systems to build the skills required by this transformation and the industries that will power it. Developing higher-order thinking skills means changing not only what is taught but how teaching happens. For example, students may be asked to evaluate and make actionable recommendations to improve energy efficiency at their school. Because many of these new jobs require higher-order thinking skills, policy investment can play a crucial role in supporting workers and those entering the workforce to be competitive for these jobs.
Creating Resilient Supply Chains, Facilitating Energy Security, and Promoting Global Stability in Strategic Markets
Moving young people into good jobs during this dramatic economic transformation will be critical not only in the United States but also to promote our interests abroad by (1) creating resilient supply chains, 2) securing critical minerals, and (3) avoiding extreme labor market disruptions in the face of a global youth bulge.
Supply chain resilience concerns are nearshoring industrial production—shifting the demand for industrial workers across geographies at a shocking scale and speed—as more manufacturing and heavy industries move back into the United States’ sphere of influence. The energy transition combined with nearshoring will rapidly retool the global economy. We need a rapid, massive, and ongoing overhaul of workforce development systems at home and abroad. The scale of this transition is massive and includes complex, multinational supply chains. Supply chains are being reworked before our eyes as we nearshore production. For example, the port of entry in Santa Teresa, New Mexico, is undergoing rapid expansion in anticipation of explosive growth of imports of spare parts for electric vehicles manufactured in Mexico. These shifting supply chains will require the strategic development of a new workforce.
The United States requires compelling models to increase its soft power to secure critical minerals for the energy transition. Securing crucial minerals for the energy transition will again reshape energy supply chains, as the mineral deposits needed for the energy transition are not necessarily located in the same countries with large oil, gas, or coal deposits. The minerals required for the energy transition are concentrated in China, Democratic Republic of Congo, Australia, Chile, Russia, and South Africa. We require additional levers to establish productive relationships to secure the minerals required for the energy transition. Workforce investments can be an important source of soft power.
Today’s 1.2 billion young people today make up the largest and most educated generation the world has ever seen, or will ever see, yet they face unemployment rates at nearly triple that of adults. Globally the youth unemployment rate is 17.93% vs. 6.18% for adults. The youth unemployment rate refers to young people aged 15–24 who are available for or seeking employment but who are unemployed. While rich countries have already passed through their own baby booms, with accompanying “youth bulges,” and collected their demographic dividends to power economic growth and wealth, much of the developing world is going through its own demographic transition. While South Korea experienced sustained prosperity once its baby boomers entered the labor force in the early 2000s, Latin America’s youth bulge is just entering the labor force. In regions like Central America, this demographic change is fueling a wave of outmigration. In Sub-Saharan Africa, the youth bulge is making its way through compulsory education with increasing demands for government policy to meet high rates of youth unemployment. It is an open question whether today’s youth bulges globally will drive prosperity as they enter the labor market. Policymakers are faced with shaping labor force training, and government policy rooted in demonstrable industry needs to meet this challenge. At the same time, green jobs is already one of the most rapidly growing occupations. The International Energy Agency (IEA) projects that adopting clean energy technologies will generate 14 million jobs by 2030, with 16 million more to retrofit and construct energy-efficient buildings and manufacture new energy vehicles. At the same time, the World Economic Forum’s 2023 future of jobs report cites the green transition as the key driver of job growth. However, the developing world is not making the corresponding investments in training programs for the green jobs that are driving growth.
Alignment with Existing Initiatives
The Biden-Harris Administration’s approach to the energy transition, supply chain resilience, and energy security must address this human capital challenge. Systemic approaches to building the skills for the energy transition through education and training complement the IRA’s incentivized apprenticeships, and focus investments from the IIJA, by building out a complete technical, vocational, education and training system oriented toward building the skills required for the energy transition. We propose a whole-of-government approach that integrates public investment in workforce training to focus on the energy transition and nearshoring with effective approaches to workforce development to address the growing green skills gap that endangers youth employment, the energy transition, energy security and supply chain resilience.
The Biden-Harris Administration Roadmap to Support Good Jobs demonstrates a commitment to building employment and job training into the Investing in America Agenda. The Roadmap catalogs programs throughout the federal government that address employment and workforce training authorized in recent legislation and meant to enable more opportunities for workers to engage with new technology, advanced manufacturing, and clean energy. Some programs had cross-sector reach, like the Good Jobs Challenge that reached 32 states and territories authorized in the American Rescue Plan to invest in workforce partnerships, while others are more targeted to specific industries, like the Battery Workforce Initiative that engages industry in developing a battery manufacturing workforce. The Roadmap’s clearinghouse of related workforce activities across the federal ecosystem presents a meaningful opportunity to advance this commitment by coordinating and strategically implementing these programs under a single series of objectives and metrics.
Identifying evidence-driven training programs can also help fill the gap between practicums and market-based job needs by allowing more students access to practical training than can be reached solely by apprenticeships, which can have high individual transaction costs for grantees to coordinate. Additionally, programs like the Good Jobs Challenge required grantees to complete a skills-gap analysis to ensure their programs fit market needs. The Administration should seek to embed capabilities to conduct skills-gap analyses first before competitive grants are requested and issued to better inform program and grant design from the beginning and to share that learning with the broader workforce training community. By using a coordinated initiative to engage across these programs and legislative mandates, the Administration can create a more catalytic, scalable whole-of-government approach to workforce training.
Collaborating on metrics can also help identify which programs are most effective at meeting the core metrics of workforce training—increased income and job placements—which often are not met in workforce programs. This initiative could be measured across programs and agencies by (1) the successful hiring of workers into quality green jobs, (2) the reduction of employer recruitment and training costs for green jobs, and (3) demonstrable decreases in identified skills gaps—as opposed to a diversity of measures without clear comparability that correspond to the myriad agencies and congressional committees that oversee current workforce investments. Better transferable data measured against comparable metrics can empower agencies and Congress to direct continued funds toward what works to ensure workforce programs are effective.
The DOL’s TAACCCT program provides a model of how the United States has successfully invested in workforce development to respond to labor market shocks in the past. Building on TAACCCT’s legacy and its lessons learned, we propose focusing investment in workforce training to address identified skills gaps in partnership with industry, engaging employers from day one, rather than primarily targeting investment based on participant eligibility. When investing in bridging critical skills gaps in the labor market, strategy and programs must be designed to work with the most marginalized communities (including rural, tribal, and Justice40 communities) to ensure equitable access and participation.
Increased interagency collaboration is required to meet the labor market demands of the energy transition, both in terms of domestic production in the United States and the greening of international supply chains from Mexico to South Africa. Our proposed youth workforce global strategy, the Energy Security Workforce Training Initiative outlined below provides a timely opportunity for the Administration to make progress on its economic development, workforce and climate goals.
Plan of Action
We propose a new Energy Security Workforce Training Initiative to coordinate youth workforce development training investments across the federal government, focused on critical and nearshored supply chains that will power energy security. ESWT will be charged with coordinating U.S. government workforce strategies to build the pipeline for young people to the jobs powering the energy transition. ESWT will rework existing education and training institutions to build critical skills and to transform how young people are oriented to, prepared for, and connected to jobs powering the energy transition. ESWT will play a critical role in cross-sector and intergovernmental learning to invest in what works and to ensure federal workforce investments in collaboration with industry address identified skills gaps in the labor market for the energy transition and resilient supply chains. Research and industry confirmation would inform investments by the Department of Energy (DOE), Department of Education (ED), Department of Commerce (DOC), and Department of Labor (DOL) toward building identified critical skills through scalable means with marginalized communities in mind. A key facet of ESWT will be to normalize and align the metrics by which federal, state, and local partners measure program effectiveness to allow for better comparability and long-term potential for scaling the most evidence-driven programs.
The ESWT should be coordinated by the National Economic Council(NEC) and DOC, particularly the Economic Development Administration. Once established, ESWT should also involve an international component focused on workforce investments to build resilience in nearshore supply chains on which U.S. manufacturing and energy security rely. Mexico should serve as an initial pilot of this global initiative because of its intertwined relationship with U.S. supply chains for products like EV batteries. Piloting a novel international workforce training program through private sector collaboration and U.S. Agency for International Development (USAID), DOL, and U.S. International Development Finance Corporation (DFC) investments could help bolster resilience for domestic jobs and manufacturing. Based on these results, ESWT could expand into other geographies of critical supply chains, such as Chile and Brazil. To launch ESWT, the Biden Administration should pursue the following steps.
Recommendation 1. The NEC should name an ESWT Initiative Coordinator in conjunction with a DOC or DOL lead who will spearhead coordination between different agency workforce training activities.
With limited growth in government funding over the coming years, a key challenge will be more effectively coordinating existing programs and funds in service of training young people for demonstrated skills gaps in the marketplace. As these new programs are implemented through existing legislation, a central entity in charge of coordinating implementation, learning, and investments can best ensure that funds are directed equitably and effectively. Additionally, this initial declaration can lay the groundwork to build capacity within the federal government to conduct market analyses and consult with industries to better inform program design and grant giving across the country. The DOC and the Economic Development Administration seem best positioned to lead this effort with an existing track record through the Good Jobs Challenge and capacity to engage fully with industry to build trust that curricula and training are conducted by people that employers verify as experts. However, the DOL could also take a co-lead role due to authorities established under the Workforce Innovation and Opportunity Act (WIOA). In selecting lead agencies for ESWT, these criteria should be followed:
- Access to emerging business intelligence regarding industry-critical skills—DOC, DOE
- Combined international and domestic remit—DOE/DOL, DOC (ITA)
- Remit that allows department to focus investment on demonstrated skills gaps, indicated by higher wages and churn—DOC
- Permitted to convene advisory committees from the private sector under the Federal Advisory Committee Act—DOC
Recommendation 2. The DOC and NEC, working with partner agencies, should collaborate to identify and analyze skills gaps and establish private-sector feedback councils to consult on real-time industry needs.
As a first step, DOC should commission or conduct research to identify quantitative and qualitative skills gaps related to the energy transition in critical supply chains both domestically and in key international markets — energy efficiency in advanced manufacturing, electric vehicle production, steel, batteries, rare earth minerals, construction, infrastructure and clean energy. DOC should budget for 20 skills gap assessments for critical occupational groups (high volume of jobs and uncertainty related to required, relevant skills) in the above-mentioned sectors. Each skills gap assessment should cost roughly $100,000, bringing the total investment to $2 million over a six-to-twelve-month period. Each skills gap assessment will determine the critical and scarce skills in a labor market for a given occupation and the degree to which existing education and training providers meet the demand for skills.
This research is central to forming effective programs to ensure investments align with industry skills needs and to lower direct costs on education providers, who often lack direct expertise in this form of analysis. Commissioning these studies can help build a robust ecosystem of labor market skills gap analysts and build capacity within the federal government to conduct these studies. By completing analysis in advance of competitive grant processes, federal grants can be better directed to training based on high-need industry skill sets to ensure participating students have market-driven employment opportunities on completion. The initial research phase would occur over a six-month timeline, including staffing and procurement. The ESWT coordinator would work with DOC, ED, and DOL to procure curricula, enrollment, and foreign labor market data. Partner agencies in this effort should also include the Departments of Education, Labor, and Energy. The research would draw upon existing research on the topic conducted by Jobs for the Future, IYF, the Project on Workforce at Harvard, and LinkedIn’s Economic Graph.
Recommendation 3. Host the Energy Security Workforce Development White House Summit to galvanize public, private, and social sector partners to address identified skills gaps.
The ESWT coordinator would present the identified quantitative and qualitative skills gaps at an action-oriented White House Summit with industry, state and local government partners, education providers, and philanthropic institutions. The Summit could serve as a youth-led gathering focused on workforce and upskilling for critical new industries and galvanize a call to action across sectors and localities. Participants will be asked to prioritize among potential choices based on research findings, available funding mechanisms, and imperatives to transform education and training systems at scale and at pace with industrial transformation. Addressing the identified skills gaps will require partnering with and securing the buy-in of both educational institutions as well as industry groups to identify what skills unlock opportunities in given labor markets, develop demand-driven training, and expanded capacity of education and training providers in order to align interests as well as curricula so that key players have the incentives and capacity to continually update curricula—creating lasting change at scale. This summit would also serve as a call to action for private sector partnerships to invest in helping reskill workers and establish buy-in from the public and civil society actors.
Recommendation 4. Establish standards and data sharing processes for linking existing training funds and programs with industry needs by convening state and local grantees, state agencies, and federal government partners.
ESWT should lay out a common series of metrics by which the federal government will assess workforce training programs to better equip efforts to scale successful programs with comparable evidence and empower policymakers to invest in what works. We recommend using the following metrics:
- Successful hiring of workers into quality green jobs
- The reduction of employer recruitment and training costs for green jobs
- Demonstrable decreases in identified skills gaps
Metrics 2 and 3 will rely on ongoing industry consultations—as well as data from the Bureau of Labor Statistics. Because of the diffuse nature of existing skills gap analyses across federal grantees and workforce training programs, ESWT should serve as a convenor for learning between jurisdictions. Models for federal government data clearinghouses could be effective as well as direct sharing of evidence and results between education providers across a series of common metrics.
Recommendation 5. Ensure grants and investments in workforce training are tied to addressing specific identified skills gaps, not just by regional employment rates.
A key function of ESWT would be to determine feasible and impactful strategies to address skills gaps in critical supply chains, given the identified gaps, existing funding mechanisms, the buy-in of critical actors in key labor markets (both domestic and international), agency priorities, and the imperative to make transformative change at scale. The coordinator could help spur agencies to pursue flexible procurement and grant-making focused on outcomes and tied to clear skills gap criteria to ensure training demonstrably develops skills required by market needs for the energy transition and growing domestic supply chains. While the Good Jobs Challenge required skills gap analysis of grantees, advanced analyses by the ESWT Initiative could inform grant requirements to ensure federal funds are directed to high-need programs. As many of these fields are new, innovative funding mechanisms could be used to meet identified skills gaps and experiment with new training programs through tiered evidence models. Establishing criteria for successful workforce training programs could also serve as a market demand-pull signal that the federal government is willing and able to invest in certain types of training, crowding-in potential new players and private sector resources to create programs tailored for the skills industry needs.
Depending on the local context, the key players, and the nature of the strategy to bridge the skills gap for each supply chain, the coordinating department will determine what financing mechanism and issuing agency is most appropriate: compacts, grants, cooperative agreements, or contracts. For example, to develop skills related to worker safety in rare-earth mineral mines in South Africa or South America, the DOL could issue a grant under the Bureau of International Labor Affairs. To develop the data science skills critical for industrial and residential energy efficiency, the ED could issue a grants program to replicate Los Angeles Unified School District’s Common Core-aligned data science curriculum.
Recommendation 6. Congress should authorize flexible workforce training grants to disperse—based on identified industry needs—toward evidence-driven, scalable training models and funding for ESWT within the DOC to facilitate continued industry skills need assessments.
Congress should establish dedicated staff and infrastructure for ESWT to oversee workforce training investments and actively analyze industry needs to inform federal workforce investment strategies. Congress and the Administration should also explore how to incentivize public-private partnerships and requirements for energy, manufacturing, and supply chain companies to engage in curriculum development efforts or provide technical expertise to access tax credits included in the IRA or CHIPS.
Recommendation 7. ESWT could also incorporate an international perspective for nearshored supply chains critical to energy security and advanced manufacturing.
To pilot this model, we recommend:
- Bilateral coordination of federal workforce and training investments across agencies like State, USAID, and DFC: Mexico could serve as an ideal pilot country due to its close ties with U.S. supply chains and growth in the manufacturing sector. This coordination effort should direct USAID and other government funding toward workforce training for industries critical to domestic supply chains for energy security and green jobs.
- Two-way learning between domestic and international workforce programs: As ESWT develops effective strategies to address the skills gap for the energy transition, the interagency initiative will identify opportunities for two-way learning. For example, as curricula for eclectic vehicle assembly is developed and piloted in Mexico with support from USAID, it could inform U.S.-based community colleges’ work with the DOL and DOE.
- If successful, expand to additional aligned countries including Brazil, India, and South Africa and nations throughout the Americas that source energy and manufacturing inputs for the green economy: ESWT could facilitate scalable public-private partnership vehicles for partner country governments, private corporations, philanthropy, and nongovernmental organizations to collaborate and fund country-dedicated programs to train their energy and climate workforce. This step could be done in conjunction with naming a Special Envoy at the State Department to coordinate diplomatic engagement with partner countries. The Envoy and Coordinator should have expertise and experience in North and South America economic relations and diplomacy, and labor markets economics. Congress could incorporate dedicated funds for ESWT into annual appropriations at State.
Conclusion
The transition from an economy fueled by human and animal labor to fossil fuels took roughly 200 years (1760–1960) and was associated with massive labor market disruptions as society and workers reacted to a retooled economy. Avoiding similar labor market disruptions as we seek to transition off fossil fuels over decades, not centuries, will require concentrated coordinated action. The Energy Security Workforce Training Initiative will overhaul education and training systems to develop the skills needed to reduce greenhouse gas emissions in the labor markets central to long-term U.S. energy security and ensure that supply chains are resilient to shocks. Such a coordinated investment in training will lower recruitment, selection, and training costs for firms while increasing productivity and move people into the middle class with the jobs fueling the energy transition.
By focusing federal workforce funding on addressing the green skills gap, we will be able to address the human capital challenges implicit in scaling the infrastructure, manufacturing overhaul, and supply chain reconfiguration necessary to secure a just transition, both at home and abroad. By building in critical international supply chains both for manufacturing and energy security from day one, the ESWT Initiative incorporates two-way learning as a means to knit together strategic supply chains through bilateral investments in equitable workforce initiatives.
Existing investments in workforce development are fragmented and are not oriented toward building the workforce needed to a net-zero carbon world, with secure energy supplies and resilient supply chains. This collaboration model ensures that workforce investments are aligned towards the net-zero carbon by 2050 aim and are targeted to the domestic and international labor markets essential to ensuring that aim, energy security, and supply chain resilience.
Similarly, to the Feed the Future Coordinator, created in 2009 because of global food insecurity and recognizing after the L’Aquila Italy G8 Summit Joint Statement on Global Food Security towards a goal of mobilizing $20 million over three years towards global agricultural and development that we needed a greater focus on food security.
This role would ensure that programs are aligned around common goal and measuring progress towards that goal. The NEC oversees the work of the coordinator. Ultimately, the Coordinator would work with Congress and the NEC to develop authorization language.
Instead of creating a new fund or program requiring congressional authorization, the ESWT strategy would align existing workforce investments across government with the Administration’s aim of net-zero greenhouse gas emissions by 2050.
Skills gaps are persistent problems around the world as education and training systems struggle to keep up with changing demands for skills. Simply investing in training systems, without addressing the underlying causes of skills gaps, will not address skills gaps. Instead, investment must be tied to the development of market-demanded skills. In IYF’s experience, this requires understanding quantitative and qualitative skills gaps, developing an industry consensus around priority skills, and driving changes to curricula, teaching practices, and student services to orient and train young people for opportunities.
Our proposed unified approach to workforce development for the energy transition aligns with the priorities of the former Congress’s House Subcommittee on Higher Education and Workforce Investment, the US Strategy to Combat Climate Change through International Development; and the Congressional Action Plan for a Clean Energy Economy and a Healthy, Resilient, and Just America.
Systemic workforce approaches that engage the public, private, and civil sectors spur catalytic investments and bring new partners to the table in line with USAID’s commitment to drive progress, not simply development programs. However, there has been little concentrated investment to build the necessary skills for the energy transition. A coordinated investment strategy to support systemic approaches to build the workforce also aligns with USAID’s localization agenda by:
- Building the capacity of local Technical Vocational Education and Training systems to develop the workforce that each country needs to meet its zero-emission commitments while continuing to grow its economy.
- Developing the capacity of local organizations, whose mission will be to facilitate workforce development efforts between the public, private and civil sectors.
- Incentivize industrial policy changes to include workforce considerations in the plan to decarbonize.
- Creating increased opportunities to generate and share evidence on successful workforce strategies and programs. To keep up with this rapid transformation of the economy, it will be essential to share information, lessons learned, and effective approaches across international, multilateral, and bilateral organizations and through public private partnerships. For example, the Inter-American Development Bank has identified the Just Transition as a strategic priority and is working with LinkedIn to identify critical skills. As Abby Finkenauer, the State Department’s Special Envoy for Global Youth Issues, has long championed, bringing domestic and international lessons together will be critical to make a more inclusive decarbonized economy possible.
Climate-Smart Cattle: US Research and Development Will Improve Animal Productivity, Address Greenhouse Gases, and Hasten Additional Market Solutions
Summary
Cattle in the United States release the greenhouse gas methane (known as “enteric methane”) from their digestive systems, which is equivalent to the amount of methane that leaks from fossil fuel infrastructure. Addressing enteric methane in cattle represents an opportunity to reduce the U.S. greenhouse gas footprint by 3% and simultaneously improve cattle productivity by ~6%. However, current solutions only address, at most, 10% of these emissions, and the U.S. has spent under $5m per year on R&D over the past five years to address this critical climate area.

Therefore, to establish long-term U.S. leadership and export competitiveness, we recommend regulatory simplification and an $82m per year U.S. Department of Agriculture research and innovation program. These common-sense recommendations would create a win for producers and a win for the environment by advancing solutions that easily drop into existing farm practices and convert avoided methane into increased milk and meat production.
Challenge & Opportunity
Cattle and other ruminants digest their food via anaerobic (oxygen-free) fermentation. This unique system allows them to digest roughage such as grasses and other forage and transform it into meat and milk. But it also generates methane. Cattle release on average 6% of the calories they eat as methane, a substantial loss in their potential meat and milk productivity. This methane is in addition to the methane emitted by their manure.
An invisible and odorless gas, methane is a powerful greenhouse gas that is responsible for 0.5°C of the 1°C of modern global warming (based on the 2010-2019 average). One-third of U.S. anthropogenic methane emissions come from cattle and other ruminants. Solutions may be able to be developed that both disrupt enteric methane production while also increasing cattle productivity. That would help reduce global temperatures and provide benefits for both producers and consumers. Currently, there are a few tested and marketable solutions that use chemicals to disrupt methane-creating microbes in the cattle’s first stomach (the rumen). These are important solutions that need to be evaluated for regulatory approval. However, additional research and development must also be done, to help address the majority of emissions that don’t yet have available solutions, particularly from cattle grazing in pastures. Additional work is also needed to continue developing solutions that consistently lead to a productivity benefit. Focused scientific research could deepen our understanding of cattle metabolism, and advance new solutions for reducing enteric methane further.
Progress on this front also requires improved research tools to measure how much methane cattle emit and relate these methane emissions to their productivity and intake of feed and forage. Access to such research tools enables researchers and innovators to develop and evaluate new solutions. Methane emissions rates vary widely between cattle on the same farm of the same breed, as well as across breeds. Currently these tools are expensive and not widely available. For example, the primary tool available measures twenty cattle per day, costs ~$100,000, and can be found at only a handful of research institutions. That presents a practical problem of access not only for producers but also for non-specialist scientific innovators. Making those tools more accessible, for example via fee-for-service centers at leading U.S. Land Grant institutes, would make them more affordable for producers and researchers. That would help unlock the creativity of U.S. innovators, and provide evidence that their solutions have a positive climate impact and are feasible for producers and acceptable to consumers.
Even when new solutions are found and proven, innovators still face a 10-year FDA approval process. This is uncompetitive and restrictive compared to other countries. Since much faster approval is possible in Australia, Brazil, and Europe, innovators have an incentive to launch their products and build their businesses there rather than in the USA. And as climate-aware export markets develop, slow FDA approval will cost U.S. producers market share and market opportunity. We therefore recommend that the FDA be given authority and direction to evaluate new methane-reducing products for safety on an accelerated timeline, while maintaining critical human and animal safety standards. This would help the U.S. position itself as a global leader in a potential multi-billion dollar market while upholding its climate commitments.

Minimizing peak temperatures requires livestock enteric methane research today.
Plan of Action
I. FUND BASIC & APPLIED LIVESTOCK ENTERIC METHANE RESEARCH
Total Funds Needed: $50,000,000 per year
Developing science-based, effective livestock enteric methane solutions depend on a detailed understanding of cattle microbiology as well as practical understanding of what makes solutions easy to adopt. These solutions have the potential to not only decrease enteric methane emissions but could unlock a new frontier of efficiency for the U.S. livestock sector, helping build a more resilient and productive food system. Increasing funding for basic and applied research could accelerate development of new methods, and rapidly build a portfolio of scalable potential solutions. Capacity funding will increase the near- and long-term throughput for solution development and shorten the idea to market timeline for these products. Competitive funding will drive innovation in sectors and geographies that have significant implementation barriers, such as those applicable to pasture operations, and can accelerate adoption of proven solutions. The Committee on Appropriations, has recognized the innovation potential increased public funds can make possible, and has encouraged USDA-NIFA to prioritize advancement of enteric fermentation solutions.
We recommend competitive and capacity funding within USDA-NIFA, including AFRI, Hatch, Animal Health and Disease, and other programs be appropriated to:
Basic research in livestock methane microbiology to create a knowledge base that will support development of new win-win solutions and accelerate our understanding of host-microbiome interactions.
Applied livestock methane solutions research based on livestock methane biology knowledge. This work should prioritize solutions that reduce methane in new ways; that simultaneously increase the production of milk or meat; and that have the potential to be in a long-duration (e.g. once per year) product formulation compatible with grazing cattle. Such technology already exists for cattle nutrition and disease prevention.
Perform surveys and other social science research to understand barriers and opportunities to low-cost and low-complexity implementation for American producers and ranchers. This research will help guide the development of new solutions and tailor the design and deployment of solutions among the diversity of U.S. operations. Together, this will maximize the global market potential of U.S. innovation.
We recommend Congress request of USDA a full-accounting and report of its current spending on enteric methane R&D across all its programs.
II. CREATE PUBLIC FEE-FOR-SERVICE TESTING FACILITIES FOR LIVESTOCK METHANE
Funds Needed: $15,000,000 per year
Access to methane test facilities, from the laboratory to the dairy barn, is a bottleneck. It limits how many innovative ideas for solutions can be tested. Only a small number of institutions worldwide have the tools needed to test methane, and outside access to those tools is limited. We recommend funding be authorized and appropriated for innovation-enabling research infrastructure to USDA-ARS through USDA Equipment Grants and USDA-AFRI. This funding would:
Authorize and establish a nationwide network of fee-for-access livestock methane research facilities. This equips the USDA-ARS laboratories with research measurement equipment and technical staff by partnering with U.S. land grant universities that already possess the necessary research cattle management expertise. Joint investment with them and partial support from research users will quickly make the U.S. an international leader in livestock methane research.
Develop a national center for pre-livestock testing and screening of potential products. This will serve as a user facility. Specialized cattle researchers shouldn’t be the only ones who can test new ideas for reducing livestock enteric methane. Accessible facilities can unlock innovation from the U.S.’ world-leading biology researchers.

Livestock methane production is invisible: current livestock methane measurement equipment costs about $100,000 for a system that measures 20-30 cattle per day.
III. FUND DEVELOPMENT OF LOW-COST CATTLE METHANE MEASUREMENT TECHNOLOGY
Funds needed: $15,000,000 per year
What is measured guides innovation and management, and what we measure easily and consistently, we improve. Producers measure milk production on every cow, every day, leading to a 300% productivity increase since 1950. But for all producers and most researchers, livestock methane production is invisible: current livestock methane measurement equipment costs about $100,000 for a system that measures 20-30 cattle per day. We recommend authorizing and appropriating $15 million per year to USDA-NIFA, Division of Animal Systems in order to:
Develop lower-cost measurement systems so every research barn can measure livestock methane. U.S. land grant universities have over ten thousand research cattle. Equipped with measurement systems, they could all provide livestock methane research data.
Develop farm-integrable measurement systems that make methane emissions and costs visible to U.S. producers, enabling them to experiment and innovate. Methane is a loss for livestock production. If producers can see it, they’ll work to decrease methane and improve their bottom line.
A $15 million annual budget for this technology development will lead to rapid improvements. Part of this would fund interdisciplinary projects that bring engineers from across industry and livestock experts together. We recommend another part be framed as a grand challenge to achieve cost and performance targets connected to a government procurement market-shaping program.
IV. MODERNIZE THE US FOOD, DRUG, AND COSMETIC ACT
Funds Needed: $2,000,000 per year
Current anti-methane feed additives are regulated as drugs, requiring a ten-year approval process. As European export markets increasingly regulate emissions, this may lead to a lack of competitiveness for U.S. products. To address this, Congress asked the FDA to review options to accelerate the approval of environmentally beneficial additives. One mechanism to shorten the regulatory timeframe of approval is to amend an existing approval pathway which exists for feed additives. Legislation has been introduced (Innovative Feed Enhancement and Economic Development Act of 2023) which would, in part, amend the Federal Food, Drug, and Cosmetic Act to include Zootechnical Animal Feed Substances as a category under the feed additive petition process. This could reduce the approval timeline for environmentally beneficial additives by 5-fold.
We recommend Congress continue to support the modernization of the U.S. Food, Drug and Cosmetic Act, and authorize and appropriate an additional $2 million per year to the Food and Drug Administration, Center for Veterinary Medicine, for personnel resources and infrastructure to robustly evaluate new anti-methane solutions for safety and efficacy and make new solutions available to farmers.
V. SUPPORT ADOPTION OF ENTERIC METHANE MITIGATION STRATEGIES THROUGH EXISTING PROGRAMS
Funds Needed: No Additional Funding
In a recent survey, fewer than 30% of U.S. producers indicated they would be willing to adopt an enteric methane solution if they had to bear the cost. Government or other funding assistance was the second most important factor influencing the use of potential solutions behind increased productivity. The Environmental Quality Incentives Program (EQIP) is the flagship program administered by USDA- Natural Resources Conservation Service and can provide financial assistance for the implementation of conservation practices, including practices that reduce greenhouse gasses. In order to promote the adoption of enteric methane mitigation solutions, we recommend USDA-NRCS:
Review conservation practice standards to include new enteric methane mitigation solutions when applicable and include mechanisms to incentivize established methods to reduce enteric methane (i.e. lipid supplementation). Encourage regular updating of practice standards to rapidly incorporate new solutions as they are approved for use, and train technical assistance providers on the implementation of enteric methane mitigation strategies.
Enteric methane is responsible for ~15% or 0.16℃ of current warming. Protein production from animal agriculture is expected to increase in the coming decades to meet increased capita and per capita consumption. Early research on methane mitigating feed additives have demonstrated enteric methane reductions up to 90% in animal trials. Technology nearing regulatory approval has demonstrated 20-30% reductions. However, these solutions aren’t yet applicable to grazing cattle. With increased research and deployment efforts, enteric methane mitigation can help meet future protein demand with fewer animals and reduce overall warming by more than 10%.
Today, no products are approved by the FDA to reduce enteric methane emissions. However, some nutritional approaches are effective, including feeding higher amounts of lipids in an animal’s diet, since lipids increase the calories available for the animal, but do not promote methane production. However, lipids can be expensive for producers and to ensure animal health, no more than a few percent of an animal’s diet can come from lipids.
Other products currently being investigated include chemicals and natural products like 3-NOP, seaweed, and even probiotics. While dietary modification for lipids and supplementation with feed additives show promise in feedlot and confined operation settings, none of the emerging solutions are applicable to grazing systems. Research areas of interest include developing breeding strategies for low methane producing animals, anti-methane vaccines, and novel delivery mechanisms for grazing animals.
Methane emissions from manure are largely dependent on whether the manure is exposed to air (methane producing microbes are not productive in oxygen rich environments). Grazing animals for instance generate very little manure methane, because manure is deposited over large areas and is exposed to open air. In confined operations like large dairies, manure is often flushed with water or scraped into a holding pond before it is applied to fields as fertilizer. These liquid manure lagoons quickly become anaerobic (without oxygen) and are an ideal environment for methane producing microbes.
Some enteric methane mitigation compounds could in theory reduce manure lagoon emissions, however the compounds would have to survive the digestive tract of the animal. It is also possible that some compounds could decrease enteric emissions but increase manure emissions. While this has not been demonstrated, prudent experimentalists include this in research studies. Growing efforts to reduce the methane from large manure lagoons include covering the lagoon and capturing the renewable biogas for use as transportation fuel, or electricity production, or processing the manure to separate the solids from the liquids and composting the solids to reduce emissions.
Using Other Transactions at DOE to Accelerate the Clean Energy Transition
Summary
The Department of Energy (DOE) should leverage its congressionally granted other transaction authority to its full statutory extent to accelerate the demonstration and deployment of innovations critical to the clean energy transition. To do so, the Secretary of Energy should encourage DOE staff to consider using other transactions to advance the agency’s core missions. DOE’s Office of Acquisition Management should provide resources to educate program and contracting staff on the opportunity that other transactions present. Doing so would unlock a less used but important tool in demonstrating and accelerating critical technology developments at scale with industry.
Challenge and Opportunity
OTs are an underleveraged tool for accelerating energy technology.
Our global and national clean energy transition requires advancing novel technology innovations across transportation, electricity generation, industrial production, carbon capture and storage, and more. If we hope to hit our net-zero emissions benchmarks by 2030 and 2050, we must do a far better job commercializing innovations, mitigating the risk of market failures, and using public dollars to crowd in private investment behind projects.
The Biden Administration and the Department of Energy, empowered by Congress through the Inflation Reduction Act (IRA) and the Bipartisan Infrastructure Law (BIL), have taken significant steps to meet these challenges. The Loan Programs Office, the Office of Clean Energy Demonstrations, the Office of Technology Transitions, and many more dedicated public servants are working hard towards the mission set forward by Congress and the administration. They are deploying a range of grants, procurement contracts, and tax credits to achieve their goals, and there are more tools at their disposal to accelerate a just, clean energy transition. The large sums of money appropriated under BIL and IRA require new ways of thinking about contracting and agreements.
Congress gives several federal agencies the authority to use flexible agreements known as other transactions (OTs). Importantly, OTs are defined by what they are not. They are not a government contract or grant, and thus not governed by the Federal Acquisitions Regulations (FAR). Historically, NASA and the DoD have been the most frequent users of other transaction authorities, including for projects like the Commercial Orbital Transportation System at NASA which developed the Falcon 9 space vehicle, and the Global Hawk program at DARPA.
In contrast, the Department of Energy has infrequently used OTs, and even when it has, the programs have achieved no notable outcomes in support of their agency mission. When the DOE has used OTs, the agency has interpreted their authority as constraining them to cost-sharing research agreements. This restricts the creativity of agency staff in executing OTs. All the law says is that an OT is not a grant or contract. By limiting itself to cost sharing research agreements, DOE is preemptively foreclosing all other kinds of novel partnerships. This is critical because some nascent climate-solution technologies may face a significant market failure or a set of misaligned incentives that a traditional research and development transaction (R&D) may not fix.
This interpretation has hampered DOE’s use of OTs, limited its ability to engage small businesses and nontraditional contractors, and prevented DOE from fully pursuing its agency mission and the administration’s climate goals.
Exploring further use of OTs would open up a range of possibilities for the agency to help address critical market failures, help U.S. firms bridge the well-documented valleys of death in technology development, and fulfill the benchmarks laid out in the DOE’s Pathways to Commercial Liftoff.
According to a GAO report from 2016, the DOE has only used OTs a handful of times since they had the authority updated in 2005, nearly two decades ago. Compare the DOE’s use of OTs to other agencies in the four-year period in the table below (the most recent for which there is open data).

From GAO-16-209
Almost every other agency uses OTs at a significantly higher rate, including agencies that have smaller overall budgets. While quantity of agreements is not the only metric to rely on, the magnitude of the discrepancy is significant.
Other agencies have made significant changes since 2014, most notably the Department of Defense. A 2020 CSIS report found that DoD use of OTs for R&D increased by 712% since FY2015, including a 75% increase in FY2019. This represents billions of dollars in awards, much of which went to consortia, including for both prototyping and production transactions. While the DOE does not have the same budget or mission as DoD, the sea change in culture among DoD officials willing to use OTs over the past few years is instructive. While DoD did receive expanded authority in the FY2015 and 2016 NDAA, this alone did not account for the massive increase. A cultural shift drove program staff to look at OTs as ways to quickly prototype and deploy solutions that could advance their missions, and support from leadership enabled staff to successfully learn how and when to use OTs.
The Department of Transportation (DOT) only uses OTs for two agencies, the Federal Aviation Administration (FAA) and the Pipeline and Hazardous Materials Safety Administration (PHIMSA). Like DOE, the FAA is not restricted in what it can and can’t use OTs for. It is authorized to “carry out the functions of the Administrator and the Administration…on such terms and conditions as the Administrator may consider appropriate.” Unlike DOE, the FAA and DOT have used their authority for several dozen transactions a year, totaling $1.45 billion in awards between 2010 and 2014.

From the GAO chart (Table 1), it’s clear that ARPA-E also follows the DOE in deploying very few OTs in support of its mission. Despite being originally envisioned as a high-potential, high-impact funder for technology that is too early in the R&D process for private investors to support, the most recent data shows that ARPA-E does not use OTs flexibly to support high-potential, high-impact tech.
The same GAO report cited above stated that:
“DOE’s regulations—because they are based on DOD’s regulations—include requirements that limit DOE’s use of other transaction agreements…. Officials told us they plan to seek approval from the Office of Management and Budget to modify the agency’s other transaction regulations to better reflect DOE’s mission, consistent with its statutory authority. According to DOE officials, if the changes are approved, DOE may increase its use of other transaction agreements.”
That report was published in 2016, but it is unclear that any changes were sought or approved, though they likely do not need to change any regulations at all to actually make use of their authority.1 The realm of the possible is quite large, and DOE has yet to fully explore the potential benefits to its mission that OTs provide.
DOE can use OTs without any further authority to drive progress in critical technologies.
The good news is that DOE has the ability to use OTs without further guidance from Congress or formally changing any guidelines. Recognizing their full statutory authority would open up use cases for OTs that would help the DOE make meaningful progress towards its agency mission and the administration’s climate goals.
For example, the DOE could use OTs in the following ways:
- Using demand-side support mechanisms to reduce the “green premium” for promising technologies like hydrogen, carbon capture, sustainable aviation fuel, enhanced geothermal, and low-embodied construction materials like steel and concrete
- Coordinating the demonstration of promising technologies through consortia with private industry in support of their commercial liftoff goals
- Organizing joint demonstration projects with the DOT or other agencies with overlapping solution sets
- Rapidly and sustainably meeting critical energy infrastructure needs across rural areas
Given the exigencies of climate change and the need to rapidly decarbonize our society and economy, there are very real instances in which traditional research contracts or grants are not enough to move the needle or unlock a significant market opportunity for a technology. Forward contract acquisitions, pay for delivery contracts, or other forms of transactions that are nonstandard but critical to supporting development of technology are covered under this authority.
One promising area where it seems the DOE is currently using this approach is in supporting the hydrogen hubs initiative. Recently the DOE announced a $1 billion initiative for demand-side support mechanisms to mitigate the risk of market failures and accelerate the commercialization of clean hydrogen technologies. The Funding Opportunity Announcement (FOA) for the H2Hubs program notes that “other FOA launches or use of Other Transaction Authorities may also be used to solicit new technologies, capabilities, end-uses, or partners.” The DOE could use OTs more frequently as a tool to advance other critical commercial liftoff strategies or to maximize the impact of dollars appropriated to implementation of the BIL and IRA. Some areas that are ripe for creative uses of other transactions include:
- Critical Minerals Consortium: A critical minerals consortium of vendors, nonprofits, academics, and others all managed by a single entity could do more than just mineral processing improvement and R&D. It could take long-term offtake agreements and do forward purchasing of mineral supplies like graphite that are essential to the production of electric vehicle (EV) batteries and other products. This could function similarly to the successful forward contract acquisition for the Strategic Petroleum Reserve executed in June 2023 by DOE.
This demand-pull would complement other recent actions taken to bolster critical minerals like the clean vehicle tax credit and the Loan Program Office’s loans to mineral processing facilities. Such a consortium could come from the existing critical materials institute or be formed by separate entities.
- Geothermal Consortium: Enhanced geothermal systems technology has received neither the attention nor the investment that it should proportional to its potential benefits as a path towards decarbonizing the electric grid. At the same time, legacy oil and natural gas industries have workforces, equipment, and experiences that can easily translate to growing the geothermal energy industry. Recently, the DOE funded the first cross-industry consortium with the $165 million GEODE competition grant awarded to Project Innerspace, Geothermal Rising, and the Society of Petroleum Engineers.
DOE could use other transactions to further support this nascent consortium and increase the demonstration and deployment of geothermal projects. The agency could also use other transactions to organize the sharing of critical subsurface data and resources through a single entity.
- Direct Air Capture (DAC): The carbon removal market faces extremely uncertain long-term demand, and unproven technological innovations may or may not present economically viable means of pulling carbon out of the atmosphere. In order to accelerate the pace of carbon removal innovation, aggregated demand structures like Frontier’s advanced market commitment have stepped up to organize pre-purchasing and offtake agreements between buyers and suppliers. The scale of the problem is such that government intervention will be necessary to capture carbon at a rate that experts believe is necessary to mitigate the worst warming pathways.
A carbon removal purchasing agreement for the DOE’s Regional Direct Air Capture Hubs could function much the same as the proposed hydrogen hubs initiative. It also could take the shape of a consortium of DAC vendors, nonprofits, scientists, and others managed by a single entity that can set standards for purchase agreements. This would cut the negotiation time among potential parties by a significant amount, allowing for cost saving and faster decarbonization.
- Rural Energy Improvements Consortium: The IRA appropriated $1 billion for energy improvements in rural and remote areas. Because of inherent resource limitations, the DOE will not be able to fund every potential compelling project that applies to this grant program. In order to keep the program from making single one-off grants for narrowly tailored projects, it could focus on funding projects with demonstrated catalytic impact. Through OTs, the DOE could encourage promising project developers to form a Rural Energy Improvement/Developers consortium that would not only help create efficiencies in renewable energy development and novel resilient local structures but attract private investment at a scale that each individual developer would not independently attract.
- Hydrogen Fuel Cell Lifespan Consortia: As hydrogen fuel cells start gaining traction across transportation, shipping, and industrial applications, a consortia for fuel cell R&D could help provide new insights into the degradation of fuel cells, organize uniform standards for recycling of fuel cells, and also provide unique financial incentives to hedge against early uncertainty of fuel cell lifespans. As firms invest in new fleets of fuel cell vehicles, it may help them to have an idea of what expected value they can receive for assets as they reach the end of their lifespans. A backstopped guarantee of a minimum price for assets with certain criteria could reduce uncertainty. Such a consortium could be led by the Office of Manufacturing and Energy Supply Chains (MESC) and complement existing initiatives to accelerate domestic manufacturing.
- Long Duration Energy Storage (LDES): The DOE commercial liftoff pathway highlights the need to intervene to address stakeholder and investor uncertainty by providing long-term market signals for LDES. The tools they highlight to do so are carbon pricing, greenhouse gas (GHG) reduction targets, and transmission expansion. While these provide generalizable long-term signals, DOE could leverage OTs to provide more concrete signals for the value of LDES.
DOE could organize an advance market commitment for long-duration energy storage capabilities on federal properties that meet certain storage hour and grid integration requirements. Such a commitment could include the DoD and the General Services Administration (GSA), which own and operate the large portfolio of federal properties, including bases and facilities in hard-to-reach locations that could benefit from more predictable and secure energy infrastructure. Early procurement of capability-meeting but expensive systems could help diversify the market and drive technology down the cost curve to reach the target of $650 per kW and 75% RTE for intra-day storage and $1,100 per kW 55 and 60% RTE for multiday storage.
To use OTs more frequently, the DOE needs to focus on culture and education.
As noted, the DOE does not need additional authorization or congressional legislation to use OTs more frequently. The agency received authority in its original charter in 1977, codified in 42 U.S. Code § 7256, which state:
“The Secretary is authorized to enter into and perform such contracts, leases, cooperative agreements, or other similar transactions with public agencies and private organizations and persons, and to make such payments (in lump sum or installments, and by way of advance or reimbursement) as he may deem to be necessary or appropriate to carry out functions now or hereafter vested in the Secretary.” [emphasis added]
This and other legislation gives DOE the authority to use OTs as the Secretary deems necessary.
Later guidelines in implementation state that other officials at DOE who have been presidentially appointed and confirmed by the Senate are able to execute these transactions. The DOE’s Office of Acquisition Management, Office of General Counsel, and any other legal bodies involved should update any unnecessarily restrictive guidelines, or note that they will follow the original authority granted in the agency’s 1977 charter.
While that would resolve any implementation questions about the ability to use OT at DOE, the agency ultimately needs strong leadership and buy-in from the Secretary in order to take full advantage. As many observers note regarding DoD’s expanding use of OTs, culture is what matters the most. The DOE should take the following actions to make sure the changing of these guidelines empowers DOE public servants to their full potential:
- The Secretary should make clear to DOE leadership and staff that increased use of OTs is not only permissible but actively encouraged.
- The Secretary should provide internal written guidance to DOE leadership and program-level staff on what criteria need to be met for her to sign off on an OT, if needed. These criteria should be driven by DOE mission needs, technology readiness, and other resources like the commercial liftoff reports.
- The Office of Acquisition Management should collaboratively educate relevant program staff, not just contracting staff, on the use of OTs, including by providing cross-agency learning opportunities from peers at DARPA, NASA, DoD, DHS, and DOT.
- DOE should provide an internal process for designing and drawing up an OT agreement for staff to get constructive feedback from multiple levels of experienced professionals.
- DOE should issue a yearly report on how many OTs they agree to and basic details of the agreements. After four years, GAO should evaluate DOE’s use of OTs and communicate any areas for improvement. Since OTs don’t meet normal contracting disclosure requirements, some form of public disclosure would be critical for accountability.
Mitigating risk
Finally, there are many ways to address potential risks involved with executing new OTs for clean energy solutions. While there are no legal contracting risks (as OTs are not guided by the FAR), DOE staff should consider ways to most judiciously and appropriately enter into agreements. For one resource, they can leverage the eight recent reports put together by four different offices of inspector generals on agencies’ usage of other transactions to understand best practices. Other important risk limiting activities include:
- DoD commonly uses consortiums to gather critical industry partners together around challenges in areas such as advanced manufacturing, mobility, enterprise healthcare innovations, and more.
- Education of relevant parties and modeling of agreements after successful DARPA and NASA OTs. These resources are in many cases publicly available online and provide ready-made templates (for example, the NIH also offers a 500-page training guide with example agreements).
Conclusion
The DOE should use the full authority granted to it by Congress in executing other transactions to advance the clean energy transition and develop secure energy infrastructure in line with their agency mission. DOE does not need additional authorization or legislation from Congress in order to do so. GAO reports have highlighted the limitations of DOE’s OT use and the discrepancy in usage between agencies. Making this change would bring the DOE in line with peer agencies and push the country towards more meaningful progress on net-zero goals.
The following examples are pulled from a GAO report but should not be regarded as the only model for potential agreements.
Examples of Past OTs at DOE
“In 2010, ARPA-E entered into an other transaction agreement with a commercial oil and energy company to research and develop new drilling technology to access geothermal energy. Specifically, according to agency documentation, the technology being tested was designed to drill into hard rock more quickly and efficiently using a hardware system to transmit high-powered lasers over long distances via fiber optic cables and integrating the laser power with a mechanical drill bit. According to ARPA-E documents, this technology could provide access to an estimated 100,000 or more megawatts of geothermal electrical power in the United States by 2050, which would help ARPA-E meet its mission to enhance the economic and energy security of the United States through the development of energy technologies.
According to ARPA-E officials, an other transaction agreement was used due to the company’s concerns about protecting its intellectual property rights, in case the company was purchased by a different company in the future. Specifically, one type of intellectual property protection known as “march-in rights” allows federal agencies to take control of a patent when certain conditions have not been met, such as when the entity has not made efforts to commercialize the invention within an agreed upon time frame.33 Under the terms of ARPA-E’s other transaction agreement, march-in rights were modified so that if the company itself was sold, it could choose to pay the government and retain the rights to the technology developed under the agreement. Additionally, according to DOE officials, ARPA-E included a United States competitive clause in the agreement that required any invention developed under the agreement to be substantially manufactured in the United States, provided products were also sold in the United States, unless the company showed that it was not commercially feasible to do so. This agreement lasted until fiscal year 2013, and ARPA-E obligated about $9 million to it.”
Examples at DoD
“In 2011, DOD entered into a 2-year other transaction agreement with a nontraditional contractor for the development of a new military sensor system. According to the agreement documentation, this military sensor system was intended to demonstrate DOD’s ability to quickly react to emerging critical needs through rapid prototyping and deployment of sensing capabilities. By using an other transaction agreement, DOD planned to use commercial technology, development techniques, and approaches to accelerate the sensor system development process. The agreement noted that commercial products change quickly, with major technology changes occurring in less than 2 years. In contrast, according to the agreement, under the typical DOD process, military sensor systems take 3 to 8 years to complete, and may not match evolving mission needs by the time the system is complete. According to an official, DOD obligated $8 million to this agreement.”
Other interpretations of the statute have prevented DOE from leveraging OTs, and there seems to be confusion on what is allowed. For example, a commonly cited OTA explainer implies that DOE is statutorily limited to “RD&D projects. Cost sharing agreement required.”
But nowhere in the original statute does Congress require DOE to exclusively use cost sharing agreements, nor is this the case at other agencies where OTs are common practice.
However, the Energy Policy Act of 2005 did require the DOE to issue guidelines for the use of OTs 90 days after the passing of the law, and this is where it gets complicated. They did so, and according to a 2008 GAO report, DOE enacted guidelines which used a specific model called a technology investment agreement (TIA). These guidelines were modeled on the DoD’s then-current guidelines for OTs and TIAs, mandating cost sharing agreements “to the maximum extent practicable” between the federal government and nonfederal parties to an agreement.2 An Acquisition/Financial Assistance Letter issued by senior DOE procurement officials in 2021 defines this explicitly: “Other Transaction Agreement, as used in this AL/FAL, means Technology Investment Agreement as codified at 10 C.F.R., Part 603, pursuant to DOE’s Other Transaction Authority of 42 U.S.C. § 7256.” However, the DOE’s authority as codified in 42 U.S.C. § 7256 (a) and (g) does not define OTs as TIAs, the definition is just a guideline from DOE, and could be changed.
Technology Investment Agreements are used to reduce the barrier to commercial and nontraditional firms’ involvement with mission-critical research needs at DOE. They are particularly useful in that they do not require traditional government accounting systems, which can be burdensome for small or new firms to implement. But that does not mean they are the only instrument that should be used. The law says that TIAs for research projects should involve cost sharing to the “maximum extent practicable.” This does not mean that cost sharing must always occur. There could be many forms of transactions other than grants and contracts in which cost sharing is neither practicable nor feasible.
Furthermore, the DOE is empowered to use OTs for research, applied research, development, and demonstration projects. Development and demonstration projects would not fit neatly in the category of research projects covered by TIAs. So subjecting them to the same guidelines is an unduly restrictive guideline.
Consortia are basically single entities that manage a group of members (to include private firms, academics, nonprofits, and more) aligned around a specific challenge or topic. Government can execute other transactions with the consortium manager, who then organizes the members around an agreed scope. MITRE provides a longer explainer and list of consortia.
Coordinating the U.S. Government Approach to the Bioeconomy
Summary
The bioeconomy touches nearly every function of the U.S. government. The products of the bioeconomy compete in an international marketplace and include medicines, foods, fuels, materials, and novel solutions to broad challenges including climate and sustainability. The infrastructure, tools, and capabilities that drive the bioeconomy must be safeguarded to maintain U.S. leadership and to protect against misuse. The vast scale of these issues requires a cross-governmental approach that draws on input and engagement with industry, academia, nongovernmental organizations, and other stakeholders across the bioeconomy.
To achieve a durable and strategic interagency approach to the bioeconomy, the Office of Science and Technology Policy (OSTP) should establish and Congress should fund a Bioeconomy Initiative Coordination Office (BICO) to coordinate strategic U.S. government investments in the bioeconomy; facilitate efficient oversight and commercialization of biotechnology products; safeguard biotechnology infrastructure, tools and capabilities; and serve as a focal point for government engagement with nongovernmental partners and experts.
Challenge and Opportunity
Executive Order 14081, “Establishing a National Biomanufacturing and Biotechnology Initiative,” was released in September 2022. Since then, OSTP has worked to coordinate this Initiative and has made significant progress with the March 2023 release of “Bold Goals for Biotechnology and Biomanufacturing,” which describes how government agencies will support and benefit from investments in the bioeconomy; an implementation plan is forthcoming. EO 14081 also initiated interagency efforts to better measure and track the bioeconomy, prepare the regulatory system for future biotechnology products, and establish a Biosafety and Biosecurity Innovation Initiative. Although these efforts are laudable, we need a more strategic, longer-term, and outward-facing approach to ensure that the United States remains the world leader in biomanufacturing and biotechnology development. Expert reports over several years, including those from the National Academies, support the formation of strategic coordinating body within the U.S. government that focuses on the bioeconomy and more strategic planning for its investments in these areas.
The CHIPS and Science Act of 2022 provides a critical opportunity for improved interagency coordination. Division B, Title IV calls for the formation of a National Engineering Biology Research and Development Initiative coordinated by an interagency committee, co-chaired by OSTP, and supported by an Initiative Coordination Office (ICO) with a director and full-time staff. This bill stipulates that this coordination office should:
- Serve as “the point of contact on Federal engineering biology activities for government organizations, academia, industry, professional societies, State governments, interested citizen groups, and others to exchange technical and programmatic information”;
- Oversee “interagency coordination of the Initiative”; and
- Promote “access to, and early application of, the technologies, innovations, and expertise derived from Initiative activities to agency missions and systems across the Federal Government, and to United States industry, including startup companies.”
A recent bipartisan letter from Congressman Jake Auchincloss of Massachusetts confirms Congress’s intent that the ICO described in the legislation incorporate the Initiative as described in Executive Order 14081.
An ICO focused on the bioeconomy would be analogous to other congressionally mandated National Coordination Offices that drive effective interagency coordination at OSTP, including those for the U.S. Global Change Research Program (USGCRP), the National Nanotechnology Initiative (NNI), and the Networking and Information Technology Research and Development (NITRD) Program. These offices have several features in common, including:
- Support for regular interagency strategic planning and assessment mechanisms, including budget cross-cuts of relevant U.S. government activities;
- A focus on topics related to horizon scanning, technology development, and responsible innovation; and
- Robust outreach and engagement with non-government stakeholders, including industry partners.
Now is the time for OSTP to establish and for Congress to fund a durable and well-staffed Bioeconomy Initiative Coordination Office (BICO) that leads ongoing, strategic, interagency coordination across the government to support the bioeconomy. The BICO should not replace current interagency committees and processes. Instead, it should coordinate bioeconomy-related efforts that reach across multiple domains, ensure a durable and long-term approach to the bioeconomy, and serve as a focal point and doorway for U.S. government engagement with industry, academia, and other stakeholders.
Plan of Action
OSTP should establish the BICO within the next year. Its focus should be on (1) biomanufacturing, including infrastructure and capacity, pre-competitive industry issues (e.g., standards), and workforce; and (2) development and commercialization of biotechnology products, tools, and capabilities, with a particular focus on those developed for nontherapeutic uses. The interagency committee that drives the BICO should be established under the National Science and Technology Council, should be co-chaired by OSTP and the Department of Commerce, and should include as participants every agency listed in EO 14081, including:
- Department of Agriculture
- Department of Commerce
- Department of Defense
- Department of Energy
- Department of Health and Human Services
- Environmental Protection Agency
- National Economic Council
- National Science Foundation
- National Security Council
- Office of Management and Budget
- Office of Science and Technology Policy
Congress should provide an appropriation of at least $4 million per year to ensure funding for at least six full-time employees (director, lead for strategic planning and assessment, lead for regulation, lead for safeguarding, outreach coordinator, and administrator), plus office expenses, events, outreach, and other costs. Absent a specific appropriation from Congress, the BICO should follow the funding model of other congressionally mandated coordination offices, including the USGCRP and the NITRD Coordination Office: each participating agency would contribute a small percentage of its total expenditures on biomanufacturing and biotechnologies to support the BICO.
The office should be tasked with:
- Coordinating strategic planning for U.S. government investments in the bioeconomy;
- A “single door approach” for the biotechnology regulatory system that product developers can use to efficiently get actionable answers about their products’ regulatory pathways;
- An interagency process focused on safeguarding the bioeconomy by ensuring that infrastructure and supply chains are secure and reducing the risk that biotechnology tools and capabilities are accidentally or deliberately misused; and
- Extensive public outreach and engagement on these topics that includes academia, industry, nongovernmental organizations, and other bioeconomy stakeholders.
Strategic Planning
To maintain U.S. leadership in the bioeconomy, the federal government has made significant investments in biomanufacturing and biotechnology development over many years, and EO 14081 provided an important step toward a more strategic approach to these investments. However, we need a more robust and ongoing structure that incorporates opportunities for assessment of the rapidly changing bioeconomy and iteration of planning activities. The BICO should coordinate a strategic approach that includes:
- A strategic planning process with regular updates (e.g., every three years) to revisit goals, assess progress, and renew commitments;
- Budget cross-cuts, published annually, that track U.S. government investments in the bioeconomy; and
- A National Bioeconomy Assessment that incorporates advances in biotechnology products, tools, and capabilities; biomanufacturing infrastructure and workforce; and trends in public and private investment. Because the bioeconomy is rapidly changing, this assessment should incorporate a Living Evidence approach that identifies and incorporates relevant updates as they arise.
To generate an accurate National Bioeconomy Assessment and ensure that it captures updated, relevant information, the BICO should actively seek external stakeholder input and engagement that includes academia, industry, manufacturing institutes (such as BioMADE or NIIMBL), state and local governments, nongovernmental organizations, and others. In addition to informing federal investment in the bioeconomy, this ongoing assessment will be valuable for other activities within the BICO by providing insight into the types of novel products that regulators might expect to see and highlighting priority topics for outreach and engagement on safeguarding biotechnology tools and capabilities. If a Living Evidence approach is not feasible, then National Bioeconomy Assessments could be generated in a more traditional format, with publication of updated assessments at regular intervals (e.g., every three years, offset from the strategic planning publication cycle).
A “Single Door Approach” for the Biotechnology Regulatory System
For the bioeconomy to flourish, the biotechnology regulatory system must allow low-risk products to be developed and marketed quickly and efficiently. At the same time, regulatory oversight is essential to identify and limit risks to human health and the environment. The BICO should establish and support a “single door approach” for the biotechnology regulatory system to reduce ambiguities and uncertainties in the system and to better prepare the regulatory agencies for future products. (See FAQs for more information.)
An effective single door approach requires robust interagency coordination that includes OSTP and high-level decision makers from each of the primary regulatory agencies: Environmental Protection Agency (EPA), Food and Drug Administration (FDA), and U.S. Department of Agriculture (USDA). Future biotechnology products will include a wide range of applications in the environment, so governmental entities responsible for environmental protection should also be included in this interagency process, including U.S. Fish and Wildlife Service, the National Marine Fisheries Service, and the Council on Environmental Quality. These groups have rarely engaged on issues related to the use of genetically engineered organisms in the environment but should prepare for this type of decision-making. Representation in this interagency process should include lawyers from the Offices of the General Counsel of EPA, FDA, and USDA who can work together within a reasonable time frame (e.g., three months) to determine which agency should lead when novel products arise. To support this single door approach, the BICO should:
- Work with the regulatory agencies to establish a method or portal through which product developers can submit information and requests about their products;
- Facilitate interagency meetings and discussion, including with product developers, as needed;
- Track submissions and timelines, work to address bottlenecks to decision-making, and maintain accountability by publishing summaries of decision-making efficiency; and
- Conduct outreach to relevant product developers and industry groups, including a website, to raise awareness and provide information about the regulatory system.
When possible and as experience is gained, the BICO should work with the agencies to distill generalizable principles or summaries of decisions to provide guidance to product developers and the broader bioeconomy on how different types of products are likely to be regulated.
Safeguarding the Bioeconomy
As the bioeconomy grows, the United States must ensure that its investments are protected and that biotechnology tools and capabilities are not accidentally or intentionally misused. In addition to coordinating discussions among U.S. government agencies on these topics, a key role for the BICO will be to conduct outreach and engagement with the broader bioeconomy community. There are several areas where outreach, particularly to industry partners, will be critical to maintaining U.S. competitiveness and leadership in the bioeconomy. Many industry standards and practices are not well-established, and there are opportunities for the federal government to work with industry partners to protect U.S. assets and keep biomanufacturing and biotechnology development securely within the United States. The BICO, in collaboration with the National Security Council, should facilitate engagement on topics such as:
- Supplies and capabilities that create key bottlenecks for U.S. companies;
- Industry best practices for securing biomanufacturing infrastructure from cyberattacks;
- Industry best practices that enable secure sharing of materials and data between partnering companies or entities, including legal approaches (e.g., contracting and subcontracting arrangements) and technical solutions such as developing standards for APIs (application programming interfaces); and
- How to navigate venture capital interest and investment from other countries, including China, which is particularly difficult for smaller companies and start-ups.
Safeguarding the bioeconomy also requires a process to better understand the potential for accidental or deliberate misuse of biotechnology tools, services, and capabilities to cause harm, and to support development of resources and best practices to reduce these risks. The BICO should develop an engagement strategy that includes opportunities for public discussion of risks related to misuse and strategies to reduce those risks; a publicly accessible portal for experts outside the government to raise concerns or suggest topics for further scrutiny; and a protected venue in which companies can securely share more sensitive information about business products, interactions, or practices. The BICO will maximize the benefit of this forum by conducting outreach and raising awareness of these opportunities, particularly by targeting industry partners.
Currently, the U.S. government does not have a venue or forum for multi-stakeholder engagement on risks related to potential misuse of biotechnology tools and capabilities. As the bioeconomy grows, a wide range of tools and capabilities will be developed to make biology easier to engineer, including many enabled by artificial intelligence. Accidental or deliberate misuse of these rapidly expanding capabilities could pose risks that will be difficult to anticipate and mitigate. By establishing a process for ongoing and robust engagement to better understand and manage these risks, the BICO can help address these biosecurity needs.
Conclusion
By establishing the BICO as a focal point for coordination among the interagency community and outreach to the broader bioeconomy, OSTP can ensure a long-term and robust U.S. government commitment to biomanufacturing and biotechnology. This commitment includes a strategic approach to investments that can be tracked over time, improvements to the regulatory system that will enable safe and useful products to be more easily commercialized, and activities and engagement to better safeguard advances in the bioeconomy. With appropriate funding, the BICO will form the foundation for a true cross-governmental approach that will ensure U.S. leadership and competitiveness and will ultimately enable the bioeconomy to flourish.
Because OSTP is part of the Executive Office of the President, there is a risk that establishing the BICO at OSTP will make it vulnerable to changes in funding or priorities, particularly during presidential transitions. However, there are several reasons for the BICO to be at OSTP. A critical factor is that the CHIPS and Science Act names OSTP as co-chair for the interagency engineering biology initiative that is described in the Act. Executive Order 14081 also names OSTP as a key point of coordination for U.S. government activities on the bioeconomy. Some aspects of the bioeconomy, particularly the regulation of biotechnology, have been coordinated by OSTP for decades. Furthermore, OSTP plays a key role in multiple science- and technology-rich initiatives that are supported by Coordination Offices, including the USGCRP, NITRD, and NNI.
If it is not feasible for OSTP to establish the BICO, coordination could be established by lead agencies that are committed to supporting the bioeconomy. A model for this type of coordination is the Wildland Fire Leadership Council, which is established by a Memorandum of Understanding among the Secretary of the Interior, Secretary of Agriculture, Secretary of Defense, and Secretary of Homeland Security. However, to capture the full scope of coordination that is needed for the bioeconomy, this approach may require negotiation of multiple MOUs among different sets of government agencies.
Currently, the U.S. government regulates biotechnology products based on the Coordinated Framework for the Regulation of Biotechnology, established in 1986 and most recently updated in 2017. Under this system, agencies regulate biotechnology products based on their product-based authorities (e.g., drugs are regulated by FDA; pesticides are regulated by EPA). However, there are gaps, ambiguities, and uncertainties in the regulatory system that will be compounded by the accelerating pace of biotechnology development, expanding range of applications, and potential novelty of new products. Often, developers of novel products struggle to determine which agency (FDA, EPA, or USDA) has primary responsibility for regulation of their product and can receive conflicting information from the agencies over the course of months or years. Several reports, including from the National Academies and from PCAST, have called for improved interagency coordination and a single door approach to the regulatory system that would enable product developers to contact a single entity within the U.S. government and receive an actionable answer about their product’s regulatory path. Importantly, this approach will not require changes to the underlying statutes that define regulatory authorities or to the regulations that define how these authorities are applied. Instead, it calls for efficient decision-making among the agencies to decide which agency will take the lead for novel products as they arise.
To use the single door approach, product developers would submit basic information about their products for the regulatory agencies to consider. At its simplest, this portal could be a submission system similar to that used by the federal government when requesting information from the public through regulations.gov (though product developer submissions would not be released publicly). A more secure system could be modeled on the Case Management System used for companies to share documents with the Committee on Foreign Investment in the U.S.
A more robust interagency process could also drive efforts to better harmonize regulatory approaches across agencies. For example, in 2017 the National Academies recommended ways the agencies could streamline oversight of familiar and low-risk products while focusing resources on products that are novel or require more complex risk assessments. The BICO should facilitate coordination on these topics, including progress agencies have made since 2017, lessons learned, and opportunities for improvements. The BICO should also conduct horizon scanning activities (e.g., as part of its National Bioeconomy Assessment or in public meetings focused on specific product types) so that regulators can best anticipate novel products and prepare for future decision-making. Expert groups, including PCAST, have also identified a need for training of regulators and opportunities for engagement between regulators and the broader bioeconomy; the BICO will be well-positioned to coordinate these activities.
DNA synthesis is one type of biotechnology for which the risks of misuse are well described, frameworks for reducing risk are already being developed and applied (including the 2010 HHS Screening Framework Guidance and efforts toward international harmonization), and best practices among responsible companies are established. An interagency process to update the Screening Framework Guidance is nearing completion; this process would have benefited from additional opportunities for engagement between the U.S. government and the DNA synthesis industry. The BICO should provide a forum for this type of engagement in support of future policy development.