Make government-funded hardware open source by default
While scientific publications and data are increasingly made publicly accessible, designs and documentation for scientific hardware — another key output of federal funding and driver of innovation — remain largely closed from view. This status quo can lead to redundancy, slowed innovation, and increased costs. Existing standards and certifications for open source hardware provide a framework for bringing the openness of scientific tools in line with that of other research outputs. Doing so would encourage the collective development of research hardware, reduce wasteful parallel creation of basic tools, and simplify the process of reproducing research. The resulting open hardware would be available to the public, researchers, and federal agencies, accelerating the pace of innovation and ensuring that each community receives the full benefit of federally funded research.
Federal grantmakers should establish a default expectation that hardware developed as part of federally supported research be released as open hardware. To retain current incentives for translation and commercialization, grantmakers should design exceptions to this policy for researchers who intend to patent their hardware.
Details
Federal funding plays an important role in setting norms around open access to research. The White House Office of Science and Technology Policy (OSTP)’s recent Memorandum Ensuring Free, Immediate, and Equitable Access to Federally Funded Research makes it clear that open access is a cornerstone of a scientific culture that values collaboration and data sharing. OSTP’s recent report on open access publishing further declares that “[b]road and expeditious sharing of federally funded research is fundamental for accelerating discovery on critical science and policy questions.”
These efforts have been instrumental in providing the public with access to scientific papers and data — two of the foundational outputs of federally funded research. Yet hardware, another key input and output of science and innovation, remains largely hidden from view. To continue the move towards an accessible, collaborative, and efficient scientific enterprise, public access policies should be expanded to include hardware. Specifically, making federally funded hardware open source by default would have a number of specific and immediate benefits:
Reduce Wasteful Reinvention. Researchers are often forced to develop testing and operational hardware that supports their research. In many cases, unbeknownst to those researchers, this hardware has already been developed as part of other projects by other researchers in other labs. However, since that original hardware was not openly documented and licensed, subsequent researchers are not able to learn from and build upon this previous work. The lack of open documentation and licensing is also a barrier to more intentional, collaborative development of standardized testing equipment for research.
Increase Access to Information. As the OSTP memo makes clear, open access to federally funded research allows all Americans to benefit from our collective investment. This broad and expeditious sharing strengthens our ability to be a critical leader and partner on issues of open science around the world. Immediate sharing of research results and data is key to ensuring that benefit. Explicit guidance on sharing the hardware developed as part of that research is the next logical step towards those goals.
Alternative Paths to Recognition. Evaluating a researcher’s impact often includes an assessment of the number of patents they can claim. This is in large part because patents are easy to quantify. However, this focus on patents creates a perverse incentive for researchers to erect barriers to follow on study even if they have no intention of using patents to commercialize their research. Encouraging researchers to open source the hardware developed as part of their research creates an alternative path to evaluate their impact, especially as those pieces of open source hardware are adopted and improved by others. Uptake of researchers’ open hardware could be included in assessments on par with any patented work. This path recognizes the contribution to a collective research enterprise.
Verifiability. Open access to data and research are important steps towards allowing third parties to verify research conclusions. However, these tools can be limited if the hardware used to generate the data and produce the research are not themselves open. Open sourcing hardware simplifies the process of repeating studies under comparable conditions, allowing for third-party validation of important conclusions.
Recommendations
Federal grantmaking agencies should establish a default presumption that recipients of research funds make hardware developed with those funds available on open terms. This policy would apply to hardware built as part of the research process, as well as hardware that is part of the final output. Grantees should be able to opt out of this requirement with regards to hardware that is expected to be patented; such an exception would provide an alternative path for researchers to share their work without undermining existing patent-based development pathways.
To establish this policy, OSTP should conduct a study and produce a report on the current state of federally funded scientific hardware and opportunities for open source hardware policy.
- As part of the study, OSTP should coordinate and convene stakeholders to discuss and align on policy implementation details — including relevant researchers, funding agencies, U.S. Patent and Trademark Office officials, and leaders from university tech transfer offices.
- The report should provide a detailed and widely applicable definition of open source hardware, drawing on definitions established in the community — in particular, the definition maintained by the Open Source Hardware Association, which has been in use for over a decade and is based on the widely recognized definition of open source software maintained by the Open Source Initiative.
- It should also lay out a broadly acceptable policy approach for encouraging open source by default, and provide guidance to agencies on implementation. The policy framework should include recommendations for:
- Minimally burdensome components of the grant application and progress report with which to capture relevant information regarding hardware and to ensure planning and compliance for making outputs open source
- A clear and well-defined opportunity for researchers to opt out of this mandate when they intend to patent their hardware
The Office of Management and Budget (OMB) should issue a memorandum establishing a policy on open source hardware in federal research funding. The memorandum should include:
- The rationale for encouraging open source hardware by default in federally funded scientific research, drawing on the motivation of public access policies for publications and data
- A finalized definition of open source hardware to be used by agencies in policy implementation
- The incorporation of OMB’s Open Source Scientific Hardware Policy, in alignment with the OSTP report and recommendations
Conclusion
The U.S. government and taxpayers are already paying to develop hardware created as part of research grants. In fact, because there is not currently an obligation to make that hardware openly available, the federal government and taxpayers are likely paying to develop identical hardware over and over again.
Grantees have already proven that existing open publication and open data obligations promote research and innovation without unduly restricting important research activities. Expanding these obligations to include the hardware developed under these grants is the natural next step.
Promoting reproducible research to maximize the benefits of government investments in science
Scientific research is the foundation of progress, creating innovations like new treatments for melanoma and providing behavioral insights to guide policy in responding to events like the COVID-19 pandemic. This potential for real-world impact is best realized when research is rigorous, credible, and subject to external confirmation. However, evidence suggests that, too often, research findings are not reproducible or trustworthy, preventing policymakers, practitioners, researchers, and the public from fully capitalizing on the promise of science to improve social outcomes in domains like health and education.
To build on existing federal efforts supporting scientific rigor and integrity, funding agencies should study and pilot new programs to incentivize researchers’ engagement in credibility-enhancing practices that are presently undervalued in the scientific enterprise.
Details
Federal science agencies have a long-standing commitment to ensuring the rigor and reproducibility of scientific research for the purposes of accelerating discovery and innovation, informing evidence-based policymaking and decision-making, and fostering public trust in science. In the past 10 years alone, policymakers have commissioned three National Academies reports, a Government Accountability Office (GAO) study, and a National Science and Technology Council (NSTC) report exploring these and related issues. Unfortunately, flawed, untrustworthy, and potentially fraudulent studies continue to affect the scientific enterprise.
The U.S. government and the scientific community have increasingly recognized that open science practices — like sharing research code and data, preregistering study protocols, and supporting independent replication efforts — hold great promise for ensuring the rigor and replicability of scientific research. Many U.S. science agencies have accordingly launched efforts to encourage these practices in recent decades. Perhaps the most well-known example is the creation of clinicaltrials.gov and the requirements that publicly and privately funded trials be preregistered (in 2000 and 2007, respectively), leading, in some cases, to fewer trials reporting positive results.
More recent federal actions have focused on facilitating sharing of research data and materials and supporting open science-related education. These efforts seek to build on areas of consensus given the diversity of the scientific ecosystem and the resulting difficulty of setting appropriate and generalizable standards for methodological rigor. However, further steps are warranted. Many key practices that could enhance the government’s efforts to increase the rigor and reproducibility of scientific practice — such as the preregistration of confirmatory studies and replication of influential or decision-relevant findings — remain far too rare. A key challenge is the weak incentive to engage in these practices. Researchers perceive them as costly or undervalued given the professional rewards created by the current funding and promotion system, which encourages exploratory searches for new “discoveries” that frequently fail to replicate. Absent structural change to these incentives, uptake is likely to remain limited.
Recommendations
To fully capitalize on the government’s investments in education and infrastructure for open science, we recommend that federal funding agencies launch pilot initiatives to incentivize and reward researchers’ pursuit of transparent, rigorous, and public good-oriented practices. Such efforts could enhance the quality and impact of federally funded research at relatively low cost, encourage alignment of priorities and incentive structures with other scientific actors, and help science and scientists better deliver on the promise of research to benefit society. Specifically, NIH and NSF should:
Establish discipline-specific offices to launch initiatives around rigor and reproducibility
- Use the National Institute for Neurological Disorders and Stroke’s Office of Research Quality (ORQ) as a model; similar ORQs would encourage uptake of under-incentivized practices through both internal initiatives and external funding programs.
- To ensure that programs are tailored to fit the priorities of a single disciplinary context, offices should be established within individual NIH institutes and within individual NSF directorates.
Incorporate assessments of transparent and credible research methods into their learning agendas
- Include questions to better understand existing practices, such as “How frequently and effectively do [agency]-funded researchers across disciplines engage in open science practices — e.g., preregistration, publication of null results, and external replication — and how do these practices relate to future funding and research outcomes?”
- Include questions to inform new policies and initiatives, such as “What steps could [agency] take to incentivize broader uptake of open science practices, and which ones — e.g., funding programs, application questions, standards, and evaluation models — are most effective?”
- To answer these questions, solicit feedback from applicants, reviewers, and program officers, and partner with external “science of science management” researchers to design rigorous prospective and retrospective studies; use the information obtained to develop new processes to incentivize and reward open science practices in funded research.
Expand support for third-party replications
- Allocate a consistent proportion of funds to support independent replications of key findings through the use of non-grant mechanisms — e.g., prizes, cooperative agreements, and contracts. The high value placed on scientific novelty discourages such studies, which could provide valuable information about treatment, policy, regulatory approval, or future scientific inquiry. A combination of agency prioritization and public requests for information should be used to identify topics for which additional supportive or contradictory evidence would provide significant societal and/or scientific benefit.
- The NSF, in partnership with an independent third party organization like the Institute for Replication, should run a pilot study to assess the utility of commissioning targeted and/or randomized replication studies for advancing research rigor and informing future funding.
Build capacity for agency use of open science hardware
When creating, using, and buying tools for agency science, federal agencies rely almost entirely on proprietary instruments. This is a missed opportunity because open source hardware — machines, devices, and other physical things whose design has been released to the public so that anyone can make, modify, distribute, and use them — offer significant benefits to federal agencies, to the creators and users of scientific tools, and to the scientific ecosystem.
In scientific work in the service of agency missions, the federal government should use and contribute to open source hardware.
Details
Open source has transformative potential for science and for government. Open source tools are generally lower cost, promote reuse and customization, and can avoid dependency on a particular vendor for products. Open source engenders transparency and authenticity and builds public trust in science. Open source tools and approaches build communities of technologists, designers, and users, and they enable co-design and public engagement with scientific tools. Because of these myriad benefits, the U.S. government has made significant strides in using open source software for digital solutions. For example, 18F, an office within the General Services Administration (GSA) that acts as a digital services consultancy for agency partners, defaults to open source for software created in-house with agency staff as well as in contracts it negotiates.
Open science hardware, as defined by the Gathering for Open Science Hardware, is any physical tool used for scientific investigations that can be obtained, assembled, used, studied, modified, shared, and sold by anyone. It includes standard lab equipment as well as auxiliary materials, such as sensors, biological reagents, and analog and digital electronic components. Beyond a set of scientific tools, open science hardware is an alternative approach to the scientific community’s reliance on expensive and proprietary equipment, tools, and supplies. Open science hardware is growing quickly in academia, with new networks, journals, publications, and events crossing institutions and disciplines. There is a strong case for open science hardware in the service of the United Nations’ Sustainable Development Goals, as a collaborative solution to challenges in environmental monitoring, and to increase the impact of research through technology transfer. Although limited so far, some federal agencies support open science hardware, such as an open source Build-It-Yourself Rover; the development of infrastructure, including NIH 3D, a platform for sharing 3D printing files and documentation; and programs such as the National Science Foundation’s Pathways to Enable Open-Source Ecosystems.
If federal agencies regularly used and contributed to open science hardware for agency science, it would have a transformative effect on the scientific ecosystem.
Federal agency procurement practices are complex, time-intensive, and difficult to navigate. Like other small businesses and organizations, the developers and users of open science hardware often lack the capacity and specialized staff needed to compete for federal procurement opportunities. Recent innovations demonstrate how the federal government can change how it buys and uses equipment and supplies. Agency Innovation Labs at the Department of Defense, Department of Homeland Security, National Oceanic and Atmospheric Association (NOAA), National Aeronautics and Space Administration, National Institute of Standards and Technology, and the Census Bureau have developed innovative procurement strategies to allow for more flexible and responsive government purchasing and provide in-house expertise to procurement officers on using these models in agency contexts. These teams provide much-needed infrastructure for continuing to expand the understanding and use of creative, mission-oriented procurement approaches, which can also support open science hardware for agency missions.
Agencies such as the Environmental Protection Agency (EPA), NOAA, and the Department of Agriculture (USDA) are well positioned to both benefit greatly from and make essential contributions to the open source ecosystem. These agencies have already demonstrated interest in open source tools; for example, the NOAA Technology Partnerships Office has supported the commercialization of open science hardware that is included in the NOAA Technology Marketplace, including an open source ocean temperature and depth logger and a sea temperature sensor designed by NOAA researchers and partners. These agencies have significant need for scientific instrumentation for agency work, and they often develop and use custom solutions for agency science. Each of these agencies has a demonstrated commitment to broadening public participation in science, which open science hardware supports. For example, EPA’s Air Sensor Loan Programs bring air sensor technology to the public for monitoring and education. Moreover, these agencies’ missions invite public engagement in a way that a commitment to open source instrumentation and tools would build a shared infrastructure for progress in the public good.
Recommendations
We recommend that the GSA take the following steps to build capacity for the use of open science hardware across government:
- Create an Interagency Community of Practice for federal staff working on open source–related topics.
- Direct the Technology Transformation Services to create boilerplate language for procurement of open source hardware that is compliant with the Federal Acquisition Authority and the America COMPETES Reauthorization Act of 2010.
- Conduct training on open source and open science hardware for procurement professionals across government.
We also recommend that EPA, NOAA, and USDA take the following steps to build capacity for agency use of open science hardware:
- Task agency representatives to identify agency scientific instrumentation needs that are most amenable to open source solutions. For example, the EPA Office of Research and Development could use and contribute to open source air quality sensors for research on spatial and temporal variation in air quality, and the USDA could use and contribute to an open source soil testing kit.
- Task agency challenge and prize coordinators with working intra-agency on a challenge or prize competition to create an open source option for one of the identified scientific instruments or sensors above that meets agency quality requirements.
- Support agency staff in using open source approaches when creating and using scientific instrumentation. Include open source scientific instrumentation in internal communication products, highlight staff efforts to create and use open science hardware, and provide training to agency staff on its development and use.
- Integrate open source hardware into Procurement Innovation Labs or agency procurement offices. This may include training for acquisition professionals on the use of open science hardware so that they can understand the benefits and better support agency staff use. This can include options for using open source designs and how to understand and use open source licenses.
Conclusion
Defaulting to open science hardware for agency science will result in an open library of tools for science that are replicable and customizable and result in a much higher return on investment. Beyond that, prioritizing open science hardware in agency science would allow all kinds of institutions, organizations, communities, and individuals to contribute to agency science goals in a way that builds upon each of their efforts.
Open scientific grant proposals to advance innovation, collaboration, and evidence-based policy
Grant writing is a significant part of a scientist’s work. While time-consuming, this process generates a wealth of innovative ideas and in-depth knowledge. However, much of this valuable intellectual output — particularly from the roughly 70% of unfunded proposals — remains unseen and underutilized. The default secrecy of scientific proposals is based on many valid concerns, yet it represents a significant loss of potential progress and a deviation from government priorities around openness and transparency in science policy. Facilitating public accessibility of grant proposals could transform them into a rich resource for collaboration, learning, and scientific discovery, thereby significantly enhancing the overall impact and efficiency of scientific research efforts.
We recommend that funding agencies implement a process by which researchers can opt to make their grant proposals publicly available. This would enhance transparency in research, encourage collaboration, and optimize the public-good impacts of the federal funding process.
Details
Scientists spend a great deal of time, energy, and effort writing applications for grant funding. Writing grants has been estimated to take roughly 15% of a researcher’s working hours and involves putting together an extensive assessment of the state of knowledge, identifying key gaps in understanding that the researcher is well-positioned to fill, and producing a detailed roadmap for how they plan to fill that knowledge gap over a span of (typically) two to five years. At major federal funding agencies like the National Institutes of Health (NIH) and National Science Foundation (NSF), the success rate for research grant applications tends to fall in the range of 20%–30%.
The upfront labor required of scientists to pursue funding, and the low success rates of applications, has led some to estimate that ~10% of scientists’ working hours are “wasted.” Other scholars argue that the act of grant writing is itself a valuable and generative process that produces spillover benefits by incentivizing research effort and informing future scholarship. Under either viewpoint, one approach to reducing the “waste” and dramatically increasing the benefits of grant writing is to encourage proposals — both funded and unfunded — to be released as public goods, thus unlocking the knowledge, frontier ideas, and roadmaps for future research that are currently hidden from view.
The idea of grant proposals being made public is a sensitive one. Indeed, there are valid reasons for keeping proposals confidential, particularly when they contain intellectual property or proprietary information, or when they are in the early stages of development. However, these reasons do not apply to all proposals, and many potential concerns only apply for a short time frame. Therefore, neither full disclosure nor full secrecy are optimal; a more flexible approach that encourages researchers to choose when and how to share their proposals could yield significant benefits with minimal risks.
The potential benefits to the scientific community, and science funders include:
- Encouraging collaboration by making promising unfunded ideas and shared interests discoverable by disparate researchers
- Supporting early-career scientists by giving them access to a rich collection of successful and unsuccessful proposals from which to learn
- Facilitating cutting-edge science-of-science research to unlock policy-relevant knowledge about research programs and scientific grantmaking
- Allowing for philanthropic crowd-in by creating a transparent and searchable marketplace of grant proposals that can attract additional or alternative funding
- Promoting efficiency in the research planning and budgeting process by increasing transparency
- Giving scientists, science funders, and the public a view into the whole of early-stage scientific thought, above and beyond the outputs of completed projects.
Recommendations
Federal funding agencies should develop a process to allow and encourage researchers to share their grant proposals publicly, within existing infrastructures for grant reporting (e.g., NIH RePORTER). Sharing should be minimally burdensome and incorporated into existing application frameworks. The process should be flexible, allowing researchers to opt in or out — and to specify other characteristics like embargoes — to ensure applicants’ privacy and intellectual property concerns are mitigated.
The White House Office of Management and Budget (OMB) should develop a framework for publicly sharing grant proposals.
- OMB’s Evidence Team — in partnership with federal funding agencies (e.g., NIH, NSF, NASA, DOE) — should review statutory and regulatory frameworks to determine whether there are legal obstacles to sharing proposal content for extramural grant applications with applicant permission.
- OMB should then issue a memo clarifying the manner in which agencies can make proposals public and directing agencies to develop plans to allow and encourage the public availability of scientific grant proposals, in alignment with the Foundations for Evidence-Based Policymaking Act and the “Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking.”
The NSF should run a focused pilot program to assess opportunities and obstacles for proposal sharing across disciplines.
- NSF’s Division of Institution and Award Support (DIAS) should work with at least three directorates to launch a pilot study assessing applicants’ perspectives on proposal sharing, their perceived risks and concerns, and disciplinary differences in applicants’ views.
- The National Science Board (NSB) should produce a report outlining the findings of the pilot study and the implications for optimal approaches to facilitating public access of grant proposals.
Based on the NSB’s report, OSTP and OMB should work with federal funding agencies to refine and implement a proposal-sharing process across agencies.
- OSTP should work with funding agencies to develop a unified application section where researchers can indicate their release preferences. The group should agree on a set of shared parameters to align the request across agencies. For example, the guidelines should establish:
- A set of embargo options, such that applicants can choose to make their proposal available after, for example, 2, 5, or 10 years
- Whether the sharing of proposals can be made conditional on acceptance/rejection
- When in the application process applicants should be asked to opt in or out, and an approach for allowing applicants to revise their decision following submission
- OMB should include public access for grant proposals as a budget priority, emphasizing its potential benefits for bolstering innovation, efficiency, and government data availability. It should also provide guidance and technical assistance to agencies on how to implement the open grants process and require agencies to provide evidence of their plans to do so.
Establish data collaboratives to foster meaningful public involvement
Federal agencies are striving to expand the role of the public, including members of marginalized communities, in developing regulatory policy. At the same time, agencies are considering how to mobilize data of increasing size and complexity to ensure that policies are equitable and evidence-based. However, community engagement has rarely been extended to the process of examining and interpreting data. This is a missed opportunity: community members can offer critical context to quantitative data, ground-truth data analyses, and suggest ways of looking at data that could inform policy responses to pressing problems in their lives. Realizing this opportunity requires a structure for public participation in which community members can expect both support from agency staff in accessing and understanding data and genuine openness to new perspectives on quantitative analysis.
To deepen community involvement in developing evidence-based policy, federal agencies should form Data Collaboratives in which staff and members of the public engage in mutual learning about available datasets and their affordances for clarifying policy problems.
Details
Executive Order 14094 and the Office of Management and Budget’s subsequent guidance memo direct federal agencies to broaden public participation and community engagement in the federal regulatory process. Among the aims of this policy are to establish two-way communications and promote trust between government agencies and the public, particularly members of historically underserved communities. Under the Executive Order, the federal government also seeks to involve communities earlier in the policy process. This new attention to community engagement can seem disconnected from the federal government’s long-standing commitment to evidence-based policy and efforts to ensure that data available to agencies support equity in policy-making; assessing data and evidence is usually considered a job for people with highly specialized, quantitative skills. However, lack of transparency about the collection and uses of data can undermine public trust in government decision-making. Further, communities may have vital knowledge that credentialed experts don’t, knowledge that could help put data in context and make analyses more relevant to problems on the ground.
For the federal government to achieve its goals of broadened participation and equitable data, opportunities must be created for members of the public and underserved communities to help shape how data are used to inform public policy. Data Collaboratives would provide such an opportunity. Data Collaboratives would consist of agency staff and individuals affected by the agency’s policies. Each member of a Data Collaborative would be regarded as someone with valuable knowledge and insight; staff members’ role would not be to explain or educate but to learn alongside community participants. To foster mutual learning, Data Collaboratives would meet regularly and frequently (e.g., every other week) for a year or more.
Each Data Collaborative would focus on a policy problem that an agency wishes to address. The Environmental Protection Agency might, for example, form a Data Collaborative on pollution prevention in the oil and gas sector. Depending on the policy problem, staff from multiple agencies may be involved alongside community participants. The Data Collaborative’s goal would be to surface the datasets potentially relevant to the policy problem, understand how they could inform the problem, and identify their limitations. Data Collaboratives would not make formal recommendations or seek consensus; however, ongoing deliberations about the datasets and their affordances can be expected to create a more robust foundation for the use of data in policy development and the development of additional data resources.
Recommendations
The Office of Management and Budget should
- Establish a government-wide Data Collaboratives program in consultation with the Chief Data Officers Council.
- Work with leadership at federal agencies to identify policy problems that would benefit from consideration by a Data Collaborative. It is expected that deputy administrators, heads of equity and diversity offices, and chief data officers would be among those consulted.
- Hire a full-time director of Data Collaboratives to lead such tasks as coordinating with public participants, facilitating meetings, and ensuring that relevant data resources are available to all collaborative members.
- Ensure agencies’ ability to provide the material support necessary to secure the participation of underrepresented community members in Data Collaboratives, such as stipends, childcare, and transportation.
- Support agencies in highlighting the activities and accomplishments of Data Collaboratives through social media, press releases, open houses, and other means.
Conclusion
Data Collaboratives would move public participation and community engagement upstream in the policy process by creating opportunities for community members to contribute their lived experience to the assessment of data and the framing of policy problems. This would in turn foster two-way communication and trusting relationships between government and the public. Data Collaboratives would also help ensure that data and their uses in federal government are equitable, by inviting a broader range of perspectives on how data analysis can promote equity and where relevant data are missing. Finally, Data Collaboratives would be one vehicle for enabling individuals to participate in science, technology, engineering, math, and medicine activities throughout their lives, increasing the quality of American science and the competitiveness of American industry.
Make publishing more efficient and equitable by supporting a “publish, then review” model
Preprinting – a process in which researchers upload manuscripts to online servers prior to the completion of a formal peer review process – has proven to be a valuable tool for disseminating preliminary scientific findings. This model has the potential to speed up the process of discovery, enhance rigor through broad discussion, support equitable access to publishing, and promote transparency of the peer review process. Yet the model’s use and expansion is limited by a lack of explicit recognition within funding agency assessment practices.
The federal government should take action to support preprinting, preprint review, and “no-pay” publishing models in order to make scholarly publishing of federal outputs more rapid, rigorous, and cost-efficient.
Details
In 2022, the Office of Science and Technology Policy (OSTP)’s “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research” memo, written by Dr. Alondra Nelson, directed federal funding agencies to make the results of taxpayer-supported research immediately accessible to readers at no cost. This important development extended John P. Holdren’s 2013 “Increasing Access to the Results of Federally Funded Scientific Research” memo by covering all federal agencies and removing 12-month embargoes to free access and mirrored developments such as the open access provisions of Horizon 2020 in Europe.
One of the key provisions of the Nelson memo is that federal agencies should “allow researchers to include reasonable publication costs … as allowable expenses in all research budgets,” signaling support for the Article Processing Charges (APC) model. Thus, the Nelson memo creates barriers to equitable publishing for researchers with limited access to funds. Furthermore, leaving the definition of “reasonable costs” open to interpretation creates the risk that an increasing proportion of federal research funds will be siphoned by publishing. In 2022, OSTP estimated that American taxpayers are already paying $390 to $798 million annually to publish federally funded research.
Without further interventions, these costs are likely to rise, since publishers have historically responded to increasing demand for open access publishing by shifting from a subscription model to one in which authors pay to publish with article processing charges (APCs). For example, APC charges increased by 50 percent from 2010 to 2019.
The “no pay” model
In May 2023, the European Union’s council of ministers called for a “no pay” academic publishing model, in which costs are paid directly by institutions and funders to ensure equitable access to read and publish scholarship. There are several routes to achieve the no pay model, including transitioning journals to ‘Diamond’ Open Access models, in which neither authors nor readers are charged.
However, in contrast to models that rely on transforming journal publishing, an alternative approach relies on the burgeoning preprint system. Preprints are manuscripts posted online by authors to a repository, without charge to authors or readers. Over the past decade, their use across the scientific enterprise has grown dramatically, offering unique flexibility and speed to scientists and encouraging dynamic conversation. More recently, preprints have been paired with a new system of preprint peer review. In this model, organizations like Peer Community In, Review Commons, and RR\ID organize expert review of preprints from the community. These reviews are posted publicly and independent of a specific publisher or journal’s process.
Despite the growing popularity of this approach, its uptake is limited by a lack of support and incorporation into science funding and evaluation models. Federal action to encourage the “publish, then review” model offers several benefits:
- Research is available sooner, and society benefits more rapidly from new scientific findings. With preprints, researchers share their work with the community months or years ahead of journal publication, allowing others to build off their advances.
- Peer review is more efficient and rigorous because the content of the review reports (though not necessarily the identity of the reviewers) is open. Readers are able to understand the level of scrutiny that went into the review process. Furthermore, an open review process enables anyone in the community to join the conversation and bring in perspectives and expertise that are currently excluded. The review process is less wasteful since reviews are not discarded with journal rejection, making better use of researchers’ time.
- Taxpayer research dollars are used more effectively. Disentangling transparent fees for dissemination and peer reviews from a publishing market driven largely by prestige would result in lower publishing costs, enabling additional funds to be used for research.
Recommendations
To support preprint-based publishing and equitable access to research:
Congress should
- Commission a report from the National Academies of Sciences, Engineering, and Medicine on benefits, risks, and projected costs to American taxpayers of supporting alternative scholarly publishing approaches, including open infrastructure for the “publish, then review” model.
OSTP should
- Coordinate support for key disciplinary infrastructures and peer review service providers with partnerships, discoverability initiatives, and funding.
- Draft a policy in which agencies require that papers resulting from federal funding are preprinted at or before submission to peer review and updated with each subsequent version; then work with agencies to conduct a feasibility study and stakeholder engagement to understand opportunities (including cost savings) and obstacles to implementation.
Science funding agencies should
- Recognize preprints and public peer review. Following the lead of the National Institutes of Health’s 2017 Guide Notice on Reporting Preprints and Other Interim Research Products, revise grant application and report guidelines to allow researchers to cite preprints. Extend this provision to include publicly accessible reviews they have received and authored. Provide guidance to reviewers on evaluating these outputs as scientific contributions within an applicant’s body of work.
Establish grant supplements for open science infrastructure security
Open science infrastructure (OSI), such as platforms for sharing research products or conducting analyses, is vulnerable to security threats and misappropriation. Because these systems are designed to be inclusive and accessible, they often require few credentials of their users. However, this quality also puts OSI at risk for attack and misuse. Seeking to provide quality tools to their users, OSI builders dedicate their often scant funding resources to addressing these security issues, sometimes delaying other important software work.
To support these teams and allow for timely resolution to security problems, science funders should offer security-focused grant supplements to funded OSI projects.
Details
Existing federal policy and funding programs recognize the importance of security to scholarly infrastructure like OSI. For example, in October 2023, President Biden issued an Executive Order to manage the risks of artificial intelligence (AI) and ensure these technologies are safe, secure, and trustworthy. Also, under the Secure and Trustworthy Cyberspace program, the National Science Foundation (NSF) provides grants to ensure the security of cyberinfrastructure and asks scholars who collect data to plan for its secure storage and sharing. Furthermore, agencies like NSF and the National Institutes of Health (NIH) already offer supplements for existing grants. What is still needed is rapid dispersal of funds to address unanticipated security concerns across scientific domains.
Risks like secure shell (SSH) attacks, data poisoning, and the proliferation of mis/disinformation on OSI threaten the utility, sustainability, and reputation of OSI. These concerns are urgent. New access to powerful generative AI tools, for instance, makes it easy to create disinformation that can convincingly mimic the rigorous science shared via OSI. In fact, increased open access to science can accelerate the proliferation of AI-generated scholarly disinformation by improving the accuracy of the models that generate it.
OSI is commonly funded by grants that afford little support for the maintenance work that could stop misappropriation and security threats. Without financial resources and an explicit commitment to a funder, it is difficult for software teams to prioritize these efforts. To ensure uptake of OSI and its continued utility, these teams must have greater access to financial resources and relevant talent to address these security concerns and norms violations.
Recommendations
Security concerns may be unanticipated and urgent, not aligning with calls for research proposals. To provide support for OSI with security risks in a timely manner, executive action should be taken through federal agencies funding science infrastructure (NSF, NIH, NASA, DOE, DOD, NOAA). These agencies should offer research supplements to address OSI misappropriation and security threats. Supplement requests would be subject to internal review by funding agencies but not subject to peer review, allowing teams to circumvent a lengthier review process for a full grant proposal. Research supplements, unlike full grant proposals, will allow researchers to nimbly respond to novel security concerns that arise after they receive their initial funding. Additionally, researchers who are less familiar with security issues but who provide OSI may not anticipate all relevant threats when the project is conceived and initial funding is distributed (managers of from-scratch science gateways are one possible example). Supplying funds through supplements when the need arises can protect sensitive data and infrastructure.
These research supplements can be made available to principal investigators and co-principal investigators with active awards. Supplements may be used to support additional or existing personnel, allowing OSI builders to bring new expertise to their teams as necessary. To ensure that funds can address unanticipated security issues in OSI from a variety of scholarly domains, supplement recipients need not be funded under an existing program to explicitly support open science infrastructure (e.g., NSF’s POSE program).
To minimize the administrative burden of review, applications for supplements should be kept short (e.g., no more than five pages, excluding budget) and should include the following:
- A description of the security issue to be addressed
- A convincing argument that the infrastructure has goals of increasing the inclusion, accessibility, and/or transparency of science and that those goals are exacerbating the relevant security threat
- A description of the steps to be taken to address the security issue and a well-supported argument that the funded researchers have the expertise and tools necessary to carry those steps out
- A brief description of the original grant’s scope, making evident that the supplemental funding will support work outside of the original scope
- An explanation as to why a grant supplement is more appropriate for their circumstances than a new full grant application
- A budget for the work
By appropriating $3 million annually across federal science funders, 40 supplemental awards of $75,000 each could be distributed to OSI projects. While the budget needed to address each security issue will vary, this estimate demonstrates the reach that these supplements could have.
Research software like OSI often struggles to find funding for maintenance. These much-needed supplemental funds will ensure that OSI developers can speedily prioritize important security-related work without doing so at the expense of other planned software work. Without this funding, we risk compromising the reputation of open science, consuming precious development resources allocated to other tasks, and negatively affecting OSI users’ experience. Grant supplements to address OSI security threats and misappropriation ensure the sustainability of OSI going forward.
Expand capacity and coordination to better integrate community data into environmental governance
Frontline communities bear the brunt of harms created by climate change and environmental pollution, but they also increasingly generate their own data, providing critical social and environmental context often not present in research or agency-collected data. However, community data collectors face many obstacles to integrating this data into federal systems: they must navigate complex local and federal policies within dense legal landscapes, and even when there is interest or demonstrated need, agencies and researchers may lack the capacity to find or integrate this data responsibly.
Federal research and regulatory agencies, as well as the White House, are increasingly supporting community-led environmental justice initiatives, presenting an opportunity to better integrate local and contextualized information into more effective and responsive environmental policy.
The Environmental Protection Agency (EPA) should better integrate community data into environmental research and governance by building internal capacity for recognizing and applying such data, facilitating connections between data communities, and addressing misalignments with data standards.
Details
Community science and monitoring are often overlooked yet vital facets of open science. Community science collaborations and their resulting data have led to historic environmental justice victories that underscore the importance of contextualized community-generated data in environmental problem-solving and evidence-informed policy-making.
Momentum around integrating community-generated environmental data has been building at the federal level for the past decade. In 2016, the report “A Vision for Citizen Science at EPA,” produced by the National Advisory Council for Environmental Policy and Technology (NACEPT), thoroughly diagnosed the need for a clear framework for moving community-generated environmental data and information into governance processes. Since then, EPA has developed additional participatory science resources, including a participatory science vision, policy guidelines, and equipment loan programs. More recently, in 2022, the EPA created an Equity Action Plan in alignment with their 2022–2026 Strategic Plan and established an Office of Environmental Justice and External Civil Rights (OEJECR). And, in 2023, as a part of the cross-agency Year of Open Science, the National Aeronautics and Space Administration (NASA)’s Transform to Open Science (TOPS) program lists “broadening participation by historically excluded communities” as a requisite part of its strategic objectives.
It is evident that the EPA and research funding agencies like NASA have a strategic and mission-driven interest in collaborating with communities bearing the brunt of environmental and climate injustice to unlock the potential of their data. It is also clear that current methods aren’t working. Communities that collect and use environmental data still must navigate disjointed reporting policies and data standards and face a dearth of resources on how to share data with relevant stakeholders within the federal government. There is a critical lack of capacity and coordination directed at cross-agency integration of community data and the infrastructure that could enable the use of this data in regulatory and policy-making processes.
Recommendations
To build government capacity to integrate community-generated data into environmental governance, the EPA should:
- Create a memorandum of understanding between the EPA’s OEJECR, National Environmental Justice Advisory Council (NEJAC), Office of Management and Budget (OMB), United States Digital Service (USDS), and relevant research agencies, including NASA, National Atmospheric and Oceanic Administration (NOAA), and National Science Foundation (NSF), to develop a collaborative framework for building internal capacity for generating and applying community-generated data, as well as managing it to enable its broader responsible reuse.
- Develop and distribute guidance on responsible scientific collaboration with communities that prioritizes ethical open science and data-sharing practices that center community and environmental justice priorities.
- Create a capacity-building program, designed by and with environmental justice and data-collecting communities, focused on building translational and intermediary roles within the EPA that can facilitate connections and responsible sharing between data holders and seekers. Incentivize the application of the aforementioned guidance within federally funded research by recruiting and training program staff to act as translational liaisons situated between the OEJECR, regional EPA offices, and relevant research funding agencies, including NASA, NOAA, and NSF.
To facilitate connections between communities generating data, the EPA should:
- Expand the scope of the current Environmental Information Exchange Network (EN) to include facilitation of environmental data sharing by community-based organizations and community science initiatives.
- Initiate a working group including representatives from data-generating community organizations to develop recommendations on how EN might accommodate community data and how its data governance processes can center community and environmental justice priorities.
- Provide grant funding within the EN earmarked for community-based organizations to support data integration with the EN platform. This could include hiring contractors with technical data management expertise to support data uploading within established standards or to build capacity internally within community-based organizations to collect and manage data according to EN standards.
- Expand the resources available for EN partners that support data quality assurance, advocacy, and sharing, for example by providing technical assistance through regional EPA offices trained through the aforementioned capacity-building program.
To address misaligned data standards, the EPA, in partnership with USDS and the OMB, should:
- Update and promote guidance resources for communities and community-based organizations aiming to apply the data standards EPA uses to integrate data in regulatory decisions.
- Initiate a collaborative co-design process for new data standards that can accommodate community-generated data, with representation from communities who collect environmental data. This may require the creation of maps or crosswalks to facilitate translation between standards, including research data standards, as well as internal capacity to maintain these crosswalks.
Community-generated data provides contextualized environmental information essential for evidence-based policy-making and regulation, which in turn reduces wasteful spending by designing effective programs. Moreover, healthcare costs will be reduced for the general public if better evidence is used to address pollution, and climate adaptation costs could be reduced if we can use more localized and granular data to address pressing environmental and climate issues now rather than in the future.
Our recommendations call for the addition of at least 10 full-time employees for each regional EPA office. The additional positions proposed could fill existing vacancies in newly established offices like the OEJECR. Additional budgetary allocations can also be made to the EPA’s EN to support technical infrastructure alterations and grant-making.
While there is substantial momentum and attention on community environmental data, our proposed capacity stimulus can make existing EPA processes more effective at achieving their mission and supports rebuilding trust in agencies that are meant to serve the public.
A Matter of Trust: Helping the Bioeconomy Reach Its Full Potential with Translational Governance
The promise of the bioeconomy is massive and fast-growing—offering new jobs, enhanced supply chains, novel technologies, and sustainable bioproducts valued at a projected $4 trillion over the next 16 years. Although the United States has been a global leader, advancements in the bioeconomy—whether it’s investing in specialized infrastructural hardware or building a multidisciplinary STEM workforce—are subject to public trust. In fact, public trust is the key to unlocking the full potential of the bioeconomy, and without it, the United States may fall short of long-term economic goals and even fall behind peer nations as a bioeconomy leader. Recent failures of the federal regulatory system for biotechnology threaten public trust, and recent regulations have been criticized for their lack of transparency. As a result, cross-sector efforts aim not just to reimagine the bioeconomy but to create a coordinated regulatory system for it. Burdened by decreasing public trust in the federal government, even the most coordinated regulatory systems will fail to boost the bioeconomy if they cannot instill public trust.
In response, the Biden-Harris Administration should direct a Bioeconomy Initiative Coordination Office (BICO) to establish a public engagement mechanism parallel with the biotechnology regulatory system. Citizen engagement and transparency are key to building public trust, yet current public engagement mechanisms cannot convey trust to a public skeptical of a biotechnology’s rewards in light of perceived risks. Bioeconomy coordination efforts should therefore prioritize public trust by adopting a new public-facing biotechnology evaluation program that collects data from nontraditional audiences via participatory technology assessments (pTA) and Multi-Criteria Decision Analysis (MCDA/MCDM) and provides insight that addresses limitations. In accordance with the CHIPS and Science Act, the public engagement program will provide a mechanism for a BICO to build public trust while advancing the bioeconomy.
The public engagement program will serve as a decision-making resource for the Coordinated Framework for the Regulation of Biotechnology (CFRB) and a data repository for evaluating public acceptance in the present and future bioeconomy.
Challenge and Opportunity
While policymakers have been addressing the challenge of sharing regulatory space among the three key agencies—Environmental Protection Agency (EPA), Food and Drug Administration (FDA), and USDA—transparency and public trust remain challenges for federal agencies, small to midsize developers, and even the public at large. The government plays a vital role in the regulatory process by providing guidelines that govern the interactions between the developers and consumers of biotechnology. For over 30 years, product developers have depended on strategic alliances between product developers and the government to ensure the market success of biotechnology. The marketplace and regulatory oversight are tightly linked, and their impacts on public confidence in the bioeconomy cannot be separated.
When it comes to a consumer’s purchase of a biotechnology product, the pivotal factor is often not price but trust. In 2016, the National Academy of Sciences, Engineering, and Medicine released recommendations on aligning public values with gene drive research. The report revealed that public engagement that promotes a “bi-directional exchange of information and perspectives” can increase public trust. Moreover, a 2022 report on Gene Drives in Agriculture highlights the importance of considering public perception and acceptance in risk-based decision-making.
The CHIPS and Science Act provides an opportunity to address transparency and public trust within the federal regulatory system for biotechnology by directing the Office of Science and Technology Policy (OSTP) to establish a Coordination Office for the National Engineering Biology Research and Development Initiative. The coordination office (i.e., BICO) will serve as a point of contact for cross-sector engagement and create a junction for the exchange of technical and programmatic information. Additionally, the office will conduct public outreach and produce recommendations for strengthening the bioeconomy.
This policy window presents a novel opportunity to create a regulatory system for the bioeconomy that also encompasses the voice of the general public. History of requests for information, public hearings, and cross-sector partnerships demonstrates the public’s capacity—or at least specific subsets of experts therein—to fill gaps, oversights, and ambiguities within biotechnology regulations.
While expert opinion is essential for developing regulation, so too are the opinions of the general public. Historically, discussions about values, sentiments, and opinions on biotechnology have been dominated by technical experts (for example, through debates on product vs. process, genetically engineered vs. genetically modified organisms, and perceived safety). Biotechnology discourse has primarily been restricted to these traditional, technical audiences, and as a result, public calls to address concerns about biotechnology are drowned out by expert opinions. We need a mechanism for public engagement that prioritizes collecting data from nontraditional audiences. This will ensure sustainable and responsible advancements in the bioeconomy.
If we want to establish a bioeconomy that increases national competitiveness, then we need to increase the participation of nontraditional audiences. Although some public concerns are unlikely to be allayed through policy change (e.g., addressing calls for a ban on genetically engineered or modified products), a public engagement program could identify the underlying issue(s) for these concerns. This would enable the adoption of comprehensive strategies that increase public trust, further sustaining the bioeconomy.
Research shows that public comment and notice periods are less likely to hear from nontraditional audiences—that is, members of underserved communities, workers, smaller market entities, and new firms. Despite the statutory and capacity-based obstacles federal agencies face in increasing public participation, the Executive Office of the President seeks to broaden public participation and community engagement in the federal regulatory process. Public engagement programs provide a platform to interact with interested parties that represent a wide range of perspectives. Thus, information gathered from public engagement could inform future proposed updates to the CFRB and the regulatory pathways for new products. In this way, the public opinions and sentiments can be incorporated into a translational governance framework to bring translational value to the bioeconomy. Since increasing public trust is complementary to advancing the bioeconomy, there is a translational value in strategically integrating a collective perception of risk and safety into future biotechnology regulation. In this case, translational governance allows for regulation that is informed by science and is responsive to the values of citizens, effectively introducing a policy lever that improves the adoption of, and investment in, the U.S. bioeconomy.
The future of biotechnology regulation is an emerging innovative ecosystem. The path to accomplishing economic goals within this ecosystem requires a new public-facing engagement mechanism framework that satisfies congressional directives and advances the bioeconomy. This framework provides a BICO with the scaffolding necessary to create an infrastructure that invites public input and community reflection and the potential to decrease the number of biotechnologies that fail to reach the market. The proposed public engagement mechanism will work alongside the current regulatory system for biotechnology to enhance public trust, improve interagency coordination, and strengthen the bioeconomy.
Plan of Action
To reach our national bioeconomic policy goals, the BICO should use a public engagement program to solicit perspectives and develop an understanding of non-economic values, such as deeply held beliefs about the relationship between humans and the environment or personal or cultural perspectives related to specific biotechnologies. The BICO should devote $10 million over five years to public engagement programs and advisory board activities that (a) report to the BICO but are carried out through external partnerships; (b) provide meaningful social data for biotechnology regulation while running parallel to the CFRB regulatory system; and (c) produce a repository of public acceptance data for horizon scanning. These programs will inform regulatory decision-making, increase public trust, and achieve the congressional directives outlined in Sec. 10402 of the CHIPS & Science Act.
Recommendation 1. Establish a Bioeconomy Initiative Coordination Office (BICO) as a home office for interagency coordination.
The BICO should be housed within the Office of Science and Technology Policy (OSTP). The creation of a BICO is in alignment with the mandates of Executive Order (EO) 14081, Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure Bioeconomy, and the statutory authority granted to the OSTP through the CHIPS and Science Act.
Congress should allocate $2 million annually for five years to the BICO to carry out a public engagement program and advisory board activities in coordination with the EPA, FDA, and USDA.
The public engagement program would be housed within the BICO as a public-facing data-generating mechanism that parallels the current federal regulatory system for biotechnology.
The bioeconomy cuts across sectors (e.g., agriculture, health, materials, energy) and actively creates new connections and opportunities for national competitiveness. A thriving bioeconomy must ensure regulatory policy coherence and alignment among these sectors, and the BICO should be able to comprehensively integrate information from multiple sectors into a strategy that increases public awareness and acceptance of bioeconomy-related products and services. Public engagement should be used to build a data ecosystem of values related to biotechnologies that advance the bioeconomy.
Recommendation 2. Establish a process for public engagement and produce a large repository of public acceptance data.
Public acceptance data will be collected alongside the “biological data ecosystem,” as referenced by the Biden Administration in EO 14081, that advances innovation in the bioeconomy. To provide an expert element to the public engagement process, an advisory board should be involved in translating public acceptance data (opinions on how biotechnologies align with values) into policy suggestions and recommendations for regulatory agencies. The advisory board should be a formal entity recognizable under the Federal Advisory Committee Act (FACA) and the Freedom of Information Act (FOIA). It should be diverse but not so large that it becomes inefficient in fulfilling its mandate. Striking a balance between compositional diversity and operational efficiency is critical to ensuring the board provides valuable insights and recommendations to the BICO. The advisory board should consist of up to 25 members, reflect NSF data on diversity and STEM, and include a diverse range of citizens, from everyday consumers (such as parents, young adults, and patients from different ethnic backgrounds) to specialists in various disciplines (such as biologists, philosophers, hair stylists, sanitation workers, social workers, and dietitians). To promote transparency and increase public trust, data will be subject to FACA and the FOIA regulations, and advisory board meetings must be accessible to the public and their details must be published in the Federal Register. Additionally, management and applications of any data collected should employ CARE (Collective Benefit, Authority to Control, Responsibility, Ethics) Principles for Indigenous Data Governance, which complement FAIR (Findable, Accessible, Interoperable and Reusable) Principles. Adopting CARE brings a people and purpose orientation to data governance and is rooted in Indigenous Peoples’ sovereignty.
The BICO can look to the National Science and Technology Council’s published requests for information (RFI) and public meetings as a model for public engagement. BICO should work with an inclusive network of external partners to design workshops for collecting public acceptance data. Using participatory technology assessments (pTA) methods, the BICO will fund public engagement activities such as open-framing focus groups, workshops, and forums that prioritize input from nontraditional public audiences. The BICO office should use pre-submission data, past technologies, near-term biotechnologies, and, where helpful, imaginative scenarios to produce case studies to engage with these audiences. Public engagement should be hosted by external grantees who maintain a wide-ranging network of interdisciplinary specialists and interested citizens to facilitate activities.
Qualitative and quantitative data will be used to reveal themes, public values, and rationales, which will aid product developers and others in the bioeconomy as they decide on new directions and potential products. This process will also serve as the primary data source for the public acceptance data repository. Evolving risk pathways are a considerable concern for any regulatory system, especially one tasked with regulating biotechnologies. How risks are managed is subject to many factors (history, knowledge, experience product) and has a lasting impact on public trust. Advancing the bioeconomy requires a transparent decision-making process that integrates public input and allows society to redefine risks and safety as a collective. Public acceptance data should inform an understanding of values, risks, and safety and improve horizon-scanning capabilities.
Conclusion
As the use of biotechnology continues to expand, policymakers must remain adaptive in their regulatory approach to ensure that public trust is acquired and maintained. Recent federal action to boost the bioeconomy provides an opportunity for policymakers to expand public engagement and improve public acceptance of biotechnology. By centralizing coordination and integrating public input, policymakers can create a responsive regulatory convention that advances the bioeconomy while also building public trust. To achieve this, the public engagement program will combine elements of community-based participatory research, value-based assessments, pTA, and CARE Principles for Indigenous Data Governance. This approach will create a translational mechanism that improves interagency coordination and builds public trust. As the government works to create a regulatory framework for the bioeconomy, the need for public participation will only increase. By leveraging the expertise and perspectives of a diverse range of interested parties, policymakers can ensure that the regulatory framework is aligned with public values and concerns while promoting innovation and progress in the U.S. bioeconomy.
Translational governance focuses on expediting the implementation of regulations to safeguard human health and the environment while simultaneously encouraging innovation. This approach involves integrating non-economic values into decision-making processes to enhance scientific and statutory criteria of risk and safety by considering public perceptions of risk and safety. Essentially, it is the policy and regulatory equivalent of translational research, which strives to bring healthcare discoveries to market swiftly and safely.
The Office of Science and Technology Policy (OSTP) should use translational governance through public engagement as a backbone of the National Engineering Biology Research and Development Initiative. Following the designation of an interagency committee by the OSTP—and once established under the scope of direction outlined in Sec. 10403 of the CHIPS and Science Act—the Initiative Coordination Office should use a public engagement program to support the following National Engineering Biology Research and Development Initiative congressional directives (Sec. 10402):
- 1) Supporting social and behavioral sciences and economics research that advances the field of engineering biology and contributes to the development and public understanding of new products, processes, and technologies.
- 2) Improving the understanding of engineering biology of the scientific and lay public and supporting greater evidence-based public discourse about its benefits and risks.
- 3) Supporting research relating to the risks and benefits of engineering biology, including under subsection d: Ensuring, through the agencies and departments that participate in the Initiative, that public input and outreach are integrated into the Initiative by the convening of regular and ongoing public discussions through mechanisms such as workshops, consensus conferences, and educational events, as appropriate].
- 4) Expanding the number of researchers, educators, and students and a retooled workforce with engineering biology training, including from traditionally underrepresented and underserved populations.
- 5) Accelerating the translation and commercialization of engineering biology and biomanufacturing research and development by the private sector.
- 6) Improving the interagency planning and coordination of federal government activities related to engineering biology.
In 1986, the newly issued regulatory system for biotechnology products faced significant statutory challenges in establishing the jurisdiction of the three key regulatory agencies (EPA, FDA, and USDA). In those early days, agency coordination allowed for the successful regulation of products that had shared jurisdiction. For example, one agency would regulate the plant in the field (USDA), and another would regulate the feed or food produced by the plant (FDA/DHHS). However, as the biotechnology product landscape has advanced, so has the complexity of agency coordination. For example, at the time of their commercialization, plants that were modified to exhibit pesticidal traits, specific microbial products, and certain genetically modified organisms cut across product use-specific regulations that were organized according to agency (i.e., field plants, food, pesticides). In response, the three key agencies have traditionally implemented their own rules and regulations (e.g., EPA’s Generally Recognized as Safe, USDA’s Am I Regulated?, USDA SECURE Rule). While such policy action is under their statutory authorities, it has resulted in policy resistance, reinforcing the confusion and lack of transparency within the regulatory process.
Since its formal debut on June 26, 1986, the CFRB has undergone two major updates, in 1992 and 2017. Additionally, the CFRB has been subject to multiple memorandums of understanding as well as two executive orders across two consecutive administrations (Trump and Biden). With the arrival of The CHIPS and Science Act (2022) and Executive Order 14081, the CFRB will likely undertake one of its most extensive updates—modernization for the bioeconomy.
According to the EPA, when the CFRB was issued in 1986, the expectation was that the framework would respond to the experiences of the industry and the agencies and that modifications would be accomplished through administrative or legislative actions. Moreover, upon their release of the 2017 updates to the CFRB, the Obama Administration described the CFRB as a “flexible regulatory structure that provides appropriate oversight for all products of modern biotechnology.” With this understanding, the CFRB is designed to be iterative and responsive to change. However, as this memo and other reports demonstrate, not all products of modern biotechnology are subject to appropriate oversight. The opportunity loss between addressing regulatory concerns and acquiring the regulations necessary to capitalize on the evolving biotechnology landscape fully presents a costly delay. The CFRB is falling behind biotechnology in a manner that hampers the bioeconomy—and likely—the future economy.
Automating Scientific Discovery: A Research Agenda for Advancing Self-Driving Labs
Despite significant advances in scientific tools and methods, the traditional, labor-intensive model of scientific research in materials discovery has seen little innovation. The reliance on highly skilled but underpaid graduate students as labor to run experiments hinders the labor productivity of our scientific ecosystem. An emerging technology platform known as Self-Driving Labs (SDLs), which use commoditized robotics and artificial intelligence for automated experimentation, presents a potential solution to these challenges.
SDLs are not just theoretical constructs but have already been implemented at small scales in a few labs. An ARPA-E-funded Grand Challenge could drive funding, innovation, and development of SDLs, accelerating their integration into the scientific process. A Focused Research Organization (FRO) can also help create more modular and open-source components for SDLs and can be funded by philanthropies or the Department of Energy’s (DOE) new foundation. With additional funding, DOE national labs can also establish user facilities for scientists across the country to gain more experience working with autonomous scientific discovery platforms. In an era of strategic competition, funding emerging technology platforms like SDLs is all the more important to help the United States maintain its lead in materials innovation.
Challenge and Opportunity
New scientific ideas are critical for technological progress. These ideas often form the seed insight to creating new technologies: lighter cars that are more energy efficient, stronger submarines to support national security, and more efficient clean energy like solar panels and offshore wind. While the past several centuries have seen incredible progress in scientific understanding, the fundamental labor structure of how we do science has not changed. Our microscopes have become far more sophisticated, yet the actual synthesizing and testing of new materials is still laboriously done in university laboratories by highly knowledgeable graduate students. The lack of innovation in how we historically use scientific labor pools may account for stagnation of research labor productivity, a primary cause of concerns about the slowing of scientific progress. Indeed, analysis of scientific literature suggests that scientific papers are becoming less disruptive over time and that new ideas are getting harder to find. The slowing rate of new scientific ideas, particularly in the discovery of new materials or advances in materials efficiency, poses a substantial risk, potentially costing billions of dollars in economic value and jeopardizing global competitiveness. However, incredible advances in artificial intelligence (AI) coupled with the rise of cheap but robust robot arms are leading to a promising new paradigm of material discovery and innovation: Self-Driving Labs. An SDL is a platform where material synthesis and characterization is done by robots, with AI models intelligently selecting new material designs to test based on previous experimental results. These platforms enable researchers to rapidly explore and optimize designs within otherwise unfeasibly large search spaces.
Today, most material science labs are organized around a faculty member or principal investigator (PI), who manages a team of graduate students. Each graduate student designs experiments and hypotheses in collaboration with a PI, and then executes the experiment, synthesizing the material and characterizing its property. Unfortunately, that last step is often laborious and the most time-consuming. This sequential method to material discovery, where highly knowledgeable graduate students spend large portions of their time doing manual wet lab work, rate limits the amount of experiments and potential discoveries by a given lab group. SDLs can significantly improve the labor productivity of our scientific enterprise, freeing highly skilled graduate students from menial experimental labor to craft new theories or distill novel insights from autonomously collected data. Additionally, they yield more reproducible outcomes as experiments are run by code-driven motors, rather than by humans who may forget to include certain experimental details or have natural variations between procedures.
Self-Driving Labs are not a pipe dream. The biotech industry has spent decades developing advanced high-throughput synthesis and automation. For instance, while in the 1970s statins (one of the most successful cholesterol-lowering drug families) were discovered in part by a researcher testing 3800 cultures manually over a year, today, companies like AstraZeneca invest millions of dollars in automation and high-throughput research equipment (see figure 1). While drug and material discovery share some characteristics (e.g., combinatorially large search spaces and high impact of discovery), materials R&D has historically seen fewer capital investments in automation, primarily because it sits further upstream from where private investments anticipate predictable returns. There are, however, a few notable examples of SDLs being developed today. For instance, researchers at Boston University used a robot arm to test 3D-printed designs for uniaxial compression energy adsorption, an important mechanical property for designing stronger structures in civil engineering and aerospace. A Bayesian optimizer was then used to iterate over 25,000 designs in a search space with trillions of possible candidates, which led to an optimized structure with the highest recorded mechanical energy adsorption to date. Researchers at North Carolina State University used a microfluidic platform to autonomously synthesize >100 quantum dots, discovering formulations that were better than the previous state of the art in that material family.
These first-of-a-kind SDLs have shown exciting initial results demonstrating their ability to discover new material designs in a haystack of thousands to trillions of possible designs, which would be too large for any human researcher to grasp. However, SDLs are still an emerging technology platform. In order to scale up and realize their full potential, the federal government will need to make significant and coordinated research investments to derisk this materials innovation platform and demonstrate the return on capital before the private sector is willing to invest it.
Other nations are beginning to recognize the importance of a structured approach to funding SDLs: University of Toronto’s Alan Aspuru-Guzik, a former Harvard professor who left the United States in 2018, has created an Acceleration Consortium to deploy these SDLs and recently received $200 million in research funding, Canada’s largest ever research grant. In an era of strategic competition and climate challenges, maintaining U.S. competitiveness in materials innovation is more important than ever. Building a strong research program to fund, build, and deploy SDLs in research labs should be a part of the U.S. innovation portfolio.
Plan of Action
While several labs in the United States are working on SDLs, they have all received small, ad-hoc grants that are not coordinated in any way. A federal government funding program dedicated to self-driving labs does not currently exist. As a result, the SDLs are constrained to low-hanging material systems (e.g., microfluidics), with the lack of patient capital hindering labs’ ability to scale these systems and realize their true potential. A coordinated U.S. research program for Self-Driving Labs should:
Initiate an ARPA-E SDL Grand Challenge: Drawing inspiration from DARPA’s previous grand challenges that have catalyzed advancements in self-driving vehicles, ARPA-E should establish a Grand Challenge to catalyze state-of-the-art advancements in SDLs for scientific research. This challenge would involve an open call for teams to submit proposals for SDL projects, with a transparent set of performance metrics and benchmarks. Successful applicants would then receive funding to develop SDLs that demonstrate breakthroughs in automated scientific research. A projected budget for this initiative is $30 million1, divided among six selected teams, each receiving $5 million over a four-year period to build and validate their SDL concepts. While ARPA-E is best positioned in terms of authority and funding flexibility, other institutions like National Science Foundation (NSF) or DARPA itself could also fund similar programs.
Establish a Focused Research Organization to open-source SDL components: This FRO would be responsible for developing modular, open-source hardware and software specifically designed for SDL applications. Creating common standards for both the hardware and software needed for SDLs will make such technology more accessible and encourage wider adoption. The FRO would also conduct research on how automation via SDLs is likely to reshape labor roles within scientific research and provide best practices on how to incorporate SDLs into scientific workflows. A proposed operational timeframe for this organization is five years, with an estimated budget of $18 million over that time period. The organization would work on prototyping SDL-specific hardware solutions and make them available on an open-source basis to foster wider community participation and iterative improvement. A FRO could be spun out of the DOE’s new Foundation for Energy Security (FESI), which would continue to establish the DOE’s role as an innovative science funder and be an exciting opportunity for FESI to work with nontraditional technical organizations. Using FESI would not require any new authorities and could leverage philanthropic funding, rather than requiring congressional appropriations.
Provide dedicated funding for the DOE national labs to build self-driving lab user facilities, so the United States can build institutional expertise in SDL operations and allow other U.S. scientists to familiarize themselves with these platforms. This funding can be specifically set aside by the DOE Office of Science or through line-item appropriations from Congress. Existing prototype SDLs, like the Argonne National Lab Rapid Prototyping Lab or Berkeley Lab’s A-Lab, that have emerged in the past several years lack sustained DOE funding but could be scaled up and supported with only $50 million in total funding over the next five years. SDLs are also one of the primary applications identified by the national labs in the “AI for Science, Energy, and Security” report, demonstrating willingness to build out this infrastructure and underscoring the recognized strategic importance of SDLs by the scientific research community.
As with any new laboratory technique, SDLs are not necessarily an appropriate tool for everything. Given that their main benefit lies in automation and the ability to rapidly iterate through designs experimentally, SDLs are likely best suited for:
- Material families with combinatorially large design spaces that lack clear design theories or numerical models (e.g., metal organic frameworks, perovskites)
- Experiments where synthesis and characterization are either relatively quick or cheap and are amenable to automated handling (e.g., UV-vis spectroscopy is relatively simple, in-situ characterization technique)
- Scientific fields where numerical models are not accurate enough to use for training surrogate models or where there is a lack of experimental data repositories (e.g., the challenges of using density functional theory in material science as a reliable surrogate model)
While these heuristics are suggested as guidelines, it will take a full-fledged program with actual results to determine what systems are most amenable to SDL disruption.
When it comes to exciting new technologies, there can be incentives to misuse terms. Self-Driving Labs can be precisely defined as the automation of both material synthesis and characterization that includes some degree of intelligent, automated decision-making in-the-loop. Based on this definition, here are common classes of experiments that are not SDLs:
- High-throughput synthesis, where synthesis automation allows for the rapid synthesis of many different material formulations in parallel (lacks characterization and AI-in-the-loop)
- Using AI as a surrogate trained over numerical models, which is based on software-only results. Using an AI surrogate model to make material predictions and then synthesizing an optimal material is also not a SDL, though certainly still quite the accomplishment for AI in science (lacks discovery of synthesis procedures and requires numerical models or prior existing data, neither of which are always readily available in the material sciences).
SDLs, like every other technology that we have adopted over the years, eliminate routine tasks that scientists must currently spend their time on. They will allow scientists to spend more time understanding scientific data, validating theories, and developing models for further experiments. They can automate routine tasks but not the job of being a scientist.
However, because SDLs require more firmware and software, they may favor larger facilities that can maintain long-term technicians and engineers who maintain and customize SDL platforms for various applications. An FRO could help address this asymmetry by developing open-source and modular software that smaller labs can adopt more easily upfront.
Finding True North with Community Navigator Programs
State, local, and Tribal governments still face major capacity issues when it comes to accessing federal funding opportunities – even with the sheer amount of programs started since the Bipartisan Infrastructure Law (BIL) and Inflation Reduction Act (IRA) were passed. Communities need more technical assistance if implementation of those bills is going to reach its full potential, but federal agencies charged with distributing funding can’t offer the amount needed to get resources to where they need to go quickly, effectively, and equitably.
Community navigator programs offer a potential solution. Navigators are local and regional experts with a deep understanding of the climate and clean energy challenges and opportunities in their area. These navigators can be trained in federal funding requirements, clean energy technologies, permitting processes, and more – allowing them to share that knowledge with their communities and boost capacity.
Federal agencies like the Department of Energy (DOE) should invest in standing up these programs by collecting feedback on specific capacity needs from regional partners and attaching them to existing technical assistance funding. These programs can look different, but agencies should consider specific goals and desired outcomes, identify appropriate regional and local partners, and explore additional flexible funding opportunities to get them off the ground.
Community navigator programs can provide much-needed capacity combined with deep place-based knowledge to create local champions with expertise in accessing federal funding – relieving agencies of technical assistance burdens and smoothing grant-writing processes for local and state partners. Agencies should quickly take advantage of these programs to implement funding more effectively.
Challenge
BIL/IRA implementation is well under way, with countless programs being stood up at record speed by federal agencies. Of course, the sheer size of the packages means that there is still quite a bit of funding on the table at DOE that risks not being distributed effectively or equitably in the allotted time frame. While the agency is making huge strides to roll out its resources—which include state-level block grants, loan guarantee programs, and tax rebates—it has limited capacity to fully understand the unique needs of individual cities and communities and to support each location effectively in accessing funding opportunities and implementing related programs.
Subnational actors own the burden of distributing and applying for funding. States, cities, and communities want to support distribution, but they are not equally prepared to access federal funding quickly. They lack what officials call absorptive capacity, the ability to apply for, distribute, and implement funding packages. Agencies don’t have comprehensive knowledge of barriers to implementation across the hundreds of thousands of communities and can’t provide individualized technical assistance that is needed.
Two recent research projects identified several keys ways that cities, state governments, and technical assistance organizations need support from federal agencies:
- Identifying appropriate federal funding opportunities and matching with projects and stakeholders
- Understanding complex federal requirements and processes for accessing those opportunities
- Assistance with completing applications quickly and accurately
- Assembling the necessary technical capacity during the pre-application period to develop a quality application with a higher likelihood of funding
- Guidance on allowable expenditures from federal funding that support the overall technical capacity or coordinating capability of a subnational entity to collect, analyze, and securely share data on project outcomes
While this research focuses on several BIL/IRA agencies, the Department of Energy in particular distributed hundreds of billions of dollars to communities over the past few years. DOE faces an additional challenge: up until 2020, the agency was mainly focused on conducting basic science research. With the advent of BIL, IRA, and the CHIPS and Science Act, it had to adjust quickly to conduct more deployment and loan guarantee activities.
In order to meet community needs, DOE needs help – and at its core, this problem is one of talent and capacity. Since the passage of BIL, DOE has increased its hiring and bolstered its offices through the Clean Energy Corps.
Yet even if DOE could hire faster and more effectively, the sheer scope of the problem outweighs any number of federal employees. Candidates need not only certain skills but also knowledge specific to each community. To fully meet the needs of the localities and individuals it aims to reach, DOE would need to develop thorough community competency for the entire country. With over 29,000 defined communities in the United States – with about half being classified as ‘low capacity’ – it’s simply impossible to hire enough people or identify and overcome the barriers each one faces in the short amount of time allotted to implementation of BIL/IRA. Government needs outside support in order to distribute funds quickly and equitably.
Opportunity
DOE, the rest of the federal government, and the national labs are keen to provide significant technical assistance for their programs. DOE’s Office of State and Community Energy Programs has put considerable time and energy into expanding its community support efforts, including the recently stood up Office of Community Engagement and the Community Energy Fellows program.
National labs have been engaging communities for a long time – the National Renewable Energy Laboratory (NREL) conducts trainings and information sessions, answers questions, and connects communities with regional and federal resources. Colorado and Alaska, for example, were well-positioned to take advantage of federal funding when BIL/IRA were released as a result of federal training opportunities from the NREL, DOE, and other institutions, as well as local and regional coordinated approaches to preparing. Their absorptive capacity has helped them successfully access opportunities – but only because communities, cities, and Tribal governments in those regions have spent the last decade preparing for clean energy opportunities.
While this type of long-term technical assistance and training is necessary, there are resources available right now that are at risk of not being used if states, cities, and communities can’t develop capacity quickly. As DOE continues to flex its deployment and demonstration muscles, the agency needs to invest in community engagement and regional capacity to ensure long-term success across the country.
A key way that DOE can help meet the needs of states and cities that are implementing funding is by standing up community navigator programs. These programs take many forms, but broadly, they leverage the expertise of individuals or organizations within a state or community that can act as guides to the barriers and opportunities within that place.
Community navigators themselves have several benefits. They can act as a catalytic resource by delivering quality technical assistance where federal agencies may not have capacity. In DOE’s case, this could help communities understand funding opportunities and requirements, identify appropriate funding opportunities, explore new clean energy technologies that might meet the needs of the community, and actually complete applications for funding quickly and accurately. They understand regional assets and available capital and have strong existing relationships. Further, community navigators can help build networks – connecting community-based organizations, start-ups, and subnational government agencies based on focus areas.
The DOE and other agencies with BIL/IRA mandates should design programs to leverage these navigators in order to better support state and local organizations with implementation. Programs that leverage community navigators will increase the efficiency of federal technical assistance resources, stretching them further, and will help build capacity within subnational organizations to sustain climate and clean energy initiatives longer term.
These programs can target a range of issues. In the past, they have been used to support access to individual benefits, but expanding their scope could lead to broader results for communities. Training community organizations, and by extension individuals, on how to engage with federal funding and assess capital, development, and infrastructure improvement opportunities in their own regions can help federal agencies take a more holistic approach to implementation and supporting communities. Applying for funding takes work, and navigators can help – but they can also support the rollout of proposed programs once funding is awarded and ensure the projects are seen through their life cycles. For example, understanding broader federal guidance on funding opportunities like the Office of Management and Budget’s proposed revisions to the Uniform Grants Guidance can give navigators and communities additional tools for monitoring and evaluation and administrative capacity.
Benefits of these programs aren’t limited to funding opportunities and program implementation – they can help smooth permitting processes as well. Navigators can act as ready-made champions for and experts on clean energy technologies and potential community concerns. In some communities, distrust of clean energy sources, companies, and government officials can slow permitting, especially for emerging technologies that are subject to misinformation or lack of wider recognition. Supporting community champions that understand the technologies, can advocate on their behalf, and can facilitate relationship building between developers and community members can reduce opposition to clean energy projects.
Further, community navigator programs could help alleviate cost-recovery concerns from permitting teams. Permitting staff within agencies understand that communities need support, especially in the pre-application period, but in the interest of being good stewards of taxpayer dollars they are often reluctant to invest in applications that may not turn into projects.
Overall, these programs have major potential for expanding the technical assistance resources of federal agencies and the capacity of state and local governments and community-based organizations. Federal agencies with a BIL/IRA mandate should design and stand up these programs alongside the rollout of funding opportunities.
Plan of Action
With the Biden Administration’s focus on community engagement and climate and energy justice, agencies have a window of opportunity in which to expand these programs. In order to effectively expand community navigator programs, offices should:
Build community navigator programs into existing technical assistance budgets.
Offices at agencies and subcomponents with BIL/IRA funding like the Department of Energy, the Bureau of Ocean Energy Management, the Bureau of Land Management (BLM), and the Environmental Protection Agency (EPA) have expanded their technical assistance programs alongside introducing new initiatives from that same funding. Community navigator programs are primarily models for providing technical assistance – and can use programmatic funding. Offices should assess funding capabilities and explore flexible funding mechanisms like the ones below.
Some existing programs are attached to large block grant funding, like DOE’s Community Energy Programs attached to the Energy Efficiency and Conservation Block Grant Program. This is a useful practice as the funding source has broad goals and is relatively large and regionally nonspecific.
Collect feedback from regional partners on specific challenges and capacity needs to appropriately tailor community navigator programs.
Before setting up a program, offices should convene local and regional partners to assess major challenges in communities and better design a program. Feedback collection can take the form of journey mapping, listening sessions, convenings, or other structures. These meetings should rely on partners’ expertise and understanding of the opportunities specific to their communities.
For example, if there’s sufficient capacity for grant-writing but a lack of expertise in specific clean energy technologies that a region is interested in, that would inform the goals, curricula, and partners of a particular program. It also would help determine where the program should sit: if it’s targeted at developing clean energy expertise in order to overcome permitting hurdles, it might fit better at the BLM or be a good candidate for a partnership between a DOE office and BLM.
Partner with other federal agencies to develop more holistic programs.
The goals of these programs often speak to the mission of several agencies – for example, the goal of just and equitable technical assistance has already led to the Environmental Justice Thriving Communities Technical Assistance Centers program, a collaboration between EPA and DOE. By combining resources, agencies and offices can even further expand the capacity of a region and increase accessibility to more federal funding opportunities.
A good example of offices collaborating on these programs is below, with the Arctic Energy Ambassadors, funded by the Office of State and Community Energy Programs (SCEP) and the Arctic Energy Office.
Roadmap for Success
There are several initial considerations for building out a program, including solidifying the program’s goals, ensuring available funding sources and mechanisms, and identifying regional and local partners to ensure it is sustainable and effective. Community navigator programs should:
Identify a need and outline clear goals for the program.
Offices should clearly understand the goals of a program. This should go without saying, but given the inconsistency in needs, capacity, and readiness across different communities, it’s key to develop a program that has defined what success looks like for the participants and region. For example, community navigator programs could specifically work to help a region navigate permitting processes; develop several projects of a singular clean energy technology; or understand how to apply for federal grants effectively. Just one of those goals could underpin an entire program.
Ideally, community navigator programs would offer a more holistic approach – working with regional organizations or training participants who understand the challenges and opportunities within their region to identify and assess federal funding opportunities and work together to develop projects from start to finish. But agencies just setting up programs should start with a more directed approach and seek to understand what would be most helpful for an area.
Source and secure available funding, including considerations for flexible mechanisms.
There are a number of available models using different funding and structural mechanisms. Part of the benefit of these programs is that they don’t rely solely on hiring new technical assistance staff, and offices can use programmatic funds more flexibly to work with partners. Rather than hiring staff to work directly for an agency, offices can work with local and regional organizations to administer programs, train other individuals and organizations, and augment local and community capacity.
Further, offices should aim to work across the agency and identify opportunities to pool resources. The IRA provided a significant amount of funding for technical assistance across the agency – for example, the State Energy Program funding at SCEP, the Energy Improvements in Rural and Remote Areas funding at the Office of Clean Energy Demonstrations (OCED), and the Environmental Justice Thriving Communities Technical Assistance Centers program from a Department of Transportation/Department of Energy partnership could all be used to fund these programs or award funding to organizations that could administer programs.
Community navigator programs could also be good candidates for entities like FESI, the DOE’s newly authorized Foundation for Energy Security and Innovation. Although FESI must be set up by DOE, once formally established it becomes a 501(c)(3) organization and can combine congressionally appropriated funding with philanthropic or private investments, making it a more flexible tool for collaborative projects. FESI is a good tool for the partnerships described above – it could hold funding from various sources and support partners overseeing programs while convening with their federal counterparts.
Finally, DOE is also exploring the expanded use of Partnership Intermediary Agreements (PIAs), public-private partnership tools that are explicitly targeted at nontraditional partners. As the DOE continues to announce and distribute BIL/IRA funds, these agreements could be used to administer community navigator programs.
Build relationships and partner with appropriate local and regional stakeholders.
Funding shouldn’t be the only consideration. Agency offices need to ensure they identify appropriate local and regional partners, both for administration and funding. Partners should be their own form of community navigators – they should understand the region’s clean energy ecosystem and the unique needs of the communities within. In different places, the reach and existence of these partners may vary – not every locality will have a dedicated nonprofit or institution supporting clean energy development, environmental justice, or workforce, for example. In those cases, there could be regional or county-level partners that have broader scope and more capacity and would be more effective federal partners. Partner organizations should not only understand community needs but have a baseline level of experience in working with the federal government in order to effectively function as the link between the two entities. Finding the right balance of community understanding and experience with federal funding is key.
This is not foolproof. NREL’s ‘Community to Clean Energy (C2C) Peer Learning Cohorts’ can help local champions share challenges and best practices across states and communities and are useful tools for enhancing local capacity. But this program faces similar challenges as other technical assistance programs: participants engage with federal institutions that provide training and technical expertise that may not directly speak to local experience. It may be more effective to train a local or regional organization with a deeper understanding of the specific challenges and opportunities of a place and greater immediate buy-in from the community. It’s challenging for NREL as well to identify the best candidates in communities across the country without that in-depth knowledge of a region.
Additional federal technical assistance support is sorely needed if BIL/IRA funds are to be distributed equitably and quickly. Federal agencies are moving faster than ever before but don’t have the capacity to assess state and local needs. Developing models for state and local partners can help agencies get funding out the door and where it needs to go to support communities moving towards a clean energy transition.
Case Study: DOE’s Arctic Energy Ambassadors
DOE’s Arctic Energy Office (AEO) has been training state level champions for years but recently introduced the Arctic Energy Ambassadors program, using community navigators to expand clean energy project development.
The program, announced in late October 2023, will support regional champions of clean energy with training and resources to help expand their impact in their communities and across Alaska. The ambassadors’ ultimate goal is clean energy project development: helping local practitioners access federal resources, identify appropriate funding opportunities, and address their communities’ specific clean energy challenges.
The Arctic Energy Office is leading the program with help from several federal and subnational organizations. DOE’s Office of State and Community Engagement and Office of Energy Efficiency and Renewable Energy are also providing funding.
On the ground, the Denali Commission will oversee distribution of funding, and the Alaska Municipal League will administer the program. The combination of comparative advantages is what will hopefully make this program successful. The Denali Commission, in addition to receiving congressionally appropriated funding, can receive funds from other nonfederal sources in service of its mission. This could help the Commission sustain the ambassadors over the longer term and use funds more flexibly. The Commission also has closer relationships with state-level and Tribal governments and can provide insight into regional clean energy needs.
The Alaska Municipal League (AML) brings additional value as a partner; its role in supporting local governments across Alaska gives it a strong sense of community strengths and needs. AML will recruit, assess, and identify the 12 ambassadors and coordinate program logistics and travel for programming. Identifying the right candidates for the program requires in-depth knowledge of Alaskan communities, including more rural and remote ones.
For its own part, the AEO will provide the content and technical expertise for the program. DOE continues to host an incredible wealth of subject matter knowledge on cutting-edge clean energy technologies, and its leadership in this area combined with the local understanding and administration by AML and Denali Commission will help the Arctic Energy Ambassadors succeed in the years to come.
In all, strong local and regional partners, diverse funding sources and flexible mechanisms for delivering it, and clear goals for community navigator programs are key for successful administration. The Arctic Energy Ambassadors represents one model that other agencies can look to for success.
Case study: SCEP’s Community Energy Fellows Program
DOE’s State and Community Energy Programs office has been working tirelessly to implement BIL and IRA, and last year as part of those efforts it introduced the Community Energy Fellows Program (CEFP).
This program aims to support local and Tribal governments with their projects funded by the Energy Efficiency and Conservation Block Grants. CEFP matches midcareer energy professionals with host organizations to provide support and technical assistance on projects as well as learn more about how clean energy development happens.
Because the program has a much broader scope than the Arctic Energy Fellows, it solicits and assesses host institutions as well as Fellows. This allows SCEP to more effectively match the two based on issue areas, expertise, and specific skillsets. This structure allows for multiple community navigators – the host institution understands the needs of its community and the Fellow brings expertise in federal programs and clean energy development. Both parties gain from the fellowship.
In addition, SCEP has added another resource: Clean Energy Coaches, who provide another layer of expertise to the host institution and the Fellow. These coaches will help develop the Fellows’ skills as they work to support the host institution and community.
Some of the awards are already being rolled out, with a second call for host institutions and Fellows out now. Communities in southern Maine participating in the program are optimistic about the support that navigators will provide – and their project leads have a keen sense of the challenges in their communities.
As the program continues to grow, it can provide a great opportunity for other agencies and offices to learn from its success.
Laying the Foundation for the Low-Carbon Cement and Concrete Industry
This report is part of a series on underinvested clean energy technologies, the challenges they face, and how the Department of Energy can use its Other Transaction Authority to implement programs custom tailored to those challenges.
Cement and concrete production is one of the hardest industries to decarbonize. Solutions for low-emissions cement and concrete are much less mature than those for other green technologies like solar and wind energy and electric vehicles. Nevertheless, over the past few years, young companies have achieved significant milestones in piloting their technologies and certifying their performance and emissions reductions. In order to finance new manufacturing facilities and scale promising solutions, companies will need to demonstrate consistent demand for their products at a financially sustainable price. Demand support from the Department of Energy (DOE) can help companies meet this requirement and unlock private financing for commercial-scale projects. Using its Other Transactions Authority, DOE could design a demand-support program involving double-sided auctions, contracts for difference, or price and volume guarantees. To fund such a program using existing funds, the DOE could incorporate it into the Industrial Demonstrations Program. However, additional funding from Congress would allow the DOE to implement a more robust program. Through such an initiative, the government would accelerate the adoption of low-emissions cement and concrete, providing emissions reductions benefits across the country while setting the United States up for success in the future clean industrial economy.
Introduction
Besides water, concrete is the most consumed material in the world. It is the material of choice for construction thanks to its durability, versatility, and affordability. As of 2022, the cement and concrete sector accounted for nine percent of global carbon emissions. The vast majority of the embodied emissions of concrete come from the production of Portland cement. Cement production emits carbon through the burning of fossil fuels to heat kilns (40% of emissions) and the chemical process of turning limestone and clay into cement using that heat (60% of emissions). Electrifying production facilities and making them more energy efficient can help decarbonize the former but not the latter, which requires deeper innovation.
Current solutions on the market substitute a portion of the cement used in concrete mixtures with Supplementary Cementitious Materials (SCMs) like fly ash, slag, or unprocessed limestone, reducing the embodied emissions of the resulting concrete. But these SCMs cannot replace all of the cement in concrete, and currently there is an insufficient supply of readily usable fly ash and slag for wider adoption across the industry.
The next generation of ultra-low-carbon, carbon-neutral, and even carbon-negative solutions seeks to develop alternative feedstocks and processes for producing cement or cementitious materials that can replace cement entirely and to capture carbon in aggregates and wet concrete. The DOE reports that testing and scaling these new technologies is crucial to fully eliminate emissions from concrete by 2050. Bringing these new technologies to the market will not only help the United States meet its climate goals but also promote U.S. leadership in manufacturing.
A number of companies have established pilot facilities or are in the process of constructing them. These companies have successfully produced near-carbon-neutral and even carbon-negative concrete. Building off of these milestones, companies will need to secure financing to build full-scale commercial facilities and increase their manufacturing capacity.
Challenges Facing Low-Carbon Cement and Concrete
A key requirement for accessing both private-sector and government financing for new facilities is that companies obtain long-term offtake agreements, which assure financiers that there will be a steady source of revenue once the facility is built. But the boom-and-bust nature of the construction industry discourages construction companies and intermediaries from entering into long-term financial commitments in case there won’t be a project to use the materials for. Cement, aggregates, and other concrete inputs also take up significant volume, so it would be difficult and costly for potential offtakers to store excess amounts during construction lulls. For these reasons, construction contractors procure concrete on an as-needed, project-specific basis.
Adding to the complexity, structural features of the cement and concrete market increase the difficulty of securing long-term offtake agreements:
- Long, fragmented supply chain: While the supply chain is highly concentrated at either end, there are multiple intermediaries between the actual producers of cement, aggregates, and other inputs and the final construction customers. These include the thousands of ready-mix concrete producers, along with materials dealers, construction contractors, and subcontractors. As a result, construction customers usually aren’t buying materials themselves, and their contractors or subcontractors often aren’t buying materials directly from cement producers.
- Regional fragmentation: Cement, aggregates, and other concrete inputs are heavy products, which entail high freight costs and embodied emissions from transportation, so producers have a limited range in which they are willing to ship their product. After these products are shipped to a ready-mix concrete facility, the fresh concrete must then be delivered to the construction site within 60 to 90 minutes or the concrete will harden. As a result, the localization of supply chains limits the potential customers for a new manufacturing plant.
- Low margins: The cement and concrete markets operate with very low margins, so buyers are highly sensitive to price. Consequently, low-carbon cement and concrete may struggle to compete against conventional options due to their green premiums.
Luckily, private construction is not the only customer for concrete. The U.S. government (federal, state, and local combined) accounts for roughly 50% of all concrete procurement in the country. Used correctly, the government’s purchasing power can be a powerful lever for spurring the adoption of decarbonized cement and concrete. However, the government faces similar barriers as the private sector against entering into long-term offtake agreements. Government procurement of concrete goes through multiple intermediaries and operates on an as-needed, project-specific basis: government agencies like the General Services Administration (GSA) enter into agreements with construction contractors for specific projects, and then the contractors or their subcontractors make the ultimate purchasing decisions for concrete.
Federal Support
The Federal Buy Clean Initiative, enacted in 2021 by the Biden Administration, is starting to address the procurement challenge for low-carbon cement and concrete. Among the initiative’s programs is the allocation of $4.5 billion from the Inflation Reduction Act (IRA) for the GSA and the Department of Transportation (DOT) to use lower-carbon construction materials. Under the initiative, the GSA is piloting directly procuring low-embodied-carbon materials for federal construction projects. To qualify as low-embodied-carbon concrete under the GSA’s interim requirements, concrete mixtures only have to achieve a roughly 25–50% reduction in carbon content,1 depending on the compressive strength. The requirement may be even less if no concrete meeting this standard is available near the project site. Since the bar is only slightly below traditional concrete, young companies developing the solutions to fully decarbonize concrete will have trouble competing in terms of price against companies producing more well-established but higher-emission solutions like fly ash, slag, and limestone concrete mixtures to secure procurement contracts. Moreover, the just-in-time and project-specific nature of these procurement contracts means they still don’t address juvenile companies’ need for long-term price and customer security in order to scale up.
The ideal solution for this is a demand-support program. The DOE Office of Clean Energy Demonstrations (OCED) is developing a demand-support program for the Hydrogen Hubs initiative, setting aside $1 billion for demand-support to accompany the $7 billion in direct funding to regional Hydrogen Hubs. In its request for proposals, OCED says that the hydrogen demand-support program will address the “fundamental mismatch in [the market] between producers, who need long-term certainty of high-volume demand in order to secure financing to build a project, and buyers, who often prefer to buy on a short-term basis at more modest volumes, especially for products that have yet to be produced at scale and [are] expected to see cost decreases.”
A demand-support program could do the same for low-carbon cement and concrete, addressing the market challenges that grants alone cannot. OCED is reviewing applications for the $6.3 billion Industrial Demonstrations Program. Similar to the Hydrogen Hubs, OCED could consider setting aside $500 million to $1 billion of the program funds to implement demand-support programs for the two highest-emitting heavy industries, low-carbon cement/concrete and steel, at $250 million to $500 million each.
Additional funding from Congress would allow DOE to implement a more robust demand-support program. Federal investment in industrial decarbonization grew from $1.5 billion in FY21 to over $10 billion in FY23, thanks largely to new funding from BIL and IRA. However, the sector remains underfunded relative to its emissions, contributing 23% of the country’s emissions while receiving less than 12% of Federal climate innovation funding. A promising piece of legislation that was recently introduced is The Concrete and Asphalt Innovation Act of 2023, which would, among other things, direct the DOE to establish a program of research, development, demonstration, and commercial application of low-emissions cement, concrete, asphalt binder, and asphalt mixture. This would include a demonstration initiative authorized at $200 million and the production of a five-year strategic plan to identify new programs and resources needed to carry out the mission. If the legislation is passed, the DOE could propose a demand-support program in its strategic plan and request funding from Congress to set it up, though the faster route would be for Congress to add a section to the Act directly establishing a demand-support program within DOE and authorizing funding for it before passing the Act.
Other Transactions Authority
BIL and IRA gave DOE an expanded mandate to support innovative technologies from early-stage research through commercialization. In order to do so, DOE must be just as innovative in its use of its available authorities and resources. Tackling the challenge of bringing technologies from pilot to commercialization requires DOE to look beyond traditional grant, loan, and procurement mechanisms. Previously, we have identified the DOE’s Other Transaction Authority (OTA) as an underleveraged tool for accelerating clean energy technologies.
OTA is defined in legislation as the authority to enter into transactions that are not government grants or contracts in order to advance an agency’s mission. This negative definition provides DOE with significant freedom to design and implement flexible financial agreements that can be tailored to address the unique challenges that different technologies face. DOE plans to use OTA to implement the hydrogen demand-support program, and it could also be used for a demand-support program for low-carbon cement and concrete. The DOE’s new Guide to Other Transactions provides official guidance on how DOE personnel can use the flexibilities provided by OTA.
Defining Products for Demand Support
Before setting up a demand-support program, DOE first needs to define what a low-carbon cement or concrete product is and the value it provides in emissions avoided. This is not straightforward due to (1) the heterogeneity of solutions, which prevents apples-to-apples comparisons in price, and (2) variations in the amount of avoided emissions that different solutions can provide. To address the first issue, for products that are not ready-mix concrete, the DOE should calculate the cost of a unit of concrete made using the product, based on a standardized mix ratio of a specific compressive strength and market prices for the other components of the concrete mix. To address the second issue, the DOE should then divide the calculated price per unit of concrete (e.g., $/m3) by the amount of CO2 emissions avoided per unit of concrete compared to the NRCMA’s industry average (e.g., kg/m3) to determine the effective price per unit of CO2 emissions avoided. The DOE can then fairly compare bids from different projects using this metric. Such an approach would result in the government providing demand support for the products that are most cost-effective at reducing carbon emissions, rather than solely the cheapest.
Furthermore, the DOE should put an upper limit on the amount of embodied carbon that the concrete product or concrete made with the product must meet in order to qualify as “low carbon.” We suggest that the DOE use the limits established by the First Movers Coalition, an international corporate advanced market commitment for concrete and other hard-to-abate industries organized by the World Economic Forum. The limits were developed through conversations with incumbent suppliers, start-ups, nonprofits, and intergovernmental organizations on what would be achievable by 2030. The limits were designed to help move the needle towards commercializing solutions that enable full decarbonization.
Companies that participate in a DOE demand-support program should be required after one or two years of operations to confirm that their product meets these limits through an Environmental Product Declaration.2 Using carbon offsets to reach that limit should not be allowed, since the goal is to spur the innovation and scaling of technologies that can eventually fully decarbonize the cement and concrete industry.
Below are some ideas for how DOE can set up a demand-support program for low-carbon cement and concrete.
Program Proposals
Double-Sided Auction
Double-sided auctions are designed to support the development of production capacity for green technologies and products and the creation of a market by providing long-term price certainty to suppliers and facilitating the sale of their products to buyers. As the name suggests, a double-sided auction consists of two phases: First, the government or an intermediary organization holds a reverse auction for long-term purchase agreements (e.g., 10 years) for the product from suppliers, who are incentivized to bid the lowest possible price in order to win. Next, the government conducts annual auctions of short-term sales agreements to buyers of the product. Once sales agreements are finalized, the product is delivered directly from the supplier to the buyer, with the government acting as a transparent intermediary. The government thus serves as a market maker by coordinating the purchase and sale of the product from producers to buyers. Government funding covers the difference between the original purchase price and the final sale price, reducing the impact of the green premium for buyers and sellers.
While the federal government has not yet implemented a double-sided auction program, OCED is considering setting up the hydrogen demand-support measure as a “market maker” that provides a “ready purchaser/seller for clean hydrogen.” Such a market maker program could be implemented most efficiently through double-sided auctions.
Germany was the first to conceive of and develop the double-sided auction scheme. The H2Global initiative was established in 2021 to support the development of production capacity for green hydrogen and its derivative products. The program is implemented by Hintco, an intermediary company, which is currently evaluating bids for its first auction for the purchase of green ammonia, methanol, and e-fuels, with final contracts expected to be announced as soon as this month. Products will start to be delivered by the end of 2024.
(Source: H2Global)
A double-sided auction scheme for low-carbon cement and concrete would address producers’ need for long-term offtake agreements while matching buyers’ short-term procurement needs. The auctions would also help develop transparent market prices for low-carbon cement and concrete products.
All bids for purchase agreements should include detailed technical specifications and/or certifications for the product, the desired price per unit, and a robust, third-party life-cycle assessment of the amount of embodied carbon per unit of concrete made with the product, at different compressive strengths. Additionally, bids of ready-mix concrete should include the location(s) of their production facility or facilities, and bids of cement and other concrete inputs should include information on the locations of ready-mix concrete facilities capable of producing concrete using their products. The DOE should then select bids through a pure reverse auction using the calculated effective price per unit of CO2 emissions avoided. To account for regional fragmentation, the DOE could conduct separate auctions for each region of the country.
A double-sided auction presents similar benefits to the low-carbon cement and concrete industry as an advance market commitment would. However, the addition of an efficient, built-in system for the government to then sell that cement or concrete allotment to a buyer means that the government is not obligated to use the cement or concrete itself. This is important because the logistics of matching cement or concrete production to a suitable government construction project can be difficult due to regional fragmentation, and the DOE is not a major procurer of cement and concrete.3 Instead, under this scheme, federal, state, or local agencies working on a construction project or their contractors could check the double-sided auction program each year to see if there is a product offering in their region that matches their project needs and sustainability goals for that year, and if so, submit a bid to procure it. In fact, this should be encouraged as a part of the Federal Buy Clean Initiative, since the government is such an important consumer of cement and concrete products.
Contracts for Difference
Contracts for difference (CfD, or sometimes called two-way CfD) programs aim to provide price certainty for green technology projects and close the gap between the price that producers need and the price that buyers are willing to offer. CfD have been used by the United Kingdom and France primarily to support the development of large-scale renewable energy projects. However, CfD can also be used to support the development of production capacity for other green technologies. OCED is considering CfD (also known as pay-for-difference contracts) for its hydrogen demand-support program.
CfD are long-term contracts signed between the government or a government-sponsored entity and companies looking to expand production capacity for a green product.4 The contract guarantees that once the production facility comes online, the government will ensure a steady price by paying suppliers the difference between the market price for which they are able to sell their product and a predetermined “strike price.” On the other hand, if the market price rises above the strike price, the supplier will pay the difference back to the government. This prevents the public from funding any potential windfall profits.
(Source: Canadian Climate Institute)
A CfD program could provide a source of demand certainty for low-carbon cement and concrete companies looking to finance the construction of pilot- and commercial-scale manufacturing plants or the retrofitting of existing plants. The selection of recipients and strike prices should be determined through annual reverse auctions. In a typical reverse auction for CfD, the government sets a cap on the maximum number of units of product and the max strike price they’re willing to accept. Each project candidate then places a sealed bid for a unit price and the amount of product they plan to produce. The bids are ranked by unit price, and projects are accepted from low to high unit price until either the max total capacity or max strike price is reached. The last project accepted sets the strike price for all accepted projects. The strike price is adjusted annually for inflation but otherwise fixed over the course of the contract. Compared to traditional subsidy programs, a CfD program can be much more cost-efficient thanks to the reverse auction process. The UK’s CfD program has seen the strike price fall with each successive round of auctions.
Applying this to the low-carbon cement and concrete industry requires some adjustments, since there are a variety of products for decarbonizing cement and concrete. As discussed prior, the DOE should compare project bids according to the effective price per unit CO2 abated when the product is used to make concrete. The DOE should also set a cap on the maximum volume of CO2 it wishes to abate and the maximum effective price per unit of CO2 abated that it is willing to pay. Bids can then be accepted from low to high price until one of those caps is hit. Instead of establishing a single strike price, the DOE should use the accepted project’s bid price as the strike price to account for the variation in types of products.
Backstop Price Guarantee
A CfD program could be designed as a backstop price guarantee if one removes the requirement that suppliers pay the government back when market prices rise above the strike price. In this case, the DOE would set a lower maximum strike price for CO2 abatement, knowing that suppliers will be willing to bid lower strike prices, since there is now the opportunity for unrestricted profits above the strike price. The DOE would then only pay in the worst-case scenario when the market price falls below the strike price, which would operate as an effective price floor.
Backstop Volume Guarantee
Alternatively, the DOE could address demand uncertainty by providing a volume guarantee. In this case, the DOE could conduct a reverse auction for volume guarantee agreements with manufacturers, wherein the DOE would commit to purchasing any units of product short of the volume guarantee that the company is unable to sell each year for a certain price, and the company would commit to a ceiling on the price they will charge buyers.5 Using OTA, the DOE could implement such a program in collaboration with DOT or GSA, wherein DOE would purchase the materials and DOT or GSA would use the materials for their construction needs.
Other Considerations for Implementation
Rather than directly managing a demand-support program, the DOE should enter into an OT agreement with an external nonprofit entity to administer the contracts.6 The nonprofit entity would then hold auctions and select, manage, and fulfill the contracts. DOE is currently in the process of doing this for the hydrogen demand-support program.
A nonprofit entity could provide two main benefits. First, the logistics of implementing such a program would not be trivial, given the number of different suppliers, intermediaries, and offtakers involved. An external entity would have an easier and faster time hiring staff with the necessary expertise compared to the federal hiring process and limited budget for program direction that the DOE has to contend with. Second, the entity’s independent nature would make it easier to gain lasting bipartisan support for the demand-support program, since the entity would not be directly associated with any one administration.
Coordination with Other DOE Programs
The green premium for near-zero-carbon cement and concrete products is steep, and demand-support programs like the ones proposed in this report should not be considered a cure-all for the industry, since it may be difficult to secure a large enough budget for any one such program to fully address the green premium across the industry. Rather, demand-support programs can complement the multiple existing funding authorities within the DOE by closing the residual gap between emerging technologies and conventional alternatives after other programs have helped to lower the green premium.
The DOE’s Loan Programs Office (LPO) received a significant increase in their lending authorities from the IRA and has the ability to provide loans or loan guarantees to innovative clean cement facilities, resulting in cheaper capital financing and providing an effective subsidy. In addition, the IRA and the Bipartisan Infrastructure Law provided substantial new funding for the demonstration of industrial decarbonization technologies through OCED.
Policies like these can be chained together. For example, a clean cement start-up could simultaneously apply to OCED for funding to demonstrate their technology at scale and a loan or loan guarantee from LPO after due diligence on their business plan. Together, these two programs drive down the cost of the green premium and derisk the companies that successfully receive their support, leaving a much more modest price premium that a mechanism like a double-sided auction could affordably cover with less risk.
Successfully chaining policies like this requires deep coordination across DOE offices. OCED and LPO would need to work in lockstep in conducting technical evaluations and due diligence of projects that apply to both and prioritize funding of projects that meet both offices’ criteria for success. The best projects should be offered both demonstration funding from OCED and conditional commitments from LPO, which would provide companies with the confidence that they will receive follow-on funding if the demonstration is successful and other conditions are met, while posing no added risk to LPO since companies will need to meet their conditions first before receiving funds. The assessments should also consider whether the project would be a strong candidate for receiving demand support through a double-sided auction, CfD program, or price/volume guarantee, which would help further derisk the loan/loan guarantee and justify the demonstration funding.
Candidates for receiving support from all three public funding instruments would of course need to be especially rigorously evaluated, since the fiscal risk and potential political backlash of such a project failing is also much greater. If successful, such coordination would ensure that the combination of these programs substantially moves the needle on bringing emerging technologies in green cement and concrete to commercial scale.
Conclusion
Demand support can help address the key barrier that low-carbon cement and concrete companies face in scaling their technologies and financing commercial-scale manufacturing facilities. Whichever approach the DOE chooses to take, the agency should keep in mind (1) the importance of setting an ambitious standard for what qualifies as low-carbon cement and concrete and comparing proposals using a metric that accounts for the range of different product types and embodied emissions, (2) the complex implementation logistics, and (3) the benefits of coordinating a demand-support program with the agency’s demonstration and loan programs. Implemented successfully, such a program would crowd in private investment, accelerate commercialization, and lay the foundation for the clean industrial economy in the United States.