Meeting Agricultural Sustainability Goals by Increasing Federal Funding for Research on Genetically Engineered Organisms

Summary

Ensuring the sustainability and resiliency of American food systems is an urgent priority, especially in the face of challenges presented by climate change and international geopolitical conflicts. To address these issues, increased federal investment in new, sustainability-oriented agricultural technology is necessary in order to bring greater resource conservation and stress tolerance to American farms and fields. Ongoing advances in bioengineering research and development (R&D) offer a diverse suite of genetically engineered organisms, including crops, animals, and microbes. Given the paramount importance of a secure food supply for national well-being, federal actors should promote the development of genetically engineered organisms for agricultural applications. 

Two crucial opportunities are imminent. First, directives in the Biden Administration’s bioeconomy executive order provide the U.S. Department of Agriculture (USDA) a channel through which to request funding for sustainability-oriented R&D in genetically engineered organisms. Second, renewal of the Farm Bill in 2023 provides a venue for congressional legislators to highlight genetic engineering as a funding focus area of existing research grant programs. Direct beneficiaries of the proposed federal funding will predominantly be nonprofit research organizations such as land grant universities; innovations resulting from the funded research will provide a public good that benefits producers and consumers alike. 

Challenge and Opportunity

The resiliency of American agriculture faces undeniable challenges in the coming decades. The first is resource availability, which includes scarcities of fertile land due to soil degradation and of water due to overuse and drought. Resource availability is also vulnerable to acute challenges, as revealed by the impact of the COVID-19 pandemic and the Russian-Ukraine war on the supply of vital inputs such as fertilizer and gas. The second set of challenges are environmental stressors, many of which are exacerbated by climate change. Flooding can wipe out an entire harvest, while the spread of pathogens poses existential risks not only to individual livelihoods but also to the global market of crops like citrus, chocolate, and banana. Such losses would be devastating for both consumers and producers, especially those in the global south. 

Ongoing advances in bioengineering R&D provide technological solutions in the form of a diverse suite of genetically engineered organisms. These have the potential to address many of the aforementioned challenges, including increasing yield and/or minimizing inputs and boosting resilience to drought, flood, and pathogens. Indeed, existing transgenic crops, such as virus-resistant papaya and flood-tolerant rice, demonstrate the ability of genetically engineered organisms to address agricultural challenges. They can also address other national priorities such as climate change and nutrition by enhancing carbon sequestration and improving the nutritional profile of food. 

Recent breakthroughs in modifying and sequencing DNA have greatly enhanced the speed of developing new, commercializable bioengineered varieties, as well as the spectrum of traits and plants that can be engineered. This process has been especially expedited by the use of CRISPR gene-editing technology; the European Sustainable Agriculture Through Genome Editing (EU-SAGE)’s database documents more than 500 instances of gene-edited crops developed in research laboratories to target traits for sustainable, climate-resilient agriculture. There is thus vast potential for genetically engineered organisms to contribute to sustainable agriculture. 

More broadly, this moment can be leveraged to bring about a turning point in the public perception of genetically engineered organisms. Past generations of genetically engineered organisms have been met with significant public backlash, despite the pervasiveness of inter-organism gene transfer throughout the history of life on earth (see FAQ). Reasons for negative public perception are complex but include the association of genetically engineered organisms with industry profit, as well as an embrace of the precautionary principle to a degree that far exceeds its application to other products, such as pharmaceuticals and artificial intelligence. Furthermore, persistent misinformation and antagonistic activism have engendered entrenched consumer distrust. The prior industry focus on herbicide resistance traits also contributed to the misconception that the technology is only used to increase the use of harmful chemicals in the environment. 

Now, however, a new generation of genetically engineered organisms feature traits beyond herbicide resistance that address sustainability issues such as reduced spoilage. Breakthroughs in DNA sequencing, as well as other analytical tools, have increased our understanding of the properties of newly developed organisms. There is pervasive buy-in for agricultural sustainability goals across many stakeholder sectors, including individual producers, companies, consumers, and legislators on both sides of the aisle. There is great potential for genetically engineered organisms to be accepted by the public as a solution to a widely recognized problem. Dedicated federal funding will be vital in seeing that this potential is realized.

Plan of Action

Recommendation 1: Fund genetically engineered organisms pursuant to the Executive Order on the bioeconomy.

Despite the importance of agriculture for the nation’s basic survival and the clear impact of agricultural innovation, USDA’s R&D spending pales in comparison to other agencies and other expenditures. In 2022, for example, USDA’s R&D budget was a mere 6% of the National Institutes of Health’s R&D budget, and R&D comprised only 9.6% of USDA’s overall discretionary budget. The Biden Administration’s September 2022 executive order provides an opportunity to amend this funding shortfall, especially for genetically engineered organisms.  

The Executive Order on Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy explicitly embraces an increased role for biotechnology in agriculture. Among the policy objectives outlined is the call to “boost sustainable biomass production and create climate-smart incentives for American agricultural producers and forest landowners.” 

Pursuant to this objective, the EO directs the USDA to submit a plan comprising programs and budget proposals to “support the resilience of the United States biomass supply chain [and] encourage climate-smart production” by September 2023. This plan provides the chance for the USDA to secure funding for agricultural R&D in a number of areas. Here, we recommend (1) USDA collaboration in Department of Energy (DoE) research programs amended under the CHIPS and Science Act and (2) funding for startup seed grants. 

CHIPS and Science Act

The 2022 CHIPS and Science Act aims to accelerate American innovation in a number of technology focus areas, including engineering biology. To support this goal, the Act established a new National Engineering Biology Research and Development Initiative (Section 10402). As part of this initiative, the USDA was tasked with supporting “research and development in engineering biology through the Agricultural Research Service, the National Institute of Food and Agriculture programs and grants, and the Office of the Chief Scientist.” Many of the initiative’s priorities are sustainability-oriented and could benefit from genetic engineering contributions. 

A highlight is the designation of an interagency committee to coordinate activities. To leverage and fulfill this mandate, we recommend that the USDA better coordinate with the DoE on bioengineering research. Specifically, the USDA should be involved in the decision-making process for awarding research grants relating to two DoE programs amended by the Act.

The first program is the Biological and Environmental Research Program, which includes carbon sequestration, gene editing, and bioenergy. (See the Appendix for a table summarizing examples of how genetic engineering can contribute sustainability-oriented technologies to these key focus areas.)

The second program is the Basic Energy Sciences Program, which has authorized funding for a Carbon Sequestration Research and Geologic Computational Science Initiative under the DoE. Carbon sequestration via agriculture is not explicitly mentioned in this section, but this initiative presents another opportunity for the USDA to collaborate with the DoE and secure funding for agricultural climate solutions. Congress should make appropriating funding for this program a priority.

Seed Grants

The USDA should pilot a seed grant program to accelerate technology transfer, a step that often poses a bottleneck. The inherent risk of R&D and entrepreneurship in a cutting-edge field may pose a barrier to entry for academic researchers as well as small agricultural biotech companies. Funding decreases the barrier of entry, thus increasing the diversity of players in the field. This can take the form of zero-equity seed grants. Similar to the National Science Foundation (NSF)’s seed grant program, which awards $200+ million R&D funding to about 400 startups, this would provide startups with funding without the risks attached to venture capital funding (such as being ousted from company leadership). The NSF’s funding is spread across numerous disciplines, so a separate agricultural initiative from the USDA dedicated to supporting small agricultural biotech companies would be beneficial. These seed grants would meet a need unmet by USDA’s existing small business grant programs, which are only awarded to established companies.

Together, the funding areas outlined above would greatly empower the USDA to execute the EO’s objective of promoting climate-smart American agriculture.

Recommendation 2: Allocate funding through the 2023 Farm Bill.

The Farm Bill, the primary tool by which the federal government sets agricultural policy, will be renewed in 2023. Several existing mandates for USDA research programs, administered through the National Institute of Food and Agriculture as competitive grants, have been allocated federal funding. Congressional legislators should introduce amendments in the mandates for these programs such that the language explicitly highlights R&D of genetically engineered organisms for sustainable agriculture applications. Such programs include the Agriculture and Food Research Initiative, a major competitive grant program, as well as the Specialty Crop Research Initiative and the Agricultural Genome to Phenome Initiative. Suggested legislative text for these amendments are provided in the Appendix. Promoting R&D of genetically engineered organisms via existing programs circumvents the difficulty of securing appropriations for new initiatives while also presenting genetically engineered organisms as a critically important category of agricultural innovation.

Additionally, Congress should appropriate funding for the Agriculture Advanced Research and Development Authority (AgARDA) at its full $50 million authorization. Similar to its counterparts in other agencies such as ARPA-E and DARPA, AgARDA would enable “moonshot” R&D projects that are high-reward but high-risk or have a long timeline—such as genetically engineered organisms with genetically complex traits. This can be especially valuable for promoting the development of sustainability-oriented crops traits: though they are a clear public good, they may be less profitable and/or marketable than crops with consumer-targeted traits such as sweetness or color, and as such profit-driven companies may be dissuaded from investing in their development. The USDA just published its implementation strategy for AgARDA. Congress must now fully fund AgARDA such that it can execute its strategy and fuel much-needed innovation in agricultural biotechnology. 

Conclusion

Current federal funding for genetically engineered organism R&D does not reflect their substantial impact in ensuring a sustainable, climate-smart future for American agriculture, with applications ranging from increasing resource-use efficiency in bioproduction to enhancing the resilience of food systems to environmental and manmade crises. Recent technology breakthroughs have opened many frontiers in engineering biology, but free market dynamics alone are not sufficient to guarantee that these breakthroughs are applied in the service of the public good in a timely manner. The USDA and Congress should therefore take advantage of upcoming opportunities to secure funding for genetic engineering research projects.

Appendix

Biological and Environmental Research Program Examples 

Research focus area added in CHIPS and Science ActExample of genetic engineering contribution
Bioenergy and biofuelOptimizing biomass composition of bioenergy crops
Non-food bioproductsLab-grown cotton; engineering plants and microbes to produce medicines
Carbon sequestrationImproving photosynthetic efficiency; enhancing carbon storage in plant roots
Plant and microbe interactionsEngineering microbes to counter plant pathogens; engineering microbes to make nutrients more accessible to plants
BioremediationEngineering plants and microbes to sequester and/or breakdown contaminants in soil and groundwater
Gene editing Engineering plants for increased nutrient content, disease-resistance, storage performance
New characterization toolsCreating molecular reporters of plant response to abiotic and biotic environmental dynamics 

Farm Bill Amendments 

Agriculture and Food Research Initiative

One of the Agriculture and Food Research Initiative (AFRI)’s focus areas is Sustainable Agricultural Systems, with topics including “advanced technology,” which supports “cutting-edge research to help farmers produce higher quantities of safer and better quality food, fiber, and fuel to meet the needs of a growing population.” Furthermore, AFRI’s Foundational and Applied Science Program supports grants in priority areas including plant health, bioenergy, natural resources, and environment. The 2023 Farm Bill could amend the Competitive, Special, and Facilities Research Grant Act (7 U.S.C. 3157) to highlight the potential of genetic engineering in the pursuit of AFRI’s goals. 

Example text: 

Subsection (b)(2) of the Competitive, Special, and Facilities Research Grant Act (7 U.S.C. 3157(b)(2)) is amended—

(1) in subparagraph (A)—

(A) in clause (ii), by striking the semicolon at the end and inserting “including genetic engineering methods to make modifications (deletions and/or insertions of DNA) to plant genomes for improved food quality, improved yield under diverse growth conditions, and improved conservation of resource inputs such as water, nitrogen, and carbon;”;

(B) in clause (vi), by striking the “and”;

(C) in clause (vii), by striking the period at the end and inserting “; and”; and

(D) by adding at the end the following: 

“(viii) plant-microbe interactions, including the identification and/or genetic engineering of microbes beneficial for plant health”

(2) in subparagraph (C), clause (iii), by inserting “production and” at the beginning;

(3) in subparagraph (D)– 

(A) in clause (vii), by striking “and”;

(B) in clause (vii), by striking the period at the end and inserting “; and”; and

(C) by adding at the end the following: 

“(ix) carbon sequestration”.

Agricultural Genome to Phenome Initiative

The goal of this initiative is to understand the function of plant genes, which is critical to crop genetic engineering for sustainability. The ability to efficiently insert and edit genes, as well as to precisely control gene expression (a core tenet of synthetic biology), would facilitate this goal.

Example text:

Section 1671(a) of the Food, Agriculture, Conservation, and Trade Act of 1990 (7 U.S.C. 5924(a)) is amended—

  1. In subparagraph (4), by inserting “and environmental” after “achieve advances in crops and animals that generate societal”; and
  2. In subparagraph (5), by inserting “genetic engineering, synthetic biology,” after “to combine fields such as genetics, genomics,”

Specialty Crop Research Initiative

Specialty crops can be a particularly fertile ground for research. There is a paucity of genetic engineering tools for specialty crops as compared to major crops (e.g. wheat, corn, etc.). At the same time, specialty crops such as fruit trees offer the opportunity to effect larger sustainability impacts: as perennials, they remain in the soil for many years, with particular implications for water conservation and carbon sequestration. Finally, economically important specialty crops such as oranges are under extreme disease threat, as identified by the Emergency Citrus Disease Research and Extension Program. Genetic engineering offers potential solutions that could be accelerated with funding. 

Example text:

Section 412(b) of the Agricultural Research, Extension, and Education Reform Act of 1998 (7 U.S.C. 7632(b)) is amended—

  1. In paragraph (1), by inserting “transgenics, gene editing, synthetic biology” after “research in plant breeding, genetics,” and—
    1. In subparagraph (B), by inserting “and enhanced carbon sequestration capacity” after “size-controlling rootstock systems”; and
    2. In subparagraph (C), by striking the semi-colon at the end and inserting “, including water-use efficiency;”
Frequently Asked Questions
What is the definition of a genetically engineered organism? What is the difference between genetically engineered, genetically modified, transgenic, gene-edited, and bioengineered?

Scientists usually use the term “genetic engineering” as a catch-all phrase for the myriad methods of changing an organism’s DNA outside of traditional breeding, but this is not necessarily reflected in usage by regulatory agencies. The USDA’s glossary, which is not regulatorily binding, defines “genetic engineering” as “​​manipulation of an organism’s genes by introducing, eliminating or rearranging specific genes using the methods of modern molecular biology, particularly those techniques referred to as recombinant DNA techniques.” Meanwhile, the USDA’s Animal and Plant Health Inspection Service (APHIS)’s 2020 SECURE rule defines “genetic engineering” as “techniques that use recombinant, synthesized, or amplified nucleic acids to modify or create a genome.” The USDA’s glossary defines “genetic modification” as “the production of heritable improvements in plants or animals for specific uses, via either genetic engineering or other more traditional methods”; however, the USDA National Organic Program has used “genetic engineering” and “genetic modification” interchangeably. 


“Transgenic” organisms can be considered a subset of genetically engineered organisms and result from the insertion of genetic material from another organism using recombinant DNA techniques. “Gene editing” or “genome editing” refers to biotechnology techniques like CRISPR that make changes in a specific location in an organism’s DNA. 


The term “bioengineered” does carry regulatory weight. The USDA-AMS’s National Bioengineered Food Disclosure Standard (NBFDS), published in 2018 and effective as of 2019, defines “bioengineered” as “contains genetic material that has been modified through in vitro recombinant deoxyribonucleic acid (DNA) techniques; and for which the modification could not otherwise be obtained through conventional breeding or found in nature.” Most gene-edited crops currently in development, such as those where the introduced gene is known to occur in the species naturally, are exempt from regulation under both the AMS’s NBFDS and APHIS’s SECURE acts.

What are some examples of genetic engineering methods?

Though “genetic engineering” has only entered the popular lexicon in the last several decades, humans have modified the genomes of plants for millennia, in many different ways. Through genetic changes introduced via traditional breeding, teosinte became maize 10,000 years ago in Mesoamerica, and hybrid rice was developed in 20th-century China. Irradiation has been used to generate random mutations in crops for decades, and the resulting varieties have never been subject to any special regulation.


In fact, transfer of genes between organisms occurs all the time in nature. Bacteria often transfer DNA to other bacteria, and some bacteria can insert genes into plants. Indeed, one of the most common “genetic engineering” approaches used today, Agrobacterium-mediated gene insertion, was inspired by that natural phenomenon. Other methods of DNA delivery including biolistics (“gene gun”) and viral vectors. Each method for gene transfer has many variations, and each method varies greatly in its mode of action and capabilities. This is key for the future of plant engineering: there is a spectrum—not a binary division—of methods, and evaluations of engineered plants should focus on the end product.

How are genetically engineered organisms regulated in the United States?

Genetically engineered organisms are chiefly regulated by USDA-APHIS, the EPA, and the FDA as established by the 1986 Coordinated Framework for the Regulation of Biotechnology. They oversee experimental testing, approval, and commercial release. The Framework’s regulatory approach is grounded in the judgment that the potential risks associated with genetically engineered organisms can be evaluated the same way as those associated with traditionally bred organisms. This is in line with its focus on “the characteristics of the product and the environment into which it is being introduced, not the process by which the product is created.”


USDA-APHIS regulates the distribution of regulated organisms that are products of biotechnology to ensure that they do not pose a plant pest risk. Developers can petition for individual organisms, including transgenics, to be deregulated via Regulatory Status Review.


The EPA regulates the distribution, sale, use, and testing of all pesticidal substances produced in plants and microbes, regardless of method of production or mode of action. Products must be registered before distribution. 


The FDA applies the same safety standards to foods derived from genetically engineered organisms as it does to all foods under the Federal Food, Drug, and Cosmetic Act. The agency provides a voluntary consultation process to help developers ensure that all safety and regulatory concerns, such as toxicity, allergenicity, and nutrient content, are resolved prior to marketing.

How do genetically engineered crops work?

Mechanisms of action vary depending on the specific trait. Here, we explain the science behind two types of transgenic crops that have been widespread in the U.S. market for decades. 


Bt crops: Three of the major crops grown in the United States have transgenic Bt varieties: cotton, corn, and soybean. Bt crops are genetically engineered such that their genome contains a gene from the bacteria Bacillus thuringiensis. This enables Bt crops to produce a protein, normally only produced by the Bt bacteria, that is toxic to a few specific plant pests but harmless for humans, other mammals, birds, and beneficial insects. In fact, the bacteria itself is approved for use as an organic insecticide. However, organic applications of Bt insecticides are limited in efficacy: since the bacteria must be topically applied to the crop, the protein it produces is ineffective against insects that have penetrated the plant or are attacking the roots; in addition, the bacteria can die or be washed away by rain. 


Engineering the crop itself to produce the insecticidal protein more reliably reduces crop loss due to pest damage, which also minimizes the need for other, often more broadly toxic systemic pesticides. Increased yield allows for more efficient use of existing agricultural land. In addition, decreased use of pesticides reduces the energy cost associated with their production and application while also preserving wildlife biodiversity. With regards to concerns surrounding insecticide resistance, the EPA requires farmers who employ Bt, both as a transgenic crop and as an organic spray, to also plant a refuge field of non-Bt crops, which prevents pests from developing resistance to the Bt protein.


The only substantive difference between Bt crops and non-Bt crops is that the former produces an insecticide already permitted by USDA organic regulations. 


Ringspot-resistant rainbow papaya: The transgenic rainbow papaya is another example of the benefits of genetic engineering in agriculture. Papaya plantations were ravaged by the papaya ringspot virus in the late 1900s, forcing many farmers to abandon their lands and careers. In response, scientists developed the rainbow papaya, which contains a gene from the virus itself that allows it to express a protein that counters viral infection. This transgenic papaya was determined to be equivalent in nutrition and all other aspects to the original papaya. The rainbow papaya, with its single gene insertion, is widely considered to have saved Hawaii’s papaya industry, which in 2013 accounted for nearly 25% of Hawaii’s food exports. Transgenic papaya now makes up about 80% of the Hawaiian papaya acreage. The remaining comprise non-GMO varieties, which would have gone locally extinct had it not been for transgenic papayas preventing the spread of the virus. The rainbow papaya’s success has clearly demonstrated that transgenic crops can preserve the genetic diversity of American crops and preserve yield without spraying synthetic pesticides, both of which are stated goals of the USDA Organic Program. However, the National Organic Program’s regulations currently forbid organic farmers from growing virus-resistant transgenic papaya.

How have recent biotechnology breakthroughs accelerated the development of new crops?

With the advent of CRISPR gene-editing technology, which allows scientists to make precise, targeted changes in an organism’s DNA, new genetically engineered crops are being developed at an unprecedented pace. These new varieties will encompass a wider variety of qualities than previously seen in the field of crop biotechnology. Many varieties are directly aimed at shoring up agricultural resilience in the face of climate change, with traits including tolerance to heat, cold, and drought. At the same time, the cost of sequencing an organism’s DNA continues to decrease. This makes it easier to confirm the insertion of multiple transgenes into a plant, as would be necessary to engineer crops to produce a natural herbicide. Such a crop, similar to Bt crops but targeting weeds instead of insects, would reduce reliance on synthetic herbicides while enabling no-till practices that promote soil health. Furthermore, cheap DNA sequencing facilitates access to information about the genomes of many wild relatives of modern crops. Scientists can then use genetic engineering to make wild relatives more productive or introduce wild traits like drought resilience into domesticated varieties. This would increase the genetic diversity of crops available to farmers and help avoid issues inherent to monocultures, most notably the uncontrollable spread of plant diseases. 


At present, most crops engineered with CRISPR technology do not contain genes from a different organism (i.e., not transgenic), and thus do not have to face the additional regulatory hurdles that transgenics like Bt crops did. However, crops developed via CRISPR are still excluded from organic farming.

What are examples of genetically engineered organisms currently on the market or in active development that address sustainability issues?

  • Improving sustainability and land conservation: potatoes that are slower to spoil, wheat with enhanced carbon sequestration capacity 

  • Increasing food quality and nutrition: vegetables with elevated micronutrient content 

  • Increasing and protecting agricultural yields: higher-yield fish, flood-tolerant rice

  • Protecting against plant and animal pests and diseases: blight-resistant chestnut, HLB-resistant citrus

  • Cultivating alternative food sources: bacteria for animal-free production of protein

Which agricultural stakeholders are engaged in genetic engineering R&D and will benefit from federal funding?

The pool of producers of genetically engineered crops is increasingly diverse. In fact, of the 37 new crops evaluated by APHIS’s Biotechnology Regulatory Service under the updated guidelines since 2021, only three were produced by large (>300 employees) for-profit corporations. Many were produced by startups and/or not-for-profit research institutions. USDA NIFA research grants predominantly fund land-grant universities; other awardees include private nonprofit organizations, private universities, and, in select cases (such as small business grants), private for-profit companies.

Why are GMOs so often vilified?

Historically, the concept of GMOs has been associated with giant multinational corporations, the so-called Big Ag. The most prevalent GMOs in the last several decades have indeed been produced by industry giants such as Dow, Bayer, and Monsanto. This association has fueled the negative public perception of GMOs in several ways, including: 



  • Some companies, such as Dow, were responsible for producing the notorious chemical Agent Orange, used to devastating effect in the Vietnam War. While this is an unfortunate shadow on the company, it is unrelated to the properties of genetically engineered crops.

  • Companies have been accused of financially disadvantaging farmers by upholding patents on GMO seeds, which prevents farmers from saving seeds from one year’s crop to plant the next season. Companies have indeed enforced seed patents (which generally last about 20 years), but it is important to note that (1) seed-saving has not been standard practice on many American farms for many decades, since the advent of (nonbioengineered) hybrid crops, from which saved seeds will produce an inferior crop, and (2) bioengineered seeds are not the only seeds that can be and are patented.

How to Replicate the Success of Operation Warp Speed

Operation Warp Speed (OWS) was a public-private partnership that produced COVID-19 vaccines in the unprecedented timeline of less than one year. This unique success among typical government research and development (R&D) programs is attributed to OWS’s strong public-private partnerships, effective coordination, and command leadership structure. Policy entrepreneurs, leaders of federal agencies, and issue advocates will benefit from understanding what policy interventions were used and how they can be replicated. Those looking to replicate this success should evaluate the stakeholder landscape and state of the fundamental science before designing a portfolio of policy mechanisms.

Challenge and Opportunity

Development of a vaccine to protect against COVID-19 began when China first shared the genetic sequence in January 2020. In May, the Trump Administration announced OWS to dramatically accelerate development and distribution. Through the concerted efforts of federal agencies and private entities, a vaccine was ready for the public in January 2021, beating the previous record for vaccine development by about three years. OWS released over 63 million doses within one year, and to date more than 613 million doses have been administered in the United States. By many accounts, OWS was the most effective government-led R&D effort in a generation.

Policy entrepreneurs, leaders of federal agencies, and issue advocates are interested in replicating similarly rapid R&D to solve problems such as climate change and domestic manufacturing. But not all challenges are suited for the OWS treatment. Replicating its success requires an understanding of the unique factors that made OWS possible, which are addressed in Recommendation 1. With this understanding, the mechanisms described in Recommendation 2 can be valuable interventions when used in a portfolio or individually.

Plan of Action

Recommendation 1. Assess whether (1) the majority of existing stakeholders agree on an urgent and specific goal and (2) the fundamental research is already established. 

Criteria 1. The majority of stakeholders—including relevant portions of the public, federal leaders, and private partners—agree on an urgent and specific goal.

The OWS approach is most appropriate for major national challenges that are self-evidently important and urgent. Experts in different aspects of the problem space, including agency leaders, should assess the problem to set ambitious and time-bound goals. For example, OWS was conceptualized in April and announced in May, and had the specific goal of distributing 300 million vaccine doses by January. 

Leaders should begin by assessing the stakeholder landscape, including relevant portions of the public, other federal leaders, and private partners. This assessment must include adoption forecasts that consider the political, regulatory, and behavioral contexts. Community engagement—at this stage and throughout the process—should inform goal-setting and program strategy. Achieving ambitious goals will require commitment from multiple federal agencies and the presidential administration. At this stage, understanding the private sector is helpful, but these stakeholders can be motivated further with mechanisms discussed later. Throughout the program, leaders must communicate the timeline and standards for success with expert communities and the public.

Example Challenge: Building Capability for Domestic Rare Earth Element Extraction and Processing
Rare earth elements (REEs) have unique properties that make them valuable across many sectors, including consumer electronics manufacturing, renewable and nonrenewable energy generation, and scientific research. The U.S. relies heavily on China for the extraction and processing of REEs, and the U.S. Geological Survey reports that 78% of our REEs were imported from China from 2017-2020. Disruption to this supply chain, particularly in the case of export controls enacted by China as foreign policy, would significantly disrupt the production of consumer electronics and energy generation equipment critical to the U.S. economy. Export controls on REEs would create an urgent national problem, making it suitable for an OWS-like effort to build capacity for domestic extraction and processing.

Criteria 2. Fundamental research is already established, and the goal requires R&D to advance for a specific use case at scale.

Efforts modeled after OWS should require fundamental research to advance or scale into a product. For example, two of the four vaccine platforms selected for development in OWS were mRNA and replication-defective live vector platforms, which had been extensively studied despite never being used in FDA-licensed vaccines. Research was advanced enough to give leaders confidence to bet on these platforms as candidates for a COVID-19 vaccine. To mitigate risk, two more-established platforms were also selected.

Technology readiness levels (TRLs) are maturity level assessments of technologies for government acquisition. This framework can be used to assess whether a candidate technology should be scaled with an OWS-like approach. A TRL of at least five means the technology was successfully demonstrated in a laboratory environment as part of an integrated or partially integrated system. In evaluating and selecting candidate technologies, risk is unavoidable, but decisions should be made based on existing science, data, and demonstrated capabilities.

Example Challenge: Scaling Desalination to Meet Changing Water Demand
Increases in efficiency and conservation efforts have largely kept the U.S.’s total water use flat since the 1980s, but drought and climate variability are challenging our water systems. Desalination, a well-understood process to turn seawater into freshwater, could help address our changing water supply. However, all current desalination technologies applied in the U.S. are energy intensive and may negatively impact coastal ecosystems. Advanced desalination technologies—such as membrane distillation, advanced pretreatment, and advanced membrane cleaning, all of which are at technology readiness levels of 5–6—would reduce the total carbon footprint of a desalination plant. An OWS for desalination could increase the footprint of efficient and low-carbon desalination plants by speeding up development and commercialization of advanced technologies.

Recommendation 2: Design a program with mechanisms most needed to achieve the goal: (1) establish a leadership team across federal agencies, (2) coordinate federal agencies and the private sector, (3) activate latent private-sector capacities for labor and manufacturing, (4) shape markets with demand-pull mechanisms, and (5) reduce risk with diversity and redundancy.

Design a program using a combination of the mechanisms below, informed by the stakeholder and technology assessment. The organization of R&D, manufacturing, and deployment should follow an agile methodology in which more risk than normal is accepted. The program framework should include criteria for success at the end of each sprint. During OWS, vaccine candidates were advanced to the next stage based on the preclinical or early-stage clinical trial data on efficacy; the potential to meet large-scale clinical trial benchmarks; and criteria for efficient manufacturing.

Mechanism 1: Establish a leadership team across federal agencies

Establish an integrated command structure co-led by a chief scientific or technical advisor and a chief operating officer, a small oversight board, and leadership from federal agencies. The team should commit to operate as a single cohesive unit despite individual affiliations. Since many agencies have limited experience in collaborating on program operations, a chief operating officer with private-sector experience can help coordinate and manage agency biases. Ideally, the team should have decision-making authority and report directly to the president. Leaders should thoughtfully delegate tasks, give appropriate credit for success, hold themselves and others accountable, and empower others to act.

The OWS team was led by personnel from the Department of Health and Human Services (HHS), the Department of Defense (DOD), and the vaccine industry. It included several HHS offices at different stages: the Centers for Disease Control and Prevention (CDC), the Food and Drug Administration (FDA), the National Institutes of Health (NIH), and the Biomedical Advanced Research and Development Authority (BARDA). This structure combined expertise in science and manufacturing with the power and resources of the DOD. The team assigned clear roles to agencies and offices to establish a chain of command.

Example Challenge: Managing Wildland Fire with Uncrewed Aerial Systems (UAS)
Wildland fire is a natural and normal ecological process, but the changing climate and our policy responses are causing more frequent, intense, and destructive fires. Reducing harm requires real-time monitoring of fires with better detection technology and modernized equipment such as UAS. Wildfire management is a complex policy and regulatory landscape with functions spanning multiple federal, state, and local entities. Several interagency coordination bodies exist, including the National Wildfire Coordinating Group, Wildland Fire Leadership Council, and the Wildland Fire Mitigation and Management Commission, but much of these efforts are consensus-based coordination models. The status quo and historical biases against agencies have created silos of effort and prevent technology from scaling to the level required. An OWS for wildland fire UAS would establish a public-private partnership led by experienced leaders from federal agencies, state and local agencies, and the private sector to advance this technology development. The team would motivate commitment to the challenge across government, academia, nonprofits, and the private sector to deliver technology that meets ambitious goals. Appropriate teams across agencies would be empowered to refocus their efforts during the duration of the challenge.

Mechanism 2: Coordinate federal agencies and the private sector

Coordinate agencies and the private sector on R&D, manufacturing, and distribution, and assign responsibilities based on core capabilities rather than political or financial considerations. Identify efficiency improvements by mapping processes across the program. This may include accelerating regulatory approval by facilitating communication between the private sector and regulators or by speeding up agency operations. Certain regulations may be suspended entirely if the risks are considered acceptable relative to the urgency of the goal. Coordinators should identify processes that can occur in parallel rather than sequentially. Leaders can work with industry so that operations occur under minimal conditions to ensure worker and product safety.

The OWS team worked with the FDA to compress traditional approval timelines by simultaneously running certain steps of the clinical trial process. This allowed manufacturers to begin industrial-scale vaccine production before full demonstration of efficacy and safety. The team continuously sent data to FDA while they completed regulatory procedures in active communication with vaccine companies. Direct lines of communication permitted parallel work streams that significantly reduced the normal vaccine approval timeline.

Example Challenge: Public Transportation and Interstate Rail
Much of the infrastructure across the United States needs expensive repairs, but the U.S. has some of the highest infrastructure construction costs for its GDP and longest construction times. A major contributor to costs and time is the approval process with extensive documentation, such as preparing an environmental impact study to comply with the National Environmental Policy Act. An OWS-like coordinating body could identify key pieces of national infrastructure eligible for support, particularly for near-end-of-lifespan infrastructure or major transportation arteries. Reducing regulatory burden for selected projects could be achieved by coordinating regulatory approval in close collaboration with the Department of Transportation, the Environmental Protection Agency, and state agencies. The program would need to identify and set a precedent for differentiating between expeditable regulations and key regulations, such as structural reviews, that could serve as bottlenecks.

Mechanism 3: Activate latent private-sector capacities for labor and manufacturing

Activate private-sector capabilities for production, supply chain management, deployment infrastructure, and workforce. Minimize physical infrastructure requirements, establish contracts with companies that have existing infrastructure, and fund construction to expand facilities where necessary. Coordinate with the Department of State to expedite visa approval for foreign talent and borrow personnel from other agencies to fill key roles temporarily. Train staff quickly with boot camps or accelerators. Efforts to build morale and ensure commitment are critical, as staff may need to work holidays or perform higher than normally expected. Map supply chains, identify critical components, and coordinate supply. Critical supply chain nodes should be managed by a technical expert in close partnership with suppliers. Use the Defense Production Act sparingly to require providers to prioritize contracts for procurement, import, and delivery of equipment and supplies. Map the distribution chain from the manufacturer to the endpoint, actively coordinate each step, and anticipate points of failure.

During OWS, the Army Corps of Engineers oversaw construction projects to expand vaccine manufacturing capacity. Expedited visa approval brought in key technicians and engineers for installing, testing, and certifying equipment. Sixteen DOD staff also served in temporary quality-control positions at manufacturing sites. The program established partnerships between manufacturers and the government to address supply chain challenges. Experts from BARDA worked with the private sector to create a list of critical supplies. With this supply chain mapping, the DOD placed prioritized ratings on 18 contracts using the Defense Production Act. OWS also coordinated with DOD and U.S. Customs to expedite supply import. OWS leveraged existing clinics at pharmacies across the country and shipped vaccines in packages that included all supplies needed for administration, including masks, syringes, bandages, and paper record cards.

Example Challenge: EV Charging Network
Electric vehicles (EVs) are becoming increasingly popular due to high gas prices and lower EV prices, stimulated by tax credits for both automakers and consumers in the Inflation Reduction Act. Replacing internal combustion engine vehicles with EVs is aligned with our current climate commitments and reduces overall carbon emissions, even when the vehicles are charged with energy from nonrenewable sources. Studies suggest that current public charging infrastructure has too few functional chargers to meet the demand of EVs currently on the road. Reliable and available public chargers are needed to increase public confidence in EVs as practical replacements for gas vehicles. Leveraging latent private-sector capacity could include expanding the operations of existing charger manufacturers, coordinating the deployment and installation of charging stations and requisite infrastructure, and building a skilled workforce to repair and maintain this new infrastructure. In February 2023 the Biden Administration announced actions to expand charger availability through partnerships with over 15 companies.

Mechanism 4: Shape markets with demand-pull mechanisms

Use contracts and demand-pull mechanisms to create demand and minimize risks for private partners. Other Transaction Authority can also be used to procure capabilities quickly by bypassing elements of the Federal Acquisition Regulation. The types of demand-pull mechanisms available to agencies are:

HHS used demand-pull mechanisms to develop the vaccine candidates during OWS. This included funding large-scale manufacturing and committing to purchase successful vaccines. HHS made up to $483 million in support available for Phase 1 trials of Moderna’s mRNA candidate vaccine. This agreement was increased by $472 million for late-stage clinical development and Phase 3 clinical trials. Several months later, HHS committed up to $1.5 billion for Moderna’s large-scale manufacturing and delivery efforts. Ultimately the U.S. government owned the resulting 100 million doses of vaccines and reserved the option to acquire more. Similar agreements were created with other manufacturers, leading to three vaccine candidates receiving FDA emergency use authorization.

Example Challenge: Space Debris
Low-earth orbit includes dead satellites and other debris that pose risks for existing and future space infrastructure. Increased interest in commercialization of low-earth orbit will exacerbate a debris count that is already considered unstable. Since national space policy generally requires some degree of engagement with commercial providers, the U.S. would need to include the industry in this effort. The cost of active space debris removal, satellite decommissioning and recycling, and other cleanup activities are largely unknown, which dissuades novel business ventures. Nevertheless, large debris objects that pose the greatest collision risks need to be prioritized for decommission. Demand-pull mechanisms could be used to create a market for sustained space debris mitigation, such as an advanced market commitment for the removal of large debris items. Commitments for removal could be paired with a study across the DOD and NASA to identify large, high-priority items for removal. Another mechanism that could be considered is fixed milestone payments, which NASA has used in past partnerships with commercial partners, most notably SpaceX, to develop commercial orbital transportation systems.

Mechanism 5: Reduce risk with diversity and redundancy

Engage multiple private partners on the same goal to enable competition and minimize the risk of overall program failure. Since resources are not infinite, the program should incorporate evidence-based decision-making with strict criteria and a rubric. A rubric and clear criteria also ensure fair competition and avoid creating a single national champion. 

During OWS, four vaccine platform technologies were considered for development: mRNA, replication-defective live-vector, recombinant-subunit-adjuvanted protein, and attenuated replicating live-vector. The first two had never been used in FDA-licensed vaccines but showed promise, while the second two were established in FDA-licensed vaccines. Following a risk assessment, six vaccine candidates using three of the four platforms were advanced. Redundancy was incorporated in two dimensions: three different vaccine platforms and two separate candidates. The manufacturing strategy also included redundancy, as several companies were awarded contracts to produce needles and syringes. Diversifying sources for common vaccination supplies reduced the overall risk of failure at each node in the supply chain.

Example Challenge: Alternative Battery Technology
Building infrastructure to capture energy from renewable sources requires long-term energy storage to manage the variability of renewable energy generation. Lithium-ion batteries, commonly used in consumer electronics and electric vehicles, are a potential candidate, since research and development has driven significant cost declines since the technology’s introduction in the 1990s. However, performance declines when storing energy over long periods, and the extraction of critical minerals is still relatively expensive and harmful to the environment. The limitations of lithium-ion batteries could be addressed by investing in several promising alternative battery technologies that use cheaper materials such as sodium, sulfur, and iron. This portfolio approach will enable competition and increase the chance that at least one option is successful.

Conclusion

Operation Warp Speed was a historic accomplishment on the level of the Manhattan Project and the Apollo program, but the unique approach is not appropriate for every challenge. The methods and mechanisms are best suited for challenges in which stakeholders agree on an urgent and specific goal, and the goal requires scaling a technology with established fundamental research. Nonetheless, the individual mechanisms of OWS can effectively address smaller challenges. Those looking to replicate the success of OWS should deeply evaluate the stakeholder and technology landscape to determine which mechanisms are required or feasible.

Acknowledgments

This memo was developed from notes on presentations, panel discussions, and breakout conversations at the Operation Warp Speed 2.0 Conference, hosted on November 17, 2022, by the Federation of American Scientists, 1Day Sooner, and the Institute for Progress to recount the success of OWS and consider future applications of the mechanisms. The attendees included leadership from the original OWS team, agency leaders, Congressional staffers, researchers, and vaccine industry leaders. Thank you to ​​Michael A. Fisher, FAS senior fellow, who contributed significantly to the development of this memo through January 2023. Thank you to the following FAS staff for additional contributions: Dan Correa, chief executive officer; Jamie Graybeal, director, Defense Budgeting Project (through September 2022); Sruthi Katakam, Scoville Peace Fellow; Vijay Iyer, program associate, science policy; Kai Etheridge, intern (through August 2022).

Frequently Asked Questions
When is the OWS approach not appropriate?

The OWS approach is unlikely to succeed for challenges that are too broad or too politically polarizing. For example, curing cancer: While a cure is incredibly urgent and the goal is unifying, too many variations of cancer exist and they include several unique research and development challenges. Climate change is another example: particular climate challenges may be too politically polarizing to motivate the commitment required.

Can the OWS mechanisms work for politicized topics?

No topic is immune to politicization, but some issues have existing political biases that will hinder application of the mechanisms. Challenges with bipartisan agreement and public support should be prioritized, but politicization can be managed with a comprehensive understanding of the stakeholder landscape.

Can the OWS mechanisms be used broadly to improve interagency coordination?

The pandemic created an emergency environment that likely motivated behavior change at agencies, but OWS demonstrated that better agency coordination is possible.

How do you define and include relevant stakeholders?

In addition to using processes like stakeholder mapping, the leadership team must include experts across the problem space that are deeply familiar with key stakeholder groups and existing power dynamics. The problem space includes impacted portions of the public; federal agencies and offices; the administration; state, local, Tribal, and territorial governments; and private partners. 


OWS socialized the vaccination effort through HHS’s Office of Intergovernmental and External Affairs, which established communication with hospitals, healthcare providers, nursing homes, community health centers, health insurance companies, and more. HHS also worked with state, local, Tribal, and territorial partners, as well as organizations representing minority populations, to address health disparities and ensure equity in vaccination efforts. Despite this, OWS leaders expressed that better communication with expert communities was needed, as the public was confused by contradictory statements from experts who were unaware of the program details.

How can future OWS-like efforts include better communication and collaboration with the public?

Future efforts should create channels for bottom-up communication from state, local, Tribal, and territorial governments to federal partners. Encouraging feedback through community engagement can help inform distribution strategies and ensure adoption of the solution. Formalized data-sharing protocols may also help gain buy-in and confidence from relevant expert communities.

Can the OWS mechanisms be used internationally?

Possibly, but it would require more coordination and alignment between the countries involved. This could include applying the mechanisms within existing international institutions to achieve existing goals. The mechanisms could apply with revisions, such as coordination among national delegations and nongovernmental organizations, activating nongovernmental capacity, and creating geopolitical incentives for adoption.

Who was on the Operation Warp Speed leadership team?

The team included HHS Secretary Alex Azar; Secretary of Defense Mark Esper; Dr. Moncef Slaoui, former head of vaccines at GlaxoSmithKline; and General Gustave F. Perna, former commanding general of U.S. Army Materiel Command. This core team combined scientific and technical expertise with military and logistical backgrounds. Dr. Slaoui’s familiarity with the pharmaceutical industry and the vaccine development process allowed OWS to develop realistic goals and benchmarks for its work. This connection was also critical in forging robust public-private partnerships with the vaccine companies.

Which demand-pull mechanisms are most effective?

It depends on the challenge. Determining which mechanism to use for a particular project requires a deep understanding of the particular R&D, manufacturing, supply chain landscapes to diagnose the market gaps. For example, if manufacturing process technologies are needed, prize competitions or challenge-based acquisitions may be most effective. If manufacturing volume must increase, volume guarantees or advance purchase agreements may be more appropriate. Advance market commitments or milestone payments can motivate industry to increase efficiency. OWS used a combination of volume guarantees and advance market commitments to fund the development of vaccine candidates and secure supply.

Enabling Faster Funding Timelines in the National Institutes of Health

Summary

The National Institutes of Health (NIH) funds some of the world’s most innovative biomedical research, but rising administrative burden and extended wait times—even in crisis—have shown that its funding system is in desperate need of modernization. Examples of promising alternative models exist: in the last two years, private “fast science funding” initiatives such as Fast Grants and Impetus Grants have delivered breakthroughs in responding to the coronavirus pandemic and aging research on days to one-month timelines, significantly faster than the yearly NIH funding cycles. In response to the COVID-19 pandemic the NIH implemented a temporary fast funding program called RADx, indicating a willingness to adopt such practices during acute crises. Research on other critical health challenges like aging, the opioid epidemic, and pandemic preparedness deserves similar urgency. We therefore believe it is critical that the NIH formalize and expand its institutional capacity for rapid funding of high-potential research.

Using the learnings of these fast funding programs, this memo proposes actions that the NIH could take to accelerate research outcomes and reduce administrative burden. Specifically, the NIH director should consider pursuing one of the following approaches to integrate faster funding mechanisms into its extramural research programs: 

Future efforts by the NIH and other federal policymakers to respond to crises like the COVID-19 pandemic would also benefit from a clearer understanding of the impact of the decision-making process and actions taken by the NIH during the earliest weeks of the pandemic. To that end, we also recommend that Congress initiate a report from the Government Accountability Office to illuminate the outcomes and learnings of fast governmental programs during COVID-19, such as RADx.

Challenge and Opportunity

The urgency of the COVID-19 pandemic created adaptations not only in how we structure our daily lives but in how we develop therapeutics and fund science. Starting in 2020, the public saw a rapid emergence of nongovernmental programs like Fast Grants, Impetus Grants, and Reproductive Grants to fund both big clinical trials and proof-of-concept scientific studies within timelines that were previously thought to be impossible. Within the government, the NIH launched RADx, a program for the rapid development of coronavirus diagnostics with significantly accelerated approval timelines. Though the sudden onset of the pandemic was unique, we believe that an array of other biomedical crises deserve the same sense of urgency and innovation. It is therefore vital that the new NIH director permanently integrate fast funding programs like RADx into the NIH in order to better respond to these crises and accelerate research progress for the future. 

To demonstrate why, we must remember that the coronavirus is far from being an outlier—in the last 20 years, humanity has gone through several major pandemics, notably swine flu, SARS CoV-1, and Ebola. Based on the long-observed history of infectious diseases, the risk of pandemics with an impact similar to that of COVID-19 is about two percent in any year. An extension of naturally occurring pandemics is the ongoing epidemic of opioid use and addiction. The rapidly changing landscape of opioid use—with overdose rates growing rapidly and synthetic opioid formulations becoming more common—makes slow, incremental grantmaking ill-suited for the task. The counterfactual impact of providing some awards via faster funding mechanisms in these cases is self-evident: having tests, trials, and interventions earlier saves lives and saves money, without sacrificing additional resources.

Beyond acute crises, there are strong longer-term public health motivations for achieving faster funding of science. In about 10 years, the United States will have more seniors (people aged 65+) than children. This will place substantial stress on the U.S. healthcare system, especially given that two-thirds of seniors suffer from more than one chronic disease. New disease treatments may help, but it often takes years to translate the results of basic research into approved drugs. The idiosyncrasies of drug discovery and clinical trials make them difficult to accelerate at scale, but we can reliably accelerate drug timelines on the front end by reducing the time researchers spend in writing and reviewing grants—potentially easing the long-term stress on U.S. healthcare.

The existing science funding system developed over time with the best intentions, but for a variety of reasons—partly because the supply of federal dollars has not kept up with demand—administrative requirements have become a major challenge for many researchers. According to surveys, working scientists now spend 44% of their research time on administrative activities and compliance, with roughly half of that time spent on pre-award activities. Over 60% of scientists say administrative burden compromises research productivity, and many fear it discourages students from pursuing science careers. In addition, the wait for funding can be extensive: one of the major NIH grants, R01, takes more than three months to write and around 8–20 months to receive (see FAQ). Even proof-of-concept ideas face onerous review processes and take at least a year to fund. This can bottleneck potentially transformative ideas, as with Katalin Kariko famously struggling to get funding for her breakthrough mRNA vaccine work when it was at its early stages. These issues have been of interest for science policymakers for more than two decades, but with little to show for it. 

Though several nongovernmental organizations have attempted to address this need, the model of private citizens continuously fundraising to enable fast science is neither sustainable nor substantial enough compared to the impact of the NIH. We believe that a coordinated governmental effort is needed to revitalize American research productivity and ensure a prompt response to national—and international—health challenges like naturally occurring pandemics and imminent demographic pressure from age-related diseases. The new NIH director has an opportunity to take bold action by making faster funding programs a priority under their leadership and a keystone of their legacy. 

The government’s own track record with such programs gives grounds for optimism. In addition to the aforementioned RADx program at NIH, the National Science Foundation (NSF) runs the Early-Concept Grants for Exploratory Research (EAGER) and Rapid Response Research (RAPID) programs, which can have response times in a matter of weeks. Going back further in history, during World War II, the National Defense Research Committee maintained a one-week review process.
Faster grant review processes can be either integrated into existing grant programs or rolled out by institutes in temporary grant initiatives responding to pressing needs, as the RADx program was. For example, when faced with data falsification around the beta amyloid hypothesis, the National Institute of Aging (NIA) could leverage fast grant review infrastructure to quickly fund replication studies for key papers, without waiting for the next funding cycle. In case of threats to human health due to toxins, the National Institute of Environmental Health Sciences (NIEHS) could rapidly fund studies on risk assessment and prevention, giving public evidence-based recommendations with no delay. Finally, empowering the National Institute of Allergy and Infectious Diseases (NIAID) to quickly fund science would prepare us for many yet-to-come pandemics.

Plan of Action

The NIH is a decentralized organization, with institutes and centers (ICs) that each have their own mission and focus areas. While the NIH Office of the Director sets general policies and guidelines for research grants, individual ICs have the authority to create their own grant programs and define their goals and scope. The Center for Scientific Review (CSR) is responsible for the peer review process used to review grants across the NIH and recently published new guidelines to simplify the review criteria. Given this organizational structure, we propose that the NIH Office of the Director, particularly the Office of Extramural Research, assess opportunities for both NIH-wide and institute-specific fast funding mechanisms and direct the CSR, institutes, and centers to produce proposed plans for fast funding mechanisms within one year. The Director’s Office should consider the following approaches. 

Approach 1. Develop an expedited peer review process for the existing R21 grant mechanism to bring it more in line with the NIH’s own goals of funding high-reward, rapid-turnaround research. 

The R21 program is designed to support high-risk, high-reward, rapid-turnaround, proof-of-concept research. However, it has been historically less popular among applicants compared to the NIH’s traditional research mechanism, the R01. This is in part due to the fact that its application and review process is known to be only slightly less burdensome than the R01, despite providing less than half of the financial and temporal support. Therefore, reforming the application and peer review process for the R21 program to make it a fast grant–style award would both bring it more in line with its own goals and potentially make it more attractive to applicants. 

All ICs follow identical yearly cycles for major grant programs like the R21, and the CSR centrally manages the peer review process for these grant applications. Thus, changes to the R21 grant review process must be spearheaded by the NIH director and coordinated in a centralized manner with all parties involved in the review process: the CSR, program directors and managers at the ICs, and the advisory councils at the ICs. 

The track record of federal and private fast funding initiatives demonstrates that faster funding timelines can be feasible and successful (see FAQ). Among the key learnings and observations of public efforts that the NIH could implement are:

Pending the success of these changes, the NIH should consider applying similar changes to other major research grant programs.

Approach 2. Direct NIH institutes and centers to independently develop and deploy programs with faster funding timelines using Other Transaction Authority (OTA).

Compared to reforming an existing mechanism, the creation of institute-specific fast funding programs would allow for context-specific implementation and cross-institute comparison. This could be accomplished using OTA—the same authority used by the NIH to implement COVID-19 response programs. Since 2020, all ICs at the NIH have had this authority and may implement programs using OTA with approval from the director of NIH, though many have yet to make use of it.

As discussed previously, the NIA, NIDA, and NIAID would be prime candidates for the roll-out of faster funding. In particular, these new programs could focus on responding to time-sensitive research needs within each institute or center’s area of focus—such as health crises or replication of linchpin findings—that would provide large public benefits. To maintain this focus, these programs could restrict investigator-initiated applications and only issue funding opportunity announcements for areas of pressing need. 

To enable faster peer review of applications, ICs should establish (a) new study section(s) within their Scientific Review Branch dedicated to rapid review, similar to how the RADx program had its own dedicated review committees. Reviewers who join these study sections would commit to short meetings on a monthly or bimonthly basis rather than meeting three times a year for one to two days as traditional study sections do. Additionally, as recommended above, these new programs should have a three-page limit on applications to reduce the administrative burden on both applicants and reviewers. 

In this framework, we propose that the ICs be encouraged to direct at least one percent of their budget to establish new research programs with faster funding processes. We believe that even one percent of the annual budget is sufficient to launch initial fast grant programs funded through National Institutes. For example, the National Institute of Aging had an operating budget of $4 billion in the 2022 fiscal year. One percent of this budget would constitute $40 million for faster funding initiatives, which would be on the order of initial budgets of Impetus and Fast Grants ($25 million and $50 million accordingly). 

NIH ICs should develop success criteria in advance of launching new fast funding programs. If the success criteria are met, they should gradually increase the budget and expand the scope of the program by allowing for investigator-initiated applications, making it a real alternative to R01 grants. A precedent for this type of grant program growth is the Maximizing Investigators’ Research Award (MIRA) (R35) grant program within the National Institute of General Medical Sciences (NIGMS), which set the goal of funding 60% of all R01 equivalent grants through MIRA by 2025. In the spirit of fast grants, we recommend setting a deadline on how long each institute can take to establish a fast grants program to ensure that the process does not extend for too many years.

Additional recommendation. Congress should initiate a Government Accountability Office report to illuminate the outcomes and learnings of governmental fast funding programs during COVID-19, such as RADx.

While a number of published papers cite RADx funding, the program’s overall impact and efficiency haven’t yet been assessed. We believe that the agency’s response during the pandemic isn’t yet well-understood but likely played an important role. Illuminating the learnings of these interventions would greatly benefit future emergency fast funding programs.

Conclusion

The NIH should become a reliable agent for quickly mobilizing funding to address emergencies and accelerating solutions for longer-term pressing issues. As present, no funding mechanisms within NIH or its branch institutes enable them to react to such matters rapidly. However, both public and governmental initiatives show that fast funding programs are not only possible but can also be extremely successful. Given this, we propose the creation of permanent fast grants programs within the NIH and its institutes based on learnings from past initiatives.

The changes proposed here are part of a larger effort from the scientific community to modernize and accelerate research funding across the U.S. government. In the current climate of rapidly advancing technology and increasing global challenges, it is more important than ever for U.S. agencies to stay at the forefront of science and innovation. A fast funding mechanism would enable the NIH to be more agile and responsive to the needs of the scientific community and would greatly benefit the public through the advancement of human health and safety.

Frequently Asked Questions
What actions, besides RADx, did the NIH take in response to the COVID-19 pandemic?

The NIH released a number of Notices of Special Interest to allow emergency revision to existing grants (e.g., PA-20-135 and PA-18-591) and a quicker path for commercialization of life-saving COVID technologies (NOT-EB-20-008). Unfortunately, repurposing existing grants reportedly took several months, significantly delaying impactful research.

What does the current review process look like?

The current scientific review process in NIH involves  multiple stakeholders. There are two stages of review at NIH, with the first stage being conducted by a Scientific Review Group that consists primarily of nonfederal scientists. Typically, Center for Scientific Review committees meet three times a year for one or two days. This way, the initial review starts only four months after the proposal submission. Special Emphasis Panel meetings that are not recurring take even longer due to panel recruitment and scheduling. The Institute and Center National Advisory Councils or Boards are responsible for the second stage of review, which usually happens after revision and appeals, taking the total timeline to approximately a year.

Is there evidence for the NIH’s current approach to scientific review?

Because of the difficulty of empirically studying drivers of scientific impact, there has been little research evaluating peer review’s effects on scientific quality. A Cochrane systematic review from 2007 found no studies directly assessing review’s effects on scientific quality, and a recent Rand review of the literature in 2018 found a similar lack of empirical evidence. A few more recent studies have found modest associations between NIH peer review scores and research impact, suggesting that peer review may indeed successfully identify innovative projects. However, such a relationship still falls short of demonstrating that the current model of grant review reliably leads to better funding outcomes than alternative models. Additionally, some studies have demonstrated that the current model leads to variable and conservative assessments. Taken together, we think that experimentation with models of peer review that are less burdensome for applicants and reviewers is warranted.

One concern with faster reviews is a lower science quality. How do you ensure high-quality science while keeping fast response times and short proposals?

Intuitively, it seems that having longer grant applications and longer review processes ensures that both researchers and reviewers expend great effort to address pitfalls and failure modes before research starts. However, systematic reviews of the literature have found that reducing the length and complexity of applications has minimal effects on funding decisions, suggesting that the quality of resulting science is unlikely to be affected. 


Historical examples have also suggested that the quality of an endeavor is largely uncorrelated from its planning times. It took Moderna 45 days from COVID-19 genome publication to submit the mRNA-1273 vaccine to the NIH for use in its Phase 1 clinical study. Such examples exist within government too: during World War II, National Defense Research Committee set a record by reviewing and authorizing grants within one week, which led to DUKWProject PigeonProximity fuze, and Radar.


Recent fast grant initiatives have produced high-quality outcomes. With its short applications and next-day response times, Fast Grants enabled:



  • detection of new concerning COVID-19 variants before other sources of funding became available.

  • work that showed saliva-based COVID-19 tests can work just as well as those using nasopharyngeal swabs.

  • drug-repurposing clinical trials, one of which identified a generic drug reducing hospitalization from COVID-19 by ~40%. 

  • Research into “Long COVID,” which is now being followed up with a clinical trial on the ability of COVID-19 vaccines to improve symptoms.


Impetus Grants focused on projects with longer timelines but led to a number of important preprints in less than a year from the moment person applied:



With the heavy toll that resource-intensive approaches to peer review take on the speed and innovative potential of science—and the early signs that fast grants lead to important and high-quality work—we feel that the evidentiary burden should be placed on current onerous methods rather than the proposed streamlined approaches. Without strong reason to believe that the status quo produces vastly improved science, we feel there is no reason to add years of grant writing and wait times to the process.

Why focus on the NIH, as opposed to other science funding agencies?

The adoption of faster funding mechanisms would indeed be valuable across a range of federal funding agencies. Here, we focus on the NIH because its budget for extramural research (over $30 billion per year) represents the single largest source of science funding in the United States. Additionally, the NIH’s umbrella of health and medical science includes many domains that would be well-served by faster research timelines for proof-of-concept studies—including pandemics, aging, opioid addiction, mental health, cancer, etc.

Project BOoST: A Biomanufacturing Test Facility Network for Bioprocess Optimization, Scaling, and Training

Summary

The U.S. bioeconomy commands millions of liters of bioproduction capacity, but only a tiny fraction of this capacity supports process optimization. Companies of all sizes face great pressures that limit their ability to commit resources to these important efforts. Consequently, the biomanufacturing industry is often forced to juggle sensitive, brittle production processes that don’t scale easily and are prone to disruption. As some recent failures of prominent companies demonstrate, this increases risk for the entire bioeconomy, and especially for the development of new companies and products.

To remedy this, the Department of Commerce should first allocate $80 million to seed a bioproduction R&D facility network that provides process optimization capability to the greater bioeconomy, followed by a $30 million process optimization challenge wherein participating facilities compete at workflow optimization, scaling, and transfer. Part one of the proposal requires rapid development, with the initial R&D facility network of four sites starting bioprocessing operations within 12 months of award. In addition to training workers for the greater bioeconomy, the facility network’s services would be available on a contract basis to any company at any stage of maturity. This work could include optimization for yield, scaling, process resilience, and/or technology transfer—all critical needs across the sector. After federal government startup funding, the network would transition toward financial independence, increasingly running on revenue from process optimization work, workforce training, and state/local support.

Part two of the plan lays out a biomanufacturing “Grand Challenge” in which participating network facilities compete to optimize a standardized biomanufacturing process. Prioritizing process resilience, security, and transferability in addition to yield, this effort would help set a new industry standard for what process optimization really means in addition to demonstrating what can be accomplished by the network facilities. With this demonstration of value, demand for facility services in other geographic locations would increase, spurring the growth of the facility network across the country.

By the end of the program, the U.S. biomanufacturing sector would see a number of benefits, including easier process innovation, a larger and better trained workforce, shortened product time to market, and reduced production risks.

Challenge & Opportunity

Biological products are, by definition, made by means of complex biological processes carried out by sensitive—some might even say fickle—microorganisms and cell lines. Determining the right steps and conditions to induce a microbe into producing a given product at a worthwhile yield is an arduous task. And once produced, the product needs to be extensively processed to make it pure, stable, and safe enough for shipping and use. Working out this entire production workflow takes a great deal of time, energy, and expertise, and the complexity of production workflows increases alongside the complexity of biological products. Many products fail at this point in development, keeping beneficial products out of the hands of end users and cutting off constructive contributions—revenue, jobs—to the larger bioeconomy. 

Once a bioproduction process is worked out at an R&D level, it must be scaled up to a much larger commercial level—another major point of failure for academic and commercial programs. Scaling up requires different equipment with its own controls and idiosyncrasies, each generating additional, sometimes unpredictable, complexities that must be corrected for or managed. The biomanufacturing industry has been asking for help with process scaling for years, and recent national initiatives, such as the National Institute for Innovation in Manufacturing Biopharmaceuticals (NIIMBL) and the BioIndustrial Manufacturing and Design Ecosystem (BioMADE), have sought to address this strategic need.

Each step on this road to the end market represents a chance for failure, and the risks are so high that the road is littered with failed companies that had a promising product that just couldn’t be made reliably or a brittle production process that blew up when performed at scale. The overarching competitive commercial environment doesn’t help, as new companies must rush from concept to production, often cutting corners along the way. Meanwhile, mature biomanufacturing companies often nurse small profit margins and must aggressively guard existing revenue streams, leaving little or no spare capacity to innovate and improve processes. All of these factors result in production workflows that are hastily constructed, poorly optimized, prone to scaling difficulties, and vulnerable to failure at multiple points. When—not if—process failures occur, the entire economy suffers, often in catastrophic ways. In the last several years alone, such failures have been witnessed at Emergent Biosciences, Dr Reddy’s, and Abbott, with any number of downstream effects. Society, too, misses out when more sustainable, environmentally friendly production methods are overlooked in favor of older, less efficient but more familiar ones. 

There is an urgent need for a national network of biomanufacturing facilities dedicated to process optimization and scaling—critical efforts that are too often overlooked or hastily glossed over, to the subsequent detriment of the entire bioeconomy. Available to any company at any stage of maturity, this facility network would operate on a contract basis to optimize biological production processes for stability, resilience, and technology transfer. The facilities would also assist with yield optimization, in addition to incorporating the specialized equipment designed to aid in scale-up risk analysis. Once established with government funding, the facility network would stand on its own, running on contract fees for process optimization and scale-up efforts. As demand for services grows, the facility network model could spread out geographically to support other markets.

This a highly opportune time for such a program. The COVID-19 pandemic has highlighted the essential importance of biomanufacturing capabilities—extending to the geopolitical level—as well as the fragility of many supply chains and processes. In response, the CHIPS and Science Act and Executive Order on Advancing Biotechnology and Biomanufacturing, among others, have provided directives to shore up U.S. biomanufacturing capacity and resilience. Project BOoST seeks to meet those directives all while building a workforce to support broader participation in a strong national bioeconomy.

Plan of Action

Project BOoST encompasses a $110 million ask spread out over four years and two overlapping phases: a first phase that quickly stands up a four-facility network to perform biomanufacturing process optimization, and a second phase that establishes a biomanufacturing “Grand Challenge” wherein facilities compete in the optimization of a standardized bioproduction process. 

Phase I: Establishing the facility network

The Department of Commerce should allocate $80 million over three years to establish the initial facility network at four sites in different regions of the country. The program would be structured as a competitive contract, with a preference for contract bidders who:

Possible funding pathways include one of the bio-related Manufacturing Innovation Institutes (MIIs), such as NIIMBL, BioMADE, or BioFabUSA. At a minimum, partnerships would be established with these MIIs to disseminate helpful information gained from the facility network. The National Institute of Standards and Technology (NIST) could also be helpful in establishing data standards for technology transfer. The Bioeconomy Information Sharing and Analysis Center (BIO-ISAC) would be another important partner organization, helping to inform the facilities’ efforts to increase both cyber resilience of workflows and industry information sharing.

Funds would be earmarked for initial startup expenditures, including lease/purchase of appropriate buildings, equipment purchases, and initial salaries/benefits of operating personnel, trainers, and program support. Funding milestones would be configured to encourage rapid movement, including:

Since no actual product made in these facilities would be directed toward regulated use (e.g., food, medical), there would likely be reduced need to build and operate the facilities at full Current Good Manufacturing Practice (CGMP) specification, allowing for significant time and cost savings. Of course, the ultimate intent is for optimized and scaled production processes to migrate back to regulated use where relevant, but process optimization need not be done in the same environment. Regardless, the facilities would be instrumented so as to facilitate bidirectional technology transfer. With detailed telemetry of processes and data traffic collected in a standardized manner from the network’s sites, organizations would have a much easier time bringing optimized, scaled processes from these facilities out to commercial production. This would result in faster parameter optimization, improved scale-up, increased workflow resilience, better product assurance, and streamlined tech transfer—all of which are major impediments and risks to growth of the U.S. bioeconomy.

Process optimization and scaling work would be accomplished on a contract basis with industry clients, with robust intellectual property protections in place to guard trade secrets. At the same time, anonymized information and techniques gathered from optimization and scaling efforts would be automatically shared with other sites in the network, enabling more globalized learning and more rapid method development. These lessons learned would also be organized and published to the relevant industry organizations, allowing these efforts to lift all ships across the bioeconomy. In this way, even a facility that failed to achieve sufficient economic self-sustainability would still make significant contributions to the industry knowledge base.

Focused on execution speed, each facility would be a public-private consortium, bringing together regional companies, universities, state and local governments, and others to create a locus of education, technology development, and job creation. In addition to hewing to provisions within the CHIPS and Science Act, this facility network would also match the “biomanufacturing infrastructure hubs” recommendation from the President’s Council of Advisors on Science and Technology.

Using the Regional Technology and Innovation Hubs model laid out in the CHIPS and Science Act, the facilities would be located near to, but not within, leading biotechnology centers, with an eye to benefiting small and rural communities where possible. All the aforementioned stakeholders would have a say in site location, with location criteria including: 

Although some MIIs have innovation acceleration and/or improving production availability within their charters, to date no production capacity has been built specifically to address the critical issues of process optimization and scaling. Project BOoST would complement the ongoing work of the bio-focused MIIs. And since the aforementioned risks to the bioeconomy represent a strategic threat today, this execution plan is intentionally designed to move rapidly. Locating network facilities outside of costly metropolitan areas and not needing full cGMP certification means that an individual facility could be spun up in months as opposed to years and at much lower cost. These facilities would quickly be able to offer their benefits to industry, local economies, and workers looking to train into a growing job sector.

Phase II: Scale-up challenge

Approximately 30 months from program start, facilities that meet the aforementioned funding milestones and demonstrate continuous movement toward financial self-sustainability (as measured by a shift from federal to state, local, and industry support) would be eligible to participate in an additional $30 million, 18-month scale-up challenge, wherein they would receive a reference production workflow so they could compete at workflow optimization, scaling, and transfer.

In contrast to previous Grand Challenges, which typically have a unifying theme (e.g., cancer, clean energy) but relatively open goals and means, Project BOoST would be hyperfocused to ensure a high degree of applicability and benefit to the biomanufacturing industry. The starting reference production workflow would be provided at lab scale, with specifications of materials, processing steps, and instrument settings. From this starting point, participating facilities would be asked to characterize and optimize the starting workflow to produce maximal yield across a broad range of conditions; scale the workflow to a 1,000L batch level, again maximizing yield; and transfer the workflows at both scales to a competing facility both for verification purposes and for proof of transferability.

In addition, all competing workflows would be subject to red-teaming by an independent group of biomanufacturing and cybersecurity experts. This examination would serve as an important assessment of workflow resilience in the setting of equipment failure, supply chain issues, cyberattack, and other scenarios.

The winning facilities—represented by their workflows—would be determined by a combination of factors:

The end result would be the practical demonstration and independent verification of the successful optimization, scale-up, and transfer of a production process—a major opportunity for learning and knowledge sharing across the entire industry.

Conclusion

Scientific innovation and advanced automation in biomanufacturing represent a potent double-edged sword. While they have allowed for incredible advances in biomanufacturing capability and capacity—to the benefit of all—they have also created complexities and dependencies that together constitute a strategic risk to the bioeconomy. This risk is a significant threat, with process failures already creating national headlines out of company collapses and congressional investigations. We propose to act now to create a biomanufacturing facility network dedicated to making production workflows more robust, resilient, and scalable, with a plan strongly biased toward rapid execution. Bringing together commercial entities, educational institutions, and multiple levels of government, Project BOoST will quickly create jobs, provide workforce development opportunities, and strengthen the bioeconomy as a whole.

Frequently Asked Questions
What differentiates Project BOoST from other facilities and networks proposed by MIIs, current Centers for Innovation in Advanced Development and Manufacturing (CIADMs), and the Department of Defense (DoD) authorization to support bioindustrial R&D included in the National Defense Authorization Act?






































Project BOoST MIIs CIADMs DoD/NDAA
Time frame to start of facility operations Estimated 12 months from funding Unknown—as of yet no new ground broken   Already operational, although only one surviving Unknown—plan to meet goals of act due 6/2023
Geographic location Targeting small and rural communities Unknown Mix: urban and less urban Unknown
Scope Process optimization, resilience, and scaling, including scale-up risk assessment DOD MII: TRL acceleration in nonmedical products


DOC MII: accelerate biopharmaceutical innovation 

Maintenance of critical product stockpiles, reserve production capacity Research into new methods, capacity building, scaling 
Financial model Initial government funding with transition to self-sufficiency Government funding plus partner contributions Persistent government funding Unknown
Will this effort address supply chain threats?

Yes. Supply chain resilience will be a constant evaluation criterion through the program. A more resilient workflow in this context might include onshoring and/or dual sourcing of critical reagent supplies, establishing on-site reserves of single-point-of-failure equipment, maintaining backups of important digital resources (e.g., software, firmware, ladder logic), and explicitly rehearsing failure recovery procedures.

What kind of workforce training opportunities would be available at these facilities?

While the specifics will be left up to the contract bidders, we recommend training programs ranging from short, focused trainings for people already in the biomanufacturing industry to longer certificate programs that can give a trainee the basic suite of skills needed to function in a skilled biomanufacturing role.

Why can’t industry address these issues on its own?

They would if they could. On a fundamental level, due to the nature of the U.S. economic system, the biomanufacturing industry is focused on competition, and there’s a lot of it. Industry organizations, whether large or small, must be primarily concerned with some combination of generating new products and producing those products. They are unable to devote resources toward more strategic efforts like resilience, data standards, and process assurance simply because energy and dollars spent there means less to put toward new product development or increasing production capacity. (Contrast this to a country like China, where the government can more easily direct industry efforts in a certain direction.) Revolutionary change and progress in U.S. biomanufacturing requires the public sector to step up to solve some of these holistic, longer-term challenges.

Advancing the U.S. Bioindustrial Production Sector

Summary

The bioindustrial production of chemicals and other goods is a critical sector of the U.S. bioeconomy. However, the sector faces challenges such as drawn-out research and development timelines, low profit margins, the requirements to produce and sell product in vast quantities over long periods of time, and barriers to accessing scale-up capacity. Companies can find it challenging to rapidly exchange helpful knowledge, attract early-stage investors, access pilot-scale infrastructure to generate evidence that forecasts cost-effective production at scale, and obtain the financing to build or access domestic commercial-scale bioproduction, biomanufacturing, and downstream bioprocessing infrastructure and facilities.

The federal government has already recognized the need to take action to sustain and extend U.S. leadership in biotech and biomanufacturing. The recent Executive Order on advancing the U.S. bioeconomy and relevant provisions in the CHIPS and Science Law and the Inflation Reduction Law have put forward high aspirations, as well some funding, that could help stimulate the biotech and biomanufacturing ecosystem.

The U.S. government should create a Bio for America Program Office (BAPO) at the National Institute for Standards and Technology (NIST) to house a suite of initiatives that would lead to the creation of more well-paying U.S.-based biomanufacturing jobs, spur economic growth and development in areas of the country that haven’t historically benefited from biotech or biomanufacturing, and ensure more resilient U.S. supply chains, the more sustainable production of chemicals and other goods, and enhanced U.S. competitiveness.

Challenge and Opportunity

The bioeconomy—the part of the economy driven by the life sciences and biotech, and enabled by engineering, computing, and information science—has the potential to revolutionize human health, climate and energy, food security and sustainability, and supply chain stability, as well as support economic growth and well-paying jobs across the country. Indeed, the sector has already produced many breakthroughs, such as mRNA vaccines that help counter the devastating impacts of COVID-19 and genetically engineered microbes that provide nutrients to crops without the pollution associated with traditional fertilizers. Valued at over $950 billion, the U.S. bioeconomy accounts for more than five percent of the U.S. gross domestic product—more than the contribution from the construction industry and on par with the information sector.

However, without sufficient federal support and coordination, the U.S. risks ceding economic, national security, and societal benefits provided by a strong bioeconomy to competitors that are implementing cohesive strategies to advance their bioeconomies. For example, China aims to dominate the 21st-century bioeconomy and has prioritized the growth of its bioeconomy in its five-year plans. From 2016 to July 2021, the market value of publicly listed biopharmaceutical innovators from China increased approximately 127-fold across several major stock exchanges, to more than $380 billion, with biotechnology companies accounting for more than 47 percent of that valuation.

Bioindustrial manufacturing (nonpharmaceutical) is a critical segment of the bioeconomy but faces low profit margins combined with the need to produce and sell product in vast quantities over long timelines. It is challenging for companies to translate research and development into commercially viable efforts and attract investors to finance access to or construction of domestic bioproduction, biomanufacturing, and downstream bioprocessing infrastructure and facilities such as fermentation tanks and bioreactors. Furthermore, many biotech and synthetic biology companies face difficulty acquiring capital for scale-up, whether that requires building custom demonstration- or commercial-scale infrastructure, contracting with fee-for-service bioproduction organizations to outsource manufacturing in external facilities, or retooling existing equipment.

All this has the potential to lead to yet more instances of “designed in America, made elsewhere”: microbes that are engineered by U.S. companies to fabricate chemicals or other products could end up being used to produce at commercial scale abroad, which is not a recipe for economic growth and improving quality of life for residents of the U.S. Domestic manufacturers should be executing bioindustrial production so that more well-paying jobs are accessible in the U.S., with the added benefits of contributing to a more stable supply chain, which bolsters U.S. national and economic security.

The federal government has recognized the need for U.S. leadership in biotech and biomanufacturing: the recent Executive Order on advancing the U.S. bioeconomy and relevant provisions in the CHIPS and Science Law and the Inflation Reduction Act (IRA)  provide high-level aspirations and some actual dollars to bolster the biotech and biomanufacturing ecosystem. Indeed, some funds appropriated in the IRA could be used to meet biomanufacturing goals set in the CHIPS and Science Law and EO.

To reach its full potential, bioindustrial manufacturing requires additional support at various levels and may need as much as hundreds of billions of dollars in capital. There is an opportunity for the U.S. government to be intentional about accelerating the growth of the bioindustrial manufacturing sector, and reaping the economic, national security, and societal benefits that would come with it.

Public-private partnerships aimed at providing resources and capital for experimental development at early-stage companies, as well as bioindustrial production scale-up and commercialization projects that are techno-economically sound, would be a strong signal that the federal government is serious about leveraging bioindusty to meet human health, climate and energy, food security and sustainability, and supply chain stability needs, as well as support economic growth and well-paying jobs. Many of the investments by the U.S. taxpayers would be matched and multiplied by investments from nongovernment sources, amplifying the impact, and generating high return on investment for Americans in the form of well-paying jobs, breakthrough products, and more stable supply chains. Furthermore, the investment would show that the U.S. is committed to leveraging advanced manufacturing to raise quality of life for Americans and retain leadership in biotech and biomanufacturing.

Plan of Action

This plan focuses on four initiatives that address specific challenge points:

The proposed initiatives could all be housed in a new office at NIST called the Bio for America Program Office (BAPO), which would collaborate closely with the Office of the Secretary of Commerce and the Under Secretary of Commerce for Standards and Technology, as well as additional government and nongovernmental stakeholders as appropriate. NIST would be an effective home for the BAPO given that it harbors cross-disciplinary expertise in engineering and the physical, information, chemical, and biological sciences; is a nonregulatory agency of the U.S. Department of Commerce, whose mission it is “to drive U.S. economic competitiveness, strengthen domestic industry, and spur the growth of quality jobs in all communities across the country”; and serves as a neutral convener for industry consortia, standards development organizations, federal labs, universities, public workshops, and interlaboratory comparability testing.

Bioindustrial Production Precompetitive Consortium

NIST should establish a Consortium, coordinated out of BAPO, to address the measurements, tools, and standards needed to advance both research and commercial bioindustrial products. The Consortium would convene industry, academia, and government to identify and address measurement, tool, and standards needs; enable members to work with NIST to develop those solutions and standards; leverage NIST expertise; and collaborate with related programs at other federal agencies. The Consortium could rapidly develop relationships with organizations such as the Bioindustrial Manufacturing and Design Ecosystem (BioMADE, a Manufacturing Innovation Institute that is part of the Manufacturing USA network), the Engineering Biology Research Consortium (EBRC), SynBioBeta, the Alternative Fuels & Chemicals Coalition (AFCC), the Synthetic Biology Coalition, the Joint BioEnergy Institute, and the Advanced Biofuels and Bioproducts Process Development Unit at Lawrence Berkeley National Laboratory, and the National Renewable Energy Laboratory’s pilot-scale Integrated Biorefinery Research Facility. It would also be useful to communicate with efforts in the biopharmaceutical space such as the Biomedical Advanced Research and Development Authority (BARDA) and the National Institute for Innovation in Manufacturing Biopharmaceuticals.

The benefits to members would include access to a neutral forum for addressing precompetitive needs; participation in the development of experimental benchmarks, guidelines, and terminology; access to tools developed by the consortium ahead of public release; and institutional representation on the consortium steering committee. Members would contribute an annual fee of, for example, $20,000 or in-kind support of equivalent value, as well as sign a Cooperative Research and Development Agreement.

Congress should initially appropriate $20 million over five years to support the Consortium’s activities, and the Consortium could launch by putting forward a Notice of Consortium establishment and letter of interest form.

U.S. Bioindustrial Production Investment Portfolio

Early-stage companies are the engine for U.S. job creation, regional economic development, and technological innovation. A more consistent, yet scrupulous, source of funding for nascent companies in the bioindustrial production space would be catalytic. Using the BARDA Ventures-Global Health Investment Corporation (GHIC) Global Health Security Portfolio public-private partnership as a model, the U.S. government, coordinating both existing and new appropriations via BAPO Ventures, should seed a nonprofit partnership manager to launch a U.S. Bioindustrial Production Investment Portfolio. The portfolio would crowd-in additional capital and invest in early-stage, domestic bioindustrial production companies that share sound metrics and credible techno-economic analyses that they are on the path to product commercialization and profitability.

The portfolio’s nonprofit partnership manager should be empowered to crowd-in capital using return augmentation and risk mitigation incentives as they see fit. Measures that a venture fund could take to incentivize coinvestment could include but are not limited to:

Launching a U.S. Bioindustrial Production Investment Portfolio requires the following four steps.

1. Both existing and newly appropriated federal funds should be used to seed investments in the U.S. Bioindustrial Production Investment Portfolio.

Existing appropriations: Appropriations have already been made to some federal agencies through the IRA and other vehicles that could be used to seed the portfolio. BAPO Ventures should coordinate with the interagency, and agencies with available funds could contribute to directly seeding the portfolio. Some examples of existing funds to coordinate include:

New appropriations: Congress should appropriate $500 million in new funding for BAPO Ventures over five years to support BAPO Ventures personnel and operations and augment the portfolio. These funds would be critical since they could be applied for all-purpose venture capital investments in early-stage bioindustrial production companies. Congress should also grant BAPO Ventures, as well as other agencies or programs, any authority needed to transfer funds to the portfolio for these purposes.

2. Identify the nonprofit partnership manager.

BAPO Ventures should solicit proposals for an existing nonprofit partner to manage the U.S. Bioindustrial Production Investment Portfolio. Selection should be based on demonstrated track record of experience with and successful venture investments in bioindustrial manufacturing or a closely related space. Potential nonprofit partners include Breakthrough Energy Catalyst or America’s Frontier Fund. GHIC should also be consulted. 

3. Transfer funds from the appropriate U.S. government programs to funds within the portfolio and support the nonprofit partnership manager in crowding-in additional capital.

The nonprofit partnership manager will recruit capital from nonfederal government sources into the portfolio’s different funds with the aim of matching and/or exceeding the dedicated public funds to generate a multiplier effect and access even more capital. Capital from investors willing to take on risk equatable to venture capital would be the most viable targets.

4. The nonprofit partnership manager will use the portfolio’s funds to invest in U.S. early-stage bioindustrial production companies on the basis of sound techno-economic analyses and robust metrics.

The nonprofit partnership manager would invest in bioindustrial production companies that commit to hiring and manufacturing domestically and making products useful to Americans and the country, on the basis of robust techno-economic analyses of the companies’ commercial potential, and generate returns on investments. BAPO Ventures, as well as NIST writ large, would be accessible for technical assistance, if necessary. The nonprofit partnership manager would structure investments with co-funding from additional nonfederal government investors. As this public-private partnership generates investment returns, proceeds from the BAPO Ventures funding will be returned to the portfolio and its funds for reinvestment and sustainment of BAPO Ventures. If this evergreen fund begins to compete with, rather than incentivize, private market funding, or otherwise begins to be unproductive, the fund should be tapered off and/or sunset.

Bioindustrial Production Scale-up Infrastructure Group

It is critical for early-stage bioindustrial production companies to gather evidence that their production processes have the potential to be commercially viable at scale—or not. To learn this, companies need access to pilot- and intermediate-scale bioindustrial production infrastructure like fermenters and bioreactors, as well as modern downstream bioprocessing equipment. The BAPO should house a Bioindustrial Production Scale-up Infrastructure Group (BPSIG). which, as an initial step, would work with both the interagency and nonfederal government partners to conduct a comprehensive analysis of the U.S. bioindustrial production pilot- and intermediate-scale infrastructure landscape with the aim of informing a precision strategy for most effectively leveraging federal resources.

The BPSIG would aim to complete the landscape analysis in three months, seeking to understand deficiencies in capacities such as the different volumes of fermenters and bioreactors that are accessible (and the costs associated with their use) and modular downstream bioprocessing equipment accessibility. They should also identify existing facilities that have accessible capacity, such as corporations’ sites where capacity might be rented, toll facilities, or facilities that could be retooled or rehabilitated to provide the necessary pilot-scale capacity. BPSIG should engage with organizations such as Capacitor, the Bioprocess to Product Network, Royal DSM, DuPont, Cargill, BioMADE, Battelle, MITRE, and the Advanced Biofuels and Bioproducts Process Development Unit at Lawrence Berkeley National Laboratory when performing this evaluation.

If the assessment concludes that retooling existing sites or building new pilot- or intermediate-scale infrastructure is necessary, and that government support would be catalytic, some funds would already be available via existing appropriations, and new appropriations might also be necessary. Appropriations have already been made to some federal agencies through the IRA and other vehicles that could be coordinated by the BPSIG. BPSIG should coordinate with the interagency, and agencies with available funds could contribute directly to building the network. Existing funds to leverage include:

Additionally, Congress may need to make appropriations directly to BPSIG, which BPSIG could then allocate to other federal financing programs for retooling or building any additional pilot- or intermediate-scale bioindustrial production infrastructure projects outside the scope of existing pools of already-appropriated funds.

Bioindustrial Production Loan Program Office

To ensure techno-economically sound bioindustrial production companies can secure financing for demonstration- or commercial-scale infrastructure and equipment needs, Congress should enable an initiative within BAPO called the Bioindustrial Production Loan Programs Office (BPLPO) that replicates and improves the DOE LPO model. The BPLPO would be tailored to the bioindustrial production segment, without agencies’ science or technology mission area constraints (for instance, energy), offering flexible debt instruments and supporting large-scale projects. For example, assistance in the form of loan guarantees would help underwrite debt associated with launching bioproduction plants.

Coordination with DOE LPO, DOE Office of Clean Energy Demonstrations, the U.S. Small Business Association, the relevant U.S. Department of Agriculture loan programs, and other government agencies and offices would be key to avoid duplicating efforts and to incorporate lessons learned and best practices from existing efforts. Congress should appropriate an initial $5 billion for the BPLPO, authorizing the program for an initial 10 years.

Conclusion

Launching a suite of public-private partnerships to advance domestic bioproduction would create more well-paying biomanufacturing jobs in the U.S., expand economic opportunity across the country by spreading the biotech and biomanufacturing footprint into nontraditional areas, produce more high-quality chemicals and goods in the U.S., and help meet national and economic security needs, such as strengthened supply chains and more sustainable production methods.

Frequently Asked Questions
Is there precedent for a federal agency–nonprofit venture capital organization public-private partnership in biotech or biomanufacturing?

BARDA, situated within the Department of Health and Human Services Office of the Assistant Secretary for Preparedness and Response, launched BARDA Ventures in June 2021 to “accelerate development and commercialization of technologies and medical products needed to respond to or prevent public health emergencies, such as pandemics, and other health security threats.” BARDA has provided the nonprofit organization GHIC tens of millions of dollars. GHIC launched and manages a global health security fund with matching capital from other investors. This partnership allows direct linkage with the investment community and establishes sustained and long-term efforts to identify, nurture, and commercialize technologies that aid the U.S. in responding effectively to future health security threats.

Could the BARDA-GHIC model be applied to other sectors in addition to bioindustrial manufacturing?

Yes. The BARDA-GHIC model can be considered when there is underinvestment from the capital markets in a particular early-stage commercial area.

Why the focus on coordinating existing funds from DoD and DOE?

Some funds have already been appropriated to DoD and DOE that could be used to advance U.S. bioindustrial production. DoD and DOE are both stakeholders in bioindustrial manufacturing whose missions would benefit from virtually any domestic bioindustrial manufacturing efforts.

What are some other opportunities for capital for bioindustrial production?

Capital from strategic investors, venture investors with long-term outlooks, and private equity, with growth equity of particular interest, could be targeted. Examples of strategic investors that could be pursued include IndianOil Corp, Petronas, Brookfield, or BASF. Venture investors with longer-term outlook funds like Breakthrough Energy Catalyst would also be candidates to pursue to recruit capital.


The scale-up and commercialization of some bioindustrial production capabilities can be capital intensive; however, standing-up bioproduction facilities can cost two to 2,000 times less than chemical facilities, and operating expenses for a bioproduction facility are relatively low, making return on capital more attractive to capital markets. It’s likely that investments’ returns should be expected to be long-term in nature. Investments now could help some bioindustrial production operations reach profitability by the mid- to late 2020s, with positive returns on investments likely. In addition to acquiring equity in bioindustrial production companies, some investors may contribute to commercializing the bioindustrial production of those operations’ chemicals or other goods in their regions of influence, etc.

What’s a potential starting point for the equitable and strategic placement of pilot- and intermediate-scale bioindustrial production facilities?

Potential regional targets include Suffolk, Massachusetts, and Albany, New York, in the Northeast; Warren, Ohio, Johnson, Kansas, and Porter, Indiana, in the Midwest; Denton, Texas, Wake, North Carolina, and Canadian, Oklahoma, in the South; and Yavapai, Arizona, and Honolulu, Hawaii, in the West.

Tilling the Federal SOIL for Transformative R&D: The Solution Oriented Innovation Liaison

Summary 

The federal government is increasingly embracing Advanced Research Projects Agencies (ARPAs) and other transformative research and engagement enterprises (TREEs) to connect innovators and create the breakthroughs needed to solve complex problems. Our innovation ecosystem needs more of these TREEs, especially for societal challenges that have not historically benefited from solution-oriented research and development. And because the challenges we face are so interwoven, we want them to work and grow together in a solution-oriented mode. 

The National Science Foundation (NSF)’s new Directorate for Technology, Innovation and Partnerships should establish a new Office of the Solution-Oriented Innovation Liaison (SOIL) to help TREEs share knowledge about complementary initiatives, establish a community of practice among breakthrough innovators, and seed a culture for exploring new models of research and development within the federal government. The SOIL would have two primary goals: (1) provide data, information, and knowledge-sharing services across existing TREEs; and (2) explore opportunities to pilot R&D models of the future and embed breakthrough innovation models in underleveraged agencies.

Challenge and Opportunity

Climate change. Food security. Social justice. There is no shortage of complex challenges before us—all intersecting, all demanding civil action, and all waiting for us to share knowledge. Such challenges remain intractable because they are broader than the particular mental models that any one individual or organization holds. To develop solutions, we need science that is more connected to social needs and to other ways of knowing. Our problem is not a deficit of scientific capital. It is a deficit of connection.

Connectivity is what defines a growing number of approaches to the public administration of science and technology, alternatively labeled as transformative innovation, mission-oriented innovation, or solutions R&D. Connectivity is what makes DARPA, IARPA, and ARPA-E work, and it is why new ARPAs are being created for health and proposed for infrastructure, labor, and education. Connectivity is also a common element among an explosion of emerging R&D models, including Focused Research Organizations (FROs) and Distributed Autonomous Organizations (DAOs). And connectivity is the purpose of NSF’s new Directorate for Technology, Innovation and Partnerships (TIP), which includes “fostering innovation ecosystems” in its mission. New transformative research and engagement enterprises (TREEs) could be especially valuable in research domains at the margins, where “the benefits of innovation do not simply trickle down.

The history of ARPAs and other TREEs shows that solutions R&D is successfully conducted by entities that combine both research and engagement. If grown carefully, such organisms bear fruit. So why just plant one here or there when we could grow an entire forest? The metaphor is apt. To grow an innovation ecosystem, we must intentionally sow the seeds of TREEs, nurture their growth, and cultivate symbiotic relationships—all while giving each the space to thrive.

Plan of Action

NSF’s TIP directorate should create a new Office of Solution-Oriented Innovation (SOIL) to foster a thriving community of TREEs. SOIL would have two primary goals: (1) nurture more TREEs of more varieties in more mission spaces; and (2) facilitate more symbiosis among TREEs of increasing number and variety. 

Goal 1: More TREEs of more varieties in more mission spaces

SOIL would shepherd the creation of TREEs wherever they are needed, whether in a federal department, a state or local agency, or in the private, nonprofit, or academic sectors. Key to this is codifying the lessons of successful TREEs and translating them to new contexts. Not all such knowledge is codifiable; much is tacit. As such, SOIL would draw upon a cadre of research-management specialists who have a deep familiarity with different organizational forms (e.g., ARPAs, FROs, DAOs) and could work with the leaders of departments, businesses, universities, consortia, etc. to determine which form best suits the need of the entity in question and provide technical assistance in establishment.

An essential part of this work would be helping institutions create mission-appropriate governance models and cultures. Administering TREEs is neither easy nor typical. Indeed, the very fact that they are managed differently from normal R&D programs makes them special. Former DARPA Director Arati Prabhakar has emphasized the importance of such tailored structures to the success of TREEs. To this end, SOIL would also create a Community of Cultivators comprising former TREE leaders, principal investigators (PIs), and staff. Members of this community would provide those seeking to establish new TREEs with guidance during the scoping, launch, and management phases.

SOIL would also provide opportunities for staff at different TREEs to connect with each other and with collective resources. It could, for example, host dedicated liaison officers at agencies (as DARPA has with its service lines) to coordinate access to SOIL resources and other TREEs and support the documentation of lessons learned for broader use. SOIL could also organize periodic TREE conventions for affiliates to discuss strategic directions and possibly set cross-cutting goals. Similar to the SBIR office at the Small Business Administration, SOIL would also report annually to Congress on the state of the TREE system, as well as make policy recommendations.

Goal 2: More symbiosis among TREEs of increasing number and variety

Success for SOIL would be a community of TREEs that is more than the sum of its parts. It is already clear how the defense and intelligence missions of DARPA and IARPA intersect. There are also energy programs at DARPA that might benefit from deeper engagement with programs at ARPA-E. In the future, transportation-infrastructure programs at ARPA-E could work alongside similar programs at an ARPA for infrastructure. Fostering stronger connections between entities with overlapping missions would minimize redundant efforts and yield shared platform technologies that enable sector-specific advances.

Indeed, symbiotic relationships could spawn untold possibilities. What if researchers across different TREEs could build knowledge together? Exchange findings, data, algorithms, and ideas? Co-create shared models of complex phenomena and put competing models to the test against evidence? Collaborate across projects, and with stakeholders, to develop and apply digital technologies as well as practices to govern their use? A common digital infrastructure and virtual research commons would enable faster, more reliable production (and reproduction) of research across domains. This is the logic underlying the Center for Open Science and the National Secure Data Service.

To this end, SOIL should build a digital Mycelial Network (MyNet), a common virtual space that would harness the cognitive diversity across TREEs for more robust knowledge and tools. MyNet would offer a set of digital services and resources that could be accessed by TREE managers, staff, and PIs. Its most basic function could be to depict the ecosystem of challenges and solutions, search for partners, and deconflict programs. Once partnerships are made, higher-level functions would include secure data sharing, co-creation of solutions, and semantic interconnection. MyNet could replace the current multitude of ad hoc, sector-specific systems for sharing research resources, giving more researchers access to more knowledge about complex systems and fewer obstacles from paywalls. And the larger the network, the bigger the network effects. If the MyNet infrastructure proves successful for TREEs, it could ultimately be expanded more broadly to all research institutions—just as ARPAnet expanded into the public internet. 

For users, MyNet would have three layers:

These functions would collectively require:

How might MyNet be applied? Consider three hypothetical programs, all focused on microplastics: a medical program that maps how microplastics are metabolized and impact health; a food-security program that maps how microplastics flow through food webs and supply chains; and a social justice program that maps which communities produce and consume microplastics. In the data layer, researchers at the three programs could combine data on health records, supply logistics, food inspections, municipal records, and demographics. In the information layer, they might collaborate on coding and evaluating quantitative models. Finally, in the knowledge layer, they could work together to validate claims regarding who is impacted, how much, and by what means.

Initial Steps

First, Congress should authorize and appropriate the NSF TIP Directorate with $500 million over four years for a new Office of the Solution-Oriented Innovation Liaison. Congress should view SOIL as an opportunity to create a shared service among emergent, transformative federal R&D efforts that will empower—rather than bureaucratically stifle—the science and technological advances we need most. This mission fits squarely under the NSF TIP Directorate’s mandate to “mobilize the collective power of the nation” by serving as “a crosscutting platform that collaboratively integrates with NSF’s existing directorates and fosters partnerships—with government, industry, nonprofits, civil society and communities of practice—to leverage, energize and rapidly bring to society use-inspired research and innovation.” 

Once appropriated and authorized to begin intentionally growing a network of TREEs, NSF’s TIP Directorate should focus on a four-year plan for SOIL. TIP should begin by choosing an appropriate leader for SOIL, such as a former director or directorate manager of an ARPA (or other TREE). SOIL would be tasked with first engaging the management of existing ARPAs in the federal government, such as those at the Departments of Defense and Energy, to form an advisory board. The advisory board would in turn guide the creation of experience-informed operating procedures for SOIL to use to establish and aid new TREEs. These might include discussions geared toward arriving at best practices and mechanisms to operate rapid solutions-focused R&D programs for the following functions:

Beyond these structural aspects, the board must also incorporate important cultural aspects of TREES into best practices. In my own research into the managerial heuristics that guide TREEs, I found that managers must be encouraged to “drive change” (critique the status quo, dream big, take action), “be better” (embrace difference, attract excellence, stand out from the crowd), “herd nerds” (focus the creative talent of scientists and engineers), “gather support” (forge relationships with research conductors and potential adversaries), “try and err” (take diverse approaches, expect to fail, learn from failure), and “make it matter” (direct activities to realize outcomes for society, not for science).

The board would also recommend a governance structure and implementation strategy for MyNet. In its first year, SOIL could also start to grow the Community of Cultivators, potentially starting with members of the advisory board. The board chair, in partnership with the White House Office of Science and Technology Policy, would also convene an initial series of interagency working groups (IWGs) focused on establishing a community of practice around TREEs, including but not limited to representatives from the following R&D agencies, offices, and programs: 

In years two and three, SOIL would focus on growing three to five new TREEs at organizations that have not had solutions-oriented innovation programs before but need them. 

SOIL would also start to build a pilot version of MyNet as a resource for these new TREEs, with a goal of including existing ARPAs and other TREEs as quickly as possible. In establishing MyNet, SOIL should focus on implementing the most appropriate system of data governance by first understanding the nature of the collaborative activities intended. Digital research collaborations can apply and mix a range of different governance patterns, with different amounts of availability and freedoms with respect to digital resources. MyNet should be flexible enough to meet a range of needs for openness and security. To this end, SOIL should coordinate with the recently created National Secure Data Service and apply lessons forward in creating an accessible, secure, and ethical information-sharing environment. 

Year four and beyond would be characterized by scaling up. Building on the lessons learned in the prior two years of pilot programs, SOIL would coordinate with new and legacy TREEs to refresh operating procedures and governance structures. It would then work with an even broader set of organizations to increase the number of TREEs beyond the three to five pilots and continue to build out MyNet as well as the Community of Cultivators. Periodic evaluations of SOIL’s programmatic success would shape its evolution after this point. These should be framed in terms of its capacity to create and support programs that yield meaningful technological and socioeconomic outcomes, not just produce traditional research metrics. As such, in its creation of new TREEs, SOIL should apply a major lesson of the National Academies’ evaluation of ARPA-E: explicitly align the (necessarily) robust performance management systems at the project level with strategy and evaluation systems at the program, portfolio, and agency levels. The long-term viability of SOIL and TREEs will depend on their ability to demonstrate value to the public.

Frequently Asked Questions
What is the transformative research model? What makes it different from a typical R&D model?

The transformative research model typically works like this:



  • Engage with stakeholders to understand their needs and set audacious goals for addressing them.

  • Establish lean projects run by teams of diverse experts assembled just long enough to succeed or fail in one approach.

  • Continuously evaluate projects, build on what works, kill what doesn’t, and repeat as necessary.


In a nutshell, transformative research enterprises exist solely to solve a particular problem, rather than to grow a program or amass a stock of scientific capital.


To get more specific, Bonvillian and Van Atta (2011) identify the unique factors that contribute to the innovative nature of ARPAs. On the personnel front, ARPA program managers are talented managers, experienced in business, and appointed for limited terms. They are “translators,” as opposed to subject-matter experts, who actively engage with allies, rivals, and others. They have great power to choose projects, hire, fire, and contract. On the structure front, projects are driven by specific challenges or visions—co-developed with stakeholders—designed around plausible implementation pathways. Projects are executed extramurally, and managed as portfolios, with clear metrics to asses risk and reward. Success for ARPAs means developing products and services that achieve broad uptake and cost-efficacy, so finding first adopters and creating markets is part of the work.

What kinds of TREEs could SOIL help to create?

Some examples come from other Day One proposals. SOIL could work with the Department of Labor to establish a Labor ARPA. It could work with the Department of Education on an Education ARPA. We could imagine a Justice Department ARPA with a program for criminal justice reform, one at Housing and Urban Development aimed at solving homelessness, or one at the State Department for innovations in diplomacy. And there are myriad opportunities beyond the federal government.

What kind of authority over TREEs should SOIL have? Since TREEs are designed to be nimble and independent, wouldn’t SOIL oversight inhibit their operations with an extra layer of bureaucracy?

TREEs thrive on their independence and flexibility, so SOIL’s functions must be designed to impose minimal interference. Other than ensuring that the TREEs it supports are effectively administered as transformative, mission-oriented organizations, SOIL would be very hands-off. SOIL would help establish TREEs and set them up so they do not operate as typical R&D units. SOIL would give TREE projects and staff the means to connect cross-organizationally with other projects and staff in areas of mutual interest (e.g., via MyNet, the Community of Cultivators, and periodic convenings). And, like the SBIR office at the Small Business Administration, SOIL would report annually to Congress on its operations and progress toward goals.

What is the estimated cost of SOIL and its component initiatives? How would it be funded?

An excellent model for SOIL is the Small Business Innovative Research (SBIR) system. SBIR is funded by redirecting a small percentage of the budgets of agencies that spend $100 million or more on extramural R&D. Given that SOIL is intended to be relevant to all federal mission spaces, we recommend that SOIL be funded by a small fraction (between 0.1 and 1.0%) of the budgets of all agencies with $1 billion or more in total discretionary spending. This would yield about $15 billion to support SOIL in growing and connecting new TREEs in a vastly widened set of mission spaces. 


The risk is the opportunity cost of this budget reallocation to each funding agency. It is worth noting, though, that changes of 0.1–1.0% are less than the amount that the average agency sees as annual perturbations in its budget. Moreover, redirecting these funds may well be worth the opportunity cost, especially as an investment in solving the compounding problems that federal agencies face. By redirecting this small fraction of funds, we can keep agency operations 99–99.9% as effective while simultaneously creating a robust, interconnected, solutions-oriented R&D system.

Saving 3.1 Million Lives a Year with a President’s Emergency Plan to Combat Acute Childhood Malnutrition

Summary

Like HIV/AIDS, acute childhood malnutrition is deadly but easily treatable when the right approach is taken. Building on the success of PEPFAR, the Biden-Harris Administration should launch a global cross-agency effort to better fund, coordinate, research, and implement malnutrition prevention and treatment programs to save millions of children’s lives annually and eventually eliminate severe acute malnutrition.

Children with untreated severe acute malnutrition are 9 to 11 times more likely to die than their peers and suffer from permanent setbacks to their neurodevelopment, immune system, and future earnings potential if they survive. Effective programs can treat children for around $60 per child with greater than 90 percent recovery rates. However, globally, only about 25–30 percent of children with moderate and severe acute malnutrition have access to treatment. Every year, 3.1 million children die due to malnutrition-related causes, and 45% of all deaths of children under five are related to malnutrition, making it the leading cause of under-five deaths. 

In 2003, a similar predicament existed: the HIV/AIDS epidemic was causing millions of deaths in sub-Saharan Africa and around the world, despite the existence of highly effective treatment and prevention methods. In response, the Bush Administration created the President’s Emergency Plan for AIDS Relief (PEPFAR). PEPFAR has proven a major global health success, saving an estimated 30 million lives since 2003 through over $100 billion in funding. 

The Biden-Harris Administration should establish a President’s Emergency Plan for Acute Childhood Malnutrition (PEPFAM) in the Office of Global Food Security at the State Department to clearly elevate the problem of acute childhood malnutrition, leverage new and existing food security and health programs to serve U.S. national security and humanitarian interests, and save the lives of up to 3.1 million children around the world, every year. PEPFAM could serve as a catalytic initiative to harmonize the fight against malnutrition and direct currently fragmented resources toward greater impact.

Challenge and Opportunity

United Nations Sustainable Development Goal (SDG) 2.2 outlines goals for reducing acute malnutrition, ambitiously targeting global rates of 5 percent by 2025 and 3 percent (a “virtual elimination”) by 2030. Due to climate change, the COVID-19 pandemic, and conflicts like the war in Ukraine, global rates of malnutrition remain at 8 percent and are forecast to become worse, not better. Globally, 45.4 million children suffer from acute malnutrition, 13.6 million of whom are severely acutely malnourished (SAM). If current trends persist until 2030, an estimated 109 million children will suffer from permanent cognitive or physiological stunting, despite the existence of highly effective and relatively cheap treatment. 

Providing life-saving treatment around the world serves a core American value of humanitarianism and helps meet commitments to the SDGs. The United States Agency for International Development (USAID) recently announced a commitment to purchase ready-to-use therapeutic food (RUTF), a life-saving food, on the sidelines of the UN General Assembly, demonstrating a prioritization of global food security. Food security is also a priority for the Biden Administration’s approach to national security. The newly released National Security Strategy dedicates an entire section to food insecurity, highlighting the urgency of the problem and calling on the United States and its global partners to work to address acute needs and tackle the extraordinary humanitarian burden posed by malnutrition. The Office of Global Food Security at the U.S. Department of State also prioritizes food security as an issue of national security, leading and coordinating diplomatic engagement in bilateral, multilateral, and regional contexts. At a time when the United States is competing for its vision of a free, open, and prosperous world, addressing childhood malnutrition could serve as a catalyst to achieve the vision articulated in the National Security Strategy and at the State Department.

“People all over the world are struggling to cope with the effects of shared challenges that cross borders—whether it is climate change, food insecurity, communicable diseases, terrorism, energy shortages, or inflation. These shared challenges are not marginal issues that are secondary to geopolitics. They are at the very core of national and international security and must be treated as such.” 

U.S. 2022 National Security Strategy 

Tested, scalable, and low-cost solutions exist to treat children with acute malnutrition, yet the platform and urgency to deliver interventions at scale does not. Solutions such as community management of acute malnutrition (CMAM), the gold standard approach to malnutrition treatment, and other intentional strategies like biofortification could dramatically lower the burden of global childhood malnutrition. Despite the 3.1 million preventable deaths that occur annually related to childhood malnutrition and the clear threat that food insecurity poses to U.S. national security, we lack an urgent platform to bring these low-cost solutions to bear. 

While U.S. government assistance to combat food insecurity and malnutrition is a priority, funding and coordination are not centralized. The U.S. has committed over $10 billion to address global food insecurity, allocating dollars to USAID, Feed the Future, the U.S. Department of Agriculture (USDA), and others. Through the recently signed Global Malnutrition Prevention and Treatment Act of 2021, Congress took a step forward by authorizing USAID to have greater authority in targeting nutrition aid to areas of greatest need and greater flexibility to coordinate activities across the agency and its partners. In accordance with the agency’s Global Nutrition Coordination Plan, Congress also established the Nutrition Leadership Council, chaired by the Bureau for Resilience and Food Security to coordinate and integrate activities solely within USAID. Multilateral and private sector partners also dedicate resources to food security: the Gates Foundation committed $922 million toward global nutrition and food systems, and UNICEF created a Nutrition Match Fund to incentivize funding to combat severe acute malnutrition. These lines of effort are each individually important, but could be more impactful if aligned. A President’s Emergency Plan for malnutrition could harmonize these separate funding streams and authorities and mobilize multilateral and private sector partners to prevent and treat malnutrition and food insecurity.

Drawing on the strengths of the PEPFAR model to combat HIV/AIDS at scale while driving down costs for treatment, PEPFAM could revolutionize how resources are spent while scaling sustainable and cost-effective solutions to childhood malnutrition, saving millions of lives every year. Under this model, significantly more—and, optimally, all—children suffering from acute malnutrition would have access to treatment. This would make dramatic progress toward global food security and U.S. national security priorities.

Plan of Action

President Biden should declare a global childhood malnutrition emergency and announce the creation of the President’s Emergency Plan for Acute Childhood Malnutrition. Using PEPFAR as a model, PEPFAM could catalyze cost-effective solutions to save millions of lives every year. When President Bush mobilized support for PEPFAR in his 2003 State of the Union, he declared, “We must remember our calling, as a blessed country, is to make the world better,” and called for interagency support for an “Emergency Plan” for HIV/AIDS relief and Congressional support to commit $15 billion over the next five years to launch PEPFAR.

President Biden should follow a similar path and announce PEPFAM in a similarly high-profile speech—the 2023 State of the Union address, for example—to elevate the problem of acute childhood malnutrition to the American people and the U.S. government and offer a clear call to action through an executive order directing an interagency task force to develop a 24-month strategic plan within 180 days. The initial stages of PEPFAM and corresponding executive branch activities can be guided by the following recommendations.

Recommendation 1. Name a White House PEPFAM czar and task the Office of Global Food Security at the State Department to coordinate cross-agency support, intended personnel, agencies, and roles involved.

A Senior Advisor on the White House’s National Security Team at the Office of Science and Technology Policy would serve as the White House czar for PEPFAM and would (1) steer and lead the initiative, (2) organize an interagency task force, and (3) coordinate PEPFAM’s strategic focus by engaging multiple federal agencies, including: 

The Office of the Global AIDS Coordinator and Health Diplomacy at the State Department (OGAC) manages the high-level execution of PEPFAR by dictating strategic direction and coordinating agencies. The PEPFAM executive order will set up a similar infrastructure at the Office of Global Food Security at the State Department to: 

USAID is also well positioned to play a leading role given its current support of global food and nutrition programming. Several of USAID’s portfolios are central to PEPFAM’s aims, including Agriculture and Food Security, Nutrition, Global Health, Water and Sanitation, and Humanitarian Assistance. The offices that support these portfolios should provide technical expertise in the realm of food and nutrition, existing connections to good program implementers in various country contexts, monitoring and evaluation capacity to track implementer’s progress toward goals, and strategic direction. 

The Office of Global Food Security and the PEPFAM czar should delegate authority for the program across government agencies, private partners (e.g., Gates Foundation), and multilateral organizations (e.g., World Food Programme). The Office would coordinate interagency action to support PEPFAM’s implementation and evaluation as well as identify agencies that are best placed to lead each component of the effort. 

Recommendation 2. Present initial, strategic action plan to build and sustain PEPFAM.

The PEPFAM interagency task force, described above, should develop a strategic plan targeting an initial set of actions to align with existing global food security and childhood malnutrition priorities and identify opportunities to redirect existing resources toward scalable, high-impact solutions like CMAM. USAID already invests millions of dollars each year in initiatives like Feed the Future that support global food security while overseeing cross-agency implementation and harmonization of the Global Food Security Strategy. These efforts and funding should be rolled under the umbrella of PEPFAM to better align treatment and prevention interventions, strategically coordinate resources across the government, and improve a focus on impact.  

Recommendation 3. Announce discrete, evidence-driven goals for PEPFAM.

These goals include:

Recommendation 4. Establish a coordination framework between PEPFAM, multilateral agencies, and private sector partners to mobilize and harmonize resources.

The Office of Global Food Security and USAID should build on current momentum to bring multilateral and private partners behind PEPFAM. USAID has recently announced a series of partnerships with large philanthropic organizations like the Gates Foundation, Aliko Dangote Foundation, and Eleanor Crook Foundation (to name a few), as well as other countries and multilateral organizations at UNGA. Much like with PEPFAR, PEPFAM could rely on the support of external partners as well as federal funds to maximize the impact of the program. 

Recommendation 5. Create an international council to set technical standards so that money goes to the most effective programs possible. 

The Office of Global Food Security, USAID, and PEPFAM should spearhead the development of an international technical council (that could be housed under the UN, the World Health Organization, or independently) to set standards for malnutrition prevention and treatment programming. Malnutrition treatment is already cost-effective, but it could be made even cheaper and more effective through innovation. Even when promising new interventions are identified, the process of disseminating and scaling of existing, proven best practices innovations doesn’t function optimally. 

Treatment guidelines issued by the WHO and national governments are slow to be updated, meaning that highly effective interventions can take years to be adopted and, even then, are adopted in a piecemeal fashion. Other implementers may be too wedded to their operational practices to consider making a change unless standards are updated or innovations from other implementers are actively socialized. 

An international technical council would disseminate and scale best practices discovered in the processes of implementation and research. If funders like the U.S. government commit to only funding organizations that promptly adopt these standards, they can maximize the impact of existing funding by ensuring that every dollar goes toward the most cost-effective ways of saving lives. This body could ideally speed the sharing and implementation of practices that could allow more children to be treated effectively, at lower costs.

Recommendation 6. Direct existing child malnutrition assistance through PEPFAM to ensure coordinated impact and seek permanent funding from Congress for PEPFAM.

The executive order will create the momentum to establish PEPFAM, but legislative authorization is required to make it sustainable. The strategic plan should lay out efforts to build Congressional support for funding legislation.

Congress will play a key role in PEPFAM implementation by appropriating funds. Under PEPFAR, Congress appropriates money directly to OGAC at the Department of State, which disburses it to other agencies. In 2003, Congress supported President Bush’s request for $15 billion in PEPFAR funding by passing the Leadership Act that authorized yearly contributions to the Global Fund from 2004 to 2008. Congress has subsequently reauthorized the program through FY2023. Each year, the OGAC presents a request of funding needed for recipient countries and programs to the President, who then forwards the request to Congress for reauthorization. The PEPFAM process should mirror this structure.

At the UNGA in 2022, President Biden announced over $2.9 billion in new assistance to address global food insecurity, building on the $6.9 billion in U.S. government assistance already committed in 2021. Last year, President Biden also announced a $10 billion, multiyear investment to promote food systems transformation, including a $5 billion commitment to Feed the Future specifically. Instead of fractured funding to different initiatives, these funds should be harmonized under PEPFAM, with dollars allocated to the PEPFAM task force to create a centralized two-year strategy to combat malnutrition. 

Conclusion 

This program would have a series of positive effects. First, and most obviously, PEPFAM would save up to 3.1 million lives every year and bring together resources and goals around food security that are currently fractured across the federal government, increasing the effectiveness of U.S. aid dollars globally. Second, PEPFAM, like PEPFAR, would make existing interventions more effective by unlocking cost savings and innovation at scale. Third, at a time when the United States is competing for its vision of a free, open, and prosperous world, PEPFAM could play a key role in achieving the mission of the National Security Strategy.
Over time, more comprehensive treatment coverage and prevention efforts could also lead to the elimination of severe acute malnutrition by preventing cases and catching those that approach moderate acute malnutrition or have already fallen into it. PEPFAM would save an estimated 27.9 million lives over the same time scale as PEPFAR. Millions of children die every year while a cheap and effective solution exists. PEPFAM could change that.

Frequently Asked Questions
How does PEPFAM compare to PEPFAR in terms of funding and effectiveness at scale?

From 2003 to present day, PEPFAR has spent billions of dollars and saved millions of lives. This table compares the estimated costs and outcomes of PEPFAR with PEPFAM. Because malnutrition treatment is cheaper than HIV/AIDS treatment and there is a higher caseload, there is a high-leverage opportunity to save lives.






























  PEPFAR (HIV/AIDS) PEPFAM (Childhood Malnutrition)
Average Cost of Treatment per Person $367,134 $60
Number of Cases 38.4 million 45.4 million
Program Cost (estimated yearly) $5.7 billion (USD) $4 billion (USD)
Lives Saved (estimated yearly) 1.6 million  1.5 million 

 


Costs for PEPFAM are difficult to project precisely, because the program is likely to become more cost-effective over time as efforts to prevent cases start to work and research and development result in cheaper and more effective treatment. The projections above operate under the most pessimistic assumptions that no improvements to cost or effectiveness are made over time. This graph illustrates a similar the expansion of PEPFAR services, even under flat budgets thanks to this same self-improvement over time. 


PEPFAR funding graph


Source: Department of State


PEPFAM is similar: more comprehensive treatment coverage and prevention efforts could lead to the elimination of severe acute malnutrition by preventing cases and catching those that approach moderate acute malnutrition or have already fallen into it. That means that the program should become cheaper over time, as more cases are identified earlier when they are cheaper to treat, and more cases are prevented, both by prevention programs and general economic development. Research and innovation can similarly cut down on the costs and improve the effectiveness of malnutrition treatment and prevention over time.

Why should the U.S. declare food security and childhood malnutrition a global emergency?

The lack of attention to childhood malnutrition in non-emergency/non-crisis zones results in millions of preventable deaths each year. Declaring an emergency would put pressure on other organizations, media outlets, and NGOs to devote more resources to food security. The international community is keen to respond to food crises in emergency contexts, especially among children. USAID and the UN recently committed millions of dollars for the procurement of ready-to-use therapeutic food (RUTF) to combat emergency risks like the war in Ukraine and conflicts in places like Ethiopia. But the unfortunate truth is that even outside of newsworthy emergencies, acute malnutrition remains a daily emergency in many places around the world. Malnutrition rates are just as high in states and countries that neighbor emergency zones as in the crisis-hit places themselves, partially as a result of movement of internally displaced people. While funding acute malnutrition in relatively mundane circumstances (e.g., poverty-stricken states in Nigeria) may make less headlines than emergency food aid, it’s equally needed.

How much U.S. global health funding is currently put toward nutrition?

Currently, only 1 percent of U.S. global health spending is put toward nutrition. Only 25–30 percent of children globally have access to treatment as a result of underfunded programs and a subsequent lack of resources and geographic coverage.

What is the current state of investment in quality treatment implementation?

Treatment is only effective if implemented well. Right now, funding goes to a range of programs that fail to meet Sphere Standards of 75 percent recovery rates. Large-scale funders like UNICEF have internal commitments to spend a certain amount of their budgets on ready-to-use therapeutic food (RUTF) a year, which means that their hands are tied when working in contexts with poor implementing partners (e.g., corrupt governments). At the same time, NGOs like Alliance for International Medical Action and Médecins Sans Frontières achieve recovery rates of more than 95 percent. More investment in quality implementation capacity is needed; otherwise, scarce existing resources will continue to be wasted.

Is there a robust evidence base for malnutrition prevention?

There’s a growing movement to implement interventions that catch children on the border of malnutrition or improve conditions that lead to malnutrition in the first place (e.g., infant and young child feeding circles, exclusive breastfeeding counseling). These programs are exciting, but the evidence base for impact at this point is minimal. It’s much cheaper to catch a child before they fall into malnutrition than it is to treat them, not to mention the health benefits to the child from averting the disease. More work needs to be done to test and validate the most cost-effective prevention methods to ensure that only those that actually generate impact are scaled.

What agencies play a role in PEPFAR?
Where are current efforts to combat malnutrition focused?

Childhood malnutrition sits at the intersection of public health and nutrition/agricultural programming. Current efforts are spread across the U.S. government and multilateral partners with little coordination toward desired outcomes. Funding that hypothetically targets childhood malnutrition can come from a variety of players in the U.S. government, ranging from Department of Defense to USAID to the Department of Agriculture. While some coordination through programs like Feed the Future exist at USAID, these programs are not yet results- or outcome-based. Coordination should involve measuring the impact of collective aid across agencies on an outcome like recovery rates or the number of children suffering from malnutrition in a given geographic area.

What outcomes does PEPFAM target?

Strengthening Policy by Bringing Evidence to Life

Summary

In a 2021 memorandum, President Biden instructed all federal executive departments and agencies to “make evidence-based decisions guided by the best available science and data.” This policy is sound in theory but increasingly difficult to implement in practice. With millions of new scientific papers published every year, parsing and acting on research insights presents a formidable challenge.

A solution, and one that has proven successful in helping clinicians effectively treat COVID-19, is to take a “living” approach to evidence synthesis. Conventional systematic reviews,  meta-analyses, and associated guidelines and standards, are published as static products, and are updated infrequently (e.g., every four to five years)—if at all. This approach is inefficient and produces evidence products that quickly go out of date. It also leads to research waste and poorly allocated research funding.

By contrast, emerging “Living Evidence” models treat knowledge synthesis as an ongoing endeavor. By combining (i) established, scientific methods of summarizing science with (ii) continuous workflows and technology-based solutions for information discovery and processing, Living Evidence approaches yield systematic reviews—and other evidence and guidance—products that are always current. 
The recent launch of the White House Year of Evidence for Action provides a pivotal opportunity to harness the Living Evidence model to accelerate research translation and advance evidence-based policymaking. The federal government should consider a two-part strategy to embrace and promote Living Evidence. The first part of this strategy positions the U.S. government to lead by example by embedding Living Evidence within federal agencies. The second part focuses on supporting external actors in launching and maintaining Living Evidence resources for the public good.

Challenge and Opportunity

We live in a time of veritable “scientific overload”. The number of scientific papers in the world has surged exponentially over the past several decades (Figure 1), and millions of new scientific papers are published every year. Making sense of this deluge of documents presents a formidable challenge. For any given topic, experts have to (i) scour the scientific literature for studies on that topic, (ii) separate out low-quality (or even fraudulent) research, (iii) weigh and reconcile contradictory findings from different studies, and (iv) synthesize study results into a product that can usefully inform both societal decision-making and future scientific inquiry.

This process has evolved over several decades into a scientific method known as “systematic review” or “meta-analysis”. Systematic reviews and meta-analyses are detailed and credible, but often take over a year to produce and rapidly go out of date once published. Experts often compensate by drawing attention to the latest research in blog posts, op-eds, “narrative” reviews, informal memos, and the like. But while such “quick and dirty” scanning of the literature is timely, it lacks scientific rigor. Hence those relying on “the best available science” to make informed decisions must choose between summaries of science that are reliable or current…but not both.

The lack of trustworthy and up-to-date summaries of science constrains efforts, including efforts championed by the White House, to promote evidence-informed policymaking. It also leads to research waste when scientists conduct research that is duplicative and unnecessary, and degrades the efficiency of the scientific ecosystem when funders support research that does not address true knowledge gaps.

Figure 1

Total number of scientific papers published over time, according to the Microsoft Access Graph (MAG) dataset. (Source: Herrmannova and Knoth, 2016)

The emerging Living Evidence paradigm solves these problems by treating knowledge synthesis as an ongoing rather than static endeavor. By combining (i) established, scientific methods of summarizing science with (ii) continuous workflows and technology-based solutions for information discovery and processing, Living Evidence approaches yield systematic reviews that are always up to date with the latest research. An opinion piece published in The New York Times called this approach “a quiet revolution to surface the best-available research and make it accessible for all.”

To take a Living Evidence approach, multidisciplinary teams of subject-matter experts and methods experts (e.g., information specialists and data scientists) first develop an evidence resource—such as a systematic review—using standard approaches. But the teams then commit to regular updates of the evidence resource at a frequency that makes sense for their end users (e.g., once a month). Using technologies such as natural-language processing and machine learning, the teams continually monitor online databases to identify new research. Any new research is rapidly incorporated into the evidence resource using established methods for high-quality evidence synthesis. Figure 2 illustrates how Living Evidence builds on and improves traditional approaches for evidence-informed development of guidelines, standards, and other policy instruments.

Figure 2

Illustration of how a Living Evidence approach to development of evidence-informed policies (such as clinical guidelines) is more current and reliable than traditional approaches. (Source: Author-developed graphic)

Living Evidence products are more trusted by stakeholders, enjoy greater engagement (up to a 300% increase in access/use, based on internal data from the Australian Stroke Foundation), and support improved translation of research into practice and policy. Living Evidence holds particular value for domains in which research evidence is emerging rapidly, current evidence is uncertain, and new research might change policy or practice. For example, Nature has credited Living Evidence with “help[ing] chart a route out” of the worst stages of the COVID-19 pandemic. The World Health Organization (WHO) has since committed to using the Living Evidence approach as the organization’s “main platform” for knowledge synthesis and guideline development across all health issues. 

Yet Living Evidence approaches remain underutilized in most domains. Many scientists are unaware of Living Evidence approaches. The minority who are familiar often lack the tools and incentives to carry out Living Evidence projects directly. The result is an “evidence to action” pipeline far leakier than it needs to be. Entities like government agencies need credible and up-to-date evidence to efficiently and effectively translate knowledge into impact.

It is time to change the status quo. The 2019 Foundations for Evidence-Based Policymaking Act (“Evidence Act”) advances “a vision for a nation that relies on evidence and data to make decisions at all levels of government.” The Biden Administration’s “Year of Evidence” push has generated significant momentum around evidence-informed policymaking. Demonstrated successes of Living Evidence approaches with respect to COVID-19 have sparked interest in these approaches specifically. The time is ripe for the federal government to position Living Evidence as the “gold standard” of evidence products—and the United States as a leader in knowledge discovery and synthesis.

Plan of Action

The federal government should consider a two-part strategy to embrace and promote Living Evidence. The first part of this strategy positions the U.S. government to lead by example by embedding Living Evidence within federal agencies. The second part focuses on supporting external actors in launching and maintaining Living Evidence resources for the public good. 

Part 1. Embedding Living Evidence within federal agencies

Federal science agencies are well positioned to carry out Living Evidence approaches directly. Living Evidence requires “a sustained commitment for the period that the review remains living.” Federal agencies can support the continuous workflows and multidisciplinary project teams needed for excellent Living Evidence products.

In addition, Living Evidence projects can be very powerful mechanisms for building effective, multi-stakeholder partnerships that last—a key objective for a federal government seeking to bolster the U.S. scientific enterprise. A recent example is Wellcome Trust’s decision to fund suites of living systematic reviews in mental health as a foundational investment in its new mental-health strategy, recognizing this as an important opportunity to build a global research community around a shared knowledge source. 

Greater interagency coordination and external collaboration will facilitate implementation of Living Evidence across government. As such, President Biden should issue an Executive Order establishing an Living Evidence Interagency Policy Committee (LEIPC) modeled on the effective Interagency Arctic Research Policy Committee (IARPC). The LEIPC would be chartered as an Interagency Working Group of the National Science and Technology Council (NSTC) Committee on Science and Technology Enterprise, and chaired by the Director of the White House Office of Science and Technology Policy (OSTP; or their delegate). Membership would comprise representatives from federal science agencies, including agencies that currently create and maintain evidence clearinghouses, other agencies deeply invested in evidence-informed decision making, and non-governmental experts with deep experience in the practice of Living Evidence and/or associated capabilities (e.g., information science, machine learning).

Supporting federal implementation of Living Evidence

Widely accepted guidance for living systematic reviews (LSRs), one type of Living Evidence product, has been published. The LEIPC—working closely with OSTP, the White House Office of Management and Budget (OMB), and the federal Evaluation Officer Council (EOC), should adapt this guidance for the U.S. federal context, resulting in an informational resource for federal agencies seeking to launch or fund Living Evidence projects. The guidance should also be used to update systematic-review processes used by federal agencies and organizations contributing to national evidence clearinghouses.2

Once the federally tailored guidance has been developed, the White House should direct federal agencies to consider and pursue opportunities to embed Living Evidence within their programs and operations. The policy directive could take the form of a Presidential Memorandum, a joint management memo from the heads of OSTP and OMB, or similar. This directive would (i) emphasize the national benefits that Living Evidence could deliver, and (ii) provide agencies with high-level justification for using discretionary funding on Living Evidence projects and for making decisions based on Living Evidence insights.

Identifying priority areas and opportunities for federally managed Living Evidence projects

The LEIPC—again working closely with OSTP, OMB, and the EOC—should survey the federal government for opportunities to deploy Living Evidence internally. Box 1 provides examples of opportunities that the LEIPC could consider.

The product of this exercise should be a report that describes each of the opportunities identified, and recommends priority projects to pursue. In developing its priority list, the LEIPC should account for both the likely impact of a potential Living Evidence project as well as the near-term feasibility of that project. While the report could outline visions for ambitious Living Evidence undertakings that would require a significant time investment to realize fully (e.g., transitioning the entire National Climate Assessment into a frequently updated “living” mode), it should also scope projects that could be completed within two years and serve as pilots/proofs of concept. Lessons learned from the pilots could ultimately inform a national strategy for incorporating Living Evidence into federal government more systematically. Successful pilots could continue and grow beyond the end of the two-year period, as appropriate.

Fostering greater collaboration between government and external stakeholders

The LEIPC should create an online “LEIPC Collaborations” platform that connects researchers, practitioners, and other stakeholders both inside and outside government. The platform would emulate IARPC Collaborations, which has built out a community of more than 3,000 members and dozens of communities of practice dedicated to the holistic advancement of Arctic science. As one stakeholder has explained:

LEIPC Collaborations could deliver the same participatory opportunities and benefits for members of the evidence community, facilitating holistic advancement of Living Evidence.

Part 2. Make it easier for scientists and researchers to develop LSRs

Many government efforts could be supported by internal Living Evidence initiatives, but not every valuable Living Evidence effort should be conducted by government. Many useful Living Evidence programs will require deep domain knowledge and specialized skills that teams of scientists and researchers working outside of government are best positioned to deliver.

But experts interested in pursuing Living Evidence efforts face two major difficulties. The first is securing funding. Very little research funding is awarded for the sole purpose of conducting systematic reviews and other types of evidence syntheses. The funding that is available is typically not commensurate with the resource and personnel needs of a high-quality synthesis. Living Evidence demands efficient knowledge discovery and the involvement of multidisciplinary teams possessing overlapping skill sets. Yet federal research grants are often structured in a way that precludes principal investigators from hiring research software engineers or from founding co-led research groups.

The second is aligning with incentives. Systematic reviews and other types of evidence syntheses are often not recognized as “true” research outputs by funding agencies or university tenure committees—i.e., they are often not given the same weight in research metrics, despite (i) utilizing well-established scientific methodologies involving detailed protocols and advanced data and statistical techniques, and (ii) resulting in new knowledge. The result is that talented experts are discouraged from investing their time on projects that can contribute significant new insights and could dramatically improve the efficiency and impact of our nation’s research enterprise.

To begin addressing these problems, the two biggest STEM-funding agencies—NIH and NSF—should consider the following actions:

  1. Perform a landscape analysis of federal funding for evidence synthesis. Rigorously documenting the funding opportunities available (or lack thereof) for researchers wishing to pursue evidence synthesis will help NIH and NSF determine where to focus potential new opportunities. The landscape analysis should consider currently available funding opportunities for systematic, scoping, and rapid reviews, and could also include surveys and focus groups to assess the appetite in the research community for pursuing additional evidence-synthesis activities if supported.
  2. Establish new grant opportunities designed to support Living Evidence projects. The goal of these grant opportunities would be to deliver definitive and always up-to-date summaries of research evidence and associated data in specified topics. The opportunities could align with particular research focuses (for instance, a living systematic review on tissue-electronic interfacing could facilitate progress on bionic limb development under NSF’s current “Enhancing Opportunities for Persons with Disabilities” Convergence Accelerator track). The opportunities could also be topic-agnostic, but require applicants to justify a proposed project by demonstrating that (i) the research evidence is emerging rapidly, (ii) current evidence is uncertain, and (iii) new research might materially change policy or practice.
  3. Increase support for career research staff in academia. Although contributors to Living Evidence projects can cycle in and out (analogous to turnover in large research collaboratives), such projects benefit from longevity in a portion of the team. With this core team in place, Living Evidence projects are excellent avenues for grad students to build core research skills, including in research study design. 
  4. Leverage prestigious existing grant programs and awards to incentivize work on Living Evidence. For instance, NSF could encourage early-career faculty to propose LSRs in applications for CAREER grants.
  5. Recognize evidence syntheses as research outputs. In all assessments of scientific track record (particularly research-funding schemes), systematic reviews and other types of rigorous evidence synthesis should be recognized as research outputs equivalent to “primary” research. 

The grant opportunities should also:

Conclusion

Policymaking can only be meaningfully informed by evidence if underpinning systems for evidence synthesis are robust. The Biden administration’s Year of Evidence for Action provides a pivotal opportunity to pursue concrete actions that strengthen use of science for the betterment of the American people. Federal investment in Living Evidence is one such action. 

Living Evidence has emerged as a powerful mechanism for translating scientific discoveries into policy and practice. The Living Evidence approach is being rapidly embraced by international actors, and the United States has an opportunity to position itself as a leader. A federal initiative on Living Evidence will contribute additional energy and momentum to the Year of Evidence for Action, ensure that our nation does not fall behind on evidence-informed policymaking, and arm federal agencies with the most current and best-available scientific evidence as they pursue their statutory missions.

Frequently Asked Questions
Which sectors and scientific fields can use Living Evidence?
The Living Evidence model can be applied to any sector or scientific field. While the Living Evidence model has so far been most widely applied to the health sector, Living Evidence initiatives are also underway in other fields, such as education and climate sciences. Living Evidence is domain-agnostic: it is simply an approach that builds on existing, rigorous evidence-synthesis methods with a novel workflow of frequent and rapid updating.
What is needed to run a successful Living Evidence project?
It does not take long for teams to develop sufficient experience and expertise to apply the Living Evidence model. The key to a successful Living Evidence project is a team that possesses experience in conventional evidence synthesis, strong project-management skills, an orientation towards innovation and experimentation, and investment in building stakeholder engagement.
How much does Living Evidence cost?
As with evidence synthesis in general, cost depends on topic scope and the complexity of the evidence being appraised. Budgeting for Living Evidence projects should distinguish the higher cost of conducting an initial “baseline” systematic review from the lower cost of maintaining the project thereafter. Teams initiating a Living Evidence project for the first time should also budget for the inevitable experimentation and training required.
Do Living Evidence initiatives require recurrent funding?
No. Living Evidence initiatives are analogous to other significant scientific programs that may extend over many years, but receive funding in discrete, time-bound project periods with clear deliverables and the opportunity to apply for continuation funding. 


Living Evidence projects do require funding for enough time to complete the initial “baseline” systematic review (typically 3-12 months, depending on scope and complexity), transition to maintenance (“living”) mode, and continue in living mode for sufficient time (usually about 6–12 months) for all involved to become familiar with maintaining and using the living resource. Hence Living Evidence projects work best when fully funded for a minimum of two years.
If there is support for funding beyond this minimum period, there are operational advantages of instantiating the follow-on funding before the previous funding period concludes. If follow-on funding is not immediately available, Living Evidence resources can simply revert to a conventional static form until and if follow-on funding becomes available.

Is Living Evidence sustainable?
Living Evidence is rapidly gaining momentum as organizations conclude that the conventional model of evidence synthesis is no longer sustainable because the volume of research that must be reviewed and synthesized for each update has grown beyond the capacity of typical project teams. Organizations that transition their evidence resources into “living” mode typically find the dynamic synthesis model to be more consistent, more feasible, easier to manage, and easier to plan for and resource. If the conventional model of intermittent synthesis is like climbing a series of  mountains, the Living Evidence approach is like hiking up to and then walking across a plateau.
How can organizations that are already struggling to develop and update conventional evidence resources take on a Living Evidence project?
New initiatives usually need specific resourcing; Living Evidence is no different. The best approach is to identify a champion within the organization that has an innovation orientation and sufficient authority to effect change. The champion plays a key role in building organizational buy-in, particularly from senior leaders, key influencers within the main evidence program, and major partners, stakeholders and end users. Ultimately, the champion (or their surrogate) should be empowered and resourced to establish 1–3 Living Evidence pilots running alongside the organization’s existing evidence activities. Risk can be reduced by starting small and building a “minimum viable product” Living Evidence resource (i.e., by finding a topic area that is relatively modest in scope, of importance to stakeholders, and is characterized by evidence uncertainty as well as relatively rapid movement in the relevant research field). Funding should be structured to enable experimentation and iteration, and then move quickly to scale up, increasing the scope of evidence moving into living mode, as organizational and stakeholder experience and support builds.
Living Evidence sounds neverending. Wouldn’t that lead to burnout in the project team?
One of the advantages of the Living Evidence model is that the project team can gradually evolve over time (members can join and leave as their interests and circumstances change). This is analogous to the evolution of an ongoing scientific network or research collaborative. In contrast, the spikes in workload required for intermittent updates of conventional evidence products often lead to burnout and loss of institutional memory. Furthermore, teams working on Living Evidence are often motivated by participation in an innovative approach to evidence and pride in contributing to a definitive, high-quality, and highly impactful scientific initiative.
How is Living Evidence disseminated?

While dissemination of conventional evidence products involves sharing several dozen key messages in a once-in-several-years communications push, dissemination of Living Evidence amounts to a regular cycle of “what’s new” updates (typically one to two key insights). Living Evidence dissemination feeds become known and trusted by end users, inspiring confidence that end users can “keep up” with the implications of new research. Publication of Living Evidence can take many forms. Typically, the core evidence resource is housed in an organizational website that can be easily and frequently updated, sometimes with an ability for users to access previous versions of the resource. Living Evidence may also be published as articles in academic journals. This could  be intermittent overviews of the evidence resource with links back to the main Living Evidence summaries, or (more ambitiously) as a series of frequently updated versions of an article that are logically linked. Multiple academic journals are innovating to better support “living” publications.

If Living Evidence products are continually updated, doesn’t that confuse end users with constantly changing conclusions?
Living Evidence requires continual monitoring for new research, as well as frequent and rapid incorporation of new research into existing evidence products. The volume of research identified and incorporated can vary from dozens of studies each month to a few each year, depending on the topic scope and research activity.


Even across broad topics in fast-moving research fields, though, the overall findings and conclusions of Living Evidence products change infrequently since the threshold for changing a conclusion drawn from a whole body of evidence is high. The largest Living Evidence projects in existence only yield about one to two new major findings or recommendations each update. Furthermore, any good evidence-synthesis product will contextualize conclusions and recommendations with confidence.

What are the implications of Living Evidence for stakeholder engagement?
Living Evidence projects, due to their persistent nature, are great opportunities for building partnerships with stakeholders. Stakeholders tend to be energized and engaged in an innovative project that gives them, their staff, and their constituencies a tractable mechanism by which to engage with the “current state of science”. In addition, the ongoing nature of a Living Evidence project means that project partnerships are always active. Stakeholders are continually engaged in meaningful, collaborative discussions and activities around the current evidence. Finally, this ongoing, always-active nature of Living Evidence projects creates “accumulative” partnerships that gradually broaden and deepen over time.
What are the equity implications of taking a Living Evidence approach?
Living Evidence resources make the latest science available to all. Conventionally, the lack of high-quality summaries of science has meant the latest science is discovered and adopted by those closest to centers of excellence and expertise. Rapid incorporation of the latest science into Living Evidence resources—as well as the wide promotion and dissemination of that science—means that the immediate benefits of science can be shared much more broadly, contributing to equity of access to science and its benefits.
What are the implications of Living Evidence for knowledge translation?
The activities that use research outputs and evidence resources (such as Living Evidence) to change practice and policy are often referred to as “knowledge translation”. These activities are substantial and often multifaceted interventions that identify and address the complex structural, organizational, and cultural barriers that impede knowledge use. 


Living Evidence has the potential to accelerate knowledge translation: not because of any changes to the knowledge-translation enterprise, but because Living Evidence identifies earlier the high-certainty evidence that underpins knowledge-translation activities.

Living Evidence may also enhance knowledge translation in two ways. First, Living Evidence is a better evidence product and has been shown to increase trust, engagement, and intention to use among stakeholders. Second, as mentioned above, Living Evidence creates opportunities for deep and effective partnerships. Together, these advantages could position Living Evidence to yield a more effective “enabling environment” for knowledge translation.

Does Living Evidence require use of technologies like machine learning?
Technologies such as natural language processing, machine learning and citizen science (crowdsourcing), as well as efforts to build common data structures (and create Findable, Accessible, Interoperable and Reusable (FAIR) data), are advancing alongside Living Evidence. These technologies are often described as “enablers” of Living Evidence. While such technologies are commonly used and developed in Living Evidence projects, they are not essential. Nevertheless, over the longer term, such technologies will likely be indispensable for creating sustainable systems that make sense of science.

Creating a Digital Service for the Planet

Summary

Challenge and Opportunity

The Biden administration—through directives such as Executive Order 14008 on Tackling the Climate Crises at Home and Abroad and President Biden’s Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking, as well as through initiatives such as Justice40 and America the Beautiful (30×30)—has laid the blueprint for a data-driven environmental agenda. 

However, the data to advance this agenda are held and managed by multiple agencies, making them difficult to standardize, share, and use to their full potential. For example, water data are collected by 25 federal entities across 57 data platforms and 462 different data types. Permitting for wetlands, forest fuel treatments, and other important natural-resource management tasks still involves a significant amount of manual data entry, and protocols for handling relevant data vary by region or district. Staff at environmental agencies have privately noted that it can take weeks or months to receive necessary data from colleagues in other agencies, and that they have trouble knowing what data exist at other agencies. Accelerating the success and breadth of environmental initiatives requires digitized, timely, and accessible information for planning and execution of agency strategies.

The state of federal environmental data today echoes the state of public-health data in 2014, when President Obama recognized that the Department of Health and Human Services lacked the technical skill sets and capacity needed to stand up Healthcare.gov. The Obama administration responded by creating the U.S. Digital Service (USDS), which provides federal agencies with on-demand access to the technical expertise they need to design, procure, and deploy technology for the public good. Over the past eight years, USDS has developed a scalable and replicable model of working across government agencies. Projects that USDS has been involved in—like improving federal procurement and hiring processes, deploying healthcare.gov, and modernizing administrative tasks for veterans and immigrants—have saved agencies such as the Department of Veterans Affairs millions of dollars.

But USDS lacks the specialized capacity and skills, experience, and specific directives needed to fully meet the shared digital-infrastructure needs of environmental agencies. The Climate and Economic Justice Screening Tool (CEJST) is an example of how crucial digital-service capacity is for tackling the nation’s environmental priorities, and the need for a DSP. While USDS was instrumental in getting the tool off the ground, several issues with the launch point to a lack of specialized environmental capabilities and expertise within USDS. Many known environmental-justice issues—including wildfire, drought, and flooding—were not reflected in the tool’s first iteration. In addition, the CEJST should have been published in July 2021, but the beta version was not released until February 2022. A DSP familiar with environmental data would have started with a stronger foundation to help anticipate and incorporate such key environmental concerns, and may have been able to deliver the tool on a tighter timeline.

There is hope in this challenge. The fact that many environmental programs across multiple federal agencies have overlapping data and technology needs means that a centralized and dedicated team focused on addressing these needs could significantly and cost-effectively advance the capacities of environmental agencies to:

Plan of Action

To best position federal agencies to meet environmental goals, the Biden administration should establish a “Digital Service for the Planet (DSP).” The DSP would build off the successes of USDS to provide support across three key areas for environmental agencies:

  1. Strategic planning and procurement. Scoping, designing, and procuring technology solutions for programmatic goals. For example, a DSP could help the Fish and Wildlife Service (FWS) accelerate updates to the National Wetlands Inventory, which are currently estimated to take 10 years and cost $20 million dollars.
  2. Technical development. Implementing targeted technical-development activities to achieve mission-related goals in collaboration with agency staff. For example, a DSP could help update the accessibility and utility for many government tools that the public rely heavily on, such as the Army Corps system that tracks mitigation banks (known as the Regulatory In lieu fee and Bank Information Tracking System (RIBITS)).
  3. Cross-agency coordination on digital infrastructure. Facilitating data inventory and sharing, and development of the databases, tools, and technological processes that make cross-agency efforts possible. A DSP could be a helpful partner for facilitating information sharing among agencies that monitor interrelated events, environments, or problems, including droughts, wildfires, and algal blooms. 

The DSP could be established either as a new branch of USDS, or as a new and separate but parallel entity housed within the White House Office of Management and Budget. The former option would enable DSP to leverage the accumulated knowledge and existing structures of USDS. The latter option would enable DSP to be established with a more focused mandate, and would also provide a clear entry point for federal agencies seeking data and technology support specific to environmental issues.

Regardless of the organizational structure selected, DSP should include the essential elements that have helped USDS succeed—per the following recommendations.

Recommendation 1. The DSP should emulate the USDS’s staffing model and position within the Executive Office of the President (EOP).

The USDS hires employees on short-term contracts, with each contract term lasting between six months and four years. This contract-based model enables USDS to attract high-level technologists, product designers, and programmers who are interested in public service, but not necessarily committed to careers in government. USDS’s staffing model also ensures that the Service does not take over core agency capacities, but rather is deployed to design and procure tech solutions that agencies will ultimately operate in-house (i.e., without USDS involvement). USDS’s position within the EOP makes USDS an attractive place for top-level talent to work, gives staff access to high-level government officials, and enables the Service to work flexibly across agencies.

Recommendation 2. Staff the DSP with specialists who have prior experience working on environmental projects.

Working on data and technology issues within environmental contexts requires specialized skill sets and experience. For example, geospatial data and analysis are fundamental to environmental protection and conservation, but this has not been a focal point of USDS hiring. In addition, a DSP staff fluent in the vast and specific terminologies used in environmental fields (such as water management) will be better able to communicate with the many subject-matter experts and data stewards working in environmental agencies. 

Recommendation 3. Place interagency collaboration at the core of the DSP mission.

Most USDS projects focus on a single federal agency, but environmental initiatives—and the data and tech needs they present—almost always involve multiple agencies. Major national challenges, including flood-risk management, harmful algal blooms, and environmental justice, all demand an integrated approach to realize cross-agency benefits. For example, EPA-funded green stormwater infrastructure could reduce flood risk for housing units subsidized by the Department of Housing and Urban Development. DSP should be explicitly tasked with devising approaches for tackling complex data and technology issues that cut across agencies. Fulfilling this mandate may require DSP to bring on additional expertise in core competencies such as data sharing and integration.

Recommendation 4. Actively promote the DSP to relevant federal agencies.

Despite USDS’s eight-year existence, many staff members at agencies involved in environmental initiatives know little about the Service and what it can do for them. To avoid underutilization due to lack of awareness, the DSP’s launch should include an outreach campaign targeted at key agencies, including but not limited to the U.S. Army Corps of Engineers (USACE), the Department of Energy (DOE), the Department of the Interior (DOI), the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Department of Agriculture, and the U.S. Global Change Research Program (USGCRP).

Conclusion

A new Digital Service for the Planet could accelerate progress on environmental and natural-resource challenges through better use of data and technology. USDS has shown that a relatively small and flexible team can have a profound and lasting effect on how agencies operate, save taxpayer money, and encourage new ways of thinking about long standing problems. However, current capacity at USDS is limited and not specifically tailored to the needs of environmental agencies. From issues ranging from water management to environmental justice, ensuring better use of technology and data will yield benefits for generations to come. This is an important step for the federal government to be a better buyer, partner, and consumer of the data technology and innovations that are necessary to support the country’s conservation, water, and stewardship priorities.

Frequently Asked Questions
How would the DSP differ from the U.S. Digital Service?

The DSP would build on the successful USDS model, but would have two distinguishing characteristics. First, the DSP would employ staff experienced in using or managing environmental data and possessing special expertise in geospatial technologies, remote sensing, and other environmentally relevant tech capabilities. Second, DSP would have an explicit mandate to develop processes for tackling data and technology issues that frequently cut across agencies. For example, the Internet of Water found that at least 25 different federal entities collect water data, while the USGCRP has identified at least 217 examples of earth observation efforts spanning many agencies. USDS is not designed to work with so many agencies at once on a single project—but DSP would be.

Would establishing the DSP prohibit agencies from independently improving their data and tech practices? 

Not in most cases. The DSP would focus on meeting data and technology needs shared by multiple agencies. Agencies would still be free—and encouraged!—to pursue agency-specific data- and tech-improvement projects independently.


Indeed, a hope would be that by showcasing the value of digital services for environmental projects on a cross-agency basis, the DSP would inspire individual agencies to establish their own digital services teams. Precedent for this evolution exists: the USDS provided initial resources to solve digital challenges for healthcare.gov and Department of Veteran Affairs. The Department of Veteran Affairs and Department of Defense have since started their internal digital services teams. However, even with agency-based digital service teams, there will always be a need for a team with a cross-agency view, especially given that so many environmental problems and solutions extend well beyond the borders of a single agency. Digital-service teams at multiple levels can be complementary and would focus on different project scopes and groups of users. For example, agency-specific digital-service teams would be much better positioned to help sustain agency-specific components of an effort established by DSP.

How much would this proposal cost?

We propose the DSP start with a mid-sized team of twenty to thirty full-time equivalent employees (FTEs) and a budget around $8 million. These personnel and financial allocations are in line with allocations for USDS. DSP could be scaled up over time if needed, just as USDS grew from approximately 12 FTEs in fiscal year (FY) 2014 to over 200 FTEs in FY 2022. The long-term target size of the DSP team should be informed by the uptake and success of DSP-led work.

Why would agencies want a DSP? Why would they see it as beneficial?

From our conversations with agency staff, we (the authors) have heard time and again that agencies see immense value in a DSP, and find that two scenarios often inhibit improved adoption of environmental data and technology. The first scenario is that environmental-agency staff see the value in pursuing a technology solution to make their program more effective, but do not have the authority or resources to implement the idea, or are not aware of the avenues available to do so. DSP can help agency staff design and implement modern solutions to realize their vision and coordinate with important stakeholders to facilitate the process.


The second scenario is that environmental-agency staff are trained experts in environmental science, but not in evaluating technology solutions. As such, they are poorly equipped to evaluate the integrity of proposed solutions from external vendors. If they end up trialing a solution that is a poor fit, they may become risk-averse to technology at large. In this scenario, there is tremendous value in having a dedicated team of experts within the government available to help agencies source the appropriate technology or technologies for their programmatic goals.

CLimate Improvements through Modern Biotechnology (CLIMB) — A National Center for Bioengineering Solutions to Climate Change and Environmental Challenges

Summary

Tackling pressing environmental challenges — such as climate change, biodiversity loss, environmental toxins and pollution — requires bold, novel approaches that can act at the scale and expediency needed to stop irreversible damage. Environmental biotechnology can provide viable and effective solutions. The America COMPETES Act, if passed, would establish a National Engineering Biology Research and Development Initiative. To lead the way in innovative environmental protection, a center should be created within this initiative that focuses on applying biotechnology and bioengineering to environmental challenges. The CLimate Improvements through Modern Biotechnology (CLIMB) Center will fast-track our nation’s ability to meet domestic and international decarbonization goals, remediate contaminated habitats, detect toxins and pathogens, and deliver on environmental-justice goals. 

The CLIMB Center would (i) provide competitive grant funding across three key tracks — bioremediation, biomonitoring, and carbon capture — to catalyze comprehensive environmental biotechnology research; (ii) house a bioethics council to develop and update guidelines for safe, equitable environmental biotechnology use; (iii) manage testbeds to efficiently prototype environmental biotechnology solutions; and (iv) facilitate public-private partnerships to help transition solutions from prototype to commercial scale. Investing in the development of environmental biotechnology through the CLIMB Center will overall advance U.S. leadership on biotechnology and environmental stewardship, while helping the Biden-Harris Administration deliver on its climate and environmental-justice goals. 

Challenge and Opportunity

The rapidly advancing field of biotechnology has considerable potential to aid the fight against climate change and other pressing environmental challenges. Fast and inexpensive genetic sequencing of bacterial populations, for instance, allows researchers to identify genes that enable microorganisms to degrade pollutants and synthesize toxins. Existing tools like CRISPR, as well as up-and-coming techniques such as retron-library recombineering, allow researchers to effectively design microorganisms that can break down pollutants more efficiently or capture more carbon. Biotechnology as a sector has been growing rapidly over the past two decades, with the global market value estimated to be worth nearly $3.5 trillion by 2030. These and numerous other biotechnological advances are already being used to transform sectors like medicine (which comprises nearly 50% of the biotechnology sector), but have to date been underutilized in the fight for a more sustainable world. 

One reason why biotechnology and bioengineering approaches have not been widely applied to advance climate and environmental goals is that returns on investment are too uncertain, too delayed, or too small to motivate private capital — even if solving pressing environmental issues through biotechnology would deliver massive societal benefits. The federal government can act to address this market failure by creating a designated environmental-biotechnology research center as part of the National Engineering Biology Research and Development Initiative (America COMPETES act, Sec. 10403). Doing so will help the Biden-Harris Administration achieve its ambitious targets for climate action and environmental justice.

Plan of Action

The America COMPETES Act would establish a National Engineering Biology Research and Development Initiative “to establish new research directions and technology goals, improve interagency coordination and planning processes, drive technology transfer to the private sector, and help ensure optimal returns on the Federal investment.” The Initiative is set to be funded through agency contributions and White House Office and Science and Technology Policy (OSTP) budget requests. The America COMPETES Act also calls for creation of undesignated research centers within the Initiative. We propose creating such a center focused on environmental-biotechnology research: The CLimate Improvements through Modern Biotechnology (CLIMB) Center. The Center would be housed under the new National Science Foundation (NSF) Directorate for Technology, Innovation and Partnerships and co-led by the NSF Directorate of Biological Sciences. The Center would take a multipronged approach to support biotechnological and bioengineering solutions to environmental and climate challenges and rapid technology deployment. 

We propose the Center be funded with an initial commitment of $60 million, with continuing funds of $300 million over five years. The main contributing federal agencies research offices would be determined by OSTP, but should at minimum include: NSF; the Departments of Agriculture, Defense, and Energy (USDA, DOD, and DOE); the Environmental Protection Agency (EPA), the National Oceanic and Atmospheric Administration (NOAA), and the U.S. Geological Survey (USGS).  

Specifically, the CLIMB Center would: 

  1. Provide competitive grant funding across three key tracks — bioremediation, biomonitoring, and carbon capture — to catalyze comprehensive environmental-biotechnology research.
  2. House a bioethics council to develop and update guidelines for safe, equitable environmental-biotechnology use.
  3. Manage testbeds to efficiently prototype environmental-biotechnology solutions. 
  4. Facilitate public-private partnerships to help transition solutions from prototype to commercial scale.

More detail on each of these components is provided below.

Component 1: Provide competitive grant funding across key tracks to catalyze comprehensive environmental biotechnology research.

The CLIMB Center will competitively fund research proposals related to (i) bioremediation, (ii) biomonitoring, and (iii) carbon capture. These three key research tracks were chosen to span the approaches to tackle environmental problems from prevention, monitoring to large-scale remediation. Within these tracks, the Center’s research portfolio will span the entire technology-development pathway, from early-stage research to market-ready applications.

Track 1: Bioremediation

Environmental pollutants are detrimental to ecosystems and human health. While the Biden-Harris Administration has taken strides to prevent the release of pollutants such as per- and polyfluoroalkyl substances (PFAS), many pollutants that have already been released into the environment persist for years or even decades. Bioremediation is the use of biological processes to degrade contaminants within the environment. It is either done within a contaminated site (in-situ bioremediation) or away from it (ex-situ). Traditional in-situ bioremediation is primarily accomplished by bioaugmentation (addition of pollutant-degrading microbes) or by biostimulation (supplying oxygen or nutrients to stimulate the growth of pollutant-degrading microbes that are already present). While these approaches work, they are costly, time-consuming, and cannot be done at large spatial scales. 

Environmental biotechnology can enhance the ability of microbes to degrade contaminants quickly and at scale. Environmental-biotechnology approaches produce bacteria that are better able to break down toxic chemicalsdecompose plastic waste, and process wastewater. But the potential of environmental biotechnology to improve bioremediation is still largely untapped, as technology development and regulatory regimes still need to be developed to enable widespread use. CLIMB Center research grants could support the early discovery phase to identify more gene targets for bioremediation as well as efforts to test more developed bioremediation technologies for scalability.

Track 2: Biomonitoring

Optimizing responses to environmental challenges requires collection of data on pollutant levels, toxin prevalence, spread of invasive species, and much more. Conventional approaches to environmental monitoring (like mass spectrometry or DNA amplification) require specialized equipment, are low-throughput, and need highly trained personnel. In contrast, biosensors—devices that use biological molecules to detect compounds of interest—provide rapid, cost-effective, and user-friendly alternatives to measure materials of interest. Due to these characteristics, biosensors enable users to sample more frequently and across larger spatial scales, resulting in more accurate datasets and enhancing our ability to respond. Detection of DNA or RNA is key for identifying pathogens, invasive species, and toxin-producing organisms. Standard DNA- and RNA-detection techniques like polymerase chain reaction (PCR) require specialized equipment and are slow. By contrast, biosensors detect minuscule amounts of DNA and RNA in minutes (rather than hours) and without the need for DNA/RNA amplificationSHERLOCK and DETECTR are two examples of highly successful, marketed tools used for diagnostic applications such as detecting SARS-CoV-2 and for ecological purposes such as distinguishing invasive fish species from similar-looking native species. Moving forward, these technologies could be repurposed for other environmental applications, such as monitoring for the presence of algal toxins in water used for drinking, recreating, agriculture, or aquaculture. Furthermore, while existing biosensors can detect DNA and RNA, detecting compounds like pesticides, DNA-damaging compounds, and heavy metals requires a different class of biosensor. CLIMB Center research grants could support development of new biosensors as well as modification of existing biomonitoring tools for new applications.  

Track 3: Carbon capture

Rising atmospheric levels of greenhouse gases like carbon dioxide are driving irreversible climate change. The problem has become so bad that it is no longer sufficient to merely reduce future emissions—limiting average global warming below 2°C by 2100 will require achieving negative emissions through capture and removal of atmospheric carbon. A number of carbon-capture approaches are currently being developed. These range from engineered approaches such as direct air capture, chemical weathering, and geologic sequestration to biological approaches such as reforestation, soil amendment, algal cultivation, and ocean fertilization.  

Environmental-biotechnology approaches such as synthetic biology (“designed biology”) can vastly increase the amount of carbon that could be captured by natural processes. For instance, plants and crops can be engineered to produce larger root biomass that sequesters more carbon into the soil, or to store more carbon in harder-to-break-down molecules such as ligninsuberin, or sporopollenin instead of easily more metabolized sugars and cellulose. Alternatively, carbon capture efficiency can be improved by modifying enzymes in the photosynthetic pathway or limiting photorespiration through synthetic biology. Microalgae in particular hold great promise for enhanced carbon capture. Microalgae can be bioengineered to not only capture more carbon but also produce a greater density of lipids that can be used for biofuel. The potential for synthetic biology and other environmental-biotechnology approaches to enhanced carbon capture is vast, largely unexplored, and certainly under commercialized. CLIMB Center research grants could propel such approaches quickly. 

Component 2: House a bioethics council to develop and update guidelines for safe, equitable environmental-biotechnology use.

The ethical, ecological, and social implications of environmental biotechnology must be carefully considered and proactively addressed to avoid unintended damage and to ensure that benefits are distributed equitably. As such, the CLIMB Center should assemble a bioethics council comprising representatives from:

The bioethics council will identify key ethical and equity issues surrounding emerging environmental biotechnologies. The council will then develop guidelines to ensure transparency of research to the public, engagement of key stakeholders, and safe and equitable technology deployment. These guidelines will ensure that there is a framework for the use of field-ready environmental-biotechnology devices, and that risk assessment is built consistently into regulatory-approval processes. The council’s findings and guidelines will be reported to the National Engineering Biology Research and Development Initiative’s interagency governance committee which will work with federal and state regulatory agencies to incorporate guidance and streamline regulation and oversight of environmental biotechnology products. 

Component 3. Manage testbeds to efficiently prototype environmental-biotechnology solutions. 

The “valley of death” separating early research and prototyping and commercialization is a well-known bottleneck hampering innovation. This bottleneck could certainly inhibit innovation in environmental biotechnology, given that environmental-biotechnology tools are often intended for use in complex natural environments that are difficult to replicate in a lab. The CLIMB Center should serve as a centralized node to connect researchers with testing facilities and test sites where environmental biotechnologies can be properly validated and risk-assessed. There are numerous federal facilities that could be leveraged for environmental biotechnology testbeds, including: 

The CLIMB Center could also work with industry, state, and local partners to establish other environmental-biotechnology testbeds. Access to these testbeds could be provided to researchers and technology developers as follow-on opportunities to CLIMB Center research grants and/or through stand-alone testing programs managed by the CLIMB Center. 

Component 4: Facilitate public-private partnerships to help transition solutions from prototype to commercial scale.

Public-private partnerships have been highly successful in advancing biotechnology for medicine. Operation Warp Speed, to cite one recent and salient example, enabled research, development, testing, and distribution of vaccines against SARS-CoV-2 at unprecedented speeds. Public-private partnerships could play a similarly key role in advancing the efficient deployment of market-ready environmental biotechnological devices. To this end, the CLIMB Center can reduce barriers for negotiating partnerships between environmental engineers and biotechnology manufacturers. For example, the CLIMB center can develop templates for Memoranda of Understandings (MOUs) and  Collaborative Research Agreements (CDAs) to facilitate the initial establishment of the partnerships, as well as help connect interested parties.The CLIMB center could also facilitate access for both smaller companies and researchers to existing government infrastructure necessary to deploy these technologies. For example, an established public-private partnership team could have access to government-managed gene and protein libraries, microbial strain collections, sequencing platforms, computing power, and other specialized equipment. The Center could further negotiate with companies to identify resources (equipment, safety data, and access to employee experts) they are willing to provide. Finally, the Center could determine and fast-track opportunities where the federal government would be uniquely suited to serve as an end user of biotechnology products. For instance, in the bioremediation space, the EPA’s purview for management and cleanup of Superfund sites would immensely benefit from the use of novel, safe, and effective tools to quickly address pollution and restore habitats.

Conclusion

Environmental and climate challenges are some of the most pressing problems facing society today. Fortunately, advances in biotechnology that enable manipulation, acceleration, and improvement of natural processes offer powerful tools to tackle these challenges. The federal government can accelerate capabilities and applications of environmental biotechnology by establishing the CLimate Improvements through Modern Biotechnology (CLIMB) Center. This center, established as part of the National Engineering Biology Research and Development Initiative, will be dedicated to advancing research, development, and commercialization of environmental biotechnology. CLIMB Center research grants will focus on advances in bioremediation, biomonitoring, and biologically assisted carbon capture, while other CLIMB Center activities will scale and commercialize emerging environmental biotechnologies safely, responsibly, and equitably. Overall, the CLIMB Center will further solidify U.S. leadership in biotechnology while helping the Biden-Harris Administration meet its ambitious climate, energy, and environmental-justice goals. 

Frequently Asked Questions
Why should the federal government take the lead in environmental biotechnology solutions?

Environmental biotechnology can help address wide-reaching, interdisciplinary issues with huge benefits for society. Many of the applications for environmental biotechnology are within realms where the federal government is an interested or responsible party. For instance, bioremediation largely falls within governmental purview. Creating regulatory guidelines in parallel to the development of these new technologies will enable an expedited rollout. Furthermore, environmental biotechnology approaches are still novel and using them on a wide scale in our natural environments will require careful handling, testing, and regulation to prevent unintended harm.  Here again, the federal government can play a key role to help validate and test technologies before they are approved for use on a wide scale.


Finally, the largest benefits from environmental biotechnology will be societal. The development of such technology should hence be largely driven by its potential to improve environmental quality and address environmental injustices, even if these are not profitable. As such, federal investments are better suited than private investments to help develop and scale these technologies, especially during early stages when returns are too small, too uncertain, and too future-oriented.

How do we mitigate security risks of bioengineered products?

Bioengineered products already exist and are in use, and bioengineering innovations and technology will continue to grow over the next century. Rather than not develop these tools and lag behind other nations that will continue to do so, it is better to develop a robust regulatory framework that will address the critical ethical and safety concerns surrounding their uses. Importantly, each bioengineered product will present its own set of risks and challenges. For instance, a bacterial species that has been genetically engineered to metabolize a toxin is very different from an enzyme or DNA probe that could be used as a biosensor. The bacteria are living, can reproduce, and can impact other organisms around them, especially when released into the environment. In contrast, the biosensor probe would contain biological parts (not a living organism) and would only exist in a device. It is thus critical to ensure that every biotechnology product, with its unique characteristics, is properly tested, validated, and designed to minimize its environmental impact and maximize societal benefits. The CLIMB Center will greatly enhance the safety of environmental-biotechnology products by facilitating access to test beds and the scientific infrastructure necessary to quantify these risk-benefit trade-offs.

How would the CLIMB Center address the Biden-Harris Administration’s goals for environmental justice?

The Biden-Harris Administration has recognized the vast disparity in environmental quality and exposure to contaminants that exist across communities in the United States. Communities of color are more likely to be exposed to environmental hazards and bear the burden of climate change-related events. For example, the closer the distance to a Superfund site—a site deemed contaminated enough to warrant federal oversight—the higher the proportion of Black and the lower the proportion of White families. To address these disparities, the Administration  issued Executive Order 14008 to advance environmental justice efforts. Through this order, President Biden created an Environmental Justice Advisory Council and launched the Justice40 initiative, which mandates that 40% of the benefits from climate investments be delivered to underserved communities. The Justice40 initiative includes priorities such as the “remediation and reduction of legacy pollution, and the development of critical clean water infrastructure.” The Executive Order also calls for the creation of a “community notification program to monitor and provide real-time data to the public on current environmental pollution…in frontline and fenceline communities — places with the most significant exposure to such pollution.” Environmental biotechnology offers an incredible opportunity to advance these goals by enhancing water treatment and bioremediation and enabling rapid and efficient monitoring of environmental contaminants.

How would the CLIMB Center address the Biden-Harris Administration’s goals for climate change?

President Biden has set targets for a 50–52% reduction (relative to 2005 levels) in net greenhouse-gas pollution by the year 2030, and has directed federal government operations to reach 100% carbon-pollution-free electricity by 2030 (Executive Order 14057). It is well established that meeting such climate goals and limiting global warming to less than 2°C will require negative emissions technologies (carbon capture) in addition to reducing the amount of emissions created by energy and other sectors. Carbon-capture technologies will need to be widely available, cost-effective, and scalable. Environmental biotechnology can help address these needs by enhancing our capacity for biological carbon capture through the use of organisms such as microalgae and macroalgae, which can even serve the dual role of producing biofuels, feedstock, and other products in a carbon-neutral or carbon-negative way. The CLIMB Center can establish the United States as the global leader in advancing both biotechnology and the many untapped environmental and climate solutions it can offer.

What are the current federal funding mechanisms available for the research and development of bioengineered environmental solutions?

There are multiple avenues for funding foundational research and development in bioengineering. Federal agencies and offices that currently fund bioengineering with an environmental focus include (but are not necessarily limited to):



  • DOE’s Office of Science’s various research programs, ARPA-E, and DOE’s Bioenergy Technologies Office

  • EPA’s Office of Research and Development, Science to Achieve Results (STAR) Program

  • National Science Foundation’s Biological Sciences and Engineering Directorates

  • USDA’s National Institute of Food and Agriculture, Biotechnology Risk Assessment Research Grants Program

  • NOAA’s Office of Ocean Exploration and Research

  • NASA’s Space Technology Mission Directorate

  • The National Institute of Health’s Environmental Health Services and National Institute of Biomedical Imaging and Bioengineering Institutes

  • DOD’s DARPA, Biological Technologies Office


Research funding provided by these offices often includes a biomedical focus. The research and development funding provided by the CLIMB Center would seek to build upon these efforts and help coordinate directed research towards environmental-biotechnology applications.

How could biosensor inform management and policy decisions?

Compared to conventional analytical techniques, biosensors are fast, cost-effective, easy-to-use, and largely portable and largely portable. However, biosensors are not always poised to take-over conventional techniques. In many cases, regulatory bodies have approved analytical techniques that can be used for compliance. Novel biosensors are rarely included in the suite of approved techniques, even though biosensors can complement conventional techniques—such as by allowing regulators to rapidly screen more samples to prioritize which require further processing using approved conventional methods. Moreover, as conventional methods can only provide snapshot measurements, potentially missing critical time periods where toxins, contaminants, or pathogens can go unnoticed. Biosensors, on the other hand, could be used to continuously monitor a given area. For example, algae can accumulate (bloom) and produce potent toxins that accumulate in seafood. To protect human health, seafood is tested using analytical chemical approaches (direct measurement of toxins) or biological assays (health monitoring in exposed laboratory animals). This requires regulators to decide when it is best to sample. However, if a biosensor was deployed in an monitoring array out in the ocean or available to people who collect the seafood, it could serve as an early detection system for the presence of these toxins. This application will become especially important moving forward since climate change has altered the geographic distribution and seasonality of these algal blooms, making it harder to forecast when it is best to measure seawater and seafood for these toxins.

How do we ensure that benefits from environmental biotechnologies extend equitably to historically excluded populations?

Communities of color are more likely to live near Superfund sites, be disproportionately exposed to pollutants, and bear the heaviest burdens from the effects of climate change. These communities have also been disproportionately affected by unethical environmental and medical-research practices. It is imperative that novel tools designed to improve environmental outcomes benefit these communities and do not cause unintended harm. Guidelines established by the CLIMB Center’s bioethics council coupled with evaluation of environmental biotechnologies in realistic testbeds will help ensure that this is the case.

Putting Redlines in the Green: Economic Revitalization Through Innovative Neighborhood Markets

Summary

The systemic effects of past redlining in more than 200 U.S. cities continue to persist. Redlining was a 20th-century policy that explicitly denied Black Americans the opportunity to secure federal mortgage loansand future wealth. Adverse impacts of redlining not only reduce quality of life for communities of color and low-income communities, but also have spillover effects that cost taxpayers upwards of $308 million per year.

The Biden-Harris administration can combat the impacts of redlining through a new place-based program called “Putting Redlines in the Green”. Through this program, the federal government would repurpose a fraction of its thousands of excess and underutilized properties as rent-free or rent-subsidized sites for Innovative Neighborhood Markets (INMs): multipurpose, community-operated spaces that serve as grocery-delivery hubs, house culturally significant businesses, and support local entrepreneurs in historically redlined areas. While recent federal initiatives (such as the Opportunity Zone and Promise Zone programs) have sought to stimulate development in economically distressed communities through top-down grants and tax incentives, “Putting Redlines in the Green” will give historically redlined communities access to a key asset—real estate—needed to spur revitalization from the bottom up.

Challenge and Opportunity

The term “redlining” derives from racially discriminatory practices carried out by government homeownership programs in the 1930s. The pernicious systemic effects of historical redlining continue to be felt today. Historically redlined areas, for instance, possess less urban-forest cover (and thus suffer from higher summer temperatures and greater pollution), experience poorer health outcomes and decreased earning potential, and are exploited by predatory lending practices that make it nearly impossible to rebuild wealth. Historic redlining can also be linked directly to the prevalence and distribution of “food deserts” and “food apartheid” in U.S. cities.

In 2021, the Department of Justice (DOJ)—in collaboration with the Consumer Financial Protection Bureau (CFPB) and the Office of the Comptroller of the Currency (OCC)—launched the Combating Redlining Initiative to ensure equal credit opportunity for communities of color. While laudable, this effort seeks to forestall future instances of redlining rather than to combat inequities associated with redlining in the past. Recent federal initiatives—such as the Trump-era Opportunity Zone program, the Obama-era Promise Zone program,1 the Bush II-era Renewal Community program, and the Clinton-era Empowerment Zone program—have aimed to spur revitalization in economically distressed communities, including historically redlinedcommunities, through grants and/or tax incentives. The success of this approach has proven mixed at best. Opportunity Zones, for instance, have been criticized for subsidizing gentrification and funneling benefits to wealthy private investors. Community leaders in designated Promise Zones have struggled to productively integrate federal grants into comprehensive, synergistic initiatives. Finally, the pattern of different administrations layering similar programs on top of each other has created confusion and lack of sustained buy-in among stakeholders. It is time for a new approach. The Plan of Action below describes a new vision for federal investment in historically redlined areas: one that relies on repurposing federal assets to empower community-driven enterprises.

Plan of Action

The Biden-Harris administration should jointly launch “Putting Redlines in the Green”, a new, interagency, and place-based program to combat inequities of historical redlining. Historically redlined communities suffer from lack of investment and inequity in financial acquisition. Through “Putting Redlines in the Green”, excess and underutilized (E&U) federal properties in historically redlined communities would be repurposed as rent-free or rent-subsidized sites for Innovative Neighborhood Markets (INMs). INMs are envisioned as multipurpose, community-operated spaces designed to spur revitalization from the bottom up by combining elements of farmers’ markets, community banks, and business improvement districts (BIDs). For instance, INMs could provide hubs for farm-to-market grocery-delivery services (see Activity 5, below), house culturally significant businesses threatened by the impacts of gentrification and the COVID-19 pandemic, and give local entrepreneurs the retail and co-working space needed to launch and grow new ventures. 

A stepwise plan of action for the program is outlined below.

Activity 1. Assemble an interagency task force to define program targets and criteria.

The Department of Housing and Urban Development (HUD)’s Office of Community Planning and Development is well-placed to identify redlined communities where INMs could deliver especially large impacts. The Environmental Protection Agency (EPA)’s Office of Community Revitalization (OCR) is already experienced insupporting locally led, community-driven efforts to protect the environment, expand economic opportunity, and revitalize neighborhoods. These two offices should jointly assemble and chair a task force comprising representatives from relevant federal agencies (e.g., the Departments of Agriculture, Commerce, and Justice (USDA, DOC, and DOJ); the General Services Administration (GSA)) and external stakeholder groups (e.g., civic groups, environmental-justice organizations, fair-housing experts). The task force would lay the foundation for “Putting Redlines in the Green” by:

Activity 2. Conduct a review to identify E&U federal properties that could be repurposed as INM sites.The portfolio of federally owned real property in the United States includes thousands of E&U properties. While the number of E&U properties catalogued in the Federal Real Property Profile (FRPP) fluctuates from year to year (due to changes in government operations, acquisition and disposal of various properties, and inconsistencies in data reporting, among other factors), the pandemic induced a notable spike: from approximately 15,000 in FY 2020 (Figure 1). With virtual and hybrid work now firmly embedded across the federal government even as the acute phase of the pandemic has ended, it is likely that a significant fraction of these properties will not return to full utilization. With maintenance of E&U federal properties costing taxpayers tens of millions of dollars annually, there is hence a timely opportunity to augment ongoing processes for federal property reallocation.

Figure 1. Changes in federal property utilization from 2019 (top) to 2020 (bottom). Source: Federal Real Property Profile Summary Data Set.

The task force should work with the GSA to review the federal government’s inventory of excess and underutilized properties to identify sites that could be repurposed as INMs. The goal of this review would be to generate a list of 10–15 sites for near-term repurposing and investment to pilot the INM concept, as well as a longer list of additional candidate sites that could be considered for INMs in the future. A first step for the review would be to crosswalk the E&U properties logged in the FRPP database with the map of priority areas developed in Activity 1. E&U properties located in priority areas should be downselected by building type. For instance, E&U hospital and lab buildings, as likely poor candidate INM sites, could be excluded while E&U housing, office, and warehouse space could be retained. Next, the remaining candidate sites should be screened against the criteria developed in Activity 1. This process stage would also be an appropriate time to identify and eliminate highly problematic candidate sites: for instance, sites that are in badly deteriorated condition or that have already proven uniquely difficult to repurpose. Finally, the task force should prioritize the final list of candidate sites for investment. Prioritization should consider factors such as geographic location (striving to achieve an equitable distribution of INMs nationwide) and buy-in from funders and community groups engaged as part of Activity 1.

Activity 3. Pilot the INM model in an initial 10–15 sites. 

HUD and EPA should lead on repurposing the 10–15 sites identified in Activity 2 into a network of INMs distributed across historically redlined communities nationwide. This process will involve (i) acquiring ownership of the sites; (ii) acquiring necessary permits, (iii) performing requisite site inspections and remediation; (iv) performing requisite construction and demolition needed to transform the sites into usable INM spaces; (v) establishing site-specific governance structures; and (vi) soliciting, selecting, and following through on locally led business proposals for the INMs. HUD and EPA should strive to have the initial suite of INMs operational within three years of program launch, and the federal government should allocate $1 million per site to achieve this goal. Funding could come from the bipartisan Infrastructure Investment and Jobs Act (specifically, through the Act’s $1.5 billion RAISE grant program), the Justice40 initiative, and/or from already-existing allocations at HUD, EPA, and partner federal agencies for activities related to economic development, community revitalization, and business/entrepreneurship. Funding could be leveraged with matching funds and/or in-kind support from philanthropies, nonprofits, local governments, and community organizations.

Activity 4. Ensure that E&U federal properties that become available in the future are systematically evaluated for suitability as INM sites.

Federal law governs the disposal process for properties no longer needed by federal agencies to carry out their program responsibilities. The first step in this process is for GSA to offer “excess property to other federal agencies that may have a program need for it.” A task force should work with GSA to ensure that the “Putting Redlines in the Green” program is incorporated into the federal agency stage of the process. The task force should also develop internal processes for efficiently evaluating E&U properties that become available as candidate sites for INMs. Steps of these internal processes would likely be broadly similar to the steps of the larger review conducted in Activity 2.

Activity 5. Launch an INM-centered “farm to neighborhood” model of grocery delivery.

To combat the specific issue of “food apartheid” in historically redlined communities, USDA’s Office of the Assistant Secretary for Civil Rights (OASCR) should spearhead creation of an INM-centered “farm to neighborhood” model (F2NM) of grocery delivery. In the F2NM, federal agencies would partner with local government and non-governmental organizations to support community gardens and nearby (within a defined radius) farms. Support, which could come in the form of subsidized crop insurance or equipment grants, would be provided to community gardeners and farmers in exchange for pledges to sell produced crops and other foods (e.g., eggs and meat) at INMs. USDA and EPA could also consider subsidizing distributors to sell key foodstuffs that cannot be produced locally (e.g., due to agricultural or logistical limitations) at affordable prices at INMs. Finally, USDA and EPA could consider working with local partners (e.g., the Detroit Black Community Food Security Network; the Center for Environmental Farming Systems [CEFS]’s Committee on Racial Equity in the Food System) to launch meal-kit services that provide community subscribers with INM-sourced ingredients and accompanying recipes. Such services will expand access to locally produced food while promoting healthier lifestyles.

Conclusion

The 11 million+ Americans that currently live in historically redlined areas deserve attention from policymakers. Historic redlining galvanizes the prevalence of food deserts, lead exposure, discriminatory practices, and other adversities, and encourages predatory markets. 

Implementation of “Putting Redlines in the Green” will empower historically redlined areas through profit-driven, self-sustaining community enterprises (INMs). “Putting Redlines in the Green” would also reinforce the Combating Redlining Initiative in ensuring that historically redlined neighborhoods receive “fair and equal access” to the lending opportunities that are—and always have been—available to non-redlined, and majority-White, neighborhoods. Ultimately, transforming excess and underutilized federal properties into INMs will strengthen urban sustainability, reduce taxpayer burdens, and promote restorative, economic, and environmental justice. “Putting Redlines in the Green” will therefore not only provide restitution for historically redlined communities, but will enfranchise the people and revitalize the place. 

Frequently Asked Questions
What does the federal government do with its excess and underutilized (E&U) properties now?

The figure below, created by the GSA, diagrams the disposal process. Generally speaking, E&U federal properties are first assessed for possible public purposes, then made available to private individuals and companies by competitive bid. Note that not every E&U federal property goes through every step of the process illustrated below.

Has anything like “Putting Redlines in the Green” been tried before?

Community organizations such as the Oakland Community Land Trust (CLT) in California and the Dudley Street Neighborhood Initiative (DSNI) in Boston, MA have revitalized their once economically distressed communities from the bottom up. Even initiatives such as the Wynwood Business Improvement District (BID) in Miami, which became susceptible to extreme gentrification following the recent removal of its Arts & Entertainment district status, succeeded in economically revitalizing an area that was once herald as the “Crime Center of Miami.” However there has never been an urban policy that has attempted to recreate the success of these localized initiatives within distressed areas across the United States. Additionally, no governmental effort has attempted to achieve urban revitalization of distressed areas through the framework of financial empowerment, community autonomy, and community-owned enterprise. “Putting Redlines in the Green” is the first to amalgamate the best elements of community-driven initiatives like those cited above and convert them into implementable urban policy.

Could “Putting Redlines in the Green” spur gentrification? How would it ensure that the INMs it creates remain community-based and -oriented?

Gentrification occurs when new development in area displaces current residents and business within that area through economic pressures (such as rising rents, mortgages, and property taxes). Gentrification requires urban revitalization, but urban revitalization does not inevitably lead to gentrification. “Putting Redlines in the Green” would promote “development without displacement.” To ensure that Innovative Neighborhood Markets (INMs) remain community-based and -oriented leading up to and after their launch, “Putting Redlines in the Green” would empower residents through a community-governance structure that controls development, creates economic opportunity, and vastly mitigates the likelihood of gentrification. The Dudley Street Neighborhood Initiative (DSNI) is one example of such a governance structure that has succeeded.

How will “Putting Redlines in the Green” establish relationships with and attract buy-in from funders?

History suggests that the creation of community enterprise within areas susceptible to “gentrification” (i.e., historically redlined neighborhoods) will systematically attract buy-in. As some economists, scholars, and historians have postulated since the 1900’s, gentrification is a consumer cycle that is heavily driven by the movement of money (usually in the form of affluent individuals looking for the newest housing stock) into areas that are nearing the end of their economic life. Thus, the new development associated with INMs will likely attract funders and buy-in from external parties.

Is there competition within the disposal process that could make procuring sites for INMs difficult?

According to the U.S. General Services Administration (GSA)’s Office of Real Property Utilization and Disposal (ORPUD), most excess property does not get transferred between the 34 federal agencies due to “specificity” of the buildings. Thus, there is limited interagency competition for disposed government property. In fact, most E&U federal properties move onto the surplus-property stage, where they may be acquired by state and local governments (i.e., “public benefit conveyance”).


At the public benefit conveyance stage, there are currently 12 legislative actions that grant special consideration for transfer or conveyance of surplus real and related personal property to state government, local government, and certain nonprofits at up to 100% discount for public benefit use. It is therefore preferable that E&U sites for INMs be acquired during the federal stage of the disposal process.

What are some examples of regional partners that would support “Putting Redlines in the Green”? What roles would regional partners play within INMs?

Regional partners could include nonprofits (e.g., Center for Environmental Farming Systems [CEF]’s Curriculum on Racial Equity [CORE]) could advise on best practices for expanding access to locally produced food while promoting healthier lifestyles) or private-sector entities (e.g., Community Development Financial Institutions [CDFIs]) could advise on how to help local entrepreneurs achieve their financial goals and how INMs can support business development by leveraging legislation like the Community Reinvestment Act of 1977). Regardless of size or sector, the role of regional partners, would be to empower the communities participating in “Putting Redlines in the Green” as they help shape, launch, and maintain INMs.

How does “Putting Redlines in the Green” differ from existing economic-development programs, such as EPA’s Smart Growth Program? What about economic-revitalization efforts launched under previous administrations, such as Opportunity Zones or Promise Zones?

“Putting Redlines in the Green” could be accurately described as a specialized smart-growth technical-assistance program that specifically addresses sustainable development in redlined communities. “Putting Redlines in the Green” could also be accurately described as an economic-revitalization effort. But while other federally sponsored economic-development and -revitalization programs have relied heavily on top-down grants and tax incentives, “Putting Redlines in the Green” will take a bottom-up approach based on community-led transformation of excess and underutilized federal properties into vibrant, locally grounded business enterprises.

Establishing the AYA Research Institute: Increasing Data Capacity and Community Engagement for Environmental-Justice Tools

Summary

Environmental justice (EJ) is a priority issue for the Biden Administration, yet the federal government lacks capacity to collect and maintain data needed to adequately identify and respond to environmental-justice (EJ) issues. EJ tools meant to resolve EJ issues — especially the Environmental Protection Agency (EPA)’s EJSCREEN tool — are gaining national recognition. But knowledge gaps and a dearth of EJ-trained scientists are preventing EJSCREEN from reaching its full potential. To address these issues, the Administration should allocate a portion of the EPA’s Justice40 funding to create the “AYA Research Institute”, a think tank under EPA’s jurisdiction. Derived from the Adinkra symbol, AYA means “resourcefulness and defiance against oppression.” The AYA Research Institute will functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues.

Challenge and Opportunity

Approximately 200,000 people in the United States die every year of pollution-related causes. These deaths are concentrated in underresourced, vulnerable, and/or minority communities. The EPA created the Office of Environmental Justice (OEJ) in 1992 to address systematic disparities in environmental outcomes among different communities. The primary tool that OEJ relies on to consider and address EJ concerns is EJSCREEN. EJSCREEN integrates a variety of environmental and demographic data into a layered map that identifies communities disproportionately impacted by environmental harms. This tool is available for public use and is the primary screening mechanism for many initiatives at state and local levels. Unfortunately, EJSCREEN has three major limitations:

  1. Missing indicators. EJSCREEN omits crucial environmental indicators such as drinking-water quality and indoor air quality. OEJ states that these crucial indicators are not included due to a lack of resources available to collect underlying data at the appropriate quality, spatial range, and resolution. 
  2. Small areas are less accurate. There is considerable uncertainty in EJSCREEN environmental and demographic estimates at the census block group (CBG) level. This is because (i) EJSCREEN’s assessments of environmental indicators can rely on data collected at scales less granular than CBG, and (ii) some of EJSCREEN’s demographic estimates are derived from surveys (as opposed to census data) and are therefore less consistent.
  3. Deficiencies in a single dataset can propagate across EJSCREEN analyses. Environmental indicators and health outcomes are inherently interconnected. This means that subpar data on certain indicators — such as emissions levels, ambient pollutant levels in air, individual exposure, and pollutant toxicity — can compromise the reliability of EJSCREEN results on multiple fronts. 

These limitations must be addressed to unlock the full potential of EJSCREEN as a tool for informing research and policy. More robust, accurate, and comprehensive environmental and demographic data are needed to power EJSCREEN. Community-driven initiatives are a powerful but underutilized way to source such data. Yet limited time, funding, rapport, and knowledge tend to discourage scientists from engaging in community-based research collaborations. In addition, effectively operationalizing data-based EJ initiatives at a national scale requires the involvement of specialists trained at the intersection of EJ and science, technology, engineering, and math (STEM). Unfortunately, relatively poor compensation discourages scientists from pursuing EJ work — and scientists who work on other topics but have interest in EJ can rarely commit the time needed to sustain long-term collaborations with EJ organizations. It is time to augment the federal government’s past and existing EJ work with redoubled investment in community-based data and training.

Plan of Action

EPA should dedicate $20 million of its Justice40 funding to establish the AYA Research Institute: an in-house think tank designed to functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues. The word AYA is the formal name for the Adinkra symbol meaning “resourcefulness and defiance against oppression” — concepts that define the fight for environmental justice.

The Research Institute will comprise three arms. The first arm will increase federal EJ data capacity through an expert advisory group tasked with providing and updating recommendations to inform federal collection and use of EJ data. The advisory group will focus specifically on (i) reviewing and recommending updates to environmental and demographic indicators included in EJSCREEN, and (ii) identifying opportunities for community-based initiatives that could help close key gaps in the data upon which EJSCREEN relies.

The second arm will help grow the pipeline of EJ-focused scientists through a three-year fellowship program supporting doctoral students in applied research projects that exclusively address EJ issues in U.S. municipalities and counties identified as frontline communities. The program will be three years long so that participants are able to conduct much-needed longitudinal studies that are rare in the EJ space. To be eligible, doctoral students will need to (i) demonstrate how their projects will help strengthen EJSCREEN and/or leverage EJSCREEN insights, and (ii) present a clear plan for interacting with and considering recommendations from local EJ grassroots organization(s). Selected students will be matched with grassroots EJ organizations distributed across five U.S. geographic regions (Northeast, Southeast, Midwest, Southwest, and West) for mentorship and implementation support. The fellowship will support participants in achieving their academic goals while also providing them with experience working with community-based data, building community-engagement and science-communication skills, and learning how to scale science policymaking from local to federal systems. As such, the fellowship will help grow the pipeline of STEM talent knowledgeable about and committed to working on EJ issues in the United States.

The third arm will embed EJ expertise into federal decision making by sponsoring a permanent suite of very dominant resident staff, supported by “visitors” (i.e., the doctoral fellows), to produce policy recommendations, studies, surveys, qualitative analyses, and quantitative analyses centered around EJ. This model will rely on the resident staff to maintain strong relationships with federal government and extragovernmental partners and to ensure continuity across projects, while the fellows provide ancillary support as appropriate based on their skills/interest and Institute needs. The fellowship will act as a screening tool for hiring future members of the resident staff.

Taken together, these arms of the AYA Research Institute will help advance Justice40’s goal of improving training and workforce development, as well as the Biden Administration’s goal of better preparing the United States to adapt and respond to the impacts of climate change. The AYA Research Institute can be launched with $10 million: $4 million to establish the fellowship program with an initial cohort of 10 doctoral students (receiving stipends commensurate with typical doctoral stipends at U.S. universities), and $6 million to cover administrative expenses and staff expert salaries. Additional funding will be needed to maintain the Institute if it proves successful after launch. Funding for the Institute could come from Justice40 funds allocated to EPA. Alternatively, EPA’s fiscal year (FY) 2022 budget for science and technology clearly states a goal of prioritizing EJ — funds from this budget could hence be allocated towards the Institute using existing authority. Finally, EPA’s FY 2022 budget for environmental programs and management dedicates approximately $6 million to EJSCREEN — a portion of these funds could be reallocated to the Institute as well.

Conclusion

The Biden-Harris Administration is making unprecedented investments in environmental justice. The AYA Research Institute is designed to be a force multiplier for those investments. Federally sponsored EJ efforts involve multiple programs and management tools that directly rely on the usability and accuracy of EJSCREEN. The AYA Research Institute will increase federal data capacity and help resolve the largest gaps in the data upon which EJSCREEN depends in order to increase the tool’s effectiveness. The Institute will also advance data-driven environmental-justice efforts more broadly by (i) growing the pipeline of EJ-focused researchers experienced in working with data, and (ii) embedding EJ expertise into federal decision making. In sum, the AYA Research Institute will strengthen the federal government’s capacity to strategically and meaningfully advance EJ nationwide. 

Frequently Asked Questions
How does this proposal align with grassroots EJ efforts?

Many grassroots EJ efforts are focused on working with scientists to better collect and use data to understand the scope of environmental injustices. The AYA Research Institute would allocate in-kind support to advance such efforts and would help ensure that data collected through community-based initiatives is used as appropriate to strengthen federal decision-making tools like EJSCREEN.

How does this proposal align with the Climate and Economic Justice Screening Tool (CEJST) recently announced by the Biden administration?

EJSCREEN and CEJST are meant to be used in tandem. As the White House explains, “EJSCREEN and CEJST complement each other — the former provides a tool to screen for potential disproportionate environmental burdens and harms at the community level, while the latter defines and maps disadvantaged communities for the purpose of informing how Federal agencies guide the benefits of certain programs, including through the Justice40 Initiative.” As such, improvements to EJSCREEN will inevitably strengthen deployment of CEJST.

Has a think tank ever been embedded in a federal government agency before?

Yes. Examples include the U.S. Army War College Strategic Studies Institute and the Asian-Pacific Center for Security Studies. Both entities have been successful and serve as primary research facilities.

What criteria would the AYA Research Institute use to evaluate doctoral students who apply to its fellowship program?

To be eligible for the fellowship program, applicants must have completed one year of their doctoral program and be current students in a STEM department. Fellows must propose a research project that would help strengthen EJSCREEN and/or leverage EJSCREEN insights to address a particular EJ issue. Fellows must also clearly demonstrate how they would work with community-based organizations on their proposed projects. Priority would be given to candidates proposing the types of longitudinal studies that are rare but badly needed in the EJ space. To ensure that fellows are well equipped to perform deep community engagement, additional selection criteria for the AYA Research Institute fellowship program could draw from the criteria presented in the rubric for the Harvard Climate Advocacy Fellowship.

What can be done to avoid politicizing the AYA Research Institute, and to ensure the Institute’s longevity across administrations?

A key step will be grounding the Institute in the expertise of salaried, career staff. This will offset potential politicization of research outputs.

What is the existing data the EJSCREEN is using?

EJSCREEN 2.0 is largely using data from the 2020 U.S. Census Bureau’s American Community Survey, as well as many other sources (e.g., the Department of Transportation (DOT) National Transportation Atlas Database, the Community Multiscale Air Quality (CMAQ) modeling system, etc.) The EJSCREEN Technical Document explicates the existing data sources that EJSCREEN relies on.

7. What are the demographic and environmental indicators of interest included in EJSCREEN?

The demographic indicators are: people of color, low income, unemployment rate, linguistic isolation, less than high school education, under age 5 and over age 64. The environmental indicators are: particulate matter 2.5, ozone, diesel particulate matter, air toxics cancer risk, air toxics respiratory hazard index, traffic proximity and volume, lead paint, Superfund proximity, risk management plan facility proximity, hazardous waste proximity, underground storage tanks and leaking UST, and wastewater discharge.