Unlocking Federal Grant Data To Inform Evidence-Based Science Funding

Summary

Federal science-funding agencies spend tens of billions of dollars each year on extramural research. There is growing concern that this funding may be inefficiently awarded (e.g., by under-allocating grants to early-career researchers or to high-risk, high-reward projects). But because there is a dearth of empirical evidence on best practices for funding research, much of this concern is anecdotal or speculative at best.

The National Institutes of Health (NIH) and the National Science Foundation (NSF), as the two largest funders of basic science in the United States, should therefore develop a platform to provide researchers with structured access to historical federal data on grant review, scoring, and funding. This action would build on momentum from both the legislative and executive branches surrounding evidence-based policymaking, as well as on ample support from the research community. And though grantmaking data are often sensitive, there are numerous successful models from other sectors for sharing sensitive data responsibly. Applying these models to grantmaking data would strengthen the incorporation of evidence into grantmaking policy while also guiding future research (such as larger-scale randomized controlled trials) on efficient science funding.

Challenge and Opportunity

The NIH and NSF together disburse tens of billions of dollars each year in the form of competitive research grants. At a high level, the funding process typically works like this: researchers submit detailed proposals for scientific studies, often to particular program areas or topics that have designated funding. Then, expert panels assembled by the funding agency read and score the proposals. These scores are used to decide which proposals will or will not receive funding. (The FAQ provides more details on how the NIH and NSF review competitive research grants.) 

A growing number of scholars have advocated for reforming this process to address perceived inefficiencies and biases. Citing evidence that the NIH has become increasingly incremental in its funding decisions, for instance, commentators have called on federal funding agencies to explicitly fund riskier science. These calls grew louder following the success of mRNA vaccines against COVID-19, a technology that struggled for years to receive federal funding due to its high-risk profile.

Others are concerned that the average NIH grant-winner has become too old, especially in light of research suggesting that some scientists do their best work before turning 40. Still others lament the “crippling demands” that grant applications exert on scientists’ time, and argue that a better approach could be to replace or supplement conventional peer-review evaluations with lottery-based mechanisms

These hypotheses are all reasonable and thought-provoking. Yet there exists surprisingly little empirical evidence to support these theories. If we want to effectively reimagine—or even just tweak—the way the United States funds science, we need better data on how well various funding policies work.

Academics and policymakers interested in the science of science have rightly called for increased experimentation with grantmaking policies in order to build this evidence base. But, realistically, such experiments would likely need to be conducted hand-in-hand with the institutions that fund and support science, investigating how changes in policies and practices shape outcomes. While there is progress in such experimentation becoming a reality, the knowledge gap about how best to support science would ideally be filled sooner rather than later.

Fortunately, we need not wait that long for new insights. The NIH and NSF have a powerful resource at their disposal: decades of historical data on grant proposals, scores, funding status, and eventual research outcomes. These data hold immense value for those investigating the comparative benefits of various science-funding strategies. Indeed, these data have already supported excellent and policy-relevant research. Examples include Ginther et. al (2011) which studies how race and ethnicity affect the probability of receiving an NIH award, and Myers (2020), which studies whether scientists are willing to change the direction of their research in response to increased resources. And there is potential for more. While randomized control trials (RCTs) remain the gold standard for assessing causal inference, economists have for decades been developing methods for drawing causal conclusions from observational data. Applying these methods to federal grantmaking data could quickly and cheaply yield evidence-based recommendations for optimizing federal science funding.

Opening up federal grantmaking data by providing a structured and streamlined access protocol would increase the supply of valuable studies such as those cited above. It would also build on growing governmental interest in evidence-based policymaking. Since its first week in office, the Biden-Harris administration has emphasized the importance of ensuring that “policy and program decisions are informed by the best-available facts, data and research-backed information.” Landmark guidance issued in August 2022 by the White House Office of Science and Technology Policy directs agencies to ensure that federally funded research—and underlying research data—are freely available to the public (i.e., not paywalled) at the time of publication.

On the legislative side, the 2018 Foundations for Evidence-based Policymaking Act (popularly known as the Evidence Act) calls on federal agencies to develop a “systematic plan for identifying and addressing policy questions” relevant to their missions. The Evidence Act specifies that the general public and researchers should be included in developing these plans. The Evidence Act also calls on agencies to “engage the public in using public data assets [and] providing the public with the opportunity to request specific data assets to be prioritized for disclosure.” The recently proposed Secure Research Data Network Act calls for building exactly the type of infrastructure that would be necessary to share federal grantmaking data in a secure and structured way.

Plan of Action

There is clearly appetite to expand access to and use of federally held evidence assets. Below, we recommend four actions for unlocking the insights contained in NIH- and NSF-held grantmaking data—and applying those insights to improve how federal agencies fund science.

Recommendation 1. Review legal and regulatory frameworks applicable to federally held grantmaking data.

The White House Office of Management and Budget (OMB)’s Evidence Team, working with the NIH’s Office of Data Science Strategy and the NSF’s Evaluation and Assessment Capability, should review existing statutory and regulatory frameworks to see whether there are any legal obstacles to sharing federal grantmaking data. If the review team finds that the NIH and NSF face significant legal constraints when it comes to sharing these data, then the White House should work with Congress to amend prevailing law. Otherwise, OMB—in a possible joint capacity with the White House Office of Science and Technology Policy (OSTP)—should issue a memo clarifying that agencies are generally permitted to share federal grantmaking data in a secure, structured way, and stating any categorical exceptions.

Recommendation 2. Build the infrastructure to provide external stakeholders with secure, structured access to federally held grantmaking data for research. 

Federal grantmaking data are inherently sensitive, containing information that could jeopardize personal privacy or compromise the integrity of review processes. But even sensitive data can be responsibly shared. The NIH has previously shared historical grantmaking data with some researchers, but the next step is for the NIH and NSF to develop a system that enables broader and easier researcher access. Other federal agencies have developed strategies for handling highly sensitive data in a systematic fashion, which can provide helpful precedent and lessons. Examples include:

  1. The U.S. Census Bureau (USCB)’s Longitudinal Employer-Household Data. These data link individual workers to their respective firms, and provide information on salary, job characteristics, and worker and firm location. Approved researchers have relied on these data to better understand labor-market trends.
  2. The Department of Transportation (DOT)’s Secure Data Commons. The Secure Data Commons allows third-party firms (such as Uber, Lyft, and Waze) to provide individual-level mobility data on trips taken. Approved researchers have used these data to understand mobility patterns in cities.

In both cases, the data in question are available to external researchers contingent on agency approval of a research request that clearly explains the purpose of a proposed study, why the requested data are needed, and how those data will be managed. Federal agencies managing access to sensitive data have also implemented additional security and privacy-preserving measures, such as:

Building on these precedents, the NIH and NSF should (ideally jointly) develop secure repositories to house grantmaking data. This action aligns closely with recommendations from the U.S. Commission on Evidence-Based Policymaking, as well as with the above-referenced Secure Research Data Network Act (SRDNA). Both the Commission recommendations and the SRDNA advocate for secure ways to share data between agencies. Creating one or more repositories for federal grantmaking data would be an action that is simultaneously narrower and broader in scope (narrower in terms of the types of data included, broader in terms of the parties eligible for access). As such, this action could be considered either a precursor to or an expansion of the SRDNA, and could be logically pursued alongside SRDNA passage.

Once a secure repository is created, the NIH and NSF should (again, ideally jointly) develop protocols for researchers seeking access. These protocols should clearly specify who is eligible to submit a data-access request, the types of requests that are likely to be granted, and technical capabilities that the requester will need in order to access and use the data. Data requests should be evaluated by a small committee at the NIH and/or NSF (depending on the precise data being requested). In reviewing the requests, the committee should consider questions such as:

  1. How important and policy-relevant is the question that the researcher is seeking to answer? If policymakers knew the answer, what would they do with that information? Would it inform policy in a meaningful way? 
  2. How well can the researcher answer the question using the data they are requesting? Can they establish a clear causal relationship? Would we be comfortable relying on their conclusions to inform policy?

Finally, NIH and NSF should consider including right-to-review clauses in agreements governing sharing of grantmaking data. Such clauses are typical when using personally identifiable data, as they give the data provider (here, the NIH and NSF) the chance to ensure that all data presented in the final research product has been properly aggregated and no individuals are identifiable. The Census Bureau’s Disclosure Review Board can provide some helpful guidance for NIH and NSF to follow on this front.

Recommendation 3. Encourage researchers to utilize these newly available data, and draw on the resulting research to inform possible improvements to grant funding.

The NIH and NSF frequently face questions and trade-offs when deciding if and how to change existing grantmaking processes. Examples include:

Typically, these agencies have very little academic or empirical evidence to draw on for answers. A large part of the problem has been the lack of access to data that researchers need to conduct relevant studies. Expanding access, per Recommendations 1 and 2 above, is a necessary part of but not a sufficient solution. Agencies must also invest in attracting researchers to use the data in a socially useful way.

Broadly advertising the new data will be critical. Announcing a new request for proposals (RFP) through the NIH and/or the NSF for projects explicitly using the data could also help. These RFPs could guide researchers toward the highest-impact and most policy-relevant questions, such as those above. The NSF’s “Science of Science: Discovery, Communication and Impact” program would be a natural fit to take the lead on encouraging researchers to use these data.

The goal is to create funding opportunities and programs that give academics clarity on the key issues and questions that federal grantmaking agencies need guidance on, and in turn the evidence academics build should help inform grantmaking policy.

Conclusion

Basic science is a critical input into innovation, which in turn fuels economic growth, health, prosperity, and national security. The NIH and NSF were founded with these critical missions in mind. To fully realize their missions, the NIH and NSF must understand how to maximize scientific return on federal research spending. And to help, researchers need to be able to analyze federal grantmaking data. Thoughtfully expanding access to this key evidence resource is a straightforward, low-cost way to grow the efficiency—and hence impact—of our federally backed national scientific enterprise.

Frequently Asked Questions
How does the NIH currently select research proposals for funding?

For an excellent discussion of this question, see Li (2017). Briefly, the NIH is organized around 27 “Institutes or Centers” (ICs) which typically correspond to disease areas or body systems. ICs have budgets each year that are set by Congress. Research proposals are first evaluated by around 180 different “study sections”, which are committees organized by scientific areas or methods. After being evaluated by the study sections, proposals are returned to their respective ICs. The highest-scoring proposals in each IC are funded, up to budget limits.

How does the NSF currently select research proposals for funding?

Research proposals are typically submitted in response to announced funding opportunities, which are organized around different programs (topics). Each proposal is sent by the Program Officer to at least three independent reviewers who do not work at the NSF. These reviewers judge the proposal on its Intellectual Merit and Broader Impacts. The Program Officer then uses the independent reviews to make a funding recommendation to the Division Director, who makes the final award/decline decision. More details can be found on the NSF’s webpage.

What data on grant funding at the NIH and NSF is currently (publicly) available?

The NIH and NSF both provide data on approved proposals. These data can be found on the RePORTER site for the NIH and award search site for the NSF. However, these data do not provide any information on the rejected applications, nor do they provide information on the underlying scores of approved proposals.

Masks via Mail: Maintaining Critical COVID-19 Infrastructure for Future Public Health Threats

Summary

To protect against future infectious disease outbreaks, the Department of Health and Human Services (HHS) Coordination Operations and Response Element (H-CORE) should develop and maintain the capacity to regularly deliver N95 respirator masks to every home using a mail delivery system. H-CORE previously developed a mailing system to provide free, rapid antigen tests to homes across the U.S. in response to the COVID-19 pandemic. H-CORE can build upon this system to supply the American public with additional disease prevention equipment––notably face masks. H-CORE can helm this expanded mail-delivery system by (i) gathering technical expertise from partnering federal agencies, (ii) deciding which masks are appropriate for public use, (iii) pulling from a rotating face-mask inventory at the Strategic National Stockpile (SNS), and (iv) centralizing subsequent equipment shipping and delivery. In doing so, H-CORE will fortify the pandemic response infrastructure established during the COVID-19 pandemic, allowing the U.S. government to face future pathogens with preparedness and resilience.

Challenge and Opportunity

The infrastructure put in place to respond to COVID-19 should be maintained and improved to better prepare for and respond to the next pandemic. As the federal government thinks about the future of COVID-19 response programs, it should prioritize maintaining systems that can be flexibly used to address a variety of health threats. One critical capability to maintain is the ability to quickly deliver medical countermeasures across the US. This was already done to provide the American public with COVID-19 rapid tests, but additional medical countermeasures––such as N95 respirators––should also be included. 

N95s are an incredibly effective means of preventing deadly infectious disease spread. Wearing an N95 respirator reduces the odds of testing positive for COVID-19 by 83%, compared to 66% for surgical masks and 56% for cloth masks. The significant difference between N95 respirators and other face coverings means that N95 respirators can provide real public health benefits against a variety of biothreats, not just COVID-19. Adding N95 respirators to H-CORE’s mailing program would increase public access to a highly effective medical countermeasure that protects against a variety of harmful diseases. Providing equitable access to N95 masks can also protect the United States against other dangerous public health emergencies, not just pandemics. Additionally, N95s protect individuals from harmful, wildfire-smoke-derived airborne particles, providing another use-case beyond protection against viruses. 

Beyond the benefit of expanding access to masks in particular, it is important to have an active public health mailing system that can be quickly scaled up to respond to emergencies. In times of need, this established mailing system could distribute a wide array of medical countermeasures, medicines, information, and personal protective equipment––including N95s. Thankfully, the agencies needed to coordinate this effort are already primed to do so. These authorities already have the momentum, expertise, and experience to convert existing COVID-19 response programs and pandemic preparedness investments into permanent health response infrastructure.

Plan of Action

The newly-elevated Administration for Strategic Preparedness and Response (ASPR) should house the N95 respirator mailing system, granting H-CORE key management and distribution responsibilities. Evolving out of the operational capacities built from Operation Warp Speed, H-CORE has demonstrated strong logistical capabilities in distributing COVID-19 vaccines, therapeutics, and at-home tests across the United States. H-CORE should continue operating some of these preparedness programs to increase public access to key medical countermeasures. At the same time, it should also maintain the flexibility to pivot and scale up these response programs as soon as the next public health emergency arises. 

H-CORE should bolster its free COVID-19 test mailing program and include the option to order one box of 10 free N95 respirator masks every quarter. 

H-CORE partnered with the U.S. Postal Service (USPS) to develop an unprecedented initiative––creating an online ordering system for rapid COVID-19 testing to be sent via mail to American households. ASPR should maintain its relationships with USPS and other shipping companies to distribute other needed medical supplies––like N95s. To ensure public comfort, a simple N95 ordering website could be designed to mimic the COVID-19 test ordering site

An N95-distribution program has already been piloted and proven successful. Thanks to ASPR and the National Institute for Occupational Safety and Health (NIOSH), masks previously held at SNS were made available to the public at select retail pharmacies. This program should be made permanent and expanded to maximize the convenience of obtaining medical countermeasures, like masks. Doing so will likely increase the chance that the general population will acquire and use them. Additionally––if supplies are sourced primarily from domestic mask manufacturers––this program can stabilize demand and incentivize further manufacturing within the United States. Keeping this production at a steady base level will also make it easier to scale up quickly, should America face another pandemic or other public health crisis.

H-CORE and ASPR should coordinate with the SNS to provide N95 respirators through a rotating inventory system.  

As evidenced by the 2009 H1N1 influenza pandemic and the COVID-19 pandemic, static stockpiling large quantities of masks is not an effective way to prepare for the next bio-incident. 

Congress has long recognized the need to shift the stockpiling status quo within HSS, including within the SNS. Recent draft legislation––including the Protecting Providers Everywhere (PPE) in America Act and PREVENT Pandemics Act, as well as being mentioned in the National Strategy for a Resilient Public Health Supply Chain––have advocated for a rotating stock system. While the concept is mentioned in these documents, there are few details on what the system would look like in practice or a timeline for its implementation.

Ultimately, the SNS should use a rotating inventory system where its stored masks get rotated out to other uses in the supply chain using a “first in, first out” approach. This will  prevent N95s from being stored beyond their recommended shelf-life and encourage continual replenishment of the SNS’ mask stockpile.

To make this new rotating inventory system possible, ASPR should pilot rotating inventory through this H-CORE mask mailing program while they decide if and how rotating inventory could be implemented in larger quantities (e.g. rotating out to Veterans Affairs, the Department of Defense, and other purchasers). To pilot a rotating inventory system, the Secretary of HHS may enter into contracts and cooperative agreements with vendors, through the SNS contracting mechanisms, and structure the contracts to include maintaining a constant supply and re-stock capacity of the stated product in such quantities as required by the contract. As a guide, the SNS can model these agreements after select pharmaceutical contracts, especially those that have stipulated similar rotating inventory systems (i.e., the radiological countermeasure Neupogen).

The N95 mail-delivery system will allow ASPR, H-CORE, and the SNS to test the rotating stock model in a way that avoids serious risk or negative consequences. The small quantity of N95s needed for the pilot program should not tax the SNS’ supply-at-large. After all, the afore-mentioned H-CORE/NIOSH mask-distribution programs are similarly designed to this pilot, and they do not disrupt the SNS supply for healthcare workers.

Conclusion

To be fully prepared for the next public health emergency, the United States must learn from its previous experience with COVID-19 and continue building the public health infrastructures that proved efficient during this pandemic. Widespread distribution of COVID-19 rapid diagnostic tests is one such success story. The logistics and protocols that made this resource dispersal possible should be continued for other flexible medical countermeasures, like N95 respirators. After all, while the need for COVID-19 tests may wane over time, the relevance of N95 respirators will not.

HHS should therefore distribute N95 respirators to the general public through H-CORE to (i) maintain the existing mailing infrastructure and (ii) increase access to a medical countermeasure that efficiently impedes transmission for many diseases. The masks for this effort should be sourced from the Strategic National Stockpile. This will not only prevent stock expiration, but also pilot rotating inventory as a strategy for larger-scale integration into the SNS. These actions will together equip the public with medical countermeasures relevant to a variety of diseases and strengthen a critical distribution program that should be maintained for future pandemic response.

Frequently Asked Questions
What are medical countermeasures?

Medical countermeasures (MCMs) can include both pharmaceutical interventions (such as vaccines, antimicrobials, antivirals, etc.) and non-pharmaceutical interventions (such as ventilators, diagnostics, personal protective equipment, etc.) that are used to prevent, mitigate, or treat the adverse health effects or a public health emergency. Examples of MCM deployment during the COVID-19 pandemic include the COVID-19 vaccines, therapeutics for COVID-19-hospitalized patients (e.g., antivirals and monoclonal antibodies), and personal protective equipment (e.g., respirators and gloves) deployed to healthcare providers and the public.

Why should the N95 mask delivery system be housed under HHS and managed through ASPR and H-CORE?

This proposal would build off of capabilities already being executed under the Department of Health and Human Services, Administration for Strategic Preparedness and Response (HHS ASPR). ASPR oversees both H-CORE and the Strategic National Stockpile (SNS) and was recently reclassified from a staff division to an operating division. This change allowed ASPR to better mobilize and respond to health-related emergencies. ASPR established H-CORE at the beginning of 2022 to create a permanent team responsible for coordinating medical countermeasures and strengthening preparedness for future pandemics. While H-CORE is currently focused on providing COVID-19 countermeasures––including vaccines, therapeutics, masks, and test kits––their longer-term mission is to augment capabilities within HHS to solve emerging health threats. As such, their ingrained mission and expertise match those required to successfully launch an N95 mail-delivery system.

How many masks would be needed for this program?

Presently, 270 million masks have been made available to the U.S. population. It’s estimated that this same number of masks would be enough for American households to receive 10 masks per quarter, assuming a 50% participation rate in the program.

How much will the N95 delivery system cost?

The total annual cost of this program is an estimated $280 million to purchase 270 million masks and facilitate shipping across the United States.

How should the N95 delivery system be funded?

There are several ways this initiative could be funded. Initial funding to purchase and mail COVID-19 tests to homes came from the American Rescue Plan. By passing the COVID Supplemental Appropriations Act, Congress could provide supplemental funds to maintain standing COVID-19 programs and help pivot them to address evolving and future health threats.


The FY2023 President’s Budget for HHS also provides ample funding for H-CORE, the SNS, and ASPR, meaning it could also provide alternative funding for an N95 mail-delivery system. Presently, the budget asks for: $133 million for H-CORE and mentions their role in making masks available nationwide. Additionally, $975 million has been allotted to the SNS, which includes coordination with HHS and maintaining the stockpile. Furthermore, is petitions for ASPR to receive $12 billion to generally prepare for pandemics and other future biological threats (and here it also specifically recommends strong coordination with HHS agency efforts).

Why are N95 masks important?

N95 respirators have a number of benefits that make them a critical defense strategy in a public health emergency. First, they are pathogen-agnostic, shelf-stable countermeasures that filter airborne particles very efficiently, meaning they can impede transmission for a variety of diseases––especially airborne and aerosolized ones. This is important, since these two latter disease categories are the most likely naturally occurring and intentional biothreats. Second, N95 respirators are useful beyond pandemic responses and also protect against wildfire smoke. Additionally, N95 masks have a long shelf-life. Therefore, the ability to quickly and widely distribute N95s is a critical public health preparedness measure.

Why should the U.S. government fund increased N95 manufacturing capacity?

Domestic mask manufacturers have also frequently experienced boom and bust cycles as public demand for masks can change rapidly and without warning. This inconsistent market makes it difficult for manufacturers to invest in increased manufacturing capacity in the long-term. One example is the company Prestige Ameritech, which invested over $1 million in new equipment and hired 150 new workers to produce masks in response to the 2009 swine flu outbreak. However, by the time production was ready, demand for masks had dropped and the company almost went bankrupt. Given overwhelmingly positive benefits of having mask manufacturing capacity available when needed, it is worthwhile for the government to provide some ongoing demand certainty.


Furthermore, making masks free and easily available to the general public could increase the public’s mask usage during the annual flu season and other periods of sickness. While personal protective equipment has decreased in cost since the peak of the pandemic, making them as accessible as possible will disproportionately increase access for low-income citizens and help ensure equitable access to protective medical countermeasures.

Can N95 respirators be deployed to the public if they are only approved for use in a healthcare setting?

It is true that N95s are not regulated outside of healthcare settings, but that shouldn’t dissuade public use. Presently, there is no federal agency currently tasked with regulating respiratory protection for the public. The Food and Drug Administration (FDA) and the Centers for Disease Control and Prevention (CDC) National Institute for Occupational Safety and Health (NIOSH) currently have a Memorandum of Understanding (MOU) coordinating regulatory authority over N95 respirators for medical use. Neither the FDA nor NIOSH, though, have jurisdiction of mask use in a non-medical, non-occupational setting. Using an N95 respirator outside of a medical setting does not satisfy all of the regulatory requirements, like undergoing a fit-test to ensure proper seal. However, using N95 respirators for every-day respiratory protection (i) provides better protection than no mask, a cloth mask, or a surgical mask, and (ii) realistically should not need to meet the same regulatory standards as medical use as people are not regularly exposed to the same level of risk as medical professionals.

Who currently regulates N95 safety standards?

Presently, there is no central regulator for public respiratory protection in general. In fact, the National Academies of Science Engineering and Medicine recently issued a recommendation for Congress to “expeditiously establish a coordinating entity within the Department of Health and Human Services (HHS) with the necessary responsibility, authority, and resources (financial, personnel, and infrastructure) to provide a unified and authoritative source of information and effective oversight in the development, approval, and use of respiratory protective devices that can meet the needs of the public and protect the public health.”


Moving forward, NIOSH alone should regulate N95 use for the public just as they do in occupational settings. The approval process used by other regulators––like the FDA––is more restrictive than necessary for public use. The FDA’s standards for medical protection understandably need to be high in order to protect doctors, nurses, and other medical professionals against a wide variety of dangerous exposure situations. NIOSH can provide alternative regulation and guidance for the general public, who realistically are unlikely to be in similar circumstances.


Aside from federal agencies, professional scientific societies have also provided their input in regulating N95s. The American Society for Testing and Materials (ASTM), for example, recently published standards for barrier face coverings not intended for medical use or currently regulated under NIOSH standards. While ASTM does not have any regulatory or enforcement authority, HHS could use these standards for protection, comfort, and usability as a starting point for developing guidelines for respirators suitable for public distribution and use.

Why use a rotating inventory system?

After the 2009 H1N1 influenza pandemic and the COVID-19 pandemic, it became evident that SNS must change its stockpile management practices. The stockpile’s reserves of N95 respirators were not sufficiently replenished after the 2009 H1N1 pandemic, in large part due to the significant up-front supply restocking cost. During the early days of COVID-19 response, many states received expired respirators and broken ventilators from the SNS. These incidents revealed a number of issues with the current stockpiling paradigm. Shifting to a rotating inventory system would prevent issues with expiration, smooth out the costs of large periodic restocks, and help maintain a capable and responsive manufacturing base.

Strengthening Policy by Bringing Evidence to Life

Summary

In a 2021 memorandum, President Biden instructed all federal executive departments and agencies to “make evidence-based decisions guided by the best available science and data.” This policy is sound in theory but increasingly difficult to implement in practice. With millions of new scientific papers published every year, parsing and acting on research insights presents a formidable challenge.

A solution, and one that has proven successful in helping clinicians effectively treat COVID-19, is to take a “living” approach to evidence synthesis. Conventional systematic reviews,  meta-analyses, and associated guidelines and standards, are published as static products, and are updated infrequently (e.g., every four to five years)—if at all. This approach is inefficient and produces evidence products that quickly go out of date. It also leads to research waste and poorly allocated research funding.

By contrast, emerging “Living Evidence” models treat knowledge synthesis as an ongoing endeavor. By combining (i) established, scientific methods of summarizing science with (ii) continuous workflows and technology-based solutions for information discovery and processing, Living Evidence approaches yield systematic reviews—and other evidence and guidance—products that are always current. 
The recent launch of the White House Year of Evidence for Action provides a pivotal opportunity to harness the Living Evidence model to accelerate research translation and advance evidence-based policymaking. The federal government should consider a two-part strategy to embrace and promote Living Evidence. The first part of this strategy positions the U.S. government to lead by example by embedding Living Evidence within federal agencies. The second part focuses on supporting external actors in launching and maintaining Living Evidence resources for the public good.

Challenge and Opportunity

We live in a time of veritable “scientific overload”. The number of scientific papers in the world has surged exponentially over the past several decades (Figure 1), and millions of new scientific papers are published every year. Making sense of this deluge of documents presents a formidable challenge. For any given topic, experts have to (i) scour the scientific literature for studies on that topic, (ii) separate out low-quality (or even fraudulent) research, (iii) weigh and reconcile contradictory findings from different studies, and (iv) synthesize study results into a product that can usefully inform both societal decision-making and future scientific inquiry.

This process has evolved over several decades into a scientific method known as “systematic review” or “meta-analysis”. Systematic reviews and meta-analyses are detailed and credible, but often take over a year to produce and rapidly go out of date once published. Experts often compensate by drawing attention to the latest research in blog posts, op-eds, “narrative” reviews, informal memos, and the like. But while such “quick and dirty” scanning of the literature is timely, it lacks scientific rigor. Hence those relying on “the best available science” to make informed decisions must choose between summaries of science that are reliable or current…but not both.

The lack of trustworthy and up-to-date summaries of science constrains efforts, including efforts championed by the White House, to promote evidence-informed policymaking. It also leads to research waste when scientists conduct research that is duplicative and unnecessary, and degrades the efficiency of the scientific ecosystem when funders support research that does not address true knowledge gaps.

Figure 1

Total number of scientific papers published over time, according to the Microsoft Access Graph (MAG) dataset. (Source: Herrmannova and Knoth, 2016)

The emerging Living Evidence paradigm solves these problems by treating knowledge synthesis as an ongoing rather than static endeavor. By combining (i) established, scientific methods of summarizing science with (ii) continuous workflows and technology-based solutions for information discovery and processing, Living Evidence approaches yield systematic reviews that are always up to date with the latest research. An opinion piece published in The New York Times called this approach “a quiet revolution to surface the best-available research and make it accessible for all.”

To take a Living Evidence approach, multidisciplinary teams of subject-matter experts and methods experts (e.g., information specialists and data scientists) first develop an evidence resource—such as a systematic review—using standard approaches. But the teams then commit to regular updates of the evidence resource at a frequency that makes sense for their end users (e.g., once a month). Using technologies such as natural-language processing and machine learning, the teams continually monitor online databases to identify new research. Any new research is rapidly incorporated into the evidence resource using established methods for high-quality evidence synthesis. Figure 2 illustrates how Living Evidence builds on and improves traditional approaches for evidence-informed development of guidelines, standards, and other policy instruments.

Figure 2

Illustration of how a Living Evidence approach to development of evidence-informed policies (such as clinical guidelines) is more current and reliable than traditional approaches. (Source: Author-developed graphic)

Living Evidence products are more trusted by stakeholders, enjoy greater engagement (up to a 300% increase in access/use, based on internal data from the Australian Stroke Foundation), and support improved translation of research into practice and policy. Living Evidence holds particular value for domains in which research evidence is emerging rapidly, current evidence is uncertain, and new research might change policy or practice. For example, Nature has credited Living Evidence with “help[ing] chart a route out” of the worst stages of the COVID-19 pandemic. The World Health Organization (WHO) has since committed to using the Living Evidence approach as the organization’s “main platform” for knowledge synthesis and guideline development across all health issues. 

Yet Living Evidence approaches remain underutilized in most domains. Many scientists are unaware of Living Evidence approaches. The minority who are familiar often lack the tools and incentives to carry out Living Evidence projects directly. The result is an “evidence to action” pipeline far leakier than it needs to be. Entities like government agencies need credible and up-to-date evidence to efficiently and effectively translate knowledge into impact.

It is time to change the status quo. The 2019 Foundations for Evidence-Based Policymaking Act (“Evidence Act”) advances “a vision for a nation that relies on evidence and data to make decisions at all levels of government.” The Biden Administration’s “Year of Evidence” push has generated significant momentum around evidence-informed policymaking. Demonstrated successes of Living Evidence approaches with respect to COVID-19 have sparked interest in these approaches specifically. The time is ripe for the federal government to position Living Evidence as the “gold standard” of evidence products—and the United States as a leader in knowledge discovery and synthesis.

Plan of Action

The federal government should consider a two-part strategy to embrace and promote Living Evidence. The first part of this strategy positions the U.S. government to lead by example by embedding Living Evidence within federal agencies. The second part focuses on supporting external actors in launching and maintaining Living Evidence resources for the public good. 

Part 1. Embedding Living Evidence within federal agencies

Federal science agencies are well positioned to carry out Living Evidence approaches directly. Living Evidence requires “a sustained commitment for the period that the review remains living.” Federal agencies can support the continuous workflows and multidisciplinary project teams needed for excellent Living Evidence products.

In addition, Living Evidence projects can be very powerful mechanisms for building effective, multi-stakeholder partnerships that last—a key objective for a federal government seeking to bolster the U.S. scientific enterprise. A recent example is Wellcome Trust’s decision to fund suites of living systematic reviews in mental health as a foundational investment in its new mental-health strategy, recognizing this as an important opportunity to build a global research community around a shared knowledge source. 

Greater interagency coordination and external collaboration will facilitate implementation of Living Evidence across government. As such, President Biden should issue an Executive Order establishing an Living Evidence Interagency Policy Committee (LEIPC) modeled on the effective Interagency Arctic Research Policy Committee (IARPC). The LEIPC would be chartered as an Interagency Working Group of the National Science and Technology Council (NSTC) Committee on Science and Technology Enterprise, and chaired by the Director of the White House Office of Science and Technology Policy (OSTP; or their delegate). Membership would comprise representatives from federal science agencies, including agencies that currently create and maintain evidence clearinghouses, other agencies deeply invested in evidence-informed decision making, and non-governmental experts with deep experience in the practice of Living Evidence and/or associated capabilities (e.g., information science, machine learning).

Supporting federal implementation of Living Evidence

Widely accepted guidance for living systematic reviews (LSRs), one type of Living Evidence product, has been published. The LEIPC—working closely with OSTP, the White House Office of Management and Budget (OMB), and the federal Evaluation Officer Council (EOC), should adapt this guidance for the U.S. federal context, resulting in an informational resource for federal agencies seeking to launch or fund Living Evidence projects. The guidance should also be used to update systematic-review processes used by federal agencies and organizations contributing to national evidence clearinghouses.2

Once the federally tailored guidance has been developed, the White House should direct federal agencies to consider and pursue opportunities to embed Living Evidence within their programs and operations. The policy directive could take the form of a Presidential Memorandum, a joint management memo from the heads of OSTP and OMB, or similar. This directive would (i) emphasize the national benefits that Living Evidence could deliver, and (ii) provide agencies with high-level justification for using discretionary funding on Living Evidence projects and for making decisions based on Living Evidence insights.

Identifying priority areas and opportunities for federally managed Living Evidence projects

The LEIPC—again working closely with OSTP, OMB, and the EOC—should survey the federal government for opportunities to deploy Living Evidence internally. Box 1 provides examples of opportunities that the LEIPC could consider.

The product of this exercise should be a report that describes each of the opportunities identified, and recommends priority projects to pursue. In developing its priority list, the LEIPC should account for both the likely impact of a potential Living Evidence project as well as the near-term feasibility of that project. While the report could outline visions for ambitious Living Evidence undertakings that would require a significant time investment to realize fully (e.g., transitioning the entire National Climate Assessment into a frequently updated “living” mode), it should also scope projects that could be completed within two years and serve as pilots/proofs of concept. Lessons learned from the pilots could ultimately inform a national strategy for incorporating Living Evidence into federal government more systematically. Successful pilots could continue and grow beyond the end of the two-year period, as appropriate.

Fostering greater collaboration between government and external stakeholders

The LEIPC should create an online “LEIPC Collaborations” platform that connects researchers, practitioners, and other stakeholders both inside and outside government. The platform would emulate IARPC Collaborations, which has built out a community of more than 3,000 members and dozens of communities of practice dedicated to the holistic advancement of Arctic science. As one stakeholder has explained:

LEIPC Collaborations could deliver the same participatory opportunities and benefits for members of the evidence community, facilitating holistic advancement of Living Evidence.

Part 2. Make it easier for scientists and researchers to develop LSRs

Many government efforts could be supported by internal Living Evidence initiatives, but not every valuable Living Evidence effort should be conducted by government. Many useful Living Evidence programs will require deep domain knowledge and specialized skills that teams of scientists and researchers working outside of government are best positioned to deliver.

But experts interested in pursuing Living Evidence efforts face two major difficulties. The first is securing funding. Very little research funding is awarded for the sole purpose of conducting systematic reviews and other types of evidence syntheses. The funding that is available is typically not commensurate with the resource and personnel needs of a high-quality synthesis. Living Evidence demands efficient knowledge discovery and the involvement of multidisciplinary teams possessing overlapping skill sets. Yet federal research grants are often structured in a way that precludes principal investigators from hiring research software engineers or from founding co-led research groups.

The second is aligning with incentives. Systematic reviews and other types of evidence syntheses are often not recognized as “true” research outputs by funding agencies or university tenure committees—i.e., they are often not given the same weight in research metrics, despite (i) utilizing well-established scientific methodologies involving detailed protocols and advanced data and statistical techniques, and (ii) resulting in new knowledge. The result is that talented experts are discouraged from investing their time on projects that can contribute significant new insights and could dramatically improve the efficiency and impact of our nation’s research enterprise.

To begin addressing these problems, the two biggest STEM-funding agencies—NIH and NSF—should consider the following actions:

  1. Perform a landscape analysis of federal funding for evidence synthesis. Rigorously documenting the funding opportunities available (or lack thereof) for researchers wishing to pursue evidence synthesis will help NIH and NSF determine where to focus potential new opportunities. The landscape analysis should consider currently available funding opportunities for systematic, scoping, and rapid reviews, and could also include surveys and focus groups to assess the appetite in the research community for pursuing additional evidence-synthesis activities if supported.
  2. Establish new grant opportunities designed to support Living Evidence projects. The goal of these grant opportunities would be to deliver definitive and always up-to-date summaries of research evidence and associated data in specified topics. The opportunities could align with particular research focuses (for instance, a living systematic review on tissue-electronic interfacing could facilitate progress on bionic limb development under NSF’s current “Enhancing Opportunities for Persons with Disabilities” Convergence Accelerator track). The opportunities could also be topic-agnostic, but require applicants to justify a proposed project by demonstrating that (i) the research evidence is emerging rapidly, (ii) current evidence is uncertain, and (iii) new research might materially change policy or practice.
  3. Increase support for career research staff in academia. Although contributors to Living Evidence projects can cycle in and out (analogous to turnover in large research collaboratives), such projects benefit from longevity in a portion of the team. With this core team in place, Living Evidence projects are excellent avenues for grad students to build core research skills, including in research study design. 
  4. Leverage prestigious existing grant programs and awards to incentivize work on Living Evidence. For instance, NSF could encourage early-career faculty to propose LSRs in applications for CAREER grants.
  5. Recognize evidence syntheses as research outputs. In all assessments of scientific track record (particularly research-funding schemes), systematic reviews and other types of rigorous evidence synthesis should be recognized as research outputs equivalent to “primary” research. 

The grant opportunities should also:

Conclusion

Policymaking can only be meaningfully informed by evidence if underpinning systems for evidence synthesis are robust. The Biden administration’s Year of Evidence for Action provides a pivotal opportunity to pursue concrete actions that strengthen use of science for the betterment of the American people. Federal investment in Living Evidence is one such action. 

Living Evidence has emerged as a powerful mechanism for translating scientific discoveries into policy and practice. The Living Evidence approach is being rapidly embraced by international actors, and the United States has an opportunity to position itself as a leader. A federal initiative on Living Evidence will contribute additional energy and momentum to the Year of Evidence for Action, ensure that our nation does not fall behind on evidence-informed policymaking, and arm federal agencies with the most current and best-available scientific evidence as they pursue their statutory missions.

Frequently Asked Questions
Which sectors and scientific fields can use Living Evidence?
The Living Evidence model can be applied to any sector or scientific field. While the Living Evidence model has so far been most widely applied to the health sector, Living Evidence initiatives are also underway in other fields, such as education and climate sciences. Living Evidence is domain-agnostic: it is simply an approach that builds on existing, rigorous evidence-synthesis methods with a novel workflow of frequent and rapid updating.
What is needed to run a successful Living Evidence project?
It does not take long for teams to develop sufficient experience and expertise to apply the Living Evidence model. The key to a successful Living Evidence project is a team that possesses experience in conventional evidence synthesis, strong project-management skills, an orientation towards innovation and experimentation, and investment in building stakeholder engagement.
How much does Living Evidence cost?
As with evidence synthesis in general, cost depends on topic scope and the complexity of the evidence being appraised. Budgeting for Living Evidence projects should distinguish the higher cost of conducting an initial “baseline” systematic review from the lower cost of maintaining the project thereafter. Teams initiating a Living Evidence project for the first time should also budget for the inevitable experimentation and training required.
Do Living Evidence initiatives require recurrent funding?
No. Living Evidence initiatives are analogous to other significant scientific programs that may extend over many years, but receive funding in discrete, time-bound project periods with clear deliverables and the opportunity to apply for continuation funding. 


Living Evidence projects do require funding for enough time to complete the initial “baseline” systematic review (typically 3-12 months, depending on scope and complexity), transition to maintenance (“living”) mode, and continue in living mode for sufficient time (usually about 6–12 months) for all involved to become familiar with maintaining and using the living resource. Hence Living Evidence projects work best when fully funded for a minimum of two years.
If there is support for funding beyond this minimum period, there are operational advantages of instantiating the follow-on funding before the previous funding period concludes. If follow-on funding is not immediately available, Living Evidence resources can simply revert to a conventional static form until and if follow-on funding becomes available.

Is Living Evidence sustainable?
Living Evidence is rapidly gaining momentum as organizations conclude that the conventional model of evidence synthesis is no longer sustainable because the volume of research that must be reviewed and synthesized for each update has grown beyond the capacity of typical project teams. Organizations that transition their evidence resources into “living” mode typically find the dynamic synthesis model to be more consistent, more feasible, easier to manage, and easier to plan for and resource. If the conventional model of intermittent synthesis is like climbing a series of  mountains, the Living Evidence approach is like hiking up to and then walking across a plateau.
How can organizations that are already struggling to develop and update conventional evidence resources take on a Living Evidence project?
New initiatives usually need specific resourcing; Living Evidence is no different. The best approach is to identify a champion within the organization that has an innovation orientation and sufficient authority to effect change. The champion plays a key role in building organizational buy-in, particularly from senior leaders, key influencers within the main evidence program, and major partners, stakeholders and end users. Ultimately, the champion (or their surrogate) should be empowered and resourced to establish 1–3 Living Evidence pilots running alongside the organization’s existing evidence activities. Risk can be reduced by starting small and building a “minimum viable product” Living Evidence resource (i.e., by finding a topic area that is relatively modest in scope, of importance to stakeholders, and is characterized by evidence uncertainty as well as relatively rapid movement in the relevant research field). Funding should be structured to enable experimentation and iteration, and then move quickly to scale up, increasing the scope of evidence moving into living mode, as organizational and stakeholder experience and support builds.
Living Evidence sounds neverending. Wouldn’t that lead to burnout in the project team?
One of the advantages of the Living Evidence model is that the project team can gradually evolve over time (members can join and leave as their interests and circumstances change). This is analogous to the evolution of an ongoing scientific network or research collaborative. In contrast, the spikes in workload required for intermittent updates of conventional evidence products often lead to burnout and loss of institutional memory. Furthermore, teams working on Living Evidence are often motivated by participation in an innovative approach to evidence and pride in contributing to a definitive, high-quality, and highly impactful scientific initiative.
How is Living Evidence disseminated?

While dissemination of conventional evidence products involves sharing several dozen key messages in a once-in-several-years communications push, dissemination of Living Evidence amounts to a regular cycle of “what’s new” updates (typically one to two key insights). Living Evidence dissemination feeds become known and trusted by end users, inspiring confidence that end users can “keep up” with the implications of new research. Publication of Living Evidence can take many forms. Typically, the core evidence resource is housed in an organizational website that can be easily and frequently updated, sometimes with an ability for users to access previous versions of the resource. Living Evidence may also be published as articles in academic journals. This could  be intermittent overviews of the evidence resource with links back to the main Living Evidence summaries, or (more ambitiously) as a series of frequently updated versions of an article that are logically linked. Multiple academic journals are innovating to better support “living” publications.

If Living Evidence products are continually updated, doesn’t that confuse end users with constantly changing conclusions?
Living Evidence requires continual monitoring for new research, as well as frequent and rapid incorporation of new research into existing evidence products. The volume of research identified and incorporated can vary from dozens of studies each month to a few each year, depending on the topic scope and research activity.


Even across broad topics in fast-moving research fields, though, the overall findings and conclusions of Living Evidence products change infrequently since the threshold for changing a conclusion drawn from a whole body of evidence is high. The largest Living Evidence projects in existence only yield about one to two new major findings or recommendations each update. Furthermore, any good evidence-synthesis product will contextualize conclusions and recommendations with confidence.

What are the implications of Living Evidence for stakeholder engagement?
Living Evidence projects, due to their persistent nature, are great opportunities for building partnerships with stakeholders. Stakeholders tend to be energized and engaged in an innovative project that gives them, their staff, and their constituencies a tractable mechanism by which to engage with the “current state of science”. In addition, the ongoing nature of a Living Evidence project means that project partnerships are always active. Stakeholders are continually engaged in meaningful, collaborative discussions and activities around the current evidence. Finally, this ongoing, always-active nature of Living Evidence projects creates “accumulative” partnerships that gradually broaden and deepen over time.
What are the equity implications of taking a Living Evidence approach?
Living Evidence resources make the latest science available to all. Conventionally, the lack of high-quality summaries of science has meant the latest science is discovered and adopted by those closest to centers of excellence and expertise. Rapid incorporation of the latest science into Living Evidence resources—as well as the wide promotion and dissemination of that science—means that the immediate benefits of science can be shared much more broadly, contributing to equity of access to science and its benefits.
What are the implications of Living Evidence for knowledge translation?
The activities that use research outputs and evidence resources (such as Living Evidence) to change practice and policy are often referred to as “knowledge translation”. These activities are substantial and often multifaceted interventions that identify and address the complex structural, organizational, and cultural barriers that impede knowledge use. 


Living Evidence has the potential to accelerate knowledge translation: not because of any changes to the knowledge-translation enterprise, but because Living Evidence identifies earlier the high-certainty evidence that underpins knowledge-translation activities.

Living Evidence may also enhance knowledge translation in two ways. First, Living Evidence is a better evidence product and has been shown to increase trust, engagement, and intention to use among stakeholders. Second, as mentioned above, Living Evidence creates opportunities for deep and effective partnerships. Together, these advantages could position Living Evidence to yield a more effective “enabling environment” for knowledge translation.

Does Living Evidence require use of technologies like machine learning?
Technologies such as natural language processing, machine learning and citizen science (crowdsourcing), as well as efforts to build common data structures (and create Findable, Accessible, Interoperable and Reusable (FAIR) data), are advancing alongside Living Evidence. These technologies are often described as “enablers” of Living Evidence. While such technologies are commonly used and developed in Living Evidence projects, they are not essential. Nevertheless, over the longer term, such technologies will likely be indispensable for creating sustainable systems that make sense of science.

Creating a Digital Service for the Planet

Summary

Challenge and Opportunity

The Biden administration—through directives such as Executive Order 14008 on Tackling the Climate Crises at Home and Abroad and President Biden’s Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking, as well as through initiatives such as Justice40 and America the Beautiful (30×30)—has laid the blueprint for a data-driven environmental agenda. 

However, the data to advance this agenda are held and managed by multiple agencies, making them difficult to standardize, share, and use to their full potential. For example, water data are collected by 25 federal entities across 57 data platforms and 462 different data types. Permitting for wetlands, forest fuel treatments, and other important natural-resource management tasks still involves a significant amount of manual data entry, and protocols for handling relevant data vary by region or district. Staff at environmental agencies have privately noted that it can take weeks or months to receive necessary data from colleagues in other agencies, and that they have trouble knowing what data exist at other agencies. Accelerating the success and breadth of environmental initiatives requires digitized, timely, and accessible information for planning and execution of agency strategies.

The state of federal environmental data today echoes the state of public-health data in 2014, when President Obama recognized that the Department of Health and Human Services lacked the technical skill sets and capacity needed to stand up Healthcare.gov. The Obama administration responded by creating the U.S. Digital Service (USDS), which provides federal agencies with on-demand access to the technical expertise they need to design, procure, and deploy technology for the public good. Over the past eight years, USDS has developed a scalable and replicable model of working across government agencies. Projects that USDS has been involved in—like improving federal procurement and hiring processes, deploying healthcare.gov, and modernizing administrative tasks for veterans and immigrants—have saved agencies such as the Department of Veterans Affairs millions of dollars.

But USDS lacks the specialized capacity and skills, experience, and specific directives needed to fully meet the shared digital-infrastructure needs of environmental agencies. The Climate and Economic Justice Screening Tool (CEJST) is an example of how crucial digital-service capacity is for tackling the nation’s environmental priorities, and the need for a DSP. While USDS was instrumental in getting the tool off the ground, several issues with the launch point to a lack of specialized environmental capabilities and expertise within USDS. Many known environmental-justice issues—including wildfire, drought, and flooding—were not reflected in the tool’s first iteration. In addition, the CEJST should have been published in July 2021, but the beta version was not released until February 2022. A DSP familiar with environmental data would have started with a stronger foundation to help anticipate and incorporate such key environmental concerns, and may have been able to deliver the tool on a tighter timeline.

There is hope in this challenge. The fact that many environmental programs across multiple federal agencies have overlapping data and technology needs means that a centralized and dedicated team focused on addressing these needs could significantly and cost-effectively advance the capacities of environmental agencies to:

Plan of Action

To best position federal agencies to meet environmental goals, the Biden administration should establish a “Digital Service for the Planet (DSP).” The DSP would build off the successes of USDS to provide support across three key areas for environmental agencies:

  1. Strategic planning and procurement. Scoping, designing, and procuring technology solutions for programmatic goals. For example, a DSP could help the Fish and Wildlife Service (FWS) accelerate updates to the National Wetlands Inventory, which are currently estimated to take 10 years and cost $20 million dollars.
  2. Technical development. Implementing targeted technical-development activities to achieve mission-related goals in collaboration with agency staff. For example, a DSP could help update the accessibility and utility for many government tools that the public rely heavily on, such as the Army Corps system that tracks mitigation banks (known as the Regulatory In lieu fee and Bank Information Tracking System (RIBITS)).
  3. Cross-agency coordination on digital infrastructure. Facilitating data inventory and sharing, and development of the databases, tools, and technological processes that make cross-agency efforts possible. A DSP could be a helpful partner for facilitating information sharing among agencies that monitor interrelated events, environments, or problems, including droughts, wildfires, and algal blooms. 

The DSP could be established either as a new branch of USDS, or as a new and separate but parallel entity housed within the White House Office of Management and Budget. The former option would enable DSP to leverage the accumulated knowledge and existing structures of USDS. The latter option would enable DSP to be established with a more focused mandate, and would also provide a clear entry point for federal agencies seeking data and technology support specific to environmental issues.

Regardless of the organizational structure selected, DSP should include the essential elements that have helped USDS succeed—per the following recommendations.

Recommendation 1. The DSP should emulate the USDS’s staffing model and position within the Executive Office of the President (EOP).

The USDS hires employees on short-term contracts, with each contract term lasting between six months and four years. This contract-based model enables USDS to attract high-level technologists, product designers, and programmers who are interested in public service, but not necessarily committed to careers in government. USDS’s staffing model also ensures that the Service does not take over core agency capacities, but rather is deployed to design and procure tech solutions that agencies will ultimately operate in-house (i.e., without USDS involvement). USDS’s position within the EOP makes USDS an attractive place for top-level talent to work, gives staff access to high-level government officials, and enables the Service to work flexibly across agencies.

Recommendation 2. Staff the DSP with specialists who have prior experience working on environmental projects.

Working on data and technology issues within environmental contexts requires specialized skill sets and experience. For example, geospatial data and analysis are fundamental to environmental protection and conservation, but this has not been a focal point of USDS hiring. In addition, a DSP staff fluent in the vast and specific terminologies used in environmental fields (such as water management) will be better able to communicate with the many subject-matter experts and data stewards working in environmental agencies. 

Recommendation 3. Place interagency collaboration at the core of the DSP mission.

Most USDS projects focus on a single federal agency, but environmental initiatives—and the data and tech needs they present—almost always involve multiple agencies. Major national challenges, including flood-risk management, harmful algal blooms, and environmental justice, all demand an integrated approach to realize cross-agency benefits. For example, EPA-funded green stormwater infrastructure could reduce flood risk for housing units subsidized by the Department of Housing and Urban Development. DSP should be explicitly tasked with devising approaches for tackling complex data and technology issues that cut across agencies. Fulfilling this mandate may require DSP to bring on additional expertise in core competencies such as data sharing and integration.

Recommendation 4. Actively promote the DSP to relevant federal agencies.

Despite USDS’s eight-year existence, many staff members at agencies involved in environmental initiatives know little about the Service and what it can do for them. To avoid underutilization due to lack of awareness, the DSP’s launch should include an outreach campaign targeted at key agencies, including but not limited to the U.S. Army Corps of Engineers (USACE), the Department of Energy (DOE), the Department of the Interior (DOI), the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Department of Agriculture, and the U.S. Global Change Research Program (USGCRP).

Conclusion

A new Digital Service for the Planet could accelerate progress on environmental and natural-resource challenges through better use of data and technology. USDS has shown that a relatively small and flexible team can have a profound and lasting effect on how agencies operate, save taxpayer money, and encourage new ways of thinking about long standing problems. However, current capacity at USDS is limited and not specifically tailored to the needs of environmental agencies. From issues ranging from water management to environmental justice, ensuring better use of technology and data will yield benefits for generations to come. This is an important step for the federal government to be a better buyer, partner, and consumer of the data technology and innovations that are necessary to support the country’s conservation, water, and stewardship priorities.

Frequently Asked Questions
How would the DSP differ from the U.S. Digital Service?

The DSP would build on the successful USDS model, but would have two distinguishing characteristics. First, the DSP would employ staff experienced in using or managing environmental data and possessing special expertise in geospatial technologies, remote sensing, and other environmentally relevant tech capabilities. Second, DSP would have an explicit mandate to develop processes for tackling data and technology issues that frequently cut across agencies. For example, the Internet of Water found that at least 25 different federal entities collect water data, while the USGCRP has identified at least 217 examples of earth observation efforts spanning many agencies. USDS is not designed to work with so many agencies at once on a single project—but DSP would be.

Would establishing the DSP prohibit agencies from independently improving their data and tech practices? 

Not in most cases. The DSP would focus on meeting data and technology needs shared by multiple agencies. Agencies would still be free—and encouraged!—to pursue agency-specific data- and tech-improvement projects independently.


Indeed, a hope would be that by showcasing the value of digital services for environmental projects on a cross-agency basis, the DSP would inspire individual agencies to establish their own digital services teams. Precedent for this evolution exists: the USDS provided initial resources to solve digital challenges for healthcare.gov and Department of Veteran Affairs. The Department of Veteran Affairs and Department of Defense have since started their internal digital services teams. However, even with agency-based digital service teams, there will always be a need for a team with a cross-agency view, especially given that so many environmental problems and solutions extend well beyond the borders of a single agency. Digital-service teams at multiple levels can be complementary and would focus on different project scopes and groups of users. For example, agency-specific digital-service teams would be much better positioned to help sustain agency-specific components of an effort established by DSP.

How much would this proposal cost?

We propose the DSP start with a mid-sized team of twenty to thirty full-time equivalent employees (FTEs) and a budget around $8 million. These personnel and financial allocations are in line with allocations for USDS. DSP could be scaled up over time if needed, just as USDS grew from approximately 12 FTEs in fiscal year (FY) 2014 to over 200 FTEs in FY 2022. The long-term target size of the DSP team should be informed by the uptake and success of DSP-led work.

Why would agencies want a DSP? Why would they see it as beneficial?

From our conversations with agency staff, we (the authors) have heard time and again that agencies see immense value in a DSP, and find that two scenarios often inhibit improved adoption of environmental data and technology. The first scenario is that environmental-agency staff see the value in pursuing a technology solution to make their program more effective, but do not have the authority or resources to implement the idea, or are not aware of the avenues available to do so. DSP can help agency staff design and implement modern solutions to realize their vision and coordinate with important stakeholders to facilitate the process.


The second scenario is that environmental-agency staff are trained experts in environmental science, but not in evaluating technology solutions. As such, they are poorly equipped to evaluate the integrity of proposed solutions from external vendors. If they end up trialing a solution that is a poor fit, they may become risk-averse to technology at large. In this scenario, there is tremendous value in having a dedicated team of experts within the government available to help agencies source the appropriate technology or technologies for their programmatic goals.

Expanding Pathways for Career Research Scientists in Academia

Summary

The U.S. university research enterprise is plagued by an odd bug: it encourages experts in science, technology, engineering, and math (STEM) to leave it at the very moment they become recognized as experts. People who pursue advanced degrees in STEM are often compelled by deep interest in research. But upon graduation from master’s, Ph.D., or postdoctoral programs, these research-oriented individuals face a difficult choice: largely cede hands-on involvement in research to pursue faculty positions (which increasingly demand that a majority of time be spent on managerial responsibilities, such as applying for grants), give up the higher pay and prestige of the tenure track in order to continue “doing the science” via lower-status staff positions (e.g., lab manager, research software engineer), or leave the academic sector altogether. 

Many choose the latter. And when that happens at scale, it harms the broader U.S. scientific enterprise by (i) decreasing federal returns on investment in training STEM researchers, and (ii) slowing scientific progress by creating a dearth of experienced personnel conducting basic research in university labs and mentoring the next generation of researchers. The solution is to strengthen and elevate the role of the career research scientist1 in academia—the highly trained senior research-group member who is hands-on and in the lab every day—in the university ecosystem. This is, fundamentally, a fairly straightforward workforce-pipeline issue that federal STEM-funding agencies have the power to address. The National Institutes of Health (NIH) and the National Science Foundation (NSF) — two of the largest sources of academic research funding — could begin by hosting high-level discussions around the problem: specifically, through an NSF-led workshop and an NIH-led task force. In parallel, the two agencies can launch immediately tractable efforts to begin making headway in addressing the problem. NSF, for instance, could increase visibility and funding for research software engineers, while NSF and/or NIH could consider providing grants to support “co-founded” research labs jointly led by an established professor or principal investigator (PI) working alongside an experienced career research scientist.

The collective goal of these activities is to infuse technical expertise into the day-to-day ideation and execution of science (especially basic research), thereby accelerating scientific progress and helping the United States retain world scientific leadership.

Challenge and Opportunity

The scientific status quo in the United States is increasingly diverting STEM experts away from direct research opportunities at universities. STEM graduate students interested in hands-on research have few attractive career opportunities in academia: those working as staff scientists, lab managers, research software engineers, and similar forego the higher pay and status of the tenure track, while those working as faculty members find themselves encumbered by tasks that are largely unrelated to research. 

Making it difficult for STEM experts to pursue hands-on research in university settings harms the broader U.S. scientific enterprise in two ways. First, the federal government disburses huge amounts of money every year—via fellowship funding, research grants, tuition support, and other avenues—to help train early-career STEM researchers. This expenditure is warranted because, as the Association of American Universities explains, “There is broad consensus that university research is a long-term national investment in the future.” This investment hinges on university contributions to basic research; universities and colleges account for just 13% of overall U.S. research and development (R&D) activity, but nearly half (48%) of basic research. Limited career opportunities for talented STEM researchers to continue “doing the science” in academic settings therefore limits our national returns on investment in these researchers.

Box 1. Productivity benefits of senior researchers in software-driven fields.
Cutting-edge research in nearly all STEM fields increasingly depends on software. Indeed, NSF observes that software is “directly responsible for increased scientific productivity and significant enhancement of researchers’ capabilities.” Problematically, there is minimal support within academia for development and ongoing maintenance of software. It is all too common for a promising research project at a university lab to wither when the graduate student who wrote the code upon which the project depends finishes their degree and leaves.

The field of deep learning (a branch of artificial intelligence (AI) and machine learning) underscores the value of research software. Progress in deep learning was slow and stuttering until development of user-friendly software tools in the mid-2010s: a development spurred mostly by private-sector investment. The result has been an explosion of productivity in deep learning. Even now, top AI research teams cite software-engineering talent as a critical input upon which their scientific output depends. But while research software engineers are some of the most in-demand and valuable team members in the private sector, career positions for research software engineers are uncommon at academic institutions. How much potential scientific discovery are U.S. university labs failing to recognize as a result of this underinvestment?

Second, attrition of STEM talent from academia slows the pace of U.S. scientific progress because most hands-on research activities are conducted by graduate students rather than more experienced personnel. Yet, senior researchers are far more scientifically productive. With years of experience under their belt, senior researchers possess tacit knowledge of how to effectively get research done in a field, can help a team avoid repeating mistakes, and can provide the technical mentorship needed for graduate students to acquire research skills quickly and well. And with graduate students and postdocs typically remaining with a research group for only a few years, career research scientists also provide important continuity across projects. The productivity boosts that senior researchers can deliver are especially well established for software-driven fields (see box).

The absence of attractive job opportunities for career research scientists at most academic institutions is an anomaly. Such opportunities are prevalent in the private sector, at national labs (e.g., those run by the NIH and the Department of Energy) and other government institutions, and in select well-endowed university labs that enjoy more discretionary spending ability. As the dominant funder of university research in the United States, the federal government has massive leverage over the structure of research labs. With some small changes in grant-funding incentives, federal agencies can address this anomaly and bring more senior research scientists into the academic research system. 

Plan of Action

Federal STEM-funding agencies — led by NSF and NIH, as the two largest sources of federal funding for academic research — should explore and pursue strategies for changing grant-funding incentives in ways that strengthen and elevate the role of the career research scientist in academia. We split our recommendations into two parts. 

The first part focuses on encouraging discussion. The problem of limited career options for trained STEM professionals who want to engage in hands-on research in the academic sector currently flies beneath the radar of many extremely knowledgeable stakeholders inside and outside of the federal government. Bringing these stakeholders together might result in excellent actionable suggestions on how to retain talented research scientists in academia. Second, we suggest two specific projects to make headway on the problem: (i) further support for research software engineers and (ii) a pilot program supporting co-founded research labs. While the recommendations below are targeted to NSF and NIH, other federal STEM-funding agencies (e.g., the Departments of Energy and Defense) can and should consider similar actions. 

Part 1. Identify needs, opportunities, and options for federal actions to support and incentivize career research scientists.2

Shifting academic employment towards a model more welcoming to career research scientists will require a mix of specific new programs and small and large changes to existing funding structures. However, it is not yet clear which reforms should be prioritized. Our first set of suggestions is designed to start the necessary discussion.

Specifically, NSF should start by convening key community members at a workshop (modeled on previous NSF-sponsored workshops, such as the workshop on a National Network of Research Institutes [NNRI]) focused on how the agency can encourage creation of additional career research scientist positions at universities. The workshop should also (i) discuss strategies for publicizing and encouraging outstanding STEM talent to pursue such positions, (ii) identify barriers that discourage universities from posting for career research scientists, and (iii) brainstorm solutions to these barriers. Workshop participants should include representatives from federal agencies that sponsor national labs as well as industry sectors (software, biotech, etc.) that conduct extensive R&D, as these entities are more experienced employers of career research scientists. The workshop should address the following questions:

The primary audience for the workshop will be NSF leadership and policymakers. The output of the workshop should be a report suggesting a clear, actionable path forward for those stakeholders to pursue.

NIH should pursue an analogous fact-finding effort, possibly structured as a working group of the Advisory Committee to the Directorate. This working group would identify strategies for incentivizing labs to hire professional staff members, including expert lab technicians, professional biostatisticians, and RSEs. This working group will ultimately recommend to the NIH Director actions that the agency can take to expand the roles of career research scientists in the academic sector. The working group would address questions similar to those explored in the NSF workshop.

Part 2. Launch two pilot projects to begin expanding opportunities for career research scientists.

Pilot 1. Create a new NSF initiative to solicit and fund requests for research software engineer (RSE) support. 

Research software engineers (RSEs) build and maintain research software, and train scientists to use that software. Incentivizing the creation of long-term RSE positions at universities will increase scientific productivity and build the infrastructure for sustained scientific progress in the academic sector. Though a wide range of STEM disciplines could benefit from RSE involvement, NSF’s Computer and Information Science and Engineering (CISE) Directorate is a good place to start expanding support for RSEs in academic projects. 

CISE has previously invested in nascent support structures for professional staff in software and computing fields. The CISE Research Initiation Initiative (CRII) was created to build research independence among early-career researchers working in CISE-related fields by funding graduate-student appointments. Much CRII-funded work involves producing — and in turn, depends on — shared community software. Similarly, the Campus Research Computing Consortium (CaRCC) and RCD Nexus are NSF-supported programs focused on creating guidelines and resources for campus research computing operations and infrastructure. Through these two programs, NSF is helping universities build a foundation of (i) software production and (ii) computing hardware and infrastructure needed to support that software. 

However, effective RSEs are crucial for progress in scientific fields outside of CISE’s domain. For example, one of this memo’s authors has personal experience with NSF-funded geosciences research. PIs working in this field are desperate for funding to hire RSEs, but do not have access to funding for that purpose. Instead, they depend almost entirely on graduate students.

As a component of the workshop recommended above, NSF should highlight other research areas hamstrung by an acute need for RSEs. In addition, CISE should create a follow-on CISE Software Infrastructure Initiative (CSII) that solicits and funds requests from pre-tenure academic researchers in a variety of fields for RSE support. Requests should explain how the requested RSE would (i) catalyze cutting-edge research, and (ii) maintain critical community open-source scientific software. Moreover, academia severely lacks strong mentorship in software engineering. A specific goal of CSII funding should be to support at least a 1:3 ratio of RSEs to graduate students in funded labs. Creative evaluation mechanisms will be needed to assess the success of CSII. The goal of this initiative will be a community of university researchers productively using software created and supported by RSEs hired through CSII funding. 

Pilot 2. Provide grants to support “co-founded” research labs jointly led by an established professor or principal investigator (PI) working alongside an experienced career research scientist.

Academic PIs (typically faculty) normally lead their labs and research groups alone. This state of affairs leads to high rates of burnout, possibly leading to poor research success. In some cases, starting an ambitious new project or company with a co-founder makes the endeavor more likely to succeed while being less stressful and isolating. A co-founder can provide a complementary set of skills. For example, the startup incubator Y Combinator is well known for wanting teams to include a CEO visionary and manager working alongside a CTO builder and designer. By contrast, academic PIs are expected to be talented at all aspects of running a modern scientific lab. Developing mechanisms to help scientists come together and benefit from complementary skill sets should be a high priority for science-funding agencies.

We recommend that NSF and/or NIH create a pilot grant program to fund co-founded research labs at universities. Formally co-founded research groups have been successful across scientific domains (e.g., the AbuGoot Lab at MIT and the Carpenter-Singh Lab at the Broad Institute), but remain quite rare. Federal grants for co-founded research labs would build on this proof of concept by competitively awarding 5–7 years of salary and equipment funding to support a lab jointly run by an early-career PI and a career research scientist. A key anticipated benefit of this grant program is increased retention of outstanding researchers in positions that enable them to keep “doing the science.” Currently, the most talented STEM researchers become faculty members or leave academia altogether. Career research scientist positions simply cannot offer competitive levels of compensation and prestige. Creating a new, high-profile, grant-funded opportunity for STEM talent to remain in hands-on university lab positions could help shift the status quo. Creating a pathway for co-founded and co-led research labs would also help PIs avoid isolation and burnout while building more robust, healthy, and successful research teams.

Conclusion

Many breakthroughs in scientific progress have required massive funding and national coordination. This is not one of them. All that needs to be done is allow expert research scientists to do the hands-on work that they’ve been trained to do. The scientific status quo prevents our nation’s basic research enterprise from achieving its full potential, and from harnessing that potential for the common good. Strengthening and elevating the role of career research scientists in the academic sector will empower existing STEM talent to drive scientific progress forward.

Frequently Asked Questions
Are there places where research scientists are common?

Yes. The tech sector is a good example. Multiple tech companies have developed senior individual contributor (IC) career paths. These IC career paths allow people to grow their influence while remaining mostly in a hands-on technical role. The most common role of a senior software engineering IC is that of the “tech lead”, guiding the technical decision making and execution of a team. Other paths might involve prototyping and architecting a critical new system or diving in and solving an emergency problem. For more details on this kind of career, look at the Staff Engineer book and accompanying discussion.

Why is now the time for federal STEM-funding agencies to increase support for career research scientists?

The United States has long been the international leader in scientific progress, but that position is being threatened as more countries develop the human capital and infrastructure to compete in a knowledge-oriented economy. In an era where humankind faces mounting existential risks requiring scientific innovation, maintaining U.S. scientific leadership is more important than ever. This requires retaining high-level scientific talent in hands-on, basic research activities. But that goal is undermined by the current structure of employment in American academic science.

Which other federal agencies fund scientific research, and could consider actions similar to those proposed in this memo for NSF and NIH?

Key federal STEM-funding agencies that could also consider ways to support and elevate career research scientist positions include the Departments of Agriculture, Defense, and Energy, as well as the National Aeronautics and Space Administration (NASA).

Regulating Use of Mobile Sentry Devices by U.S. Customs and Border Protection

Summary

Robotic and automated systems have the potential to remove humans from dangerous situations, but their current intended use as aids or replacements for human officers conducting border patrols raises ethical concerns if not regulated to ensure that this use “promot[es] the safety of the officer/agent and the public” (emphasis added). U.S. Customs and Border Protection (CBP) should update its use-of-force policy to cover the use of robotic and other autonomous systems for CBP-specific applications that differ from the military applications assumed in existing regulations. The most relevant existing regulation, Department of Defense Directive 3000.09, governs how semi-autonomous weapons may be used to engage with enemy combatants in the context of war. This use case is quite different from mobile sentry duty, which may include interactions with civilians (whether U.S. citizens or migrants). With robotic and automated systems about to come into regular use at CBP, the agency should proactively issue regulations to forestall adverse effects—specifically, by only permitting use of these systems in ways that presume all encountered humans to be non-combatants. 

Challenge and Opportunity

CBP is currently developing mobile sentry devices as a new technology to force-multiply its presence at the border. Mobile sentry devices, such as legged and flying robots, have the potential to reduce deaths at the border by making it easier to locate and provide aid to migrants in distress. According to an American Civil Liberties Union (ACLU) report, 22% of migrant deaths between 2010 and 2021 that involved an on-duty CBP agent or officer were caused by medical distress that began before the agent or officer arrived on the scene. However, the eventual use cases, rules of engagement, and functionalities of these robots are unclear. If not properly regulated, mobile sentry devices could also be used to harm or threaten people at the border—thereby contributing to the 44% of deaths that occurred as a direct result of vehicular or foot pursuit by a CBP agent. Regulations on mobile sentry device use—rather than merely acquisition—are needed because even originally unarmed devices can be weaponized after purchase. Devices that remain unarmed can also harm civilians using a limb or propeller. 

Existing Department of Homeland Security (DHS) regulations governing autonomous systems seek to minimize technological bias in artificially intelligent risk-assessment systems. Existing military regulations seek to minimize risks of misused or misunderstood capabilities for autonomous systems. However, no existing federal regulations govern how uncrewed vehicles, whether remotely controlled or autonomous, can be used by CBP. The answer is not as simple as extending military regulations to the CBP. Military regulations governing autonomous systems assume that the robots in question are armed and interacting with enemy combatants. This assumption does not apply to most, if not all, possible CBP use cases.

With the CBP already testing robotic dogs for deployment on the Southwestern border, the need for tailored regulation is pressing. Recent backlash over the New York Police Department testing similar autonomous systems makes this topic even more timely. While the robots used by CBP are currently unarmed, the same company that developed the robots being tested by CBP is working with another company to mount weapons on them. The rapid innovation and manufacturing of these systems requires implementation of policies governing their use by CBP before CBP has fully incorporated such systems into its workflows, and before the companies that build these systems have formed a powerful enough lobby to resist appropriate oversight. 

Plan of Action

CBP should immediately update its Use of Force policy to include restrictions on use of force by mobile sentry devices. Specifically, CBP should add a chapter to the policy with the following language:

These regulations should go into effect before Mobile Sentry Devices are moved from the testing phase to the deployment phase. Related new technology, whether it increases capabilities for surveillance or autonomous mobility, should undergo review by a committee that includes representatives from the National Use of Force Review Board, migrant rights groups, and citizens living along the border. This review should mirror the process laid out in the Community Control over Police Surveillance project, which has already been successfully implemented in multiple cities

Conclusion

U.S. Customs and Border Patrol (CBP) is developing an application for legged robots as mobile sentry devices at the southwest border. However, the use cases, functionality, and rules of engagement for these robots remain unclear. New regulations are needed to forestall adverse effects of autonomous robots used by the federal government for non-military applications, such as those envisioned by CBP. These regulations should specify that mobile sentry devices can only be used as humanitarian aids, and must use de-escalation methods to indicate that they are not threatening. Regulations should further mandate that mobile sentry devices maintain clear distance from human targets, that use of force by mobile sentry devices is never considered “reasonable,” and that mobile sentry devices may never be used to pursue, detain, or arrest humans. Such regulations will help ensure that the legged robots currently being tested as mobile sentry devices by CBP—as well as any future mobile sentry devices—are used ethically and in line with CBP’s goals, alleviating concerns for migrant advocates and citizens along the border.

Frequently Asked Questions
What is the purpose of regulating CBP use of autonomous robots as mobile sentry devices rather than purchasing of autonomous robots?

Regulations on purchasing are not sufficient to prevent mobile sentry device technology from being weaponized after it is purchased. However, DHS could certainly also consider updating its acquisition regulations to include clauses resulting in fines when mobile sentry devices acquired by the CBP are not used for humanitarian purposes.

Why is Department of Defense (DOD) Directive 3000.09 not sufficient to regulate the use of force by all government agencies?

DOD Directive 3000.09 regulates the use of autonomous weapons systems in the context of war. For an autonomous, semi-autonomous, or remotely controlled system that is deployed with the intention to be a weapon in an active battlefield, this regulation makes sense. But applications of robotic and automated systems currently being developed by DHS are oriented towards mobile sentry duty along stretches of American land where civilians are likely to be found. This sentry duty is likely to be performed by uncrewed ground robots following GPS breadcrumb trails along predetermined regular patrols along the border. Applying Directive 3000.09, the use of a robot to kill or harm a person during a routine patrol along the border would not be a violation as long as a human had “meaningful control” over the robot at that time. The upshot is that mobile sentry devices used by CBP should be subject to stricter regulations.

What standards do robotics companies have on the use of their technologies?

Most companies selling legged robots in the United States have explicit end-user policies prohibiting the use of their machines to harm or intimidate humans or animals. Some companies selling quadcopter drones have similar policies. But these policies lack any enforcement mechanism. As such, there is a regulatory gap that the federal government must fill.

Is updating its Use of Force policy the only way for CBP to regulate its use of mobile sentry devices?

No, but it is an immediately actionable strategy. An alternative—albeit more time-consuming—option would be for CBP to form a committee comprising representatives from the National Use of Force Review Board, the military, migrant-rights activist groups, and experts on ethics to develop a directive for CBP’s use of mobile sentry devices. This directive should be modeled after DoD Directive 3000.09, which regulates the use of lethal autonomous weapons systems by the military. As the autonomous systems in DOD Directive 3000.09 are assumed to be interacting with enemy combatants while CBP’s jurisdiction consists mostly of civilians, the CBP directive should be considerably more stringent than Directive 3000.09.

Would the policies proposed in this memo vary with the degree of autonomy possessed by the robot in question?

The policies proposed in this memo govern what mobile sentry devices are and are not permitted to do, regardless of the extent to which humans are involved in device operation and/or the degree of autonomy possessed by the technology in question. The policies proposed in this memo could therefore be applied consistently as the technology continues to be developed. AI is always changing and improving, and by creating policies that are tech-agnostic, CPB can avoid updating regulations as mobile sentry device technology evolves.

CLimate Improvements through Modern Biotechnology (CLIMB) — A National Center for Bioengineering Solutions to Climate Change and Environmental Challenges

Summary

Tackling pressing environmental challenges — such as climate change, biodiversity loss, environmental toxins and pollution — requires bold, novel approaches that can act at the scale and expediency needed to stop irreversible damage. Environmental biotechnology can provide viable and effective solutions. The America COMPETES Act, if passed, would establish a National Engineering Biology Research and Development Initiative. To lead the way in innovative environmental protection, a center should be created within this initiative that focuses on applying biotechnology and bioengineering to environmental challenges. The CLimate Improvements through Modern Biotechnology (CLIMB) Center will fast-track our nation’s ability to meet domestic and international decarbonization goals, remediate contaminated habitats, detect toxins and pathogens, and deliver on environmental-justice goals. 

The CLIMB Center would (i) provide competitive grant funding across three key tracks — bioremediation, biomonitoring, and carbon capture — to catalyze comprehensive environmental biotechnology research; (ii) house a bioethics council to develop and update guidelines for safe, equitable environmental biotechnology use; (iii) manage testbeds to efficiently prototype environmental biotechnology solutions; and (iv) facilitate public-private partnerships to help transition solutions from prototype to commercial scale. Investing in the development of environmental biotechnology through the CLIMB Center will overall advance U.S. leadership on biotechnology and environmental stewardship, while helping the Biden-Harris Administration deliver on its climate and environmental-justice goals. 

Challenge and Opportunity

The rapidly advancing field of biotechnology has considerable potential to aid the fight against climate change and other pressing environmental challenges. Fast and inexpensive genetic sequencing of bacterial populations, for instance, allows researchers to identify genes that enable microorganisms to degrade pollutants and synthesize toxins. Existing tools like CRISPR, as well as up-and-coming techniques such as retron-library recombineering, allow researchers to effectively design microorganisms that can break down pollutants more efficiently or capture more carbon. Biotechnology as a sector has been growing rapidly over the past two decades, with the global market value estimated to be worth nearly $3.5 trillion by 2030. These and numerous other biotechnological advances are already being used to transform sectors like medicine (which comprises nearly 50% of the biotechnology sector), but have to date been underutilized in the fight for a more sustainable world. 

One reason why biotechnology and bioengineering approaches have not been widely applied to advance climate and environmental goals is that returns on investment are too uncertain, too delayed, or too small to motivate private capital — even if solving pressing environmental issues through biotechnology would deliver massive societal benefits. The federal government can act to address this market failure by creating a designated environmental-biotechnology research center as part of the National Engineering Biology Research and Development Initiative (America COMPETES act, Sec. 10403). Doing so will help the Biden-Harris Administration achieve its ambitious targets for climate action and environmental justice.

Plan of Action

The America COMPETES Act would establish a National Engineering Biology Research and Development Initiative “to establish new research directions and technology goals, improve interagency coordination and planning processes, drive technology transfer to the private sector, and help ensure optimal returns on the Federal investment.” The Initiative is set to be funded through agency contributions and White House Office and Science and Technology Policy (OSTP) budget requests. The America COMPETES Act also calls for creation of undesignated research centers within the Initiative. We propose creating such a center focused on environmental-biotechnology research: The CLimate Improvements through Modern Biotechnology (CLIMB) Center. The Center would be housed under the new National Science Foundation (NSF) Directorate for Technology, Innovation and Partnerships and co-led by the NSF Directorate of Biological Sciences. The Center would take a multipronged approach to support biotechnological and bioengineering solutions to environmental and climate challenges and rapid technology deployment. 

We propose the Center be funded with an initial commitment of $60 million, with continuing funds of $300 million over five years. The main contributing federal agencies research offices would be determined by OSTP, but should at minimum include: NSF; the Departments of Agriculture, Defense, and Energy (USDA, DOD, and DOE); the Environmental Protection Agency (EPA), the National Oceanic and Atmospheric Administration (NOAA), and the U.S. Geological Survey (USGS).  

Specifically, the CLIMB Center would: 

  1. Provide competitive grant funding across three key tracks — bioremediation, biomonitoring, and carbon capture — to catalyze comprehensive environmental-biotechnology research.
  2. House a bioethics council to develop and update guidelines for safe, equitable environmental-biotechnology use.
  3. Manage testbeds to efficiently prototype environmental-biotechnology solutions. 
  4. Facilitate public-private partnerships to help transition solutions from prototype to commercial scale.

More detail on each of these components is provided below.

Component 1: Provide competitive grant funding across key tracks to catalyze comprehensive environmental biotechnology research.

The CLIMB Center will competitively fund research proposals related to (i) bioremediation, (ii) biomonitoring, and (iii) carbon capture. These three key research tracks were chosen to span the approaches to tackle environmental problems from prevention, monitoring to large-scale remediation. Within these tracks, the Center’s research portfolio will span the entire technology-development pathway, from early-stage research to market-ready applications.

Track 1: Bioremediation

Environmental pollutants are detrimental to ecosystems and human health. While the Biden-Harris Administration has taken strides to prevent the release of pollutants such as per- and polyfluoroalkyl substances (PFAS), many pollutants that have already been released into the environment persist for years or even decades. Bioremediation is the use of biological processes to degrade contaminants within the environment. It is either done within a contaminated site (in-situ bioremediation) or away from it (ex-situ). Traditional in-situ bioremediation is primarily accomplished by bioaugmentation (addition of pollutant-degrading microbes) or by biostimulation (supplying oxygen or nutrients to stimulate the growth of pollutant-degrading microbes that are already present). While these approaches work, they are costly, time-consuming, and cannot be done at large spatial scales. 

Environmental biotechnology can enhance the ability of microbes to degrade contaminants quickly and at scale. Environmental-biotechnology approaches produce bacteria that are better able to break down toxic chemicalsdecompose plastic waste, and process wastewater. But the potential of environmental biotechnology to improve bioremediation is still largely untapped, as technology development and regulatory regimes still need to be developed to enable widespread use. CLIMB Center research grants could support the early discovery phase to identify more gene targets for bioremediation as well as efforts to test more developed bioremediation technologies for scalability.

Track 2: Biomonitoring

Optimizing responses to environmental challenges requires collection of data on pollutant levels, toxin prevalence, spread of invasive species, and much more. Conventional approaches to environmental monitoring (like mass spectrometry or DNA amplification) require specialized equipment, are low-throughput, and need highly trained personnel. In contrast, biosensors—devices that use biological molecules to detect compounds of interest—provide rapid, cost-effective, and user-friendly alternatives to measure materials of interest. Due to these characteristics, biosensors enable users to sample more frequently and across larger spatial scales, resulting in more accurate datasets and enhancing our ability to respond. Detection of DNA or RNA is key for identifying pathogens, invasive species, and toxin-producing organisms. Standard DNA- and RNA-detection techniques like polymerase chain reaction (PCR) require specialized equipment and are slow. By contrast, biosensors detect minuscule amounts of DNA and RNA in minutes (rather than hours) and without the need for DNA/RNA amplificationSHERLOCK and DETECTR are two examples of highly successful, marketed tools used for diagnostic applications such as detecting SARS-CoV-2 and for ecological purposes such as distinguishing invasive fish species from similar-looking native species. Moving forward, these technologies could be repurposed for other environmental applications, such as monitoring for the presence of algal toxins in water used for drinking, recreating, agriculture, or aquaculture. Furthermore, while existing biosensors can detect DNA and RNA, detecting compounds like pesticides, DNA-damaging compounds, and heavy metals requires a different class of biosensor. CLIMB Center research grants could support development of new biosensors as well as modification of existing biomonitoring tools for new applications.  

Track 3: Carbon capture

Rising atmospheric levels of greenhouse gases like carbon dioxide are driving irreversible climate change. The problem has become so bad that it is no longer sufficient to merely reduce future emissions—limiting average global warming below 2°C by 2100 will require achieving negative emissions through capture and removal of atmospheric carbon. A number of carbon-capture approaches are currently being developed. These range from engineered approaches such as direct air capture, chemical weathering, and geologic sequestration to biological approaches such as reforestation, soil amendment, algal cultivation, and ocean fertilization.  

Environmental-biotechnology approaches such as synthetic biology (“designed biology”) can vastly increase the amount of carbon that could be captured by natural processes. For instance, plants and crops can be engineered to produce larger root biomass that sequesters more carbon into the soil, or to store more carbon in harder-to-break-down molecules such as ligninsuberin, or sporopollenin instead of easily more metabolized sugars and cellulose. Alternatively, carbon capture efficiency can be improved by modifying enzymes in the photosynthetic pathway or limiting photorespiration through synthetic biology. Microalgae in particular hold great promise for enhanced carbon capture. Microalgae can be bioengineered to not only capture more carbon but also produce a greater density of lipids that can be used for biofuel. The potential for synthetic biology and other environmental-biotechnology approaches to enhanced carbon capture is vast, largely unexplored, and certainly under commercialized. CLIMB Center research grants could propel such approaches quickly. 

Component 2: House a bioethics council to develop and update guidelines for safe, equitable environmental-biotechnology use.

The ethical, ecological, and social implications of environmental biotechnology must be carefully considered and proactively addressed to avoid unintended damage and to ensure that benefits are distributed equitably. As such, the CLIMB Center should assemble a bioethics council comprising representatives from:

The bioethics council will identify key ethical and equity issues surrounding emerging environmental biotechnologies. The council will then develop guidelines to ensure transparency of research to the public, engagement of key stakeholders, and safe and equitable technology deployment. These guidelines will ensure that there is a framework for the use of field-ready environmental-biotechnology devices, and that risk assessment is built consistently into regulatory-approval processes. The council’s findings and guidelines will be reported to the National Engineering Biology Research and Development Initiative’s interagency governance committee which will work with federal and state regulatory agencies to incorporate guidance and streamline regulation and oversight of environmental biotechnology products. 

Component 3. Manage testbeds to efficiently prototype environmental-biotechnology solutions. 

The “valley of death” separating early research and prototyping and commercialization is a well-known bottleneck hampering innovation. This bottleneck could certainly inhibit innovation in environmental biotechnology, given that environmental-biotechnology tools are often intended for use in complex natural environments that are difficult to replicate in a lab. The CLIMB Center should serve as a centralized node to connect researchers with testing facilities and test sites where environmental biotechnologies can be properly validated and risk-assessed. There are numerous federal facilities that could be leveraged for environmental biotechnology testbeds, including: 

The CLIMB Center could also work with industry, state, and local partners to establish other environmental-biotechnology testbeds. Access to these testbeds could be provided to researchers and technology developers as follow-on opportunities to CLIMB Center research grants and/or through stand-alone testing programs managed by the CLIMB Center. 

Component 4: Facilitate public-private partnerships to help transition solutions from prototype to commercial scale.

Public-private partnerships have been highly successful in advancing biotechnology for medicine. Operation Warp Speed, to cite one recent and salient example, enabled research, development, testing, and distribution of vaccines against SARS-CoV-2 at unprecedented speeds. Public-private partnerships could play a similarly key role in advancing the efficient deployment of market-ready environmental biotechnological devices. To this end, the CLIMB Center can reduce barriers for negotiating partnerships between environmental engineers and biotechnology manufacturers. For example, the CLIMB center can develop templates for Memoranda of Understandings (MOUs) and  Collaborative Research Agreements (CDAs) to facilitate the initial establishment of the partnerships, as well as help connect interested parties.The CLIMB center could also facilitate access for both smaller companies and researchers to existing government infrastructure necessary to deploy these technologies. For example, an established public-private partnership team could have access to government-managed gene and protein libraries, microbial strain collections, sequencing platforms, computing power, and other specialized equipment. The Center could further negotiate with companies to identify resources (equipment, safety data, and access to employee experts) they are willing to provide. Finally, the Center could determine and fast-track opportunities where the federal government would be uniquely suited to serve as an end user of biotechnology products. For instance, in the bioremediation space, the EPA’s purview for management and cleanup of Superfund sites would immensely benefit from the use of novel, safe, and effective tools to quickly address pollution and restore habitats.

Conclusion

Environmental and climate challenges are some of the most pressing problems facing society today. Fortunately, advances in biotechnology that enable manipulation, acceleration, and improvement of natural processes offer powerful tools to tackle these challenges. The federal government can accelerate capabilities and applications of environmental biotechnology by establishing the CLimate Improvements through Modern Biotechnology (CLIMB) Center. This center, established as part of the National Engineering Biology Research and Development Initiative, will be dedicated to advancing research, development, and commercialization of environmental biotechnology. CLIMB Center research grants will focus on advances in bioremediation, biomonitoring, and biologically assisted carbon capture, while other CLIMB Center activities will scale and commercialize emerging environmental biotechnologies safely, responsibly, and equitably. Overall, the CLIMB Center will further solidify U.S. leadership in biotechnology while helping the Biden-Harris Administration meet its ambitious climate, energy, and environmental-justice goals. 

Frequently Asked Questions
Why should the federal government take the lead in environmental biotechnology solutions?

Environmental biotechnology can help address wide-reaching, interdisciplinary issues with huge benefits for society. Many of the applications for environmental biotechnology are within realms where the federal government is an interested or responsible party. For instance, bioremediation largely falls within governmental purview. Creating regulatory guidelines in parallel to the development of these new technologies will enable an expedited rollout. Furthermore, environmental biotechnology approaches are still novel and using them on a wide scale in our natural environments will require careful handling, testing, and regulation to prevent unintended harm.  Here again, the federal government can play a key role to help validate and test technologies before they are approved for use on a wide scale.


Finally, the largest benefits from environmental biotechnology will be societal. The development of such technology should hence be largely driven by its potential to improve environmental quality and address environmental injustices, even if these are not profitable. As such, federal investments are better suited than private investments to help develop and scale these technologies, especially during early stages when returns are too small, too uncertain, and too future-oriented.

How do we mitigate security risks of bioengineered products?

Bioengineered products already exist and are in use, and bioengineering innovations and technology will continue to grow over the next century. Rather than not develop these tools and lag behind other nations that will continue to do so, it is better to develop a robust regulatory framework that will address the critical ethical and safety concerns surrounding their uses. Importantly, each bioengineered product will present its own set of risks and challenges. For instance, a bacterial species that has been genetically engineered to metabolize a toxin is very different from an enzyme or DNA probe that could be used as a biosensor. The bacteria are living, can reproduce, and can impact other organisms around them, especially when released into the environment. In contrast, the biosensor probe would contain biological parts (not a living organism) and would only exist in a device. It is thus critical to ensure that every biotechnology product, with its unique characteristics, is properly tested, validated, and designed to minimize its environmental impact and maximize societal benefits. The CLIMB Center will greatly enhance the safety of environmental-biotechnology products by facilitating access to test beds and the scientific infrastructure necessary to quantify these risk-benefit trade-offs.

How would the CLIMB Center address the Biden-Harris Administration’s goals for environmental justice?

The Biden-Harris Administration has recognized the vast disparity in environmental quality and exposure to contaminants that exist across communities in the United States. Communities of color are more likely to be exposed to environmental hazards and bear the burden of climate change-related events. For example, the closer the distance to a Superfund site—a site deemed contaminated enough to warrant federal oversight—the higher the proportion of Black and the lower the proportion of White families. To address these disparities, the Administration  issued Executive Order 14008 to advance environmental justice efforts. Through this order, President Biden created an Environmental Justice Advisory Council and launched the Justice40 initiative, which mandates that 40% of the benefits from climate investments be delivered to underserved communities. The Justice40 initiative includes priorities such as the “remediation and reduction of legacy pollution, and the development of critical clean water infrastructure.” The Executive Order also calls for the creation of a “community notification program to monitor and provide real-time data to the public on current environmental pollution…in frontline and fenceline communities — places with the most significant exposure to such pollution.” Environmental biotechnology offers an incredible opportunity to advance these goals by enhancing water treatment and bioremediation and enabling rapid and efficient monitoring of environmental contaminants.

How would the CLIMB Center address the Biden-Harris Administration’s goals for climate change?

President Biden has set targets for a 50–52% reduction (relative to 2005 levels) in net greenhouse-gas pollution by the year 2030, and has directed federal government operations to reach 100% carbon-pollution-free electricity by 2030 (Executive Order 14057). It is well established that meeting such climate goals and limiting global warming to less than 2°C will require negative emissions technologies (carbon capture) in addition to reducing the amount of emissions created by energy and other sectors. Carbon-capture technologies will need to be widely available, cost-effective, and scalable. Environmental biotechnology can help address these needs by enhancing our capacity for biological carbon capture through the use of organisms such as microalgae and macroalgae, which can even serve the dual role of producing biofuels, feedstock, and other products in a carbon-neutral or carbon-negative way. The CLIMB Center can establish the United States as the global leader in advancing both biotechnology and the many untapped environmental and climate solutions it can offer.

What are the current federal funding mechanisms available for the research and development of bioengineered environmental solutions?

There are multiple avenues for funding foundational research and development in bioengineering. Federal agencies and offices that currently fund bioengineering with an environmental focus include (but are not necessarily limited to):



  • DOE’s Office of Science’s various research programs, ARPA-E, and DOE’s Bioenergy Technologies Office

  • EPA’s Office of Research and Development, Science to Achieve Results (STAR) Program

  • National Science Foundation’s Biological Sciences and Engineering Directorates

  • USDA’s National Institute of Food and Agriculture, Biotechnology Risk Assessment Research Grants Program

  • NOAA’s Office of Ocean Exploration and Research

  • NASA’s Space Technology Mission Directorate

  • The National Institute of Health’s Environmental Health Services and National Institute of Biomedical Imaging and Bioengineering Institutes

  • DOD’s DARPA, Biological Technologies Office


Research funding provided by these offices often includes a biomedical focus. The research and development funding provided by the CLIMB Center would seek to build upon these efforts and help coordinate directed research towards environmental-biotechnology applications.

How could biosensor inform management and policy decisions?

Compared to conventional analytical techniques, biosensors are fast, cost-effective, easy-to-use, and largely portable and largely portable. However, biosensors are not always poised to take-over conventional techniques. In many cases, regulatory bodies have approved analytical techniques that can be used for compliance. Novel biosensors are rarely included in the suite of approved techniques, even though biosensors can complement conventional techniques—such as by allowing regulators to rapidly screen more samples to prioritize which require further processing using approved conventional methods. Moreover, as conventional methods can only provide snapshot measurements, potentially missing critical time periods where toxins, contaminants, or pathogens can go unnoticed. Biosensors, on the other hand, could be used to continuously monitor a given area. For example, algae can accumulate (bloom) and produce potent toxins that accumulate in seafood. To protect human health, seafood is tested using analytical chemical approaches (direct measurement of toxins) or biological assays (health monitoring in exposed laboratory animals). This requires regulators to decide when it is best to sample. However, if a biosensor was deployed in an monitoring array out in the ocean or available to people who collect the seafood, it could serve as an early detection system for the presence of these toxins. This application will become especially important moving forward since climate change has altered the geographic distribution and seasonality of these algal blooms, making it harder to forecast when it is best to measure seawater and seafood for these toxins.

How do we ensure that benefits from environmental biotechnologies extend equitably to historically excluded populations?

Communities of color are more likely to live near Superfund sites, be disproportionately exposed to pollutants, and bear the heaviest burdens from the effects of climate change. These communities have also been disproportionately affected by unethical environmental and medical-research practices. It is imperative that novel tools designed to improve environmental outcomes benefit these communities and do not cause unintended harm. Guidelines established by the CLIMB Center’s bioethics council coupled with evaluation of environmental biotechnologies in realistic testbeds will help ensure that this is the case.

Putting Redlines in the Green: Economic Revitalization Through Innovative Neighborhood Markets

Summary

The systemic effects of past redlining in more than 200 U.S. cities continue to persist. Redlining was a 20th-century policy that explicitly denied Black Americans the opportunity to secure federal mortgage loansand future wealth. Adverse impacts of redlining not only reduce quality of life for communities of color and low-income communities, but also have spillover effects that cost taxpayers upwards of $308 million per year.

The Biden-Harris administration can combat the impacts of redlining through a new place-based program called “Putting Redlines in the Green”. Through this program, the federal government would repurpose a fraction of its thousands of excess and underutilized properties as rent-free or rent-subsidized sites for Innovative Neighborhood Markets (INMs): multipurpose, community-operated spaces that serve as grocery-delivery hubs, house culturally significant businesses, and support local entrepreneurs in historically redlined areas. While recent federal initiatives (such as the Opportunity Zone and Promise Zone programs) have sought to stimulate development in economically distressed communities through top-down grants and tax incentives, “Putting Redlines in the Green” will give historically redlined communities access to a key asset—real estate—needed to spur revitalization from the bottom up.

Challenge and Opportunity

The term “redlining” derives from racially discriminatory practices carried out by government homeownership programs in the 1930s. The pernicious systemic effects of historical redlining continue to be felt today. Historically redlined areas, for instance, possess less urban-forest cover (and thus suffer from higher summer temperatures and greater pollution), experience poorer health outcomes and decreased earning potential, and are exploited by predatory lending practices that make it nearly impossible to rebuild wealth. Historic redlining can also be linked directly to the prevalence and distribution of “food deserts” and “food apartheid” in U.S. cities.

In 2021, the Department of Justice (DOJ)—in collaboration with the Consumer Financial Protection Bureau (CFPB) and the Office of the Comptroller of the Currency (OCC)—launched the Combating Redlining Initiative to ensure equal credit opportunity for communities of color. While laudable, this effort seeks to forestall future instances of redlining rather than to combat inequities associated with redlining in the past. Recent federal initiatives—such as the Trump-era Opportunity Zone program, the Obama-era Promise Zone program,1 the Bush II-era Renewal Community program, and the Clinton-era Empowerment Zone program—have aimed to spur revitalization in economically distressed communities, including historically redlinedcommunities, through grants and/or tax incentives. The success of this approach has proven mixed at best. Opportunity Zones, for instance, have been criticized for subsidizing gentrification and funneling benefits to wealthy private investors. Community leaders in designated Promise Zones have struggled to productively integrate federal grants into comprehensive, synergistic initiatives. Finally, the pattern of different administrations layering similar programs on top of each other has created confusion and lack of sustained buy-in among stakeholders. It is time for a new approach. The Plan of Action below describes a new vision for federal investment in historically redlined areas: one that relies on repurposing federal assets to empower community-driven enterprises.

Plan of Action

The Biden-Harris administration should jointly launch “Putting Redlines in the Green”, a new, interagency, and place-based program to combat inequities of historical redlining. Historically redlined communities suffer from lack of investment and inequity in financial acquisition. Through “Putting Redlines in the Green”, excess and underutilized (E&U) federal properties in historically redlined communities would be repurposed as rent-free or rent-subsidized sites for Innovative Neighborhood Markets (INMs). INMs are envisioned as multipurpose, community-operated spaces designed to spur revitalization from the bottom up by combining elements of farmers’ markets, community banks, and business improvement districts (BIDs). For instance, INMs could provide hubs for farm-to-market grocery-delivery services (see Activity 5, below), house culturally significant businesses threatened by the impacts of gentrification and the COVID-19 pandemic, and give local entrepreneurs the retail and co-working space needed to launch and grow new ventures. 

A stepwise plan of action for the program is outlined below.

Activity 1. Assemble an interagency task force to define program targets and criteria.

The Department of Housing and Urban Development (HUD)’s Office of Community Planning and Development is well-placed to identify redlined communities where INMs could deliver especially large impacts. The Environmental Protection Agency (EPA)’s Office of Community Revitalization (OCR) is already experienced insupporting locally led, community-driven efforts to protect the environment, expand economic opportunity, and revitalize neighborhoods. These two offices should jointly assemble and chair a task force comprising representatives from relevant federal agencies (e.g., the Departments of Agriculture, Commerce, and Justice (USDA, DOC, and DOJ); the General Services Administration (GSA)) and external stakeholder groups (e.g., civic groups, environmental-justice organizations, fair-housing experts). The task force would lay the foundation for “Putting Redlines in the Green” by:

Activity 2. Conduct a review to identify E&U federal properties that could be repurposed as INM sites.The portfolio of federally owned real property in the United States includes thousands of E&U properties. While the number of E&U properties catalogued in the Federal Real Property Profile (FRPP) fluctuates from year to year (due to changes in government operations, acquisition and disposal of various properties, and inconsistencies in data reporting, among other factors), the pandemic induced a notable spike: from approximately 15,000 in FY 2020 (Figure 1). With virtual and hybrid work now firmly embedded across the federal government even as the acute phase of the pandemic has ended, it is likely that a significant fraction of these properties will not return to full utilization. With maintenance of E&U federal properties costing taxpayers tens of millions of dollars annually, there is hence a timely opportunity to augment ongoing processes for federal property reallocation.

Figure 1. Changes in federal property utilization from 2019 (top) to 2020 (bottom). Source: Federal Real Property Profile Summary Data Set.

The task force should work with the GSA to review the federal government’s inventory of excess and underutilized properties to identify sites that could be repurposed as INMs. The goal of this review would be to generate a list of 10–15 sites for near-term repurposing and investment to pilot the INM concept, as well as a longer list of additional candidate sites that could be considered for INMs in the future. A first step for the review would be to crosswalk the E&U properties logged in the FRPP database with the map of priority areas developed in Activity 1. E&U properties located in priority areas should be downselected by building type. For instance, E&U hospital and lab buildings, as likely poor candidate INM sites, could be excluded while E&U housing, office, and warehouse space could be retained. Next, the remaining candidate sites should be screened against the criteria developed in Activity 1. This process stage would also be an appropriate time to identify and eliminate highly problematic candidate sites: for instance, sites that are in badly deteriorated condition or that have already proven uniquely difficult to repurpose. Finally, the task force should prioritize the final list of candidate sites for investment. Prioritization should consider factors such as geographic location (striving to achieve an equitable distribution of INMs nationwide) and buy-in from funders and community groups engaged as part of Activity 1.

Activity 3. Pilot the INM model in an initial 10–15 sites. 

HUD and EPA should lead on repurposing the 10–15 sites identified in Activity 2 into a network of INMs distributed across historically redlined communities nationwide. This process will involve (i) acquiring ownership of the sites; (ii) acquiring necessary permits, (iii) performing requisite site inspections and remediation; (iv) performing requisite construction and demolition needed to transform the sites into usable INM spaces; (v) establishing site-specific governance structures; and (vi) soliciting, selecting, and following through on locally led business proposals for the INMs. HUD and EPA should strive to have the initial suite of INMs operational within three years of program launch, and the federal government should allocate $1 million per site to achieve this goal. Funding could come from the bipartisan Infrastructure Investment and Jobs Act (specifically, through the Act’s $1.5 billion RAISE grant program), the Justice40 initiative, and/or from already-existing allocations at HUD, EPA, and partner federal agencies for activities related to economic development, community revitalization, and business/entrepreneurship. Funding could be leveraged with matching funds and/or in-kind support from philanthropies, nonprofits, local governments, and community organizations.

Activity 4. Ensure that E&U federal properties that become available in the future are systematically evaluated for suitability as INM sites.

Federal law governs the disposal process for properties no longer needed by federal agencies to carry out their program responsibilities. The first step in this process is for GSA to offer “excess property to other federal agencies that may have a program need for it.” A task force should work with GSA to ensure that the “Putting Redlines in the Green” program is incorporated into the federal agency stage of the process. The task force should also develop internal processes for efficiently evaluating E&U properties that become available as candidate sites for INMs. Steps of these internal processes would likely be broadly similar to the steps of the larger review conducted in Activity 2.

Activity 5. Launch an INM-centered “farm to neighborhood” model of grocery delivery.

To combat the specific issue of “food apartheid” in historically redlined communities, USDA’s Office of the Assistant Secretary for Civil Rights (OASCR) should spearhead creation of an INM-centered “farm to neighborhood” model (F2NM) of grocery delivery. In the F2NM, federal agencies would partner with local government and non-governmental organizations to support community gardens and nearby (within a defined radius) farms. Support, which could come in the form of subsidized crop insurance or equipment grants, would be provided to community gardeners and farmers in exchange for pledges to sell produced crops and other foods (e.g., eggs and meat) at INMs. USDA and EPA could also consider subsidizing distributors to sell key foodstuffs that cannot be produced locally (e.g., due to agricultural or logistical limitations) at affordable prices at INMs. Finally, USDA and EPA could consider working with local partners (e.g., the Detroit Black Community Food Security Network; the Center for Environmental Farming Systems [CEFS]’s Committee on Racial Equity in the Food System) to launch meal-kit services that provide community subscribers with INM-sourced ingredients and accompanying recipes. Such services will expand access to locally produced food while promoting healthier lifestyles.

Conclusion

The 11 million+ Americans that currently live in historically redlined areas deserve attention from policymakers. Historic redlining galvanizes the prevalence of food deserts, lead exposure, discriminatory practices, and other adversities, and encourages predatory markets. 

Implementation of “Putting Redlines in the Green” will empower historically redlined areas through profit-driven, self-sustaining community enterprises (INMs). “Putting Redlines in the Green” would also reinforce the Combating Redlining Initiative in ensuring that historically redlined neighborhoods receive “fair and equal access” to the lending opportunities that are—and always have been—available to non-redlined, and majority-White, neighborhoods. Ultimately, transforming excess and underutilized federal properties into INMs will strengthen urban sustainability, reduce taxpayer burdens, and promote restorative, economic, and environmental justice. “Putting Redlines in the Green” will therefore not only provide restitution for historically redlined communities, but will enfranchise the people and revitalize the place. 

Frequently Asked Questions
What does the federal government do with its excess and underutilized (E&U) properties now?

The figure below, created by the GSA, diagrams the disposal process. Generally speaking, E&U federal properties are first assessed for possible public purposes, then made available to private individuals and companies by competitive bid. Note that not every E&U federal property goes through every step of the process illustrated below.

Has anything like “Putting Redlines in the Green” been tried before?

Community organizations such as the Oakland Community Land Trust (CLT) in California and the Dudley Street Neighborhood Initiative (DSNI) in Boston, MA have revitalized their once economically distressed communities from the bottom up. Even initiatives such as the Wynwood Business Improvement District (BID) in Miami, which became susceptible to extreme gentrification following the recent removal of its Arts & Entertainment district status, succeeded in economically revitalizing an area that was once herald as the “Crime Center of Miami.” However there has never been an urban policy that has attempted to recreate the success of these localized initiatives within distressed areas across the United States. Additionally, no governmental effort has attempted to achieve urban revitalization of distressed areas through the framework of financial empowerment, community autonomy, and community-owned enterprise. “Putting Redlines in the Green” is the first to amalgamate the best elements of community-driven initiatives like those cited above and convert them into implementable urban policy.

Could “Putting Redlines in the Green” spur gentrification? How would it ensure that the INMs it creates remain community-based and -oriented?

Gentrification occurs when new development in area displaces current residents and business within that area through economic pressures (such as rising rents, mortgages, and property taxes). Gentrification requires urban revitalization, but urban revitalization does not inevitably lead to gentrification. “Putting Redlines in the Green” would promote “development without displacement.” To ensure that Innovative Neighborhood Markets (INMs) remain community-based and -oriented leading up to and after their launch, “Putting Redlines in the Green” would empower residents through a community-governance structure that controls development, creates economic opportunity, and vastly mitigates the likelihood of gentrification. The Dudley Street Neighborhood Initiative (DSNI) is one example of such a governance structure that has succeeded.

How will “Putting Redlines in the Green” establish relationships with and attract buy-in from funders?

History suggests that the creation of community enterprise within areas susceptible to “gentrification” (i.e., historically redlined neighborhoods) will systematically attract buy-in. As some economists, scholars, and historians have postulated since the 1900’s, gentrification is a consumer cycle that is heavily driven by the movement of money (usually in the form of affluent individuals looking for the newest housing stock) into areas that are nearing the end of their economic life. Thus, the new development associated with INMs will likely attract funders and buy-in from external parties.

Is there competition within the disposal process that could make procuring sites for INMs difficult?

According to the U.S. General Services Administration (GSA)’s Office of Real Property Utilization and Disposal (ORPUD), most excess property does not get transferred between the 34 federal agencies due to “specificity” of the buildings. Thus, there is limited interagency competition for disposed government property. In fact, most E&U federal properties move onto the surplus-property stage, where they may be acquired by state and local governments (i.e., “public benefit conveyance”).


At the public benefit conveyance stage, there are currently 12 legislative actions that grant special consideration for transfer or conveyance of surplus real and related personal property to state government, local government, and certain nonprofits at up to 100% discount for public benefit use. It is therefore preferable that E&U sites for INMs be acquired during the federal stage of the disposal process.

What are some examples of regional partners that would support “Putting Redlines in the Green”? What roles would regional partners play within INMs?

Regional partners could include nonprofits (e.g., Center for Environmental Farming Systems [CEF]’s Curriculum on Racial Equity [CORE]) could advise on best practices for expanding access to locally produced food while promoting healthier lifestyles) or private-sector entities (e.g., Community Development Financial Institutions [CDFIs]) could advise on how to help local entrepreneurs achieve their financial goals and how INMs can support business development by leveraging legislation like the Community Reinvestment Act of 1977). Regardless of size or sector, the role of regional partners, would be to empower the communities participating in “Putting Redlines in the Green” as they help shape, launch, and maintain INMs.

How does “Putting Redlines in the Green” differ from existing economic-development programs, such as EPA’s Smart Growth Program? What about economic-revitalization efforts launched under previous administrations, such as Opportunity Zones or Promise Zones?

“Putting Redlines in the Green” could be accurately described as a specialized smart-growth technical-assistance program that specifically addresses sustainable development in redlined communities. “Putting Redlines in the Green” could also be accurately described as an economic-revitalization effort. But while other federally sponsored economic-development and -revitalization programs have relied heavily on top-down grants and tax incentives, “Putting Redlines in the Green” will take a bottom-up approach based on community-led transformation of excess and underutilized federal properties into vibrant, locally grounded business enterprises.

Addressing the Mental Health Crisis Among Predoctoral and Postdoctoral Researchers in STEM

Summary

The growing mentalhealth crisis among science, technology, engineering, and math (STEM) doctoral and postdoctoral researchers threatens the future and competitiveness of science and technology in the United States. The federal government should tackle this crisis through a four-part approach to (i) improve data collection on the underlying drivers of mental-health struggles in STEM, (ii) discourage behaviors and cultures that perpetuate stress, (iii) require Principal Investigators (PIs) to submit a statement of their mentoring philosophy as part of applications for federally supported research grants, and (iv) increase access to mental-health care for predoctoral and postdoctoral researchers.

Challenge and Opportunity

The prevalence of mental-health problems is higher among Ph.D. students than in the highly educated general population: fully half of Ph.D. students experience psychological distress. In a survey of postdoctoral researchers conducted by Nature, 51% of respondents reported considering leaving science due to work-related mental-health concerns. 65% of respondents reported experiencing power imbalances or bullying during their postdoctoral appointments, and 74% reported observing the same. Stress accumulation not only leads to the development of neuropsychiatric disorders among the developing STEM workforce — it also contributes to burnout. At a time when advancing U.S. competitiveness in science and technology is of utmost importance, the mental-health crisis is depleting our nation’s STEM pipeline when we should be expanding and diversifying it. This is a crisis that the federal government is well-positioned to and must solve. 

Plan of Action

The federal government should counter the mental-health crisis for U.S. doctoral and postdoctoral researchers through a four-part approach to (i) improve data collection on the underlying drivers of mental-health struggles in STEM, (ii) discourage behaviors that perpetuate stress, (iii) require PIs to submit a statement of their mentoring philosophy as part of applications for federally supported research grants, and (iv) increase access to mental-health care for doctoral and postdoctoral researchers. Detailed recommendations associated with each of these steps are provided below.

Part 1. Improve data collection

Data drives public policy. Various organizations conduct surveys evaluating the mental health of doctoral and postdoctoral researchers in STEM, but survey designs, target audiences, and subsequent follow-up and monitoring are inconsistent. This fragmented information ecosystem makes it difficult to integrate and act on existing data on mental health in STEM. To provide a more comprehensive picture of the STEM mental-health landscape in the United States, the National Institutes of Health (NIH) and the National Science Foundation (NSF) should work together to conduct and publish biennial evaluations of the state of mental health of the STEM workforce. The survey format could be modeled on the NSF’s Survey of Doctorate Recipients or the Survey of Earned Doctorates — and, like those surveys, resultant data could be maintained at NSF under the National Center for Science and Engineering Statistics. Once established, the data from the survey can be used to track effectiveness of programs that are implemented and direct the federal government to change or start new initiatives to modify the needs of doctoral and postdoctoral researchers. Additionally, the NSF and NIH could partner with physicians within HHS to define and establish what “healthy” means in terms of mental-health guidelines in order to establish new program guidelines and goals. 

Part 2. Discourage problematic behaviors

The future of a doctoral or postdoctoral researcher depends considerably on the researcher’s professional relationship with their PI(s). Problems in the relationship — including bullying, harassment, and discrimination — can put a trainee in a difficult situation, as the trainee may worry that confronting the PI could compromise their career opportunities. The federal government can take three steps to discourage these problematic behaviors by requiring PIs to submit and implement training and mentorship plans for all grant-supported trainees. 

First, the White House Office of Science and Technology Policy (OSTP) should assemble a committee of professionals in psychology, social sciences, and human resources to define what behaviors constitute bullying and harassment in academic work environments. The committee’s findings should be publicized via a web portal (similar to NSF’s website on Sexual Harassment), and included in all requests for grant applications issued by federal STEM-funding agencies (in order to raise awareness among PIs).

Second, federal STEM-funding agencies should require universities to submit annual reports of bullying to federal, grant-issuing agencies. NSF already requires institutions to report findings of sexual harassment and other forms of harassment and can revoke grants if a grantee is found culpable. NSF and other STEM-funding agencies should add clarity to this definition and broaden this reporting to include bullying and retaliation to include bullying and retaliation attempts by PIs, with similar consequences for repeated offenses. Reinstatement of privileges (e.g., reinstatement of eligibility for federal grant funding) would be considered on a case-by-case basis by the grant-issuing institution and could be made contingent on implementation of an adequate “re-entry” plan by the PI’s home institution. The NIH Office of Behavioral and Social Science Research should be consulted to help formulate such “re-entry” plans to benefit both researchers and PIs.

Third, STEM-funding agencies could work together to establish a mechanism whereby trainees can anonymously report problematic PI behaviors. NSF has a complaint form for those who wish to report incidents for incidents of sexual harassment or harassment. Thus, NSF could expand their system to accept broader incidents such as bullying and retaliation attempts and NIH could use this complaint form as a template for reporting as well. In conjunction with reporting misconduct, a “two-strike” accountability system should be imposed if a PI is found guilty of harassment, bullying, or other behaviors that could contribute to the development of a neuropsychiatric disorder. After receiving a first strike (report of problematic behavior and a guilty verdict), the PI would be given a warning and be required to participate in relevant training workshops and counseling using a plan outlined by social science professionals at NIH. If a second strike is received, the PI would lose privileges to apply for federal grant funding and opportunities to serve on committees that are often favored for tenure and promotion, such as grant review committees. Again, reinstatement of privileges would be considered on a case-by-case basis by the grant-issuing institution and could be made contingent on implementation of an adequate “re-entry” plan.

Part 3. Require submission of mentoring philosophies

NIH F31 predoctoral and F32 postdoctoral award applications already require PIs to submit mentoring plans for their trainees to receive professional-development training. Federal STEM-funding agencies should build on this precedent by requiring PIs applying for federal grants to submit not just mentoring plans, but brief summaries of their mentoring philosophies. As the University of Colorado Boulder explains, a mentoring philosophy

“…defines [a mentor’s] approach to engaging with students as [they] guide their personal growth and professional development, often explaining [the mentor’s] motivation to mentor with personal narratives while highlighting their goals for successful relationships and broader social impact. These statements may also be considered ‘living documents’ that are updated as [the mentor] refine[s[ [their] approach and the context and goals of [their] work changes.”

Mentoring philosophies help guide development of and updates to individualized mentoring plans. Mentoring philosophies also promote equity and inclusion among mentees by providing a common starting point for communication and expectations. Requiring PIs to create mentoring philosophies will elevate mental health among doctoral and postdoctoral researchers in STEM by promoting effective top-down mentorship and discouraging unintended marginalization. And since a growing number of university faculty are already creating mentoring philosophies, this new requirement shouldn’t be seen as just another administrative burden; rather, it would serve as a means to quickly perpetuate a best practice that is already spreading. The federal government can support PIs in adhering to this new requirement by working with external partners to collect and broadly share resources related to preparing mentoring philosophies. The Center for the Improvement of Mentored Experiences in Research, for instance, has already assembled a suite of such resources on its web platform. 

Part 4. Increase access to mental health care

Concurrent with reducing causes of mental health burdens, the federal government should work to expand doctoral and postdoctoral researchers’ access to adequate mental-health care. Current access may vary considerably depending on the level of insurance coverage offered by a researcher’s home institution. Inspired by legislation (S. 3048 – Stopping the Mental Health Pandemic Act, where funds can be used to support and enhance mental health services) introduced in the 117th Congress, the Department of Health and Human Services (HHS) should partner with federal STEM-funding agencies to design and implement new pathways, programs, and opportunities to strengthen mental-health care among early-career STEM professionals. In particular, the federal government could create a library of model policies that federally funded public and private institutions could adopt to strengthen mental-health care for employed early-career researchers. Examples include allowing trainees to take time off during the workday to receive mental-health treatment without expectations to make up hours outside of business hours, providing a supplemental stipend for trainees to pay for therapy costs that are not covered by insurance, and addressing other sources of stress that can exacerbate stressful situations, such as increasing stipends to decrease financial stress. 

Conclusion

The U.S. science and technology enterprise is only as strong as the workforce behind it. Failing to address the mental-health crisis that plagues early-career researchers will lead the United States to fall behind in global research and development due to talent attrition. President Biden’s 2022 State of the Union address cited mental health as a priority area of concern. There is an especially clear need for a culture change around mental health in academia. The four actions detailed in this memo align with the President’s policy agenda. By improving data collection on the mental-health status of STEM doctoral and postdoctoral researchers, discouraging behaviors and cultures that produce stress among this population, improving training and mentorship at universities, and expanding access to mental-health care among STEM doctoral and postdoctoral researchers, the federal government can ensure that success for early-career STEM professionals does not demand mental-health sacrifice.

Frequently Asked Questions
Why does this proposal focus on early-career professionals in STEM and not on other fields?

STEM fields are closely tied to the U.S. economy, supporting two-thirds of U.S. jobs and 69% of the U.S. Gross Domestic Product (GDP). Attrition of U.S. researchers from STEM fields due to mental-health challenges has disproportionately adverse effects on American society and undermines U.S. competitiveness. Policymakers should prioritize actions designed to combat the mental-health crisis in STEM.

Bullying and harassment are subjective behaviors. How can the federal government prevent false allegations from being submitted by doctoral and postdoctoral researchers?

NSF already requires that universities who receive federal research funding conduct internal investigations to validate claims of harassment and sexual harassment. Similar policies could be implemented regarding reported bullying and/or workplace harassment. If an allegation is found to be false, it should be handled by university-specific policies.

If bullying and harassment are causing serious issues in STEM training, why should a PI be allowed “re-entry” to apply for federal funding to mentor students and postdocs after workshops and therapy are completed?

The goal of requiring PIs to attend workshops on mentorship and therapy sessions is to help them better themselves and improve their ability to mentor the next generation of STEM professionals. Re-entry to mentoring trainees will be closely monitored by leadership faculty who should conduct surveys of both mentors and mentees to determine if the PI understands (a) their previous misconduct and (b) the lasting mental health effects that their previous actions inflicted on their trainees.

NIH and NSF aren’t the only federal agencies that provide funding for training early career researchers. What about the others?

NIH and NSF are arguably the two leading federal agencies when it comes to providing federal funding for graduate students. That said, recommendations presented in this memo could easily be extended to other STEM-funding agencies. For instance, there is a timely opportunity to extend these recommendations to the Department of Energy (DOE). DOE is currently working to manage the President’s major FY23 investment in clean energy and sustainability, including through significant research-grant funding. Coupling these new grants with policies designed to mitigate mental-health burdens among early-career researchers could help foster a more resilient and productive clean-energy workforce and serve as a pilot group for the NIH and NSF to follow.

Requiring the reporting of bullying or harassment by a PI is an administrative burden. Why should universities take on increased responsibilities in this area?

The administrative responsibilities for reporting are minimal. NSF’s Organizational Notification of Harassment Form can — at a minimum — be used as a template for NSF, NIH, and other agencies to notify the federal government of guilty verdicts from universities. Alternatively, doctoral and postdoctoral researchers can submit incidents for reporting by federal agencies similar to NSF’s existing complaint form, which would reduce the initial administrative burden of university employees but may create additional hours of work once federal agencies conduct their investigations.

Some universities are offering free yoga and meditation classes for predoctoral and postdoctoral researchers. Others are offering training courses on developing resilience to stress. Aren’t these opportunities sufficient for alleviating mental health concerns?

While the strategies above teach researchers how to cope with stress, a long-term, more supportive approach would be to reduce stress by going straight to the source. Actions such as addressing harassment and bullying will benefit not only the researcher themselves, but others in the work environment by fostering a responsible, low-stress culture.

7. How are mentoring philosophies different from mentoring plans?

The submission of mentoring plans by PIs are currently required for NIH pre- and post-doctoral fellowship applications. They are meant to supplement the training of a researcher by focusing on the logistics of skill building. However, mentorship of a researcher transcends knowledge and skill-building — it also encompasses the holistic development of a researcher, supporting and respecting their interests, values, and considerations of their individual situations. Thus, submission of a mentoring philosophy is meant to stimulate thoughts and conversations about how a PI wants to communicate openly and honestly with their trainee and how they can adapt to support the mentoring style that best fits their trainee.

Reduce, Repurpose, Recharge: Establishing a Collaborative Doctrine of Groundwater Management in the Ogallala Aquifer

Summary

Climate change has resulted in extreme and irregular rain events across the United States. Consequently, farmers in the High Plains region have been increasingly dependent on the Ogallala Aquifer for water supplies. With an estimated value of $35 billion, this aquifer supports one-fifth of the nations’ wheat, corn, cotton, and cattle. The Ogallala once held enough water to fill Chicago’s Sears Tower over 2,000 times. Today, the aquifer has lost 30% of its supply — and it is being recharged at half the rate it is being depleted. The consequence of inaction is 70% aquifer depletion by 2060, which will reduce crop output by 30–40%.

Figure 1. Changes in groundwater levels in the Ogallala Aquifer from predevelopment to 2015. Adapted in the Fourth National Climate Assessment from McGuire et al. (2017).

This $14 billion loss to the High Plains agricultural production may be slowed and eventually reversed by (1) reducing Ogallala use, (2) repurposing existing supplies, and (3) recharging the aquifer. The U.S. Department of Agriculture (USDA), in collaboration with the Department of the Interior (DOI) and the Federal Emergency Management Agency (FEMA), should accordingly create the Reduce, Repurpose, Recharge Initiative (RRRI), a voluntary program designed to keep farmers engaged in groundwater conservation. This multi-state program will provide financial incentives to participating farmers in exchange for pledges to limit groundwater withdrawal and participate in training that will equip them with knowledge needed to fulfill those pledges. The RRRI will also make expert advisors available to consult with farmers on policies and funding opportunities related to groundwater conservation. Finally, this program will connect farmers across state lines, allowing them to learn from each other and work together on sustainable management of the Ogallala. The program should be funded through the various water-sustainability budgets of the DOI and USDA, as well as through FEMA’s Building Resilient Infrastructure and Communities grant program.

Challenge and Opportunity

Climate-change-induced droughts have increased the nation’s dependence on groundwater as a source for agriculture, industry, and domestic use. Excessive groundwater pumping has led to land subsidence and deterioration of water quality, increasing water-use cost and jeopardizing crop yield. The problem is especially acute in the Ogallala Aquifer of the High Plains region. The aquifer underlies eight states of the nation’s breadbasket — including Nebraska, Kansas, and Texas — and spans 175,000 square miles. Dependence on the Ogallala has depleted its supply by 30% to date, as shown in Figure 1. 90% of water withdrawn from the Ogallala is used for agricultural irrigation.

Strategic plans for the USDA and DOI make it clear that drought preparedness and water conservation/sustainability are national priorities. Multiple federal efforts exist to advance these priorities. Publicly accessible platforms hosting and providing groundwater data exist at the United States Geological Survey (USGS), the National Institute of Food and Agriculture (NIFA), and the cross-agency National Integrated Drought Information System (NIDIS) partnership. The 2018 Farm Bill strengthened technical- and financial-assistance programs to help individual farms implement water-conservation technology; the bill also created an incentive program for agriculture-to-wetland conversion. From 2011–2018, the USDA’s Natural Resources Conservation Service (NRCS) ran the Ogallala Aquifer Initiative (OAI) to “support targeted, local efforts to conserve the availability of water, both its quantity and quality, in each of the States” covering the Ogallala. The OAI was successful in meeting its water-conservation goals. Recent surveys found that 93% of agricultural producers in the High Plains region believe that water conservation is important.

These past and ongoing initiatives demonstrate that federal will and stakeholder buy-in for aquifer conservation and restoration are there. The key need is for a program that provides farmers the incentives and technical assistance needed to minimize groundwater reliance, ending the tragedy of the commons in the Ogallala once and for all.

Plan of Action

USDA, DOI, and FEMA should launch a joint program designed to embed the three pillars of groundwater conservation — Reduce, Repurpose, and Recharge — into the practices of farmers in Ogallala states. The RRRI will provide a financial incentive to farmers in exchange for farmer commitments to:

  1. Achieve specified water-conservation targets.
  2. Participate in training opportunities and workshops teaching best practices for water conservation and aquifer recharge.

To succeed, the RRRI will require enthusiastic, voluntary participation from farmers across the High Plains region. Participation should be voluntary because studies have shown that voluntary programs are significantly more effective than mandates in achieving water-conservation goals. In a comparative case study about implementing

In a comparative case study about implementing a voluntary versus mandated water restriction, farmers under the voluntary restriction conserved more water relative to the mandatory regulation. A survey of these farmers attributed the group-education component of the voluntary program as the driving force for their restriction. Another survey similarly found that farmers’ altruistic views of water conservation led to longer-lasting participation in water-conservation activities. A comprehensive review of the outcomes of different water policies found that educational programs about water conservation were more effective in water use reduction and improving attitudes towards water conservation relative to mandatory water use restrictions.

To encourage voluntary participation, farmers who enroll in the RRRI would receive a financial incentive. The exact nature of the incentive would need to be determined by the implementing agencies, but could include preferential price setting, preferential market placement, or subsidies based on crop type. In exchange, farmers would agree to an initial water-use assessment performed by field experts (either employees or contractors of USDA or DOI). An appointed advisor (again, either employees or contractors of USDA or DOI) would then work with each farmer to establish long-term (5-year) water-conservation targets based on the assessment results. Each participating farmer would meet quarterly with their advisor to review their water-conservation plan, assess progress towards targets, make mutually agreeable target adjustments, and discuss challenges and solutions. Advisors would also be available in between quarterly meetings for interim questions and concerns.

Farmers who enroll in the RRRI would also commit to attending group trainings and workshops designed to help them identify and implement best water-conservation practices. These learning opportunities would be led by experts sourced from existing agricultural committees (e.g., NRCS Conservation Planners and Technical Service Providers, State Technical Committees, etc.) and water-conservation groups (e.g., Ogallala Water Coordinated Agriculture Project, Groundwater Protection Council, etc.). The group-education curriculum would cover the three tenets of groundwater conservation: reduce, repurpose, and recharge. Table 1 provides a brief description of each tenet, along with examples of aligned activities and potential sources of funding for those activities. The curriculum would teach farmers how each tenet contributes to groundwater conservation, existing and emerging technologies and practices that farmers can implement to achieve each tenet, and financial vehicles available to fund implementation. An added benefit of the group education will be the establishment of a community of farmers across the Ogallala states in which ideas and experiences can be shared.

TenetDefinitionExample activitiesPotential funding source(s)
ReduceMinimizing water needs for existing systemsMore efficient irrigation NRCS’s Agricultural Management Assistance and Conservation Innovation Grants
RepurposeMove away from water-intensive practicesSwitch to less water-intensive cropsNRCS’s Regional Conservation Partnership Program andConservation Stewardship Program
RechargeReplenish groundwater source (aquifer)Capture excess stormwater; convert agricultural land to wetlands FEMA’s Building Resilient Infrastructure and Communities Grant; NRCS’s Agricultural Conservation Easement Program
Table 1. Definition, example activities, and potential funding sources for each groundwater-conservation tenet.

The RRRI should be established as a multi-agency collaboration. Each involved agency (USDA, DOI, and FEMA) can provide unique expertise. USDA can leverage its research arm, NIFA, to produce up-to-date technology recommendations and scientific assessments. USDA’s NRCS can provide the underlying technical and financial support for realizing the RRRI tenets. DOI can rely on USGS’s existing groundwater database and the NIDIS’s affiliated expert community of data scientists to support the granular, up-to-date groundwater measurements needed to assess water-conservation progress. DOI’s Bureau of Land Management (BLM) can ensure the RRRI tenets are enacted (in parallel with implementation on privately owned farmland) across public lands in the High Plains region. Finally, FEMA can collaborate with NIDIS and with USDA’s Risk Management Agency (RMA) to formally assess risks of drought and Ogallala depletion — assessments that can be used to make the case for the RRRI to farmers, funders, and policymakers. 

Early actions needed to launch the RRRI include:

Conclusion

Climate-change-induced droughts have increased farmer dependence on groundwater, resulting in a 30% depletion of the Ogallala Aquifer to date. Under current management practices, depletion of the Ogallala will reach 70% by 2060. We can solve the problem. The technology, technical expertise, programmatic and data infrastructure, and financial support for groundwater conservation exist. The key need is to directly connect farmers with — and motivate them to use — these resources. A joint USDA/DOI/FEMA program founded in the “Reduce, Repurpose, Recharge” tenets of water conservation can do just that for farmers across the High Plains region. By coupling financial incentives with tailored water-conservation targets, technical expertise, and group educational opportunities, the RRRI will meaningfully advance the long-term security of the critically important Ogallala—and the farmers whose livelihoods depend on it.

Frequently Asked Questions
What is the estimated cost of this program?

Based on the budget for the Ogallala Aquifer Initiative, the RRRI would require $25 million per year for 10-20 years to support the program’s staff and cover travel costs. This funding can be drawn from water-sustainability discretionary funds already allocated at USDA and DOI as well as FEMA’s Building Resilient Infrastructure and Communities grant program.

What existing technologies can promote sustainable groundwater management?

Publications from the Ogallala Water Coordinated Agriculture Project cite numerous examples of existing technologies that can promote sustainable groundwater management, including irrigating with recycled water (i.e., direct non-potable reuse) and shifting to dryland irrigation.

How does the hydrology of the Ogallala region lend itself to aquifer recharge?

The sandy soils of the High Plains are ideal for managed aquifer recharge as they allow for fast infiltration.

Why focus on the Ogallala Aquifer when groundwater depletion is an issue across the US?

With no existing federal regulation on groundwater use, the country needs a pilot program to demonstrate the effectiveness of an interstate groundwater use policy to create precedent for future policymaking and begin to optimize water use policies at such a large scale. The Ogallala Aquifer is the largest and most productive aquifer in the world and conserving the agriculture it supports is required for a sustainable future.

Why won’t the federal government just put a limit on groundwater pumping?

While the federal government has regulations in place dictating water quality through the Environmental Protection Agency’s Clean Water Act and Safe Drinking Water Act, water-allocation policy is left up to the states. Between the eight states above the Ogallala Aquifer, there are four distinct doctrines that define groundwater law, some in direct conflict with one another. State authority over water resources makes it difficult for the federal government to implement mandatory groundwater conservation measures. Voluntary programs like RRRI are an effective mechanism to reach groundwater conservation goals without infringing on states’ water rights.

Establishing the AYA Research Institute: Increasing Data Capacity and Community Engagement for Environmental-Justice Tools

Summary

Environmental justice (EJ) is a priority issue for the Biden Administration, yet the federal government lacks capacity to collect and maintain data needed to adequately identify and respond to environmental-justice (EJ) issues. EJ tools meant to resolve EJ issues — especially the Environmental Protection Agency (EPA)’s EJSCREEN tool — are gaining national recognition. But knowledge gaps and a dearth of EJ-trained scientists are preventing EJSCREEN from reaching its full potential. To address these issues, the Administration should allocate a portion of the EPA’s Justice40 funding to create the “AYA Research Institute”, a think tank under EPA’s jurisdiction. Derived from the Adinkra symbol, AYA means “resourcefulness and defiance against oppression.” The AYA Research Institute will functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues.

Challenge and Opportunity

Approximately 200,000 people in the United States die every year of pollution-related causes. These deaths are concentrated in underresourced, vulnerable, and/or minority communities. The EPA created the Office of Environmental Justice (OEJ) in 1992 to address systematic disparities in environmental outcomes among different communities. The primary tool that OEJ relies on to consider and address EJ concerns is EJSCREEN. EJSCREEN integrates a variety of environmental and demographic data into a layered map that identifies communities disproportionately impacted by environmental harms. This tool is available for public use and is the primary screening mechanism for many initiatives at state and local levels. Unfortunately, EJSCREEN has three major limitations:

  1. Missing indicators. EJSCREEN omits crucial environmental indicators such as drinking-water quality and indoor air quality. OEJ states that these crucial indicators are not included due to a lack of resources available to collect underlying data at the appropriate quality, spatial range, and resolution. 
  2. Small areas are less accurate. There is considerable uncertainty in EJSCREEN environmental and demographic estimates at the census block group (CBG) level. This is because (i) EJSCREEN’s assessments of environmental indicators can rely on data collected at scales less granular than CBG, and (ii) some of EJSCREEN’s demographic estimates are derived from surveys (as opposed to census data) and are therefore less consistent.
  3. Deficiencies in a single dataset can propagate across EJSCREEN analyses. Environmental indicators and health outcomes are inherently interconnected. This means that subpar data on certain indicators — such as emissions levels, ambient pollutant levels in air, individual exposure, and pollutant toxicity — can compromise the reliability of EJSCREEN results on multiple fronts. 

These limitations must be addressed to unlock the full potential of EJSCREEN as a tool for informing research and policy. More robust, accurate, and comprehensive environmental and demographic data are needed to power EJSCREEN. Community-driven initiatives are a powerful but underutilized way to source such data. Yet limited time, funding, rapport, and knowledge tend to discourage scientists from engaging in community-based research collaborations. In addition, effectively operationalizing data-based EJ initiatives at a national scale requires the involvement of specialists trained at the intersection of EJ and science, technology, engineering, and math (STEM). Unfortunately, relatively poor compensation discourages scientists from pursuing EJ work — and scientists who work on other topics but have interest in EJ can rarely commit the time needed to sustain long-term collaborations with EJ organizations. It is time to augment the federal government’s past and existing EJ work with redoubled investment in community-based data and training.

Plan of Action

EPA should dedicate $20 million of its Justice40 funding to establish the AYA Research Institute: an in-house think tank designed to functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues. The word AYA is the formal name for the Adinkra symbol meaning “resourcefulness and defiance against oppression” — concepts that define the fight for environmental justice.

The Research Institute will comprise three arms. The first arm will increase federal EJ data capacity through an expert advisory group tasked with providing and updating recommendations to inform federal collection and use of EJ data. The advisory group will focus specifically on (i) reviewing and recommending updates to environmental and demographic indicators included in EJSCREEN, and (ii) identifying opportunities for community-based initiatives that could help close key gaps in the data upon which EJSCREEN relies.

The second arm will help grow the pipeline of EJ-focused scientists through a three-year fellowship program supporting doctoral students in applied research projects that exclusively address EJ issues in U.S. municipalities and counties identified as frontline communities. The program will be three years long so that participants are able to conduct much-needed longitudinal studies that are rare in the EJ space. To be eligible, doctoral students will need to (i) demonstrate how their projects will help strengthen EJSCREEN and/or leverage EJSCREEN insights, and (ii) present a clear plan for interacting with and considering recommendations from local EJ grassroots organization(s). Selected students will be matched with grassroots EJ organizations distributed across five U.S. geographic regions (Northeast, Southeast, Midwest, Southwest, and West) for mentorship and implementation support. The fellowship will support participants in achieving their academic goals while also providing them with experience working with community-based data, building community-engagement and science-communication skills, and learning how to scale science policymaking from local to federal systems. As such, the fellowship will help grow the pipeline of STEM talent knowledgeable about and committed to working on EJ issues in the United States.

The third arm will embed EJ expertise into federal decision making by sponsoring a permanent suite of very dominant resident staff, supported by “visitors” (i.e., the doctoral fellows), to produce policy recommendations, studies, surveys, qualitative analyses, and quantitative analyses centered around EJ. This model will rely on the resident staff to maintain strong relationships with federal government and extragovernmental partners and to ensure continuity across projects, while the fellows provide ancillary support as appropriate based on their skills/interest and Institute needs. The fellowship will act as a screening tool for hiring future members of the resident staff.

Taken together, these arms of the AYA Research Institute will help advance Justice40’s goal of improving training and workforce development, as well as the Biden Administration’s goal of better preparing the United States to adapt and respond to the impacts of climate change. The AYA Research Institute can be launched with $10 million: $4 million to establish the fellowship program with an initial cohort of 10 doctoral students (receiving stipends commensurate with typical doctoral stipends at U.S. universities), and $6 million to cover administrative expenses and staff expert salaries. Additional funding will be needed to maintain the Institute if it proves successful after launch. Funding for the Institute could come from Justice40 funds allocated to EPA. Alternatively, EPA’s fiscal year (FY) 2022 budget for science and technology clearly states a goal of prioritizing EJ — funds from this budget could hence be allocated towards the Institute using existing authority. Finally, EPA’s FY 2022 budget for environmental programs and management dedicates approximately $6 million to EJSCREEN — a portion of these funds could be reallocated to the Institute as well.

Conclusion

The Biden-Harris Administration is making unprecedented investments in environmental justice. The AYA Research Institute is designed to be a force multiplier for those investments. Federally sponsored EJ efforts involve multiple programs and management tools that directly rely on the usability and accuracy of EJSCREEN. The AYA Research Institute will increase federal data capacity and help resolve the largest gaps in the data upon which EJSCREEN depends in order to increase the tool’s effectiveness. The Institute will also advance data-driven environmental-justice efforts more broadly by (i) growing the pipeline of EJ-focused researchers experienced in working with data, and (ii) embedding EJ expertise into federal decision making. In sum, the AYA Research Institute will strengthen the federal government’s capacity to strategically and meaningfully advance EJ nationwide. 

Frequently Asked Questions
How does this proposal align with grassroots EJ efforts?

Many grassroots EJ efforts are focused on working with scientists to better collect and use data to understand the scope of environmental injustices. The AYA Research Institute would allocate in-kind support to advance such efforts and would help ensure that data collected through community-based initiatives is used as appropriate to strengthen federal decision-making tools like EJSCREEN.

How does this proposal align with the Climate and Economic Justice Screening Tool (CEJST) recently announced by the Biden administration?

EJSCREEN and CEJST are meant to be used in tandem. As the White House explains, “EJSCREEN and CEJST complement each other — the former provides a tool to screen for potential disproportionate environmental burdens and harms at the community level, while the latter defines and maps disadvantaged communities for the purpose of informing how Federal agencies guide the benefits of certain programs, including through the Justice40 Initiative.” As such, improvements to EJSCREEN will inevitably strengthen deployment of CEJST.

Has a think tank ever been embedded in a federal government agency before?

Yes. Examples include the U.S. Army War College Strategic Studies Institute and the Asian-Pacific Center for Security Studies. Both entities have been successful and serve as primary research facilities.

What criteria would the AYA Research Institute use to evaluate doctoral students who apply to its fellowship program?

To be eligible for the fellowship program, applicants must have completed one year of their doctoral program and be current students in a STEM department. Fellows must propose a research project that would help strengthen EJSCREEN and/or leverage EJSCREEN insights to address a particular EJ issue. Fellows must also clearly demonstrate how they would work with community-based organizations on their proposed projects. Priority would be given to candidates proposing the types of longitudinal studies that are rare but badly needed in the EJ space. To ensure that fellows are well equipped to perform deep community engagement, additional selection criteria for the AYA Research Institute fellowship program could draw from the criteria presented in the rubric for the Harvard Climate Advocacy Fellowship.

What can be done to avoid politicizing the AYA Research Institute, and to ensure the Institute’s longevity across administrations?

A key step will be grounding the Institute in the expertise of salaried, career staff. This will offset potential politicization of research outputs.

What is the existing data the EJSCREEN is using?

EJSCREEN 2.0 is largely using data from the 2020 U.S. Census Bureau’s American Community Survey, as well as many other sources (e.g., the Department of Transportation (DOT) National Transportation Atlas Database, the Community Multiscale Air Quality (CMAQ) modeling system, etc.) The EJSCREEN Technical Document explicates the existing data sources that EJSCREEN relies on.

7. What are the demographic and environmental indicators of interest included in EJSCREEN?

The demographic indicators are: people of color, low income, unemployment rate, linguistic isolation, less than high school education, under age 5 and over age 64. The environmental indicators are: particulate matter 2.5, ozone, diesel particulate matter, air toxics cancer risk, air toxics respiratory hazard index, traffic proximity and volume, lead paint, Superfund proximity, risk management plan facility proximity, hazardous waste proximity, underground storage tanks and leaking UST, and wastewater discharge.

Creating the Make it in America Regional Challenge

Summary

In response to growing supply chain challenges and rising inflation, the Biden Administration should create a national competition — The Make it in America Regional Challenge (MIARC) — that activates demand in underinvested regions with cluster-based techno-economic development efforts. MIARC would be a $10 billion two phase competition that would award 30-50 regions planning grants and then 10-15 ultimate winners up to $1 billion to strengthen regional capacity in economic clusters that align with critical U.S. supply chain priorities. 

Challenge and Opportunity

Roughly one in five Americans mention the high costs of living or fuel prices as the most important problem facing the United States. Meanwhile, the COVID-19 pandemic, global competition with China, and the Russian invasion of Ukraine has exposed significant, long-standing weaknesses in U.S. supply chains. For example, more than 40 percent of active pharmacological ingredients, 50 percent of global personal protective equipment supplies, and 90 percent of chemical ingredients for generic drugs are sourced or made in China. 

This is just one small cross-section of a range of critical sectors with diffuse but at-risk supply chains globally. Offshored production in critical sectors not only induces economic loss — like the recent chip shortage which resulted in over $210 billion of foregone revenue — but places a drag on America’s ability to innovate. Indeed, America’s innovation ecosystem has lost the art of “learning-by-building”,  the substantial, value-add interactions that happen when manufacturers are seated at the table with designers. The past is full of examples, including solar panels in which China-based firms have captured nearly 80 percent of market share by betting early on manufacturing innovations that precipitated a nearly 100 percent drop in PV cells’ module costs over the last 30 years.

One reason for the breakdown in supply chains is the geographic gap between where innovation and production takes place in America. Currently, there are only a handful of cities with the “industries and a solid base of human capital [to] keep attracting good employers and offering high wages … ecosystems form in these hot cities, complete with innovation companies, funding sources, highly educated workers and a strong service economy.”  Increasing the capability for non-“superstar” regions to have comprehensive supply chain solutions that couple research, manufacturing, and distribution would improve these regions’ global competitiveness and drastically reduce the nation’s reliance on unstable, global supply chains. Doing so would create new jobs in distressed communities and strengthen U.S. economic independence.

The $1 billion Build Back Better Regional Challenge (BBBRC) launched in 2021 by the Economic Development Administration offers a recent example of how national competitions can spur both local and national economic competitiveness. The competition received 529 applications from all 50 states  and will ultimately award between 20 and 30 regions up to $100 million. Representing tribal, coal, and next-generation hubs of global competitiveness, the 60 finalists each brought unique regional resources to bear including leveraging a total of $30+ billion in federal R&D investments at universities and national labs. 

Final awards aside, new and extraordinary local collaborations and clusters have sprouted across these regions due to the convening power of the BBBRC. Congress and the Department of Commerce should take advantage of this nascent, in-real-time progress by creating a new national competition — The Make it in America Regional Challenge (MIARC). 

If modeled after BBBRC, MIARC would restore America’s full potential to innovate, with supply chains secured by onshoring innovation and production capacities in both the heartland and coastal regions. But it would also spread bring demand to underinvested “stone cold” markets. In turn, total demand and multi-factor economic growth would skyrocket, while prices would stabilize, from the bottom up and middle out.

This approach should not be attempted in every sector. Given the serious supply chain needs, MIARC should focus on critical innovation industries where manufacturing can play a complementary role: semiconductors, high-capacity batteries, rare earth minerals, and pharmaceuticals. As described in the BBBRC Finalist Proposal Narratives, each region is uniquely positioned to support the growth of different sectors. However, public R&D funding into certain industries often generates spillover patent and citation creation in entirely different fields as well. For example, every patent generated from R&D grant funding for energy technologies yields three more patents in other sectors, suggesting a more holistic economic development strategy from targeted cluster investments. 

In fact, extant academic research has described an unparalleled multiplier effect by investing in innovation sectors: “for each new high-tech job in a city, five additional jobs are ultimately created outside of the high-tech sector in that city, both in skilled occupations (lawyers, teachers, nurses) and in unskilled ones (waiters, hairdressers, carpenters).” Exemplifying this effect are America’s top 25 “most dynamic” metros, which over-index on “technology hub” cities that are beginning to spread away from Silicon Valley to previously underutilized regions as  “other metros are now more capable than ever of producing the next tech company with a trillion-dollar market value.”

But successful regional innovation is a complex process, dependent on interregional spillovers of private and university knowledge, frequent face-to-face contact and knowledge-sharing between capable workforces, and sufficient resources for startups to commercialize research from labs to the marketplace. To accomplish these effects, MIARC should target investments that support a dual R&D and commercialization effort, similar to BBBRC’s cluster-building approach. 

This research-commercialization funding approach would yield dividends, as the Department of Energy, National Science Foundation, and additional Department of Commerce programs are deploying a range of regional economic growth strategies. Stitching together ongoing federal resources — either through research assets such as FFRDCs and national labs, or federal research funding at universities — would multiply the effects of these collective upfront investments. For example, empirical research found that research funding investments generated two times as many startups in the proximity of a national laboratory and three times the amount of successful startups (i.e., $10+ million IPO). 

In addition to BBBRC, both the Senate and House have passed different versions of legislation that calls for up to $10 billion for regional “tech hubs”, which programmatically align with the concept of the Make it in America Regional Challenge.

Plan of Action

The Make it in America Regional Challenge would be a $10 billion two phase competition that would award 30-50 regions planning grants and then 10-15 ultimate winners up to $1 billion to strengthen regional capacity in economic clusters that align with critical U.S. supply chain priorities (e.g. semiconductors, lithium batteries, etc.). Drawing from lessons from the BBB Regional Challenge, these investments would be: 

In addition, any application design should allow for throughput from BBBRC applications components into MIARC Phase I applications. The existing Phase I BBBRC applicants, regardless of final award, have embarked on a herculean undertaking assembling unique regional coalitions. 

In selecting additional regions, the Department of Commerce should identify the industry, the region’s related extent of intersectoral knowledge, its source (e.g., local, neighboring, or external regions), and effect on patenting. For example, recent research describes a serious difference in interregional spillover as “innovation in the chemical and electrical and electronic industries is not affected by long-distance private R&D spillovers while it is in other industries.”