Next-Generation Fire and Vegetation Modeling for a Hot and Dry Future
Summary
Wildfires are burning in ways that surprise even seasoned firefighters. Our current models cannot predict this extreme fire behavior—nor can they reproduce recent catastrophic wildfires, making them likely to fail at predicting future wildfires or determining when it is safe to light prescribed fires.
To better prepare the fire management community to operate in a new climate, Congress should establish and fund five regional centers of excellence (CoE) to develop, maintain, and operate next-generation fire and vegetation models to support wildland fire planning and management. Developing five regional CoEs (Southeast, Southwest, California, Pacific Northwest, Northern/Central Rockies) will ensure that researchers pursue a range of approaches that will ultimately lead to better models for predicting future wildfire behavior, improving our ability to safeguard human lives, communities, and ecosystems.
Challenge and Opportunity
In the decade ending in 2021, total federal wildfire suppression expenditures surpassed $23 billion, which is a fraction of the total costs of damages from wildfire over that period. For example, the 2018 wildfires in California are estimated to have amounted to $148.5 billion in economic costs for the state. The costs of suppressing fire, and the societal and natural resources costs of extreme wildfire, will continue to increase with increasing temperatures.
Fewer than 2% of ignitions become large wildfires, but it is this 2% that cause most of the damage because they are burning under extreme conditions. The area of forests burned by wildfire annually in the western United States has been increasing exponentially since 1984. While the number of ignitions remains relatively constant from year to year, climate change is drying fuels and making forests more flammable. As a result, no matter how much money we spend on wildfire suppression, we will not be able to stop increasingly extreme wildfires. Thus, we need to better understand where the risks lie on our landscapes and work proactively to reduce them.
When vegetation—especially dead vegetation—is subjected to high temperatures, any moisture absorbed during the winter months quickly evaporates. As a result, increasingly hot summers are making our forests more flammable. Live vegetation moisture content does not react as quickly as dead vegetation, but sharp increases in air temperature when conditions are dry can make live plants more flammable as well. While this relationship between temperature and ecosystem flammability has remained consistent over time, until the past decade we had not reached a level of warming that dried ecosystems sufficiently to allow for consistent extreme fire behavior. This is in part because large dead fuels, such as dead trees and logs, did not dry sufficiently to become flammable for the majority of the fire season until recently.
Our current operational models for simulating wildfire and vegetation are incapable of reproducing the extreme fire behavior and rapid ecosystem change that we are now experiencing. Forest growth-and-yield models, such as the Forest Vegetation Simulator, used by managers have served them well for decades. However, because they are built using statistical relationships between past tree growth and climate, they are incapable of capturing the effects of changing climate, especially extreme events, on tree growth and mortality. Similarly, our operational fire models, such as FARSITE, that are used for both management planning and simulating fire spread to plan fire suppression activities are not designed to deal with the substantial ecosystem changes that are occurring from climate change. These fire models have served us well in the past, but increasing temperature and a drying atmosphere are causing conditions that far exceed the data used to build these models.
For example, our current operational fire models do not account for large dead trees and logs and how they contribute to fire spread or for the way fire behaves in the wildland–urban interface. Yet wildfires are increasingly burning through communities, and the number of dead trees and logs is increasing because of drought- and insect-induced tree mortality and is increasingly available to burn because of high temperatures. The 2020 Creek Fire in the Sierra Nevada, California, burned through an area of extensive tree mortality from prolonged drought and insect outbreaks. The operational fire spread model ELMFIRE, which is used to predict fire spread of active wildfires, was unable to predict the mass fire behavior created by the massive number of dead trees.
Managing wildfire risk both prior to and during wildfires requires advanced models that are able to account for changing climatic conditions. We need new wildfire models that account for the increasing fuel dryness that facilitates extreme fire behavior, and we need new vegetation models that account for the effects of extreme drought and temperature on vegetation mortality. The research and development necessary to prepare us for our increasingly flammable world requires both fundamental and applied research, neither of which is sufficient on its own.
Further, we need to ensure that we commit to maintaining these models as the climate continues to change so that we do not create another tool that fails to serve us well within a decade or two. As the climate continues to change, these next-generation fire and vegetation models will be challenged with novel conditions that require continuous efforts to ensure they are capable of capturing the dynamics of the system. In addition, we must ensure that the mechanistic understanding of the system that develops is applied to supporting fire and vegetation management decision-making. This will require ongoing experimentation and observations of actual wildfire behavior, along with extensive data collection to characterize how quickly the flammability of the system changes as a function of vegetation type and weather conditions.
Developing these next-generation models is necessary for both fire suppression and management planning. Incident command teams rely on fire spread models to help plan suppression efforts for active wildfires, and thus having better predictions of fire spread is essential for effective operations and firefighter safety. Likewise, planning forest treatments that are effective for reducing the risk of high-severity wildfire under extreme weather conditions requires better vegetation and fire models that can capture the influence of changing climate on the probability that high-severity wildfire occurs.
Plan of Action
Developing and future-proofing next-generation fire and vegetation models will require new and sustained investment. Further, we must accept that these advanced models will require a level of expertise to operate that we cannot expect from a land manager trained in natural resource management, requiring that we fund expert model users to support management planning and suppression efforts.
As with all research and development, there are many possible pathways. Regional differences in weather, vegetation, and management history will alter climate effects on vegetation growth, mortality, and flammability. Similar to the Manhattan Project approach of simultaneously pursuing two different ignition systems when there was more than one potential viable alternative, we lack the necessary understanding to pick a “winning” model at this point.
To account for regional differences in vegetation and the research momentum that is developing in different nascent modeling approaches, an effective and robust federal investment would entail the following actions.
Recommendation 1. Congress should establish and fund five centers of excellence housed at academic institutions in the Southeast, Southwest, California, Pacific Northwest, and Northern/Central Rockies to develop and maintain next-generation fire and vegetation models that are capable of modeling extreme fire behavior and can be operationalized to support planning for wildfire and vegetation management and to support wildfire suppression.
Establishing five centers with this geographic distribution will allow for investigation into the forest types where the majority of wildfire area occurs and will capture the range of climatic conditions under which wildfires are occurring. It will also take advantage of past and ongoing regional research efforts that will form the information foundation for each center. While these centers should have largely independent research programs, it will be necessary to coordinate some large-scale experimentation and to ensure that research findings and advances are shared rapidly. To achieve these objectives, one center should be selected to act as the coordinating center for the network.
Recommendation 2. Congress should require institutional partnerships between the host institutions and federal research institutions (e.g., U.S. Forest Service Research and Development, Department of Energy National Labs, U.S. Geological Survey, etc.).
We are currently in an all-hands-on-deck situation in the fire and fuels research community, and we need to operate in a collaborative and regionally coordinated manner. Requiring partnerships between the academic centers of excellence and federal research facilities within each region will ensure that effort is not duplicated and a wider range of expertise. For example, efforts are under way at federal research facilities that could be integrated within the regional fire centers. The integration will ensure collaboration between academic and federal partners and allow for the overall research effort to draw on the strengths of these different types of institutions.
Recommendation 3. Congress should mandate and fund the centers to operate these next-generation models and support wildfire and vegetation management planning and operations.
To date, we have relied on fire and vegetation models developed by the research community to use data collected by fire and forest managers and packaged so that natural resource professionals can operate the models. Both of these constraints have contributed to the limitations of our current suite of models. We can no longer afford the limitations imposed by expectations on the research community to develop models that a natural resource professional can run on a desktop computer. Accounting for a range of factors, such as how changing climatic conditions will directly change the amount of fuel on the landscape and also for how short-term changes in weather will interact with longer-term changes in climate and influence fuel moisture, requires a more sophisticated approach to simulating the system than is necessarily accessible to a non-expert user. Expecting a natural resource professional to use an advanced coupled atmosphere-biosphere fire model would be like teaching someone how to balance their checkbook and then expecting them to calculate exactly how much they need to save every week for retirement. Further, important feedback to model improvement will come from repeated application by expert model users. To deploy next-generation fire and vegetation models in a manner that will effectively support fire and natural resource management decision-making, each center will employ experts who will work collaboratively with managers in response to their requests to run simulations for pre-fire management and suppression operations planning.
Recommendation 4. Congress should mandate the creation of strategic plans to support implementation and coordination across centers.
Each center will develop a five-year strategic plan to guide its research and development efforts. Following strategic plan development, representatives from the five centers will convene to determine necessary coordinated experimentation and implementation plans to facilitate coordinated efforts. The coordinating center will hold biannual leadership meetings to ensure data and information flow and identify additional opportunities for collaboration among individual centers.
Conclusion
Establishing five centers of excellence to develop, maintain, and operate next-generation models will cost approximately $26 million per year, which is less than 1% of the 2021 federal wildfire suppression expenditure. This level of funding would provide $5 million per year per center (plus an additional $1 million per year for the coordinating center). The annual budgets would fund staff scientist and research assistant positions, provide support for the experiments necessary to develop and parameterize new models, provide computing resources for computationally sophisticated models, and fund staff analysts to run the models in support of managers. Initially, the majority of the annual appropriation would be focused on model development, transitioning to maintaining and operating the models to support land management as the technology matures.
The centers could be supported through National Science Foundation (NSF) funding. NSF could provide financial support for five university host institutions (one in each region) selected through a competitive bidding process. In turn, these university host institutions can manage the required federal partnerships. Selection of university host institutions could be based in part on demonstrated capacity to manage successful partnerships with federal institutions.
It is imperative that we invest in new models that will support more effective mitigation to reduce wildfire severity, otherwise spending on suppression will continue to balloon despite improved fire intelligence.
Yes. Just a few examples include the colocation of the University of Georgia with fire researchers in the U.S. Forest Service (USFS) Southern Research Station; the University of New Mexico’s existing relationships with Los Alamos National Lab, Sandia National Lab, and the U.S. Geological Survey; and the University of Washington’s long-standing relationship with the USFS Pacific Northwest Fire and Environmental Applications research group.
NSF is in wildland fire research and, jointly with the National Institute of Standards and Technology, already funds research on fire in the wildland–urban interface. While much of the research needed to develop next-generation fire and vegetation models is basic, all wildland fire research is inherently applicable. NSF hosted a five-day Wildfire and the Biosphere Innovation Lab, and the findings included the assertion that “support for applied research will be most effective by aiming at both short- and long-term applications and solutions,” acknowledging that the application of research findings is an important part of the research enterprise.
Yes. These centers will bring together and build from ongoing efforts. There are already efforts under way to develop optimal treatment strategies that account for changing climatic conditions using advanced forest landscape models. This approach, with some refinement and validation, will be useful for informing treatment placement within the next two years.
This is functionally the system we have now. The Fire Research Management and Exchange System (FRAMES) provides a clearinghouse of models developed for fire and vegetation modeling to inform management. FRAMES may be a good interface to help increase manager awareness of the models the five centers will develop, but it is not a mechanism for facilitating the research and development needed to tackle the wildfire problem. We need five centers because there are already a number of efforts under way to develop new fire and vegetation models. None of the models will be perfect because they all take different approaches and there are tradeoffs inherent in any given approach. With simultaneous investment, we will be able to capitalize on the aspects of each model that best simulate a part of the fire spread or vegetation growth process and then develop a system that incorporates the best of each model. Competition within the U.S. scientific enterprise has helped our country achieve high global standing. Funding five centers will shift that competition away from researchers spending much of their time competing for funding and focus it on competing with their best ideas in a way that prepares us for managing wildfire in the future.
Save Lives by Making Smoke Tracking a Core Part of Wildland Fire Management
Toxic smoke from wildland fire spreads far beyond fire-prone areas, killing many times more people than the flames themselves and disrupting the lives of tens of millions of people nationwide. Data infrastructure critical for identifying and minimizing these smoke-related hazards is largely absent from our wildland fire management toolbox.
Congress and executive branch agencies can and should act to better leverage existing smoke data in the context of wildland fire management and to fill crucial data infrastructure gaps. Such actions will enable smoke management to become a core part of wildland fire management strategy, thereby saving lives.
Challenge and Opportunity
The 2023 National Cohesive Wildland Fire Management Strategy Addendum describes a vision for the future: “To safely and effectively extinguish fire, when needed; use fire where allowable; manage our natural resources; and collectively, learn to live with wildland fire.” Significant research conducted since the publication of the original Strategy in 2014 indicates that wildfire smoke impacts people across the United States, causing thousands of deaths and billions of dollars of economic losses annually.
Smoke impacts exceed their corresponding flame impacts and span far greater areas coast to coast. However, wildfire strategy and funding largely focus on flames and their impacts. Smoke mitigation and management should be a high priority for federal agencies considering the 1:1 ratio of economic impacts and 1:30 ratio of fire to smoke deaths.
Some smoke data is already collected, but these datasets can be made more actionable for health considerations and better integrated with other fire-impact data to mitigate risks and save more lives.
Smoke tracking
Several federal programs exist to track wildfire smoke nationwide, but there are gaps in their utility as actionable intelligence for health. For example, the recent “smoke wave” on the East Coast highlighted some of the difficulties with public warning systems.
Existing wildfire-smoke monitoring and forecast programs include:
- The Fire and Smoke Map, collaboratively managed by the Environmental Protection Agency (EPA) and the US Forest Service, which displays real-time air-quality data but is limited to locations with sensors;
- The National Oceanic and Atmospheric Administration (NOAA) Hazard Mapping System Fire and Smoke Product, which evaluates total-atmosphere smoke, but lacks ground-level estimates of what people would breathe; and
- The Interagency Wildland Fire Air Quality Response Program (IWFAQRP) and the experimental U.S. Forest Service (USFS) BlueSky Daily Runs, which integrate external data to make forecasts, but lack location-specific data for all potentially impacted locations.
The EPA also publishes retrospective smoke emissions totals in the National Emissions Inventory (NEI), but these lack specificity on the downwind locations impacted by the smoke that would be needed to be used for health considerations.
Existing data are excellent, but scientists using the data combine them in non-standardized ways, making interoperability of results difficult. New nationwide authoritative smoke-data tools need to be created—likely by linking existing data and existing methods—and integrated into core wildland fire strategy to save lives.
Smoke health impacts
There is no single, authoritative accounting of wildfire smoke impacts on human health for the public or policymakers to use. Four key gaps in smoke and health infrastructure may explain why such an accounting doesn’t yet exist.
- The U.S. lacks a standardized method for quantifying the health impacts of wildfire smoke, especially mortality, despite recent research progress in this area.
- The lack of a national smoke concentration dataset hinders national studies of smoke-health impacts because different studies take different approaches.
- Access to mortality data through the National Vital Statistics System (NVSS), managed by the National Center for Health Statistics (NCHS), is slow and difficult for the scientists who seek to use mortality data in epidemiological studies of wildfire smoke.
- Gaps remain in understanding the relative harm of wildfire smoke, which can contain aerosolized hazardous compounds from burned infrastructure, compared to the general air pollution (e.g., from cars and factories) that is often used as analog in health research.
Addressing these gaps together will enable official wildfire-smoke-attributable death tolls to be publicized and used by decision-makers.
Integration of wildfire smoke into wildland fire management strategy
Interagency collaborations currently set wildland fire management strategy. Three key groups with a mission to facilitate interagency collaboration are the National Interagency Fire Center (NIFC), the National Wildfire Coordinating Group (NWCG), and the Wildland Fire Leadership Council (WFLC). NIFC maintains datasets on wildfire impacts, including basic summary statistics like acres burned, but smoke data are not included in these datasets. Furthermore, while NWCG does have 1 of its 17 committees dedicated to smoke, and has collaborations that include NOAA (who oversees smoke tracking in the Hazard Mapping System), none of the major wildfire collaborations include agencies with expertise in measuring the impacts of smoke, such as the EPA or Centers for Disease Control (CDC). Finally, WFLC has added calls for furthering community smoke-readiness in the recent 2023 National Cohesive Wildland Fire Management Strategy Addendum, but greater emphasis on smoke is still needed. Better integration of smoke data, smoke-health data, and smoke-expert agencies will enable better consideration of smoke as part of national wildland fire management strategy.
Plan of Action
To make smoke management a core and actionable part of wildland fire management strategy, thereby saving lives, several interrelated actions should be taken.
To enhance decision tools individuals and jurisdictions can use to protect public health, Congress should take action to:
- Issue smoke wave alerts nationwide. Fund the National Weather Service (NWS) to develop and issue smoke wave alerts to communities via the Wireless Emergency Alerts (WEA) system, which is designed for extreme weather alerting. The NWS currently distributes smoke messages defined by state agencies through lower-level alert pathways, but should use the WEA system to increase how many people receive the alerts. Furthermore, a national program, rather than current state-level decisions, would ensure continuity nationwide so all communities have timely warning of potentially deadly smoke disasters. Alerts should follow best practices for alerting to concisely deliver information to a maximum audience, while avoiding alert fatigue.
- Create a nationwide smoke concentration dataset. Fund NOAA and/or EPA to create a data inventory of ground-level smoke PM2.5 concentrations by integrating air-monitor data and satellite data, using existing methods as needed. The proposed data stream would provide standardized estimates of smoke concentrations nationwide, and would be a critical precursor for estimating smoke mortality as well as the extent to which smoke is contributing to poor air quality in communities. This action would be enhanced by data from recommendation 4 (below).
- Create a smoke mortality dataset. Fund the CDC and/or EPA to create a nationwide data inventory of excess morbidity and mortality attributed to smoke from wildland fires. An additional enhancement would be to track the smoke health impacts contributed by each source wildfire. Findings should be disseminated in NIFC wildfire impact summaries. This action would be enhanced by data from recommendations 4-5 and research from recommendations 6-8 (below).
The decision-making tools in recommendations 1-3 can be created today based on existing data streams. They should be further enhanced as follows in recommendations 4-10:
To better track locations and concentrations of wildfire smoke, Congress should take action to:
- Install more air-quality sensors. Fund the EPA, which currently monitors ground-level air pollutants and co-oversees the Fire and Smoke Map with the USFS, to establish smoke-monitoring stations in each census tract across the U.S and in other locations as needed to provide all communities with real-time data on wildfire-smoke exposure.
- Create a smoke impact dashboard. The current EPA Fire and Smoke Map shows near-real-time data from regulatory-grade air monitors, commercial-grade air sensors, and satellite data of smoke plumes. An upgraded dashboard would combine that map with data from recommendations 1-3 to give current and historic information about ground-level air quality, the fraction of pollutants due to wildfire smoke, and the expected health impacts. It would also include short-term forecast data, which would be greatly improved with additional modeling capability to incorporate fire behavior and complex terrain.
To better track health impacts of wildfire smoke, Congress should take action to:
- Improve researcher access to mortality data. Specifically, direct the CDC to increase epidemiologist access to the National Vital Statistics System. This data system contains the best mortality data for the U.S., so enhancing access will enhance the scientific community’s ability to study the health impacts of wildfire smoke (recommendations 6-8).
- Establish wildfire-health research centers. Specifically, fund the National Institutes of Health (NIH) to establish flagship wildfire-smoke health-research centers to research the health effects of wildfire smoke. Results-dissemination pathways should include through the NIFC to reach a broad wildfire policy audience.
- Enhanced health-impact-analysis tools. Direct EPA to evaluate the available epidemiological literature to adopt standardized wildfire-specific concentration-response functions for use in estimating health impacts in their BenMAP-CE tool. Non-wildfire functions are currently used even in the research literature, despite potentially underestimating the health impacts of wildfire smoke.
To enhance wildland fire strategy by including smoke impacts, Congress should take action to:
- Hire interagency staff. Specifically, fund EPA and CDC to place staff at the main NIFC office and join the NIFC collaboration. This will facilitate collaboration between smoke-expert agencies with agencies focused on other aspects of wildfire.
Support landscape management research. Specifically, direct the USFS, CDC, and EPA to continue researching the public health impacts of different landscape management strategies (e.g., prescribed burns of different frequencies compared to full suppression). Significant existing research, including from the EPA, has investigated these links but still more is needed to better inform policy. Needed research will continue to link different landscape management strategies to probable smoke outputs in different regions, and link the smoke outputs to health impacts. Understanding the whole chain of linkages is crucial to landscape management decisions at the core of a resilient wildland fire management strategy.

Diagram with arrows showing data flow from top to bottom, between the proposed infrastructure, with each shape representing one recommendation. Data flows from the data inputs (top boxes) to actionable tools for decision-making (circles), and finally on to pathways for integrating smoke into wildland fire management strategy (bottom boxes). The three blue shapes are recommendations that can be implemented immediately.
Cost estimates
This proposal is estimated to have a first-year cost of approximately $273 million, and future annual cost of $38 million once equipment is purchased. The total cost of the first year represents less than 4% of current annual wildfire spending (subsequent years would be 0.5% of annual spending), and it would lay the foundation to potentially save thousands of lives each year. Assumptions behind this estimate can be found in the FAQ.
Conclusion
In the U.S., more and more people are being exposed to wildfire smoke—27 times more people are experiencing extreme smoke days than a decade ago. The suggested programs are needed to improve the national technical ability to increase smoke-related safety, thereby saving lives and reducing smoke-related public health costs.
Recommendations 1-3 can be completed within approximately 6-12 months because they rely on existing technology. Recommendation 4 requires building physical infrastructure, so it should take 6 months to initiate and several years to complete. Recommendation 5 requires building digital infrastructure from existing tools, so it can be initiated immediately but relies on data from recommendations 2-3 to finalize. Recommendation 6 will require one year of personnel time to complete program review necessary for making changes, then will require ongoing support. Recommendation 7 establishes research centers, which will take 2 years to solicit and select proposals, then 5 years of funding after. Recommendation 8 requires a literature review and can be completed in 1 year. Recommendations 9-10 are ongoing projects that can start within the first year but then will require ongoing support to succeed.
The latest estimates indicate that thousands of people die across the United States each year due to wildfire smoke. However, there is no consistent ongoing tracking of smoke-attributable deaths and no centralized authoritative tallies.
Many deaths occur during the wildfire itself—wildfire smoke contains small particles (less than 2.5 microns, called PM2.5) that immediately increase the risk of stroke and heart attack. Additional deaths can occur after the fire, due to longer-term complications, much in the same way that smoking increases mortality.
Wildfires and wildfire smoke occur across the country, so deaths attributable to these causes do too. Recent research indicates that there are high numbers of deaths attributable to wildfire smoke on the West Coast, but also in Texas and New York, due to long-distance transportation of smoke and the high populations in those states.
One-time costs for recommendations 2, 3, and 8 were estimated in terms of person-years of effort and are additive with their annual costs in the first year. Recommendations 2-3 require a large team to create the initial datasets and then smaller teams to maintain, while recommendation 8 requires only an initial literature review and no maintenance. One person-year is estimated at $150,000 per year, including fringe benefits.
One-time costs for recommendation 4 were calculated in terms of air-quality monitor costs, with one commercial grade sensor ($400) for each of the 84,414 census tracts in the U.S., one sensor comparable to regulatory grade (estimated at $40,000) for each of the 5% most smoke-impacted census tracts, and 15% overhead costs for siting and installation.
Annual costs for recommendations 1-3, 5-6, and 9-10 were estimated in terms of person-years of effort because salary is the main consumable for these projects. One person-year is estimated at $150,000 per year, including fringe benefits.
Annual costs for recommendation 4 were estimated by assuming that 10% of sensors would need replacement per year. These funds can be passed on to jurisdictions, following current maintenance practice of air-quality monitors.
Annual costs for recommendation 7 is for four NIH Research Core Centers (P30 grant type) at their maximum amount of $2.5 million, each, per year.
Collaboration for the Future of Public and Active Transportation
Summary
Public and active transportation are not equally accessible to all Americans. Due to a lack of sufficient infrastructure and reliable service for public transportation and active modes like biking, walking, and rolling, Americans must often depend on personal vehicles for travel to work, school, and other activities. During the past two years, Congress has allocated billions of dollars to equitable infrastructure, public transportation upgrades, and decreasing greenhouse gas pollution from transportation across the United States. The Department of Transportation (DOT) and its agencies should embrace innovation and partnerships to continue to increase active and public transportation across the country. The DOT should require grant applications for funding to discuss cross-agency collaborations, partner with the Department of Housing and Urban Development (HUD) to organize prize competitions, encourage public-private partnerships (P3s), and work with the Environmental Protection Agency (EPA) to grant money for transit programs through the Greenhouse Gas Reduction Fund.
Challenge and Opportunity
Historically, U.S. investment in transportation has focused on expanding and developing highways for personal vehicle travel. As a result, 45% of Americans do not have access to reliable and safe public transportation, perpetuating the need for single-use vehicles for almost half of the country. The EPA reports that transportation accounts for 29% of total U.S. greenhouse gas emissions, with 58% of those emissions coming from light-duty cars. This large share of nationwide emissions from personal vehicles has short- and long-term climate impacts.
Investments in green public and active transit should be a priority for the DOT in transitioning away from a personal-vehicle-dominated society and meeting the Biden Administration’s “goals of a 100% clean electrical grid by 2035 and net-zero carbon emissions by 2050.” Public and active transportation infrastructure includes bus systems, light rail, bus rapid transit, bike lanes, and safe sidewalks. Investments in public and active transportation should go towards a combination of electrifying existing public transportation, such as buses; improving and expanding public transit to be more reliable and accessible for more users; constructing bike lanes; developing community-owned bike share programs; and creating safe walking corridors.
In addition to reducing carbon emissions, improved public transportation that disincentivizes personal vehicle use has a variety of co-benefits. Prioritizing public and active transportation could limit congestion on roads and lower pollution. Fewer vehicles on the road result in less tailpipe emissions, which “can trigger health problems such as aggravated asthma, reduced lung capacity, and increased susceptibility to respiratory illnesses, including pneumonia and bronchitis.” This is especially important for the millions of people who live near freeways and heavily congested roads.
Congestion can also be financially costly for American households; the INRIZ Global Traffic Scorecard reports that traffic congestion cost the United States $81 billion in 2022. Those costs include vehicle maintenance, fuel cost, and “lost time,” all of which can be reduced with reliable and accessible public and active transportation. Additionally, the American Public Transportation Association reports that every $1 invested in public transportation generates $5 in economic returns, measured by savings in time traveled, reduction in traffic congestion, and business productivity. Thus, by investing in public transportation, communities can see improvements in air quality, economy, and health.
Public transportation is primarily managed at the local and state level; currently, over 6000 local and state transportation agencies provide and oversee public transportation in their regions. Public transportation is funded through federal, state, and local sources, and transit agencies receive funding from “passenger fares and other operating receipts.” The Federal Transit Administration (FTA) distributes funding for transit through grants and loans and accounts for 15% of total income for transit agencies, including 31% of capital investments in transit infrastructure. Local and state entities often lack sufficient resources to improve public transportation systems because of the uncertainty of ridership and funding streams.
Public-private partnerships can help alleviate some of these resource constraints because contracts can allow the private partner to operate public transportation systems. Regional and national collaboration across multiple agencies from the federal to the municipal level can also help alleviate resource barriers to public transit development. Local and state agencies do not have to work alone to improve public and active transportation systems.
The following recommendations provide a pathway for transportation agencies at all levels of government to increase public and active transportation, resulting in social, economic, and environmental benefits for the communities they serve.
Plan of Action
Recommendation 1. The FTA should require grant applicants for programs such as the Rebuilding American Infrastructure with Sustainability and Equity (RAISE) to define how they will work collaboratively with multiple federal agencies and conduct community engagement.
Per the National Blueprint for Transportation Decarbonization, FTA staff should prioritize funding for grant applicants who successfully demonstrate partnerships and collaboration. This can be demonstrated, for example, with letters of support from community members and organizations for transit infrastructure projects. Collaboration can also be demonstrated by having applicants report clear goals, roles, and responsibilities for each agency involved in proposed projects. The FTA should:
- Develop a rubric for evaluating partnerships’ efficiency and alignment with national transit decarbonization goals.
- Create a tiered metrics system within the rubric that prioritizes grants for projects based on collaboration and reduction of greenhouse gas emissions in the transit sector.
- Add a category to their Guidance Center on federal-state-local partnerships to provide insight on how they view successful collaboration.
Recommendation 2. The DOT and HUD should collaborate on a prize competition to design active and/or public transportation projects to reduce traffic congestion.
Housing and transportation costs are related and influence one another, which is why HUD is a natural partner. Funding can be sourced from the Highway Trust Fund, which the DOT has the authority to allocate up to “1% of the funds for research and development to carry out . . . prize competition program[s].”
This challenge should call on local agency partners to provide a design challenge or opportunity that impedes their ability to adopt transit-oriented infrastructure that could reduce traffic congestion. Three design challenges should be selected and publicly posted on the Challenge.gov website so that any individual or organization can participate.
The goal of the prize competition is to identify challenges, collaborate, and share resources across agencies and communities to design transportation solutions. The competition would connect the DOT with local and regional planning and transportation agencies to solicit solutions from the public, whether from individuals, teams of individuals, or organizations. The DOT and HUD should work collaboratively to design the selection criteria for the challenge and select the winners. Each challenge winner would be provided with a financial prize of $250,000, and their idea would be housed on the DOT website as a case study that can be used for future planning decisions. The local agencies that provide the three design challenges would be welcome to implement the winning solutions.
Recommendation 3. Federal, state, and local government should increase opportunities for public-private partnerships (P3s).
The financial investment required to develop active and public transportation infrastructure is a hurdle for many agencies. To address this issue, we make the following recommendations:
- Currently, only 36 out of the 50 states have policies that allow the use of P3s. The remaining 14 states should pass legislation authorizing the use of P3s for public transportation projects so that they too can benefit from this financing model and access federal P3 funding opportunities.
- In 2016, the DOT launched the Build America Bureau to assist with financing transportation projects. The Bureau administers the Transportation Infrastructure Finance and Innovation Act (TIFIA) program, which provides financial assistance through low-interest loans for infrastructure projects and leverages public-private partnerships to access additional private-sector funding. Currently, only about 30% of all loans through the TIFIA are used for public transit projects while 66% are used on tolls and highways. Local and regional agencies should use the TIFIA loan more to fund public and active transit projects.
- EPA should specify in its Greenhouse Gas Reduction Fund guidelines that public and active transit projects are eligible for investment from the fund and can leverage public and private partnerships. EPA is set to distribute $27 billion through the Fund for carbon pollution reduction: $20 billion will go towards nonprofit entities, such as green banks, that will leverage public and private investment to fund emissions reduction projects, with $8 billion allocated to projects in low-income and disadvantaged communities; $7 billion will go to state and local agencies and nonprofits in the form of grants or technical assistance to low-income and disadvantaged communities. EPA should encourage applicants to include public and active transportation projects, which can play a significant role in reducing carbon emissions and air pollution, in their portfolios.
Conclusion
The road to decarbonizing the transportation sector requires public and active transportation. Federal agencies can allocate funding for public and active transit more effectively through the recommendations above. It’s time for the government to recognize public and active transportation as the key to equitable decarbonization of the transportation sector throughout the United States.
Most P3s in the United States are for highways, bridges, and roads, but there have been a few successful public transit P3s. In 2018 the City of Los Angeles joined LAX and LAX Integrated Express Solutions in a $4.9 billion P3 to develop a train system within the airport. This project aims to launch in 2024 to “enhance the traveler experience” and will “result in 117,000 fewer vehicle miles traveled per day” to the airport. This project is a prime example of how P3s can help reduce traffic congestion and enable and encourage the use of public transportation.
In 2021, the Congressional Research Service released a report about public-private partnerships (3Ps) that highlights the role the federal government can play by making it easier for agencies to participate in P3s.
The state of Michigan has a long history with its Michigan Saves program, the nation’s first nonprofit green bank, which provides funding for projects like rooftop solar or energy efficiency programs.
In California the California Alternative Energy and Advanced Transportation Financing Authority works “collaboratively with public and private partners to provide innovative and effective financing solutions” for renewable energy sources, energy efficiency, and advanced transportation and manufacturing technologies.
The Rhode Island Infrastructure Bank provides funding to municipalities, businesses, and homeowners for projects “including water and wastewater, roads and bridges, energy efficiency and renewable energy, and brownfield remediation.”
One Small Step: Anticipatory Diplomacy in Outer Space
Summary
The $350 billion space industry could grow to more than $1 trillion by 2040, spurring international interest in harnessing space resources. But this interest will bring with it a challenge: while existing international agreements like the Artemis Accords promote the peaceful and shared exploration of celestial bodies, they do little to address differences between existing scientific research activities and emerging opportunities like lunar mining, particularly for water ice at polar latitudes and in the perpetually shaded depths of certain craters. Lunar water ice will be a vital resource for outer space exploration and development efforts because it can be used to make hydrogen fuel cells, rocket fuel, and drinking water for astronauts. It will also be cheaper than transporting water from Earth’s surface into outer space, given the moon’s lower surface gravity and proximity to human space operations on its surface and beyond. The moon harbors other valuable long-term commodities like helium-3, the fuel needed for low-emissions nuclear fusion energy.
However, current multilateral agreements do not address whether nongovernmental operators can claim territory on celestial bodies for their use or own the resources they extract. Further, the space object registration process is currently used for satellites and other spacecraft while in orbit, but it does not include space objects intended for use on the surface of celestial bodies, such as mining equipment. These gaps leave few options for the United States or other Artemis Accords nations to resolve conflicts over territorial claims on a celestial body. In the worst-case scenario, this increasing competition for resources—especially with other major space powers like China and Russia—could escalate into military conflict.
Adopting new treaties or amendments to the existing Outer Space Treaty (OST) for modern space use is a slow process that may fail to meet the urgency of emerging space resource issues. However, the United States has another diplomatic avenue for faster action: revision of the existing United Nations’ Guidelines for the Long-term Sustainability of Outer Space under the auspices of the U.N. Committee on the Peaceful Uses of Outer Space (COPUOS). Such a process avoids the decade-long deliberations of a formal treaty amendment. The United States should thus lead the development of multilateral protocols for extracting resources from celestial bodies by proposing two updates to either the COPUOS Guidelines, the OST, or both. First, there should be an updated registration process for all space objects, which should specify the anticipated location, timeline, and type(s) of operation to establish usage rights on a particular part of a celestial body. Second, the United Nations should establish a dispute resolution process to allow for peaceful resolution of competing claims on celestial surfaces. These strategies will lay the necessary foundation for peacefully launching new mining operations in space.
Challenge and Opportunity
Right now, outer space is akin to the Wild West, in that the opportunities for scientific innovation and economic expansion are numerous, yet there is little to no political or legal infrastructure to facilitate orderly cooperation between interested terrestrial factions. For example, any nation claiming mining rights to lunar territory is on shaky legal ground, at best: the Outer Space Treaty and the subsequent Guidelines for the Long-term Sustainability of Outer Space, promulgated by the U.N. Committee on the Peaceful Uses of Outer Space, do not provide legally sound or internationally recognized development rights, enforcement structures, or deconfliction mechanisms. If one claimant allegedly violates the territorial rights of another, what legal systems could either party use to press their case? Moreover, what mechanisms would avert potential escalation toward militarized conflict? Right now, the answer is none.
This is an unfortunate obstacle to progress given the enormous economic potential of outer space development in the coming decades. To put the potential value in perspective, the emerging $350 billion space industry could grow to more than $1 trillion by 2040, motivating significant international interest. One potentially lucrative subset of operations is space mining, a sector valued at $1 billion today with a potential value of $3 billion by 2027. Once operational, space mining would be a valuable source of rare earth elements (e.g., neodymium, scandium, and others), 60% of which are currently produced in China. Rare earth elements are necessary for essential technologies such as electric vehicles, wind turbines, computers, and medical equipment. Additionally, in the event that nuclear fusion becomes commercially viable in the long-term future, space mining will be an essential industry for securing helium-3 (He-3), an abundant isotope found on the moon. Recent increases in fusion investment and a breakthrough in fusion research show the potential for fusion energy, but there is no guarantee of success. He-3 could serve as a critical fuel source for future nuclear fusion operations, an emerging form of energy production free of carbon emissions that could provide humanity with the means to address global climate and energy crises without losing energy abundance. The abundance of lunar He-3 could mean having access to secure clean energy for the foreseeable human future.
Furthermore, human exploration and development of outer space will require water, both in the form of drinking water for crewed missions and in the form of rocket propellant and fuel cell components for spacecraft. As it costs over $1 million to transport a single cubic meter of water from Earth’s surface into low Earth orbit, extracting water from the lunar surface for use in outer space operations could be substantially more economical due to the moon’s lower escape velocity—in fact, lunar water ice is estimated to be worth $10 million per cubic meter.
The space mining sector and lunar development also offer promise far beyond Earth. Our moon is the perfect “first port of call” as humanity expands into outer space. It has lower surface gravity, polar ice deposits, and abundant raw materials such as aluminum, and its status as our closest celestial neighbor make it the ideal layover supply depot and launch point for spacecraft from Earth heading deeper into our solar system. Spacecraft could be launched from Earth with just enough fuel to escape Earth’s gravity, land and refuel on the moon, and launch far more efficiently from the moon’s weaker gravity elsewhere into the system.
All in all, the vast untapped scientific and economic potential of our moon underscores the need for policy innovation to fill the gaps in existing international space law and allow the development of outer space within internationally recognized legal lines. The imperative for leading on these matters falls to the United States as a nation uniquely poised to lead the space mining industry. Not only is the United States one of the global leaders in space operations, but U.S. domestic law, including the Commercial Space Launch Competitiveness Act of 2015, provides the U.S. private sector some of the necessary authority to commercialize space operations like mining. However, the United States’ rapid innovation has also led the way to a growing space industry internationally, and the sector is now accessible to more foreign states than before. The internationalization of the space economy further highlights the gaps and failings of the existing space policy frameworks.
Two main challenges must be addressed to ensure current governance structures are sufficient for securing the future of lunar mining. First is clarifying the rights of OST State Parties and affiliated nongovernmental operators to establish space objects on celestial bodies and to own the resources extracted. The OST, the primary governing tool in space (Figure 2), establishes that no State that signed the treaty may declare ownership over all or part of a celestial body like the moon. And despite the domestic authority bestowed by the 2015 Commercial Space Launch Competitiveness Act, the multilateral OST does not address whether nongovernmental operators can claim territory and own resources they extract from celestial bodies. Thus, the OST promotes the peaceful and shared exploration of space and scientific research but does little to address differences between research operations and new commercial opportunities like lunar mining. This leaves few options to resolve conflicts that may arise between competing private sector entities or States.
Even if domestic authorization of mining operations were sufficient, a second challenge has emerged: ensuring transparency and recordkeeping of different operations to maintain peaceful shared operations in space. Through the OST and the Registration Convention, States have agreed to inform the U.N. Secretary General of space activities and to maintain a record of registered space objects (including a unique identifier, the location and date of launch, and its orbital path). But this registration process covers space objects simply at a geospatial position in orbit, and there are gaps in the process for space objects intended for use on the surface of celestial bodies and whether a spacecraft that was designed for one purpose (i.e., landing) can be repurposed for another purpose (i.e., mining). This leaves little recourse for any group that seeks to peacefully pursue mining operations on the moon’s surface if another entity also seeks to use that land.
In spite of these gaps, the U.S. government has been able to move forward with scaling up moon-related space missions via NASA’s bipartisan Artemis Program and the corresponding Artemis Accords (Figure 1), a set of bilateral agreements with updated principles for space use. The Accords have 24 signatories who collectively seek to reap the benefits of emerging space opportunities like mining. In part, the Artemis Accords aim to remedy the policy gaps of previous multilateral agreements like the OST by explicitly supporting private sector efforts to secure valuable resources like He-3 and water ice.
But the Accords do not address the key underlying challenges that could stifle U.S. innovation and leadership in space mining. For instance, while the Accords reaffirm the need to register space objects and propose the creation of safety zones surrounding lunar mining operations, gaps still remain in describing exactly how to register operations on celestial objects. This can be seen in Section 7 of the Artemis Accords, which states that space objects need to be registered, but does not specify what would classify as a “space object” or if an object registered for one purpose can be repurposed for other operations. Further, the Accords leave little room to address broader international tensions stemming from increased resource competition in space mining. While competition can have positive outcomes such as spurring rapid innovation, unchecked competition could escalate into military conflict, despite provisions in the original OST to avoid this.
In particular, preemptive measures must be taken to alleviate potential tensions with other OST signatories in direct competition with the Accords. China and Russia are not party to the Accords and therefore do not need to abide by the agreement. In fact, these nations have declared opposition to the Accords and instead formed their own partnership to establish a competing International Lunar Research Station. As these programs develop concrete lunar applications, designating methods to determine who can conduct what type of operations on specific timelines and in specific locations will be a crucial form of anticipatory diplomacy.
Plan of Action
The United States should propose that when any State registers a space object in advance of operations on a celestial body, it must specify the anticipated location of the operation; the timeline; and the type(s) of operation, described as “intent to” do one or more of the following: mine/extract resources for sale, conduct scientific research, or perform routine maintenance. This multilaterally developed process would clarify the means to register space objects for peaceful occupation of celestial object surfaces.
Additionally, the United States should propose the implementation of a process for States to resolve disputes through either bilateral negotiation or arbitration through another mutually agreed-upon third party such as the International Court of Justice (ICJ) or the Permanent Court of Arbitration (PCA). Similar disputes related to maritime resource extraction under the United Nations Law of the Sea have been resolved peacefully using the aforementioned bilateral negotiations or third party arbitration. The new dispute resolution process would similarly allow for peaceful resolution of competing claims on celestial body surfaces and resources.
To guide the creation of a space object arbitration process, other such processes like the ICJ, PCA, and International Tribunal of the Law of the Sea can be used as models. The PCA has had success with halting unfair processes and setting up a dialogue between participating parties. It has helped smaller countries set up arbitration processes with bigger ones, such as Ecuador vs. the United States, in which the Republic of Ecuador instituted arbitral proceedings against the United States concerning the interpretation and application of an investment treaty between the two countries. In the short term, existing negotiation avenues will likely be sufficient to allow for dispute resolution. However, as the space industry continues to grow, it may eventually be necessary to establish an internationally recognized “Space Court” to arbitrate disputes. The International Tribunal for the Law of the Sea provides an example of the type of international body that could arbitrate space disputes.
These anticipatory diplomacy steps could be implemented in one of three ways:
- As a binding amendment to the OST: This would require the most time to implement, but this would also make it enforceable and binding, an obvious advantage. It would also provide an opportunity to bring all the important players to the table, specifically the parties who did not sign the Artemis Accords, and would help to start a discussion on the improvement of diplomatic relations for future space operations.
- As a nonbinding update to COPUOS Guidelines: This would be faster to implement, but would not be enforceable or binding.
- As an update to the COPUOS Guidelines followed by an amendment to the OST: This would allow for both quick action in the nearer term and a permanent and enforceable implementation longer-term. Implementing a revised COPOUS could be a precursor to build support for the nonbinding updates to COPUOS. If the model is successful, State Parties would be more likely to agree to a binding amendment to OST. However they are implemented, these two proposed anticipatory diplomacy steps would improve the ability of space faring nations to peacefully use resources on celestial bodies.
Could this be done through bilateral agreements? After all, the United States has shown diplomatic initiative by entering into agreements with countries such as France, Germany, and India with the aim of using space for peaceful purposes and cooperation, though they don’t explicitly mention mining. But a bilateral process does not offer good prospects for global solutions. For one, it would be very slow and time-consuming for the United States to enter into bilateral agreements with every major country with stakes in lunar mining. If space mining agreements were to occur on a similar timeline to bilateral trade agreements, each agreement could take from one to six years to take effect. A crucial obstacle is the Wolf Amendment, which prevents the United States from entering into bilateral agreements with China, one of the its major competitors in the space industry. This restriction makes it hard to negotiate bilaterally with an important stakeholder concerning space mining.
Further, reaching these agreements would require addressing aspects of the Accords that have made many major stakeholder countries hesitant to sign on. Thus, an easier path would be to operate diplomatically through the COPUOS, which already represents 95 major countries and oversees the existing multilateral space treaties and potential amendments to them. This approach would ensure that the United States still has some power over potential amendment language but would bring other major players into some sort of dialogue regarding the usage of space for commercial purposes.
While the COPUOS guidelines are not explicitly binding, they do provide a pathway for verification and arbitration, as well as a foundation for the adoption of a binding amendment or a new space treaty moving forward. Treaty negotiations are a slow, lengthy process; the OST required several years of work before it took full effect in 1967. With many Artemis Program goals reliant upon successful launches and milestones achieved by 2025, treaty amendments are not the timeliest approach. Delays could also be caused by the fact that some parties to the OST may have reservations about adopting an amendment for private sector space use due to another space treaty, the Moon Agreement. This agreement, which the United States is not party to, asserts that “the Moon and its natural resources are the common heritage of mankind and that an international regime should be established to govern the exploitation of such resources when such exploitation is about to become feasible.” Thus, countries that have signed the Moon Agreement probably want the moon to operate like a global commons with all countries on Earth having access to the fruits of lunar mining or other resource extraction. Negotiations with these nations will require time to complete.
The U.S. State Department’s Office of Space Affairs, under the Bureau of Oceans and Environmental and Scientific Affairs (OES), is the lead office for space diplomacy, exploration, and commercialization and would be the ideal office to craft the required legislation for an OST amendment. Additionally, the Office of Treaty Affairs, which is often tasked with writing up the legal framework of treaties, could provide guidance on the legislation and help initiate the process within the U.S. State Department and the United Nations. Existing U.S. law like the Commercial Space Launch Competitiveness Act, and international treaties like OST and Registration Convention, provide authority for these proposals to be implemented in the short term. However, negotiation of updates to COPOUS Guidelines and amendments to the OST and other relevant space treaties over the next 5 to 10 years will be essential to their long term success.
Finally, the Federal Aviation Administration (FAA) at the Department of Transportation would be the logical federal agency to initially lead implementing the updated registration process for U.S.-affiliated space objects and for verifying the location and intended use of space objects from other nations. FAA implements the current U.S. process for space objects registration. In the long term it could be appropriate to transfer responsibility for space object registration to the rapidly growing Office of Space Commerce (OSC) at the Department of Commerce. Moving responsibilities for implementing space object registration and verification to the OSC would provide opportunities for the office to expand with the rapidly expanding space industry. This change would also allow the FAA to focus on its primary responsibilities for regulating the domestic aerospace industry.
Conclusion
Douglas Adams may have put it best: “Space is big. You just won’t believe how vastly, hugely, mind-bogglingly big it is.” While Adams was describing the sheer size of space, this description applies just as well to the scale of outer space’s scientific and economic prospects. After all, any new economic theater that will grow into a multi-trillion dollar market in just a few decades is not to be taken lightly. But without a plan to avert and resolve potential conflicts with other outer space actors, the United States’ future efforts in this emerging theater will be hamstrung. Improved collaboration on space mining provides an opportunity to promote international cooperation and economic development, while military conflict in space poses high risks to the economic potential of the current and future space industry. Transparent and widely agreed-upon frameworks would allow for peaceful competition on scientific research and resource extraction on celestial objects.
Lunar mining has shown promise for providing access to water ice, rare earth metals, He-3, and other raw materials crucial for the further exploration of space. Providing a peaceful and secure source of these materials would build on the bipartisan Commercial Space Launch Competitiveness Act’s guidelines for space resource extraction and, in the long run, further enable the modernization and decarbonization of the U.S. electric grid for public benefit.
In order to promote the peaceful exploration and development of space, we must update existing international law—either the COPUOS Guidelines, the OST, or both—to clarify the locations, timeline, and types of outer space operations conducted by state actors. We must also propose deconfliction mechanisms for OST parties to resolve disputes peacefully via bilateral negotiation or arbitration by a mutually acceptable third party like the ICJ or PCA. Just as the United States led the world into the “final frontier” in the 20th century, so too must we lead the next chapter in the 21st. If implemented successfully, the anticipatory space diplomacy we propose will allow for the shared peaceful use of celestial bodies for decades to come.
Acknowledgments
Dr. Sindhu Nathan provided valuable insights into the writing of this memo.
There would be no additional cost to the recommendation outside of existing costs for diplomatic and U.N. activities. The Artemis Program is expected to cost $93 billion through 2025 and Congressional appropriators are already questioning the billion-dollar price tag for each planned launch. Thus, clarifying these legal frameworks may help incentivize private innovation and reduce launch costs. This proposal may facilitate economic benefits at virtually no extra cost. Therefore, the United States and Artemis Accords nations have a vested interest to ensure that these continuing investments result in successful missions with as few additional costs as possible. This proposal will likely also facilitate further private investment and innovation and protect against risk to investment from military conflict.
Another similar treaty, the Antarctic Treaty of 1961, is a great example of how different countries can unite and create a dialogue to effectively manage and share a common resource. Although the region is used for various scientific purposes, all countries can do so in a peaceful and cooperative manner. This is in part because the Antarctic treaty has been systematically updated to reflect the changing times, especially concerning the environment. The OST has not undergone any such changes. Thus, updating the COPUOS would provide a means for the United States to take the lead in ensuring that space remains a common shared resource and that no country can unfairly claim a monopoly over it.
Nuclear fusion is currently not commercially viable. However, significant interest and investment is currently centered around this potential energy source, and breakthroughs in the technology have been recently reported by leading researchers in the field. Access to He-3 will be critical if and when this industry is commercially viable.
The OST currently allows State Parties to observe space flights and access equipment for any other OST State Party. One way States could use this power to ensure these guidelines are followed is for States and the COPUOS to track how many and what types of space object operations occur on celestial bodies. (The U.S. Department of Defense already tracks over 26,000 outer space objects, but cross-referencing with COPUOS could help differentiate between debris and state objects of interest.) Interested or concerned parties could verify the accuracy of registered operations of space objects on celestial bodies led by other States, and any violations of the new guidelines could be referred to the new dispute resolution process.
In the United States, the Guidelines would be ratified in the same way as other United Nations regulations and international treaties, in the form of an executive agreement. These are directly implemented by the president and do not require a majority in the Senate to be passed but are still legally binding.
The purpose of a neutral organization like the United Nations is to engage in meaningful dialogue between powerful countries. Since space is a common shared resource, it is best to ensure that all parties have a stage to be part of talks that deal with the sharing of resources. Suggesting guidelines to a popular treaty is a good place to start, and the United States can show leadership by taking the first step while also advocating for terms that are beneficial to U.S. interests.
All the signatories of the COPOUS meet every year to discuss the effectiveness of the treaty, and countries propose various statements to the chair of the committee. (The United States’ statements from the 65th meeting of the committee in 2022 can be found here.) Although there is no obvious precedent where a statement has directly been converted into guidelines, it would still be useful to make a statement regarding a possible addition of guidelines, and one could reasonably hope it could open doors for negotiations.
Arbitration processes such as those described in the U.N. Conventions on the Law of the Sea ensure that powerful countries are not able to dominate smaller countries or frighten them with the possibility of war. Although the verdict of the arbitration process would have to be enforced by OST States, it provides a peaceful alternative to immediate military conflict. This would at least halt disputed proceedings and give time for States involved with the dispute to gather resources and support. The existence of an arbitration process would reinforce the principle that all OST States, both small and large, are entitled to access space as an equal resource for all.
Increasing National Resilience through an Open Disaster Data Initiative
Summary
Federal, state, local, tribal, and territorial agencies collect and maintain a range of disaster resilience, vulnerability, and loss data. However, this valuable data lives on different platforms and in various formats across agency silos. Inconsistent data collection and lack of validation can result in gaps and inefficiencies and make it difficult to implement appropriate mitigation, preparedness, response, recovery, and adaptation measures for natural hazards, including wildfires, smoke, drought, extreme heat, flooding, and debris flow. Lack of complete data down to the granular level also makes it challenging to gauge the true cost of disasters.
The Biden-Harris Administration should launch an Open Disaster Data Initiative to mandate the development and implementation of national standards for disaster resilience, vulnerability, and loss data to enable federal, state, local, tribal, and territorial agencies to regularly collect, validate, share, and report on disaster data in consistent and interoperable formats.
Challenge and Opportunity
Disaster resilience, vulnerability, and loss data are used in many life-saving missions, including early detection and local response coordination, disaster risk assessments, local and state hazard mitigation plans, facilitating insurance and payouts, enabling rebuilding and recovery, and empowering diverse communities to adapt to climate impacts in inclusive, equitable, and just ways.
While a plethora of tools are being developed to enable better analytics and visualizations of disaster and climate data, including wildfire data, the quality and completeness of the data itself remains problematic, including in the recently released National Risk Index.
This is because there is a lack of agency mandates, funding, capacity, and infrastructure for data collection, validation, sharing, and reporting in consistent and interoperable formats. Currently, only a few federal agencies have the mandate and funds from Congress to collect disaster data relevant to their mission. Further, this data does not necessarily integrate state and local data for non-federally declared disasters.
Due to this lack of national disaster and climate data standards, federal and state agencies, universities, nonprofits, and insurers currently maintain disaster-related data in silos, making it difficult to link in productive and efficient ways down to the granular level.
Also, only a few local, state, and federal agencies regularly budget for or track spending on disaster resilience, vulnerability, and response activities. As a result, local agencies, nonprofits, and households, particularly in underserved communities, often lack access to critical lifesaving data. Further, disaster loss data is often private and proprietary, leading to inequality in data access and usability. This leaves already disadvantaged communities unprepared and with only a limited understanding of the financial burden of disaster costs carried by taxpayers.
Since the 1990s, several bipartisan reviews, research, data, and policy documents, including the recent President’s Council of Advisors on Science and Technology (PCAST) report on modernizing wildland firefighting, have reiterated the need to develop national standards for the consistent collection and reporting of disaster vulnerability, damage, and loss data. Some efforts are under way to address the standardization and data gaps—such as the all-hazards dataset that created an open database by refining the Incident Command System data sets (ICS-209).
However, significant work remains to integrate secondary and cascading disasters and monitor longitudinal climate impacts, especially on disadvantaged communities. For example, the National Interagency Fire Center consolidates major wildfire events but does not currently track secondary or cascading impacts, including smoke (see AirNow’s Fire and Smoke Map), nor does it monitor societal vulnerabilities and impacts such as on public health, displacement, poverty, and insurance. There are no standardized methods for accounting and tracking damaged or lost structures. For example, damage and loss data on structures, fatalities, community assets, and public infrastructure is not publicly available in a consolidated format.
The Open Disaster Data Initiative will enable longitudinal monitoring of pre- and post-event data for multiple hazards, resulting in a better understanding of cascading climate impacts. Guided by the Open Government Initiative (2016), the Fifth National Action Plan (2022), and in the context of the Year of Open Science (2023), the Open Disaster Data Initiative will lead to greater accountability in how federal, state, and local governments prioritize funding, especially to underserved and marginalized communities.
Finally, the Open Disaster Data Initiative will build on the Justice40 Initiative and be guided by the recommendations of the PCAST Report on Enhancing prediction and protecting communities. The Open Disaster Data Initiative should also reiterate the Government Accountability Office’s 2022 recommendation to Congress to designate a federal entity to develop and update climate information and to create a National Climate Information System.
Precedents
Recent disaster and wildfire research data platforms and standards provide some precedence and show how investing in data standards and interoperability can enable inclusive, equitable, and just disaster preparedness, response, and recovery outcomes.
The Open Disaster Data Initiative must build on lessons learned from past initiatives, including:
- the National Weather Service’s (NWS) Storm Events database, which collects meteorological data on when and where extreme events occur, along with occasional but unverified estimates of socioeconomic impacts.
- the Centers for Disease Control’s (CDC) COVID-19 Data Modernization Initiative, which attempts to harmonize data collection and reporting across national, tribal, state, and local agencies.
- the National Oceanic and Atmospheric Administration’s (NOAA) National Integrated Drought Information System (NIDIS), a multiagency partnership that coordinates drought monitoring, forecasting, planning, and information at national, tribal, state, and local levels but is impacted by inconsistent data reporting.
- the Federal Emergency Management Agency (FEMA)’s OpenFEMA initiative, which shares vast amounts of data on multiple aspects of disaster outcomes, including disaster assistance, hazard mitigation investments, the National Flood Insurance Program, and grants, but requires technical expertise to access and utilize the data effectively.
- FEMA’s National Risk Index, which maps the nation’s resilience, vulnerability, and disaster losses at county and census tract levels but shows shortcomings in capturing the risk of geophysical events such as earthquakes and tsunamis. In late 2022, Congress passed the Community Disaster Resilience Zones Act (P.L. 117-255), which codifies the National Risk Index. The goal is to support the census tracts with the highest risk rating with financial, technical, and other forms of assistance.
There are also important lessons to learn from international efforts such as the United Nations’ ongoing work on monitoring implementation of the Sendai Framework for Disaster Risk Reduction (2015–2030) by creating the next generation of disaster loss and damage databases, and the Open Geospatial Consortium’s Disaster Pilot 2023 and Climate Resilience Pilot, which seek to use standards to enable open and interoperable sharing of critical geospatial data across missions and scales.
Plan of Action
President Biden should launch an Open Disaster Data Initiative by implementing the following four actions.
Recommendation 1. Issue an Executive Order to direct the development and adoption of national standards for disaster resilience, vulnerability, and loss data collection, validation, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies to create the enabling conditions for adoption by universities, non-profits, and the private sector. The scope of this Executive Order should include data on local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
Recommendation 2. Direct the Office of Management and Budget (OMB) to issue an Open Disaster Data Initiative Directive for all relevant federal agencies to collaboratively implement the following actions:
- Direct the National Council on Science and Technology to appoint a subcommittee to work with the National Institute of Standards and Technology to develop national standards for disaster resilience, vulnerability, and loss data collection, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies, as well as by universities, nonprofits, and the private sector.
- Direct all relevant federal agencies to adopt national standards for disaster resilience, vulnerability, and loss data collection; validation; sharing; and reporting to address ongoing issues concerning data quality, completeness, integration, interoperability, accessibility, and usability.
- Develop federal agency capacities to accurately collect, validate, and use disaster resilience, vulnerability, and loss data, especially as it relates to population estimates of mortality, morbidity, and displacements, including from extreme heat and wildfire smoke.
- Direct FEMA to coordinate and implement training for state, local, tribal, and territorial agencies on how to collect disaster resilience, vulnerability, and loss data in line with the proposed national standards. Further, building on the FEMA Data Strategy 2023-2027 and in line with OpenFEMA, FEMA should review its private and sensitive data sharing policy to ensure that disaster data is publicly available and useable. FEMA’s National Incident Management System will be well positioned to cut across hazard mission silos and offer wide-ranging operational support and training for disaster loss accounting to federal, state, local, tribal, and territorial agencies, as well as nonprofit stakeholders, in coordination with FEMA’s Emergency Management Institute.
Recommendation 3. Designate a lead coordinator for the Open Disaster Data Initiative within the Office of Science Technology and Policy (OSTP), such as the Tech Team, to work with the OMB on developing a road map for implementing the Open Disaster Data Initiative, including developing the appropriate capacities across all of government.
Recommendation 4. Direct FEMA to direct appropriate funding and capacities for coordination with the National Weather Service (NWS), the U.S. Department of Agriculture’s Risk Management Agency, and the National Centers for Environmental Information (NCEI) to maintain a federated, open, integrated, and interoperable disaster data system that can seamlessly roll up local data, including research, nonprofit, and private, including insurance data.
In addition, Congress should take the following three actions to ensure success of the Open Disaster Data Initiative:
Recommendation 5. Request the Government Accountability Office to undertake a Disaster Data Systems and Infrastructure Review to:
- Inform the development of national standards and identify barriers for accurate disaster data collection, validation, accounting, and sharing between federal, state, local, tribal, and territorial agencies, as well as the philanthropic and private sector.
- Review lessons learned from precedents (including NWS’s Storm Events database, CDC’s Data Modernization Initiative, NOAA’s NIDIS, and FEMA’s National Risk Index).
- Form the basis for the OMB and the OSTP to designate an appropriate budget and capacity commitment and suggest a national framework/architecture for implementing an Open Disaster Data Initiative.
Recommendation 6. Appropriate dedicated funding for the implementation of the Open Disaster Data Initiative to allow federal agencies, states, nonprofits, and the private sector to access regular trainings and develop the necessary infrastructure and capacities to adopt national disaster data standards and collect, validate, and share relevant data. This access to training should facilitate seamless roll-up of disaster vulnerability and loss data to the federal level, thereby enabling accurate monitoring and accounting of community resilience in inclusive and equitable ways.
Recommendation 7. Use the congressional tool of technical corrections to support and enhance the Initiative:
- Pass Technical Corrections and Improvements to the Community Disaster Resilience Zones Act to include provisions for the collection, sharing, and reporting of disaster resilience, vulnerability, and loss data by all relevant federal, state, local, tribal, and territorial agencies and academic, private, community-based, and nonprofit entities, in consistent and interoperable formats, and in line with the proposed national disaster data standards. Technical corrections language could point to the requirement to “review the underlying collection and aggregation methodologies as well as consider the scoping of additional data the agency (ies) may be collecting.” This language should also direct agencies to review dissemination procedures for propriety, availability, and access for public review. The scope of this technical correction should include local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
- Pass Technical Corrections and Improvements or suspension bill to the Disaster Recovery Reform Act, section 1223 which mandates a study of information collection or section 1224, which requires publication of said data accessible to the public would complement the above by tasking FEMA to study, aggregate, and share information with the public in a way that is digestible and actionable.
Conclusion
The Open Disaster Data Initiative can help augment whole-of-nation disaster resilience in at least three ways:
- Enable enhanced data sharing and information coordination among federal, state, local, tribal, and territorial agencies, as well as with universities, nonprofits, philanthropies, and the private sector.
- Allow for longitudinal monitoring of compounding and cascading disaster impacts on community well-being and ecosystem health, including a better understanding of how disasters impact poverty rates, housing trends, local economic development, and displacement and migration trends, particularly among disadvantaged communities.
- Inform the prioritization of policy and program investments for inclusive, equitable, and just disaster risk reduction outcomes, especially in socially and historically marginalized communities, including rural communities.
Recent analysis by a federal interagency effort, Science for Disaster Reduction, shows that national-level databases significantly underreport disaster losses due to an overreliance on public sources and exclusion (or inaccessibility) of loss information from commercial as well as federal institutions that collect insured losses.
Also, past research has captured common weaknesses of national agency-led disaster loss databases, including:
- over- or underreporting of certain hazard types (hazard bias)
- gaps in historic records (temporal bias)
- overreliance on direct and/or monetized losses (accounting bias)
- focus on high impact and/or acute events while ignoring the extensive impacts of slow disasters or highly localized cascading disasters (threshold bias)
- overrepresentation of densely populated and/or easily accessible areas (geography bias)
The National Weather Service’s Storm Events Database, the USDA’s Risk Management Agency’s Crop Data, and the CDC’s COVID-19 Data Modernization Initiative provide good templates for how to roll up data from the local to federal level. However, it is important to recognize that past initiatives, such as NOAA’s NIDIS initiative, have found it challenging to go beyond data collection on standard metrics of immediate loss and damage to also capture data on impacts and outcomes. Further, disaster loss and damage data are not currently integrated with other datasets that may capture secondary and cascading effects, such as, injuries, morbidities, and mortalities captured in CDC’s data.
Defining new standards that expand the range of attributes to be collected in consistent and interoperable formats would allow for moving beyond hazard and geographic silos, allowing data to be open, accessible, and usable. In turn, this will require new capacity and operational commitments, including an exploration of artificial intelligence, machine learning, and distributed ledger system (DLS) and blockchain technology, to undertake expanded data collection, sharing, and reporting across missions and scales.
Aligning with guidance provided in the OSTP’s recent Blueprint for an AI Bill of Rights and several research collective initiatives in recent years, the Open Disaster Data Initiative should seek to make disaster resilience, vulnerability, loss, and damage data FAIR (findable, accessible, interoperable, reusable) and usable in CARE-full ways (collective benefit, with authority to control, for responsible, and, ethical use).
A technical corrections bill is a type of congressional legislation to correct or clarify errors, omissions, or inconsistencies in previously passed laws. Technical corrections bills are typically noncontroversial and receive bipartisan support, as their primary goal is to correct mistakes rather than to make substantive policy changes. Technical corrections bills can be introduced at any time during a congressional session and may be standalone bills or amendments to larger pieces of legislation. They are typically considered under expedited procedures, such as suspension of the rules in the House of Representatives, which allows for quick consideration and passage with a two-thirds majority vote. In the Senate, technical corrections bills may be considered under unanimous consent agreements or by unanimous consent request, which allows for passage without a formal vote if no senator objects. Sometimes more involved technical corrections or light policy adjustments happen during “vote-o-rama” in the Senate.
Technical corrections bills or reports play an important role in the legislative process, particularly during appropriations and budgeting, by helping to ensure the accuracy and consistency of proposed funding levels and programmatic changes. For example, during the appropriations process, technical corrections may be needed to correct funding levels or programmatic details that were inadvertently left out of the original bill. These technical changes can be made to ensure that funding is allocated to the intended programs or projects and that the language of the bill accurately reflects the intent of Congress.
Similarly, during the budgeting process, technical corrections may be needed to adjust estimates or projections based on new information or changes in circumstances that were not foreseen when the original budget was proposed. These technical changes can help to ensure that the budget accurately reflects the current economic and fiscal conditions and that funding priorities are aligned with the goals and priorities of Congress. For example, in 2021, Congress used a technical corrections bill to clarify budget allocations and program intent after Hurricane Ida to make recovery programs more efficient and help with overall disaster recovery program clarification. Similarly, in 2017, Congress relied on a technical corrections/suspension bill to clarify some confusing tax provisions related to previous legislation for relief from Hurricane Maria.
Strengthening the U.S. Biomanufacturing Sector Through Standardization
Summary
The advancement and commercialization of bioprocesses in the United States is hindered by a lack of suitable and available pilot-scale and manufacturing-scale facilities. This challenge stems in part from our inability to repurpose facilities that are no longer needed due to a lack of standardization and inadequate original design. Historically, most biomanufacturing facilities have been built with a single product in mind and with a focus on delivering a facility as cheaply and quickly as possible. While this might be the best approach for individual private companies, it is not the best approach for the bioeconomy as a whole. The Biden-Harris Administration should establish a program to standardize the construction of biomanufacturing facilities across the United States that also permits facilities to be repurposed for different products in the future.
Through government-incentivized standardization, better biomanufacturing facilities can be built that can be redeployed as needed to meet future market and governmental needs and ultimately solve our nation’s lack of biomanufacturing capacity. This program will help protect U.S. investment in the bioeconomy and accelerate the commercialization of biotechnology. Enforcement of existing construction standards and the establishment of new standards that are strictly adhered to through a series of incentivization programs will establish a world-leading biomanufacturing footprint that increases supply resilience for key products (vaccines, vitamins, nutritional ingredients, enzymes, renewable plastics), reduces reliance on foreign countries, and increases the number of domestic biomanufacturing jobs. Furthermore, improved availability of pilot-scale and manufacturing-scale facilities will accelerate growth in biotechnology across the United States.
This memo details a framework for developing and deploying the necessary standards to enable repurposing of biomanufacturing facilities in the future. A team of 10–12 experts led by the National Institute for Standards and Technology (NIST) should develop these standards. A government-sponsored incentivization program with an estimated cost of $50 million per year would then subsidize the building of new facilities and recognition of participating companies.
Challenge and Opportunity
Currently, the United States faces a shortage in both pilot-scale and manufacturing-scale biomanufacturing facilities that severely hinders product development and commercialization. This challenge is particularly large for the fermentation industry, where new facilities take years to build and require hundreds of millions of dollars in infrastructure investment. Many companies rely on costly foreign assets to advance their technology or delay their commercialization for years as they wait for access to one of the limited contract pilot or manufacturing facilities in the United States.
Why do we have such a shortage of these facilities? It is because numerous facilities have been shut down due to changing market conditions, failed product launches, or bankruptcy. When the facilities were ultimately abandoned and dismantled for scrap, the opportunity to repurpose expensive infrastructure was lost along with them.
Most U.S. biomanufacturing facilities are built to produce a specific product, making it difficult to repurpose them for alternative products. Due to strict financing and tight timelines for commercialization, companies often build the minimally viable facility, ultimately resulting in a facility with niche characteristics specific to their specific process and that has a low likelihood of being repurposed. When the facility is no longer needed for its original purpose—due to changes in market demand or financial challenges—it is very unlikely to be purchased by another organization.
This challenge is not unique to the biomanufacturing industry. In fact, even in the highly established automotive industry, less than half of its manufacturing facilities are repurposed. The rate of repurposing biomanufacturing facilities is much lower, given the lower level of standardization. Furthermore, nearly 30% of currently running biomanufacturing facilities have some idle capacity that could be repurposed. This is disappointing considering that many of these biomanufacturing facilities have similar upstream operations involving a seed bioreactor (a small bioreactor to be used as inoculum for a larger vessel) to initiate fermentation followed by a production reactor and then harvest tanks. Downstream processing operations are less similar across facilities and typically represent far less than half the capital required to build a new facility.
The United States has been a hot spot for biotech investment, with many startups and many commercial successes. We also have a robust supply of corn dextrose (a critical input for most industrial fermentation), reasonable energy costs, and the engineering infrastructure to build world-class biomanufacturing facilities providing advantages over many foreign locations. Our existing biomanufacturing footprint is already substantial, with hundreds of biomanufacturing facilities across the country at a variety of scales, but the design of these facilities lacks the standardization needed to meet the current and future needs of our biomanufacturing industry. There have been some success stories of facilities being repurposed, such as the one used by Gevo for the production of bio-butanol in Minnesota or the Freedom Pines facility in Georgia repurposed by LanzaTech.
However, there are numerous stories of facilities that were unable to be repurposed, such as the INEOS facility that was shuttered in India River, Florida. Repurposing these facilities is challenging for two primary reasons:
- A lack of forethought that the facility could be repurposed in the future (i.e., no space for additional equipment, equipment difficult to modify, materials of construction that do not have broad range of process compatibility).
- A lack of standardization in the detailed design (materials of construction, valve arrangements, pipe sloping, etc.) that prevents processes with higher aseptic requirements (lower contamination rates) from being implemented.
In order to increase the rate at which our biomanufacturing facilities are repurposed, we need to establish the policies and programs to make all new biomanufacturing facilities sustainable, more reliable, and capable of meeting the future needs of the industry. These policies and associated standards will establish a minimum set of guidelines for construction materials, sterilizability, cleanability, unit operation isolation, mixing, aeration, and process material handling that will enable a broad range of compatibility across many bioprocesses. As a specific example, all fermentors, bioreactors, and harvest tanks should be constructed out of 316L grade stainless steel minimum to ensure that the vast majority of fermentation and cell culture broths could be housed in these vessels without material compatibility concerns. Unfortunately, many of the U.S. biomanufacturing facilities in operation today were constructed with 304 grade stainless steel, which is incompatible with high-salt or high-chloride content broths. Furthermore, all process equipment containing living microorganisms should be designed to aseptic standards, even if the current product is not required to be axenic (absent of foreign microorganisms).
These standards should focus on upstream equipment (fermentors, media preparation tanks, sterilization systems), which are fairly universal across the food, pharma, and industrial biotech industries. While there are some opportunities to apply these standards to downstream process equipment, the downstream unit operations required to manufacture different biotech products vary significantly, making it more challenging to repurpose equipment.
Fortunately, guiding principles covering most factors that need to be addressed have already been developed by experts in the American Society of Mechanical Engineers (ASME), Bioprocess Equipment (BPE), and the International Society for Pharmaceutical Engineering (ISPE). These standards cover the gamut of biomanufacturing specifications: piping, mixing, valves, construction materials, and, in some cases, the design of specific unit operations. Companies are often forced to decide between following best practices in facility design and making tight timelines and budgets.
Following these standards increases capital costs of the associated equipment by 20% to 30%, and can extend construction timelines, preventing companies from adopting the standards even though it directly improves their top or bottom line by improving process reliability. Our biggest gap today is not ability to standardize but rather the incentivization to standardize. If the government provides incentives to adopt these standards, many companies will participate as it is widely recognized that these standards will result in facilities that are more reliable and more flexible for future products.
The National Institute for Standards and Technology (NIST) should initiate a program focused on biomanufacturing standards. The proposed program could be housed or coordinated out of a new office at the NIST—for example, as described in the previously proposed “Bio for America Program Office (BAPO)”—which should collaborate closely with the Office of the Secretary of Commerce and the Under Secretary of Commerce for Standards and Technology, as well as additional government and nongovernmental stakeholders as appropriate. NIST is the appropriate choice because it harbors cross-disciplinary expertise in engineering, and the physical, information, chemical, and biological sciences; is a nonregulatory agency of the U.S. Department of Commerce, whose mission it is “to drive U.S. economic competitiveness, strengthen domestic industry, and spur the growth of quality jobs in all communities across the country”; and is a neutral convener for industry consortia, standards development organizations, federal labs, universities, public workshops, and interlaboratory comparability testing.
Plan of Action
The Biden-Harris Administration should sponsor an initiative to incentivize the standardization that will enables the repurposing of biomanufacturing facilities, resulting in a more integrated and seamless bioeconomy. To do so, Congress should appropriate funds for a program focused on biomanufacturing standards at NIST. This program should:
- Develop a set of design and construction standards that enable facilities to be efficiently repurposed for different products in the future.
- Create, in collaboration with other government agencies, an incentivization program to encourage participation.
- Recognize participating companies with a certification.
- Track the program’s impact by measuring the rate of facility repurposing long-term.
First, the program will need to be funded by Congress and stood up within NIST. The award amounts will vary based on the facility size, but it is estimated that each participating company will receive $6 million on average, leading to a total program cost in the range of $30 million to $50 million per year. While the costs might seem high, the investment is at reduced risk by design, since facilities that adopt the program are better equipped to be repurposed should the original company abandon the facility.
Next, design and building standards would be defined that ensure the highest chance of redeployment along with reliable operation. While relevant standards exist (i.e., ASME BPE Standards), they should be refined and elaborated by an expert panel established by NIST with the purpose of promoting repurposing. The adoption rate of the existing nonmandatory standards is low, particularly outside of the pharma industry. This new NIST program should establish a panel of experts, including industry and government representatives, to fully develop and publish these standards. A panel of 10–12 members could develop these standards in one year’s time. Thereafter, the panel could be assembled regularly to review and update these standards as needed.
Once the standards are published, NIST should launch (and manage) a corresponding incentivization program to attract participation. The program should be designed such that an estimated 50% incremental cost savings would be achieved by adhering to these standards. In other words, the improved infrastructure established by following the standards would not be fully subsidized, but it would be subsidized at the rate of 50%. The NIST program could oversee applicants’ adherence to the new standards and provide awards as appropriate. NIST should also work with other federal government agencies that support development of biomanufacturing capacity (e.g., Department of Energy [DOE], Department of Defense [DoD], and Department of Agriculture [USDA]) to explore financial incentives and funding requirements to support adherence with the standards.
In addition, the government should recognize facilities built to the new standards with a certification that could be used to strengthen business through customer confidence in supply reliability and overall performance. NIST will publish a list of certified facilities annually and will seek opportunities to recognize companies that broadly participate as a way to recognize their adoption of this program. Furthermore, this type of certification could become a prerequisite for receiving funding from other government organizations (i.e., DoE, DoD, USDA) for biomanufacturing-related funding programs.
Last, to measure the program’s success, NIST should track the rate of redeployment of participating facilities. The success rate of redeployment of facilities not participating in the program should also be tracked as a baseline. After 10 years, at least a twofold improvement in redeployment rate would be expected. If this does not occur, the program should be reevaluated and an investigation should be conducted to understand why the participating facilities were not redeployed. If needed, the existing biomanufacturing standards should be adjusted.
Conclusion
Given the large gap in biomanufacturing assets needed to meet our future needs across the United States, it is of paramount importance for the federal government to act soon to standardize our biomanufacturing facilities. This standardization will enable repurposing and will build a stronger bioeconomy. By establishing a program that standardizes the design and construction of biomanufacturing facilities across the country, we can ensure that facilities are built to meet the industry’s long-term needs—securing the supply of critical products and reducing our reliance on foreign countries for biomanufacturing needs. In the long run, it will also spur biotech innovation, since startup companies will need to invest less in biomanufacturing due to the improved availability of manufacturing assets.
A committee will need to be established to create a detailed budget plan; however, rough estimates are as follows: A typical biomanufacturing facility costs between $100 million and $400 million to build, depending on scale and complexity. If the program is designed to support five biomanufacturing facilities per year, and we further assume an average construction cost of $200 million with $40 million of that being equipment that applies to the new standard, a 15% subsidy would result in ~$6 million being awarded to each participating facility. If we assume that following these standards increases the costs of the associated equipment by 30%, the net increase in costs would be from $40 million to $52 million. This 15% subsidy is designed to offset the cost of applying these new standards at roughly a 50 cents on the dollar rate. In addition, there will be some overhead costs to run the program at NIST, but these are expected to be small. Thus, the new program would cost in the range of $30 million to $50 million per year to run, depending on how many companies participate and are awarded on an annual basis.
When they apply for funding, companies will describe the facility to be built and how the funds will be used to make it more flexible for future use. A NIST panel of subject matter experts will evaluate and prioritize nominations, with an emphasis on selecting facilities across different manufacturing sectors: food, pharma, and industrial biotech.
Given that the life of biomanufacturing facilities is on the order of years, it is expected that this program will take several years before a true impact is observed. For this reason, the program evaluation is placed 10 years after launch, by which time it is expected that more than 20 facilities will have participated in the program, and at least a few will have been repurposed during that time.
Keeping the standards general across industries enables repurposing of facilities across different industries. The fact that different standards exist across industries, and are present in some industries but not others, is part of the current challenge in redeploying facilities.
The initial focus is on standardization within the United States. Eventually, standardization on a more global scale can be pursued, which will make it easier for the United States to leverage facilities internationally. However, international standardization presents a whole new set of challenges due to differences in equipment availability and materials of construction.
Increasing Access to Capital by Expanding SBA’s Secondary Market Capacity
Summary
Entrepreneurship and innovation are crucial for sustained community development, as new ventures create new jobs and wealth. As entrepreneurs start and grow their companies, access to capital is a significant barrier. Communities nationwide have responded by initiating programs, policies, and practices to help entrepreneurs creatively leverage philanthropic dollars, government grants and loans, and private capital. But these individually promising solutions collectively amount to a national patchwork of support. Those who seek to scale promising ideas face a funding continuum that is filled with gaps, replete with high-transaction costs, and highly variable depending on each entrepreneur’s circumstances.
To help entrepreneurs better and more reliably access capital no matter where in the country they are, the Small Business Administration (SBA) should work with the other Interagency Community Investment Committee (ICIC) agencies to expand the SBA’s secondary market capacity. The SBA’s secondary market allows lenders to sell the guaranteed portion of a loan backed by the SBA. This provides additional liquidity to lenders, which in turn expands the availability of commercial credit for small businesses. However, there is no large standardized secondary market for debt serviced by other federal agencies, so the benefits of a secondary market are limited to only a portion of federal lending programs that support entrepreneurship. Expanding SBA’s secondary market authority would increase access to large pools of private capital for a larger proportion of entrepreneurs and innovative small businesses.
As a first step towards this goal, one or several agencies should enter into a pilot partnership with SBA to use SBA’s existing administrative authority and infrastructure to enable private lenders to sell other forms of federally securitized loans. Once proven, the secondary market could be expanded further and permanently established as a government-sponsored enterprise (GSE). This GSE would provide accessible capital for entrepreneurs and small businesses in much the same way that the GSEs Fannie Mae and Freddie Mac provide accessible capital, as mortgages, for prospective homeowners.
With the 118th Congress considering the reauthorization of SBA for the first time in 22 years, there is an opportunity to seize on this reauthorization to modernize the SBA. Piloting the SBA’s secondary market capacity is a crucial piece of modernization to increase access to capital for entrepreneurs.
Challenge and Opportunity
Access to capital changes the economic trajectory of individuals and communities. Approved small business loan applicants, for instance, report average income increases of more than 10% five years after loan approval. Unfortunately, capital for budding entrepreneurs is scarce and inequitably allocated. Some 83% of budding entrepreneurs never access adequate capital to start or grow their business. Success rates are even lower for demographic minorities. And when entrepreneurs can’t access capital to start their business, the communities around them suffer, as evidenced by the fact that two out of every three new jobs over the past 25 years has been generated by small businesses.
The vast majority of new businesses in the United States are funded by personal or family savings, business loans from banks or financial institutions, or personal credit cards. Venture capital is used by only 0.5% of entrepreneurs because most entrepreneurs’ businesses are not candidates for it. Public and mission-driven lending efforts are valiant but can’t come close to matching the scale of this untapped potential. Outside of the COVID-19 emergency response, the SBA annually appropriates $1–2 billion for lending programs. The Urban Institute found that between 2011 and 2017, Chicago alone received $4 billion of mission-driven lending that predominantly went toward communities of color and high-poverty communities. But during the same time period, Chicago also received over $67 billion of market investment—most of which flowed to white and affluent neighborhoods.
Communities across the country have sought to bridge this gap with innovative ideas to increase access to private capital, often by leveraging federal funding or federal programmatic infrastructure. For example:
- The Entrepreneur Backed Asset Fund is a philanthropically funded initiative that creates a secondary market for microloans made through Community Development Financial Institutions (CDFIs). This idea came from an experienced microlender who was frustrated with the illiquidity of microloans and subsequent need to constantly engage in time-consuming philanthropic fundraising.
- ESO Ventures is a nonprofit organization out of East Oakland, California, that combines competency-based curriculum and access to capital. As entrepreneurs grow their skills through the program, they receive access to increasing lines of credit to apply those new skills to their businesses. ESO Ventures has grown rapidly by using both federal recovery grants funneled through the state of California and access to private banks for credit lines. The organization’s goal is to create 3,000 more businesses and generate $3 billion in economic activity by 2030.
- Network Kansas is an entrepreneurial support organization born out of the first version of the Treasury Department’s State Small Business Credit Initiative (SSBCI) program under the Obama Administration. The organization leverages a Kansas state tax credit to provide below-market debt to rural entrepreneurs in the most remote parts of Kansas. Since 2006, Network Kansas has deployed $500 million.
These example programs are successful, replicable, and already supported by some of the agencies in the ICIC. These programs use traditional, well-understood financial mechanisms to provide capital to entrepreneurs: credit lines, insurance, shared-equity agreements, tax credits, and low-interest debt. The biggest obstacle to scaling these types of programs is financial: they must first raise money to support their core financial mechanism(s) and their dependence on ad hoc fundraising almost inevitably yields uneven results.
There is a clear rationale for federal intervention to improve capital access for entrepreneurship-support programs. Successful investment in marginalized communities serves the public interest by generating positive externalities such as increases in jobs, wealth, and ownership. Government can grow these externalities manyfold by reducing risk for investors and reducing the cost of capital to entrepreneurs through the expansion of SBA’s secondary market authority and ultimate creation of a GSE to create permanence, increased accountability, and further flexibility of capital access. With SBA reauthorization on the legislative docket, this is a prime opportunity to address the core challenge of capital access for entrepreneurs.
Plan of Action
Federal government should create standardized, straightforward mechanisms for entrepreneurs and small businesses across the country to tap into vast pools of private capital at scale. A first step is launching an administrative pilot that extends the SBA’s current secondary market capacity to interested agencies in the ICIC. An initial pilot partner could be the Department of the Treasury in order to recapitalize its Community Development Financial Institutions (CDFI) Fund. If the pilot proves successful, the secondary market could be expanded further and permanently established as a government-sponsored enterprise.
Recommendation 1. Establish an administrative pilot.
The SBA’s secondary market can already serve small business debt and debt-like instruments for small businesses and community development. The SBA currently underwrites, guarantees, securitizes, and sells pools of 7(a) and 504 loans, unsecured SBA loans in Development Company Participation Certificates, and Small Business Investment Company Debentures. Much like Federal Housing Administration and Veterans Affairs home loans offer guaranteed debt to homeowners, there are programs that offer guaranteed debt for entrepreneurs. However, there is no large standardized secondary market for the debt that extends across agencies.
An interagency memorandum of understanding between interested ICIC agencies could quickly open up the SBA’s secondary market infrastructure to other forms of small business debt. This would allow the ICIC to explore, with limited risk, the extent to which an expanded secondary market for federally securitized debt products enables entrepreneurs and small businesses to more easily access low-cost capital. Examples of other forms of small business lending provided by ICIC agencies include Department of Agriculture Rural Business Development Grants, Department of Housing and Urban Development Community Development Block Grants, and the Treasury Small Business Lending Fund, among others.
An ideal initial pilot partner target among ICIC agencies would be the Treasury, which could pilot a secondary market approach to recapitalizing its CDFI Fund. This fund allocates capital via debenture to CDFIs for them to make personal, mortgage, and commercial loans to low-income and underserved communities. The fund is recapitalized on an annual basis through the federal budget process. A partnership with SBA to create a secondary market for the CDFI Fund would effectively double the federal support available for CDFIs that leverage that fund.
It is important to note that while SBA can create pilot intergovernmental agreements to extend its secondary market infrastructure, broader or permanent extension of secondary market authority may require congressional approval.
Recommendation 2. Create a government-sponsored enterprise (GSE).
Upon successful completion of the administrative pilot, the ICIC should explore creating a GSE that decreases the cost of capital for entrepreneurs and small businesses and expands capital access for underserved communities. This separate entity would be a more independent body than an expanded secondary market created through SBA’s existing infrastructure. Benefits of creating a GSE include providing more flexibility and allowing the agency to function more independently and with greater authority while being subject to more rigorous reporting and oversight requirements to ensure accountability.
After the 2008 housing-market crash and subsequent recession, the concept of a GSE was criticized and reforms were proposed. There is no doubt that GSEs made mistakes in the housing market, but they also helped standardize and grow the mortgage market that now serves 65% of American households. The federal government will need to implement thoughtful, innovative governance structures to realize the benefits that a GSE could offer entrepreneurs and small businesses while avoiding repeating the mistakes that the mortgage-focused GSEs Fannie Mae and Freddie Mac made.
One potential ownership structure is the Perpetual Purpose Trust (PPT). PPTs work by separating the ownership right of governance from the ownership right of financial return and giving them to different parties. The best-known example of a PPT to date is likely the one established by Yvon Chouinard to take over his family’s ownership interest in Patagonia. In a PPT, trustees—organized in a Trust Steward Committee (TSC)—are bound by a fiduciary duty to maintain focus on the stated purpose of the trust. None of the interests within the TSC are entitled to financial return; rather, the rights to financial return are held in a separate entity (the Corporate Trustee) that does not possess governance rights. This structure, which is backed by a Trust Enforcer, ensures that the TSC cannot force the company to do something that is good for profits but bad for purpose.
Emulating this basic structure for a capital-focused GSE could circumvent the moral hazard that plagued the mortgage-focused GSEs. The roles of TSC, Trust Enforcer, and Corporate Trustee in a federal context could be filled as follows:
- Trust Steward Committee: The TSC possesses a fiduciary for the PPT and typically makes strategic decisions to pursue the PPT’s purpose. For a capital-focused GSE, the TSC’s role would be limited to governance. The TSC would be populated by a mix of stakeholders, such as entrepreneurs, community developers, investors, and support organizations.
- Trust Enforcer: The Trust Enforcer is an independent entity that rules on whether the TSC is sufficiently pursuing the PPT’s stated purpose. For a capital-focused GSE, this role could be filled by federal agency staff. The trust that holds the governance rights serves as the vehicle for the regulatory body to oversee the GSE.
- Corporate Trustee: The entity that holds the financial interest of the GSE could be sold to investors, granted to employees, or given away to relevant nonprofit and other stakeholder groups.
Conclusion
The ICIC agencies support and create many creative solutions that blend private and public dollars to increase entrepreneurship and community development. Yet the federal government stops short of providing the most important benefit: standardization and scale. The ICIC agencies should therefore create an entity that unlocks standardization and scale for the programs they help create, with the overall goals of:
- Unifying small-business and community-development underwriting
- Broadening private-actor participation in small business loan origination by creating the risk standards that allow for greater liquidity
- Opening the capital markets to lower the cost of capital for entrepreneurs
A first step towards accomplishing these goals is to establish an administrative pilot, by which interested ICIC agencies would use the SBA’s existing authority and infrastructure to create a secondary market for their securitized debt instruments.
If the pilot proves successful, the next step is to expand the secondary market and establish it for the long term through a GSE modeled on those that have effectively supported the mortgage industry—but with a creative structure that proactively addresses GSE weaknesses unveiled by the 2008 housing-market crash. The result is a stable, permanent institution that enables all communities to realize the benefits of robust entrepreneurship by ensuring that budding entrepreneurs and small-business owners across the country can easily tap into the capital they need to get started.
Precedents for this type of federal intervention can be found in the mortgage industry. Homeownership is a major driver of wealth creation. The federal government supports homeownership through mortgage guarantees by federal agencies like the Federal Housing Authority and Veterans Affairs. In addition, the federal government increases liquidity in the mortgage industry by enabling insured mortgages and market-rate mortgages to be securitized, sold, and purchased on secondary markets through government-sponsored enterprises (GSEs) like Fannie Mae and Freddie Mac, or wholly owned agencies like Ginnie Mae. These structures have created a reliable stream of capital to originate loans for homeownership and lower the cost of borrowing.
The mortgage GSEs are engaging in innovation to increase access to housing credit. Fannie Mae, for example, is taking a number of steps to extend credit and homeownership to historically disadvantaged communities, including by using documented rental payments to help individuals build their credit scores and using special-purpose credit programs to develop new solutions for down payment assistance, underwriting, and credit enhancement. These changes will have an outsize effect on the mortgage industry because of the central role a GSE like Fannie Mae plays in connecting private markets to potential homeowners.
COVID-19 relief efforts provide an application of this model specific to small businesses. The California Rebuild Fund (CARF) was a private credit fund for small businesses capitalized with a mixture of state, federal, philanthropic, and private investment. The CARF used government debt guarantees to push down the cost of capital to Community Development Financial Institutions that were best positioned to originate and serve small businesses most negatively impacted by COVID-19.
The CARF proved that a coherent and routinized process for accessing private capital that lowers interest rates, expands credit for small businesses, and creates operational efficiencies for entrepreneurial support organizations. For instance, there is a single application site that matches potential borrowers to potential lenders. The keys to the CARF’s success were its guarantee from the state of California and the fact that it provided relatively uniform offering to different investors along a spectrum of return profiles.
To begin the new entity, securitize or purchase securities from only government guaranteed loans. Even during the worst of the housing crash, the government-guaranteed mortgage-backed securities were more stable than non-agency loans. Beginning with guaranteed loans allows this new entity to provide explicit guarantees to guarantee-sensitive investors. However, a gradual push into new mechanisms, innovative underwriting, and perhaps non-agency debt should be a goal.
The guarantee of the loans should be explicit but only sit after the equity of the borrower and the agency guarantee.
Any privileges extended to the new entity, such as exemption from securities registration or state and local taxation, that results in measurable decrease in cost of lending should be passed on to the final borrower, as much as possible.
Assuming that the regulatory body, acting as a fiduciary of the trust, can implement policies that take into account demographics like race, ethnicity, and country of origin, the GSE should use special purpose credit programs to address racial inequalities in access to capital.
The authorizing statute for the SBA secondary market required the lender to remain obligated to the SBA if it securitizes and sells the underlying loan on a secondary market. To promulgate that obligation the SBA requires the lender to keep a percentage of the loan on their books for servicing. This is an operational hurdle to securitizing loans. Either there needs to be a more robust market to justify the operational expense or there should be another manner by which the lender remains obligated to the SBA.
The SBA recently announced a change in the interest rates that lenders can charge for 7(a) loans. While it is understandable that the SBA does not want the guarantee to run up the profit margin for lenders, the tradeoff is that some entrepreneurs will go without capital because lenders cannot justify the risk at the formulated interest rate. The authorizing statute, CFR 120.213, merely requires that interest rates be reasonable. This should give the SBA room to experiment with how it can deliver low-cost capital to borrowers. For example, if the usury cap was removed for some loans, could the SBA require the excess yield be used to push down the cost of borrowing for other loans?
The Interagency Community Investment Committee (ICIC) focuses on the operations and execution of federal programs that facilitate the flow of capital and the provision of financial resources into historically underserved communities, including communities of color, rural communities, and Tribal nations. The ICIC is composed of representatives from the Treasury, Small Business Administration, Department of Commerce, Department of Transportation, Housing and Urban Development, and Department of Agriculture.
Building a National Network of Composite Pipes to Reduce Greenhouse Gas Emissions
Summary
65,000 miles of pipeline: that’s the distance that may be necessary to achieve economy-wide net-zero emissions by 2050, according to a Princeton University study. The United States is on the verge of constructing a vast network of pipelines to transport hydrogen and carbon dioxide, incentivized by the Infrastructure Investment and Jobs Act and the Inflation Reduction Act. Yet the lifecycle emissions generated by a typical steel pipeline is 27.35 kg carbon dioxide eq per ft1. Which means 65,000 miles would result in nearly 9.4 million megatons of carbon dioxide eq (equal to over 2 million passenger cars annually) produced just from steel pipeline infrastructure alone.
Pipelines made from composite materials offer one pathway to lowering emissions. Composite pipe is composed of multiple layers of different materials—typically a thermoplastic polymer as the primary structural layer with reinforcing materials such as fibers or particulate fillers to increase strength and stiffness. Some types have lifecycle emissions that are nearly one-third less than typical steel pipeline. Depending on the application, composite pipelines can be safer and less expensive. However, the process under Pipeline and Hazardous Materials and Safety Administration (PHMSA) to issue permits for composite pipe takes longer than steel, and for hydrogen and supercritical carbon dioxide, the industry lacks regulatory standards altogether. Reauthorization of the Protecting Our Infrastructure of Pipelines and Enhancing Safety (PIPES) Act offers an excellent opportunity to review the policies concerning new, less emissive pipeline technologies.
Challenge and Opportunity
Challenge
The United States is on the verge of a clean energy construction boom, expanding far beyond wind and solar energy to include infrastructure that utilizes hydrogen and carbon capture. The pump has been primed with $21 billion for demonstration projects or “hubs” in the Infrastructure Investment and Jobs Act and reinforced with another $7 billion for demonstration projects and at least $369 billion in tax credits in the Inflation Reduction Act. Congress recognized that pipelines are a critical component and provided $2.1 billion in loans and grants under the Carbon Dioxide Transportation Infrastructure Finance and Innovation Act (CIFIA).
The United States is crisscrossed by pipelines. Approximately 3.3 million miles of predominately steel pipelines convey trillions of cubic feet of natural gas and hundreds of billions of tons of liquid petroleum products each year. A far fewer 5,000 miles are used to transport carbon dioxide and only 1,600 miles are dedicated to hydrogen. Research suggests the existing pipeline network is nowhere near what is needed. According to Net Zero America, approximately 65,000 miles of pipeline will be needed to transport captured carbon dioxide to achieve economy-wide net zero emissions in the United States by 2050. The study also identifies a need for several thousand miles of pipelines to transport hydrogen within each region.
Making pipes out of steel is a carbon-intensive process, and steel manufacturing in general accounts for seven to nine percent of global greenhouse gas emissions. There are ongoing efforts to lower emissions generated from steel (i.e., “green steel”) by being more energy efficient, capturing and storing emitted carbon dioxide, recycling scrap steel combined with renewable energy, and using low-emissions hydrogen. However, cost is a significant challenge with many of these mitigation strategies. The estimated cost of transitioning global steel assets to net-zero compatible technologies by 2050 is $200 billion, in addition to a baseline average of $31 billion annually to simply meet growing demand.
Opportunity
Given the vast network of pipelines required to achieve a net-zero future, expanding use of composite pipe provides a significant opportunity for the United States to lower carbon emissions. Composite materials are highly resistant to corrosion, weigh less and are more flexible, and have improved flow capacity. This means that pipelines made from composite materials have a longer service life and require less maintenance than steel pipelines. Composite pipe can be four times faster to install, require one-third the labor to install, and have significantly lower operating costs.2 The use of composite pipe is expected to continue to grow as technological advancements make these materials more reliable and cost-effective.
Use of composite pipe is also expanding as industry seeks to improve its sustainability. We performed a lifecycle analysis on thermoplastic pipe, which is made by a process called extrusion that involves melting a thermoplastic material, such as high-density polyethylene or polyvinyl chloride, and then forcing it through a die to create a continuous tube. The tube can then be cut to the desired length and fittings can be attached to the ends to create a complete pipeline. We found that the lifecycle emissions from thermoplastic pipe were 6.83 kg carbon dioxide eq/ft and approximately 75% lower than an equivalent length of steel pipe, which has lifecycle emissions of 27.35 kg carbon dioxide eq/ft.
These estimates do not include potential differences in leaks. Specifically, composite pipe has a continuous structure that allows for the production of longer pipe sections, resulting in fewer joints and welds. In contrast, metallic pipes are often manufactured in shorter sections due to limitations in the manufacturing process. This means that more joints and welds are required to connect the sections together, which can increase the risk of leaks or other issues. Further, approximately half of the steel pipelines in the United States are over 50 years old, increasing the potential for leaks and maintenance cost.3 Another advantage of composite pipe is that it can be pulled through steel pipelines, thereby repurposing aging steel pipelines to transport different materials while also reducing the need for new rights of way and associated permits.
Despite the advantages of using composite materials, the standards have not yet been developed to allow for safe permitting to transport supercritical carbon dioxide4 and hydrogen. At the federal level, pipeline safety is administered by the Department of Transportation’s Pipeline and Hazardous Materials Administration (PHMSA).5 To ensure safe transportation of energy and other hazardous materials, PHMSA establishes national policy, sets and enforces standards, educates, and conducts research to prevent incidents. There are regulatory standards to transport supercritical carbon dioxide in steel pipe.6 However, there are no standards for composite pipe to transport either hydrogen or carbon dioxide in either a supercritical liquid, gas, or subcritical liquid state.
Repurposing existing infrastructure is critical because the siting of pipelines, regardless of type, is often challenging. Whereas natural gas pipelines and some oil pipelines can invoke eminent domain provisions under federal law such as the Natural Gas Act or Interstate Commerce Act, no such federal authorities exist for hydrogen and carbon dioxide pipelines. In some states, specific statutes address eminent domain for carbon dioxide pipelines. These laws typically establish the procedures for initiating eminent domain proceedings, determining the amount of compensation to be paid to property owners, and resolving disputes related to eminent domain. However, current efforts are under way in states such as Iowa to restrict use of state authorities to grant eminent domain to pending carbon dioxide pipelines. The challenges with eminent domain underscore the opportunity provided by technologies that allow for the repurposing of existing pipeline to transport carbon dioxide and hydrogen.
Plan of Action
How can we build a vast network of carbon dioxide and hydrogen pipelines while also using lower emissive materials?
Recommendation 1. Develop safety standards to transport hydrogen and supercritical carbon dioxide using composite pipe.
PHMSA, industry, and interested stakeholders should work together to develop safety standards to transport hydrogen and supercritical carbon dioxide using composite pipe. Without standards, there is no pathway to permit use of composite pipe. This collaboration could occur within the context of PHMSA’s recent announcement to update its standards for transporting carbon dioxide, which is being done in response to an incident in 2020 in Sartartia, MS.
Ideally, the permits could be issued using PHMSA’s normal process rather than as special permits (e.g., 49 CFR § 195.8). It takes several years to develop standards, so it is critical to launch the standard-setting process so that composite pipe can be used in Department of Energy-funded hydrogen hubs and carbon capture demonstration projects.
Europe is ahead of the United States in this regard, as the classification company DNV is currently undertaking a joint industry project to review the cost and risk of using thermoplastic pipe to transport hydrogen. This work will inform regulators in the European Union, who are currently revising standards for hydrogen infrastructure. The European Clean Hydrogen Alliance recently adopted a “Roadmap on Hydrogen Standardization” that expressly recommends setting standards for non-metallic pipes. To the extent practicable, it would benefit export markets for U.S. products if the standards were similar.
Recommendation 2. Streamline the permitting process to retrofit steel pipelines.
Congress should streamline the retrofitting of steel pipes by enacting a legislative categorical exclusion under the National Environmental Policy Act (NEPA). NEPA requires federal agencies to evaluate actions that may have a significant effect on the environment. Categorical exclusions (CEs) are categories of actions that have been determined to have no significant environmental impact and therefore do not require an environmental assessment (EA) or an environmental impact statement (EIS) before they can proceed. CEs can be processed within a few days, thereby expediting the review of eligible actions.
The CE process allows federal agencies to avoid the time and expense of preparing an EA or EIS for actions that are unlikely to have significant environmental effects. CEs are often established through agency rulemaking but can also be created by Congress as a “legislative CE.” Examples include minor construction activities, routine maintenance and repair activities, land transfers, and research and data collection. However, even if an action falls within a CE category, the agency must still conduct a review to ensure that there are no extraordinary circumstances that would warrant further analysis.
Given the urgency to deploy clean technology infrastructure, Congress should authorize federal agencies to apply a categorical exclusion where steel pipe is retrofitted using composite pipe. In such situations, the project is using an existing pipeline right-of-way, and there should be few, if any, additional environmental impacts. Should there be any extraordinary circumstances, such as substantial changes in the risk of environmental effects, federal agencies would be able to evaluate the project under an EA or EIS. A CE does not obviate the review of safety standards and other applicable, substantive laws, but simply right-sizes the procedural analysis under NEPA.
Recommendation 3. Explore opportunities to improve the policy framework for composite pipe during reauthorization of the PIPES Act.
Both of the aforementioned ideas should be considered as Congress initiates its reauthorization of the Protecting Our Infrastructure of Pipelines and Enhancing Safety (PIPES) Act of 2020. Among other improvements to pipeline safety, the PIPES Act reauthorized PHMSA through FY2023. As Congress begins work on its next reauthorization bill for PHMSA, it is the perfect time to review the state of the industry, including the potential for composite pipe to accelerate the energy transition.
Recommendation 4. Consider the embedded emissions of construction materials when funding demonstration projects.
The Office of Clean Energy Demonstrations should consider the embedded emissions of construction materials when evaluating projects for funding. Applicants that have a plan to consider embedded emissions of construction materials could receive additional weight in the selection process.
Recommendation 5. Support research and development of composite materials.
Composite materials offer advantages in many other applications, not just pipelines. The Office of Energy Efficiency and Renewable Energy (EERE) should support research to further enhance the properties of composite pipe while improving lifecycle emissions. In addition to ongoing efforts to lower the emissions intensity of steel and concrete, EERE should support innovation in alternative, composite materials for pipelines and other applications.
Conclusion
Recent legislation will spark construction of the next generation in clean energy infrastructure, and the funding also creates an opportunity to deploy construction materials with lower lifecycle emissions of greenhouse gases. This is important, because constructing vast networks of pipelines using high-emissive processes undercuts the goals of the legislation. However, the regulatory code remains an impediment by failing to provide a pathway for using composite materials. PHMSA and industry should commence discussions to create the requisite safety standards, and Congress should work with both industry and regulators to streamline the NEPA process when retrofitting steel pipelines. As America commences construction of hydrogen and carbon capture, utilization, and storage networks, reauthorization of the PIPES Act provides an excellent opportunity to significantly lower the emissions.
We compared two types of pipes: 4” API 5L X42 metallic pipe vs. 4” Baker Hughes non-metallic next generation thermoplastic flexible pipe. The analysis was conducted using FastLCA, a proprietary web application developed by Baker Hughes and certified by an independent reviewer to quantify carbon emissions from our products and services. The emission factors for the various materials and processes are based on the ecoinvent 3.5 database for global averages.
- The data for flexible pipe production is from 2020 production year and represents transport, machine, and energy usage at the Baker Hughes’ manufacturing plant located in Houston, TX.
- All raw material and energy inputs for flex pipes are taken directly from engineering and plant manufacturing data, as verified by engineering and manufacturing personnel, and represent actual usage to manufacture the flexible pipes.
- All of the data for metallic pipe production is from API 5L X42 schedule 80 pipe specifications and represent transport from Alabama and energy usage for production from global averages.
- All raw material and energy inputs for hot rolling steel are computed from ecoinvent 3.5 database emission factors. All relevant production steps and processes are modeled.
- All secondary processes are from the ecoinvent 3 database (version 3.5 compiled as of November 2018) as applied in SimaPro 9.0.0.30.
- Results are calculated using IPCC 2013 GWP 100a (IPCC AR5).
Similar to steel pipe, transporting hydrogen and carbon dioxide using composite pipe poses certain safety risks that must be carefully managed and mitigated:
- Hydrogen gas can diffuse into the composite material and cause embrittlement, which can lead to cracking and failure of the pipe.
- The composite material used in the pipe must be compatible with hydrogen and carbon dioxide. Incompatibility can cause degradation of the pipe due to permeation, leading to leaks or ruptures.
- Both hydrogen and carbon dioxide are typically transported at high pressure, which can increase the risk of pipe failure due to stress or fatigue.
- Carbon dioxide can be corrosive to certain metals, which can lead to corrosion of the pipe and eventual failure.
- Hydrogen is highly flammable and can ignite in the presence of an ignition source, such as a spark or heat.
To mitigate these safety risks, appropriate testing, inspection, and maintenance procedures must be put in place. Additionally, proper handling and transportation protocols should be followed, including strict adherence to pressure and temperature limits and precautions to prevent ignition sources. Finally, emergency response plans should be developed and implemented to address any incidents that may occur during transportation.
API Specification 15S, Spoolable Reinforced Plastic Line Pipe, covers the use of flexible composite pipe in onshore applications. The standard does not address transport of carbon dioxide and has not been incorporated into PHMSA’s regulations.
API Specification 17J, Specification for Unbonded Flexible Pipe, covers the use of flexible composite pipe in offshore applications. Similar to 15S, it does not address transport of carbon dioxide and has not been incorporated into PHMSA’s regulations.
HDPE pipe, commonly used in applications such as water supply, drainage systems, gas pipelines, and industrial processes, has similar advantages to composite pipe in terms of flexibility, ease of installation, and low maintenance requirements. It can be assembled to create seamless joints, reducing the risk of leaks. It can also be used to retrofit steel pipes as a liner per API SPEC 15LE.
HDPE pipe has been approved by PHMSA to transport natural gas under 49 CFR Part 192. However, the typical operating pressures (e.g., 100 psi) are significantly lower than composite pipe. Similar to composite pipe, there are no standards for the transport of hydrogen and carbon dioxide, though HDPE pipe’s lower pressure limits make it less suited for use in carbon capture and storage.
Addressing Online Harassment and Abuse through a Collaborative Digital Hub
Efforts to monitor and combat online harassment have fallen short due to a lack of cooperation and information-sharing across stakeholders, disproportionately hurting women, people of color, and LGBTQ+ individuals. We propose that the White House Task Force to Address Online Harassment and Abuse convene government actors, civil society organizations, and industry representatives to create an Anti-Online Harassment (AOH) Hub to improve and standardize responses to online harassment and to provide evidence-based recommendations to the Task Force. This Hub will include a data-collection mechanism for research and analysis while also connecting survivors with social media companies, law enforcement, legal support, and other necessary resources. This approach will open pathways for survivors to better access the support and recourse they need and also create standardized record-keeping mechanisms that can provide evidence for and enable long-term policy change.
Challenge and Opportunity
The online world is rife with hate and harassment, disproportionately hurting women, people of color, and LGBTQ+ individuals. A research study by Pew indicated that 47% of women were harassed online for their gender compared to 18% of men, while 54% of Black or Hispanic internet users faced race-based harassment online compared to 17% of White users. Seven in 10 LGBTQ+ adults have experienced online harassment, and 51% faced even more severe forms of abuse. Meanwhile, existing measures to combat online harassment continue to fall short, leaving victims with limited means for recourse or protection.
Numerous factors contribute to these shortcomings. Social media companies are opaque, and when survivors turn to platforms for assistance, they are often met with automated responses and few means to appeal or even contact a human representative who could provide more personalized assistance. Many survivors of harassment face threats that escalate from online to real life, leading them to seek help from law enforcement. While most states have laws against cyberbullying, law enforcement agencies are often ill-trained and ill-equipped to navigate the complex web of laws involved and the available processes through which they could provide assistance. And while there are nongovernmental organizations and companies that develop tools and provide services for survivors of online harassment, the onus continues to lie primarily on the survivor to reach out and navigate what is often both an overwhelming and a traumatic landscape of needs. Although resources exist, finding the correct organizations and reaching out can be difficult and time-consuming. Most often, the burden remains on the victims to manage and monitor their own online presence and safety.
On a larger, systemic scale, the lack of available data to quantitatively analyze the scope and extent of online harassment hinders the ability of researchers and interested stakeholders to develop effective, long-term solutions and to hold social media companies accountable. Lack of large-scale, cross-sector and cross-platform data further hinders efforts to map out the exact scale of the issue, as well as provide evidence-based arguments for changes in policy. As the landscape of online abuse is ever changing and evolving, up-to-date information about the lexicons and phrases that are used in attacks also change.
Forming the AOH Hub will improve the collection and monitoring of online harassment while preserving victims’ privacy; this data can also be used to develop future interventions and regulations. In addition, the Hub will streamline the process of receiving aid for those targeted by online harassment.
Plan of Action
Aim of proposal
The White House Task Force to Address Online Harassment and Abuse should form an Anti-Online Harassment Hub to monitor and combat online harassment. This Hub will center around a database that collects and indexes incidents of online harassment and abuse from technology companies’ self-reporting, through connections civil society groups have with survivors of harassment, and from reporting conducted by the general public and by targets of online abuse. Civil society actors that have conducted past work in providing resources and monitoring harassment incidents, ranging from academics to researchers to nonprofits, will run the AOH Hub in consortium as a steering committee. There are two aims for the creation of this hub.
First, the AOH Hub can promote collaboration within and across sectors, forging bonds among government, the technology sector, civil society, and the general public. This collaboration enables the centralization of connections and resources and brings together diverse resources and expertise to address a multifaceted problem.
Second, the Hub will include a data collection mechanism that can be used to create a record for policy and other structural reform. At present, the lack of data limits the ability of external actors to evaluate whether social media companies have worked adequately to combat harmful behavior on their platforms. An external data collection mechanism enables further accountability and can build the record for Congress and the Federal Trade Commission to take action where social media companies fall short. The allocated federal funding will be used to (1) facilitate the initial convening of experts across government departments and nonprofit organizations; (2) provide support for the engineering structure required to launch the Hub and database; (3) support the steering committee of civil society actors that will maintain this service; and (4) create training units for law enforcement officials on supporting survivors of online harassment.
Recommendation 1. Create a committee for governmental departments.
Survivors of online harassment struggle to find recourse, failed by legal technicalities in patchworks of laws across states and untrained law enforcement. The root of the problem is an outdated understanding of the implications and scale of online harassment and a lack of coordination across branches of government on who should handle online harassment and how to properly address such occurrences. A crucial first step is to examine and address these existing gaps. The Task Force should form a long-term committee of members across governmental departments whose work pertains to online harassment. This would include one person from each of the following organizations, nominated by senior staff:
- Department of Homeland Security
- Department of Justice
- Federal Bureau of Investigation
- Department of Health and Human Services
- Office on Violence Against Women
- Federal Trade Commission
This committee will be responsible for outlining fallibilities in the existing system and detailing the kind of information needed to fill those gaps. Then, the committee will outline a framework clearly establishing the recourse options available to harassment victims and the kinds of data collection required to prove a case of harassment. The framework should be completed within the first 6 months after the committee has been convened. After that, the committee will convene twice a year to determine how well the framework is working and, in the long term, implement reforms and updates to current laws and processes to increase the success rates of victims seeking assistance from governmental agencies.
Recommendation 2: Establish a committee for civil society organizations.
The Task Force shall also convene civil society organizations to help form the AOH Hub steering committee and gather a centralized set of resources. Victims will be able to access a centralized hotline and information page, and Hub personnel will then triage reports and direct victims to resources most helpful for their particular situation. This should reduce the burden on those who are targets of harassment campaigns to find the appropriate organizations that can help address their issues by matching incidents to appropriate resources.
To create the AOH Hub, members of the Task Force can map out civil society stakeholders in the space and solicit applications to achieve comprehensive and equitable representation across sectors. Relevant organizations include organizations/actors working on (but not limited to):
- Combating domestic violence and intimate partner violence
- Addressing technology-facilitated gender based violence (TF-GBV)
- Developing online tools for survivors of harassment to protect themselves
- Conducting policy work to improve policies on harassment
- Providing mental health support for survivors of harassment
- Servicing pro bono or other forms of legal assistance for survivors of harassment
- Connecting tech company representatives with survivors of harassment
- Researching methods to address online harassment and abuse
The Task Force will convene an initial meeting, during which core members will be selected to create an advisory board, act as a liaison across members, and conduct hiring for the personnel needed to redirect victims to needed services. Other secondary members will take part in collaboratively mapping out and sharing available resources, in order to understand where efforts overlap and complement each other. These resources will be consolidated, reviewed, and published as a public database of resources within a year of the group’s formation.
For secondary members, their primary obligation will be to connect with victims who have been recommended to their services. Core members, meanwhile, will meet quarterly to evaluate gaps in services and assistance provided and examine what more needs to be done to continue growing the robustness of services and aid provided.
Recommendation 3: Convene committee for industry.
After its formation, the AOH steering committee will be responsible for conducting outreach with industry partners to identify a designated team from each company best equipped to address issues pertaining to online abuse. After the first year of formation, the industry committee will provide operational reporting on existing measures within each company to address online harassment and examine gaps in existing approaches. Committee dialogue should also aim to create standardized responses to harassment incidents across industry actors and understandings of how to best uphold community guidelines and terms of service. This reporting will also create a framework for standardized best practices for data collection, in terms of the information collected on flagged cases of online harassment.
On a day-to-day basis, industry teams will be available resources for the hub, and cases can be redirected to these teams to provide person-to-person support for handling cases of harassment that require a personalized level of assistance and scale. This committee will aim to increase transparency regarding the reporting process and improve equity in responses to online harassment.
Recommendation 4: Gather committees to provide long-term recommendations for policy change.
On a yearly basis, representatives across the three committees will convene and share insights on existing measures and takeaways. These recommendations will be given to the Task Force and other relevant stakeholders, as well as be accessible by the general public. Three years after the formation of these committees, the groups will publish a report centralizing feedback and takeaway from all committees, and provide recommendations of improvement for moving forward.
Recommendation 5: Create a data-collection mechanism and standard reporting procedures.
The database will be run and maintained by the steering committee with support from the U.S. Digital Service, with funding from the Task Force for its initial development. The data collection mechanism will be informed by the frameworks provided by the committees that compose the Hub to create a trauma-informed and victim-centered framework surrounding the collection, protection, and use of the contained data. The database will be periodically reviewed by the steering committee to ensure that the nature and scope of data collection is necessary and respects the privacy of those whose data it contains. Stakeholders can use this data to analyze and provide evidence of the scale and cross-cutting nature of online harassment and abuse. The database would be populated using a standardized reporting form containing (1) details of the incident; (2) basic demographic data of the victim; (3) platform/means through which the incident occurred; (4) whether it is part of a larger organized campaign; (5) current status of the incident (e.g., whether a message was taken down, an account was suspended, the report is still ongoing); (6) categorization within existing proposed taxonomies indicating the type of abuse. This standardization of data collection would allow advocates to build cases regarding structured campaigns of abuse with well-documented evidence, and the database will archive and collect data across incidents to ensure accountability even if the originals are lost or removed.
The reporting form will be available online through the AOH Hub. Anyone with evidence of online harassment will be able to contribute to the database, including but not limited to victims of abuse, bystanders, researchers, civil society organizations, and platforms. To protect the privacy and safety of targets of harassment, this data will not be publicly available. Access will be limited to: (1) members of the Hub and its committees; (2) affiliates of the aforementioned members; (3) researchers and other stakeholders, after submitting an application stating reasons to access the data, plans for data use, and plans for maintaining data privacy and security. Published reports using data from this database will be nonidentifiable, such as with statistics being published in aggregate, and not be able to be linked back to individuals without express consent.
This database is intended to provide data to inform the committees in and partners of the Hub of the existing landscape of technology-facilitated abuse and violence. The large-scale, cross-domain, and cross-platform nature of the data collected will allow for better understanding and analysis of trends that may not be clear when analyzing specific incidents, and provide evidence regarding disproportionate harms to particular communities (such as women, people of color, LGBTQ+ individuals). Resources permitting, the Hub could also survey those who have been impacted by online abuse and harassment to better understand the needs of victims and survivors. This data aims to provide evidence for and help inform the recommendations made from the committees to the Task Force for policy change and further interventions.
Recommendation 6: Improve law enforcement support.
Law enforcement is often ill-equipped to handle issues of technology-facilitated abuse and violence. To address this, Congress should allocate funding for the Hub to create training materials for law enforcement nationwide. The developed materials will be added to training manuals and modules nationwide, to ensure that 911 operators and officers are aware of how to handle cases of online harassment and how state and federal law can apply to a range of scenarios. As part of the training, operators will also be notified to add records of 911 calls regarding online harassment to the Hub database, with the survivor’s consent.
Conclusion
As technology-facilitated violence and abuse proliferates, we call for funding to create a steering committee in which experts and stakeholders from civil society, academia, industry, and government can collaborate on monitoring and regulating online harassment across sectors and incidents. The resulting Anti-Online Harassment Hub would maintain a data-collection mechanism accessible to researchers to better understand online harassment as well as provide accountability for social media platforms to address the issue. Finally, the Hub would provide accessible resources for targets of harassment in a fashion that would reduce the burden on these individuals. Implementing these measures would create a safer online space where survivors are able to easily access the support they need and establish a basis for evidence-based, longer-term policy change.
Platform policies on hate and harassment differ in the redress and resolution they offer. Twitter’s proactive removal of racist abuse toward members of the England football team after the UEFA Euro 2020 Finals shows that it is technically feasible for abusive content to be proactively detected and removed by the platforms themselves. However, this appears to only be for high-profile situations or for well-known individuals. For the general public, the burden of dealing with abuse usually falls to the targets to report messages themselves, even as they are in the midst of receiving targeted harassment and threats. Indeed, the current processes for reporting incidents of harassment are often opaque and confusing. Once a report is made, targets of harassment have very little control over the resolution of the report or the speed at which it is addressed. Platforms also have different policies on whether and how a user is notified after a moderation decision is made. A lot of these notifications are also conducted through automated systems with no way to appeal, leaving users with limited means for recourse.
Recent years have seen an increase in efforts to combat online harassment. Most notably, in June 2022, Vice President Kamala Harris launched a new White House Task Force to Address Online Harassment and Abuse, co-chaired by the Gender Policy Council and the National Security Council. The Task Force aims to develop policy solutions to enhance accountability of perpetrators of online harm while expanding data collection efforts and increasing access to survivor-centered services. In March 2022, the Biden-Harris Administration also launched the Global Partnership for Action on Gender-Based Online Harassment and Abuse, alongside Australia, Denmark, South Korea, Sweden, and the United Kingdom. The partnership works to advance shared principles and attitudes toward online harassment, improve prevention and response measures to gender-based online harassment, and expand data and access on gender-based online harassment.
Efforts focus on technical interventions, such as tools that increase individuals’ digital safety, automatically blur out slurs, or allow trusted individuals to moderate abusive messages directed towards victims’ accounts. There are also many guides that walk individuals through how to better manage their online presence or what to do in response to being targeted. Other organizations provide support for those who are victims and provide next steps, help with reporting, and information on better security practices. However, due to resource constraints, organizations may only be able to support specific types of targets, such as journalists, victims of intimate partner violence, or targets of gendered disinformation. This increases the burden on victims to find support for their specific needs. Academic institutions and researchers have also been developing tools and interventions that measure and address online abuse or improve content moderation. While there are increasing collaborations between academics and civil society, there are still gaps that prevent such interventions from being deployed to their full efficacy.
While complete privacy and security is extremely different to ensure in a technical sense, we envision a database design that preserves data privacy while maintaining its usability. First, the fields of information required for filing an incident report form would minimize the amount of personally identifiable information collected. As some data can be crowdsourced from the public and external observers, this part of the dataset would consist of existing public data. Nonpublicly available data would be entered by only individuals who are sharing incidents that are targeting them (e.g., direct messages), and individuals would be allowed to choose whether it is visible in the database or only shown in summary statistics. Furthermore, the data collection methods and the database structure will be periodically reviewed by the steering committee of civil society organizations, who will make recommendations for improvement as needed.
Data collection and reporting can be conducted internationally, as we recognize that limiting data collection to the U.S. will also undermine our goals of intersectionality. However, the hotline will likely have more comprehensive support for U.S.-based issues. In the long run, however, efforts can also be expanded internationally, as a cross-collaborative effort across multinational governments.
Accelerating Biomanufacturing and Producing Cost-Effective Amino Acids through a Grand Challenge
Summary
A number of biomanufactured products require amino acids and growth factors as inputs, but these small molecules and proteins can be very expensive, driving up the costs of biomanufacturing, slowing the expansion of the U.S. bioeconomy, and limiting the use of novel biomedical and synthetically produced agricultural products. Manufacturing costs can be substantially limiting: officials from the National Institutes of Health and the Bill & Melinda Gates Foundation point to the manufacturing costs of antibody drugs as a major bottleneck in developing and distributing treatments for a variety of extant and emerging infectious diseases. To help bring down the costs of these biomanufacturing inputs, the Biden-Harris Administration should allocate federal funding for a Grand Challenge to research and develop reduced-cost manufacturing processes and demonstrate the scalability of these solutions.
Amino acids are essential but costly inputs for large-scale bioproduction. To reduce these costs, federal funding should be used to incentivize the development of scalable production methods resulting in production costs that are half of current costs. Specifically, the U.S. Department of Agriculture (USDA) and ARPA-H should jointly commit to an initial funding amount of $15 million for 10 research projects in the first year, with a total of $75. million over five years, in Grand Challenge funding for researchers or companies who can develop a scalable process for producing food-grade or pharmaceutical-grade amino acids or growth factors at a fraction of current costs. ARPA-H should also make funding available for test-bed facilities that researchers can use to demonstrate the scalability of their cost-saving production methods.
Scaling up the use of animal cell culture for biosynthetic production will only be economically effective if the costs of amino acids and growth factors are reduced. Reducing the cost of bioproduction of medical and pharmaceutical products like vaccines and antimicrobial peptides, or of animal tissue products like meat or cartilage, would improve the availability and affordability of these products, make innovation and new product development easier and more cost effective, and increase our ability to economically manufacture bioproducts in the United States, reducing our dependence on foreign supply chains.
For a better understanding of the use of amino acids and growth factors in the production of biologics and animal cell-based products, and to accurately forecast supply and demand to ensure a reliable and available supply chain for medical products, the Department of Defense (DoD) and USDA should jointly commission an economic analysis of synthetic manufacturing pathway costs for common bioproducts and include assessments of comparative costs of production for major international competitors.
Challenge and Opportunity
Amino acids are necessary inputs when synthesizing protein and peptide products, including pharmaceutical and healthcare products (e.g., antibodies, insulin) and agricultural products (e.g., synthetic plant and animal proteins for food, collagen, gelatin, insecticidal proteins), but they are very expensive. Amino acids as inputs to cell culture cost approximately $3 to $50 per kg, and growth factors cost $50,000 per gram, meaning that their costs can be half or more of the total production cost.
Biomanufacturing depends on the availability of reagents, small molecules, and bioproducts that are used as raw inputs to the manufacturing process. The production of synthetic bioproducts is limited by the cost and availability of certain reagents, including amino acids and small signaling proteins like hormones and growth factors. These production inputs are used in cell culture to increase yields and production efficiency in the biosynthesis of products such as monoclonal antibodies, synthetic meat, clotting factors, and interferon (proteins that inhibit tumor growth and support immune system function). While some bioproducts can be produced synthetically in plant cells or bacterial cells, some products benefit from production steps in animal cells. One example is glycosylation, a protein-modification process that helps proteins fold into stable structures, which is a much simpler process in animal cells than in bacteria or in cell-free systems. The viruses used in vaccine development are also usually grown in animal cells, though some recombinant vaccines can be made in yeast or insect cells. There are benefits and drawbacks to the use of plant, fungi, bacteria, insect, or animal cells in recombinant bioproduction; animal cells are generally more versatile because they mimic human processes closely and require less engineering than non-animal cells. All cells, whether they are animal, plant or bacteria, require amino acids and various growth factors to survive and function efficiently. While in the future growth factors may no longer be required, amino acids will always be required. Amino acids are the most costly necessary additive on a price per kilogram basis; the most costly of the supporting additives are growth factors.
Growth factors are proteins or steroids that act as signaling molecules that regulate cells’ internal processes, while amino acids are building blocks of proteins that are necessary both for cell function and for producing new proteins within a cell. Cells require supplementation with both growth factors and amino acids because most cells are not capable of producing their own growth factors. Biosynthetic production in animal cells frequently uses growth factors (e.g., TGF, IGF) to increase yield and increase production speed, signaling cells to work faster and make more of a particular compound.
Pharmaceuticals
Although pharmaceutical products are expensive, relatively small demand volumes prevent market forces from exerting sufficient cost pressure to spur innovation in their production. The biosynthetic production of pharmaceuticals involves engineering cells to produce large quantities of a molecule, such as a protein or peptide, which can then be isolated, purified, and used in medicine. Peptide therapeutics is a $39 billion global market that includes peptides sold as end products and others used as inputs to the synthesis of other biological compounds. Protein and peptide product precursors, including amino acids and growth factors, represent a substantial cost of production, which is a barrier to low-cost, high-volume biomanufacturing.
For example, the production of antimicrobial peptides, used as therapeutics against antibiotic-resistant bacteria and viruses, is strongly constrained by the cost of chemical inputs. One input alone, guanidine, accounts for more than 25% of the approximately $41,000 per gram production cost of antimicrobial peptides. Reducing the cost of these inputs will have substantial downstream effects on the economics of production. Antimicrobial peptides are currently very expensive to produce, limiting their development as alternatives to antibiotics, despite a growing need for new antibiotics. The U.S. National Action Plan for Combating Antibiotic-Resistant Bacteria (CARB) outlines a coordinated strategy to accelerate the development of new antibiotics and slow the spread of antibiotic resistance. Reducing the cost to produce antimicrobial peptides would support these goals.
The high costs of synthetic production limit the growth of the market for synthetic products. This creates a local equilibrium that is suboptimal for the development of the synthetic biology industry and creates barriers to market entry for synthetic products that could, at scale, address environmental and bioavailability concerns associated with natural sources. The federal government has already indicated an interest in supporting the development of a robust and innovative U.S.-based biomanufacturing center, with the passage of the CHIPS and Science Act and Executive Order 14081 on Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy. Reducing the costs of basic inputs to the biomanufacturing process of a range of products addresses this desire to make U.S. biomanufacturing more sustainable. There are other examples of federal investment to reduce the cost of manufacturing inputs, from USDA support for new methods of producing fertilizer, to Food and Drug Administration investment to improve pharmaceutical manufacturing and establish manufacturing R&D centers at universities, to USDA National Institute of Food and Agriculture (NIFA) support for the development of bioplastics and bio-based construction materials. Federal R&D support increases subsequent private research funding and increases the number of new products that recipients develop, a positive measure of innovation.
The effort to reduce biomanufacturing costs is larger than any one company; therefore, it requires a coordinated effort across industry, academia, and government to develop and implement the best solution. The ability to cost-effectively manufacture precursors will directly and indirectly advance all aspects of biomanufacturing. Academia and industry are poised and ready to improve the efficiency and cost of bioproduction but require federal government coordination and support to achieve this essential milestone and to support the development of the newly emerging industry of large-scale synthetic bioproducts.
Synthetic meat
Developing cost-effective protein and peptide synthesis would remove a substantial barrier to the expansion of synthetic medical and agricultural products, which would address current supply bottlenecks (e.g., blood proteins, antibody drugs) and mounting environmental and political challenges to natural sourcing (e.g., beef, soy protein). Over the past decade, breakthroughs in the manufacturing capability to synthetically produce biological products, like biofuels or the antimalarial drug artemisinin, have failed to reach cost-competitiveness with naturally sourced competitors, despite environmental and supply-chain-related benefits of a synthetic version. The Department of Energy (DoE) and others continue to invest in biofuel and bioproduct development, and additional research innovation may soon bring these products to a cost-competitive threshold. For bioproducts that depend on amino acids and growth factors as inputs, that threshold may be very close. Proof of concept research on growth factor and amino acid production, as well as techno-economic assessments of synthetic meat products, point to precursor amino acids and proteins as being substantial barriers to cost competitiveness of bioproduction—but close to being overcome through technological development. Potential innovators lack support to invest in the development of potentially globally beneficial technologies with uncertain returns.
Reducing the costs of these inputs for the peptide drug and pharmaceutical market could also bring down the costs of synthetic meat, thereby increasing a substantial additional market for low-cost amino acids and growth factors while alleviating the environmental burdens of a growing demand for meat. Israel has demonstrated that there is strong demand for such products and has substantially invested in its synthetic meat sector, which in turn has augmented its overall bioeconomy.
Bringing the cost of synthetic meat from current estimates of $250 per kg to the high end of wholesale meat prices at $10 per kg is infeasible without reducing the cost of growth factors and amino acids as production inputs but would also reduce the water and land usage of meat production by 70% to 95%. Synthetic meat would also alleviate many of the ethical and environmental objections to animal agriculture, reduce food waste, and increase the amount of plant products available for human consumption (currently 77% of agricultural land is used for livestock, meat, and dairy production, and 45% of the world’s crop calories are eaten by livestock).
Bioeconomy initiatives and opportunity
Maintaining U.S. competitiveness and leadership in biomanufacturing and the bioeconomy is a priority for the Biden-Harris Administration, which has led to a national bioeconomy strategy that aims to coordinate federal investment in R&D for biomanufacturing, improve and expand domestic biomanufacturing capacity, and expand market opportunities for biobased products. Reducing the cost and expanding the supply of amino acids and growth factors supports these three objectives by making bioproducts derived from animal cells cheaper and more efficient to produce.
Several directives within President Biden’s National Biotechnology and Biomanufacturing Initiative could apply to the goal of producing cost-effective amino acids and growth factors, but a particular stipulation for the Department of Health and Human Services stands out. The 2022 Executive Order 14081 on Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy includes a directive for the Department of Health and Human Services (HHS) to invest $40 million to “expand the role of biomanufacturing for active pharmaceutical ingredients (APIs), antibiotics, and the key starting materials needed to produce essential medications and respond to pandemics.” Protein and peptide product precursors are key starting materials for medical and pharmaceutical products, justifying HHS support for this research challenge.
Congress has also signaled its intent to advance U.S. biotech and biomanufacturing. The CHIPS and Science Act authorizes funding for projects that could scale up the U.S. bioeconomy. Title IV of the Act, on bioeconomy research and development, authorizes financial support for research, test beds for scaling up technologies, and tools to accelerate research. This support could take the form of grants, multi-agency collaborative funding, and Small Business Innovation Research (SBIR) or Small Business Technology Transfer Program (SBTTP) funding.
Biomanufacturing is important for national security and stability, yet much research and development are needed to realize that potential. The abovementioned funding opportunities should be leveraged to support foundational, cross-cutting capabilities to achieve affordable, accessible biomanufactured products, such as the production of essential precursor molecules.
Plan of Action
To provide the catalyst for innovation that will drive down the price of components, federal funding should be made available to organizations developing cost-effective biosynthetic production pathways. Initial funding would be most helpful in the form of research grants as part of a Grand Challenge competition. University researchers have made some proof-of-concept progress in developing cost-effective methods of amino acid synthesis, but the investment required to demonstrate that these methods succeed at scale is currently not provided by the market. The main market for synthetic biomanufacturing inputs like amino acids is pharmaceutical products, which can pass on high production costs to the consumer and are not sufficiently incentivized to drive down the costs of inputs.
Recommendation 1. Provide Grand Challenge funding for reduced-cost scalable production methods for amino acids and growth factors.
The USDA (through the USDA-NIFA Agriculture and Food Research Initiative [AFRI] or through AgARDA if it is funded) and ARPA-H should jointly commit to $15 million for 10 projects in the first year, with a total of $75 million over five years, in Grand Challenge1 funding for researchers or companies who can develop a scalable process for producing food-grade or pharmaceutical-grade amino acids or growth factors at a fraction of current costs (e.g., $100,000 per kg for growth factors, and $1.50 per kg for amino acids), with escalating prizes for greater cost reductions. Applicants can also demonstrate the development of scalably produced bioengineered growth factors that demonstrate increased efficacy and efficiency. Grand Challenges offer funding to incentivize productive competition among researchers to achieve specific goals; they may also offer prizes for achieving interim steps toward a larger goal.
ARPA-H and USDA are well-positioned to spur innovation in cost-effective precursor production. Decreasing the costs of producing amino acids and growth factors would enable the transformative development of biologics and animal-cell-based products like synthetic meat, which aligns well with ARPA-H’s goal of supporting the development of breakthrough medical and biological products and technologies. ARPA-H aims to use its $6.5 billion in funding from the FY22 federal budget to invest in three-to-five-year projects that will support breakthrough technologies that are not yet economically compelling or sufficiently feasible for companies to invest internally in their development. An example technology cited by the ARPA-H concept paper is “new manufacturing processes to create patient-specific T-cells to search and destroy malignant cells, decreasing costs from $100,000s to $1000s to make these therapies widely available.” Analogously, new manufacturing processes for animal cell culture inputs will make biosynthetic products more cost-effective and widely available, but the potential market is still speculative, making investment risky.
AgARDA was meant to complement AFRI, in its model for soliciting research proposals, and being able to jointly support projects like a Grand Challenge to scale up amino acids and growth factors provides reason to fund AgARDA at its authorized level. Because producing cell-based meat at cost parity to animal meat would be an agricultural achievement, lowering the cost of necessary inputs to cell-based meat production could fall under the scope of AgARDA.
Recommendation 2. Reward Grand Challenge winners who demonstrate scalability and provide BioPreferred program purchasing preference.
Researchers developing novel low-cost and high-efficiency production methodology for amino acids and growth factors will also need access to facilities and manufacturing test beds to ensure that their solutions can scale up to industrial levels of production. To support this, ARPA-H should make funding available to Grand Challenge winners to demonstrate scaling their solutions to hundreds of kilograms per year. This is aligned with the test-bed development mandated by the CHIPS and Science Act. This funding should include $15 million to establish five test-bed facilities (a similar facility at the University of Delaware was funded at $3 million) and an additional $3 million to provide vouchers of between $10,000 and $300,000 for use at test-bed facilities. (These amounts are similar to the vouchers provided by the California Department of Energy for its clean energy test-bed program.)
To support the establishment of a market for the novel production processes, USDA should add to its BioPreferred program a requirement that federal procurement give preference to winners of the Grand Challenge when purchasing amino acids or growth factors for the production of biologics and animal cell-derived products. The BioPreferred program requires that federal purchases favor bio-based products (e.g., biodegradable cutlery rather than plastic cutlery) where the bio-based product meets the requirements for the purchaser’s use of that product. This type of purchasing commitment would be especially valuable for Grand Challenge winners who identify novel production methods—such as molecular “farming” in plants or cell-free protein synthesis—whose startup costs make it difficult to bootstrap incremental growth in production. Requiring that federal purchasing give preference to Grand Challenge winners ensures a certain volume of demand for new suppliers to establish themselves without increasing costs for purchasers.
Stakeholder support for this Grand Challenge would include research universities; the alternative protein, peptide products, and synthetic protein industries; nonprofits supporting reduced peptide drug prices (such as the American Diabetes Association or the Boulder Peptide Foundation) and a reduction in animal agriculture (such as New Harvest or the Good Food Institute); and U.S. biomanufacturing supporters, including DoE and DoD. Companies and researchers working on novel methods for scalable amino acid and growth factor production will also support additional funding for technology-agnostic solutions (solutions that focus on characteristics of the end product rather than the method—such as precision fermentation, plant engineering, or cell-free synthesis—used to obtain the product).
As another incentive, ARPA-H should solicit additional philanthropic and private funding for Grand Challenge winners, which could take the form of additional prize money or advance purchase commitment for a specified volume of amino acids or growth factors at a given threshold price, providing further incentive for bringing costs below the level specified by the Challenge.
Recommendation 3. To project future demand, DoD should commission an economic analysis of synthetic manufacturing pathway costs for common bioproducts, and include assessments of comparative costs in major international competitors (e.g., China, the European Union, the United Kingdom, Singapore, South Korea, Japan).
This analysis could be funded in part via BioMADE’s project calls for technology and innovation research. BioMADE received $87 million in DoD funding in 2020 for a seven-year period, plus an additional $450 million announced in 2023. Cost sharing for this project could come from the NSF Directorate for Technology, Innovation, and Partnerships or from the DoE’s Office of Science’s Biological and Environmental Research Program, which has supported techno-economic analyses of similar technologies, such as biofuels.
EO 14081 also includes DoD as a major contributor to building the bioeconomy. The DoD’s Tri-Service Biotechnology for a Resilient Supply Chain program will invest $270 million over five years to speed the application of research to product manufacturing. Decreasing the costs of amino acids and growth factors as inputs to manufacturing biologics could be part of this new program, depending on the forthcoming details of its implementation. Advancing cost-effective biomanufacturing will transform defense capabilities needed to maintain U.S. competitiveness, secure critical supply chains, and enhance resiliency of our troops and defense needs, including medicines, alternative foods, fuels, commodity and specialty chemicals, sensors, materials, and more. China recently declared a focus on synthetic animal protein production in its January 2022 Five Year Plan for Agriculture. Our trade relationship with China, which includes many agricultural products, may shift if China is able to successfully produce these products synthetically.
Conclusion
To support the development of an expansive and nimble biomanufacturing economy within the United States, federal agencies should ensure that the necessary inputs for creating biomanufactured products are as abundant and cost-effective as possible. Just as the cost to produce an almond is greatly dependent on the cost of water, the cost to manufacture a biological product in a cell-based manufacturing system depends on the cost of the inputs used to feed that system. Biomanufactured products that require amino acids and growth factors as inputs range from the medically necessary, like clotting factors and monoclonal antibodies, to the potentially monumental and industry-changing, like cell-based meat and dairy products. Federal actions to increase the feasibility and cost-effectiveness of manufacturing these products in the United States will beneficially affect the bioeconomy and biotechnology industry, the pharmaceutical and biomedical industries, and potentially the food and agriculture industries as well.
Similar grant funding through NINDS (CREATE Bio) and NIST (NIIMBL) for biomanufacturing initiatives devoted $10 million to $16 million in funding for 12-14 projects. The USDA recently awarded $10 million over five years to Tufts University to develop a National Institute for Cellular Agriculture, as part of a $146 million investment in 15 research projects announced in 2021 and distributed by the USDA-NIFA Agriculture and Food Research Initiative’s Sustainable Agricultural Systems (AFRI-SAS) program. AFRI-SAS supports workforce training and standardization of methods used in the production of cell-based meat, while Tufts’s broader research goals include evaluating the economics of production. Decreasing the cost of synthetic meat is key to developing a sustainable cellular agriculture program, and USDA could direct a portion of its AFRI-SAS funding to providing support for this initiative.
Yes. Current production methods for biological products, such as monoclonal antibody drugs, are sufficiently high that developing monoclonal antibodies for infectious diseases that primarily affect poor regions of the world is considered infeasible. Decreasing the costs of manufacturing these drugs through decreasing the costs of their inputs would make it economically possible to develop antibody drugs for diseases like malaria and zika, and biomedical innovation for other infectious diseases could follow. Similarly, decreasing the costs of amino acid and growth factor inputs would allow synthetic meat companies greater flexibility in the types of products and manufacturing processes they are able to use, increasing their ability to innovate.
In fact, a few non-U.S. companies are pursuing the production of synthetic growth factors as well as bioengineered platforms for lower-cost growth factor production. Israeli company BioBetter, Icelandic company ORF Genetics, UK-based CellRX, and Canadian company Future Fields are all working to decrease growth factor cost, while Japanese company Ajinomoto and Chinese companies such as Meihua Bio and Fosun Pharma are developing processes to decrease amino acid costs. Many of these companies receive subsidies or are funded by national venture funding dedicated to synthetic biology and the alternative protein sector. thus, U.S. federal funding of lower-cost amino acid and growth factor production would support the continued competitiveness of the national bioeconomy and demonstrate support for domestically manufactured bioengineered products.
Reducing the supply chain costs of manufacturing allows companies to increase manufacturing volumes, produce a wider range of products, and sell into more price-sensitive markets, all of which could result in job growth and the expansion of the biomanufacturing center. As an example of a product that has seen similar effects, solar panels and photovoltaic cells have seen substantial decreases in their costs of production, which have been coupled with job growth. Jobs in photovoltaics are seeing the largest increases among overall growth in renewable energy employment.
The techniques required to lower costs and scale production of amino acids and growth factors should translate to the production of other types of small molecules and proteins, and may even pave the way for more efficient and lower-cost production methods in chemical engineering, which shares some methods with bioengineering and biological manufacturing. For example, chemical engineering can involve the production of organic molecules and processing and filtration steps that are also used in the production of amino acids and growth factors.
Increased synthetic meat production will help address growing demands for meat and for protein-rich foods that the livestock industry currently struggles with in combination with other demands for land, water, agricultural products, and skilled labor. As an example, the recent U.S. egg shortage demonstrated that the livestock industry is susceptible to external production shocks caused by disease and unexpected environmental effects. Many large-scale meat companies, including giants like Cargill and Tyson Foods, see themselves as in the business of supplying protein, rather than the business of slaughtering animals, and have invested in plant-based-meat companies to broaden their portfolios. Expanding into synthetic meat is another way for animal agriculture to continue to serve meat to customers while incorporating new technological methods of production. If synthetic meat adoption expands rapidly enough to reduce the need for animal husbandry, farmers and ranchers will likely respond by shifting the types of products they produce, whether by growing more vegetables and plant crops or by raising animals for other industries.
Visa Interview Waivers after COVID
Summary
The COVID-19 pandemic severely impaired State Department (DOS) processing capacity by interrupting operations at U.S. consulates and foreign posts, slashing revenue for consular services through the resultant collapse in collected fees, and straining preexisting staffing challenges. To respond to diminished capacity, the State Department used its authority to waive in-person interviews to efficiently process visas with the resources it had available, while protecting national security. Even after COVID-19 ends as an official public health emergency, its effects on visa processing capacity will linger. In 2022, 48 percent of nonimmigrant visas were issued with an interview waiver, which was a vital component in rejuvenating the global talent mobility. Current visa interview waiver policies should remain in place until U.S. visa processing fully rebounds and should become a permanent feature of the State Department’s ongoing efforts to develop country-by-country consular policies that mitigate risk and avoid backlogs.
Expanded use of interview waivers because of diminished processing capacity
Congress authorized interview waivers to allow the State Department to focus its scarce resources on potential threats. The State Department originally had complete discretion about who must make a “personal appearance” and who may be waived under the Immigration and Nationality Act of 1952.1 Before the September 11th attacks, personal appearance waivers were relatively common,2 but post-9/11 policy guidance, initially codified in regulation in 2003, restricted the use of waivers to certain circumstances.3 Congress codified these restrictions in the Intelligence Reform and Terrorism Prevention Act of 2004, which added an in-person interview requirement for all applicants between 14 and 79 except under particular circumstances.4 Namely, DOS can offer waivers for the in-person interview requirement to applicants renewing visas (who have already had interviews) and for designated low-risk applicants.5
The 2004 State Department waiver authorities that Congress left to the State Department contain three components. First, individual consular officers may waive in-person interviews in certain cases when the applicant “presents no national security concerns requiring an interview.” Second, the Secretary of State may waive interviews when it is in the national interest. Third, the Deputy Assistant Secretary for Visa Services has the authority to waive interviews when it is “necessary as a result of unusual or emergent circumstances.”6
In the wake of COVID-19, the State Department has strategically used these waivers to address growing backlogs. After a temporary suspension of visa processing at the beginning of the pandemic, DOS resumed limited visa processing in July 2020. However, limited capacity led to significant backlogs and wait times. A number of factors have contributed to lengthy backlogs:
- Interrupted operations at consulates and embassies: Many consular offices shut down temporarily or scaled back their services during peak pandemic times due to lockdown measures and health risks. This led to delays in application processes that spiraled into massive backlogs when normal functionality resumed.
- Diminished revenue: As a fee-based agency, consular services lost the revenue associated with normal operations. Cuts to staff and resources left the agency with higher caseloads per officer.
- Limited resources before pandemic: Even before COVID-19, US consulates and embassies had inadequate resources to efficiently handle significant processing demands. This problem was exacerbated by pandemic-related disruptions.
- Increased application volumes: Global travel resumed as vaccines became widely available. Families reuniting after extended time apart was a primary contributor to rising visa application volumes.
The Department of State’s current policies focus on low-risk applicants, namely individuals who: have previously traveled to the United States; have biometrics on file for full screening and vetting; and either are the beneficiary of an approved petition from DHS confirming their eligibility for a visa classification or have already received a Certificate of Eligibility for a visa classification by an institution designated by DOS.
On March 26, 2020, Secretary of State Pompeo announced that the DOS would expand the availability of waivers to certain H-2 applicants, marking the first expansion of visa waivers in response to reduced processing capacity. In August 2020, Pompeo announced that applicants seeking a visa in the same category they previously held would be allowed to get an interview waiver if their visa expired in the last 24 months. Before this, the expiration period for an interview waiver was only 12 months. In December 2020, just two days before this policy was set to expire, DOS extended it through the end of March 2021. In March, the expiration period was doubled again, from 24 months to 48 months and the policy extended through December 31, 2021. In September of 2021, DOS also approved waivers through the remainder of 2021 for applicants of F, M, and academic J visas from Visa Waiver Program countries who were previously issued a visa.
In December 2021, DOS extended its then-existing policies (with some minor modifications) through December 2022. It also expanded its interview waiver policies by making first-time applicants for H-1, H-3, H-4, L, O, P, and Q visas — all classifications requiring petition adjudication by DHS — eligible for waivers if they are nationals of countries participating in the Visa Waiver Program and are a previous traveler to the United States through the Electronic System for Travel Authorization (ESTA). Applicants for H-1, H-3, H-4, L, O, P, and Q visas are also eligible for waivers if they have previously been issued any type of visa (meaning their biometric data was on file with DOS), have never been refused a visa (unless the refusal was overcome or waived), and provided they have no apparent or potential ineligibility. Applicants who have been issued a valid Certificate of Eligibility for classification as an F-1 student or an exchange visitor on an academic J-1 program may also be issued a visa without an interview. Moreover, the interview waiver policy that individuals renewing a visa in the same category as a visa that expired in the preceding 48 months may be eligible for issuance without an interview was announced as a standing policy of the State Department, and added to the department’s Foreign Affairs Manual for consular officers. In December 2022, DOS announced another extension of these policies, which are set to expire at the end of 2023.
In April 2023, President Biden signed a resolution ending the state of national emergency initiated by the pandemic. The public health emergency expires on May 11, 2023.
As policymakers consider the future of interview waivers beyond the official COVID emergency, they should note that new waiver policies were a response to a profound reduction in processing capacity rather than as a direct public health measure. Even with an expanded use of waivers, backlogs are still significant. The average wait time is estimated to be about 100 days—well above pre-pandemic waits. Even though the public health emergency has ended, we must retain current policies on interview waivers as long as processing delays persist.
Interview Waivers Have Been Highly Effective
Interview waivers have positively contributed to effective visa processing. Recent data show a decline in global wait times for various applicant types, including students, exchange visitors, temporary workers requiring DHS petition approval, and B-1/B-2 visitors.7 Moreover, interview waivers have had a minimal impact on overstay rates.
It should be noted that waivers are not granted at the expense of national security or public safety. Robust screening and vetting protocols persist even when interviews are waived. Preserving the waiver mechanism can help strike a balance between robust screening and vetting measures and the procedural workflows that are vital for efficiently managing backlog cases. Waived applicants typically consist of low-risk profiles or those who have previously been granted visas after comprehensive background checks, who are then subjected to the same screening and vetting checks and reviews as interviewed applicants based on their biometrics already on file. The ability of State’s consular posts to receive visa applications without an interview, but not mandating that posts do so for all available categories,allows consular officials to take into account country-specific conditions.
As the State Department recently noted: “These interview waiver authorities have reduced visa appointment wait times at many embassies and consulates by freeing up in-person interview appointments for other applicants who require an interview. Nearly half of the almost seven million nonimmigrant visas the Department issued in Fiscal Year 2022 were adjudicated without an in-person interview. We are successfully lowering visa wait times worldwide, following closures during the pandemic, and making every effort to further reduce those wait times as quickly as possible, including for first-time tourist visa applicants. Embassies and consulates may still require an in-person interview on a case-by-case basis and dependent upon local conditions.”8
Given that about half of all nonimmigrant visas were issued last year without an interview, discontinuing interview waivers following the end of the public health emergency will create undue strain on an already understaffed consular workforce and hamper global mobility just as academic, industrial, and government travel is returning to pre-pandemic levels. The workload that previously took care of almost half of successful visa applications will instead increase pressure on a system that is poorly equipped to service the growing post-pandemic demand.
Interview waivers do not jeopardize security
As the State Department explained in 2015, “interview waiver options do not represent a reduced scrutiny of applicants; rather, they are intended to enhance the security of the visa process by allowing State to focus more of its resources on potential threats.”
First, expanded use of interview waivers as a result of the pandemic only applies to low-risk applicants. The waivers are subject to important guardrails to safeguard security. They are not available to any applicant who: has previously been denied a visa; is listed in the Consular Lookout and Support System (CLASS); requires a Security Advisory Opinion or State Department clearance; is applying from a country they are not a national or resident of; or who is applying from a country designated a state sponsor of terrorism. Furthermore, they cannot be a member of any group that poses a security threat, has historically had an above average rate of visa denials, or poses a substantial risk of visa fraud.
Second, applicants eligible for interview waivers remain subject to the background checks and all screening and vetting required for all nonimmigrants, including name checks and biometric screening.
Third, the waivers are discretionary. Consular officers always have the option to interview an applicant if they doubt their credibility or have any other questions about their eligibility following standard screening procedures.
Interview waivers maximize the security afforded by DOS for a given level of processing capacity by allowing the department to deploy its resources where they are most needed.
Recommendations
Current interview waivers should be extended until at least 80% of non-immigrant visa applicants in categories requiring USCIS petition approval or sponsor-issued Certificates of Eligibility can schedule an interview within three weeks. Existing waivers should not be lifted unless this benchmark for visa processing can be maintained. In 2012, the president established this benchmark as a target for DOS with regard to business and tourist visas. By 2015, the Department successfully brought wait times down with the help of numerous policy changes, including the use of interview waivers. This benchmark provides a reasonable criterion to define unusual or emergent circumstances related to visa processing justifying waivers.
Consular management controls should include required annual reporting by consular posts to the State Department’s Bureau of Consular Affairs on the use of interview waivers. Consular posts typically conduct a handful of validation studies each year for Visa Services leadership in Consular Affairs. Each consular post should be tasked with reporting: whether the post utilized interview waiver authorities, and the reasoning for when the authorities were employed or not, what efficiencies or hurdles were encountered; and how the targeted use of interview waivers at the individual post can mitigate risks by allowing consular officials to focus attention on country-specific conditions.
Congress should authorize expanded interview waivers beyond the emergent circumstances of reduced processing capacity and task DOS with piloting other policies that would institutionalize efficient visa processing. Waivers are justified under current authority by the unusual circumstance of reduced processing capacity but may be helpful even when processing capacity has rebounded. Congress can and should make clear that it intends the national interest authorities left to the Secretary of State be utilized for the purpose of keeping processing times down. Congress can also help DOS pilot remote interviews for those lowest risk applicants that remain ineligible for interview waivers or in countries where interview waivers are not an appropriate response for country conditions. Combining interview waivers with remote interviewing authority would allow the State Department to better choose how to deploy its resources while also maintaining thorough screening and vetting through biometrics. Institutionalizing more certain and predictable timing on visa applications would help ensure the U.S. is attractive to international talent that is key to keeping the country competitive.
Conclusion
Despite reported improvements in pandemic conditions, visa backlogs continue to pose significant challenges at U.S. diplomatic missions around the world. The State Department should be allowed to broadly and flexibly use consular resources to collect and review screening and vetting results and complete all processing requirements without scheduling interviews. This will allow the Department to offer more timely options for qualified individuals seeking entry into the country. Maintaining interview waivers after the official expiration of the COVID-19 health crisis allows experts to focus on essential cases requiring more in-depth scrutiny, thus bolstering the security of our immigration system.
Lifting COVID-related restrictions does not automatically imply that all embassies or consulates will be able to immediately manage pre-pandemic levels of visa applications. Adjusting staffing resources and infrastructure could take time, especially when considering additional constraints from dealing with limited operational capacities. In these cases, visa interview waivers can help alleviate undue stress on embassy operations while providing flexibility to consular officers.
Creating a Fair Work Ombudsman to Bolster Protections for Gig Workers
To increase protections for fair work, the U.S. Department of Labor (DOL) should create an Office of the Ombudsman for Fair Work. Gig workers are a category of non-employee contract workers who engage in on-demand work, often through online platforms. They have had historic vulnerabilities in the U.S. economy. A large portion of gig workers are people of color, and the nature of their temporary and largely unregulated work can leave them vulnerable to economic instability and workplace abuse. Currently, there is no federal mechanism to protect gig workers, and state-level initiatives have not offered thorough enough policy redress. Establishing an Office of the Ombudsman would provide the Department of Labor with a central entity to investigate worker complaints against gig employers, collect data and evidence about the current gig economy, and provide education to gig workers about their rights. There is strong precedent for this policy solution, since bureaus across the federal government have successfully implemented ombudsmen that are independent and support vulnerable constituents. To ensure its legal and long-lasting status, the Secretary of Labor should establish this Office in an act of internal agency reorganization.
Challenge and Opportunity
The proportion of the U.S. workforce engaging in gig work has risen steadily in the past few decades, from 10.1% in 2005 to 15.8% in 2015 to roughly 20% in 2018. Since the COVID-19 pandemic began, this trend has only accelerated, and a record number of Americans have now joined the gig economy and rely on its income. In a 2021 Pew Research study, over 16% of Americans reported having made money through online platform work alone, such as on apps like Uber and Doordash, which is merely a subset of gig work jobs. Gig workers in particular are more likely to be Black or Latino compared to the overall workforce.
Though millions of Americans rely on gig work, it does not provide critical employee benefits, such as minimum wage guarantees, parental leave, healthcare, overtime, unemployment insurance, or recourse for injuries incurred during work. According to an NPR survey, in 2018 more than half of contract workers received zero benefits through work. Further, the National Labor Relations Act, which protects employees’ rights to unionize and collectively bargain without retaliation, does not protect gig workers. This lack of benefits, rights, and voice leaves millions of workers more vulnerable than full-time employees to predatory employers, financial instability, and health crises, particularly during emergencies—such as the COVID-19 pandemic.
Additionally, in 2022, inflation reached a decades-long high, and though the price of necessities has spiked, wages have not increased correspondingly. Extreme inflation hurts lower-income workers without savings the most and is especially dangerous to gig workers, some of whom make less than the federal minimum hourly wage and whose income and work are subject to constant flux.
State-level measures have as yet failed to create protections for all gig workers. In 2020, California passed AB5, legally reclassifying many gig workers as employees instead of independent contractors and thus entitling them to more benefits and protections. But further bills and Proposition 22 reverted several groups of gig workers, including online platform gig workers like Uber and Doordash drivers, to being independent contractors. Ongoing litigation related to Proposition 22 leaves the future status of online platform gig workers in California unclear. In 2022, Washington State passed ESHB 2076 guaranteeing online platform workers—but not all gig workers—the benefits of full-time employees.
This sparse patchwork of state-level measures, which only supports subgroups of gig workers, could trigger a “race to the bottom” in which employers of gig workers relocate to less strict states. Additionally, inconsistencies between state laws make it harder for gig workers to understand their rights and gain redress for grievances, harder for businesses to determine with certainty their duties and liabilities, and harder for states to enforce penalties when an employer is headquartered in one state and the gig worker lives in another. The status quo is also difficult for businesses that strive to be better employers because it creates downward pressure on the entire landscape of labor market competition. Ultimately, only federal policy action can fully address these inconsistencies and broadly increase protections and benefits for all gig workers.
The federal ombudsman’s office outlined in this proposal can serve as a resource for gig workers to understand the scope of their current rights, provide a voice to amplify their grievances and harms, and collect data and evidence to inform policy proposals. It is the first step toward a sustainable and comprehensive national solution that expands the rights of gig workers.
Specifically, clarifying what rights, benefits, and means of recourse gig workers do and do not have would help gig workers better plan for healthcare and other emergent needs. It would also allow better tracking of trends in the labor market and systemic detection of employee misclassification. Hearing gig workers’ complaints in a centralized office can help the Department of Labor more expeditiously address gig workers’ concerns in situations where they legally do have recourse and can otherwise help the Department of Labor better understand the needs of and harms experienced by all workers. Collecting broad-ranging data on gig workers in particular could help inform federal policy change on their rights and protections. Currently, most datasets are survey based and often leave out people who were not working a gig job at the time the survey was conducted but typically otherwise do. More broadly, because of its informal and dynamic nature, the gig economy is difficult to accurately count and characterize, and an entity that is specifically charged with coordinating and understanding this growing sector of the market is key.
Lastly, employees who are not gig workers are sometimes misclassified as such and thus lose out on benefits and protections they are legally entitled to. Having a centralized ombudsman office dedicated to gig work could expedite support of gig workers seeking to correct their classification status, which the Wage and Hour Division already generally deals with, as well as help the Department of Labor and other agencies collect data to clarify the scope of the problem.
Plan of Action
The Department of Labor should establish an Office of the Ombudsman for Fair Work. This office should be independent of Department of Labor agencies and officials, and it should report directly to the Secretary of Labor. The Office would operate on a federal level with authority over states.
The Secretary of Labor should establish the Office in an act of internal agency reorganization. By establishing the Office such that its powers do not contradict the Department of Labor’s statutory limitations, the Secretary can ensure the Office’s status as legal and long-lasting, due to the discretionary power of the Department to interpret its statutes.
The role of the Office of the Ombudsman for Fair Work would be threefold: to serve as a centralized point of contact for hearing complaints from gig workers; to act as a central resource and conduct outreach to gig workers about their rights and protections; and to collect data such as demographic, wage, and benefit trends on the labor practices of the gig economy. Together, these responsibilities ensure that this Office consolidates and augments the actions of the Department of Labor as they pertain to workers in the gig economy, regardless of their classification status.
The functions of the ombudsman should be as follows:
- Establish a clear and centralized mechanism for hearing, collating, and investigating complaints from workers in the gig economy, such as through a helpline or mobile app.
- Establish and administer an independent, neutral, and confidential process to receive, investigate, resolve, and provide redress for cases in which employers misrepresent to individuals that they are engaged as independent contractors when they’re actually engaged as employees.
- Commence court proceedings to enforce fair work practices and entitlements, as they pertain to workers in the gig economy, in conjunction with other offices in the DOL.
- Represent employees or contractors who are or may become a party to proceedings in court over unfair contracting practices, including but not limited to misclassification as independent contractors. The office would refer matters to interagency partners within the Department of Labor and across other organizations engaged in these proceedings, augmenting existing work where possible.
- Provide education, assistance, and advice to employees, employers, and organizations, including best practice guides to workplace relations or workplace practices and information about rights and protections for workers in the gig economy.
- Conduct outreach in multiple languages to gig economy workers informing them of their rights and protections and of the Office’s role to hear and address their complaints and entitlements.
- Serve as the central data collection and publication office for all gig-work-related data. The Office will publish a yearly report detailing demographic, wage, and benefit trends faced by gig workers. Data could be collected through outreach to gig workers or their employers, or through a new data-sharing agreement with the Internal Revenue Service (IRS). This data report would also summarize anonymized trends based on the complaints collected (as per function 1), including aggregate statistics on wage theft, reports of harassment or discrimination, and misclassification. These trends would also be broken down by demographic group to proactively identify salient inequities. The office may also provide separate data on platform workers, which may be easier to collect and collate, since platform workers are a particular subject of focus in current state legislation and litigation.
Establishing an Office of the Ombudsman for Fair Work within the Department of Labor will require costs of compensation for the ombudsman and staff, other operational costs, and litigation expenses. To reflect the need for a reaction to the rapid ongoing changes in gig economy platforms, a small portion of the Office’s budget should be set aside to support the appointment of a chief innovation officer, aimed at examining how technology can strengthen its operations. Some examples of tasks for this role include investigating and strengthening complaint sorting infrastructure, utilizing artificial intelligence to evaluate contracts for misclassification, and streamlining request for proposal processes.
Due to the continued growth of the gig economy, and the precarious status of gig workers in the onset of an economic recession, this Office should be established in the nearest possible window. Establishing, appointing, and initiating this office will require up to a year of time, and will require budgeting within the DOL.
There are many precedents of ombudsmen in federal office, including the Office of the Ombudsman for the Energy Employees Occupational Illness Compensation Program within the Department of Labor. Additionally, the IRS established the Office of the Taxpayer Advocate, and the Department of Homeland Security has both a Citizenship and Immigration Services Ombudsman and an Immigration Detention Ombudsman. These offices have helped educate constituents about their rights, resolved issues that an individual might have with that federal agency, and served as independent oversight bodies. The Australian Government has a Fair Work Ombudsman that provides resources to differentiate between an independent contractor and employee and investigates employers who may be engaging in sham contracting or other illegal practices. Following these examples, the Office of the Ombudsman for Fair Work should work within the Department of Labor to educate, assist, and provide redress for workers engaged in the gig economy.
Conclusion
How to protect gig workers is a long-standing open question for labor policy and is likely to require more attention as post-pandemic conditions affect labor trends. The federal government needs a solution to the issues of vulnerability and instability experienced by gig workers, and this solution needs to operate independently of legislation that may take longer to gain consensus on. Establishing an office of an ombudsman is the first step to increase federal oversight for gig work. The ombudsman will use data, reporting, and individual worker cases to build a clearer picture for how to create redress for laborers that have been harmed by gig work, which will provide greater visibility into the status and concerns of gig workers. It will additionally serve as a single point of entry for gig workers and businesses to learn about their rights and for gig workers to lodge complaints. If made a reality, this office will be an influential first step in changing the entire policy ecosystem regarding gig work.
There is a current definitional debate about whether gig workers and platform workers are employees or contractors. Until this issue of misclassification can be resolved, there will likely not be a comprehensive state or federal policy governing gig work. However, the office of an ombudsman would be able to serve as the central point within the Department of Labor to handle gig worker issues, and it would be the entity tasked with collecting and publishing data about this class of laborers. This would help elevate the problems gig workers face as well as paint a picture of the extent of the issue for future legislation.
Each ombudsman will be appointed for a six-year period, to ensure insulation from partisan politics.
States often do not have adequate solutions to handle the discrepancies between employees and contractors. There is also the “race to the bottom” issue, where if protections are increased in one state, gig employers will simply relocate to states where the policies are less stringent. Further, there is the issue of gig companies being headquartered in one state while employees work in another. It makes sense for the Department of Labor to house a central, federal mechanism to handle gig work.
The key challenge right now is for the federal government to collect data and solve issues regarding protections for gig work. The office of the ombudsman’s broadly defined mandate is actually an advantage in this still-developing conversation about gig work.
Establishing a new Department of Labor office is no small feat. It requires a clear definition of the goal and allowed activities of the ombudsman. This would require buy-in from key DOL bureaucrats. The office would also have to hire, recruit, and train staff. These tasks may be speed bottlenecks for this proposal to get off the ground. Since DOL plans its budget several years in advance, this proposal would likely be targeted for the 2026 cycle.