Training for Safety and Success: Research & National Minimum Training Standards for Law Enforcement
Summary
Law enforcement is a highly visible profession where, without effective training, safety is at risk for both law enforcement officers and community members. Officers regularly respond to calls for service with uncertain risk factors and must balance the work with proactive activities to improve community well-being. Nationally, mandated training hours for new law enforcement officers are consistently less than those required for cosmetology licensure, with training quality and requirements varying significantly by state. Nearly three-quarters of states allow officers to work in a law enforcement function before completing the basic academy. Public trust and safety are placed in the hands of law enforcement officers, even if they lack the training, skills, and knowledge to be successful. Policing practices are regularly shaped by failures shown in national media, yet the shift in practices is rarely institutionalized in basic training practices.
To make communities safer and law enforcement officers more successful, the Biden-Harris Administration should fund research on the effectiveness of law enforcement training and create a national minimum standard for entry-level academy training to further support the Safer American Plan. The 2022 Executive Order on Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety focuses on strengthening trust between communities and law enforcement officers, including training and equitable policing. The Department of Justice should oversee this research, and the Departments of Homeland Security, Labor, and Commerce can help create national standards and minimum training recommendations. Based on the findings and using pedagogical approaches for the most productive learning, minimum national training standards will be recommended by an interdisciplinary federal task force. Training can be used to compel change in law enforcement, improve community-police relations, and reduce liability while advancing community safety.
Challenge and Opportunity
Law enforcement actions have widespread implications due to the immense power and inherent risks associated with the position. The profession is plagued with complexity and unpredictability, further challenged by extensive discretionary capabilities and varied training requirements. Basic academy training is the foundational coursework for learning about laws and ethics, technical skills relating to actionable law enforcement functions, soft skill development, and honing critical thinking during stressful situations. However, more focus is placed on didactic portions with practical exercises than on cognitive, emotional, and social skills, which can be used to safely de-escalate situations. Even with these known training insufficiencies, academy training topics and hours are rarely updated. Training requirements and pedagogical approaches administered by peace officer standards and training or similar overseeing bodies generally require legislative updates to update curriculum standards, taking significant time and resources to enact change.
Back in 2015, President Obama highlighted the need for training and education in the 21st Century Taskforce on Policing, citing that law enforcement officers (LEOs) are required to be highly skilled in many operational areas to meet the wide variety of challenges and increasing expectations. The Biden-Harris Administration has vowed to advance effective, accountable policing through the Safer America Plan, noting that change at the local and state level requires congressional action. The Safer American Plan would provide funding for 100,000 additional LEOs, all of whom will require training to be effective in their role. Academy training requirements are not regularly collected or monitored at the federal level, and research is not routinely completed to show the efficacy of the training provided. The lack of data on law enforcement actions further complicates the training process, as the time spent during patrol is not regularly cataloged and reviewed to determine where officers spend most of their time. Data showing where officer time is spent can guide training decisions and adjust hours to provide skills for the most commonly utilized skill sets.
There is no national training standard for LEOs: state requirements vary from 1345 hours in the basic academy in Connecticut to 0 hours in Hawaii. The basic academy provides future LEOs foundational knowledge and skills in law, defensive tactics, report writing, first aid, communication, and other critical skills. The average length of basic training is 833 hours, with an average of 73 hours dedicated to firearm skills and 18 hours to de-escalation techniques. While firearm familiarization and skills are of utmost importance due to the fatal consequences of not understanding the weaponry and one’s ability, the discharge of a firearm occurs significantly less than de-escalation and other communication techniques. When not used regularly, skills become perishable, and the lack of regular training on topics like firearms and traffic stops can reduce an LEO’s efficiency, response time, and safety. The 2022 Executive Order on Advancing Effective, Accountable Policing mandates training federal LEOs with clear guidance on use-of-force standards and implicit bias, but these basic tenets of policing requirements are not extended to state and local law enforcement.
Thirty-seven states allow LEOs to work before they have completed a basic training academy. The time LEOs can work before receiving basic training ranges from 3 months in West Virginia to 24 months in Mississippi. There are obvious dangers to LEOs and the public by providing a uniform and firearm to an untrained person to interact with the community in a position of power. Figure 1 shows the ranges of when the basic academy is required of new LEOs.
With the basic academy averaging 833 hours, or about 21 weeks, it may seem like a sufficient timeframe to train new law enforcement officers. However, it commonly takes at least six months to master a new skill, with the academy requiring many new skills to be developed simultaneously. The minimum basic academy hour requirement in California is 664 hours, though the training is commonly over 1000 hours. By contrast, earning a cosmetology license in California has more extensive hour requirements than the basic police academy, with cosmetology and barber training requiring 1000 hours for state licensure. While injuries can occur in cosmetology, the profession is inherently safer for the practitioner and the client.
FBI Director Wray noted a 60% increase in murders of law enforcement officers in 2021, explicitly noting that violence against law enforcement officers does not receive as much attention as it should. Of the 245 LEOs who died in the line of duty in 2022, 74 were feloniously killed, up from 48 in 2019. In 2022, 1194 people were killed by LEOs, with 101 people being unarmed. Black people are disproportionately killed by LEOs, at nearly triple the population rate. The statistics of community members killed do not differentiate between legally justified uses of force and illegal actions, so a true picture of potential training concerns versus ethical violations cannot be determined.
Recognizing the insufficiencies of current LEO training raises opportunities for data-driven improvements. Research is needed to determine the efficacy of the basic academy training in each state, with comparisons made to provide an overall recommendation for minimum national standards. Innovation should be encouraged when developing future training standards, as the basic academy training has not embraced technology or newer learning techniques that may aid in practical decision-making and skill mastery.
Plan of Action
Training can be used to implement vital reforms in law enforcement, potentially saving lives. A multipronged, transparent approach is needed to determine the efficacy of current training before introducing innovation and minimum training standards. Multiple agencies will need to collaborate to complete the evaluation and create recommendations to incorporate inclusive views through multifaceted lenses and coordinate future actions. Transparency of the research and its goals, including making findings available on public-facing websites, is needed for accountability and to foster trust in the process of improving law enforcement. Additional detail of the proposed agencies and their roles is below.
Recommendation 1. Fund research for current LEO training and efficacy
Before overhauling training, data is needed to provide a baseline of training in each state, including its perceived efficacy by stakeholders. The DOJ should create and administer competitive grants to evaluate current training in every state/territory and complete surveys, interviews, and focus groups with stakeholders to determine the impact of training. Use-of-force incidents, accidents, LEO decertification, and other aspects of potential training deficiency should be examined for additional insight into effectiveness.
Research should also be conducted on fatal and accidental duty-related incidents to determine the human and other contributing factors. Data and trends gained from the research should be incorporated into minimum training standards to reduce future errors. Competitive grants can be provided to evaluate potential root causes of duty-related fatal and accidental deaths.
A key component of the research phase will be bringing the researchers together to discuss findings, regional and national trends, and recommendations. Creating a formal networking process will allow for best practices to be shared across all states/territories participating and made available to all LEO training commissions.
Recommendation 2. Spark innovation from adult learning experts and practitioners for LEO training
Through a competitive grant process, the DOJ’s Office of Justice Programs can advertise funding opportunities and outline the application process. Grants focusing on practitioners and adult learning experts in collaboration, potentially through practitioner-higher education partnerships, can assist in bringing the necessary experience from the field and adult learning. Curriculum designers should consider immersive or simulation training experiences and the use of technology in training. In addition, they should consider redesigning the rigid paramilitary format to encourage LEOs to utilize critical thinking skills, improve adaptability, and hone communication skills. Using Challenge.gov can also provide additional insights from the community.
Recommendation 3. Create national minimum standards for LEO basic academy training
Using the recommendations from the state law enforcement training researchers, the fatality factor researchers, practitioner and adult learner experts, FLETC, and DOL, a compilation of recommendations from NIST, DOJ, DHS, DOC, and DOL of national minimal standards should be completed. Requirements for academy instructors will also need to be established, including training program requirements and regular reviews of their performance and impact. NIST will use the information gathered, including contemporary training topics and a focus on adult learning techniques, and create a draft standard. The research teams and the public will have an opportunity to comment on the draft standards, then NIST will adjudicate the comments before sending the standards to an SDO for additional feedback for a quality, peer review.
The DOJ’s Office of Justice Programs will offer grants to all interested state LEO training bodies to adhere to the national minimum standard, with funding for planning, Implementation, and evaluation of the project. Grants should require a three-year timeline for implementation to ensure trainees receive training before their first day on the streets and the basic academy meets the minimum national requirements.
Recommendation 4. Evaluate curricula changes with environmental changes
Grant funding for the planning and implementation should extend an additional two years for the evaluation component. Evaluators chosen during the grant process can review how well training adheres to the national standards across all academies in the state, LEO feelings of preparedness upon graduation and quarterly after that for up to two years, and supervisor/administrator feedback on LEO performance after the academy. Deidentified records of unjustified use-of-force, decertification, and criminal actions can be reviewed for additional insight into the effectiveness of the basic academy training.
An overall program evaluation will be needed, including reviewing the state evaluations and the overall administration of the project. The grant can be open to one organization or multiple organizations with the selection and funding provided by DOJ’s Office of Justice Programs. Competitive grant funding for up to $5 million should be awarded for the six-to-eight-year evaluation.
Budget Proposal
A budget of $125 million is proposed to evaluate current LEO training, develop minimum requirements, and evaluate the implementation. The primary research of determining current LEO basic academy training and efficacy requires $500,000 for one researcher/research group per state/territory, totaling $28 million.
For the adult learning and practitioner component, competitive grants for up to 10 collaborations should receive up to $300,000 each, totaling $3 million. FLETC and DOL can be funded for their participation in the minimum standard creation at $1 million each, totaling $2 million.
Each state LEO training commission should be eligible to receive up to $2 million each to plan, implement, and evaluate the minimum training standards. If all states/territories participate, the funding will total $112 million.
An evaluation of the entire program will be conducted for $5 million for six to eight years of expected evaluative work. The final report will be provided to the DOJ to determine if performance metrics were met.
Conclusion
The national LEO training standard is meant to be the floor of training for states and does not remove the oversight of state peace officer training commissions. Every LEO should go through a basic academy and field training before serving the community to ensure they can be safe and effective in their roles. Developing innovating training techniques can help increase skills and understanding of vital topics while refining critical thinking skills in high-stress situations. Minimum training standards can improve safety for the public and first responders, reduce ethical and criminal violations by LEOs, and assist in repairing community-police relationships.
No. The 10th Amendment restricts the federal government from mandating standards, but federal grant funding can be restricted from states that do not meet the minimum training mandates. Precedence was made with DOJ’s Community Oriented Policing Services grants, which restrict federal funding if the agency’s use-of-force policy does not adhere to federal, state, and local laws.
States can update their training requirements at their will. States may be incentivized with federal grant funding, rather than waiting for unfunded and underresourced local attempts. Change involving many or all states can create pressure to conform to minimum requirements where there is currently little pressure with no financial incentives offered.
In December 2022, the House passed S.4003 Law Enforcement De-Escalation Training Act of 2022. The bill provides $34 million to the Department of Justice to fund scenario-based training for de-escalation and use-of-force for individuals experiencing a mental, suicidal, or behavioral crisis.
Stemming from the deaths of two unarmed Black men, HR 1280 and HR 1347 requested additional training and standards to reduce excessive force by LEOs. HR 1280 passed the House, and HR 1347 was introduced to the House with no actions since 2021.
LEO training in the United States is among the lowest in the world, with France training LEOs for 10 months or 1600 hours, Scotland’s basic training lasting for 92 weeks or 3680 hours, India for 2.5 years or 5400 hours, and Finland for three years or 6240 hours, with an additional year of field training.
Most states require continuing education or professional development. Hawaii has no LEO training requirements, and New Jersey law states agencies may provide in-service training without hourly requirements. Once minimum standards for basic training are implemented, national minimum mandatory annual continuing education or professional education can be developed.
The first recommendation requests funding to assess and determine the current efficacy of law enforcement training in every state. The multistage research would include interviews, surveys, and focus groups with stakeholders to determine training perceptions and impact, while a comparison is made using data from use-of-force incidents, officer decertification, accidents, fatal incidents, and other areas of potential training deficiency.
Protecting Consumers by Reforming Food Labeling Regulations
Summary
The Biden-Harris Administration has consistently prioritized consumer protection, invigorating rural communities and natural technologies that address climate change. These three priorities are embodied in this proposal and present an opportunity for a bipartisan win-win. Agriculture directly connects rural Americans with urban ones and is central to practical climate solutions. But as biotechnology advances, consumers face a myriad of new ingredients and labels to parse through at the supermarket. These labels, including ‘organic’ and ‘non-GMO,’ can often be confusing. There are competing views about the proper regulatory framework that will provide the highest nutrition to the most citizens at the lowest possible cost while respecting the environment. Comprehensive food labeling regulation reform can help consumers avoid deceptive marketing and allow farmers and grocers to compete fairly. In addition, it can be a tool to leverage the marketplace to implement climate-friendly solutions.
There are two possible approaches to implementing this reform: The best alternative would be to pass legislation that expands the BE labeling program, enhancing the labeling authority of USDA, strengthening Truth-in-Advertising laws, and providing a legal framework to address misleading claims across Federal agencies. Alternatively, the Federal Trade Commission (FTC) is already empowered to enforce existing Truth-in-Advertising laws. It can use this authority to reinforce the USDA’s existing labeling programs to ensure that consumer information aligns with scientific evidence.
Challenge and Opportunity
In the past 50 years, the idea of “health foods” has gone mainstream. Despite the lack of hard scientific evidence, the term has morphed from denoting foods that help individuals avoid diet-related diseases to marketing foods that claim to help every American live healthier. This change in the market has also generated healthy profit margins for certain grocery retailers.1 But the distinction is more than marketing—most physicians now agree that there is a strong relationship between diet and disease based on scientific evidence. For example, scientific communities agree that specific ingredients like saturated fats can affect health. To ensure consumers can make informed choices about these ingredients, their presence is explicitly listed on the FDA’s nutrition labels.
Unfortunately, the zealous proponents of health foods have gone beyond advocacy of ingredients the medical establishment deems “healthy.” Foods whose heritage can be traced to intentional genetic modification in a modern laboratory are ominously labeled as “genetically modified organisms” (GMOs). Although this label has taken on a negative connotation, it’s simply a descriptor and, by itself, cannot convey whether or not a product is “healthy.” Such labeling is like singling out children born using modern in vitro fertilization and treating them differently than children conceived “naturally”! Conflating the nutritional composition of food with its genetic heritage allows marketers to extract a premium for foods labeled “non-GMO” while failing to acknowledge the actual health benefits of some GMOs.
In 2016, Congress established the National Bioengineered Food Disclosure Standard (NBFDS), a US federal law that mandates “BE” labeling for bioengineered foods. These foods contain genetic material not accessible via breeding, added using in vitro recombinant DNA techniques. This law empowers USDA to specify whether ingredients should be labeled BE depending on their supply chains and to define analytical tests that establish whether labeling is necessary. These analytical tests allow the agency to define bioengineered products precisely. While GMO and BE foods may overlap, the two labels are inconsistent and have different criteria specified by different organizations.
Science has weighed in on GMO/BE foods, and numerous studies have shown no health risks associated with the consumption of GMO/BE foods.2 Indeed, bioengineering improves the nutritional content of some foods. For example, low linoleic acid canola oil has less trans-fat, a dietary component associated with increased rates of heart disease. In such cases, the nutritional differences are reflected on food labels following FDA guidelines. In addition, bioengineering can reduce the number of agricultural chemicals needed to prevent spoilage, eliminating potentially toxic residues and food waste. But marketers of “health(y) foods” have spent millions to support “non-GMO” labels that are unrelated to health while continuing to sow irrational fears to help maintain their margins.
To make matters worse, marketers have added to the confusion by labeling certain foods with another vague descriptor, “organic”. Organic farming is a cultivation practice that avoids synthetic pesticides and artificial fertilizers. It is how the crops are grown, not what. But even the USDA’s National Organic Program (65 FR 80547. 12/21/2000)3 conflates the two, specifying that even animals fed with GMO feed cannot be labeled USDA Organic! From a scientific perspective, it is inaccurate to consider any GMO an “ingredient” because the genes themselves are present in minuscule amounts and can be fully digested. The changes are in the code, not the composition. They are made up of natural building blocks, as are the proteins produced.
Further, because farm animals digest food to these components, any “pass-through” of GMO characteristics would require extraordinary proof. While it is impossible to prove a negative, there is no evidence of adverse consumer reactions (even among those with severe food allergies) to GMOs themselves. For this reason, USDA’s BE designation expressly excludes animals fed with bioengineered foods (NBFDS, Sec 293(a)(2)(A)]. The current regulatory regime around bioengineered foods, organic farming, and GMOs is inconsistent and requires reform. Consumers deserve objective and relevant information about the foods they consume, but current sources of information can be inaccurate or incomplete.. As consumers have become more health- and origin-conscious, corporations have seized on this awareness to promote their products. Unfortunately, health(y) food marketers often use scientifically tenuous and potentially deceptive labels. Corporations fund academic researchers and non-governmental organizations to conduct independent research to legitimize these marketing messages, often as philanthropic, tax-advantaged donations.
While such funding is not necessarily nefarious, it can confuse consumers and undermine more trusted and objective sources of nutrition information – federal agencies. The Government’s responsibility is to provide accurate ratings that support fact-based competition. Free and fair competition in the marketplace has long been the objective of Federal regulations. While corporations should be allowed to differentiate their goods in the eye of the consumer, they shouldn’t be allowed to instill irrational fear of health hazards lacking robust scientific support. This is not unique to the agricultural industry – in fact, it is the core of the regulatory framework for pharmaceuticals.
Corporations currently exploit the hodgepodge rating system, but it can be improved through Government regulation. As shown in the figure, surveys show that U. S. consumers trust Government ratings more than any other source except for “experts” and find such ratings to be more understandable, particularly in contrast to those expressed by experts.
There is an opportunity for regulatory improvement in the food labeling space, both legislatively and through executive action. Because USDA labeling covers agricultural food sources (including Bio-Engineered and Organic labels), adding a non-Bio Engineered label would further enable consumers to make an informed choice. The dissonance between BE and USDA Organic labels should also be resolved by removing the prohibition on using BE/GMO sources as a condition of Organic labeling. However, this is an issue that must be corrected legislatively.
Furthermore, because of the significant market advantages gained through advertising unsubstantiated health claims, market players have taken to the courts, where dozens of lawsuits have been filed against USDA, attempting to force the Department’s labels to support spurious health claims due to ambiguities in the legal definitions of both “organic” and “bioengineered”. Affirming that USDA is empowered by statute to determine specific criteria for its own labels when legislative language is ambiguous will help negate any claims to the contrary.
Plan of Action
Food labeling is central to the flow of accurate and unbiased information from farm to table. Currently, two primary agencies are responsible for food labeling, USDA and FDA (under HHS), and one agency is responsible for truth in advertising, FTC (under Commerce). These responsibilities are split: USDA covers farm products, FDA covers nutrition, and FTC prosecutes false advertising. The recommended actions below will improve coordination among these agencies, produce a more uniform response to labeling issues, and increase consumer confidence in and knowledge of the food they purchase.
Because food labels are often relied upon during a purchase decision in the grocery aisle, the Bioengineered Food Labeling Standard established in 20164 and mandated in 20225 should be strengthened. Specifically:
Congress should pass legislation removing redundancy in USDA’s Organic and BE labeling requirements.
Although this may be a more long-term solution, the current regulatory regime is confusing and conflates agricultural methods with content. Congress should take up this issue in future Farm Bills and appropriations cycles and develop clear, mutually exclusive legal definitions. This will create more transparent labels for consumers and lead to more explicit decisions by the judiciary in marketing lawsuits.
USDA’s Agricultural Marketing Service should certify a non-Bioengineered label.
AMS currently oversees the assignment of BE labels. Through independent laboratory analysis, the agency should also offer a service to firms to certify a non-BE label, using the NBFLS criteria. USDA already has analytical laboratories and staff conducting spot inspections of meat producers. These capabilities could be leveraged to confirm a non-BE label. In addition, producers who wish to label their goods as “non-BE” would be willing to pay an evaluation fee comparable to fees paid to non-governmental certification agencies, so the budgetary impact should be minimal. Alternatively, because the NBFLS establishes methods that can be performed in certified testing facilities, USDA’s resources could be deployed to spot-check the claims. Further, because non-BE labeling would not be mandatory, producers can choose to remain silent on the content of their goods if their bioengineered content is unknown.
Any ingredients with known health benefits should appear on the FDA nutrition label, and any marketer that uses either Organic or non-GMO labeling without adhering to USDA’s authorities should be prosecuted for false advertising.
For budgetary purposes, USDA’s Animal and Plant Health Inspection Service (APHIS) and its Food Safety and Inspection Service (FSIS) are allocated approximately $1.7B and $1B, respectively. Additional staffing needs would likely be minimal because spot inspections of manufacturing facilities are already part of their routine.
The Federal Trade Commission (FTC) should increase enforcement of ‘Truth-in-Advertising’ regulations to prosecute improperly labeled Organic or non-GMO foods.
Another angle agencies could take to support a more coordinated approach to consumer protections is through prosecution of improperly labeled Organic or non-GMO foods. While USDA would maintain the responsibility of conducting spot inspections, the FTC would be responsible for enforcing any transgressions through False Advertising Laws.6
There is already precedent for this type of enforcement. Between 2003 and 2010, FTC successfully removed spurious health claims made by POM Wonderful, a marketer of pomegranate juice and related products, despite a vigorous appeal mounted by the company. While this example rejected false advertising based on specific health claims, it could also be extended to false advertising based on general health claims.
Conclusion
This proposal presents a more coordinated framework for food labeling regulations and would have wide-ranging effects. Among the stakeholders are farmers (both large and small), national grocery chains, food processing companies, agricultural biotechnology companies (particularly those that use laboratory-derived technologies that do not result in a “Bio-Engineered” label), and alternative protein companies that create consumer goods using processes developed in laboratories (e.g., Impossible Foods). In addition, various organizations, such as the Biotechnology Innovation Organization (BIO), have filed amicus briefs in lawsuits that target USDA labeling. There is significant interest in improving the current system.
In addition to providing the protection that consumers deserve, this proposal has health and climate impacts. Nutrition and health are tightly linked, and consumers know for themselves what foods are likely to aggravate their health outcomes. Accurate labeling empowers consumers to decide for themselves about their individual needs, to the extent that consumers believe that non-BE foods are more nutritious. Constraining both seed and method to organic, non-GMO can have a demonstrable negative impact on the climate mitigation capabilities of agricultural practices.
As suggested above, language suggesting that using seeds descended from laboratory methods of genetic modification anywhere in the chain precludes organic production methods should be eliminated. This can be more accurately communicated using two different labeling permutations, “organic & BE” and “organic & non-BE.”
No. The Non-GMO Project (the NGO responsible for certifying the labels) has extensive, published criteria that suggest that there is a precise definition of a GMO. But, unfortunately, there isn’t one: It’s a gray area whose definition is scientifically imprecise, to the extent that it is defined differently in the US than in the EU (for example). In particular, the Project’s definition is so broad that any food determined (by the Project) to be “unnatural”, including processes and products traced to the use of a genetically modified organism, can be denied a label. In contrast, the USDA’s BE Label is scientifically precise and focused on an analytical criterion that can be objectively determined in the laboratory.
Probably none. It’s hard to tell because, as mentioned above, The Non-GMO Project’s labeling criteria are subjective. According to their criteria, determining a new GMO is intrusive and requires surveillance of its entire development path. In contrast, determining a BE label requires inspection (much like the USDA’s meat grades), albeit in a laboratory setting.
Because The Non-GMO Project label includes processes and derivatives, foods such as the plant-based Impossible Burger could be labeled non-BE, even though the process involves a GMO, disqualifying it from their labeling. (A GMO is used to create the meat flavor of the protein, which is purified before blending.)
Of course! Because they already monitor new GMOs, this non-profit can help guide USDA inspectors to foods that should be labeled as BE but are not. In addition, they can guide analytical procedures that can be used to ascertain whether a given food product is, in fact, BE.
Agriculture is a globally significant enterprise that can both capture and release greenhouse gases responsible for global warming. Under the current scheme, improving the efficiency of agricultural practices involving GMO processes is discouraged because of the stigma. Innovations such as PivotBio’s enhanced nitrogen fixation organism (a GMO that reduces the amount of fertilizer needed) may be avoided by farmers because of a fully-justified fear of being labeled.
Health Care Coverage for the Incarcerated Population to Reduce Opioid-Related Relapse, Overdose, and Recidivism Rates
Summary
Untreated substance use disorders (SUDs) are common among those who pass through the criminal justice system. At both the state and federal levels, re-entry into communities is a critical time period for these individuals. Preventing opioid relapse and potential overdose post-release can prevent recidivism, and improve an individual’s life after time in jail. Medication-assisted treatment (MAT) for opioid use disorders (OUDs) can help some sustain recovery. However, there are many barriers that interfere with the distribution of medication: cost, accessibility, and distribution are difficult to overcome, along with a lack of professionals trained to prescribe medication for OUDs.
To address this facet of the growing opioid crisis, the United States Department of Justice (DOJ) and the Centers for Medicare & Medicaid Services (CMS) should facilitate the accessibility for medications for OUDs (MOUDs) and train professionals to prescribe MOUDs. Additionally, incarcerated individuals with an OUD should have intensive case management that continues through reintegration into society. Finally, Medicare coverage should be available in order to continue treatment and support successful reentry into their community. Together, these will help reduce risks of recidivism, opioid-related relapse, and overdoses during reintegration back into their community.
Challenge and Opportunity
Approximately 65% of the United States prison population has a substance use disorder. An estimated 17% of those detained in state and federal prisons who meet the criteria for substance use disorder have an opioid use disorder specifically. Repeated drug usage causes a person to grow physiologically reliant on the drug, requiring more to have the desired effect, known as increasing tolerance. Individuals with an OUD lose their tolerance to the drug while incarcerated, which sets them at a greater risk of overdose mortality upon release. The risk of mortality from a lethal overdose is more than 12 times greater than that of another person within two weeks of being released from jail or prison. A meta-analysis determined that MOUDs during incarceration increased post-release treatment involvement and reduced opioid use post-release. Similarly, a randomized control trial at a Baltimore pre-release prison setting, showed that those who began methadone therapy and counseling while in prison were more likely to continue treatment post-release. They also had reduced rates of opioid use re-offending over the course of six months compared to those who received counseling only.
Methadone, buprenorphine, and naltrexone are MOUDs that have been authorized by the Food and Drug Administration for the treatment of OUDs. Research on the utilization of MOUD has demonstrated to be an effective treatment, specifically with methadone and buprenorphine. However, the distribution amount of MOUDs in the criminal justice system settings is low: only 3.6% of incarcerated individuals with OUD across the United States were prescribed and administered buprenorphine. According to the Pew Charitable Trusts and Substance Abuse and Mental Health Services Administration (SAMHSA), just 14 states administered at least one MOUD, 39 states provided naltrexone in jail or prison settings, and only one state (Rhode Island) provided all three MOUDs. Increasing the percentage of MOUD administration in carceral settings and after release across the United States is critical in order to reduce opioid overdose deaths.
Rhode Island’s Approach to Opioid Use Disorder Treatment
The Rhode Island Department of Corrections (RIDC) is the first correctional system to launch an extensive program to screen individuals for an OUD upon entry, offer all three MOUDs to eligible incarcerated individuals, and continue with treatment post-release. The RIDC MAT program provides incarcerated individuals with access to MOUDs, and counseling during incarceration. RIDC MAT also provides linkage to care after release through a partnered non-profit organization, Community Organization for Drug Abuse Control (CODAC) Behavioral Healthcare. Together, RIDC and CODAC have established a successful pipeline for the continuation of MAT post-incarceration. Prior to an individual’s release date, CODAC develops a re-entry strategy with the assistance of case management and care providers. As a result, Rhode Island’s statewide overdose fatalities decreased by 12% in the first year of this program’s adoption, while post-incarceration overdose deaths decreased by 61%. A decrease in mortality rates related to opioid overdose post incarceration allows approximately $7,300 more in personal income per individual’s extended years of life. Other states have turned to Rhode Island’s MAT program to learn from and advocate for incarcerated individuals in order to treat OUDs during and after incarceration, and help reduce recidivism.
Challenges for Implementation
Despite these strong results, challenges remain.
Opioid use treatment and services are covered by health insurances under the Mental Health Parity and Addiction Equity Act (MHPAEA) of 2008. However, incarcerated individual healthcare coverage is entirely operated by the state, which contributes to the above mentioned disparities in drug therapeutic access and counseling–but only while incarcerated. As individuals transition back into society, if they do not have health insurance to pay for their MOUDs or other rehabilitation treatments, they lose treatment, and experience an increased likelihood of relapse.
The Medicaid Inmate Exclusion Policy under the Social Security Act prevents and prohibits Medicaid coverage while incarcerated, making it difficult for formerly incarcerated individuals to acquire healthcare upon release, and thus access MOUDs. The majority of these individuals qualify for Medicaid upon release since they are low-income and fall below the federal poverty line. In 2018, Congress provided waiver opportunities for CMS to connect individuals who were recently released from jail/prison to healthcare across the states, but not federally.
Medicaid Section 1115 Waivers
To combat this gap, states are waiving the Medicaid Inmate Exclusion Policy to provide Medicaid coverage for incarcerated individuals upon release by filing Section 1115 waivers. A section 1115 waiver is a provision within the Social Security Act that grants the Secretary of Health and Human Services the authority to waive specific requirements within the Medicaid program. Section 1115 waivers offer states the flexibility to design and implement innovative approaches to enhance access to healthcare. To obtain approval, states must submit proposals outlining their proposed changes and demonstrate that the waiver will not increase federal government expenditures over the waiver period. Once approved, the waiver permits the states to operate the Medicaid program with modified, exempted, or alternative requirements. For instance, Section 1115 waivers from New Hampshire and Utah were approved, enabling the expansion of healthcare coverage to incarcerated individuals. Under the waiver, incarcerated individuals are granted full Medicaid coverage for care coordination and provider services, which commences approximately one month prior to their release.
Plan of Action
The Biden Administration has urged states to submit Section 1115 waivers to propose options for expanding coverage in order to reduce health disparities, remove barriers to MOUDs treatment access, and find long-term solutions to OUDs issues. It is imperative that the federal government prioritize reducing relapse, and opioid overdose mortality rates during incarceration and post-release in order to reduce recidivism. The DOJ, CMS, and SAMHSA should collaborate to develop a pipeline that expands training across professionals, have MOUDs more accessible to correctional facilities, and have healthcare coverage post-release.
Recommendation 1. Compare and contrast the Section 1115 waivers submitted by states to encourage and detail advantages to the remaining states.
A root of the issue is the failure to provide pre-release healthcare coverage to incarcerated individuals in order for them to continue having coverage post-release. Hence, increasing the access to healthcare post-release by states applying for Section 1115 waivers to propose measures and assist incarcerated individuals in obtaining healthcare coverage is important. Currently 35 states have filed approved Section 1115 waivers. Collecting data on these states would provide insight into how these waivers reduce recidivism and overdose rates. The Agency for Healthcare Research and Quality (AHRQ) should issue an open call for evidence synthesis to delve into the impacts of Section 1115 waivers. By doing so, AHRQ would aim to conduct a comprehensive analysis of the impacts and outcomes from the implementation of Section 1115 waivers. This initiative would contribute to evidence-based decision-making and further enhance the understanding of the implication of Section 1115 waivers on healthcare. Examples of data collection that could be obtained to assess the success of Medicaid resources are:
- Overdose mortality rates between those who have Medicaid and those who do not
- Post-release drug-related opioid reoffending
- Economic impact such as quality-adjusted life years gained.
Once the data has been gathered, it is essential that the dataset is made publicly accessible to researchers. The dataset can be published on the CMS data website, enabling widespread access and utilization for researchers. This accessibility will allow researchers to examine the significance of reducing overdose-related fatalities after incarceration and assess how the expansion of Section 1115 waivers could contribute to achieve this reduction.
Recommendation 2. Increase opioid treatment program accessibility during and after incarceration.
Rhode Island’s MAT program has shown to be effective in reducing opioid overdose deaths. A replica of the Rhode Island program has improved OUD treatment to reduce opioid related relapses and death in a correctional facility in Massachusetts. In order to provide intensive case management when individuals come into contact with the criminal justice system and adequately rehabilitate them, correctional facilities should use a method similar to that used by Rhode Island’s Department of Corrections MAT program. Since correctional facilities and licensed professionals must be accredited by the DEA and SAMHSA to provide MOUDs, individuals will have the opportunity to have access to MOUDs at Opioid treatment programs (OTPs) during and after incarceration who are certified. Thus, the DOJ and SAMHSA should collaborate with CODAC and similar organizations to increase OTP accessibility across correctional facilities during and after incarceration. These organizations can assist with creating a re-entry treatment plan during incarceration and continue after release. Incarcerated individuals will have access to MOUDs at OTPs as well as counseling. This aims to increase accessibility to MOUDs, licensed therapists, and medical doctors.
Recommendation 3. Intensive case management during incarceration should continue when reintegrating back into the community.
The DOJ, CMS and OTPs should further collaborate to establish a pipeline that aids individuals to combat OUDs. Currently, upon release, formerly incarcerated individuals’ MOUD treatment is terminated and they do not have access to treatment unless they are referred to a rehabilitation center or seen by a licensed professional. The first two weeks after release are crucial because there is a higher risk of relapsing. Thus, it is essential for correctional facilities to assist incarcerated individuals to apply for Medicaid within a few months of release to access MOUDs and therapy. Medicaid would cover MOUD costs and counseling services at OTPs or similar organizations. MOUD treatment should be administered during prison in order to commence proper rehabilitation, whether that is at a correctional facility or an OTP. Subsequently, continuing their pharmacological treatment in parallel with counseling post-release reduces relapse, withdrawal symptoms, and overdose deaths. This aims to expand access while in a correctional facility and continue treatment post-release to reduce opioid mortality rates.
Conclusion
Opioid relapses and overdoses following imprisonment have escalated significantly, accelerating the chance of overdose mortality. Incarcerated individuals with an OUD should get comprehensive case management while incarcerated that continues as they reintegrate into their communities. However, the Social Security Act prevents incarcerated individuals from receiving Medicaid coverage while incarcerated. Implementing these measures will decrease overdose mortality rates, risk of relapse, and reduce recidivism.
In Massachusetts, researchers were able to assess an estimated cost and benefits of administering MOUDs during incarceration, using the Researching Effective Strategies to Prevent Opioid Death (RESPOND) simulation model. The availability of all three MOUDs during incarceration showed that it was cost effective at approximately $7252 per quality-adjusted life year gained and reduced 1.8% of opioid related overdose deaths.
The U.S Department of Health and Human Services (HHS) has provided science and community-based approaches to combat the opioid epidemic crisis. In the past years, the HHS has allocated $2 billion in grants to help reduce opioid mortality and relapse rates across the United States. Researchers and community-based organizations can apply for grant money from HHS for data collection on how Section 1115 waivers have improved reducing recidivism and overdose rates.
The DOJ has approximately allocated $340 million in grant award funding money to battle the opioid crisis. $7.2 million dollars have been used to treat individuals with a substance use disorder and assist with support during incarceration and reentry services.
The United States is in the middle of an emerging life-threatening opioid epidemic crisis that is affecting over 33,000 deaths per year from prescription and synthetic opioids. The opioid epidemic crisis is highly prevalent among the criminal justice population. This impacts individuals across the country, not just in specific states. The federal government should encourage individual states to apply for federal funding that is available in order to combat the opioid epidemic crisis.
The use of MOUDs in OTPs in the United States is regulated by the 42 Code of Federal Regulations (CRF) 8. This regulation established a system for accrediting and certifying OTPs in order to grant the ability to dispense and administer FDA approved MOUDs. More information on the process of accrediting and certifying OTPs can be found in SAMHSA website.
Given the rigorous nature of the accreditation process, obtaining accreditation for OTPs can be an intricate process, which involves several steps and requirements, including: thorough assessments of program infrastructure, staff qualification and training, and compliance with regulatory standards. These factors collectively contribute to the length of the accreditation process, potentially deterring some facilities from pursuing OTP status. Another aspect to consider is the decision-making process of states regarding the application for Section 1115 waivers. One significant consideration revolves around funding and financial considerations. States often conduct an extensive evaluation to assess the potential financial implications and cost-sharing arrangements associated with the Section 1115 waiver before finalizing their decision to apply. Despite these challenges, it is crucial to acknowledge that implementing OTP accreditation and Section 1115 waiver approvals play a crucial role in reducing relapse rates post-incarceration, while also creating a more comprehensive and effective healthcare system that saves lives by addressing the opioid crisis and minimizing recidivism.
Next-Generation Fire and Vegetation Modeling for a Hot and Dry Future
Summary
Wildfires are burning in ways that surprise even seasoned firefighters. Our current models cannot predict this extreme fire behavior—nor can they reproduce recent catastrophic wildfires, making them likely to fail at predicting future wildfires or determining when it is safe to light prescribed fires.
To better prepare the fire management community to operate in a new climate, Congress should establish and fund five regional centers of excellence (CoE) to develop, maintain, and operate next-generation fire and vegetation models to support wildland fire planning and management. Developing five regional CoEs (Southeast, Southwest, California, Pacific Northwest, Northern/Central Rockies) will ensure that researchers pursue a range of approaches that will ultimately lead to better models for predicting future wildfire behavior, improving our ability to safeguard human lives, communities, and ecosystems.
Challenge and Opportunity
In the decade ending in 2021, total federal wildfire suppression expenditures surpassed $23 billion, which is a fraction of the total costs of damages from wildfire over that period. For example, the 2018 wildfires in California are estimated to have amounted to $148.5 billion in economic costs for the state. The costs of suppressing fire, and the societal and natural resources costs of extreme wildfire, will continue to increase with increasing temperatures.
Fewer than 2% of ignitions become large wildfires, but it is this 2% that cause most of the damage because they are burning under extreme conditions. The area of forests burned by wildfire annually in the western United States has been increasing exponentially since 1984. While the number of ignitions remains relatively constant from year to year, climate change is drying fuels and making forests more flammable. As a result, no matter how much money we spend on wildfire suppression, we will not be able to stop increasingly extreme wildfires. Thus, we need to better understand where the risks lie on our landscapes and work proactively to reduce them.
When vegetation—especially dead vegetation—is subjected to high temperatures, any moisture absorbed during the winter months quickly evaporates. As a result, increasingly hot summers are making our forests more flammable. Live vegetation moisture content does not react as quickly as dead vegetation, but sharp increases in air temperature when conditions are dry can make live plants more flammable as well. While this relationship between temperature and ecosystem flammability has remained consistent over time, until the past decade we had not reached a level of warming that dried ecosystems sufficiently to allow for consistent extreme fire behavior. This is in part because large dead fuels, such as dead trees and logs, did not dry sufficiently to become flammable for the majority of the fire season until recently.
Our current operational models for simulating wildfire and vegetation are incapable of reproducing the extreme fire behavior and rapid ecosystem change that we are now experiencing. Forest growth-and-yield models, such as the Forest Vegetation Simulator, used by managers have served them well for decades. However, because they are built using statistical relationships between past tree growth and climate, they are incapable of capturing the effects of changing climate, especially extreme events, on tree growth and mortality. Similarly, our operational fire models, such as FARSITE, that are used for both management planning and simulating fire spread to plan fire suppression activities are not designed to deal with the substantial ecosystem changes that are occurring from climate change. These fire models have served us well in the past, but increasing temperature and a drying atmosphere are causing conditions that far exceed the data used to build these models.
For example, our current operational fire models do not account for large dead trees and logs and how they contribute to fire spread or for the way fire behaves in the wildland–urban interface. Yet wildfires are increasingly burning through communities, and the number of dead trees and logs is increasing because of drought- and insect-induced tree mortality and is increasingly available to burn because of high temperatures. The 2020 Creek Fire in the Sierra Nevada, California, burned through an area of extensive tree mortality from prolonged drought and insect outbreaks. The operational fire spread model ELMFIRE, which is used to predict fire spread of active wildfires, was unable to predict the mass fire behavior created by the massive number of dead trees.
Managing wildfire risk both prior to and during wildfires requires advanced models that are able to account for changing climatic conditions. We need new wildfire models that account for the increasing fuel dryness that facilitates extreme fire behavior, and we need new vegetation models that account for the effects of extreme drought and temperature on vegetation mortality. The research and development necessary to prepare us for our increasingly flammable world requires both fundamental and applied research, neither of which is sufficient on its own.
Further, we need to ensure that we commit to maintaining these models as the climate continues to change so that we do not create another tool that fails to serve us well within a decade or two. As the climate continues to change, these next-generation fire and vegetation models will be challenged with novel conditions that require continuous efforts to ensure they are capable of capturing the dynamics of the system. In addition, we must ensure that the mechanistic understanding of the system that develops is applied to supporting fire and vegetation management decision-making. This will require ongoing experimentation and observations of actual wildfire behavior, along with extensive data collection to characterize how quickly the flammability of the system changes as a function of vegetation type and weather conditions.
Developing these next-generation models is necessary for both fire suppression and management planning. Incident command teams rely on fire spread models to help plan suppression efforts for active wildfires, and thus having better predictions of fire spread is essential for effective operations and firefighter safety. Likewise, planning forest treatments that are effective for reducing the risk of high-severity wildfire under extreme weather conditions requires better vegetation and fire models that can capture the influence of changing climate on the probability that high-severity wildfire occurs.
Plan of Action
Developing and future-proofing next-generation fire and vegetation models will require new and sustained investment. Further, we must accept that these advanced models will require a level of expertise to operate that we cannot expect from a land manager trained in natural resource management, requiring that we fund expert model users to support management planning and suppression efforts.
As with all research and development, there are many possible pathways. Regional differences in weather, vegetation, and management history will alter climate effects on vegetation growth, mortality, and flammability. Similar to the Manhattan Project approach of simultaneously pursuing two different ignition systems when there was more than one potential viable alternative, we lack the necessary understanding to pick a “winning” model at this point.
To account for regional differences in vegetation and the research momentum that is developing in different nascent modeling approaches, an effective and robust federal investment would entail the following actions.
Recommendation 1. Congress should establish and fund five centers of excellence housed at academic institutions in the Southeast, Southwest, California, Pacific Northwest, and Northern/Central Rockies to develop and maintain next-generation fire and vegetation models that are capable of modeling extreme fire behavior and can be operationalized to support planning for wildfire and vegetation management and to support wildfire suppression.
Establishing five centers with this geographic distribution will allow for investigation into the forest types where the majority of wildfire area occurs and will capture the range of climatic conditions under which wildfires are occurring. It will also take advantage of past and ongoing regional research efforts that will form the information foundation for each center. While these centers should have largely independent research programs, it will be necessary to coordinate some large-scale experimentation and to ensure that research findings and advances are shared rapidly. To achieve these objectives, one center should be selected to act as the coordinating center for the network.
Recommendation 2. Congress should require institutional partnerships between the host institutions and federal research institutions (e.g., U.S. Forest Service Research and Development, Department of Energy National Labs, U.S. Geological Survey, etc.).
We are currently in an all-hands-on-deck situation in the fire and fuels research community, and we need to operate in a collaborative and regionally coordinated manner. Requiring partnerships between the academic centers of excellence and federal research facilities within each region will ensure that effort is not duplicated and a wider range of expertise. For example, efforts are under way at federal research facilities that could be integrated within the regional fire centers. The integration will ensure collaboration between academic and federal partners and allow for the overall research effort to draw on the strengths of these different types of institutions.
Recommendation 3. Congress should mandate and fund the centers to operate these next-generation models and support wildfire and vegetation management planning and operations.
To date, we have relied on fire and vegetation models developed by the research community to use data collected by fire and forest managers and packaged so that natural resource professionals can operate the models. Both of these constraints have contributed to the limitations of our current suite of models. We can no longer afford the limitations imposed by expectations on the research community to develop models that a natural resource professional can run on a desktop computer. Accounting for a range of factors, such as how changing climatic conditions will directly change the amount of fuel on the landscape and also for how short-term changes in weather will interact with longer-term changes in climate and influence fuel moisture, requires a more sophisticated approach to simulating the system than is necessarily accessible to a non-expert user. Expecting a natural resource professional to use an advanced coupled atmosphere-biosphere fire model would be like teaching someone how to balance their checkbook and then expecting them to calculate exactly how much they need to save every week for retirement. Further, important feedback to model improvement will come from repeated application by expert model users. To deploy next-generation fire and vegetation models in a manner that will effectively support fire and natural resource management decision-making, each center will employ experts who will work collaboratively with managers in response to their requests to run simulations for pre-fire management and suppression operations planning.
Recommendation 4. Congress should mandate the creation of strategic plans to support implementation and coordination across centers.
Each center will develop a five-year strategic plan to guide its research and development efforts. Following strategic plan development, representatives from the five centers will convene to determine necessary coordinated experimentation and implementation plans to facilitate coordinated efforts. The coordinating center will hold biannual leadership meetings to ensure data and information flow and identify additional opportunities for collaboration among individual centers.
Conclusion
Establishing five centers of excellence to develop, maintain, and operate next-generation models will cost approximately $26 million per year, which is less than 1% of the 2021 federal wildfire suppression expenditure. This level of funding would provide $5 million per year per center (plus an additional $1 million per year for the coordinating center). The annual budgets would fund staff scientist and research assistant positions, provide support for the experiments necessary to develop and parameterize new models, provide computing resources for computationally sophisticated models, and fund staff analysts to run the models in support of managers. Initially, the majority of the annual appropriation would be focused on model development, transitioning to maintaining and operating the models to support land management as the technology matures.
The centers could be supported through National Science Foundation (NSF) funding. NSF could provide financial support for five university host institutions (one in each region) selected through a competitive bidding process. In turn, these university host institutions can manage the required federal partnerships. Selection of university host institutions could be based in part on demonstrated capacity to manage successful partnerships with federal institutions.
It is imperative that we invest in new models that will support more effective mitigation to reduce wildfire severity, otherwise spending on suppression will continue to balloon despite improved fire intelligence.
Yes. Just a few examples include the colocation of the University of Georgia with fire researchers in the U.S. Forest Service (USFS) Southern Research Station; the University of New Mexico’s existing relationships with Los Alamos National Lab, Sandia National Lab, and the U.S. Geological Survey; and the University of Washington’s long-standing relationship with the USFS Pacific Northwest Fire and Environmental Applications research group.
NSF is in wildland fire research and, jointly with the National Institute of Standards and Technology, already funds research on fire in the wildland–urban interface. While much of the research needed to develop next-generation fire and vegetation models is basic, all wildland fire research is inherently applicable. NSF hosted a five-day Wildfire and the Biosphere Innovation Lab, and the findings included the assertion that “support for applied research will be most effective by aiming at both short- and long-term applications and solutions,” acknowledging that the application of research findings is an important part of the research enterprise.
Yes. These centers will bring together and build from ongoing efforts. There are already efforts under way to develop optimal treatment strategies that account for changing climatic conditions using advanced forest landscape models. This approach, with some refinement and validation, will be useful for informing treatment placement within the next two years.
This is functionally the system we have now. The Fire Research Management and Exchange System (FRAMES) provides a clearinghouse of models developed for fire and vegetation modeling to inform management. FRAMES may be a good interface to help increase manager awareness of the models the five centers will develop, but it is not a mechanism for facilitating the research and development needed to tackle the wildfire problem. We need five centers because there are already a number of efforts under way to develop new fire and vegetation models. None of the models will be perfect because they all take different approaches and there are tradeoffs inherent in any given approach. With simultaneous investment, we will be able to capitalize on the aspects of each model that best simulate a part of the fire spread or vegetation growth process and then develop a system that incorporates the best of each model. Competition within the U.S. scientific enterprise has helped our country achieve high global standing. Funding five centers will shift that competition away from researchers spending much of their time competing for funding and focus it on competing with their best ideas in a way that prepares us for managing wildfire in the future.
Save Lives by Making Smoke Tracking a Core Part of Wildland Fire Management
Toxic smoke from wildland fire spreads far beyond fire-prone areas, killing many times more people than the flames themselves and disrupting the lives of tens of millions of people nationwide. Data infrastructure critical for identifying and minimizing these smoke-related hazards is largely absent from our wildland fire management toolbox.
Congress and executive branch agencies can and should act to better leverage existing smoke data in the context of wildland fire management and to fill crucial data infrastructure gaps. Such actions will enable smoke management to become a core part of wildland fire management strategy, thereby saving lives.
Challenge and Opportunity
The 2023 National Cohesive Wildland Fire Management Strategy Addendum describes a vision for the future: “To safely and effectively extinguish fire, when needed; use fire where allowable; manage our natural resources; and collectively, learn to live with wildland fire.” Significant research conducted since the publication of the original Strategy in 2014 indicates that wildfire smoke impacts people across the United States, causing thousands of deaths and billions of dollars of economic losses annually.
Smoke impacts exceed their corresponding flame impacts and span far greater areas coast to coast. However, wildfire strategy and funding largely focus on flames and their impacts. Smoke mitigation and management should be a high priority for federal agencies considering the 1:1 ratio of economic impacts and 1:30 ratio of fire to smoke deaths.
Some smoke data is already collected, but these datasets can be made more actionable for health considerations and better integrated with other fire-impact data to mitigate risks and save more lives.
Smoke tracking
Several federal programs exist to track wildfire smoke nationwide, but there are gaps in their utility as actionable intelligence for health. For example, the recent “smoke wave” on the East Coast highlighted some of the difficulties with public warning systems.
Existing wildfire-smoke monitoring and forecast programs include:
- The Fire and Smoke Map, collaboratively managed by the Environmental Protection Agency (EPA) and the US Forest Service, which displays real-time air-quality data but is limited to locations with sensors;
- The National Oceanic and Atmospheric Administration (NOAA) Hazard Mapping System Fire and Smoke Product, which evaluates total-atmosphere smoke, but lacks ground-level estimates of what people would breathe; and
- The Interagency Wildland Fire Air Quality Response Program (IWFAQRP) and the experimental U.S. Forest Service (USFS) BlueSky Daily Runs, which integrate external data to make forecasts, but lack location-specific data for all potentially impacted locations.
The EPA also publishes retrospective smoke emissions totals in the National Emissions Inventory (NEI), but these lack specificity on the downwind locations impacted by the smoke that would be needed to be used for health considerations.
Existing data are excellent, but scientists using the data combine them in non-standardized ways, making interoperability of results difficult. New nationwide authoritative smoke-data tools need to be created—likely by linking existing data and existing methods—and integrated into core wildland fire strategy to save lives.
Smoke health impacts
There is no single, authoritative accounting of wildfire smoke impacts on human health for the public or policymakers to use. Four key gaps in smoke and health infrastructure may explain why such an accounting doesn’t yet exist.
- The U.S. lacks a standardized method for quantifying the health impacts of wildfire smoke, especially mortality, despite recent research progress in this area.
- The lack of a national smoke concentration dataset hinders national studies of smoke-health impacts because different studies take different approaches.
- Access to mortality data through the National Vital Statistics System (NVSS), managed by the National Center for Health Statistics (NCHS), is slow and difficult for the scientists who seek to use mortality data in epidemiological studies of wildfire smoke.
- Gaps remain in understanding the relative harm of wildfire smoke, which can contain aerosolized hazardous compounds from burned infrastructure, compared to the general air pollution (e.g., from cars and factories) that is often used as analog in health research.
Addressing these gaps together will enable official wildfire-smoke-attributable death tolls to be publicized and used by decision-makers.
Integration of wildfire smoke into wildland fire management strategy
Interagency collaborations currently set wildland fire management strategy. Three key groups with a mission to facilitate interagency collaboration are the National Interagency Fire Center (NIFC), the National Wildfire Coordinating Group (NWCG), and the Wildland Fire Leadership Council (WFLC). NIFC maintains datasets on wildfire impacts, including basic summary statistics like acres burned, but smoke data are not included in these datasets. Furthermore, while NWCG does have 1 of its 17 committees dedicated to smoke, and has collaborations that include NOAA (who oversees smoke tracking in the Hazard Mapping System), none of the major wildfire collaborations include agencies with expertise in measuring the impacts of smoke, such as the EPA or Centers for Disease Control (CDC). Finally, WFLC has added calls for furthering community smoke-readiness in the recent 2023 National Cohesive Wildland Fire Management Strategy Addendum, but greater emphasis on smoke is still needed. Better integration of smoke data, smoke-health data, and smoke-expert agencies will enable better consideration of smoke as part of national wildland fire management strategy.
Plan of Action
To make smoke management a core and actionable part of wildland fire management strategy, thereby saving lives, several interrelated actions should be taken.
To enhance decision tools individuals and jurisdictions can use to protect public health, Congress should take action to:
- Issue smoke wave alerts nationwide. Fund the National Weather Service (NWS) to develop and issue smoke wave alerts to communities via the Wireless Emergency Alerts (WEA) system, which is designed for extreme weather alerting. The NWS currently distributes smoke messages defined by state agencies through lower-level alert pathways, but should use the WEA system to increase how many people receive the alerts. Furthermore, a national program, rather than current state-level decisions, would ensure continuity nationwide so all communities have timely warning of potentially deadly smoke disasters. Alerts should follow best practices for alerting to concisely deliver information to a maximum audience, while avoiding alert fatigue.
- Create a nationwide smoke concentration dataset. Fund NOAA and/or EPA to create a data inventory of ground-level smoke PM2.5 concentrations by integrating air-monitor data and satellite data, using existing methods as needed. The proposed data stream would provide standardized estimates of smoke concentrations nationwide, and would be a critical precursor for estimating smoke mortality as well as the extent to which smoke is contributing to poor air quality in communities. This action would be enhanced by data from recommendation 4 (below).
- Create a smoke mortality dataset. Fund the CDC and/or EPA to create a nationwide data inventory of excess morbidity and mortality attributed to smoke from wildland fires. An additional enhancement would be to track the smoke health impacts contributed by each source wildfire. Findings should be disseminated in NIFC wildfire impact summaries. This action would be enhanced by data from recommendations 4-5 and research from recommendations 6-8 (below).
The decision-making tools in recommendations 1-3 can be created today based on existing data streams. They should be further enhanced as follows in recommendations 4-10:
To better track locations and concentrations of wildfire smoke, Congress should take action to:
- Install more air-quality sensors. Fund the EPA, which currently monitors ground-level air pollutants and co-oversees the Fire and Smoke Map with the USFS, to establish smoke-monitoring stations in each census tract across the U.S and in other locations as needed to provide all communities with real-time data on wildfire-smoke exposure.
- Create a smoke impact dashboard. The current EPA Fire and Smoke Map shows near-real-time data from regulatory-grade air monitors, commercial-grade air sensors, and satellite data of smoke plumes. An upgraded dashboard would combine that map with data from recommendations 1-3 to give current and historic information about ground-level air quality, the fraction of pollutants due to wildfire smoke, and the expected health impacts. It would also include short-term forecast data, which would be greatly improved with additional modeling capability to incorporate fire behavior and complex terrain.
To better track health impacts of wildfire smoke, Congress should take action to:
- Improve researcher access to mortality data. Specifically, direct the CDC to increase epidemiologist access to the National Vital Statistics System. This data system contains the best mortality data for the U.S., so enhancing access will enhance the scientific community’s ability to study the health impacts of wildfire smoke (recommendations 6-8).
- Establish wildfire-health research centers. Specifically, fund the National Institutes of Health (NIH) to establish flagship wildfire-smoke health-research centers to research the health effects of wildfire smoke. Results-dissemination pathways should include through the NIFC to reach a broad wildfire policy audience.
- Enhanced health-impact-analysis tools. Direct EPA to evaluate the available epidemiological literature to adopt standardized wildfire-specific concentration-response functions for use in estimating health impacts in their BenMAP-CE tool. Non-wildfire functions are currently used even in the research literature, despite potentially underestimating the health impacts of wildfire smoke.
To enhance wildland fire strategy by including smoke impacts, Congress should take action to:
- Hire interagency staff. Specifically, fund EPA and CDC to place staff at the main NIFC office and join the NIFC collaboration. This will facilitate collaboration between smoke-expert agencies with agencies focused on other aspects of wildfire.
Support landscape management research. Specifically, direct the USFS, CDC, and EPA to continue researching the public health impacts of different landscape management strategies (e.g., prescribed burns of different frequencies compared to full suppression). Significant existing research, including from the EPA, has investigated these links but still more is needed to better inform policy. Needed research will continue to link different landscape management strategies to probable smoke outputs in different regions, and link the smoke outputs to health impacts. Understanding the whole chain of linkages is crucial to landscape management decisions at the core of a resilient wildland fire management strategy.

Diagram with arrows showing data flow from top to bottom, between the proposed infrastructure, with each shape representing one recommendation. Data flows from the data inputs (top boxes) to actionable tools for decision-making (circles), and finally on to pathways for integrating smoke into wildland fire management strategy (bottom boxes). The three blue shapes are recommendations that can be implemented immediately.
Cost estimates
This proposal is estimated to have a first-year cost of approximately $273 million, and future annual cost of $38 million once equipment is purchased. The total cost of the first year represents less than 4% of current annual wildfire spending (subsequent years would be 0.5% of annual spending), and it would lay the foundation to potentially save thousands of lives each year. Assumptions behind this estimate can be found in the FAQ.
Conclusion
In the U.S., more and more people are being exposed to wildfire smoke—27 times more people are experiencing extreme smoke days than a decade ago. The suggested programs are needed to improve the national technical ability to increase smoke-related safety, thereby saving lives and reducing smoke-related public health costs.
Recommendations 1-3 can be completed within approximately 6-12 months because they rely on existing technology. Recommendation 4 requires building physical infrastructure, so it should take 6 months to initiate and several years to complete. Recommendation 5 requires building digital infrastructure from existing tools, so it can be initiated immediately but relies on data from recommendations 2-3 to finalize. Recommendation 6 will require one year of personnel time to complete program review necessary for making changes, then will require ongoing support. Recommendation 7 establishes research centers, which will take 2 years to solicit and select proposals, then 5 years of funding after. Recommendation 8 requires a literature review and can be completed in 1 year. Recommendations 9-10 are ongoing projects that can start within the first year but then will require ongoing support to succeed.
The latest estimates indicate that thousands of people die across the United States each year due to wildfire smoke. However, there is no consistent ongoing tracking of smoke-attributable deaths and no centralized authoritative tallies.
Many deaths occur during the wildfire itself—wildfire smoke contains small particles (less than 2.5 microns, called PM2.5) that immediately increase the risk of stroke and heart attack. Additional deaths can occur after the fire, due to longer-term complications, much in the same way that smoking increases mortality.
Wildfires and wildfire smoke occur across the country, so deaths attributable to these causes do too. Recent research indicates that there are high numbers of deaths attributable to wildfire smoke on the West Coast, but also in Texas and New York, due to long-distance transportation of smoke and the high populations in those states.
One-time costs for recommendations 2, 3, and 8 were estimated in terms of person-years of effort and are additive with their annual costs in the first year. Recommendations 2-3 require a large team to create the initial datasets and then smaller teams to maintain, while recommendation 8 requires only an initial literature review and no maintenance. One person-year is estimated at $150,000 per year, including fringe benefits.
One-time costs for recommendation 4 were calculated in terms of air-quality monitor costs, with one commercial grade sensor ($400) for each of the 84,414 census tracts in the U.S., one sensor comparable to regulatory grade (estimated at $40,000) for each of the 5% most smoke-impacted census tracts, and 15% overhead costs for siting and installation.
Annual costs for recommendations 1-3, 5-6, and 9-10 were estimated in terms of person-years of effort because salary is the main consumable for these projects. One person-year is estimated at $150,000 per year, including fringe benefits.
Annual costs for recommendation 4 were estimated by assuming that 10% of sensors would need replacement per year. These funds can be passed on to jurisdictions, following current maintenance practice of air-quality monitors.
Annual costs for recommendation 7 is for four NIH Research Core Centers (P30 grant type) at their maximum amount of $2.5 million, each, per year.
Collaboration for the Future of Public and Active Transportation
Summary
Public and active transportation are not equally accessible to all Americans. Due to a lack of sufficient infrastructure and reliable service for public transportation and active modes like biking, walking, and rolling, Americans must often depend on personal vehicles for travel to work, school, and other activities. During the past two years, Congress has allocated billions of dollars to equitable infrastructure, public transportation upgrades, and decreasing greenhouse gas pollution from transportation across the United States. The Department of Transportation (DOT) and its agencies should embrace innovation and partnerships to continue to increase active and public transportation across the country. The DOT should require grant applications for funding to discuss cross-agency collaborations, partner with the Department of Housing and Urban Development (HUD) to organize prize competitions, encourage public-private partnerships (P3s), and work with the Environmental Protection Agency (EPA) to grant money for transit programs through the Greenhouse Gas Reduction Fund.
Challenge and Opportunity
Historically, U.S. investment in transportation has focused on expanding and developing highways for personal vehicle travel. As a result, 45% of Americans do not have access to reliable and safe public transportation, perpetuating the need for single-use vehicles for almost half of the country. The EPA reports that transportation accounts for 29% of total U.S. greenhouse gas emissions, with 58% of those emissions coming from light-duty cars. This large share of nationwide emissions from personal vehicles has short- and long-term climate impacts.
Investments in green public and active transit should be a priority for the DOT in transitioning away from a personal-vehicle-dominated society and meeting the Biden Administration’s “goals of a 100% clean electrical grid by 2035 and net-zero carbon emissions by 2050.” Public and active transportation infrastructure includes bus systems, light rail, bus rapid transit, bike lanes, and safe sidewalks. Investments in public and active transportation should go towards a combination of electrifying existing public transportation, such as buses; improving and expanding public transit to be more reliable and accessible for more users; constructing bike lanes; developing community-owned bike share programs; and creating safe walking corridors.
In addition to reducing carbon emissions, improved public transportation that disincentivizes personal vehicle use has a variety of co-benefits. Prioritizing public and active transportation could limit congestion on roads and lower pollution. Fewer vehicles on the road result in less tailpipe emissions, which “can trigger health problems such as aggravated asthma, reduced lung capacity, and increased susceptibility to respiratory illnesses, including pneumonia and bronchitis.” This is especially important for the millions of people who live near freeways and heavily congested roads.
Congestion can also be financially costly for American households; the INRIZ Global Traffic Scorecard reports that traffic congestion cost the United States $81 billion in 2022. Those costs include vehicle maintenance, fuel cost, and “lost time,” all of which can be reduced with reliable and accessible public and active transportation. Additionally, the American Public Transportation Association reports that every $1 invested in public transportation generates $5 in economic returns, measured by savings in time traveled, reduction in traffic congestion, and business productivity. Thus, by investing in public transportation, communities can see improvements in air quality, economy, and health.
Public transportation is primarily managed at the local and state level; currently, over 6000 local and state transportation agencies provide and oversee public transportation in their regions. Public transportation is funded through federal, state, and local sources, and transit agencies receive funding from “passenger fares and other operating receipts.” The Federal Transit Administration (FTA) distributes funding for transit through grants and loans and accounts for 15% of total income for transit agencies, including 31% of capital investments in transit infrastructure. Local and state entities often lack sufficient resources to improve public transportation systems because of the uncertainty of ridership and funding streams.
Public-private partnerships can help alleviate some of these resource constraints because contracts can allow the private partner to operate public transportation systems. Regional and national collaboration across multiple agencies from the federal to the municipal level can also help alleviate resource barriers to public transit development. Local and state agencies do not have to work alone to improve public and active transportation systems.
The following recommendations provide a pathway for transportation agencies at all levels of government to increase public and active transportation, resulting in social, economic, and environmental benefits for the communities they serve.
Plan of Action
Recommendation 1. The FTA should require grant applicants for programs such as the Rebuilding American Infrastructure with Sustainability and Equity (RAISE) to define how they will work collaboratively with multiple federal agencies and conduct community engagement.
Per the National Blueprint for Transportation Decarbonization, FTA staff should prioritize funding for grant applicants who successfully demonstrate partnerships and collaboration. This can be demonstrated, for example, with letters of support from community members and organizations for transit infrastructure projects. Collaboration can also be demonstrated by having applicants report clear goals, roles, and responsibilities for each agency involved in proposed projects. The FTA should:
- Develop a rubric for evaluating partnerships’ efficiency and alignment with national transit decarbonization goals.
- Create a tiered metrics system within the rubric that prioritizes grants for projects based on collaboration and reduction of greenhouse gas emissions in the transit sector.
- Add a category to their Guidance Center on federal-state-local partnerships to provide insight on how they view successful collaboration.
Recommendation 2. The DOT and HUD should collaborate on a prize competition to design active and/or public transportation projects to reduce traffic congestion.
Housing and transportation costs are related and influence one another, which is why HUD is a natural partner. Funding can be sourced from the Highway Trust Fund, which the DOT has the authority to allocate up to “1% of the funds for research and development to carry out . . . prize competition program[s].”
This challenge should call on local agency partners to provide a design challenge or opportunity that impedes their ability to adopt transit-oriented infrastructure that could reduce traffic congestion. Three design challenges should be selected and publicly posted on the Challenge.gov website so that any individual or organization can participate.
The goal of the prize competition is to identify challenges, collaborate, and share resources across agencies and communities to design transportation solutions. The competition would connect the DOT with local and regional planning and transportation agencies to solicit solutions from the public, whether from individuals, teams of individuals, or organizations. The DOT and HUD should work collaboratively to design the selection criteria for the challenge and select the winners. Each challenge winner would be provided with a financial prize of $250,000, and their idea would be housed on the DOT website as a case study that can be used for future planning decisions. The local agencies that provide the three design challenges would be welcome to implement the winning solutions.
Recommendation 3. Federal, state, and local government should increase opportunities for public-private partnerships (P3s).
The financial investment required to develop active and public transportation infrastructure is a hurdle for many agencies. To address this issue, we make the following recommendations:
- Currently, only 36 out of the 50 states have policies that allow the use of P3s. The remaining 14 states should pass legislation authorizing the use of P3s for public transportation projects so that they too can benefit from this financing model and access federal P3 funding opportunities.
- In 2016, the DOT launched the Build America Bureau to assist with financing transportation projects. The Bureau administers the Transportation Infrastructure Finance and Innovation Act (TIFIA) program, which provides financial assistance through low-interest loans for infrastructure projects and leverages public-private partnerships to access additional private-sector funding. Currently, only about 30% of all loans through the TIFIA are used for public transit projects while 66% are used on tolls and highways. Local and regional agencies should use the TIFIA loan more to fund public and active transit projects.
- EPA should specify in its Greenhouse Gas Reduction Fund guidelines that public and active transit projects are eligible for investment from the fund and can leverage public and private partnerships. EPA is set to distribute $27 billion through the Fund for carbon pollution reduction: $20 billion will go towards nonprofit entities, such as green banks, that will leverage public and private investment to fund emissions reduction projects, with $8 billion allocated to projects in low-income and disadvantaged communities; $7 billion will go to state and local agencies and nonprofits in the form of grants or technical assistance to low-income and disadvantaged communities. EPA should encourage applicants to include public and active transportation projects, which can play a significant role in reducing carbon emissions and air pollution, in their portfolios.
Conclusion
The road to decarbonizing the transportation sector requires public and active transportation. Federal agencies can allocate funding for public and active transit more effectively through the recommendations above. It’s time for the government to recognize public and active transportation as the key to equitable decarbonization of the transportation sector throughout the United States.
Most P3s in the United States are for highways, bridges, and roads, but there have been a few successful public transit P3s. In 2018 the City of Los Angeles joined LAX and LAX Integrated Express Solutions in a $4.9 billion P3 to develop a train system within the airport. This project aims to launch in 2024 to “enhance the traveler experience” and will “result in 117,000 fewer vehicle miles traveled per day” to the airport. This project is a prime example of how P3s can help reduce traffic congestion and enable and encourage the use of public transportation.
In 2021, the Congressional Research Service released a report about public-private partnerships (3Ps) that highlights the role the federal government can play by making it easier for agencies to participate in P3s.
The state of Michigan has a long history with its Michigan Saves program, the nation’s first nonprofit green bank, which provides funding for projects like rooftop solar or energy efficiency programs.
In California the California Alternative Energy and Advanced Transportation Financing Authority works “collaboratively with public and private partners to provide innovative and effective financing solutions” for renewable energy sources, energy efficiency, and advanced transportation and manufacturing technologies.
The Rhode Island Infrastructure Bank provides funding to municipalities, businesses, and homeowners for projects “including water and wastewater, roads and bridges, energy efficiency and renewable energy, and brownfield remediation.”
One Small Step: Anticipatory Diplomacy in Outer Space
Summary
The $350 billion space industry could grow to more than $1 trillion by 2040, spurring international interest in harnessing space resources. But this interest will bring with it a challenge: while existing international agreements like the Artemis Accords promote the peaceful and shared exploration of celestial bodies, they do little to address differences between existing scientific research activities and emerging opportunities like lunar mining, particularly for water ice at polar latitudes and in the perpetually shaded depths of certain craters. Lunar water ice will be a vital resource for outer space exploration and development efforts because it can be used to make hydrogen fuel cells, rocket fuel, and drinking water for astronauts. It will also be cheaper than transporting water from Earth’s surface into outer space, given the moon’s lower surface gravity and proximity to human space operations on its surface and beyond. The moon harbors other valuable long-term commodities like helium-3, the fuel needed for low-emissions nuclear fusion energy.
However, current multilateral agreements do not address whether nongovernmental operators can claim territory on celestial bodies for their use or own the resources they extract. Further, the space object registration process is currently used for satellites and other spacecraft while in orbit, but it does not include space objects intended for use on the surface of celestial bodies, such as mining equipment. These gaps leave few options for the United States or other Artemis Accords nations to resolve conflicts over territorial claims on a celestial body. In the worst-case scenario, this increasing competition for resources—especially with other major space powers like China and Russia—could escalate into military conflict.
Adopting new treaties or amendments to the existing Outer Space Treaty (OST) for modern space use is a slow process that may fail to meet the urgency of emerging space resource issues. However, the United States has another diplomatic avenue for faster action: revision of the existing United Nations’ Guidelines for the Long-term Sustainability of Outer Space under the auspices of the U.N. Committee on the Peaceful Uses of Outer Space (COPUOS). Such a process avoids the decade-long deliberations of a formal treaty amendment. The United States should thus lead the development of multilateral protocols for extracting resources from celestial bodies by proposing two updates to either the COPUOS Guidelines, the OST, or both. First, there should be an updated registration process for all space objects, which should specify the anticipated location, timeline, and type(s) of operation to establish usage rights on a particular part of a celestial body. Second, the United Nations should establish a dispute resolution process to allow for peaceful resolution of competing claims on celestial surfaces. These strategies will lay the necessary foundation for peacefully launching new mining operations in space.
Challenge and Opportunity
Right now, outer space is akin to the Wild West, in that the opportunities for scientific innovation and economic expansion are numerous, yet there is little to no political or legal infrastructure to facilitate orderly cooperation between interested terrestrial factions. For example, any nation claiming mining rights to lunar territory is on shaky legal ground, at best: the Outer Space Treaty and the subsequent Guidelines for the Long-term Sustainability of Outer Space, promulgated by the U.N. Committee on the Peaceful Uses of Outer Space, do not provide legally sound or internationally recognized development rights, enforcement structures, or deconfliction mechanisms. If one claimant allegedly violates the territorial rights of another, what legal systems could either party use to press their case? Moreover, what mechanisms would avert potential escalation toward militarized conflict? Right now, the answer is none.
This is an unfortunate obstacle to progress given the enormous economic potential of outer space development in the coming decades. To put the potential value in perspective, the emerging $350 billion space industry could grow to more than $1 trillion by 2040, motivating significant international interest. One potentially lucrative subset of operations is space mining, a sector valued at $1 billion today with a potential value of $3 billion by 2027. Once operational, space mining would be a valuable source of rare earth elements (e.g., neodymium, scandium, and others), 60% of which are currently produced in China. Rare earth elements are necessary for essential technologies such as electric vehicles, wind turbines, computers, and medical equipment. Additionally, in the event that nuclear fusion becomes commercially viable in the long-term future, space mining will be an essential industry for securing helium-3 (He-3), an abundant isotope found on the moon. Recent increases in fusion investment and a breakthrough in fusion research show the potential for fusion energy, but there is no guarantee of success. He-3 could serve as a critical fuel source for future nuclear fusion operations, an emerging form of energy production free of carbon emissions that could provide humanity with the means to address global climate and energy crises without losing energy abundance. The abundance of lunar He-3 could mean having access to secure clean energy for the foreseeable human future.
Furthermore, human exploration and development of outer space will require water, both in the form of drinking water for crewed missions and in the form of rocket propellant and fuel cell components for spacecraft. As it costs over $1 million to transport a single cubic meter of water from Earth’s surface into low Earth orbit, extracting water from the lunar surface for use in outer space operations could be substantially more economical due to the moon’s lower escape velocity—in fact, lunar water ice is estimated to be worth $10 million per cubic meter.
The space mining sector and lunar development also offer promise far beyond Earth. Our moon is the perfect “first port of call” as humanity expands into outer space. It has lower surface gravity, polar ice deposits, and abundant raw materials such as aluminum, and its status as our closest celestial neighbor make it the ideal layover supply depot and launch point for spacecraft from Earth heading deeper into our solar system. Spacecraft could be launched from Earth with just enough fuel to escape Earth’s gravity, land and refuel on the moon, and launch far more efficiently from the moon’s weaker gravity elsewhere into the system.
All in all, the vast untapped scientific and economic potential of our moon underscores the need for policy innovation to fill the gaps in existing international space law and allow the development of outer space within internationally recognized legal lines. The imperative for leading on these matters falls to the United States as a nation uniquely poised to lead the space mining industry. Not only is the United States one of the global leaders in space operations, but U.S. domestic law, including the Commercial Space Launch Competitiveness Act of 2015, provides the U.S. private sector some of the necessary authority to commercialize space operations like mining. However, the United States’ rapid innovation has also led the way to a growing space industry internationally, and the sector is now accessible to more foreign states than before. The internationalization of the space economy further highlights the gaps and failings of the existing space policy frameworks.
Two main challenges must be addressed to ensure current governance structures are sufficient for securing the future of lunar mining. First is clarifying the rights of OST State Parties and affiliated nongovernmental operators to establish space objects on celestial bodies and to own the resources extracted. The OST, the primary governing tool in space (Figure 2), establishes that no State that signed the treaty may declare ownership over all or part of a celestial body like the moon. And despite the domestic authority bestowed by the 2015 Commercial Space Launch Competitiveness Act, the multilateral OST does not address whether nongovernmental operators can claim territory and own resources they extract from celestial bodies. Thus, the OST promotes the peaceful and shared exploration of space and scientific research but does little to address differences between research operations and new commercial opportunities like lunar mining. This leaves few options to resolve conflicts that may arise between competing private sector entities or States.
Even if domestic authorization of mining operations were sufficient, a second challenge has emerged: ensuring transparency and recordkeeping of different operations to maintain peaceful shared operations in space. Through the OST and the Registration Convention, States have agreed to inform the U.N. Secretary General of space activities and to maintain a record of registered space objects (including a unique identifier, the location and date of launch, and its orbital path). But this registration process covers space objects simply at a geospatial position in orbit, and there are gaps in the process for space objects intended for use on the surface of celestial bodies and whether a spacecraft that was designed for one purpose (i.e., landing) can be repurposed for another purpose (i.e., mining). This leaves little recourse for any group that seeks to peacefully pursue mining operations on the moon’s surface if another entity also seeks to use that land.
In spite of these gaps, the U.S. government has been able to move forward with scaling up moon-related space missions via NASA’s bipartisan Artemis Program and the corresponding Artemis Accords (Figure 1), a set of bilateral agreements with updated principles for space use. The Accords have 24 signatories who collectively seek to reap the benefits of emerging space opportunities like mining. In part, the Artemis Accords aim to remedy the policy gaps of previous multilateral agreements like the OST by explicitly supporting private sector efforts to secure valuable resources like He-3 and water ice.
But the Accords do not address the key underlying challenges that could stifle U.S. innovation and leadership in space mining. For instance, while the Accords reaffirm the need to register space objects and propose the creation of safety zones surrounding lunar mining operations, gaps still remain in describing exactly how to register operations on celestial objects. This can be seen in Section 7 of the Artemis Accords, which states that space objects need to be registered, but does not specify what would classify as a “space object” or if an object registered for one purpose can be repurposed for other operations. Further, the Accords leave little room to address broader international tensions stemming from increased resource competition in space mining. While competition can have positive outcomes such as spurring rapid innovation, unchecked competition could escalate into military conflict, despite provisions in the original OST to avoid this.
In particular, preemptive measures must be taken to alleviate potential tensions with other OST signatories in direct competition with the Accords. China and Russia are not party to the Accords and therefore do not need to abide by the agreement. In fact, these nations have declared opposition to the Accords and instead formed their own partnership to establish a competing International Lunar Research Station. As these programs develop concrete lunar applications, designating methods to determine who can conduct what type of operations on specific timelines and in specific locations will be a crucial form of anticipatory diplomacy.
Plan of Action
The United States should propose that when any State registers a space object in advance of operations on a celestial body, it must specify the anticipated location of the operation; the timeline; and the type(s) of operation, described as “intent to” do one or more of the following: mine/extract resources for sale, conduct scientific research, or perform routine maintenance. This multilaterally developed process would clarify the means to register space objects for peaceful occupation of celestial object surfaces.
Additionally, the United States should propose the implementation of a process for States to resolve disputes through either bilateral negotiation or arbitration through another mutually agreed-upon third party such as the International Court of Justice (ICJ) or the Permanent Court of Arbitration (PCA). Similar disputes related to maritime resource extraction under the United Nations Law of the Sea have been resolved peacefully using the aforementioned bilateral negotiations or third party arbitration. The new dispute resolution process would similarly allow for peaceful resolution of competing claims on celestial body surfaces and resources.
To guide the creation of a space object arbitration process, other such processes like the ICJ, PCA, and International Tribunal of the Law of the Sea can be used as models. The PCA has had success with halting unfair processes and setting up a dialogue between participating parties. It has helped smaller countries set up arbitration processes with bigger ones, such as Ecuador vs. the United States, in which the Republic of Ecuador instituted arbitral proceedings against the United States concerning the interpretation and application of an investment treaty between the two countries. In the short term, existing negotiation avenues will likely be sufficient to allow for dispute resolution. However, as the space industry continues to grow, it may eventually be necessary to establish an internationally recognized “Space Court” to arbitrate disputes. The International Tribunal for the Law of the Sea provides an example of the type of international body that could arbitrate space disputes.
These anticipatory diplomacy steps could be implemented in one of three ways:
- As a binding amendment to the OST: This would require the most time to implement, but this would also make it enforceable and binding, an obvious advantage. It would also provide an opportunity to bring all the important players to the table, specifically the parties who did not sign the Artemis Accords, and would help to start a discussion on the improvement of diplomatic relations for future space operations.
- As a nonbinding update to COPUOS Guidelines: This would be faster to implement, but would not be enforceable or binding.
- As an update to the COPUOS Guidelines followed by an amendment to the OST: This would allow for both quick action in the nearer term and a permanent and enforceable implementation longer-term. Implementing a revised COPOUS could be a precursor to build support for the nonbinding updates to COPUOS. If the model is successful, State Parties would be more likely to agree to a binding amendment to OST. However they are implemented, these two proposed anticipatory diplomacy steps would improve the ability of space faring nations to peacefully use resources on celestial bodies.
Could this be done through bilateral agreements? After all, the United States has shown diplomatic initiative by entering into agreements with countries such as France, Germany, and India with the aim of using space for peaceful purposes and cooperation, though they don’t explicitly mention mining. But a bilateral process does not offer good prospects for global solutions. For one, it would be very slow and time-consuming for the United States to enter into bilateral agreements with every major country with stakes in lunar mining. If space mining agreements were to occur on a similar timeline to bilateral trade agreements, each agreement could take from one to six years to take effect. A crucial obstacle is the Wolf Amendment, which prevents the United States from entering into bilateral agreements with China, one of the its major competitors in the space industry. This restriction makes it hard to negotiate bilaterally with an important stakeholder concerning space mining.
Further, reaching these agreements would require addressing aspects of the Accords that have made many major stakeholder countries hesitant to sign on. Thus, an easier path would be to operate diplomatically through the COPUOS, which already represents 95 major countries and oversees the existing multilateral space treaties and potential amendments to them. This approach would ensure that the United States still has some power over potential amendment language but would bring other major players into some sort of dialogue regarding the usage of space for commercial purposes.
While the COPUOS guidelines are not explicitly binding, they do provide a pathway for verification and arbitration, as well as a foundation for the adoption of a binding amendment or a new space treaty moving forward. Treaty negotiations are a slow, lengthy process; the OST required several years of work before it took full effect in 1967. With many Artemis Program goals reliant upon successful launches and milestones achieved by 2025, treaty amendments are not the timeliest approach. Delays could also be caused by the fact that some parties to the OST may have reservations about adopting an amendment for private sector space use due to another space treaty, the Moon Agreement. This agreement, which the United States is not party to, asserts that “the Moon and its natural resources are the common heritage of mankind and that an international regime should be established to govern the exploitation of such resources when such exploitation is about to become feasible.” Thus, countries that have signed the Moon Agreement probably want the moon to operate like a global commons with all countries on Earth having access to the fruits of lunar mining or other resource extraction. Negotiations with these nations will require time to complete.
The U.S. State Department’s Office of Space Affairs, under the Bureau of Oceans and Environmental and Scientific Affairs (OES), is the lead office for space diplomacy, exploration, and commercialization and would be the ideal office to craft the required legislation for an OST amendment. Additionally, the Office of Treaty Affairs, which is often tasked with writing up the legal framework of treaties, could provide guidance on the legislation and help initiate the process within the U.S. State Department and the United Nations. Existing U.S. law like the Commercial Space Launch Competitiveness Act, and international treaties like OST and Registration Convention, provide authority for these proposals to be implemented in the short term. However, negotiation of updates to COPOUS Guidelines and amendments to the OST and other relevant space treaties over the next 5 to 10 years will be essential to their long term success.
Finally, the Federal Aviation Administration (FAA) at the Department of Transportation would be the logical federal agency to initially lead implementing the updated registration process for U.S.-affiliated space objects and for verifying the location and intended use of space objects from other nations. FAA implements the current U.S. process for space objects registration. In the long term it could be appropriate to transfer responsibility for space object registration to the rapidly growing Office of Space Commerce (OSC) at the Department of Commerce. Moving responsibilities for implementing space object registration and verification to the OSC would provide opportunities for the office to expand with the rapidly expanding space industry. This change would also allow the FAA to focus on its primary responsibilities for regulating the domestic aerospace industry.
Conclusion
Douglas Adams may have put it best: “Space is big. You just won’t believe how vastly, hugely, mind-bogglingly big it is.” While Adams was describing the sheer size of space, this description applies just as well to the scale of outer space’s scientific and economic prospects. After all, any new economic theater that will grow into a multi-trillion dollar market in just a few decades is not to be taken lightly. But without a plan to avert and resolve potential conflicts with other outer space actors, the United States’ future efforts in this emerging theater will be hamstrung. Improved collaboration on space mining provides an opportunity to promote international cooperation and economic development, while military conflict in space poses high risks to the economic potential of the current and future space industry. Transparent and widely agreed-upon frameworks would allow for peaceful competition on scientific research and resource extraction on celestial objects.
Lunar mining has shown promise for providing access to water ice, rare earth metals, He-3, and other raw materials crucial for the further exploration of space. Providing a peaceful and secure source of these materials would build on the bipartisan Commercial Space Launch Competitiveness Act’s guidelines for space resource extraction and, in the long run, further enable the modernization and decarbonization of the U.S. electric grid for public benefit.
In order to promote the peaceful exploration and development of space, we must update existing international law—either the COPUOS Guidelines, the OST, or both—to clarify the locations, timeline, and types of outer space operations conducted by state actors. We must also propose deconfliction mechanisms for OST parties to resolve disputes peacefully via bilateral negotiation or arbitration by a mutually acceptable third party like the ICJ or PCA. Just as the United States led the world into the “final frontier” in the 20th century, so too must we lead the next chapter in the 21st. If implemented successfully, the anticipatory space diplomacy we propose will allow for the shared peaceful use of celestial bodies for decades to come.
Acknowledgments
Dr. Sindhu Nathan provided valuable insights into the writing of this memo.
There would be no additional cost to the recommendation outside of existing costs for diplomatic and U.N. activities. The Artemis Program is expected to cost $93 billion through 2025 and Congressional appropriators are already questioning the billion-dollar price tag for each planned launch. Thus, clarifying these legal frameworks may help incentivize private innovation and reduce launch costs. This proposal may facilitate economic benefits at virtually no extra cost. Therefore, the United States and Artemis Accords nations have a vested interest to ensure that these continuing investments result in successful missions with as few additional costs as possible. This proposal will likely also facilitate further private investment and innovation and protect against risk to investment from military conflict.
Another similar treaty, the Antarctic Treaty of 1961, is a great example of how different countries can unite and create a dialogue to effectively manage and share a common resource. Although the region is used for various scientific purposes, all countries can do so in a peaceful and cooperative manner. This is in part because the Antarctic treaty has been systematically updated to reflect the changing times, especially concerning the environment. The OST has not undergone any such changes. Thus, updating the COPUOS would provide a means for the United States to take the lead in ensuring that space remains a common shared resource and that no country can unfairly claim a monopoly over it.
Nuclear fusion is currently not commercially viable. However, significant interest and investment is currently centered around this potential energy source, and breakthroughs in the technology have been recently reported by leading researchers in the field. Access to He-3 will be critical if and when this industry is commercially viable.
The OST currently allows State Parties to observe space flights and access equipment for any other OST State Party. One way States could use this power to ensure these guidelines are followed is for States and the COPUOS to track how many and what types of space object operations occur on celestial bodies. (The U.S. Department of Defense already tracks over 26,000 outer space objects, but cross-referencing with COPUOS could help differentiate between debris and state objects of interest.) Interested or concerned parties could verify the accuracy of registered operations of space objects on celestial bodies led by other States, and any violations of the new guidelines could be referred to the new dispute resolution process.
In the United States, the Guidelines would be ratified in the same way as other United Nations regulations and international treaties, in the form of an executive agreement. These are directly implemented by the president and do not require a majority in the Senate to be passed but are still legally binding.
The purpose of a neutral organization like the United Nations is to engage in meaningful dialogue between powerful countries. Since space is a common shared resource, it is best to ensure that all parties have a stage to be part of talks that deal with the sharing of resources. Suggesting guidelines to a popular treaty is a good place to start, and the United States can show leadership by taking the first step while also advocating for terms that are beneficial to U.S. interests.
All the signatories of the COPOUS meet every year to discuss the effectiveness of the treaty, and countries propose various statements to the chair of the committee. (The United States’ statements from the 65th meeting of the committee in 2022 can be found here.) Although there is no obvious precedent where a statement has directly been converted into guidelines, it would still be useful to make a statement regarding a possible addition of guidelines, and one could reasonably hope it could open doors for negotiations.
Arbitration processes such as those described in the U.N. Conventions on the Law of the Sea ensure that powerful countries are not able to dominate smaller countries or frighten them with the possibility of war. Although the verdict of the arbitration process would have to be enforced by OST States, it provides a peaceful alternative to immediate military conflict. This would at least halt disputed proceedings and give time for States involved with the dispute to gather resources and support. The existence of an arbitration process would reinforce the principle that all OST States, both small and large, are entitled to access space as an equal resource for all.
Increasing National Resilience through an Open Disaster Data Initiative
Summary
Federal, state, local, tribal, and territorial agencies collect and maintain a range of disaster resilience, vulnerability, and loss data. However, this valuable data lives on different platforms and in various formats across agency silos. Inconsistent data collection and lack of validation can result in gaps and inefficiencies and make it difficult to implement appropriate mitigation, preparedness, response, recovery, and adaptation measures for natural hazards, including wildfires, smoke, drought, extreme heat, flooding, and debris flow. Lack of complete data down to the granular level also makes it challenging to gauge the true cost of disasters.
The Biden-Harris Administration should launch an Open Disaster Data Initiative to mandate the development and implementation of national standards for disaster resilience, vulnerability, and loss data to enable federal, state, local, tribal, and territorial agencies to regularly collect, validate, share, and report on disaster data in consistent and interoperable formats.
Challenge and Opportunity
Disaster resilience, vulnerability, and loss data are used in many life-saving missions, including early detection and local response coordination, disaster risk assessments, local and state hazard mitigation plans, facilitating insurance and payouts, enabling rebuilding and recovery, and empowering diverse communities to adapt to climate impacts in inclusive, equitable, and just ways.
While a plethora of tools are being developed to enable better analytics and visualizations of disaster and climate data, including wildfire data, the quality and completeness of the data itself remains problematic, including in the recently released National Risk Index.
This is because there is a lack of agency mandates, funding, capacity, and infrastructure for data collection, validation, sharing, and reporting in consistent and interoperable formats. Currently, only a few federal agencies have the mandate and funds from Congress to collect disaster data relevant to their mission. Further, this data does not necessarily integrate state and local data for non-federally declared disasters.
Due to this lack of national disaster and climate data standards, federal and state agencies, universities, nonprofits, and insurers currently maintain disaster-related data in silos, making it difficult to link in productive and efficient ways down to the granular level.
Also, only a few local, state, and federal agencies regularly budget for or track spending on disaster resilience, vulnerability, and response activities. As a result, local agencies, nonprofits, and households, particularly in underserved communities, often lack access to critical lifesaving data. Further, disaster loss data is often private and proprietary, leading to inequality in data access and usability. This leaves already disadvantaged communities unprepared and with only a limited understanding of the financial burden of disaster costs carried by taxpayers.
Since the 1990s, several bipartisan reviews, research, data, and policy documents, including the recent President’s Council of Advisors on Science and Technology (PCAST) report on modernizing wildland firefighting, have reiterated the need to develop national standards for the consistent collection and reporting of disaster vulnerability, damage, and loss data. Some efforts are under way to address the standardization and data gaps—such as the all-hazards dataset that created an open database by refining the Incident Command System data sets (ICS-209).
However, significant work remains to integrate secondary and cascading disasters and monitor longitudinal climate impacts, especially on disadvantaged communities. For example, the National Interagency Fire Center consolidates major wildfire events but does not currently track secondary or cascading impacts, including smoke (see AirNow’s Fire and Smoke Map), nor does it monitor societal vulnerabilities and impacts such as on public health, displacement, poverty, and insurance. There are no standardized methods for accounting and tracking damaged or lost structures. For example, damage and loss data on structures, fatalities, community assets, and public infrastructure is not publicly available in a consolidated format.
The Open Disaster Data Initiative will enable longitudinal monitoring of pre- and post-event data for multiple hazards, resulting in a better understanding of cascading climate impacts. Guided by the Open Government Initiative (2016), the Fifth National Action Plan (2022), and in the context of the Year of Open Science (2023), the Open Disaster Data Initiative will lead to greater accountability in how federal, state, and local governments prioritize funding, especially to underserved and marginalized communities.
Finally, the Open Disaster Data Initiative will build on the Justice40 Initiative and be guided by the recommendations of the PCAST Report on Enhancing prediction and protecting communities. The Open Disaster Data Initiative should also reiterate the Government Accountability Office’s 2022 recommendation to Congress to designate a federal entity to develop and update climate information and to create a National Climate Information System.
Precedents
Recent disaster and wildfire research data platforms and standards provide some precedence and show how investing in data standards and interoperability can enable inclusive, equitable, and just disaster preparedness, response, and recovery outcomes.
The Open Disaster Data Initiative must build on lessons learned from past initiatives, including:
- the National Weather Service’s (NWS) Storm Events database, which collects meteorological data on when and where extreme events occur, along with occasional but unverified estimates of socioeconomic impacts.
- the Centers for Disease Control’s (CDC) COVID-19 Data Modernization Initiative, which attempts to harmonize data collection and reporting across national, tribal, state, and local agencies.
- the National Oceanic and Atmospheric Administration’s (NOAA) National Integrated Drought Information System (NIDIS), a multiagency partnership that coordinates drought monitoring, forecasting, planning, and information at national, tribal, state, and local levels but is impacted by inconsistent data reporting.
- the Federal Emergency Management Agency (FEMA)’s OpenFEMA initiative, which shares vast amounts of data on multiple aspects of disaster outcomes, including disaster assistance, hazard mitigation investments, the National Flood Insurance Program, and grants, but requires technical expertise to access and utilize the data effectively.
- FEMA’s National Risk Index, which maps the nation’s resilience, vulnerability, and disaster losses at county and census tract levels but shows shortcomings in capturing the risk of geophysical events such as earthquakes and tsunamis. In late 2022, Congress passed the Community Disaster Resilience Zones Act (P.L. 117-255), which codifies the National Risk Index. The goal is to support the census tracts with the highest risk rating with financial, technical, and other forms of assistance.
There are also important lessons to learn from international efforts such as the United Nations’ ongoing work on monitoring implementation of the Sendai Framework for Disaster Risk Reduction (2015–2030) by creating the next generation of disaster loss and damage databases, and the Open Geospatial Consortium’s Disaster Pilot 2023 and Climate Resilience Pilot, which seek to use standards to enable open and interoperable sharing of critical geospatial data across missions and scales.
Plan of Action
President Biden should launch an Open Disaster Data Initiative by implementing the following four actions.
Recommendation 1. Issue an Executive Order to direct the development and adoption of national standards for disaster resilience, vulnerability, and loss data collection, validation, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies to create the enabling conditions for adoption by universities, non-profits, and the private sector. The scope of this Executive Order should include data on local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
Recommendation 2. Direct the Office of Management and Budget (OMB) to issue an Open Disaster Data Initiative Directive for all relevant federal agencies to collaboratively implement the following actions:
- Direct the National Council on Science and Technology to appoint a subcommittee to work with the National Institute of Standards and Technology to develop national standards for disaster resilience, vulnerability, and loss data collection, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies, as well as by universities, nonprofits, and the private sector.
- Direct all relevant federal agencies to adopt national standards for disaster resilience, vulnerability, and loss data collection; validation; sharing; and reporting to address ongoing issues concerning data quality, completeness, integration, interoperability, accessibility, and usability.
- Develop federal agency capacities to accurately collect, validate, and use disaster resilience, vulnerability, and loss data, especially as it relates to population estimates of mortality, morbidity, and displacements, including from extreme heat and wildfire smoke.
- Direct FEMA to coordinate and implement training for state, local, tribal, and territorial agencies on how to collect disaster resilience, vulnerability, and loss data in line with the proposed national standards. Further, building on the FEMA Data Strategy 2023-2027 and in line with OpenFEMA, FEMA should review its private and sensitive data sharing policy to ensure that disaster data is publicly available and useable. FEMA’s National Incident Management System will be well positioned to cut across hazard mission silos and offer wide-ranging operational support and training for disaster loss accounting to federal, state, local, tribal, and territorial agencies, as well as nonprofit stakeholders, in coordination with FEMA’s Emergency Management Institute.
Recommendation 3. Designate a lead coordinator for the Open Disaster Data Initiative within the Office of Science Technology and Policy (OSTP), such as the Tech Team, to work with the OMB on developing a road map for implementing the Open Disaster Data Initiative, including developing the appropriate capacities across all of government.
Recommendation 4. Direct FEMA to direct appropriate funding and capacities for coordination with the National Weather Service (NWS), the U.S. Department of Agriculture’s Risk Management Agency, and the National Centers for Environmental Information (NCEI) to maintain a federated, open, integrated, and interoperable disaster data system that can seamlessly roll up local data, including research, nonprofit, and private, including insurance data.
In addition, Congress should take the following three actions to ensure success of the Open Disaster Data Initiative:
Recommendation 5. Request the Government Accountability Office to undertake a Disaster Data Systems and Infrastructure Review to:
- Inform the development of national standards and identify barriers for accurate disaster data collection, validation, accounting, and sharing between federal, state, local, tribal, and territorial agencies, as well as the philanthropic and private sector.
- Review lessons learned from precedents (including NWS’s Storm Events database, CDC’s Data Modernization Initiative, NOAA’s NIDIS, and FEMA’s National Risk Index).
- Form the basis for the OMB and the OSTP to designate an appropriate budget and capacity commitment and suggest a national framework/architecture for implementing an Open Disaster Data Initiative.
Recommendation 6. Appropriate dedicated funding for the implementation of the Open Disaster Data Initiative to allow federal agencies, states, nonprofits, and the private sector to access regular trainings and develop the necessary infrastructure and capacities to adopt national disaster data standards and collect, validate, and share relevant data. This access to training should facilitate seamless roll-up of disaster vulnerability and loss data to the federal level, thereby enabling accurate monitoring and accounting of community resilience in inclusive and equitable ways.
Recommendation 7. Use the congressional tool of technical corrections to support and enhance the Initiative:
- Pass Technical Corrections and Improvements to the Community Disaster Resilience Zones Act to include provisions for the collection, sharing, and reporting of disaster resilience, vulnerability, and loss data by all relevant federal, state, local, tribal, and territorial agencies and academic, private, community-based, and nonprofit entities, in consistent and interoperable formats, and in line with the proposed national disaster data standards. Technical corrections language could point to the requirement to “review the underlying collection and aggregation methodologies as well as consider the scoping of additional data the agency (ies) may be collecting.” This language should also direct agencies to review dissemination procedures for propriety, availability, and access for public review. The scope of this technical correction should include local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
- Pass Technical Corrections and Improvements or suspension bill to the Disaster Recovery Reform Act, section 1223 which mandates a study of information collection or section 1224, which requires publication of said data accessible to the public would complement the above by tasking FEMA to study, aggregate, and share information with the public in a way that is digestible and actionable.
Conclusion
The Open Disaster Data Initiative can help augment whole-of-nation disaster resilience in at least three ways:
- Enable enhanced data sharing and information coordination among federal, state, local, tribal, and territorial agencies, as well as with universities, nonprofits, philanthropies, and the private sector.
- Allow for longitudinal monitoring of compounding and cascading disaster impacts on community well-being and ecosystem health, including a better understanding of how disasters impact poverty rates, housing trends, local economic development, and displacement and migration trends, particularly among disadvantaged communities.
- Inform the prioritization of policy and program investments for inclusive, equitable, and just disaster risk reduction outcomes, especially in socially and historically marginalized communities, including rural communities.
Recent analysis by a federal interagency effort, Science for Disaster Reduction, shows that national-level databases significantly underreport disaster losses due to an overreliance on public sources and exclusion (or inaccessibility) of loss information from commercial as well as federal institutions that collect insured losses.
Also, past research has captured common weaknesses of national agency-led disaster loss databases, including:
- over- or underreporting of certain hazard types (hazard bias)
- gaps in historic records (temporal bias)
- overreliance on direct and/or monetized losses (accounting bias)
- focus on high impact and/or acute events while ignoring the extensive impacts of slow disasters or highly localized cascading disasters (threshold bias)
- overrepresentation of densely populated and/or easily accessible areas (geography bias)
The National Weather Service’s Storm Events Database, the USDA’s Risk Management Agency’s Crop Data, and the CDC’s COVID-19 Data Modernization Initiative provide good templates for how to roll up data from the local to federal level. However, it is important to recognize that past initiatives, such as NOAA’s NIDIS initiative, have found it challenging to go beyond data collection on standard metrics of immediate loss and damage to also capture data on impacts and outcomes. Further, disaster loss and damage data are not currently integrated with other datasets that may capture secondary and cascading effects, such as, injuries, morbidities, and mortalities captured in CDC’s data.
Defining new standards that expand the range of attributes to be collected in consistent and interoperable formats would allow for moving beyond hazard and geographic silos, allowing data to be open, accessible, and usable. In turn, this will require new capacity and operational commitments, including an exploration of artificial intelligence, machine learning, and distributed ledger system (DLS) and blockchain technology, to undertake expanded data collection, sharing, and reporting across missions and scales.
Aligning with guidance provided in the OSTP’s recent Blueprint for an AI Bill of Rights and several research collective initiatives in recent years, the Open Disaster Data Initiative should seek to make disaster resilience, vulnerability, loss, and damage data FAIR (findable, accessible, interoperable, reusable) and usable in CARE-full ways (collective benefit, with authority to control, for responsible, and, ethical use).
A technical corrections bill is a type of congressional legislation to correct or clarify errors, omissions, or inconsistencies in previously passed laws. Technical corrections bills are typically noncontroversial and receive bipartisan support, as their primary goal is to correct mistakes rather than to make substantive policy changes. Technical corrections bills can be introduced at any time during a congressional session and may be standalone bills or amendments to larger pieces of legislation. They are typically considered under expedited procedures, such as suspension of the rules in the House of Representatives, which allows for quick consideration and passage with a two-thirds majority vote. In the Senate, technical corrections bills may be considered under unanimous consent agreements or by unanimous consent request, which allows for passage without a formal vote if no senator objects. Sometimes more involved technical corrections or light policy adjustments happen during “vote-o-rama” in the Senate.
Technical corrections bills or reports play an important role in the legislative process, particularly during appropriations and budgeting, by helping to ensure the accuracy and consistency of proposed funding levels and programmatic changes. For example, during the appropriations process, technical corrections may be needed to correct funding levels or programmatic details that were inadvertently left out of the original bill. These technical changes can be made to ensure that funding is allocated to the intended programs or projects and that the language of the bill accurately reflects the intent of Congress.
Similarly, during the budgeting process, technical corrections may be needed to adjust estimates or projections based on new information or changes in circumstances that were not foreseen when the original budget was proposed. These technical changes can help to ensure that the budget accurately reflects the current economic and fiscal conditions and that funding priorities are aligned with the goals and priorities of Congress. For example, in 2021, Congress used a technical corrections bill to clarify budget allocations and program intent after Hurricane Ida to make recovery programs more efficient and help with overall disaster recovery program clarification. Similarly, in 2017, Congress relied on a technical corrections/suspension bill to clarify some confusing tax provisions related to previous legislation for relief from Hurricane Maria.
Strengthening the U.S. Biomanufacturing Sector Through Standardization
Summary
The advancement and commercialization of bioprocesses in the United States is hindered by a lack of suitable and available pilot-scale and manufacturing-scale facilities. This challenge stems in part from our inability to repurpose facilities that are no longer needed due to a lack of standardization and inadequate original design. Historically, most biomanufacturing facilities have been built with a single product in mind and with a focus on delivering a facility as cheaply and quickly as possible. While this might be the best approach for individual private companies, it is not the best approach for the bioeconomy as a whole. The Biden-Harris Administration should establish a program to standardize the construction of biomanufacturing facilities across the United States that also permits facilities to be repurposed for different products in the future.
Through government-incentivized standardization, better biomanufacturing facilities can be built that can be redeployed as needed to meet future market and governmental needs and ultimately solve our nation’s lack of biomanufacturing capacity. This program will help protect U.S. investment in the bioeconomy and accelerate the commercialization of biotechnology. Enforcement of existing construction standards and the establishment of new standards that are strictly adhered to through a series of incentivization programs will establish a world-leading biomanufacturing footprint that increases supply resilience for key products (vaccines, vitamins, nutritional ingredients, enzymes, renewable plastics), reduces reliance on foreign countries, and increases the number of domestic biomanufacturing jobs. Furthermore, improved availability of pilot-scale and manufacturing-scale facilities will accelerate growth in biotechnology across the United States.
This memo details a framework for developing and deploying the necessary standards to enable repurposing of biomanufacturing facilities in the future. A team of 10–12 experts led by the National Institute for Standards and Technology (NIST) should develop these standards. A government-sponsored incentivization program with an estimated cost of $50 million per year would then subsidize the building of new facilities and recognition of participating companies.
Challenge and Opportunity
Currently, the United States faces a shortage in both pilot-scale and manufacturing-scale biomanufacturing facilities that severely hinders product development and commercialization. This challenge is particularly large for the fermentation industry, where new facilities take years to build and require hundreds of millions of dollars in infrastructure investment. Many companies rely on costly foreign assets to advance their technology or delay their commercialization for years as they wait for access to one of the limited contract pilot or manufacturing facilities in the United States.
Why do we have such a shortage of these facilities? It is because numerous facilities have been shut down due to changing market conditions, failed product launches, or bankruptcy. When the facilities were ultimately abandoned and dismantled for scrap, the opportunity to repurpose expensive infrastructure was lost along with them.
Most U.S. biomanufacturing facilities are built to produce a specific product, making it difficult to repurpose them for alternative products. Due to strict financing and tight timelines for commercialization, companies often build the minimally viable facility, ultimately resulting in a facility with niche characteristics specific to their specific process and that has a low likelihood of being repurposed. When the facility is no longer needed for its original purpose—due to changes in market demand or financial challenges—it is very unlikely to be purchased by another organization.
This challenge is not unique to the biomanufacturing industry. In fact, even in the highly established automotive industry, less than half of its manufacturing facilities are repurposed. The rate of repurposing biomanufacturing facilities is much lower, given the lower level of standardization. Furthermore, nearly 30% of currently running biomanufacturing facilities have some idle capacity that could be repurposed. This is disappointing considering that many of these biomanufacturing facilities have similar upstream operations involving a seed bioreactor (a small bioreactor to be used as inoculum for a larger vessel) to initiate fermentation followed by a production reactor and then harvest tanks. Downstream processing operations are less similar across facilities and typically represent far less than half the capital required to build a new facility.
The United States has been a hot spot for biotech investment, with many startups and many commercial successes. We also have a robust supply of corn dextrose (a critical input for most industrial fermentation), reasonable energy costs, and the engineering infrastructure to build world-class biomanufacturing facilities providing advantages over many foreign locations. Our existing biomanufacturing footprint is already substantial, with hundreds of biomanufacturing facilities across the country at a variety of scales, but the design of these facilities lacks the standardization needed to meet the current and future needs of our biomanufacturing industry. There have been some success stories of facilities being repurposed, such as the one used by Gevo for the production of bio-butanol in Minnesota or the Freedom Pines facility in Georgia repurposed by LanzaTech.
However, there are numerous stories of facilities that were unable to be repurposed, such as the INEOS facility that was shuttered in India River, Florida. Repurposing these facilities is challenging for two primary reasons:
- A lack of forethought that the facility could be repurposed in the future (i.e., no space for additional equipment, equipment difficult to modify, materials of construction that do not have broad range of process compatibility).
- A lack of standardization in the detailed design (materials of construction, valve arrangements, pipe sloping, etc.) that prevents processes with higher aseptic requirements (lower contamination rates) from being implemented.
In order to increase the rate at which our biomanufacturing facilities are repurposed, we need to establish the policies and programs to make all new biomanufacturing facilities sustainable, more reliable, and capable of meeting the future needs of the industry. These policies and associated standards will establish a minimum set of guidelines for construction materials, sterilizability, cleanability, unit operation isolation, mixing, aeration, and process material handling that will enable a broad range of compatibility across many bioprocesses. As a specific example, all fermentors, bioreactors, and harvest tanks should be constructed out of 316L grade stainless steel minimum to ensure that the vast majority of fermentation and cell culture broths could be housed in these vessels without material compatibility concerns. Unfortunately, many of the U.S. biomanufacturing facilities in operation today were constructed with 304 grade stainless steel, which is incompatible with high-salt or high-chloride content broths. Furthermore, all process equipment containing living microorganisms should be designed to aseptic standards, even if the current product is not required to be axenic (absent of foreign microorganisms).
These standards should focus on upstream equipment (fermentors, media preparation tanks, sterilization systems), which are fairly universal across the food, pharma, and industrial biotech industries. While there are some opportunities to apply these standards to downstream process equipment, the downstream unit operations required to manufacture different biotech products vary significantly, making it more challenging to repurpose equipment.
Fortunately, guiding principles covering most factors that need to be addressed have already been developed by experts in the American Society of Mechanical Engineers (ASME), Bioprocess Equipment (BPE), and the International Society for Pharmaceutical Engineering (ISPE). These standards cover the gamut of biomanufacturing specifications: piping, mixing, valves, construction materials, and, in some cases, the design of specific unit operations. Companies are often forced to decide between following best practices in facility design and making tight timelines and budgets.
Following these standards increases capital costs of the associated equipment by 20% to 30%, and can extend construction timelines, preventing companies from adopting the standards even though it directly improves their top or bottom line by improving process reliability. Our biggest gap today is not ability to standardize but rather the incentivization to standardize. If the government provides incentives to adopt these standards, many companies will participate as it is widely recognized that these standards will result in facilities that are more reliable and more flexible for future products.
The National Institute for Standards and Technology (NIST) should initiate a program focused on biomanufacturing standards. The proposed program could be housed or coordinated out of a new office at the NIST—for example, as described in the previously proposed “Bio for America Program Office (BAPO)”—which should collaborate closely with the Office of the Secretary of Commerce and the Under Secretary of Commerce for Standards and Technology, as well as additional government and nongovernmental stakeholders as appropriate. NIST is the appropriate choice because it harbors cross-disciplinary expertise in engineering, and the physical, information, chemical, and biological sciences; is a nonregulatory agency of the U.S. Department of Commerce, whose mission it is “to drive U.S. economic competitiveness, strengthen domestic industry, and spur the growth of quality jobs in all communities across the country”; and is a neutral convener for industry consortia, standards development organizations, federal labs, universities, public workshops, and interlaboratory comparability testing.
Plan of Action
The Biden-Harris Administration should sponsor an initiative to incentivize the standardization that will enables the repurposing of biomanufacturing facilities, resulting in a more integrated and seamless bioeconomy. To do so, Congress should appropriate funds for a program focused on biomanufacturing standards at NIST. This program should:
- Develop a set of design and construction standards that enable facilities to be efficiently repurposed for different products in the future.
- Create, in collaboration with other government agencies, an incentivization program to encourage participation.
- Recognize participating companies with a certification.
- Track the program’s impact by measuring the rate of facility repurposing long-term.
First, the program will need to be funded by Congress and stood up within NIST. The award amounts will vary based on the facility size, but it is estimated that each participating company will receive $6 million on average, leading to a total program cost in the range of $30 million to $50 million per year. While the costs might seem high, the investment is at reduced risk by design, since facilities that adopt the program are better equipped to be repurposed should the original company abandon the facility.
Next, design and building standards would be defined that ensure the highest chance of redeployment along with reliable operation. While relevant standards exist (i.e., ASME BPE Standards), they should be refined and elaborated by an expert panel established by NIST with the purpose of promoting repurposing. The adoption rate of the existing nonmandatory standards is low, particularly outside of the pharma industry. This new NIST program should establish a panel of experts, including industry and government representatives, to fully develop and publish these standards. A panel of 10–12 members could develop these standards in one year’s time. Thereafter, the panel could be assembled regularly to review and update these standards as needed.
Once the standards are published, NIST should launch (and manage) a corresponding incentivization program to attract participation. The program should be designed such that an estimated 50% incremental cost savings would be achieved by adhering to these standards. In other words, the improved infrastructure established by following the standards would not be fully subsidized, but it would be subsidized at the rate of 50%. The NIST program could oversee applicants’ adherence to the new standards and provide awards as appropriate. NIST should also work with other federal government agencies that support development of biomanufacturing capacity (e.g., Department of Energy [DOE], Department of Defense [DoD], and Department of Agriculture [USDA]) to explore financial incentives and funding requirements to support adherence with the standards.
In addition, the government should recognize facilities built to the new standards with a certification that could be used to strengthen business through customer confidence in supply reliability and overall performance. NIST will publish a list of certified facilities annually and will seek opportunities to recognize companies that broadly participate as a way to recognize their adoption of this program. Furthermore, this type of certification could become a prerequisite for receiving funding from other government organizations (i.e., DoE, DoD, USDA) for biomanufacturing-related funding programs.
Last, to measure the program’s success, NIST should track the rate of redeployment of participating facilities. The success rate of redeployment of facilities not participating in the program should also be tracked as a baseline. After 10 years, at least a twofold improvement in redeployment rate would be expected. If this does not occur, the program should be reevaluated and an investigation should be conducted to understand why the participating facilities were not redeployed. If needed, the existing biomanufacturing standards should be adjusted.
Conclusion
Given the large gap in biomanufacturing assets needed to meet our future needs across the United States, it is of paramount importance for the federal government to act soon to standardize our biomanufacturing facilities. This standardization will enable repurposing and will build a stronger bioeconomy. By establishing a program that standardizes the design and construction of biomanufacturing facilities across the country, we can ensure that facilities are built to meet the industry’s long-term needs—securing the supply of critical products and reducing our reliance on foreign countries for biomanufacturing needs. In the long run, it will also spur biotech innovation, since startup companies will need to invest less in biomanufacturing due to the improved availability of manufacturing assets.
A committee will need to be established to create a detailed budget plan; however, rough estimates are as follows: A typical biomanufacturing facility costs between $100 million and $400 million to build, depending on scale and complexity. If the program is designed to support five biomanufacturing facilities per year, and we further assume an average construction cost of $200 million with $40 million of that being equipment that applies to the new standard, a 15% subsidy would result in ~$6 million being awarded to each participating facility. If we assume that following these standards increases the costs of the associated equipment by 30%, the net increase in costs would be from $40 million to $52 million. This 15% subsidy is designed to offset the cost of applying these new standards at roughly a 50 cents on the dollar rate. In addition, there will be some overhead costs to run the program at NIST, but these are expected to be small. Thus, the new program would cost in the range of $30 million to $50 million per year to run, depending on how many companies participate and are awarded on an annual basis.
When they apply for funding, companies will describe the facility to be built and how the funds will be used to make it more flexible for future use. A NIST panel of subject matter experts will evaluate and prioritize nominations, with an emphasis on selecting facilities across different manufacturing sectors: food, pharma, and industrial biotech.
Given that the life of biomanufacturing facilities is on the order of years, it is expected that this program will take several years before a true impact is observed. For this reason, the program evaluation is placed 10 years after launch, by which time it is expected that more than 20 facilities will have participated in the program, and at least a few will have been repurposed during that time.
Keeping the standards general across industries enables repurposing of facilities across different industries. The fact that different standards exist across industries, and are present in some industries but not others, is part of the current challenge in redeploying facilities.
The initial focus is on standardization within the United States. Eventually, standardization on a more global scale can be pursued, which will make it easier for the United States to leverage facilities internationally. However, international standardization presents a whole new set of challenges due to differences in equipment availability and materials of construction.
Increasing Access to Capital by Expanding SBA’s Secondary Market Capacity
Summary
Entrepreneurship and innovation are crucial for sustained community development, as new ventures create new jobs and wealth. As entrepreneurs start and grow their companies, access to capital is a significant barrier. Communities nationwide have responded by initiating programs, policies, and practices to help entrepreneurs creatively leverage philanthropic dollars, government grants and loans, and private capital. But these individually promising solutions collectively amount to a national patchwork of support. Those who seek to scale promising ideas face a funding continuum that is filled with gaps, replete with high-transaction costs, and highly variable depending on each entrepreneur’s circumstances.
To help entrepreneurs better and more reliably access capital no matter where in the country they are, the Small Business Administration (SBA) should work with the other Interagency Community Investment Committee (ICIC) agencies to expand the SBA’s secondary market capacity. The SBA’s secondary market allows lenders to sell the guaranteed portion of a loan backed by the SBA. This provides additional liquidity to lenders, which in turn expands the availability of commercial credit for small businesses. However, there is no large standardized secondary market for debt serviced by other federal agencies, so the benefits of a secondary market are limited to only a portion of federal lending programs that support entrepreneurship. Expanding SBA’s secondary market authority would increase access to large pools of private capital for a larger proportion of entrepreneurs and innovative small businesses.
As a first step towards this goal, one or several agencies should enter into a pilot partnership with SBA to use SBA’s existing administrative authority and infrastructure to enable private lenders to sell other forms of federally securitized loans. Once proven, the secondary market could be expanded further and permanently established as a government-sponsored enterprise (GSE). This GSE would provide accessible capital for entrepreneurs and small businesses in much the same way that the GSEs Fannie Mae and Freddie Mac provide accessible capital, as mortgages, for prospective homeowners.
With the 118th Congress considering the reauthorization of SBA for the first time in 22 years, there is an opportunity to seize on this reauthorization to modernize the SBA. Piloting the SBA’s secondary market capacity is a crucial piece of modernization to increase access to capital for entrepreneurs.
Challenge and Opportunity
Access to capital changes the economic trajectory of individuals and communities. Approved small business loan applicants, for instance, report average income increases of more than 10% five years after loan approval. Unfortunately, capital for budding entrepreneurs is scarce and inequitably allocated. Some 83% of budding entrepreneurs never access adequate capital to start or grow their business. Success rates are even lower for demographic minorities. And when entrepreneurs can’t access capital to start their business, the communities around them suffer, as evidenced by the fact that two out of every three new jobs over the past 25 years has been generated by small businesses.
The vast majority of new businesses in the United States are funded by personal or family savings, business loans from banks or financial institutions, or personal credit cards. Venture capital is used by only 0.5% of entrepreneurs because most entrepreneurs’ businesses are not candidates for it. Public and mission-driven lending efforts are valiant but can’t come close to matching the scale of this untapped potential. Outside of the COVID-19 emergency response, the SBA annually appropriates $1–2 billion for lending programs. The Urban Institute found that between 2011 and 2017, Chicago alone received $4 billion of mission-driven lending that predominantly went toward communities of color and high-poverty communities. But during the same time period, Chicago also received over $67 billion of market investment—most of which flowed to white and affluent neighborhoods.
Communities across the country have sought to bridge this gap with innovative ideas to increase access to private capital, often by leveraging federal funding or federal programmatic infrastructure. For example:
- The Entrepreneur Backed Asset Fund is a philanthropically funded initiative that creates a secondary market for microloans made through Community Development Financial Institutions (CDFIs). This idea came from an experienced microlender who was frustrated with the illiquidity of microloans and subsequent need to constantly engage in time-consuming philanthropic fundraising.
- ESO Ventures is a nonprofit organization out of East Oakland, California, that combines competency-based curriculum and access to capital. As entrepreneurs grow their skills through the program, they receive access to increasing lines of credit to apply those new skills to their businesses. ESO Ventures has grown rapidly by using both federal recovery grants funneled through the state of California and access to private banks for credit lines. The organization’s goal is to create 3,000 more businesses and generate $3 billion in economic activity by 2030.
- Network Kansas is an entrepreneurial support organization born out of the first version of the Treasury Department’s State Small Business Credit Initiative (SSBCI) program under the Obama Administration. The organization leverages a Kansas state tax credit to provide below-market debt to rural entrepreneurs in the most remote parts of Kansas. Since 2006, Network Kansas has deployed $500 million.
These example programs are successful, replicable, and already supported by some of the agencies in the ICIC. These programs use traditional, well-understood financial mechanisms to provide capital to entrepreneurs: credit lines, insurance, shared-equity agreements, tax credits, and low-interest debt. The biggest obstacle to scaling these types of programs is financial: they must first raise money to support their core financial mechanism(s) and their dependence on ad hoc fundraising almost inevitably yields uneven results.
There is a clear rationale for federal intervention to improve capital access for entrepreneurship-support programs. Successful investment in marginalized communities serves the public interest by generating positive externalities such as increases in jobs, wealth, and ownership. Government can grow these externalities manyfold by reducing risk for investors and reducing the cost of capital to entrepreneurs through the expansion of SBA’s secondary market authority and ultimate creation of a GSE to create permanence, increased accountability, and further flexibility of capital access. With SBA reauthorization on the legislative docket, this is a prime opportunity to address the core challenge of capital access for entrepreneurs.
Plan of Action
Federal government should create standardized, straightforward mechanisms for entrepreneurs and small businesses across the country to tap into vast pools of private capital at scale. A first step is launching an administrative pilot that extends the SBA’s current secondary market capacity to interested agencies in the ICIC. An initial pilot partner could be the Department of the Treasury in order to recapitalize its Community Development Financial Institutions (CDFI) Fund. If the pilot proves successful, the secondary market could be expanded further and permanently established as a government-sponsored enterprise.
Recommendation 1. Establish an administrative pilot.
The SBA’s secondary market can already serve small business debt and debt-like instruments for small businesses and community development. The SBA currently underwrites, guarantees, securitizes, and sells pools of 7(a) and 504 loans, unsecured SBA loans in Development Company Participation Certificates, and Small Business Investment Company Debentures. Much like Federal Housing Administration and Veterans Affairs home loans offer guaranteed debt to homeowners, there are programs that offer guaranteed debt for entrepreneurs. However, there is no large standardized secondary market for the debt that extends across agencies.
An interagency memorandum of understanding between interested ICIC agencies could quickly open up the SBA’s secondary market infrastructure to other forms of small business debt. This would allow the ICIC to explore, with limited risk, the extent to which an expanded secondary market for federally securitized debt products enables entrepreneurs and small businesses to more easily access low-cost capital. Examples of other forms of small business lending provided by ICIC agencies include Department of Agriculture Rural Business Development Grants, Department of Housing and Urban Development Community Development Block Grants, and the Treasury Small Business Lending Fund, among others.
An ideal initial pilot partner target among ICIC agencies would be the Treasury, which could pilot a secondary market approach to recapitalizing its CDFI Fund. This fund allocates capital via debenture to CDFIs for them to make personal, mortgage, and commercial loans to low-income and underserved communities. The fund is recapitalized on an annual basis through the federal budget process. A partnership with SBA to create a secondary market for the CDFI Fund would effectively double the federal support available for CDFIs that leverage that fund.
It is important to note that while SBA can create pilot intergovernmental agreements to extend its secondary market infrastructure, broader or permanent extension of secondary market authority may require congressional approval.
Recommendation 2. Create a government-sponsored enterprise (GSE).
Upon successful completion of the administrative pilot, the ICIC should explore creating a GSE that decreases the cost of capital for entrepreneurs and small businesses and expands capital access for underserved communities. This separate entity would be a more independent body than an expanded secondary market created through SBA’s existing infrastructure. Benefits of creating a GSE include providing more flexibility and allowing the agency to function more independently and with greater authority while being subject to more rigorous reporting and oversight requirements to ensure accountability.
After the 2008 housing-market crash and subsequent recession, the concept of a GSE was criticized and reforms were proposed. There is no doubt that GSEs made mistakes in the housing market, but they also helped standardize and grow the mortgage market that now serves 65% of American households. The federal government will need to implement thoughtful, innovative governance structures to realize the benefits that a GSE could offer entrepreneurs and small businesses while avoiding repeating the mistakes that the mortgage-focused GSEs Fannie Mae and Freddie Mac made.
One potential ownership structure is the Perpetual Purpose Trust (PPT). PPTs work by separating the ownership right of governance from the ownership right of financial return and giving them to different parties. The best-known example of a PPT to date is likely the one established by Yvon Chouinard to take over his family’s ownership interest in Patagonia. In a PPT, trustees—organized in a Trust Steward Committee (TSC)—are bound by a fiduciary duty to maintain focus on the stated purpose of the trust. None of the interests within the TSC are entitled to financial return; rather, the rights to financial return are held in a separate entity (the Corporate Trustee) that does not possess governance rights. This structure, which is backed by a Trust Enforcer, ensures that the TSC cannot force the company to do something that is good for profits but bad for purpose.
Emulating this basic structure for a capital-focused GSE could circumvent the moral hazard that plagued the mortgage-focused GSEs. The roles of TSC, Trust Enforcer, and Corporate Trustee in a federal context could be filled as follows:
- Trust Steward Committee: The TSC possesses a fiduciary for the PPT and typically makes strategic decisions to pursue the PPT’s purpose. For a capital-focused GSE, the TSC’s role would be limited to governance. The TSC would be populated by a mix of stakeholders, such as entrepreneurs, community developers, investors, and support organizations.
- Trust Enforcer: The Trust Enforcer is an independent entity that rules on whether the TSC is sufficiently pursuing the PPT’s stated purpose. For a capital-focused GSE, this role could be filled by federal agency staff. The trust that holds the governance rights serves as the vehicle for the regulatory body to oversee the GSE.
- Corporate Trustee: The entity that holds the financial interest of the GSE could be sold to investors, granted to employees, or given away to relevant nonprofit and other stakeholder groups.
Conclusion
The ICIC agencies support and create many creative solutions that blend private and public dollars to increase entrepreneurship and community development. Yet the federal government stops short of providing the most important benefit: standardization and scale. The ICIC agencies should therefore create an entity that unlocks standardization and scale for the programs they help create, with the overall goals of:
- Unifying small-business and community-development underwriting
- Broadening private-actor participation in small business loan origination by creating the risk standards that allow for greater liquidity
- Opening the capital markets to lower the cost of capital for entrepreneurs
A first step towards accomplishing these goals is to establish an administrative pilot, by which interested ICIC agencies would use the SBA’s existing authority and infrastructure to create a secondary market for their securitized debt instruments.
If the pilot proves successful, the next step is to expand the secondary market and establish it for the long term through a GSE modeled on those that have effectively supported the mortgage industry—but with a creative structure that proactively addresses GSE weaknesses unveiled by the 2008 housing-market crash. The result is a stable, permanent institution that enables all communities to realize the benefits of robust entrepreneurship by ensuring that budding entrepreneurs and small-business owners across the country can easily tap into the capital they need to get started.
Precedents for this type of federal intervention can be found in the mortgage industry. Homeownership is a major driver of wealth creation. The federal government supports homeownership through mortgage guarantees by federal agencies like the Federal Housing Authority and Veterans Affairs. In addition, the federal government increases liquidity in the mortgage industry by enabling insured mortgages and market-rate mortgages to be securitized, sold, and purchased on secondary markets through government-sponsored enterprises (GSEs) like Fannie Mae and Freddie Mac, or wholly owned agencies like Ginnie Mae. These structures have created a reliable stream of capital to originate loans for homeownership and lower the cost of borrowing.
The mortgage GSEs are engaging in innovation to increase access to housing credit. Fannie Mae, for example, is taking a number of steps to extend credit and homeownership to historically disadvantaged communities, including by using documented rental payments to help individuals build their credit scores and using special-purpose credit programs to develop new solutions for down payment assistance, underwriting, and credit enhancement. These changes will have an outsize effect on the mortgage industry because of the central role a GSE like Fannie Mae plays in connecting private markets to potential homeowners.
COVID-19 relief efforts provide an application of this model specific to small businesses. The California Rebuild Fund (CARF) was a private credit fund for small businesses capitalized with a mixture of state, federal, philanthropic, and private investment. The CARF used government debt guarantees to push down the cost of capital to Community Development Financial Institutions that were best positioned to originate and serve small businesses most negatively impacted by COVID-19.
The CARF proved that a coherent and routinized process for accessing private capital that lowers interest rates, expands credit for small businesses, and creates operational efficiencies for entrepreneurial support organizations. For instance, there is a single application site that matches potential borrowers to potential lenders. The keys to the CARF’s success were its guarantee from the state of California and the fact that it provided relatively uniform offering to different investors along a spectrum of return profiles.
To begin the new entity, securitize or purchase securities from only government guaranteed loans. Even during the worst of the housing crash, the government-guaranteed mortgage-backed securities were more stable than non-agency loans. Beginning with guaranteed loans allows this new entity to provide explicit guarantees to guarantee-sensitive investors. However, a gradual push into new mechanisms, innovative underwriting, and perhaps non-agency debt should be a goal.
The guarantee of the loans should be explicit but only sit after the equity of the borrower and the agency guarantee.
Any privileges extended to the new entity, such as exemption from securities registration or state and local taxation, that results in measurable decrease in cost of lending should be passed on to the final borrower, as much as possible.
Assuming that the regulatory body, acting as a fiduciary of the trust, can implement policies that take into account demographics like race, ethnicity, and country of origin, the GSE should use special purpose credit programs to address racial inequalities in access to capital.
The authorizing statute for the SBA secondary market required the lender to remain obligated to the SBA if it securitizes and sells the underlying loan on a secondary market. To promulgate that obligation the SBA requires the lender to keep a percentage of the loan on their books for servicing. This is an operational hurdle to securitizing loans. Either there needs to be a more robust market to justify the operational expense or there should be another manner by which the lender remains obligated to the SBA.
The SBA recently announced a change in the interest rates that lenders can charge for 7(a) loans. While it is understandable that the SBA does not want the guarantee to run up the profit margin for lenders, the tradeoff is that some entrepreneurs will go without capital because lenders cannot justify the risk at the formulated interest rate. The authorizing statute, CFR 120.213, merely requires that interest rates be reasonable. This should give the SBA room to experiment with how it can deliver low-cost capital to borrowers. For example, if the usury cap was removed for some loans, could the SBA require the excess yield be used to push down the cost of borrowing for other loans?
The Interagency Community Investment Committee (ICIC) focuses on the operations and execution of federal programs that facilitate the flow of capital and the provision of financial resources into historically underserved communities, including communities of color, rural communities, and Tribal nations. The ICIC is composed of representatives from the Treasury, Small Business Administration, Department of Commerce, Department of Transportation, Housing and Urban Development, and Department of Agriculture.
Building a National Network of Composite Pipes to Reduce Greenhouse Gas Emissions
Summary
65,000 miles of pipeline: that’s the distance that may be necessary to achieve economy-wide net-zero emissions by 2050, according to a Princeton University study. The United States is on the verge of constructing a vast network of pipelines to transport hydrogen and carbon dioxide, incentivized by the Infrastructure Investment and Jobs Act and the Inflation Reduction Act. Yet the lifecycle emissions generated by a typical steel pipeline is 27.35 kg carbon dioxide eq per ft1. Which means 65,000 miles would result in nearly 9.4 million megatons of carbon dioxide eq (equal to over 2 million passenger cars annually) produced just from steel pipeline infrastructure alone.
Pipelines made from composite materials offer one pathway to lowering emissions. Composite pipe is composed of multiple layers of different materials—typically a thermoplastic polymer as the primary structural layer with reinforcing materials such as fibers or particulate fillers to increase strength and stiffness. Some types have lifecycle emissions that are nearly one-third less than typical steel pipeline. Depending on the application, composite pipelines can be safer and less expensive. However, the process under Pipeline and Hazardous Materials and Safety Administration (PHMSA) to issue permits for composite pipe takes longer than steel, and for hydrogen and supercritical carbon dioxide, the industry lacks regulatory standards altogether. Reauthorization of the Protecting Our Infrastructure of Pipelines and Enhancing Safety (PIPES) Act offers an excellent opportunity to review the policies concerning new, less emissive pipeline technologies.
Challenge and Opportunity
Challenge
The United States is on the verge of a clean energy construction boom, expanding far beyond wind and solar energy to include infrastructure that utilizes hydrogen and carbon capture. The pump has been primed with $21 billion for demonstration projects or “hubs” in the Infrastructure Investment and Jobs Act and reinforced with another $7 billion for demonstration projects and at least $369 billion in tax credits in the Inflation Reduction Act. Congress recognized that pipelines are a critical component and provided $2.1 billion in loans and grants under the Carbon Dioxide Transportation Infrastructure Finance and Innovation Act (CIFIA).
The United States is crisscrossed by pipelines. Approximately 3.3 million miles of predominately steel pipelines convey trillions of cubic feet of natural gas and hundreds of billions of tons of liquid petroleum products each year. A far fewer 5,000 miles are used to transport carbon dioxide and only 1,600 miles are dedicated to hydrogen. Research suggests the existing pipeline network is nowhere near what is needed. According to Net Zero America, approximately 65,000 miles of pipeline will be needed to transport captured carbon dioxide to achieve economy-wide net zero emissions in the United States by 2050. The study also identifies a need for several thousand miles of pipelines to transport hydrogen within each region.
Making pipes out of steel is a carbon-intensive process, and steel manufacturing in general accounts for seven to nine percent of global greenhouse gas emissions. There are ongoing efforts to lower emissions generated from steel (i.e., “green steel”) by being more energy efficient, capturing and storing emitted carbon dioxide, recycling scrap steel combined with renewable energy, and using low-emissions hydrogen. However, cost is a significant challenge with many of these mitigation strategies. The estimated cost of transitioning global steel assets to net-zero compatible technologies by 2050 is $200 billion, in addition to a baseline average of $31 billion annually to simply meet growing demand.
Opportunity
Given the vast network of pipelines required to achieve a net-zero future, expanding use of composite pipe provides a significant opportunity for the United States to lower carbon emissions. Composite materials are highly resistant to corrosion, weigh less and are more flexible, and have improved flow capacity. This means that pipelines made from composite materials have a longer service life and require less maintenance than steel pipelines. Composite pipe can be four times faster to install, require one-third the labor to install, and have significantly lower operating costs.2 The use of composite pipe is expected to continue to grow as technological advancements make these materials more reliable and cost-effective.
Use of composite pipe is also expanding as industry seeks to improve its sustainability. We performed a lifecycle analysis on thermoplastic pipe, which is made by a process called extrusion that involves melting a thermoplastic material, such as high-density polyethylene or polyvinyl chloride, and then forcing it through a die to create a continuous tube. The tube can then be cut to the desired length and fittings can be attached to the ends to create a complete pipeline. We found that the lifecycle emissions from thermoplastic pipe were 6.83 kg carbon dioxide eq/ft and approximately 75% lower than an equivalent length of steel pipe, which has lifecycle emissions of 27.35 kg carbon dioxide eq/ft.
These estimates do not include potential differences in leaks. Specifically, composite pipe has a continuous structure that allows for the production of longer pipe sections, resulting in fewer joints and welds. In contrast, metallic pipes are often manufactured in shorter sections due to limitations in the manufacturing process. This means that more joints and welds are required to connect the sections together, which can increase the risk of leaks or other issues. Further, approximately half of the steel pipelines in the United States are over 50 years old, increasing the potential for leaks and maintenance cost.3 Another advantage of composite pipe is that it can be pulled through steel pipelines, thereby repurposing aging steel pipelines to transport different materials while also reducing the need for new rights of way and associated permits.
Despite the advantages of using composite materials, the standards have not yet been developed to allow for safe permitting to transport supercritical carbon dioxide4 and hydrogen. At the federal level, pipeline safety is administered by the Department of Transportation’s Pipeline and Hazardous Materials Administration (PHMSA).5 To ensure safe transportation of energy and other hazardous materials, PHMSA establishes national policy, sets and enforces standards, educates, and conducts research to prevent incidents. There are regulatory standards to transport supercritical carbon dioxide in steel pipe.6 However, there are no standards for composite pipe to transport either hydrogen or carbon dioxide in either a supercritical liquid, gas, or subcritical liquid state.
Repurposing existing infrastructure is critical because the siting of pipelines, regardless of type, is often challenging. Whereas natural gas pipelines and some oil pipelines can invoke eminent domain provisions under federal law such as the Natural Gas Act or Interstate Commerce Act, no such federal authorities exist for hydrogen and carbon dioxide pipelines. In some states, specific statutes address eminent domain for carbon dioxide pipelines. These laws typically establish the procedures for initiating eminent domain proceedings, determining the amount of compensation to be paid to property owners, and resolving disputes related to eminent domain. However, current efforts are under way in states such as Iowa to restrict use of state authorities to grant eminent domain to pending carbon dioxide pipelines. The challenges with eminent domain underscore the opportunity provided by technologies that allow for the repurposing of existing pipeline to transport carbon dioxide and hydrogen.
Plan of Action
How can we build a vast network of carbon dioxide and hydrogen pipelines while also using lower emissive materials?
Recommendation 1. Develop safety standards to transport hydrogen and supercritical carbon dioxide using composite pipe.
PHMSA, industry, and interested stakeholders should work together to develop safety standards to transport hydrogen and supercritical carbon dioxide using composite pipe. Without standards, there is no pathway to permit use of composite pipe. This collaboration could occur within the context of PHMSA’s recent announcement to update its standards for transporting carbon dioxide, which is being done in response to an incident in 2020 in Sartartia, MS.
Ideally, the permits could be issued using PHMSA’s normal process rather than as special permits (e.g., 49 CFR § 195.8). It takes several years to develop standards, so it is critical to launch the standard-setting process so that composite pipe can be used in Department of Energy-funded hydrogen hubs and carbon capture demonstration projects.
Europe is ahead of the United States in this regard, as the classification company DNV is currently undertaking a joint industry project to review the cost and risk of using thermoplastic pipe to transport hydrogen. This work will inform regulators in the European Union, who are currently revising standards for hydrogen infrastructure. The European Clean Hydrogen Alliance recently adopted a “Roadmap on Hydrogen Standardization” that expressly recommends setting standards for non-metallic pipes. To the extent practicable, it would benefit export markets for U.S. products if the standards were similar.
Recommendation 2. Streamline the permitting process to retrofit steel pipelines.
Congress should streamline the retrofitting of steel pipes by enacting a legislative categorical exclusion under the National Environmental Policy Act (NEPA). NEPA requires federal agencies to evaluate actions that may have a significant effect on the environment. Categorical exclusions (CEs) are categories of actions that have been determined to have no significant environmental impact and therefore do not require an environmental assessment (EA) or an environmental impact statement (EIS) before they can proceed. CEs can be processed within a few days, thereby expediting the review of eligible actions.
The CE process allows federal agencies to avoid the time and expense of preparing an EA or EIS for actions that are unlikely to have significant environmental effects. CEs are often established through agency rulemaking but can also be created by Congress as a “legislative CE.” Examples include minor construction activities, routine maintenance and repair activities, land transfers, and research and data collection. However, even if an action falls within a CE category, the agency must still conduct a review to ensure that there are no extraordinary circumstances that would warrant further analysis.
Given the urgency to deploy clean technology infrastructure, Congress should authorize federal agencies to apply a categorical exclusion where steel pipe is retrofitted using composite pipe. In such situations, the project is using an existing pipeline right-of-way, and there should be few, if any, additional environmental impacts. Should there be any extraordinary circumstances, such as substantial changes in the risk of environmental effects, federal agencies would be able to evaluate the project under an EA or EIS. A CE does not obviate the review of safety standards and other applicable, substantive laws, but simply right-sizes the procedural analysis under NEPA.
Recommendation 3. Explore opportunities to improve the policy framework for composite pipe during reauthorization of the PIPES Act.
Both of the aforementioned ideas should be considered as Congress initiates its reauthorization of the Protecting Our Infrastructure of Pipelines and Enhancing Safety (PIPES) Act of 2020. Among other improvements to pipeline safety, the PIPES Act reauthorized PHMSA through FY2023. As Congress begins work on its next reauthorization bill for PHMSA, it is the perfect time to review the state of the industry, including the potential for composite pipe to accelerate the energy transition.
Recommendation 4. Consider the embedded emissions of construction materials when funding demonstration projects.
The Office of Clean Energy Demonstrations should consider the embedded emissions of construction materials when evaluating projects for funding. Applicants that have a plan to consider embedded emissions of construction materials could receive additional weight in the selection process.
Recommendation 5. Support research and development of composite materials.
Composite materials offer advantages in many other applications, not just pipelines. The Office of Energy Efficiency and Renewable Energy (EERE) should support research to further enhance the properties of composite pipe while improving lifecycle emissions. In addition to ongoing efforts to lower the emissions intensity of steel and concrete, EERE should support innovation in alternative, composite materials for pipelines and other applications.
Conclusion
Recent legislation will spark construction of the next generation in clean energy infrastructure, and the funding also creates an opportunity to deploy construction materials with lower lifecycle emissions of greenhouse gases. This is important, because constructing vast networks of pipelines using high-emissive processes undercuts the goals of the legislation. However, the regulatory code remains an impediment by failing to provide a pathway for using composite materials. PHMSA and industry should commence discussions to create the requisite safety standards, and Congress should work with both industry and regulators to streamline the NEPA process when retrofitting steel pipelines. As America commences construction of hydrogen and carbon capture, utilization, and storage networks, reauthorization of the PIPES Act provides an excellent opportunity to significantly lower the emissions.
We compared two types of pipes: 4” API 5L X42 metallic pipe vs. 4” Baker Hughes non-metallic next generation thermoplastic flexible pipe. The analysis was conducted using FastLCA, a proprietary web application developed by Baker Hughes and certified by an independent reviewer to quantify carbon emissions from our products and services. The emission factors for the various materials and processes are based on the ecoinvent 3.5 database for global averages.
- The data for flexible pipe production is from 2020 production year and represents transport, machine, and energy usage at the Baker Hughes’ manufacturing plant located in Houston, TX.
- All raw material and energy inputs for flex pipes are taken directly from engineering and plant manufacturing data, as verified by engineering and manufacturing personnel, and represent actual usage to manufacture the flexible pipes.
- All of the data for metallic pipe production is from API 5L X42 schedule 80 pipe specifications and represent transport from Alabama and energy usage for production from global averages.
- All raw material and energy inputs for hot rolling steel are computed from ecoinvent 3.5 database emission factors. All relevant production steps and processes are modeled.
- All secondary processes are from the ecoinvent 3 database (version 3.5 compiled as of November 2018) as applied in SimaPro 9.0.0.30.
- Results are calculated using IPCC 2013 GWP 100a (IPCC AR5).
Similar to steel pipe, transporting hydrogen and carbon dioxide using composite pipe poses certain safety risks that must be carefully managed and mitigated:
- Hydrogen gas can diffuse into the composite material and cause embrittlement, which can lead to cracking and failure of the pipe.
- The composite material used in the pipe must be compatible with hydrogen and carbon dioxide. Incompatibility can cause degradation of the pipe due to permeation, leading to leaks or ruptures.
- Both hydrogen and carbon dioxide are typically transported at high pressure, which can increase the risk of pipe failure due to stress or fatigue.
- Carbon dioxide can be corrosive to certain metals, which can lead to corrosion of the pipe and eventual failure.
- Hydrogen is highly flammable and can ignite in the presence of an ignition source, such as a spark or heat.
To mitigate these safety risks, appropriate testing, inspection, and maintenance procedures must be put in place. Additionally, proper handling and transportation protocols should be followed, including strict adherence to pressure and temperature limits and precautions to prevent ignition sources. Finally, emergency response plans should be developed and implemented to address any incidents that may occur during transportation.
API Specification 15S, Spoolable Reinforced Plastic Line Pipe, covers the use of flexible composite pipe in onshore applications. The standard does not address transport of carbon dioxide and has not been incorporated into PHMSA’s regulations.
API Specification 17J, Specification for Unbonded Flexible Pipe, covers the use of flexible composite pipe in offshore applications. Similar to 15S, it does not address transport of carbon dioxide and has not been incorporated into PHMSA’s regulations.
HDPE pipe, commonly used in applications such as water supply, drainage systems, gas pipelines, and industrial processes, has similar advantages to composite pipe in terms of flexibility, ease of installation, and low maintenance requirements. It can be assembled to create seamless joints, reducing the risk of leaks. It can also be used to retrofit steel pipes as a liner per API SPEC 15LE.
HDPE pipe has been approved by PHMSA to transport natural gas under 49 CFR Part 192. However, the typical operating pressures (e.g., 100 psi) are significantly lower than composite pipe. Similar to composite pipe, there are no standards for the transport of hydrogen and carbon dioxide, though HDPE pipe’s lower pressure limits make it less suited for use in carbon capture and storage.
Addressing Online Harassment and Abuse through a Collaborative Digital Hub
Efforts to monitor and combat online harassment have fallen short due to a lack of cooperation and information-sharing across stakeholders, disproportionately hurting women, people of color, and LGBTQ+ individuals. We propose that the White House Task Force to Address Online Harassment and Abuse convene government actors, civil society organizations, and industry representatives to create an Anti-Online Harassment (AOH) Hub to improve and standardize responses to online harassment and to provide evidence-based recommendations to the Task Force. This Hub will include a data-collection mechanism for research and analysis while also connecting survivors with social media companies, law enforcement, legal support, and other necessary resources. This approach will open pathways for survivors to better access the support and recourse they need and also create standardized record-keeping mechanisms that can provide evidence for and enable long-term policy change.
Challenge and Opportunity
The online world is rife with hate and harassment, disproportionately hurting women, people of color, and LGBTQ+ individuals. A research study by Pew indicated that 47% of women were harassed online for their gender compared to 18% of men, while 54% of Black or Hispanic internet users faced race-based harassment online compared to 17% of White users. Seven in 10 LGBTQ+ adults have experienced online harassment, and 51% faced even more severe forms of abuse. Meanwhile, existing measures to combat online harassment continue to fall short, leaving victims with limited means for recourse or protection.
Numerous factors contribute to these shortcomings. Social media companies are opaque, and when survivors turn to platforms for assistance, they are often met with automated responses and few means to appeal or even contact a human representative who could provide more personalized assistance. Many survivors of harassment face threats that escalate from online to real life, leading them to seek help from law enforcement. While most states have laws against cyberbullying, law enforcement agencies are often ill-trained and ill-equipped to navigate the complex web of laws involved and the available processes through which they could provide assistance. And while there are nongovernmental organizations and companies that develop tools and provide services for survivors of online harassment, the onus continues to lie primarily on the survivor to reach out and navigate what is often both an overwhelming and a traumatic landscape of needs. Although resources exist, finding the correct organizations and reaching out can be difficult and time-consuming. Most often, the burden remains on the victims to manage and monitor their own online presence and safety.
On a larger, systemic scale, the lack of available data to quantitatively analyze the scope and extent of online harassment hinders the ability of researchers and interested stakeholders to develop effective, long-term solutions and to hold social media companies accountable. Lack of large-scale, cross-sector and cross-platform data further hinders efforts to map out the exact scale of the issue, as well as provide evidence-based arguments for changes in policy. As the landscape of online abuse is ever changing and evolving, up-to-date information about the lexicons and phrases that are used in attacks also change.
Forming the AOH Hub will improve the collection and monitoring of online harassment while preserving victims’ privacy; this data can also be used to develop future interventions and regulations. In addition, the Hub will streamline the process of receiving aid for those targeted by online harassment.
Plan of Action
Aim of proposal
The White House Task Force to Address Online Harassment and Abuse should form an Anti-Online Harassment Hub to monitor and combat online harassment. This Hub will center around a database that collects and indexes incidents of online harassment and abuse from technology companies’ self-reporting, through connections civil society groups have with survivors of harassment, and from reporting conducted by the general public and by targets of online abuse. Civil society actors that have conducted past work in providing resources and monitoring harassment incidents, ranging from academics to researchers to nonprofits, will run the AOH Hub in consortium as a steering committee. There are two aims for the creation of this hub.
First, the AOH Hub can promote collaboration within and across sectors, forging bonds among government, the technology sector, civil society, and the general public. This collaboration enables the centralization of connections and resources and brings together diverse resources and expertise to address a multifaceted problem.
Second, the Hub will include a data collection mechanism that can be used to create a record for policy and other structural reform. At present, the lack of data limits the ability of external actors to evaluate whether social media companies have worked adequately to combat harmful behavior on their platforms. An external data collection mechanism enables further accountability and can build the record for Congress and the Federal Trade Commission to take action where social media companies fall short. The allocated federal funding will be used to (1) facilitate the initial convening of experts across government departments and nonprofit organizations; (2) provide support for the engineering structure required to launch the Hub and database; (3) support the steering committee of civil society actors that will maintain this service; and (4) create training units for law enforcement officials on supporting survivors of online harassment.
Recommendation 1. Create a committee for governmental departments.
Survivors of online harassment struggle to find recourse, failed by legal technicalities in patchworks of laws across states and untrained law enforcement. The root of the problem is an outdated understanding of the implications and scale of online harassment and a lack of coordination across branches of government on who should handle online harassment and how to properly address such occurrences. A crucial first step is to examine and address these existing gaps. The Task Force should form a long-term committee of members across governmental departments whose work pertains to online harassment. This would include one person from each of the following organizations, nominated by senior staff:
- Department of Homeland Security
- Department of Justice
- Federal Bureau of Investigation
- Department of Health and Human Services
- Office on Violence Against Women
- Federal Trade Commission
This committee will be responsible for outlining fallibilities in the existing system and detailing the kind of information needed to fill those gaps. Then, the committee will outline a framework clearly establishing the recourse options available to harassment victims and the kinds of data collection required to prove a case of harassment. The framework should be completed within the first 6 months after the committee has been convened. After that, the committee will convene twice a year to determine how well the framework is working and, in the long term, implement reforms and updates to current laws and processes to increase the success rates of victims seeking assistance from governmental agencies.
Recommendation 2: Establish a committee for civil society organizations.
The Task Force shall also convene civil society organizations to help form the AOH Hub steering committee and gather a centralized set of resources. Victims will be able to access a centralized hotline and information page, and Hub personnel will then triage reports and direct victims to resources most helpful for their particular situation. This should reduce the burden on those who are targets of harassment campaigns to find the appropriate organizations that can help address their issues by matching incidents to appropriate resources.
To create the AOH Hub, members of the Task Force can map out civil society stakeholders in the space and solicit applications to achieve comprehensive and equitable representation across sectors. Relevant organizations include organizations/actors working on (but not limited to):
- Combating domestic violence and intimate partner violence
- Addressing technology-facilitated gender based violence (TF-GBV)
- Developing online tools for survivors of harassment to protect themselves
- Conducting policy work to improve policies on harassment
- Providing mental health support for survivors of harassment
- Servicing pro bono or other forms of legal assistance for survivors of harassment
- Connecting tech company representatives with survivors of harassment
- Researching methods to address online harassment and abuse
The Task Force will convene an initial meeting, during which core members will be selected to create an advisory board, act as a liaison across members, and conduct hiring for the personnel needed to redirect victims to needed services. Other secondary members will take part in collaboratively mapping out and sharing available resources, in order to understand where efforts overlap and complement each other. These resources will be consolidated, reviewed, and published as a public database of resources within a year of the group’s formation.
For secondary members, their primary obligation will be to connect with victims who have been recommended to their services. Core members, meanwhile, will meet quarterly to evaluate gaps in services and assistance provided and examine what more needs to be done to continue growing the robustness of services and aid provided.
Recommendation 3: Convene committee for industry.
After its formation, the AOH steering committee will be responsible for conducting outreach with industry partners to identify a designated team from each company best equipped to address issues pertaining to online abuse. After the first year of formation, the industry committee will provide operational reporting on existing measures within each company to address online harassment and examine gaps in existing approaches. Committee dialogue should also aim to create standardized responses to harassment incidents across industry actors and understandings of how to best uphold community guidelines and terms of service. This reporting will also create a framework for standardized best practices for data collection, in terms of the information collected on flagged cases of online harassment.
On a day-to-day basis, industry teams will be available resources for the hub, and cases can be redirected to these teams to provide person-to-person support for handling cases of harassment that require a personalized level of assistance and scale. This committee will aim to increase transparency regarding the reporting process and improve equity in responses to online harassment.
Recommendation 4: Gather committees to provide long-term recommendations for policy change.
On a yearly basis, representatives across the three committees will convene and share insights on existing measures and takeaways. These recommendations will be given to the Task Force and other relevant stakeholders, as well as be accessible by the general public. Three years after the formation of these committees, the groups will publish a report centralizing feedback and takeaway from all committees, and provide recommendations of improvement for moving forward.
Recommendation 5: Create a data-collection mechanism and standard reporting procedures.
The database will be run and maintained by the steering committee with support from the U.S. Digital Service, with funding from the Task Force for its initial development. The data collection mechanism will be informed by the frameworks provided by the committees that compose the Hub to create a trauma-informed and victim-centered framework surrounding the collection, protection, and use of the contained data. The database will be periodically reviewed by the steering committee to ensure that the nature and scope of data collection is necessary and respects the privacy of those whose data it contains. Stakeholders can use this data to analyze and provide evidence of the scale and cross-cutting nature of online harassment and abuse. The database would be populated using a standardized reporting form containing (1) details of the incident; (2) basic demographic data of the victim; (3) platform/means through which the incident occurred; (4) whether it is part of a larger organized campaign; (5) current status of the incident (e.g., whether a message was taken down, an account was suspended, the report is still ongoing); (6) categorization within existing proposed taxonomies indicating the type of abuse. This standardization of data collection would allow advocates to build cases regarding structured campaigns of abuse with well-documented evidence, and the database will archive and collect data across incidents to ensure accountability even if the originals are lost or removed.
The reporting form will be available online through the AOH Hub. Anyone with evidence of online harassment will be able to contribute to the database, including but not limited to victims of abuse, bystanders, researchers, civil society organizations, and platforms. To protect the privacy and safety of targets of harassment, this data will not be publicly available. Access will be limited to: (1) members of the Hub and its committees; (2) affiliates of the aforementioned members; (3) researchers and other stakeholders, after submitting an application stating reasons to access the data, plans for data use, and plans for maintaining data privacy and security. Published reports using data from this database will be nonidentifiable, such as with statistics being published in aggregate, and not be able to be linked back to individuals without express consent.
This database is intended to provide data to inform the committees in and partners of the Hub of the existing landscape of technology-facilitated abuse and violence. The large-scale, cross-domain, and cross-platform nature of the data collected will allow for better understanding and analysis of trends that may not be clear when analyzing specific incidents, and provide evidence regarding disproportionate harms to particular communities (such as women, people of color, LGBTQ+ individuals). Resources permitting, the Hub could also survey those who have been impacted by online abuse and harassment to better understand the needs of victims and survivors. This data aims to provide evidence for and help inform the recommendations made from the committees to the Task Force for policy change and further interventions.
Recommendation 6: Improve law enforcement support.
Law enforcement is often ill-equipped to handle issues of technology-facilitated abuse and violence. To address this, Congress should allocate funding for the Hub to create training materials for law enforcement nationwide. The developed materials will be added to training manuals and modules nationwide, to ensure that 911 operators and officers are aware of how to handle cases of online harassment and how state and federal law can apply to a range of scenarios. As part of the training, operators will also be notified to add records of 911 calls regarding online harassment to the Hub database, with the survivor’s consent.
Conclusion
As technology-facilitated violence and abuse proliferates, we call for funding to create a steering committee in which experts and stakeholders from civil society, academia, industry, and government can collaborate on monitoring and regulating online harassment across sectors and incidents. The resulting Anti-Online Harassment Hub would maintain a data-collection mechanism accessible to researchers to better understand online harassment as well as provide accountability for social media platforms to address the issue. Finally, the Hub would provide accessible resources for targets of harassment in a fashion that would reduce the burden on these individuals. Implementing these measures would create a safer online space where survivors are able to easily access the support they need and establish a basis for evidence-based, longer-term policy change.
Platform policies on hate and harassment differ in the redress and resolution they offer. Twitter’s proactive removal of racist abuse toward members of the England football team after the UEFA Euro 2020 Finals shows that it is technically feasible for abusive content to be proactively detected and removed by the platforms themselves. However, this appears to only be for high-profile situations or for well-known individuals. For the general public, the burden of dealing with abuse usually falls to the targets to report messages themselves, even as they are in the midst of receiving targeted harassment and threats. Indeed, the current processes for reporting incidents of harassment are often opaque and confusing. Once a report is made, targets of harassment have very little control over the resolution of the report or the speed at which it is addressed. Platforms also have different policies on whether and how a user is notified after a moderation decision is made. A lot of these notifications are also conducted through automated systems with no way to appeal, leaving users with limited means for recourse.
Recent years have seen an increase in efforts to combat online harassment. Most notably, in June 2022, Vice President Kamala Harris launched a new White House Task Force to Address Online Harassment and Abuse, co-chaired by the Gender Policy Council and the National Security Council. The Task Force aims to develop policy solutions to enhance accountability of perpetrators of online harm while expanding data collection efforts and increasing access to survivor-centered services. In March 2022, the Biden-Harris Administration also launched the Global Partnership for Action on Gender-Based Online Harassment and Abuse, alongside Australia, Denmark, South Korea, Sweden, and the United Kingdom. The partnership works to advance shared principles and attitudes toward online harassment, improve prevention and response measures to gender-based online harassment, and expand data and access on gender-based online harassment.
Efforts focus on technical interventions, such as tools that increase individuals’ digital safety, automatically blur out slurs, or allow trusted individuals to moderate abusive messages directed towards victims’ accounts. There are also many guides that walk individuals through how to better manage their online presence or what to do in response to being targeted. Other organizations provide support for those who are victims and provide next steps, help with reporting, and information on better security practices. However, due to resource constraints, organizations may only be able to support specific types of targets, such as journalists, victims of intimate partner violence, or targets of gendered disinformation. This increases the burden on victims to find support for their specific needs. Academic institutions and researchers have also been developing tools and interventions that measure and address online abuse or improve content moderation. While there are increasing collaborations between academics and civil society, there are still gaps that prevent such interventions from being deployed to their full efficacy.
While complete privacy and security is extremely different to ensure in a technical sense, we envision a database design that preserves data privacy while maintaining its usability. First, the fields of information required for filing an incident report form would minimize the amount of personally identifiable information collected. As some data can be crowdsourced from the public and external observers, this part of the dataset would consist of existing public data. Nonpublicly available data would be entered by only individuals who are sharing incidents that are targeting them (e.g., direct messages), and individuals would be allowed to choose whether it is visible in the database or only shown in summary statistics. Furthermore, the data collection methods and the database structure will be periodically reviewed by the steering committee of civil society organizations, who will make recommendations for improvement as needed.
Data collection and reporting can be conducted internationally, as we recognize that limiting data collection to the U.S. will also undermine our goals of intersectionality. However, the hotline will likely have more comprehensive support for U.S.-based issues. In the long run, however, efforts can also be expanded internationally, as a cross-collaborative effort across multinational governments.