Impacts of Extreme Heat on Federal Healthcare Spending
Public health insurance programs, especially Medicaid, Medicare, and the Children’s Health Insurance Program (CHIP), are more likely to cover populations at increased risk from extreme heat, including low-income individuals, people with chronic illnesses, older adults, disabled adults, and children. When temperatures rise to extremes, these populations are more likely to need care for their heat-related or heat-exacerbated illnesses. Congress must prioritize addressing the heat-related financial impacts onthese programs. To boost the resilience of these programs to extreme heat, Congress should incentivize prevention by enabling states to facilitate health-related social needs (HRSN) pilots that can reduce heat-related illnesses, continue to support screenings for the social drivers of health, and implement preparedness and resilience requirements into the Conditions of Participation (CoPs) and Conditions for Coverage (CfCs) of relevant programs.
Extreme Heat Increases Fiscal Impacts on Public Insurance Programs
Healthcare costs are a function of utilization, which has been rapidly rising since 2010. Extreme heat is driving up utilization as more Americans seek medical care for heat-related illnesses. Extreme heat events are estimated to be annually responsible for nearly 235,000 emergency department visits and more than 56,000 hospital admissions, adding approximately $1 billion to national healthcare costs.
Heat-driven increases in healthcare utilization are especially notable for public insurance programs. One recent study found that there is a 10% increase in heat-related emergency department visits and a 7% increase in hospitalizations during heat wave days for low-income populations eligible for both Medicaid and Medicare. Further demonstrating the relationship between increased spending and extreme heat, the Congressional Budget Office found that for every 100,000 Medicare beneficiaries, extreme temperatures cause an additional 156 emergency department visits and $388,000 in spending per day on average. These higher utilization rates also drive increases in Medicaid transfer payments from the federal government to help states cover rising costs. For every 10 additional days of extreme heat above 90°F, annual Medicaid transfer payments increase by nearly 1%, equivalent to an $11.78 increase per capita.
Additionally, Medicaid funds services for over 60% of nursing home residents. Yet Medicaid reimbursement rates often fail to cover the actual cost of care, leaving many facilities operating at a financial loss. This can make it difficult for both short-term and long-term care facilities to invest in and maintain the cooling infrastructure necessary to comply with existing requirements to maintain safe indoor temperatures. Further, many short-term and long-term care facilities do not have the emergency power back-ups that can keep the air conditioning on during extreme weather events and power outages, nor do they have emergency plans for occupant evacuation in case of dangerous indoor temperatures. This can and does subject residents to deadly indoor temperatures that can worsen their overall health outcomes.
The Impacts of the One Big Beautiful Bill Act
The One Big Beautiful Bill Act (H.R. 1) will have consequential impacts on federally-supported health insurance programs. The Congressional Budget Office projects that an estimated 10 million people could lose their healthcare coverage by 2034. Researchers have estimated that a loss of coverage could result in 50,000 preventable deaths. Further, health care facilities and hospitals will likely see funding losses as a result of Medicaid funding reductions. This will be especially burdensome to low-resourced hospitals, such as those serving rural areas, and result in reductions in available offerings for patients and even closure of facilities. States will need support navigating this new funding landscape while also identifying cost-effective measures and strategies to address the health-related impacts of extreme heat.
Advancing Solutions that Safeguard America’s Health from Extreme Heat
To address these impacts in this additionally challenged context, there are common-sense strategies to help people avoid extreme heat exposure. For example, access to safely cool indoor environments is one of the best preventative strategies for heat-related illness. In particular, Congress should create a demonstration pilot that provides eligible Medicare beneficiaries with cooling assistance and direct CMS to encourage Section 1115 demonstration waivers for HRSN related to extreme heat. Section 1115 waivers have enabled states to finance pilots for life-saving cooling devices and air filter distributions. These HRSN financing pilots have helped several states to work around the challenges of U.S. underinvestment in health and social services by providing a flexible vehicle to test methods of delivering and paying for healthcare services in Medicaid and CHIP. As Congress members explore these policies, they should consider the impact of H.R. 1’s new requirements for 1115 waiver’s proof of cost-neutrality.
To further support these efforts for heat interventions, Congress should direct CMS to continue Social Drivers of Health (SDOH) screenings as a part of Quality Reporting Programs and integrate questions about extreme heat exposure risks into the screening process. These screenings are critical for identifying the most vulnerable patients and directing them to the preventative services they need. This information will also be critical for identifying facilities that are treating high proportions of heat-vulnerable patients, which could then be sites for testing interventions like energy and housing assistance.
Congress should also direct the CMS to integrate heat preparedness and resilience requirements and metrics into the Conditions of Participation (CoPs) and Conditions for Coverage (CfCs), such as through the Emergency Preparedness Rule. This could include assessing the cooling capacity of a health care facility under extreme heat conditions, back-up power that is sufficient to maintain safe indoor temperatures, and policies for resident evacuation in the event of high indoor temperatures. For safety net facilities, such as rural hospitals and federally qualified health centers, Congress should consider allocating resources for technical assistance to assess these risks and the infrastructure upgrades.
Impacts of Extreme Heat on Agriculture
Agriculture, food, and related industries produce nearly 90% of the food consumed in the United States and contribute approximately $1.54 trillion to the national GDP. Given the agricultural sector’s importance to the national economy, food security, and public health, Congress must pay attention to the impacts of extreme heat. To boost the resilience of this sector, Congress should design strategic insurance solutions, enhance research and data, and protect farmworkers through on-farm adaptation measures.
Extreme Heat Reduces Farm Productivity and Profitability
Extreme heat threatens agricultural productivity by increasing crop damage, causing livestock illness and mortality, and worsening water scarcity. Hotter conditions can damage crops through crop sunburn and heat stress, reducing annual yields for farms by as much as 40%. Animals raised for meat, milk, and eggs also experience increased risks of heat stress and heat-related mortality. For dairy production in particular, an estimated 1% of total annual yield is lost to heat stress alone. Further straining agricultural productivity, extreme heat accelerates water scarcity by increasing water evaporation rates. These higher evaporation rates force farmers to use even more water, drawing often from already stressed water sources. The compounding pressures posed by extreme heat can translate into significant economic losses: a study of Kansas commodity farms found that for every 1°C (1.8°F) increase in temperature, net farm incomes drop by 66%. Together, this means reduced revenue for farms and less food available for people.
Insurance solutions can help mitigate these financial impacts from extreme heat if employed responsibly. Multiple permanently authorized federal programs provide insurance or direct payments to help producers recover losses from extreme heat, including the Federal Crop Insurance Program, the Noninsured Crop Disaster Assistance Program, the Livestock Indemnity Program, and the Emergency Assistance for Livestock, Honey Bees, and Farm-Raised Fish Program. These programs need to ensure that producers are adequately covered against heat-related impacts and incentivize practices that reduce the risk of extreme heat related damages. This in turn will reduce the fiscal exposure of federal farm risk management programs. Congress should call on the United States Department of Agriculture (USDA) to research the feasibility of incentivizing heat resilience through federal crop insurance rates. Congress should also consider insurance premium subsidies for producers who adopt practices that enhance heat resilience for crops and livestock.
Given the increasing stress of extreme heat on the water systems necessary to sustain agricultural production, National Oceanic and Atmospheric Administration (NOAA) should build on its Weather, Water, and Climate Strategy and collaborate with USDA on a national water security strategy that accounts for current and future hotter temperatures. To enhance system-wide drought resilience, Congress can also appropriate funds to leverage existing USDA programs to support on-farm adoption of shade systems, effective water management, cover crops, and soil regeneration practices.
Finally, there are still notable knowledge gaps around extreme heat and its impacts on agriculture. These gaps include the long-term effects of higher temperatures on yields, farm input costs, and federal program spending. To address these information gaps and guide future research, Congress can direct the USDA Secretary to submit a report to Congress on the impacts of extreme heat on agriculture, farm input costs and losses, consumer prices, and the federal government’s spending (e.g., federal insurance and direct payment programs for losses of agricultural products and the provision of Supplemental Nutrition Assistance Program (SNAP) benefits).
Extreme Heat Lowers Agricultural Workers’ Productivity and Exposes Them to Health Risks
Higher temperatures and resulting heat stress are endangering farmer and farmworker safety and reducing their overall productivity, impacting bottom lines. Farmworkers are essential to the American food system, yet they are among the most vulnerable to extreme heat, facing a 35 times greater risk of dying from heat-related illnesses than workers in other sectors. This risk is intensifying as the sector increasingly relies on H‑2A farmworkers, who are hired to fill persistent domestic farm labor shortages. In many regions, over 25% of certified H‑2A farmworkers are required to work when local average temperatures exceed 90°F, and counties with the highest concentrations of H‑2A workers often coincide with the hottest parts of the country. After the work day, many of these workers return to substandard employer-provided housing that lacks essential cooling or ventilation, preventing effective recovery from daily heat exposure and exacerbating heat-related health risks. On top of the health risks, these conditions make people less effective on the job, which translates to economy-wide impacts: heat-related labor productivity losses across the U.S. economy currently exceeds $100 billion annually.
To address these risks, Congress should pass legislation requiring the Occupational Safety and Health Administration to finalize a federal heat standard that provides sufficient coverage for farming operations. In tandem with Occupational Safety and Health Administration (OSHA) finalizing the standard, USDA should be funded to provide technical assistance to agricultural employers for tailoring heat illness prevention plans and implementing cost-effective interventions that improve working conditions while maintaining productivity. This should include support for agricultural employers to integrate heat awareness into workforce training, resources for safety equipment and education, and support for the addition of shade structures. Doing so would ensure that agricultural workers across both large and small-scale farming operations have access to essential protections, like shade, clean water, and breaks, as well as sufficient capacity to comply. Current funding streams that could have an extreme heat infrastructure “plus-up” include the Environmental Quality Incentives Program and the Farm Service Agency’s microloans program.Lastly, Congress should also direct OSHA to continue implementing its National Emphasis Program on Heat, which enforces employers’ obligation to protect workers against heat illness or injury. OSHA should additionally review employers’ practices to ensure that H2A and other agricultural workers are protected from job or wage loss when extreme heat renders working conditions unsafe.
Clean Water: Protecting New York State Private Wells from PFAS
This memo responds to a policy need at the state level that originates due to a lack of relevant federal data. The Environmental Protection Agency (EPA) has a learning agenda question that asks,“To what extent does EPA have ready access to data to measure drinking water compliance reliably and accurately?” This memo fills that need because EPA doesn’t measure private wells.
Per- and polyfluoroalkyl substances (PFAS) are widely distributed in the environment, in many cases including the contamination of private water wells. Given their links to numerous serious health consequences, initiatives to mitigate PFAS exposure among New York State (NYS) residents reliant on private wells were included among the priorities outlined in the annual State of the State address and have been proposed in state legislation. We therefore performed a scenario analysis exploring the impacts and costs of a statewide program testing private wells for PFAS and reimbursing the installation of point of entry treatment (POET) filtration systems where exceedances occur.
Challenge and Opportunity
Why care about PFAS?
Per- and polyfluoroalkyl substances (PFAS), a class of chemicals containing millions of individual compounds, are of grave concern due to their association with numerous serious health consequences. A 2022 consensus study report by the National Academies of Sciences, Engineering, and Medicine categorized various PFAS-related health outcomes based on critical appraisal of existing evidence from prior studies; this committee of experts concluded that there is high confidence of an association between PFAS exposure and (1) decreased antibody response (a key aspect of immune function, including response to vaccines) (2) dyslipidemia (abnormal fat levels in one’s blood), (3) decreased fetal and infant growth, and (4) kidney cancer, and moderate confidence of an association between PFAS exposure and (1) breast cancer, (2) liver enzyme alterations, (3) pregnancy-induced high blood pressure, (4) thyroid disease, and (5) ulcerative colitis (an autoimmune inflammatory bowel disease).
Extensive industrial use has rendered these contaminants virtually ubiquitous in both the environment and humans, with greater than 95% of the U.S. general population having detectable PFAS in their blood. PFAS take years to be eliminated from the human body once exposure has occurred, earning their nickname as “forever chemicals.”
Why focus on private drinking water?
Drinking water is a common source of exposure.
Drinking water is a primary pathway of human exposure. Combining both public and private systems, it is estimated that approximately 45% of U.S. drinking water sources contain at least one PFAS. Rates specific to private water supplies have varied depending on location and thresholds used. Sampling in Wisconsin revealed that 71% of private wells contained at least one PFAS and 4% contained levels of perfluorooctanoic acid (PFOA) or perfluorooctanesulfonic acid (PFOS), two common PFAS compounds, exceeding Environmental Protection Agency (EPA)’s Maximum Contaminant Levels (MCLs) of 4 ng/L. Sampling in New Hampshire, meanwhile, found that 39% of private wells exceeded the state’s Ambient Groundwater Quality Standards (AGQS), which were established in 2019 and range from 11-18 ng/L depending on the specific PFAS compound. Notably, while the EPA MCLs represent legally enforceable levels accounting for the feasibility of remediation, the agency has also released health-based, non-enforceable Maximum Contaminant Level Goals (MCLGs) of zero for PFOA and PFOS.
PFAS in private water are unregulated and expensive to remediate.
In New York State (NYS), nearly one million households rely on private wells for drinking water; despite this, there are currently no standardized well testing procedures and effective well water treatment is unaffordable to many New Yorkers. As of April 2024, the EPA has established federal MCLs for several specific PFAS compounds and mixtures of compounds and its National Primary Drinking Water Regulations (NPDWR) require public water systems to begin monitoring and publicly reporting levels of these PFAS by 2027; if monitoring reveals exceedances of the MCLs, public water systems must also implement solutions to reduce PFAS by 2029. In contrast, there are no standardized testing procedures or enforceable limits for PFAS in private water. Additionally, testing and remediating private wells are both associated with high costs which are unaffordable to many well owners; prices range in hundreds of dollars for PFAS testing and can cost several thousands of dollars for the installation and maintenance of effective filtration systems.

How are states responding to the problem of PFAS in private drinking water?
Several states, including Colorado, New Hampshire, and North Carolina, have already initiated programs offering well testing and financial assistance for filters to protect against PFAS.
- After piloting its PFAS Testing and Assistance (TAP) program in one county in 2024, Colorado will expand it to three additional counties in 2025. The program covers the expenses of testing and a $79 nano pitcher (point-of-use) filter. Residents are eligible if PFOA and/or PFOS in their wells exceeds EPA MCLs of 4 ng/L; filters are free if their household income is ≤80% of the area median income and offered at a 30% discount if this income criteria is not met.
- The New Hampshire (NH) PFAS Removal Rebate Program for Private Wells offers greater flexibility and higher cost coverage than Colorado PFAS TAP, with reimbursements of up to $5000 offered for either point-of-entry or point-of-use treatment system installation and up to $10,000 offered for connection to a public water system. Though other residents may also participate in the program and receive delayed reimbursement, households earning ≤80% of the area median family income are offered the additional assistance of payment directly to a treatment installer or contractor (prior to installation) so as to relieve the applicant of fronting the cost. Eligibility is based on testing showing exceedances of the EPA MCLs of 4 ng/L for PFOA or PFOS or 10 ng/L for PFHxS, PFNA, or HFPO-DA (trademarked as “GenX”).
- The North Carolina PFAS Treatment System Assistance Program offers flexibility similar to New Hampshire in terms of the types of water treatment reimbursed, including multiple point-of-entry and point-of-use filter options as well as connection to public water systems. It is additionally notable for its tiered funding system, with reimbursement amounts ranging from $375 to $10,000 based on both the household’s income and the type of water treatment chosen. The tiered system categorizes program participants based on whether their household income is (1) <200%, (2) 200-400%, or (3) >400% the Federal Poverty Level (FPL). Also similar to New Hampshire, payments may be made directly to contractors prior to installation for the lowest income bracket, who qualify for full installation costs; others are reimbursed after the fact. This program uses the aforementioned EPA MCLs for PFOA, PFOS, PFHxS, PFNA, or HFPO-DA (“GenX”) and also recognizes the additional EPA MCL of a hazard index of 1.0 for mixtures containing two or more of PFHxS, PFNA, HFPO-DA, or PFBS.
An opportunity exists to protect New Yorkers.
Launching a program in New York similar to those initiated in Colorado, New Hampshire, and North Carolina was among the priority initiatives described by New York Governor Kathy Hochul in the annual State of the State she delivered in January 2025. In particular, Hochul’s plans to improve water infrastructure included “a pilot program providing financial assistance for private well owners to replace or treat contaminated wells.” This was announced along with a $500 million additional investment beyond New York’s existing $5.5 billion dedicated to water infrastructure, which will also be used to “reduce water bills, combat flooding, restore waterways, and replace lead service lines to protect vulnerable populations, particularly children in underserved communities.” In early 2025, the New York Legislature introduced Senate Bill S3972, which intended to establish an installation grant program and a maintenance rebate program for PFAS removal treatment. Bipartisan interest in protecting the public from PFAS-contaminated drinking water is further evidenced by a hearing focused on the topic held by the NYS Assembly in November 2024.
Though these efforts would likely initially be confined to a smaller pilot program with limited geographic scope, such a pilot program would aim to inform a broader, statewide intervention. Challenges to planning an intervention of this scope include uncertainty surrounding both the total funding which would be allotted to such a program and its total costs. These costs will be dependent on factors such as the eligibility criteria employed by the state, the proportion of well owners who opt into sampling, and the proportion of tested wells found to have PFAS exceedances (which will further vary based on whether the state adopts EPA MCLs or NYS Department of Health MCLs, which are 10 ng/L for PFOA and PFOS). We allay the uncertainty associated with these numerous possibilities by estimating the numbers of wells serviced and associated costs under various combinations of 10 potential eligibility criteria, 5 possible rates (5, 25, 50, 75, and 100%) of PFAS testing among eligible wells, and 5 possible rates (5, 25, 50, 75, and 100%) of PFAS>MCL and subsequent POET installation among wells tested.
Scenario Analysis
Key findings
- Over 900,000 residences across NYS are supplied by private drinking wells (Figure 1).
- The three most costly scenarios were offering testing and installation rebates for (Table 1):
- Every private well owner (901,441 wells; $1,034,403,547)
- Every well located within a census tract designated as disadvantaged (based on NYS Disadvantaged Community (DAC) criteria) AND/OR belonging to a household with annual income <$150,000 (725,923 wells; $832,996,643)
- Every well belonging to a household with annual income <$150,000 (705,959 wells; $810,087,953)
- The three least costly scenarios were offering testing and installation rebates for (Table 1):
- Every well located within a census tract in which at least 51% of households earn below 80% of the area median income (22,835 wells; $26,191,688)
- Every well belonging to a household earning <100% of the Federal Poverty Level (92,661 wells; $106,328,398)
- Every well located within a census tract designated as disadvantaged (based on NYS Disadvantaged Community (DAC) criteria) (93,840 wells; $107,681,400)
- Of six income-based eligibility criteria, household income <$150,000 included the greatest number of wells, whereas location within a census tract in which at least 51% of households earn below 80% the area median income (a definition of low-to-moderate income used for programs coordinated by the U.S. Department of Housing and Urban Development), included the fewest wells. This amounts to a cost difference of $783,896,265 between these two eligibility scenarios.
- Six income-based criteria varied dramatically in terms of their inclusion of wells across NYS which fall within either disadvantaged or small communities (Table 2):
- For disadvantaged communities, this ranged from 12% (household income <100% federal poverty level) to 79% (income <$150,000) of all wells within disadvantaged communities being eligible.
- For small communities, this ranged from 2% (census tracts in which at least 51% of households earn below 80% area median income) to 83% (income <$150,000) of all wells within small communities being eligible.
Plan of Action
New York State is already considering a PFAS remediation program (e.g., Senate Bill S3972). The 2025 draft of the bill directed the New York Department of Environmental Conservation to establish an installation grant program and a maintenance rebate program for PFAS removal treatment, and establishes general eligibility criteria and per-household funding amounts. To our knowledge, S3972 did not pass in 2025, but its program provides a strong foundation for potential future action. Our suggestions below resolve some gaps in S3972, including additional detail that could be followed by the implementing agency and overall cost estimates that could be used by the Legislature when considering overall financial impacts.
Recommendation 1. Remediate all disadvantaged wells statewide
We recommend including every well located within a census tract designated as disadvantaged (based on NYS Disadvantaged Community (DAC) criteria) and/or belonging to a household with annual income <$150,000 as the eligibility criteria which protects the widest range of vulnerable New Yorkers. Using this criteria, we estimate a total program cost of approximately $833 million, or $167 million per year if the program were to be implemented over a 5-year period. Even accounting for the other projects which the state will be undertaking at the same time, this annual cost falls well within the additional $500 million which the 2025 State of the State reports will be added in 2025 to an existing $5.5 million state investment in water infrastructure.
Recommendation 2. Target disadvantaged census tracts and household incomes
Wells in DAC census tracts accounts for a variety of disadvantages. Including NYS DAC criteria helps to account for the heterogeneity of challenges experienced by New Yorkers by weighing statistically meaningful thresholds for 45 different indicators across several domains. These include factors relevant to the risk of PFAS exposure, such as land use for industrial purposes and proximity to active landfills.
Wells in low-income households account for cross-sectoral disadvantage. The DAC criteria alone is imperfect:
- Major criticisms include its underrepresentation of rural communities (only 13% of rural census tracts, compared to 26% of suburban and 48% of urban tracts, have been DAC-designated) and failure to account for some key stressors relevant to rural communities (e.g., distance to food stores and in-migration/gentrification).
- Another important note is that wells within DAC communities account for only 10% of all wells within NYS (Table 2). While wells within DAC-designated communities are important to consider, including only DAC wells in an intervention would therefore be very limiting.
- Whereas DAC designation is a binary consideration for an entire census tract, place-based criteria such as this are limited in that any real community comprises a spectrum of socioeconomic status and (dis)advantage.
The inclusion of income-based criteria is useful in that financial strain is a universal indicator of resource constraint which can help to identify the most-in-need across every community. Further, including income-based criteria can widen the program’s eligibility criteria to reach a much greater proportion of well owners (Table 2). Finally, in contrast to the DAC criteria’s binary nature, income thresholds can be adjusted to include greater or fewer wells depending on final budget availability.
- Of the income thresholds evaluated, income <$150,000 is recommended due to its inclusion not only of the greatest number of well owners overall, but also the greatest percentages of wells within disadvantaged and small communities (Table 2). These two considerations are both used by the EPA in awarding grants to states for water infrastructure improvement projects.
- As an alternative to selecting one single income threshold, the state may also consider maximizing cost effectiveness by adopting a tiered rebate system similar to that used by the North Carolina PFAS Treatment System Assistance Program.
Recommendation 3. Alternatives to POETs might be more cost-effective and accessible
A final recommendation is for the state to maximize the breadth of its well remediation program by also offering reimbursements for point-of-use treatment (POUT) systems and for connecting to public water systems, not just for POET installations. While POETs are effective in PFAS removal, they require invasive changes to household plumbing and prohibitively expensive ongoing maintenance, two factors which may give well owners pause even if they are eligible for an initial installation rebate. Colorado’s PFAS TAP program models a less invasive and extremely cost-effective POUT alternative to POETs. We estimate that if NYS were to provide the same POUT filters as Colorado, the total cost of the program (using the recommended eligibility criteria of location within a DAC-designated census tract and/or belonging to a household with annual income <$150,000) would be $163 million, or $33 million per year across 5 years. This amounts to a total decrease in cost of nearly $670 million if POUTs were to be provided in place of POETs. Connection to public water systems, on the other hand, though a significant initial investment, provides an opportunity to streamline drinking water monitoring and remediation moving forward and eliminates the need for ongoing and costly individual interventions and maintenance.
Conclusion
Well testing and rebate programs provide an opportunity to take preventative action against the serious health threats associated with PFAS exposure through private drinking water. Individuals reliant on PFAS-contaminated private wells for drinking water are likely to ingest the chemicals on a daily basis. There is therefore no time to waste in taking action to break this chain of exposure. New York State policymakers are already engaged in developing this policy solution; our recommendations can help both those making the policy and those tasked with implementing it to best serve New Yorkers. Our analysis shows that a program to mitigate PFAS in private drinking water is well within scope of current action and that fair implementation of such a program can help those who need it most and do so in a cost-effective manner.
While the Safe Drinking Water Act regulates the United States’ public drinking water supplies, there is no current federal government to regulate private wells. Most states also lack regulation of private wells. Introducing new legislation to change this would require significant time and political will. Political will to enact such a change is unlikely given resource limitations, concerns around well owners’ privacy, and the current time in which the EPA is prioritizing deregulation.
Decreasing blood serum levels is likely to decrease negative health impacts. Exposure via drinking water is particularly associated with elevated serum PFAS levels, while appropriate water filtration has demonstrated efficacy in reducing serum PFAS levels.
We estimated total costs assuming that 75% of eligible wells are tested for PFAS and that of these tested wells, 25% are both found to have PFAS exceedances and proceed to have filter systems installed. This PFAS exceedance/POET installation rate was selected because it falls between the rates of exceedances observed when private well sampling was conducted in Wisconsin and New Hampshire in recent years.
For states which do not have their own tools for identifying disadvantaged communities, the Social Vulnerability Index developed by the Centers for Disease Control and Prevention (CDC) and Agency for Toxic Substances and Disease Registry (ATSDR) may provide an alternative option to help identify those most in need.
Too Hot not to Handle
Every region in the U.S. is experiencing year after year of record-breaking heat. More households now require home cooling solutions to maintain safe and liveable indoor temperatures. Over the last two decades, U.S. consumers and the private sector have leaned heavily into purchasing and marketing conventional air conditioning (AC) systems, such as central air conditioning, window units and portable ACs, to cool down overheating homes.
While AC can offer immediate relief, the rapid scaling of AC has created dangerous vulnerabilities: rising energy bills are straining people’s wallets and increasing utility debt, while surging electricity demand increases reliance on high-polluting power infrastructure and mounts pressure on an aging power grid increasingly prone to blackouts. There is also an increasing risk of elevated demand for electricity during a heat wave, overloading the grid and triggering prolonged blackouts, causing whole regions to lose their sole cooling strategy. This disruption could escalate into a public health emergency as homes and people overheat, leading to hundreds of deaths.
What Americans need to be prepared for more extreme temperatures is a resilient cooling strategy. Resilient cooling is an approach that works across three interdependent systems — buildings, communities, and the electric grid — to affordably maintain safe indoor temperatures during extreme heat events and reduce power outage risks.
This toolkit introduces a set of Policy Principles for Resilient Cooling and outlines a set of actionable policy options and levers for state and local governments to foster broader access to resilient cooling technologies and strategies.
This toolkit introduces a set of Policy Principles for Resilient Cooling and outlines a set of actionable policy options and levers for state and local governments to foster broader access to resilient cooling technologies and strategies. For example, states are the primary regulators of public utility commissions, architects of energy and building codes, and distributors of federal and state taxpayer dollars. Local governments are responsible for implementing building standards and zoning codes, enforcing housing and health codes, and operating public housing and retrofit programs that directly shape access to cooling.
The Policy Principles for Resilient Cooling for a robust resilient cooling strategy are:
- Expand Cooling Access and Affordability. Ensuring that everyone can affordably access cooling will reduce the population-wide risk of heat-related illness and death in communities and the resulting strain on healthcare systems. Targeted financial support tools — such as subsidies, rebates, and incentives — can reduce both upfront and ongoing costs of cooling technologies, thereby lowering barriers and enabling broader adoption.
- Incorporate Public Health Outcomes as a Driver of Resilience. Indoor heat exposure and heat-driven factors that reduce indoor air quality — such as pollutant accumulation and mold-promoting humidity — are health risks. Policymakers should embed heat-related health risks into building codes, energy standards, and guidelines for energy system planning, including establishing minimum indoor temperature and air quality requirements, integrating health considerations into energy system planning standards, and investing in multi-solving community system interventions like green infrastructure.
- Advance Sustainability Across the Cooling Lifecycle. Rising demand for air conditioning is intensifying the problem it aims to solve by increasing electricity consumption, prolonging reliance on high-polluting power plants, and leaking refrigerants that release powerful greenhouse gases. Policymakers can adopt codes and standards that reduce reliance on high-emission energy sources and promote low-global warming potential (GWP) refrigerants and passive cooling strategies.
- Promote Solutions for Grid Resilience. The U.S. electric grid is struggling to keep up with rising demand for electricity, creating potential risks to communities’ cooling systems. Policymakers can proactively identify potential vulnerabilities in energy systems’ ability to sustain safe indoor temperatures. Demand-side management strategies, distributed energy resources, and grid-enhancing technologies can prepare the electric grid for increased energy demand and ensure its reliability during extreme heat events.
- Build a Skilled Workforce for Resilient Cooling. Resilient cooling provides an opportunity to create pathways to good-paying jobs, reduce critical workforce gaps, and bolster the broader economy. Investing in a workforce that can design, install, and maintain resilient cooling systems can strengthen local economies, ensure preparedness for all kinds of risks to the system, and bolster American innovation.
By adopting a resilient cooling strategy, state and local policymakers can address today’s overlapping energy, health, and affordability crises, advance American-made innovation, and ensure their communities are prepared for the hotter decades ahead.
A Holistic Framework for Measuring and Reporting AI’s Impacts to Build Public Trust and Advance AI
As AI becomes more capable and integrated throughout the United States economy, its growing demand for energy, water, land, and raw materials is driving significant economic and environmental costs, from increased air pollution to higher costs for ratepayers. A recent report projects that data centers could consume up to 12% of U.S. electricity by 2028, underscoring the urgent need to assess the tradeoffs of continued expansion. To craft effective, sustainable resource policies, we need clear standards for estimating the data centers’ true energy needs and for measuring and reporting the specific AI applications driving their resource consumption. Local and state-level bills calling for more oversight of utility rates and impacts to ratepayers have received bipartisan support, and this proposal builds on that momentum.
In this memo, we draw on research proposing a holistic evaluation framework for characterizing AI’s environmental impacts, which establishes three categories of impacts arising from AI: (1) Computing-related impacts; (2) Immediate application impacts; and (3) System-level impacts . Concerns around AI’s computing-related impacts, e.g. energy and water use due to AI data centers and hardware manufacturing, have become widely known with corresponding policy starting to be put into place. However, AI’s immediate application and system-level impacts, which arise from the specific use cases to which AI is applied, and the broader socio-economic shifts resulting from its use, remain poorly understood, despite their greater potential for societal benefit or harm.
To ensure that policymakers have full visibility into the full range of AI’s environmental impacts we recommend that the National Institute of Standards and Technology (NIST) oversee creation of frameworks to measure the full range of AI’s impacts. Frameworks should rely on quantitative measurements of the computing and application related impacts of AI and qualitative data based on engagements with the stakeholders most affected by the construction of data centers. NIST should produce these frameworks based on convenings that include academic researchers, corporate governance personnel, developers, utility companies, vendors, and data center owners in addition to civil society organizations. Participatory workshops will yield new guidelines, tools, methods, protocols and best practices to facilitate the evolution of industry standards for the measurement of the social costs of AI’s energy infrastructures.
Challenge and Opportunity
Resource consumption associated with AI infrastructures is expanding quickly, and this has negative impacts, including asthma from air pollution associated with diesel backup generators, noise pollution, light pollution, excessive water and land use, and financial impacts to ratepayers. A lack of transparency regarding these outcomes and public participation to minimize these risks losing the public’s trust, which in turn will inhibit the beneficial uses of AI. While there is a huge amount of capital expenditure and a massive forecasted growth in power consumption, there remains a lack of transparency and scientific consensus around the measurement of AI’s environmental impacts with respect to data centers and their related negative externalities.
A holistic evaluation framework for assessing AI’s broader impacts requires empirical evidence, both qualitative and quantitative, to influence future policy decisions and establish more responsible, strategic technology development. Focusing narrowly on carbon emissions or energy consumption arising from AI’s computing related impacts is not sufficient. Measuring AI’s application and system-level impacts will help policymakers consider multiple data streams, including electricity transmission, water systems and land use in tandem with downstream economic and health impacts.
Regulatory and technical attempts so far to develop scientific consensus and international standards around the measurement of AI’s environmental impacts have focused on documenting AI’s computing-related impacts, such as energy use, water consumption, and carbon emissions required to build and use AI. Measuring and mitigating AI’s computing-related impacts is necessary, and has received attention from policymakers (e.g. the introduction of the AI Environmental Impacts Act of 2024 in the U.S., provisions for environmental impacts of general-purpose AI in the EU AI Act, and data center sustainability targets in the German Energy Efficiency Act). However, research by Kaack et al (2022) highlights that impacts extend beyond computing. AI’s application impacts, which arise from the specific use cases for which AI is deployed (e.g. AI’s enabled emissions, such as application of AI to oil and gas drilling have much greater potential scope for positive or negative impacts compared to AI’s computing impacts alone, depending on how AI is used in practice). Finally, AI’s system-level impacts, which include even broader, cascading social and economic impacts associated with AI energy infrastructures, such as increased pressure on local utility infrastructure leading to increased costs to ratepayers, or health impacts to local communities due to increased air pollution, have the greatest potential for positive or negative impacts, while being the most challenging to measure and predict. See Figure 1 for an overview.

from Kaack et al. (2022). Effectively understanding and shaping AI’s impacts will require going beyond impacts arising from computing alone, and requires consideration and measurement of impacts arising from AI’s uses (e.g. in optimizing power systems or agriculture) and how AI’s deployment throughout the economy leads to broader systemic shifts, such as changes in consumer behavior.
Effective policy recommendations require more standardized measurement practices, a point raised by the Government Accountability Office’s recent report on AI’s human and environmental effects, which explicitly calls for increasing corporate transparency and innovation around technical methods for improved data collection and reporting. But data should also include multi-stakeholder engagement to ensure there are more holistic evaluation frameworks that meet the needs of specific localities, including state and local government officials, businesses, utilities, and ratepayers. Furthermore, while states and municipalities are creating bills calling for more data transparency and responsibility, including in California, Indiana, Oregon, and Virginia, the lack of federal policy means that data center owners may move their operations to states that have fewer protections in place and similar levels of existing energy and data transmission infrastructure.
States are also grappling with the potential economic costs of data center expansion. Ohio’s Policy Matters found that tax breaks for data center owners are hurting tax revenue streams that should be used to fund public services. In Michigan, tax breaks for data centers are increasing the cost of water and power for the public while undermining the state’s climate goals. Some Georgia Republicans have stated that data center companies should “pay their way.” While there are arguments that data centers can provide useful infrastructure, connectivity, and even revenue for localities, a recent report shows that at least ten states each lost over $100 million a year in revenue to data centers because of tax breaks. The federal government can help create standards that allow stakeholders to balance the potential costs and benefits of data centers and related energy infrastructures. We now have an urgent need to increase transparency and accountability through multi-stakeholder engagement, maximizing economic benefits while reducing waste.
Despite the high economic and policy stakes, critical data needed to assess the full impacts—both costs and benefits—of AI and data center expansion remains fragmented, inconsistent, or entirely unavailable. For example, researchers have found that state-level subsidies for data center expansion may have negative impacts on state and local budgets, but this data has not been collected and analyzed across states because not all states publicly release data about data center subsidies. Other impacts, such as the use of agricultural land or public parks for transmission lines and data center siting, must be studied at a local and state level, and the various social repercussions require engagement with the communities who are likely to be affected. Similarly, estimates on the economic upsides of AI vary widely, e.g. the estimated increase in U.S. labor productivity due to AI adoption ranges from 0.9% to 15% due in large part to lack of relevant data on AI uses and their economic outcomes, which can be used to inform modeling assumptions.
Data centers are highly geographically clustered in the United States, more so than other industrial facilities such as steel plants, coal mines, factories, and power plants (Fig. 4.12, IEA World Energy Outlook 2024). This means that certain states and counties are experiencing disproportionate burdens associated with data center expansion. These burdens have led to calls for data center moratoriums or for the cessation of other energy development, including in states like Indiana. Improved measurement and transparency can help planners avoid overly burdensome concentrations of data center infrastructure, reducing local opposition.
With a rush to build new data center infrastructure, states and localities must also face another concern: overbuilding. For example, Microsoft recently put a hold on parts of its data center contract in Wisconsin and paused another in central Ohio, along with contracts in several other locations across the United States and internationally. These situations often stem from inaccurate demand forecasting, prompting utilities to undertake costly planning and infrastructure development that ultimately goes unused. With better measurement and transparency, policymakers will have more tools to prepare for future demands, avoiding the negative social and economic impacts of infrastructure projects that are started but never completed.
While there have been significant developments in measuring the direct, computing-related impacts of AI data centers, public participation is needed to fully capture many of their indirect impacts. Data centers can be constructed so they are more beneficial to communities while mitigating their negative impacts, e.g. by recycling data center heat, and they can also be constructed to be more flexible by not using grid power during peak times. However, this requires collaborative innovation and cross-sector translation, informed by relevant data.
Plan of Action
Recommendation 1. Develop a database of AI uses and framework for reporting AI’s immediate applications in order to understand the drivers of environmental impacts.
The first step towards informed decision-making around AI’s social and environmental impacts is understanding what AI applications are actually driving data center resource consumption. This will allow specific deployments of AI systems to be linked upstream to compute-related impacts arising from their resource intensity, and downstream to impacts arising from their application, enabling estimation of immediate application impacts.
The AI company Anthropic demonstrated a proof-of-concept categorizing queries to their Claude language model under the O*NET database of occupations. However, O*NET was developed in order to categorize job types and tasks with respect to human workers, which does not exactly align with current and potential uses of AI. To address this, we recommend that NIST works with relevant collaborators such as the U.S. Department of Labor (responsible for developing and maintaining the O*NET database) to develop a database of AI uses and applications, similar to and building off of O*NET, along with guidelines and infrastructure for reporting data center resource consumption corresponding to those uses. This data could then be used to understand particular AI tasks that are key drivers of resource consumption.
Any entity deploying a public-facing AI model (that is, one that can produce outputs and/or receive inputs from outside its local network) should be able to easily document and report its use case(s) within the NIST framework. A centralized database will allow for collation of relevant data across multiple stakeholders including government entities, private firms, and nonprofit organizations.
Gathering data of this nature may require the reporting entity to perform analyses of sensitive user data, such as categorizing individual user queries to an AI model. However, data is to be reported in aggregate percentages with respect to use categories without attribution to or listing of individual users or queries. This type of analysis and data reporting is well within the scope of existing, commonplace data analysis practices. As with existing AI products that rely on such analyses, reporting entities are responsible for performing that analysis in a way that appropriately safeguards user privacy and data protection in accordance with existing regulations and norms.
Recommendation 2. NIST should create an independent consortium to develop a system-level evaluation framework for AI’s environmental impacts, while embedding robust public participation in every stage of the work.
Currently, the social costs of AI’s system-level impacts—the broader social and economic implications arising from AI’s development and deployment—are not being measured or reported in any systematic way. These impacts fall heaviest on the local communities that host the data centers powering AI: the financial burden on ratepayers who share utility infrastructure, the health effects of pollutants from backup generators, the water and land consumed by new facilities, and the wider economic costs or benefits of data-center siting. Without transparent metrics and genuine community input, policymakers cannot balance the benefits of AI innovation against its local and regional burdens. Building public trust through public participation is key when it comes to ensuring United States energy dominance and national security interests in AI innovation, themes emphasized in policy documents produced by the first and second Trump administrations.
To develop evaluation frameworks in a way that is both scientifically rigorous and broadly trusted, NIST should stand up an independent consortium via a Cooperative Research and Development Agreement (CRADA). A CRADA allows NIST to collaborate rapidly with non-federal partners while remaining outside the scope of the Federal Advisory Committee Act (FACA), and has been used, for example, to convene the NIST AI Safety Institute Consortium. Membership will include academic researchers, utility companies and grid operators, data-center owners and vendors, state, local, Tribal, and territorial officials, technologists, civil-society organizations, and frontline community groups.
To ensure robust public engagement, the consortium should consult closely with FERC’s Office of Public Participation (OPP)—drawing on OPP’s expertise in plain-language outreach and community listening sessions—and with other federal entities that have deep experience in community engagement on energy and environmental issues. Drawing on these partners’ methods, the consortium will convene participatory workshops and listening sessions in regions with high data-center concentration—Northern Virginia, Silicon Valley, Eastern Oregon, and the Dallas–Fort Worth metroplex—while also making use of online comment portals to gather nationwide feedback.
Guided by the insights from these engagements, the consortium will produce a comprehensive evaluation framework that captures metrics falling outside the scope of direct emissions alone. These system-level metrics could encompass (1) the number, type, and duration of jobs created; (2) the effects of tax subsidies on local economies and public services; (3) the placement of transmission lines and associated repercussions for housing, public parks, and agriculture; (4) the use of eminent domain for data-center construction; (5) water-use intensity and competing local demands; and (6) public-health impacts from air, light, and noise pollution. NIST will integrate these metrics into standardized benchmarks and guidance.
Consortium members will attend public meetings, engage directly with community organizations, deliver accessible presentations, and create plain-language explainers so that non-experts can meaningfully influence the framework’s design and application. The group will also develop new guidelines, tools, methods, protocols, and best practices to facilitate industry uptake and to evolve measurement standards as technology and infrastructure grow.
We estimate a cost of approximately $5 million over two years to complete the work outlined in recommendation 1 and 2, covering staff time, travel to at least twelve data-center or energy-infrastructure sites across the United States, participant honoraria, and research materials.
Recommendation 3. Mandate regular measurement and reporting on relevant metrics by data center operators.
Voluntary reporting is the status quo, via e.g. corporate Environmental, Social, and Governance (ESG) reports, but voluntary reporting has so far been insufficient for gathering necessary data. For example, while the technology firm OpenAI, best known for their highly popular ChatGPT generative AI model, holds a significant share of the search market and likely corresponding share of environmental and social impacts arising from the data centers powering their products, OpenAI chooses not to publish ESG reports or data in any other format regarding their energy consumption or greenhouse gas (GHG) emissions. In order to collect sufficient data at the appropriate level of detail, reporting must be mandated at the local, state, or federal level. At the state level, California’s Climate Corporate Data Accountability Act (SB -253, SB-219) requires that large companies operating within the state report their GHG emissions in accordance with the GHG Protocol, administered by the California Air Resources Board (CARB).
At the federal level, the EU’s Corporate Sustainable Reporting Directive (CSRD), which requires firms operating within the EU to report a wide variety of data related to environmental sustainability and social governance, could serve as a model for regulating companies operating within the U.S. The Environmental Protection Agency’s (EPA) GHG Reporting Program already requires emissions reporting by operators and suppliers associated with large GHG emissions sources, and the Energy Information Administration (EIA) collects detailed data on electricity generation and fuel consumption through forms 860 and 923. With respect to data centers specifically, the Department of Energy (DOE) could require that developers who are granted rights to build AI data center infrastructure on public lands perform the relevant measurement and reporting, and more broadly reporting could be a requirement to qualify for any local, state or federal funding or assistance provided to support buildout of U.S. AI infrastructure.
Recommendation 4. Incorporate measurements of social cost into AI energy and infrastructure forecasting and planning.
There is a huge range in estimates of future data center energy use, largely driven by uncertainty around the nature of demands from AI. This uncertainty stems in part from a lack of historical and current data on which AI use cases are most energy intensive and how those workloads are evolving over time. It also remains unclear the extent to which challenges in bringing new resources online, such as hardware production limits or bottlenecks in permitting, will influence growth rates. These uncertainties are even more significant when it comes to the holistic impacts (i.e. those beyond direct energy consumption) described above, making it challenging to balance costs and benefits when planning future demands from AI.
To address these issues, accurate forecasting of demand for energy, water, and other limited resources must incorporate data gathered through holistic measurement frameworks described above. Further, the forecasting of broader system-level impacts must be incorporated into decision-making around investment in AI infrastructure. Forecasting needs to go beyond just energy use. Models should include predicting energy and related infrastructure needs for transmission, the social cost of carbon in terms of pollution, the effects to ratepayers, and the energy demands from chip production.
We recommend that agencies already responsible for energy-demand forecasting—such as the Energy Information Administration at the Department of Energy—integrate, in line with the NIST frameworks developed above, data on the AI workloads driving data-center electricity use into their forecasting models. Agencies specializing in social impacts, such as the Department of Health and Human Services in the case of health impacts, should model social impacts and communicate those to EIA and DOE for planning purposes. In parallel, the Federal Energy Regulatory Commission (FERC) should update its new rule on long-term regional transmission planning, to explicitly include consideration of the social costs corresponding to energy supply, demand and infrastructure retirement/buildout across different scenarios.
Recommendation 5. Transparently use federal, state, and local incentive programs to reward data-center projects that deliver concrete community benefits.
Incentive programs should attach holistic estimates of the costs and benefits collected under the frameworks above, and not purely based on promises. When considering using incentive programs, policymakers should ask questions such as: How many jobs are created by data centers and for how long do those jobs exist, and do they create jobs for local residents? What tax revenue for municipalities or states is created by data centers versus what subsidies are data center owners receiving? What are the social impacts of using agricultural land or public parks for data center construction or transmission lines? What are the impacts to air quality and other public health issues? Do data centers deliver benefits like load flexibility and sharing of waste heat?
Grid operators (Regional Transmission Organizations [RTOs] and Independent System Operators [ISOs]) can leverage interconnection queues to incentivize data center operators to justify that they have sufficiently considered the impacts to local communities when proposing a new site. FERC recently approved reforms to processing the interconnect request queue, allowing RTOs to implement a “first-ready first-served” approach rather than a first-come first-served approach, wherein proposed projects can be fast-tracked based on their readiness. A similar approach could be used by RTOs to fast-track proposals that include a clear plan for how they will benefit local communities (e.g. through load flexibility, heat reuse, and clean energy commitments), grounded in careful impact assessment.
There is the possibility of introducing state-level incentives in states with existing significant infrastructure. Such incentives could be determined in collaboration with the National Governors Association, who have been balancing AI-driven energy needs with state climate goals.
Conclusion
Data centers have an undeniable impact on energy infrastructures and the communities living close to them. This impact will continue to grow alongside AI infrastructure investment, which is expected to skyrocket. It is possible to shape a future where AI infrastructure can be developed sustainably, and in a way that responds to the needs of local communities. But more work is needed to collect the necessary data to inform government decision-making. We have described a framework for holistically evaluating the potential costs and benefits of AI data centers, and shaping AI infrastructure buildout based on those tradeoffs. This framework includes: establishing standards for measuring and reporting AI’s impacts, eliciting public participation from impacted communities, and putting gathered data into action to enable sustainable AI development.
This memo is part of our AI & Energy Policy Sprint, a policy project to shape U.S. policy at the critical intersection of AI and energy. Read more about the Policy Sprint and check out the other memos here.
Data centers are highly spatially concentrated largely due to reliance on existing energy and data transmission infrastructure; it is more cost-effective to continue building where infrastructure already exists, rather than starting fresh in a new region. As long as the cost of performing the proposed impact assessment and reporting in established regions is less than that of the additional overhead of moving to a new region, data center operators are likely to comply with regulations in order to stay in regions where the sector is established.
Spatial concentration of data centers also arises due to the need for data center workloads with high data transmission requirements, such as media streaming and online gaming, to have close physical proximity to users in order to reduce data transmission latency. In order for AI to be integrated into these realtime services, data center operators will continue to need presence in existing geographic regions, barring significant advances in data transmission efficiency and infrastructure.
bad for national security and economic growth. So is infrastructure growth that harms the local communities in which it occurs.
Researchers from Good Jobs First have found that many states are in fact losing tax revenue to data center expansion: “At least 10 states already lose more than $100 million per year in tax revenue to data centers…” More data is needed to determine if data center construction projects coupled with tax incentives are economically advantageous investments on the parts of local and state governments.
The DOE is opening up federal lands in 16 locations to data center construction projects in the name of strengthening America’s energy dominance and ensuring America’s role in AI innovation. But national security concerns around data center expansion should also consider the impacts to communities who live close to data centers and related infrastructures.
Data centers themselves do not automatically ensure greater national security, especially because the critical minerals and hardware components of data centers depend on international trade and manufacturing. At present, the United States is not equipped to contribute the critical minerals and other materials needed to produce data centers, including GPUs and other components.
Federal policy ensures that states or counties do not become overburdened by data center growth and will help different regions benefit from the potential economic and social rewards of data center construction.
Developing federal standards around transparency helps individual states plan for data center construction, allowing for a high-level, comparative look at the energy demand associated with specific AI use cases. It is also important for there to be a federal intervention because data centers in one state might have transmission lines running through a neighboring state, and resultant outcomes across jurisdictions. There is a need for a national-level standard.
Current cost-benefit estimates can often be extremely challenging. For example, while municipalities often expect there will be economic benefits attached to data centers and that data center construction will yield more jobs in the area, subsidies and short-term jobs in construction do not necessarily translate into economic gains.
To improve the ability of decision makers to do quality cost-benefit analysis, the independent consortium described in Recommendation 2 will examine both qualitative and quantitative data, including permitting histories, transmission plans, land use and eminent domain cases, subsidies, jobs numbers, and health or quality of life impacts in various sites over time. NIST will help develop standards in accordance with this data collection, which can then be used in future planning processes.
Further, there is customer interest in knowing their AI is being sourced from firms implementing sustainable and socially responsible practices. These efforts which can be used in marketing communications and reported as a socially and environmentally responsible practice in ESG reports. This serves as an additional incentive for some data center operators to participate in voluntary reporting and maintain operations in locations with increased regulation.
Advance AI with Cleaner Air and Healthier Outcomes
Artificial intelligence (AI) is transforming industries, driving innovation, and tackling some of the world’s most pressing challenges. Yet while AI has tremendous potential to advance public health, such as supporting epidemiological research and optimizing healthcare resource allocation, the public health burden of AI due to its contribution to air pollutant emissions has been under-examined. Energy-intensive data centers, often paired with diesel backup generators, are rapidly expanding and degrading air quality through emissions of air pollutants. These emissions exacerbate or cause various adverse health outcomes, from asthma to heart attacks and lung cancer, especially among young children and the elderly. Without sufficient clean and stable energy sources, the annual public health burden from data centers in the United States is projected to reach up to $20 billion by 2030, with households in some communities located near power plants supplying data centers, such as those in Mason County, WV, facing over 200 times greater burdens than others.
Federal, state, and local policymakers should act to accelerate the adoption of cleaner and more stable energy sources and address AI’s expansion that aligns innovation with human well-being, advancing the United States’ leadership in AI while ensuring clean air and healthy communities.
Challenge and Opportunity
Forty-six percent of people in the United States breathe unhealthy levels of air pollution. Ambient air pollution, especially fine particulate matter (PM2.5), is linked to 200,000 deaths each year in the United States. Poor air quality remains the nation’s fifth highest mortality risk factor, resulting in a wide range of immediate and severe health issues that include respiratory diseases, cardiovascular conditions, and premature deaths.
Data centers consume vast amounts of electricity to power and cool the servers running AI models and other computing workloads. According to the Lawrence Berkeley National Laboratory, the growing demand for AI is projected to increase the data centers’ share of the nation’s total electricity consumption to as much as 12% by 2028, up from 4.4% in 2023. Without enough sustainable energy sources like nuclear power, the rapid growth of energy-intensive data centers is likely to exacerbate ambient air pollution and its associated public health impacts.
Data centers typically rely on diesel backup generators for uninterrupted operation during power outages. While the total operation time for routine maintenance of backup generators is limited, these generators can create short-term spikes in PM2.5, NOx, and SO2 that go beyond the baseline environmental and health impacts associated with data center electricity consumption. For example, diesel generators emit 200–600 times more NOx than natural gas-fired power plants per unit of electricity produced. Even brief exposure to high-level NOx can aggravate respiratory symptoms and hospitalizations. A recent report to the Governor and General Assembly of Virginia found that backup generators at data centers emitted approximately 7% of the total permitted pollution levels for these generators in 2023. Based on the Environmental Protection Agency’s COBRA modeling tool, the public health cost of these emissions in Virginia is estimated at approximately $200 million, with health impacts extending to neighboring states and reaching as far as Florida. In Memphis, Tennessee, a set of temporary gas turbines powering a large AI data center, which has not undergone a complete permitting process, is estimated to emit up to 2,000 tons of NOx annually. This has raised significant health concerns among local residents and could result in a total public health burden of $160 million annually. These public health concerns coincide with a paradigm shift that favors dirty energy and potentially delays sustainability goals.
In 2023 alone, air pollution attributed to data centers in the United States resulted in an estimated $5 billion in health-related damages, a figure projected to rise up to $20 billion annually by 2030. This projected cost reflects an estimated 1,300 premature deaths in the United States per year by the end of the decade. While communities near data centers and power plants bear the greatest burden, with some households facing over 200 times greater impacts than others, the health impacts of these facilities extend to communities across the nation. The widespread health impacts of data centers further compound the already uneven distribution of environmental costs and water resource stresses imposed by AI data centers across the country.
While essential for mitigating air pollution and public health risks, transitioning AI data centers to cleaner backup fuels and stable energy sources such as nuclear power presents significant implementation hurdles, including lengthy permitting processes. Clean backup generators that match the reliability of diesel remain limited in real-world applications, and multiple key issues must be addressed to fully transition to cleaner and more stable energy.
While it is clear that data centers pose public health risks, comprehensive evaluations of data center air pollution and related public health impacts are essential to grasp the full extent of the harms these centers pose, yet often remain absent from current practices. Washington State conducted a health risk assessment of diesel particulate pollution from multiple data centers in the Quincy area in 2020. However, most states lack similar evaluations for either existing or newly proposed data centers. To safeguard public health, it is essential to establish transparency frameworks, reporting standards, and compliance requirements for data centers, enabling the assessment of PM2.5, NOₓ, SO₂, and other harmful air pollutants, as well as their short- and long-term health impacts. These mechanisms would also equip state and local governments to make informed decisions about where to site AI data center facilities, balancing technological progress with the protection of community health nationwide.
Finally, limited public awareness, insufficient educational outreach, and a lack of comprehensive decision-making processes further obscure the potential health risks data centers pose to public health. Without robust transparency and community engagement mechanisms, communities housing data center facilities are left with little influence or recourse over developments that may significantly affect their health and environment.
Plan of Action
The United States can build AI systems that not only drive innovation but also promote human well-being, delivering lasting health benefits for generations to come. Federal, state, and local policymakers should adopt a multi-pronged approach to address data center expansion with minimal air pollution and public health impacts, as outlined below.
Federal-level Action
Federal agencies play a crucial role in establishing national standards, coordinating cross-state efforts, and leveraging federal resources to model responsible public health stewardship.
Recommendation 1. Incorporate Public Health Benefits to Accelerate Clean and Stable Energy Adoption for AI Data Centers
Congress should direct relevant federal agencies, including the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and the Environmental Protection Agency (EPA), to integrate air pollution reduction and the associated public health benefits into efforts to streamline the permitting process for more sustainable energy sources, such as nuclear power, for AI data centers. Simultaneously, federal resources should be expanded to support research, development, and pilot deployment of alternative low-emission fuels for backup generators while ensuring high reliability.
- Public Health Benefit Quantification. Direct the EPA, in coordination with DOE and public health agencies, to develop standardized methods for estimating the public health benefits (e.g., avoided premature deaths, hospital visits, and economic burden) of using cleaner and more stable energy sources for AI data centers. Require lifecycle emissions modeling of energy sources and translate avoided emissions into quantitative health benefits using established tools such as the EPA’s BenMAP. This should:
- Include modeling of air pollution exposure and health outcomes (e.g., using tools like EPA’s COBRA)
- Incorporate cumulative risks from regional electricity generation and local backup generator emissions
- Account for spatial disparities and vulnerable populations (e.g., children, the elderly, and disadvantaged communities)
- Evaluate both short-term (e.g., generator spikes) and long-term (e.g., chronic exposure) health impacts
- Preferential Permitting. Instruct the DOE to prioritize and streamline permitting for cleaner energy projects (e.g., small modular reactors, advanced geothermal) that demonstrate significant air pollution reduction and health benefits in supporting AI data center infrastructures. Develop a Clean AI Permitting Framework that allows project applicants to submit health benefit assessments as part of the permitting package to justify accelerated review timelines.
- Support for Cleaner Backup Systems. Expand DOE and EPA R&D programs to support pilot projects and commercialization pathways for alternative backup generator technologies, including hydrogen combustion systems and long-duration battery storage. Provide tax credits or grants for early adopters of non-diesel backup technologies in AI-related data center facilities.
- Federal Guidance & Training. Provide technical assistance to state and local agencies to implement the protocol, and fund capacity-building efforts in environmental health departments.
Recommendation 2. Establish a Standardized Emissions Reporting Framework for AI Data Centers
Congress should direct the EPA, in coordination with the National Institute of Standards and Technology (NIST), to develop and implement a standardized reporting framework requiring data centers to publicly disclose their emissions of air pollutants, including PM₂.₅, NOₓ, SO₂, and other hazardous air pollutants associated with backup generators and electricity use.
- Multi-Stakeholder Working Group. Task EPA with convening a multi-stakeholder working group, including representatives from NIST, DOE, state regulators, industry, and public health experts, to define the scope, metrics, and methodologies for emissions reporting.
- Standardization. Develop a federal technical standard that specifies:
- Types of air pollutants that should be reported
- Frequency of reporting (e.g., quarterly or annually)
- Facility-specific disclosures (including generator use and power source profiles)
- Geographic resolution of emissions data
- Public access and data transparency protocols
State-level Action
Recommendation 1. State environmental and public health departments should conduct a health impact assessment (HIA) before and after data center construction to evaluate discrepancies between anticipated and actual health impacts for existing and planned data center operations. To maintain and build trust, HIA findings, methodologies, and limitations should be publicly available and accessible to non-technical audiences (including policymakers, local health departments, and community leaders representing impacted residents), thereby enhancing community-informed action and participation. Reports should focus on the disparate impact between rural and urban communities, with particular attention to overburdened communities that have under-resourced health infrastructure. In addition, states should coordinate HIA and share findings to address cross-boundary pollution risks. This includes accounting for nearby communities across state lines, considering that jurisdictional borders should not constrain public health impacts and analysis.
Recommendation 2. State public health departments should establish a state-funded program that offers community education forums for affected residents to express their concerns about how data centers impact them. These programs should emphasize leading outreach, engaging communities, and contributing to qualitative analysis for HIAs. Health impact assessments should be used as a basis for informed community engagement.
Recommendation 3. States should incorporate air pollutant emissions related to data centers into their implementation of the National Ambient Air Quality Standards (NAAQS) and the development of State Implementation Plans (SIPs). This ensures that affected areas can meet standards and maintain their attainment statuses. To support this, states should evaluate the adequacy of existing regulatory monitors in capturing emissions related to data centers and determine whether additional monitoring infrastructure is required.
Local-level Action
Recommendation 1. Local governments should revise zoning regulations to include stricter and more explicit health-based protections to prevent data center clustering in already overburdened communities. Additionally, zoning ordinances should address colocation factors and evaluate potential cumulative health impacts. A prominent example is Fairfax County, Virginia, which updated its zoning ordinance in September 2024 to regulate the proximity of data centers to residential areas, require noise pollution studies prior to construction, and establish size thresholds. These updates were shaped through community engagement and input.
Recommendation 2. Local governments should appoint public health experts to the zoning boards to ensure data center placement decisions reflect community health priorities, thereby increasing public health expert representation on zoning boards.
Conclusion
While AI can revolutionize industries and improve lives, its energy-intensive nature is also degrading air quality through emissions of air pollutants. To mitigate AI’s growing air pollution and public health risks, a comprehensive assessment of AI’s health impact and transitioning AI data centers to cleaner backup fuels and stable energy sources, such as nuclear power, are essential. By adopting more informed and cleaner AI strategies at the federal and state levels, policymakers can mitigate these harms, promote healthier communities, and ensure AI’s expansion aligns with clean air priorities.
This memo is part of our AI & Energy Policy Sprint, a policy project to shape U.S. policy at the critical intersection of AI and energy. Read more about the Policy Sprint and check out the other memos here.
It’s Summer, America’s Heating Up, and We’re Even More Unprepared
Summer officially kicked off this past weekend with the onset of a sweltering heat wave. As we hit publish on this piece, tens of millions of Americans across the central and eastern United States are experiencing sweltering temperatures that make it dangerous to work, play, or just hang out outdoors.
The good news is that even when the mercury climbs, heat illness, injury, and death are preventable. The bad news is that over the past five months, the Trump administration has dismantled essential preventative capabilities.
At the beginning of this year, more than 70 organizations rallied around a common-sense Heat Policy Agenda to tackle this growing whole-of-nation crisis. Since then, we’ve seen some encouraging progress. The new Congressional Extreme Heat Caucus presents an avenue for bipartisan progress on securing resources and legislative wins. Recommendations from the Heat Policy Agenda have already been echoed in multiple introduced bills. Four states, California, Arizona, New Jersey, and New York, now have whole-of-government heat action plans, and there are several States with plans in development. More locally, mayors are banding together to identify heat preparedness, management, and resilience solutions. FAS highlighted examples of how leaders and communities across the country are beating the heat in a Congressional briefing just last week.
But these steps in the right direction are being forestalled by the Trump Administration’s leap backwards on heat. The Heat Policy Agenda emphasized the importance of a clear, sustained federal governance structure for heat, named authorities and dedicated resourcing for federal agencies responsible for extreme heat management, and funding and technical assistance to subnational governments to build their heat readiness. The Trump Administration has not only failed to advance these goals – it has taken actions that clearly work against them.
The result? It’s summer, America’s heating up, and we’re deeply unprepared.
The heat wave making headlines today is just the latest example of how extreme heat is a growing problem for all 50 states. In just the past month, the Pacific Northwest smashed early-summer temperature records, there were days when parts of Texas were the hottest places on Earth, and Alaska – yes, Alaska – issued its first-ever heat advisory. Extreme heat is deadlier than hurricanes, floods, and tornadoes combined, and is exacerbating a mental-health crisis as well. By FAS’ estimates, extreme heat costs the nation more than $162 billion annually, costs that have made extreme heat a growing concern to private markets.
To build a common understanding of the state of federal heat infrastructure, we analyzed the status of heat-critical programs and agencies through public media, government reports, and conversations with stakeholders. All known impacts are confirmed via publicly available sources. We highlight five areas where federal capacity has been impacted:
- Leadership and governance infrastructure
- Key personnel and their expertise
- Data, forecasts, and information availability
- Funding sources and programs for preparedness, risk mitigation and resilience
- Progress towards heat policy goals
This work provides answers to many of the questions our team has been asked over the last few months about what heat work continues at the federal level. With this grounding, we close with some options and opportunities for subnational governments to consider heading into Summer 2025.
What is the Current State of Federal Capacity on Extreme Heat?
Loss of leadership and governance infrastructure
At the time of publication, all but one of the co-chairs for the National Integrated Heat Health Information System’s (NIHHIS) Interagency Working Group (IWG) have either taken an early retirement offer or have been impacted by reductions in force. The co-chairs represented NIHHIS, the National Weather Service (NWS), Health and Human Services (HHS), and the Federal Emergency Management Agency (FEMA). The National Heat Strategy, a whole-of-government vision for heat governance crafted by 28 agencies through the NIHHIS IWG, was also taken offline. A set of agency-by-agency tasks for Strategy implementation (to build short-term readiness for upcoming heat seasons, as well as to strengthen long-term preparedness) was in development as of early 2025, but this work has stalled. There was also a goal to formalize NIHHIS via legislation, given that its existence is not mandated by law – relevant legislation has been introduced but its path forward is unclear. Staff remain at NIHHIS and are continuing the work to manage the heat.gov website, craft heat resources and information, and disseminate public communications like Heat Beat Newsletter and Heat Safety Week. Their positions could be eliminated if proposed budget cuts to the National Oceanic and Atmospheric Administration (NOAA) are approved by Congress.
Staffing reductions and actualized or proposed changes to FEMA and HHS, the federal disaster management agencies implicated in addressing extreme heat, are likely to be consequential in relation to extreme heat this summer. Internal reports have found that FEMA is not ready for responding to even well-recognized disasters like hurricanes, increasing the risk for a mismanaged response to an unprecedented heat disaster. The loss of key leaders at FEMA has also put a pause to efforts to integrate extreme heat within agency functions, such as efforts to make extreme heat an eligible disaster. FEMA is also proposing changes that will make it more difficult to receive federal disaster assistance. The Administration for Strategic Preparedness and Response (ASPR), HHS’ response arm, has been folded into the Centers for Disease Control and Prevention (CDC), which has been refocused to focus solely on infectious diseases. There is still little public information for what this merger means for HHS’ implementation of the Public Health Service Act, which requires an all-hazards approach to public health emergency management. Prior to January 2025, HHS was determining how it could use the Public Health Emergency authority to respond to extreme heat.
Loss of key personnel and their expertise
Many key agencies involved in NIHHIS, and extreme heat management more broadly, have been impacted by reductions in force and early retirements, including NOAA, FEMA, HHS, the Department of Housing and Urban Development (HUD), the Environmental Protection Agency (EPA), the U.S. Forest Service (USFS), and the Department of Energy (DOE). Some key agencies, like FEMA, have lost or will lose almost 2,000 staff. As more statutory responsibilities are put on fewer workers, efforts to advance “beyond scope” activities, like taking action on extreme heat, will likely be on the back burner.
Downsizing at HHS has been acutely devastating to extreme heat work. In January, the Office of Climate Change and Health Equity (OCCHE) was eliminated, putting a pause on HHS-wide coordination on extreme heat and the new Extreme Heat Working Group. In April, the entire staff of the Climate and Health program at CDC, the Low Income Home Energy Assistance Program (LIHEAP), and all of the staff at the National Institute for Occupational Safety and Health (NIOSH) working on extreme heat, received reduction in force notices. While it appears that staff are returning to the CDC’s National Center for Environmental Health, they have lost months of time that could have been spent on preparedness, tool development, and technical assistance to local and state public health departments. Sustained funding for extreme heat programs at HHS is under threat, the FY2026 budget for HHS formally eliminates the CDC’s Climate and Health Program, all NIOSH efforts on extreme heat, and LIHEAP.
Risks to data, forecasts, and information availability, though some key tools remain online
Staff reductions at NWS have compromised local forecasts and warnings, and some offices can no longer staff around-the-clock surveillance. Staff reductions have also compromised weather balloon launches, which collect key temperature data for making heat forecasts. Remaining staff at the NWS are handling an increased workload at one of the busiest times of the year for weather forecasting. Reductions in force, while now reversed, have impacted real-time heat-health surveillance at the CDC, where daily heat-related illness counts have been on pause since May 21, 2025 and the site is not currently being maintained as of the date of this publication.
Some tools remain online and available to use this summer, including NWS/CDC’s HeatRisk (a 7-day forecast of health-informed heat risks) and the National Highway Traffic Safety Administration’s Heat-Related EMS Activation Surveillance Dashboard (which shows the number of heat-related EMS activations, time to patient, percent transported to medical facilities, and deaths). Most of the staff that built HeatRisk have been impacted by reductions in force. The return of staff to the CDC’s Climate and Health program is a bright spot, and could bode well for the tool’s ongoing operations and maintenance for Summer 2025.
Proposed cuts in the FY26 budget will continue to compromise heat forecasting and data. The budget proposes cutting budgets for upkeep of NOAA satellites crucial to tracking extreme weather events like extreme heat; cutting budgets for the National Aeronautics and Space Administration’s LandSat program, which is used widely by researchers and private sector companies to analyze surface temperatures and understand heat’s risks; and fully defunding the National Environmental Public Health Tracking Network, which funds local and state public health departments to collect heat-health illness and death data and federal staff to analyze it.
Rollbacks in key funding sources and programs for preparedness, risk mitigation and resilience
As of May 2025, both NIHHIS Centers of Excellence – the Center for Heat Resilient Communities and the Center for Collaborative Heat Monitoring – received stop work orders and total pauses in federal funding. These Centers were set to work with 26 communities across the country to either collect vital data on local heat patterns and potential risks or shape local governance to comprehensively address the threat of extreme heat. These communities represented a cross-cut of the United States, from urban to coastal to rural to agricultural to tribal. Both Center’s leadership plans to continue the work with the selected communities in a reduced capacity, and continue to work towards aspirational goals like a universal heat action plan. Future research, coordination, and technical assistance at NOAA on extreme heat is under fire with the proposed total elimination of NOAA Research in the FY26 budget.
At FEMA, a key source of funding for local heat resilience projects, the Building Resilience Infrastructure and Communities (BRIC) program, has been cancelled. BRIC was the only FEMA Resilience grant that explicitly called out extreme heat in its Notice of Funding Opportunity, and funded $13 million in projects to mitigate the impacts of extreme heat. Many states have also faced difficulties in getting paid by FEMA for grants that support their emergency management divisions, and the FY26 budget proposes cuts to these grant programs. The cancellation of Americorps further reduces capacity for disaster response. FEMA is also dropping its support for improving building codes that mitigate disaster risk as well as removing requirements for subnational governments to plan for climate change.
At HHS, a lack of staff at CDC has stalled payments from key programs to prepare communities for extreme heat, the Building Resilience Against Climate Effects (BRACE) grant program and the Public Health Preparedness and Response program. BRACE is critical federal funding for state and local climate and health offices. In states like North Carolina, the BRACE program funds live-saving efforts like heat-health alerts. Both of these programs are proposed to be totally eliminated in the FY26 budget. The Hospital Preparedness Program (HPP) is also slated for elimination, despite being the sole source of federal funding for health care system readiness. HPP funds coalitions of health systems and public health departments, which have quickly responded to heat disasters like the 2021 Pacific Northwest Heat Domes and established comprehensive plans for future emergencies. The National Institutes of Health’s Climate and Health Initiative was eliminated and multiple grants paused in March 2025. Research on extreme weather and health may proceed, according to new agency guidelines, yet overall cuts to the NIH will impact capacity to fund new studies and new research avenues. The National Institute of Environmental Health Sciences, which funds research on environmental health, faces a 36% reduction in its budget, from $994 million to $646 million.
Access to cool spaces is key to preventing heat-illness and death. Yet cuts, regulatory rollbacks, and program eliminations across the federal government are preventing progress towards ensuring every American can afford their energy bills. At DOE, rollbacks in energy efficiency standards for cooling equipment and the ending of the EnergyStar program will impact the costs of cooling for consumers. Thankfully, DOE’s Home Energy Rebates survived the initial funding freezes and the funding has been deployed to states to support home upgrades like heat pumps, insulation, air sealing, and mechanical ventilation. At HUD, the Green and Resilient Retrofits Program has been paused as of March 2025, which was set to fund important upgrades to affordable housing units that would have decreased the costs of cooling for vulnerable residents. At EPA, widespread pauses and cancellations in Inflation Reduction Act programs have put projects to provide more affordable cooling solutions on pause. At the U.S. Department of Agriculture, all grantees for the Rural Energy for America Program, which funds projects that provide reliable and affordable energy in rural communities, have been asked to resubmit their grants to receive allocated funding. These delays put rural community members at risk of extreme heat this summer, where they face particular risks due to their unique health and sociodemographic vulnerabilities. Finally, while the remaining $400 million in LIHEAP funding was released for this year, it faces elimination in FY26 appropriations. If this money is lost, people will very likely die and utilities will not be able to cover the costs of unpaid bills and delay improvements to the grid infrastructure to increase reliability.
Uncertain progress towards heat policy goals
Momentum towards establishing a federal heat stress rule as quickly as possible has stalled. The regulatory process for the Heat Injury and Illness Prevention in Outdoor and Indoor Work Settings is proceeding, with hearings that began June 16 and are scheduled to continue until July 3. It remains to be seen how the Occupational Safety and Health Administration (OSHA) will proceed with the existing rule as written. OSHA’s National Emphasis Program (NEP) for Heat will continue until April 6, 2026. This program focuses on identifying and addressing heat-related injuries and illnesses in workplaces, and educating employers on how they can reduce these impacts on the job. To date, NEP has conducted nearly 7,000 inspections connected to heat risks, which lead to 60 heat citations and nearly 1,400 “hazard alert” letters being sent to employers.
How Can Subnational Governments Ready for this Upcoming Heat Season?
Downscaled federal capacity comes at a time when many states are facing budget shortfalls compounded by federal funding cuts and rescissions. The American Rescue Plan Act, the COVID-19 stimulus package, has been a crucial source of revenue for many local and state governments that enabled expansion in services, like extreme heat response. That funding must be spent by December 2026, and many subnational governments are facing funding cliffs of millions of dollars that could result in the elimination of these programs. While there is a growing attention to heat, it is still often deprioritized in favor of work on hazards that damage property.
Even in this environment, local and state governments can still make progress on addressing extreme heat’s impacts and saving lives. Subnational governments can:
- Conduct a data audit to ensure they are tracking the impacts of extreme heat, like emergency medical services activations, emergency room visits, hospitalizations, and deaths, and tracking expenditures dedicated to any heat-related activity.
- Develop a heat preparedness and response plan, to better understand how to leverage existing resources, capacities, and partnerships to address extreme heat. This includes understanding emergency authorities available at the local and state level that could be leveraged in a crisis.
- Use their platforms to educate the public about extreme heat and share common-sense strategies that reduce the risk of heat-illness, and public health departments can target communications to the most vulnerable.
- Ensure existing capital planning and planned infrastructure build-outs prioritize resilience to extreme heat and set up cooling standards for new and existing housing and for renters. Subnational governments can also leverage strategies that reduce their fiscal risk, such as implementing heat safety practices for their own workforces and encouraging or requiring employers to deploy these practices as a way to reduce workers compensation claims.
FAS stands ready to support leaders and communities in implementing smart, evidence-based strategies to build heat readiness – and to help interested parties understand more about the impacts of the Trump administration’s actions on federal heat capabilities. Contact Grace Wickerson (gwickerson@fas.org) with inquiries.
Position On H.Res.446 – Recognizing “National Extreme Heat Awareness Week”
The Federation of American Scientists supports H.Res. 446, which would recognize July 3rd through July 10th as “National Extreme Heat Awareness Week”.
The resolution is timely, as the majority of heat-related illness and death in the United States occurs from May to September. If enacted, H.Res. 446 would raise awareness about the dangers of extreme heat, enabling individuals and communities to take action to better protect themselves this year and for years to come.
“Extreme heat is one of the leading causes of weather-related mortality and a growing economic risk,” said Grace Wickerson, Senior Manager for Climate and Health at the Federation of American Scientists. “We applaud Rep. Lawler and Rep. Stanton’s efforts to raise awareness of the threat of extreme heat with this resolution and the launch of the new Extreme Heat Caucus.”
Position On H.R.3738 – Heat Management Assistance Grant Act of 2025
The Federation of American Scientists supports H.R. 3738 of the 119th Congress, titled the “Heat Management Assistance Grant Act of 2025.”
The Heat Management Assistance Grant Act of 2025 creates the Heat Management Assistance Grant (HMAG) Program, a quick release of Federal Emergency Management Agency grants to state, local, tribal, and territorial governments for managing heat events that could become major disasters. This resourcing can be used for responses to extreme heat events, including supplies, personnel, and public assistance. HMAG is modeled after the Fire Management Assistance Grant program, which similarly deploys quick funding to activities that prevent wildfires from becoming major disaster events. The bill also creates a definition for an extreme heat event, which informs subnational leaders on when they can ask for assistance.
“Heat emergencies, such as the 2021 Pacific Northwest Heat Dome and 2024 power outage following Hurricane Beryl in Texas, demonstrate a critical need for government assistance for heat-affected communities. Yet to date, there has been no federal pathway for rapidly resourcing heat response,” said Grace Wickerson, Senior Manager for Climate and Health at the Federation of American Scientists. “The Heat Management Assistance Grant Act of 2025 is a critical step in the right direction to unlock the resources needed to save lives, and aligns with key recommendations from our 2025 Heat Policy Agenda.”
Federal Climate Policy Is Being Gutted. What Does That Say About How Well It Was Working?
On the left is the Bankside Power Station in 1953. That vast relic of the fossil era once towered over London, oily smoke pouring from its towering chimney. These days, Bankside looks like the right:
The old power plant’s vast turbine hall is now at the heart of the airy Tate Modern Art Museum; sculptures rest where the boilers once churned.
Bankside’s evolution into the Tate illustrates that transformations, both literal and figurative, are possible for our energy and economic systems. Some degree of demolition – if paired with a plan – can open up space for something innovative and durable.
Today, the entire energy sector is undergoing a massive transformation. After years of flat energy demand served by aging fossil power plants, solar energy and battery storage are increasingly dominating energy additions to meet rising load. Global investment in clean energy will be twice as big as investment in fossil fuels this year. But in the United States, the energy sector is also undergoing substantial regulatory demolition, courtesy of a wave of executive and Congressional attacks and sweeping potential cuts to tax credits for clean energy.
What’s missing is a compelling plan for the future. The plan certainly shouldn’t be to cede leadership on modern energy technologies to China, as President Trump seems to be suggesting; that approach is geopolitically unwise and, frankly, economically idiotic. But neither should the plan be to just re-erect the systems that are being torn down. Those systems, in many ways, weren’t working. We need a new plan – a new paradigm – for the next era of climate and clean energy progress in the United States.
Asking Good Questions About Climate Policy Designs
How do we turn demolition into a superior remodel? First, we have to agree on what we’re trying to build. Let’s start with what should be three unobjectionable principles.
Principle 1. Climate change is a problem worth fixing – fast. Climate change is staggeringly expensive. Climate change also wrecks entire cities, takes lives, and generally makes people more miserable. Climate change, in short, is a problem we must fix. Ignoring and defunding climate science is not going to make it go away.
Principle 2. What we do should work. Tackling the climate crisis isn’t just about cleaning up smokestacks or sewer outflows; it’s about shifting a national economic system and physical infrastructure that has been rooted in fossil fuels for more than a century. Our responses must reflect this reality. To the extent possible, we will be much better served by developing fit-for-purpose solutions rather than just press-ganging old institutions, statutes, and technologies into climate service.
Principle 3. What we do should last. The half-life of many climate strategies in the United States has been woefully short. The Clean Power Plan, much touted by President Obama, never went into force. The Trump administration has now turned off California’s clean vehicle programs multiple times. Much of this hyperpolarized back-and-forth is driven by a combination of far-right opposition to regulation as a matter of principle and the fossil fuel industry pushing mass de-regulation for self-enrichment – a frustrating reality, but one that can only be altered by new strategies that are potent enough to displace vocal political constituencies and entrenched legacy corporate interests.
With these principles in mind, the path forward becomes clearer. We can agree that ambitious climate policy is necessary; protecting Americans from climate threats and destabilization (Principle 1) directly aligns with the founding Constitutional objectives of ensuring domestic tranquility, providing for the common defense, and promoting general welfare. We can also agree that the problem in front of us is figuring out which tools we need, not how to retain the tools we had, regardless of their demonstrated efficacy (Principle 2). And we can recognize that achieving progress in the long run requires solutions that are both politically and economically durable (Principle 3).
Below, we consider how these principles might guide our responses to this summer’s crop of regulatory reversals and proposed shifts in federal investment.
Honing Regulatory Approaches
The Trump Administration recently announced that it plans to dismantle the “endangerment finding” – the legal predicate for the Environmental Protection Agency (EPA) to regulate greenhouse gas emissions from power plants and transportation; meanwhile, the Senate revoked permission for California to enforce key car and truck emission standards. It has also proposed to roll back key power plant toxic and greenhouse gas standards. We agree with those who think that these actions are scientifically baseless and likely illegal, and therefore support efforts to counter them. But we should also reckon honestly with how the regulatory tools we are defending have played out so far.
Federal and state pollution rules have indisputably been a giant public-health victory. EPA standards under the Clean Air Act led directly to dramatic reductions in harmful particulate matter and other air pollutants, saving hundreds of thousands of lives and avoiding millions of cases of asthma and other respiratory diseases. Federal regulations similarly caused mercury pollution from coal-fired power plants to drop by 90% in just over a decade. Pending federal rollbacks of mercury rules thus warrant vocal opposition. In the transportation sector, tailpipe emissions standards for traditional combustion vehicles have been impressively effective. These and other rules have indeed delivered some climate benefits by forcing the fossil fuel industry to face pollution clean-up costs and driving development of clean technologies.
But if our primary goal is motivating a broad energy transition (i.e., what needs to happen per Principle 1), then we should think beyond pollution rules as our only tools – and allocate resources beyond immediate defensive fights. Why? The first reason is that, as we have previously written, these rules are poorly equipped to drive that transition. Federal and state environmental agencies can do many things well, but running national economic strategy and industrial policy primarily through pollution statutes is hardly the obvious choice (Principle 2).
Consider the power sector. The most promising path to decarbonize the grid is actually speeding up replacement of old coal and gas plants with renewables by easing unduly complex interconnection processes that would speed adding clean energy to address rising demand, and allow the old plants to retire and be replaced – not bolting pollution-control devices on ancient smokestacks. That’s an economic and grid policy puzzle, not a pollution regulatory challenge, at heart. Most new power plants are renewable- or battery-powered anyway. Some new gas plants might be built in response to growing demand, but the gas turbine pipeline is backed up, limiting the scope of new fossil power, and cheaper clean power is coming online much more quickly wherever grid regulators have their act together. Certainly regulations could help accelerate this shift, but the evidence suggests that they may be complementary, not primary, tools.
The upshot is that economics and subnational policies, not federal greenhouse gas regulation, have largely driven power plant decarbonization to date and therefore warrant our central focus. Indeed, states that have made adding renewable infrastructure easy, like Texas, have often been ahead of states, like California, where regulatory targets are stronger but infrastructure is harder to build. (It’s also worth noting that these same economics mean that the Trump Administration’s efforts to revert back to a wholly fossil fuel economy by repealing federal pollution standards will largely fail – again, wrong tool to substantially change energy trajectories.)
The second reason is that applying pollution rules to climate challenges has hardly been a lasting strategy (Principle 3). Despite nearly two decades of trying, no regulations for carbon emissions from existing power plants have ever been implemented. It turns out to be very hard, especially with the rise of conservative judiciaries, to write legal regulations for power plants under the Clean Air Act that both stand up in Court and actually yield substantial emissions reductions.
In transportation, pioneering electric vehicle (EV) standards from California – helped along by top-down economic leverage applied by the Obama administration – did indeed begin a significant shift and start winning market share for new electric car and truck companies; under the Biden administration, California doubled down with a new set of standards intended to ultimately phase out all sales of gas-powered cars while the EPA issued tailpipe emissions standards that put the industry on course to achieve at least 50% EV sales by 2030. But California’s EV standards have now been rolled back by the Trump administration and a GOP-controlled Congress multiple times; the same is true for the EPA rules. Lest we think that the Republican party is the sole obstacle to a climate-focused regulatory regime that lasts in the auto sector, it is worth noting that Democratic states led the way on rollbacks. Maryland, Massachusetts, Oregon, and Vermont all paused, delayed, or otherwise fuzzed up their plans to deploy some of their EV rules before Congress acted against California. The upshot is that environmental standards, on their own, cannot politically sustain an economic transition at this scale without significant complementary policies.
Now, we certainly shouldn’t abandon pollution rules – they deliver massive health and environmental benefits, while forcing the market to more accurately account for the costs of polluting technologies, But environmental statutes built primarily to reduce smokestack and tailpipe emissions remain important but are simply not designed to be the primary driver of wholesale economic and industrial change. Unsurprisingly, efforts to make them do that anyway have not gone particularly well – so much so that, today, greenhouse gas pollution standards for most economic sectors either do not exist, or have run into implementation barriers. These observations should guide us to double down on the policies that improve the economics of clean energy and clean technology — from financial incentives to reforms that make it easier to build — while developing new regulatory frameworks that avoid the pitfalls of the existing Clean Air Act playbook. For example, we might learn from state regulations like clean electricity standards that have driven deployment and largely withstood political swings.
To mildly belabor the point – pollution standards form part of the scaffolding needed to make climate progress, but they don’t look like the load-bearing center of it.
Refocusing Industrial Policy
Our plan for the future demands fresh thinking on industrial policy as well as regulatory design. Years ago, Nobel laureate Dr. Elinor Ostrom pointed out that economic systems shift not as a result of centralized fiat, from the White House or elsewhere, but from a “polycentric” set of decisions rippling out from every level of government and firm. That proposition has been amply borne out in the clean energy space by waves of technology innovation, often anchored by state and local procurement, regional technology clusters, and pioneering financial institutions like green banks.
The Biden Administration responded to these emerging understandings with the CHIPS and Science Act, Bipartisan Infrastructure Law (BIL), and Inflation Reduction Act (IRA) – a package of legislation intended to shore up U.S. leadership in clean technology through investments that cut across sectors and geographies. These bills included many provisions and programs with top-down designs, but the package as a whole but did engage with, and encourage, polycentric and deep change.
Here again, taking a serious look at how this package played out can help us understand what industrial policies are most likely to work (Principle 2) and to last (Principle 3) moving forward.
We might begin by asking which domestic clean-technology industries need long-term support and which do not in light of (i) the multi-layered and polycentric structure of our economy, and (ii) the state of play in individual economic sectors and firms at the subnational level. IRA revisions that appropriately phase down support for mature technologies in a given sector or region where deployment is sufficient to cut emissions at an adequate pace could be worth exploring in this light – but only if market-distorting supports for fossil-fuel incumbents are also removed. We appreciate thoughtful reform proposals that have been put forward by those on the left and right.
More directly: If the United States wants to phase down, say, clean power tax credits, such changes should properly be phased with removals of support for fossil power plants and interconnection barriers, shifting the entire energy market towards a fair competition to meet increasing load, as well as new durable regulatory structures that ensure a transition to a low-carbon economy at a sufficient pace. Subsidies and other incentives could appropriately be retained for technologies (e.g., advanced battery storage and nuclear) that are still in relatively early stages and/or for which there is a particularly compelling argument for strengthening U.S. leadership. One could similarly imagine a gradual shift away from EV tax credits – if other transportation system spending was also reallocated to properly balance support among highways, EV charging stations, transit, and other types of transportation infrastructure. In short, economic tools have tremendous power to drive climate progress, but must be paired with the systemic reforms needed to ensure that clean energy technologies have a fair pathway to achieving long-term economic durability.
Our analysis can also touch on geopolitical strategy. It is true that U.S. competitors are ahead in many clean technology fields; it is simultaneously true that the United States has a massive industrial and research base that can pivot ably with support. A pure on-shoring approach is likely to be unwise – and we have just seen courts enjoin the administration’s fiat tariff policy that sought that result. That’s a good opportunity to have a more thoughtful conversation (in which many are already engaging) on areas where tariffs, public subsidies, and other on-shoring planning can actually position our nation for long-term economic competition on clean technology. Opportunities that rise to the top include advanced manufacturing, such as for batteries, and critical industries, like the auto sector. There is also a surprising but potent national security imperative to center clean energy infrastructure in U.S. industrial policy, given the growing threat of foreign cyberattacks that are exploiting “seams” in fragile legacy energy systems.
Finally, our analysis suggests that states, which are primarily responsible for economic policy in their jurisdictions, have a role to play in this polycentric strategy that extends beyond simply replicating repealed federal regulations. States have a real opportunity in this moment to wed regulatory initiatives with creative whole-of-the-economy approaches that can actually deliver change and clean economic diversification, positioning them well to outlast this period of churn and prosper in a global clean energy transition.
A successful and “sticky” modern industrial policy must weave together all of the above considerations – it must be intentionally engineered to achieve economic and political durability through polycentric change, rather than relying solely or predominantly on large public subsidies.
Conclusion
The Trump Administration has moved with alarming speed to demolish programs, regulations, and institutions that were intended to make our communities and planet more liveable. Such wholesale demolition is unwarranted, unwise, and should not proceed unchecked. At the same time, it is, as ever, crucial to plan for the future. There is broad agreement that achieving an effective, equitable, and ethical energy transition requires us to do something different. Yet there are few transpartisan efforts to boldly reimagine regulatory and economic paradigms. Of course, we are not naive: political gridlock, entrenched special interests, and institutional inertia are formidable obstacles to overcome. But there is still room, and need, to try – and effort bears better fruit when aimed at the right problems. We can begin by seriously debating which past approaches work, which need to be improved, which ultimately need imaginative recasting to succeed in our ever-more complex world. Answers may be unexpected. After all, who would have thought that the ultimate best future of the vast oil-fired power station south of the Thames with which we began this essay would, a few decades later, be a serene and silent hall full of light and reflection?
AI, Energy, and Climate: What’s at Stake? Hint: A lot.
DC’s first-ever Climate Week brought with it many chances to discuss the hottest-button topics in climate innovation and policy. FAS took the opportunity to do just that, by hosting a panel to explore the intersection of artificial intelligence (AI), energy, and climate issues with leading experts. Dr. Oliver Stephenson, FAS’ Associate Director of Artificial Intelligence and Emerging Technology Policy, sat down with Dr. Tanya Das, Dr. Costa Samaras, and Charles Hua to discuss what’s at stake at this critical crossroads moment.
Missed the panel? Don’t fret. Read on to learn the need-to-knows. Here’s how these experts think we can maximize the “good” and minimize the “bad” of AI and data centers, leverage research and development (R&D) to make AI tools more successful and efficient, and how to better align incentives for AI growth with the public good.
First, Some Level Setting
The panelists took their time to make sure the audience understood two key facts regarding this space. First, not all data centers are utilized for AI. The Electric Power Research Institute (EPRI) estimates that AI applications are only used in about 10-20% of data centers. The rest? Data storage, web hosting capabilities, other cloud computing, and more.
Second, load growth due to the energy demand of data centers is happening, but the exact degree still remains unknown. Lawrence Berkeley National Lab (LBNL) models project that data centers in the US will consume anywhere between 6.7% and 12% of US electricity generation by 2028. For a country that consumes roughly 4 trillion kilowatt hours (kWh) of electricity each year, this estimation spans a couple hundred billion kWh/year from the low end to the high. Also, these projections are calculated based on different assumptions that factor in AI energy efficiency improvements, hardware availability, regulatory decisions, modeling advancements, and just how much demand there will be for AI. When each of these conditions are evolving daily, even the most credible projections come with a good amount of uncertainty.
There is also ambiguity in the numbers and in the projections at the local and state levels, as many data center companies shop around to multiple utilities to get the best deal. This can sometimes lead to projects getting counted twice in local projections. Researchers at LBNL have recently said they can confidently make data center energy projections out to 2028. Beyond that, they can’t make reasonable assumptions about data center load growth amid growing load from other sectors working to electrify—like decarbonizing buildings and electric vehicle (EV) adoption.
Maximizing the Good, Minimizing the Bad
As data center clusters continue to proliferate across the United States, their impacts—on energy systems and load growth, water resources, housing markets, and electricity rates—will be most acutely felt at the state and local levels. DC’s nearby neighbor Northern Virginia has become a “data center alley” with more than 200 data centers in Loudoun County alone, and another 117 in the planning stages.
States ultimately hold the power to shape the future of the industry through utility regulation, zoning laws, tax incentives, and grid planning – with specific emphasis on state Public Utility Commissions (PUCs). PUCs have a large influence on where data centers can be connected to the grid and the accompanying rate structure for how each data center pays for its power—whether through tariffs, increasing consumer rates, or other cost agreements. It is imperative that vulnerable ratepayers are not left to shoulder the costs and risks associated with the rapid expansion of data centers, including higher electricity bills, increased grid strain, and environmental degradation.
Panelists emphasized that despite the potential negative impacts of AI and data centers expansion, leaders have a real opportunity to leverage AI to maximize positive outcomes—like improving grid efficiency, accelerating clean energy deployment, and optimizing public services—while minimizing harms like overconsumption of energy and water, or reinforcing environmental injustice. Doing so, however, will require new economic and political incentives that align private investment with public benefit.
Research & Development at the Department of Energy
The U.S. Department of Energy (DOE) is uniquely positioned to help solve the challenges AI and data centers pose, as the agency sits at the critical intersection of AI development, high-performance computing, and energy systems. DOE’s national laboratories have been central to advancing AI capabilities: Oak Ridge National Laboratory (ORNL) was indeed the first to integrate graphics processing units (GPUs) into supercomputers, pioneering a new era of AI training and modeling capacity. DOE also runs two of the world’s most powerful supercomputers – Aurora at Argonne National Lab and Frontier at ORNL – cementing the U.S.’ leadership in high-performance computing.
Beyond computing, DOE plays a key role in modernizing grid infrastructure, advancing clean energy technologies, and setting efficiency standards for energy-intensive operations like data centers. The agency has also launched programs like the Frontiers in Artificial Intelligence for Science, Security and Technology (FASST), overseen by the Office of Critical and Emerging Tech (CET), to coordinate AI-related activities across its programs.
As the intersection of AI and energy deepens—with AI driving data center expansion and offering tools to manage its impact—DOE must remain at the center of this conversation, and it must continue to deliver. The stakes are high: how we manage this convergence will influence not only the pace of technological innovation but also the equity and sustainability of our energy future.
Incentivizing Incentives: Aligning AI Growth with the Public Good
The U.S. is poised to spend a massive amount of carbon to power the next wave of artificial intelligence. From training LLMs to supporting real-time AI applications, the energy intensity of this sector is undeniable—and growing. That means we’re not just investing financially in AI; we’re investing environmentally. To ensure that this investment delivers public value, we must align political and economic incentives with societal outcomes like grid stability, decarbonization, and real benefits for American communities.
One of the clearest opportunities lies in making data centers more responsive to the needs of the electric grid. While these facilities consume enormous amounts of power, they also hold untapped potential to act as flexible loads—adjusting their demand based on grid conditions to support reliability and integrate clean energy. The challenge? There’s currently little economic incentive for them to do so. One panelist noted skepticism that market structures alone will drive this shift without targeted policy support or regulatory nudges.
Instead, many data centers continue to benefit from “sweetheart deals”—generous tax abatements and economic development incentives offered by states and municipalities eager to attract investment. These agreements often lack transparency and rarely require companies to contribute to local energy resilience or emissions goals. For example, in several states, local governments have offered multi-decade property tax exemptions or reduced electricity rates without any accountability for climate impact or grid contributions.
New AI x Energy Policy Ideas Underway
If we’re going to spend gigatons of carbon in pursuit of AI-driven innovation, we must be strategic about where and how we direct incentives. That means:
- Conditioning public subsidies on data center flexibility and efficiency performance.
- Requiring visibility into private energy agreements and emissions footprints.
- Designing market signals—like time-of-use pricing or demand response incentives—that reward facilities for operating in sync with clean energy resources.
We don’t just need more incentives—we need better ones. And we need to ensure they serve public priorities, not just private profit. Through our AI x Energy Policy Sprint, FAS is working with leading experts to develop promising policy solutions for the Trump administration, Congress, and state and local governments. These policy memos will address how to: mitigate the energy and environmental impacts of AI systems and data centers, enhance the reliability and efficiency of energy systems using AI applications, and unlock transformative technological solutions with AI and energy R&D.
Right now, we have a rare opportunity to shape U.S. policy at the critical intersection of AI and energy. Acting decisively today ensures we can harness AI to drive innovation, revolutionize energy solutions, and sustainably integrate transformative technologies into our infrastructure.
Building an Environmental Regulatory System that Delivers for America
The Clean Air Act. The Clean Water Act. The National Environmental Policy Act. These and most of our nation’s other foundational environmental laws were passed decades ago – and they have started to show their age. The Clean Air Act, for instance, was written to cut air pollution, not to drive the whole-of-economy response that the climate crisis now warrants. The Energy Policy and Conservation Act of 1975 was designed to make cars more efficient in a pre-electric vehicle era, and now puts the Department of Transportation in the awkward position of setting fuel economy standards in an era when more and more cars don’t burn gas.
Trying to manage today’s problems with yesterday’s laws results in government by kludge. Legacy regulatory architecture has foundered under a patchwork of legislative amendments and administrative procedures designed to bridge the gap between past needs and present realities. Meanwhile, Congressional dysfunction has made purpose-built updates exceptionally difficult to land. The Inflation Reduction Act, for example, was mostly designed to move money rather than rethink foundational statutes or regulatory processes – because those rethinks couldn’t make it past the filibuster.
As the efficacy of environmental laws has waned, so has their durability. What was once a broadly shared goal – protecting Americans from environmental harm – is now a political football, with rules that whipsaw back and forth depending on who’s in charge.
The second Trump Administration launched the biggest environmental deregulatory campaign in history against this backdrop. But that campaign, coupled with massive reductions in the federal civil service and a suite of landmark court decisions (including Loper Bright) about how federal agencies regulate, risks pushing U.S. regulatory architecture past the point of sensible and much-needed reform and into a state of complete disrepair.
Dismantling old systems has proven surprisingly easy. Building what comes next will be harder. And the work must begin now.
It is time to articulate a long-term vision for a government that can actually deliver in an ever-more complex society. The Federation of American Scientists (FAS) is meeting this moment by launching an ambitious new project to reimagine the U.S. environmental regulatory state, drawing ideas from across ideological lines.
The Beginning of a New Era
Fear of the risks of systemic change often prevent people from entertaining change in earnest. Think of the years of U.S. squabbles over how or whether to reform permitting and environmental review, while other countries simply raced ahead to build clean energy projects and establish dominance in the new world economy. Systemic stagnation, however, comes with its own consequences.
The Inflation Reduction Act (IRA) and the Infrastructure Investment and Jobs Act (IIJA) are a case in point when it comes to climate and the environment. Together, these two pieces of legislation represented the largest global investment in the promise of a healthier, more sustainable, and, yes, cheaper future. Unfortunately, as proponents of the “abundance” paradigm and others have observed, rollout was hampered by inefficient processes and outdated laws. Implementing the IRA and the IIJA via old systems, in short, was like trying to funnel an ocean through a garden hose – and as a result, most Americans experienced only a trickle of real-world impact.
Similar barriers are constraining state progress. For example, the way we govern and pay for electricity has not kept pace with a rapidly changing energy landscape – meaning that the United States risks ceding leadership on energy technologies critical to national security, economic competitiveness, and combating climate change.
But we are perhaps now entering a new era. The United States appears to be on the edge of real political realignments, with transpartisan stakes around the core role of government in economic development that do not match up neatly to current coalitions. This realignment presents a crucial opportunity to catalyze a new era of climate, environmental, and democratic progress.
FAS will leverage this opportunity by providing a forum for debate and engagement on different facets of climate and environmental governance, a platform to amplify insights, and the capacity to drive forward solutions. Examples of topics ripe for exploration include:
- Balancing agility and accountability. As observed, regulatory approaches of the past have struggled to address the interconnected, quickly evolving nature of climate and environmental challenges. At the same time, mechanisms for ensuring accountability have been disrupted by an evolving legal landscape and increasingly muscular executive. There is a need to imagine and test new systems designed to move quickly but responsibly on climate and environmental issues.
- Complementing traditional regulation through novel strategies. There is growing interest in using novel financial, contractual, and other strategies as a complement to regulation for driving climate and environmental progress. There is considerable room to go deeper in this space, identifying both the power of these strategies and their limits.
- Rethinking stakeholder engagement. The effectiveness of regulation depends on its ability to serve diverse stakeholder needs while advancing environmental goals. Public comment and other pipelines for engaging stakeholders and integrating external perspectives and expertise into regulations have been disrupted by technologies such as AI, while the relationship between regulated entities and their regulators has become increasingly adversarial. There is a need to examine synergies and tradeoffs between centering stakeholders and centering outcomes in regulatory processes, as well as examine how stakeholder engagement could be improved to better ensure regulations that are informed, feasible, durable, and distributively fair.
In working through topics like these, FAS seeks to lay out a positive vision of regulatory reconstruction that is substantively superior to either haphazard destruction or incremental change. Our vision is nothing less than to usher in a new paradigm of climate and environmental governance: one that secures a livable world while reinforcing democratic stability, through systems that truly deliver for America.
We will center our focus on the federal government given its important role in climate and environmental issues. However, states and localities do a lot of the work of a federated government day-to-day. We recognize that federal cures are unlikely to fully alleviate the symptoms that Americans are experiencing every day, from decaying infrastructure to housing shortages. We are committed to ensuring that solutions are appropriately matched to the root cause of state capacity problems and that federal climate and environmental regulatory regimes are designed to support successful cooperation with local governments and implementation partners.
FAS is no stranger to ambitious endeavors like these. Since our founding in 1945, we have been committed to tackling the major science policy issues that reverberate through American life. This new FAS workstream will be embedded across our Climate and Environment, Clean Energy, and Government Capacity portfolios. We have already begun engaging and activating the diverse community of scholars, experts, and leaders laying the intellectual groundwork to develop compelling answers to urgent questions surrounding the climate regulatory state, against the backdrop of a broader state capacity movement. True to our nonpartisan commitment, we will build this work on a foundation of cross-ideological curiosity and play on the tension points in existing coalitions that strike us all as most productive.
We invite you to join us in conversation and collaboration. If you want to get involved, contact Zoë Brouns (zbrouns@fas.org).