Rebuilding Environmental Governance: Understanding the Foundations

Today we are facing persistent, complex, and accelerating environmental challenges that require adding new approaches to existing environmental governance frameworks. The scale of some of them, such as climate change, require rethinking our regulatory tools, while diffuse sources of pollutants present additional difficulties. At the same time, effective governance systems must accommodate the addition of new infrastructure, housing, and energy delivery to support communities. Our legal framework must be sufficiently stable to enable regulation, investment, and innovation to proceed without the discontinuities and gridlock of the past few decades. 

In an increasingly divided atmosphere, it will take candid, multiperspective dialogue to identify paths toward such a framework. This discussion paper explores the baseline that we’re building on and some key dynamics to consider as we think about the durable systems, approaches, and capacity needed to achieve today’s multiple societal goals.


Our environmental system was built for 1970s-era pollution control, but today it needs stable, integrated, multi-level governance that can make tradeoffs, share and use evidence, and deliver infrastructure while demonstrating that improved trust and participation are essential to future progress. 

Implications for democratic governance

Capacity needs

Modernize today’s system of cooperative federalism to address the lack of clear and intentional interconnections, adaptive feedback loops, and aligned objective, by:


The early 20th Century saw the emergence of our first national laws regulating public resources— the Federal Power Act in the 1930s, the precursor to the Clean Water Act in the 1940s, and the first version of the Clean Air Act in the 1950s. Then, in a concentrated decade of new laws and massive amendments to existing ones, the 1970s saw a focus on assessing, controlling, and reducing pollution, while setting ambitious goals for human and ecosystem health. These statutes generally were constructed around specific resources—airsheds, watersheds, public lands, and wildlife habitat—and articulated specific roles for federal agencies and other levels of government. State efforts were incorporated into a nationwide system of cooperative federalism, while many states undertook their own initiatives to address environmental problems.

For half a century these laws—enacted with overwhelming, bipartisan congressional support— produced a great deal of success, with conventional pollution decreasing across many resources and regions and some species and habitats recovering. But we have plateaued in terms of broad improvements, and meanwhile novel pollutants and more diffuse, global threats have emerged. Political shifts, legacy economic interests, and a changing information landscape have played an important role, as amply recounted elsewhere. 

The bipartisan legislation of the 1970s arose from both idealism and necessity, during an Earth Day moment that embraced ecological thinking in response to tangible harms to humans and the environment. The laws enjoyed massive public support and got many things right. Some were aspirational and holistic, such as the Clean Water Act’s “zero-discharge” target or NEPA’s vision “to create and maintain conditions under which man and nature can exist in productive harmony, and fulfill the social, economic, and other requirements of present and future generations of Americans.” The latter Act established the Council on Environmental Quality to coordinate this policy across the entire federal government.

Other advances came piecemeal, focused on specific resources. The U.S. Environmental Protection Agency (EPA) was cobbled together by an executive plan to reorganize several existing agencies and offices, then granted authority in a series of media-specific statutes that began with the Clean Air Act, Clean Water Act, and Safe Drinking Water Act, and later the Toxic Substances Control Act and Federal Insecticide, Fungicide, and Rodenticide Act. The Resource Conservation and Recovery Act, Superfund, and Oil Pollution Act addressed hazardous substances affecting the nation’s health and ecosystems. Implementation of all these laws required the Agency to develop in-house scientific expertise and detailed regulations that fleshed out statutory standards and applied them to specific sectors—an approach upheld for decades by the Supreme Court.

These laws made unquestionable progress on conventional pollution and waste, the visible, toxic byproducts of industrial production and consumer culture that had spurred the environmental movement and drawn a generation of lawyers to the new profession. But with specialization came fragmentation of environmental law into a plethora of subtopics, and a managerial, permit-centric legal culture that risked losing sight of ecological goals. Nor were the benefits distributed equally by race or class, as demonstrated by pioneering studies in the field of environmental justice.

As the field matured, it slowed, with congressional interventions becoming less frequent and more technical. Some of the last major amendments to a bedrock environmental statute were the Clean Air Act Amendments of 1990, enacted by a bipartisan Congress and signed by President George H.W. Bush. (The other prominent example is the Frank R. Lautenberg Chemical Safety for the 21st Century Act (Lautenberg Chemical Safety Act), a major amendment to TSCA in 2016.) Absent updated legislation, EPA regulations became paramount, but these had to run a gauntlet of shifting policy priorities, complex rulemaking procedures, litigation, and a transformed and often skeptical Supreme Court. 

Critiques of this system date back almost as far as the statutes themselves. One ELI study listed 34 major “rethinking” efforts emanating from academia, blue-ribbon commissions, and NGOs between 1985 and 2014, across the political spectrum and ranging from incremental reforms to radical reinvention. One highly touted initiative, led by sitting Vice President Al Gore, resulted in some modest administrative streamlining. Most remained paper exercises, appealing to good-government advocates but lacking political support.

The stakes grew higher with increasing awareness of climate change. In June 1988, NASA and book-length treatments followed, sparking broad discussion of what was then a fully bipartisan issue. Vice President Bush campaigned on addressing it, and as President in 1992, he traveled to Rio de Janeiro to sign the U.N. Framework Convention on Climate Change. With successes like the 1987 Montreal Protocol on the ozone layer or EPA’s 1990 Acid Rain Program doubtless in mind, the Senate ratified the Framework Convention 92-0.

But climate change implicates much larger portions of the U.S. economy—energy, transportation, agriculture—at individual as well as industrial scales. While NEPA embodied the 1960s slogan that “everything is connected,” the lesson of climate change is that many things emit greenhouse gases, and all things will be affected by global warming. The need for systemic change proved to be an uneasy fit with existing site-specific, media-specific environmental laws.

Growing awareness of climate change and the scale of action needed to address it also generated a backlash from entrenched economic interests. By the mid-2000s, the Bush/Cheney administration had reversed course on federal climate commitments. It contested and lost Massachusetts v. EPA, a landmark ruling in which a narrowly divided Supreme Court held that the Clean Air Act applies to greenhouse gas emissions that affect the climate. 

The Administration’s argument was captured by Justice Antonin Scalia’s flippant remark in dissent that “everything airborne, from Frisbees to flatulence, [would] qualif[y] as an ‘air pollutant.’” In Scalia’s opinion, real pollution must be visible, earthbound, toxic, inhaled, not a matter of colorless molecules interacting in the stratosphere. Even in dissent, this view set the stage for subsequent legal battles, right up to the present effort to revoke EPA’s 2009 “endangerment finding” that is now the underpinning of federal greenhouse gas regulation. 

Climate change likewise laid bare the long-standing divide between environmental law, which historically regulated the power sector in terms of its fuel inputs and combustion byproducts, and energy and utility law, which focused more on transmission and distribution of the resulting power. (Both fields are further divided among federal, state, and local authorities, as discussed below.) Vehicle emissions similarly are regulated via both EPA tailpipe standards and National Highway Transportation and Safety Administration mileage standards, with California authorized to propose more stringent ones. When coordinated, this multi-headed structure produces steady advances, but in deregulatory moments it has become fertile ground for opportunism, retrenchment, and delay. 

At the federal level, these questions have been exacerbated by massive shifts in administrative law, long the building block of environmental law and climate action, and in federal court rulings on the separation of powers, implicating the authority of federal agencies to issue and enforce rules. Successive administrations have run afoul of the current Supreme Court majority, whose “major questions doctrine” casts a shadow both on attempts to fit new problems into once-expansive environmental statutes, and on “whole of government” approaches that attempt to address climate change’s sources and impacts across the entire economy. 

Tentative attempts by presidents to leverage executive power and emergency authority have been curtailed when invoked for regulatory purposes, but are running strong in deregulatory efforts and executive actions in the service of “energy dominance.” Whether the Supreme Court will articulate some principled limits, and whether those will be even-handedly applied to future administrations, remains to be seen. Meanwhile, the past year has seen a large-scale push to reduce environmental regulation, in parallel with abrupt reorganizations and steep reductions in the federal workforce and agency budgets. These actions were joined by sharp declines in environmental enforcement and U.S. withdrawal from environmental and climate-related international instruments and bodies.

In this uncertain atmosphere, attention has turned to new technologies and building the necessary infrastructure to effect growth in low- and zero-carbon energy. As clean energy alternatives have matured and become economically competitive, the climate imperative is pushing against long-standing environmental review and permitting procedures. That may well include NEPA, which is now attracting attention from all three branches of government and a robust debate about whether, or how much, its procedures might be slowing energy deployment. 

Environmental issues were federalized for a reason: to counter pollution that crosses state borders and to prevent a race to the bottom. But decades of implementation have seen the blunting of some tools, expansion of others, and identification of gaps. Moving forward requires reaffirming that the environment is inseparable from societal health and well-being, economic stability, and energy systems. Any serious response must orient governance toward decarbonization, while embedding accountability, equity, and justice from the outset rather than inconsistently and often inadequately after the fact. Doing all this without sacrificing hard-won environmental gains will not be easy.

To meet the challenge of the worldwide crises of biodiversity loss, pollution overload, and climate change, creation of any new structure must be rooted in understanding the existing baseline for environmental governance. 

Cross-Cutting Objectives

Inseparable: Environment, Energy, Economy, and Society

The past half-century has demonstrated the impossibility of severing the environment from the economy, energy production, and social well-being. We must ensure the false dichotomy between environmental protection and economic development, characterized by an oversimplified idea that the two are in a zero-sum competition, also fades. The decades-old concept of sustainability (or triple bottom line) has not yet made its way into many of our foundational laws and governance structures.

Ignoring the complex relationships among environment, energy, the economy, and society favors short-term decisions that externalize impacts. This underlies the longstanding debate over the accuracy and efficacy of cost-benefit analyses, throughout their 40-plus year federal history, including questions about scope and how they handle uncertainty. For any project or program, system designers that consider an integrated suite of factors that move beyond basic environmental parameters or economic indicators (from public health to workforce development, from the supply chain to community well-being) have a greater chance of cross-sector success. 

These governance challenges are also inseparable from shifts in how finance flows. Public and private financial tools—from subsidies and tax credits to loans, grants, and community-based financing—are increasingly shaping market behavior and determining whether policy objectives translate into real-world outcomes. Who controls these tools, how they are deployed, and when capital is made available all play a central role in driving or constraining environmental progress.

Bridging these gaps is, of course, easier said than done. But widening the aperture of considerations can connect decisionmaking to holistic industrial policies that account for a wider range of economic, social, and environmental factors. Accounting for this wider range isn’t just a nice-to-have, but essential to shared prosperity. 

Foundational: Trust and Participation 

A process, project, or program will move at the speed of trust—no faster and no slower. This refers to trust in institutions, in science, and in process. 

Trust is earned through consistent transparency, clear accountability, and demonstrated responsiveness. For governance systems to function at the scale and pace required today, these principles must be embedded in decisionmaking in ways that are coherent and durable, rather than fragmented across a series of disparate steps and entities. Our traditional frameworks contain mechanisms to solicit and incorporate public input. But those mechanisms have limitations for all involved, both those trying to make their voice heard and those proposing the action and receiving input. (These range from when and how often participation occurs in the decisionmaking process to how the input is incorporated and decisions communicated.) Participation is foundational to our regulatory democracy and must occur early enough and in meaningful ways to improve decisions.

Effective participation also depends on clarity. People must be able to understand how decisions are made, what tradeoffs are being weighed, and where and how engagement can influence outcomes. But our frameworks still reflect reliance on elite and professional representation rather than widespread engagement. Trust—and the durability of outcomes—will increase when our processes have clearly articulated principles, transparently and rapidly weigh tradeoffs, and come to decisions through open and informed consideration. 

The Concurrent Risk and Promise of Technology 

Mechanization and industrialization created both unprecedented wealth and the pollutants that were the target of the 1970s wave of environmental laws. Emerging technologies likewise offer great promise, but also place familiar stresses—greenhouse gas emissions, water consumption, land use, waste—on the ecosystem and on human health and well-being. Our existing laws will need to respond and adapt to these problems as data centers and other novel demands reach greater scale, even as we evolve new ways of balancing those technologies’ potential against their up-front impacts and opportunity costs. 

Technology also offers a potential path through the climate crisis, as solar and wind energy have become scalable and cost-competitive with traditional fossil fuels. Other clean technologies on the horizon, such as geothermal or fusion energy, retain bipartisan support and will require legal and regulatory guardrails if they mature and are integrated into the system. Battery storage and energy efficiency advances will help manage and reduce energy demand, and carbon removal and sequestration technologies may also play a role in curbing emissions. And at the outer limits of our knowledge, various geoengineering concepts are raising difficult questions about feasibility, decisionmaking procedures, unintended consequences, and accountability. 

New technologies are also helping shape the implementation of environmental law in important ways. Existing tools such as satellite imaging, GPS location and geographic information systems, remote monitoring and sensing, and drones have fundamentally altered the way we view and record data from the physical world, in close to real time. Computer modeling and simulations have been a mainstay of climate science and policy, and other software innovations may improve environmental governance, including addressing long-standing issues of government transparency and public participation.

Sample Topics for Multi-Perspective Discussions
Communicating environmental challenges, conditions, and risks

 Effective messaging is essential to enhancing public understanding of interconnected issues and support for responses. It should be tailored to specific jurisdictions and informed by advances in research (e.g., behavioral science), learn from those thriving in today’s information ecosystem, and embrace strategies for reducing polarization.

Advancing the beneficial use of technologies while establishing reasonable guardrails

How can we identify and address barriers to the development and equitable deployment of technologies that advance environmental protection while limiting their negative impacts.

Democracy, Expertise, and Regulatory Certainty

In a healthy democracy, public policy is guided by evidence, and truth is the shared foundation for collective decisionmaking, whatever the chosen outcome. When facts and scientific expertise are dismissed or minimized in favor of ideology, however, it becomes harder for citizens to deliberate, solve problems, and hold leaders accountable. The diminution and marginalization of science contribute to the erosion of democracy itself.

In the United States, our ability to build necessary infrastructure and take action has been slowed by the long timelines and sometimes overlapping requirements of our regulatory processes. This is exacerbated by the increasingly extreme policy swings we have been experiencing between administrations. The result is the twin challenge of how to increase the pace of our processes without lessening their protections, while also making our decisions more stable and durable.

Aligning Regulatory Certainty and Timelines 

Regulatory certainty is not the same thing as rigidity. When done correctly, it can be the backdrop against which communities are able to plan for the future and companies can make informed decisions about where and how to invest. Regulation that is sufficiently clear on stable objectives does not have as much space in which to swing. 

Long horizons with clear milestones matter: think of a national clean electricity standard, or the emissions-based equivalent, set on a 15- to 20-year glidepath. Confidence in long-term decisions, however, stems from effective inclusion, holistic analysis, and transparent decisions. The perspectives of subject-matter experts (in-house and external), and of those who manage and care about the resources or land in question, should be considered essential and actively pursued by policymakers. 

Program-level thinking can help inform decisions at the project level. The energy transition will be remembered for feats of engineering—the thousands of miles of transmission lines, the buildout of battery storage—but its success will be determined by whether our framework listens, incorporates needed expertise, and produces rules that last long enough for people to plan their lives.

Evidence-Based Decisionmaking

For decades, the principle that good decisions require a good evidence base has been axiomatic. Dating back to 1945, the federal government has invested in science as a discipline and an idea, with government supporting the research to be conducted by public institutions and delivered as socially useful goods by the private sector.

Incorporating meaningful, often complex, evidence—including scientific data, traditional knowledge, and the needs, concerns, and priorities of potentially affected individuals—into decisionmaking is increasingly fraught. Climate change illustrates these challenges: despite decades of understanding by government officials and private sector decisionmakers about its causes and the need to act, economic and social interests have prevented effective policy and legislative response. Decisions are as good as the information they are based on. Emissions reductions ultimately depend not just on technical knowledge, but on institutions and governments capable of acting on that knowledge independently, transparently, and free from corruption and clientelism.

In a study assessing the effectiveness of the federal government’s efforts to improve evidence-based decisionmaking, the U.S. Government Accountability Office found mixed progress in: (1) developing relevant and high-quality evidence; (2) employing it in decisionmaking; and (3) ensuring adequate capacity to undertake those activities. These are foundational problems.

Compounding our challenges in making legislative and policy decisions based on accurate and pertinent evidence is the siren song of AI. Artificial intelligence promises many tools, ranging in complexity and autonomy from providing clerical tasks to generating substantive recommendations. (AI Clerical Assistive Systems automate certain administrative and procedural tasks, such as document classification and automatic transcription, and AI Recommendation Systems can contribute to judicial decision-making, for example, by analyzing legal codes and case precedents. Paul Grimm et al.)

 AI is already being used across jurisdictions and agencies for environmental regulation, including planning, reviewing proposals, drafting environmental reviews, public participation and engagement, monitoring compliance, and enforcement. Recent federal policy has fueled the AI flame, with a 2025 AI action plan and multiple Executive Orders that offer the power to expedite permitting processes.

Enormous governance questions around AI have yet to be resolved. Technologies built by people reflect the values and assumptions of those who built them, and their use shifts power in decisionmaking processes. If a judge were called upon to review a decision made by such a tool, how could she determine the finding was reasonable under existing standards of administrative law? Can machine-generated analysis satisfy NEPA’s “hard look” review? These types of governance concerns dog AI tools wherever they are deployed but become particularly critical when they have the potential to become the decisionmaker in our legal and regulatory system.

The importance of having rigorous systems for identifying and considering trusted information to ground collective and democratic decisionmaking cannot be overstated. Until recently, dozens of scientific advisory committees routinely advised federal agencies to help bridge information gaps. Staggering recent losses of federal research funding and government programs and scrubbing of essential data sets means any path forward will likely require significant investments of both financial and human capital. When we rebuild, priority should be placed on ensuring all participants in decisionmaking have access to the same evidence, supported by the same systems. 

Frontloading Regulatory Decisionmaking 

Even as we work to improve how evidence informs decisionmaking, we face growing risks, uncertainties, and tradeoffs. The challenge is not simply to generate more information, but to make better use of what we already know through regulatory systems that reflect the integrated nature of the problems we face—without mistaking uncertainty for an absence of evidence.

Many conflicts arise because decisions are fragmented across regulatory silos and institutions.  Consider a proposed electrical transmission line crossing a wetland. Decisionmakers must balance the imperatives of the energy transition, the conservation of biodiversity, the protection of water resources, and local economic opportunities. Yet these factors may be evaluated at different times, at different scales, and by different agencies. As a result, environmental permitting decisions can be made in isolation, long after foundational choices about the project’s purpose and design have already been locked in.

By the time site-specific questions arise, such as whether a particular wetland falls within the narrowed jurisdiction of the Clean Water Act, many broader tradeoffs have already been foreclosed. 

A holistic approach would entail identifying the priority of certain projects and a system for weighing their impacts. For example, infrastructure decisions could happen at a systemic scale such as nationwide grid needs, providing context for decisions about individual projects and resources. Our decisionmaking processes need systems for weighing tradeoffs, and making them transparent, to enable systems-level planning and prioritization and effective engagement. 

Hard decisions will have to be made regarding prioritized (and thus deprioritized) objectives. But frontloading data gathering, assessment, and decisionmaking on a national scale—through meaningful scenario planning, for example—could reduce the number of decisions made much further down the line in a project lifecycle and temper the uncertainty that can stem from permitting officials’ discretion. 

We will be facing these types of tradeoffs with increasing frequency as needs mount to build infrastructure and housing, retreat from our coasts, manage and conserve species and ecosystems, and respond to and prepare for increasingly frequent and severe emergencies. In addition to an integrated approach for assessing impacts and making tradeoffs transparent, the system will need certain decisions to be made earlier in the decisionmaking processes and with a broader scope. 

Acting (and Adapting) Amidst Uncertainty 

Core tenets of administrative law structure decisionmaking with up front analysis and assume that we have full—or at least sufficient—information about circumstances and potential impacts to support a decision. But this is not always the case. When there are substantial uncertainties about conditions or the possible impacts of an action or rulemaking, adaptive management can improve outcomes by taking an iterative, systematic approach. 

The uncertainties brought on by changing conditions due to climate impacts and unknowns about the consequences of proposed actions may call for an adaptive approach. And there are other situations where establishing sufficient evidence before taking irreversible action is appropriate. For example, we currently have limited understanding of the potential local and global impacts of geoengineering proposals to release aerosols into the atmosphere to block the sun’s rays, nor are there governing mechanisms in place to address them. 

There are also situations where it is important to ensure that we do not indefinitely postpone action due to a desire to have all the answers before acting, such as infrastructure for transitioning away from fossil fuel combustion. When appropriate, effective adaptive management plans include procedural and substantive safeguards such as clear goals to set an agenda and provide transparency, an accurate assessment of baseline conditions to compare future monitoring data against, an outline of the thresholds at which management actions should be taken to promote certainty and assist with judicial enforcement, and is linked to response action.

Learning as we go and making appropriate adjustments may be justified in some contexts, and even essential when we do not have the luxury of time and must move ahead without critical information. Adaptive management can increase an agency’s ability to make decisions and allow managers to experiment, learn, and adjust based on data. But adaptive management’s flexibility comes at the cost of more resources and less certainty, which may also invite controversy. The sweet spot for adaptive management may be when managing a dynamic system for which uncertainty and controllability are high and risk is low. While uncertainties are proliferating, situations that meet those conditions are not the norm. 

It would be beneficial for our environmental governance systems to explicitly identify conditions under which adaptive management may and may not be used, and to provide clear accountability mechanisms. The approach must fit with the practical realities of the working environment. For example, even if uncertainty and controllability are high and risk is relatively low, tinkering with large-scale energy infrastructure is not practical. Adaptive management may not be suited to regulatory contexts (1) in which long-term stability of decisions is important; (2) where decisions simply can’t easily be adjusted once implemented; or (3) where it is essential that an agency retain firm authority to say “yes” or “no” and leave it at that.  It is a valuable tool to be invoked when truly necessary.

Sample Topics for Multi-Perspective Discussions
Realigning to reflect today’s challenges

The interconnectedness of today’s global environmental challenges is in tension with the accreted framework of media-specific, site-specific laws and siloed agencies. Adjustments that help to align objectives, processes, and structures could scale impact. 

Evidence-based decisionmaking is foundational to U.S. governance and essential to progress towards today’s environmental imperatives

Our framework should reflect commitment to and investment in gathering and analyzing information, from intricate science to the concerns of impacted communities; and be designed to incorporate and respond to changing information, such as through judicial review or other checks. 

Designing effective certainty

In part because of impacts already set in motion, we must consider when we cannot wait for more information before taking action on environmental and climate challenges. By their nature, some of those actions can be adapted on an ongoing basis, while others cannot. Clear parameters for differentiating will help ensure clear timelines and appropriate, effective processes.

Building a Structure Fit for Purpose

The triple planetary crises, a term coined by the UN Environment Programme, refers to the challenges of biodiversity loss, pollution overload, and climate change. They require large-scale mobilization and societal level adjustments. This magnitude of action requires a multifaceted system that can support and move myriad levers in a coordinated and balanced manner. The year she received the Nobel Prize in Economics, Elinor Ostrom published a paper capturing the tension but also necessity of this layered system, calling for a “polycentric approach” to addressing climate change.

The following discussion focuses largely on federal and state government action. In addition, Tribal Nations are vital sovereign authorities, partners, and voices in governance, including natural resource management, and their needs and knowledge are critical to effective, sustainable, just results. And as Ostrom recognized, private entities will also be instrumental in addressing climate change and other complex challenges; this includes not only corporations, as discussed below, but philanthropic organizations and a variety of other nongovernmental actors.

The Scale Challenge 

Environmental regulation occurs at multiple levels: local ordinances, state laws and policies, interstate agreements, tribal laws, federal regulations, and international laws and norms. It also works at different resource scales, from managing a subspecies to protecting regional drinking water to setting nationwide air standards.

Jurisdictional nesting can provide comparative benefits at various levels for specific resources or pollutants. For example, working at the local level may allow for tailoring to specific circumstances to maximize benefits and the building of trust, while working at the state level can allow for the cumulative benefits of collective local action while also allowing for the testing of different approaches to federal implementation. Meanwhile, working at the federal and larger scale allows, among other things, the balancing of voices, and the establishment of shared objectives, standards, or requirements. 

However, tiered systems can also be subject to gaps in implementation, such as when there is no mechanism to trigger enforcement of an international mandate at a national level. This may inadvertently impede interoperability and shared learning, such as by using different data standards, tools, or systems, and slow action due to competing or otherwise unaligned priorities. In addition, rarely do jurisdictional boundaries align with resource definitions, whether it be a hydrogeographic basin, extent of an air pollutant, or natural hazard vulnerability zone. Further complexity is added by questions around preemption, with changes occurring in longstanding understandings of federal versus state authorities under key statutes and regulatory structures. 

Federal, tribal, state, and local governments must navigate these challenging dynamics as they work to effectively implement existing environmental laws and creatively address new environmental problems. 

Cooperative Federalism

Federalism—whereby the federal government and states share power and responsibilities—is a central tenet of the U.S. governance system. A particular form, cooperative federalism, is embodied in most of the major U.S. environmental laws, including the Clean Air Act and the Clean Water Act. These laws establish a legal framework in which minimum standards are established at the federal level and individual states implement the programs. Today, over 90 percent of the delegable federal environmental programs are run by states. As a general matter, states are responsible for ensuring that federal standards are met but have the flexibility to impose standards that are more stringent than the federal standards. 

In practice, the Congressional Research Service observes that the “precise relationship and balance of power between federal and state authorities in cooperative federalism systems is the subject of debate.” This debate has manifested in a variety of ways over the decades, including differences over the appropriate scope of federal oversight and levels of federal funding for state-delegated programs. 

Environmental protection has advanced in many respects over time with cooperative federalism as its foundation, but few would argue there is no room for improvement. For example, a 2018 memorandum by the Environmental Council of the States (ECOS) captured a consensus among states that the “current relationship between U.S. EPA and state environmental agencies doesn’t consistently and effectively engage nor fully leverage the capacity and expertise of the implementing state environmental agencies or the U.S. EPA.”

In addition to the leeway that cooperative federalism provides to the states in implementing federal environmental laws, states are free to regulate or otherwise address environmental problems that are not covered by federal laws. As a result, states are often referred to as (in Justice Brandeis’ phrase) “laboratories of democracy” for testing innovative policies. Historically, states have served as testing grounds for environmental policies later adopted by the federal government. Given the current federal governance landscape, discussed below, what happens in the states may stay in the states (at least for quite some time)—making state laboratories one of the few promising options for advancing environmental protection. 

Barriers to Optimal Functioning of Cooperative Federalism 

In addition to the inherent systemic challenges outlined above with respect to multi-tiered jurisdiction and resource scale, there are broad societal barriers to maximizing the efficacy of cooperative federalism. The numerous overarching problems contributing to democratic dysfunction (e.g., channelized communication, primaries that yield extreme candidates who foster dramatic pendulum swings, lack of public trust) will contribute to impeding the optimal functioning of cooperative federalism for the foreseeable future. 

The multitude of environmental governance-specific challenges identified earlier also significantly affect the functioning of cooperative federalism. These include, for example, long-standing congressional gridlock; new and emerging environmental harms that cannot be easily addressed within the existing, siloed framework; a Supreme Court changing its review of regulation; and regulatory pendulum swings that make consistency and stability difficult and hinder continuous improvement.

In addition, several additional barriers arguably weaken the foundations of cooperative federalism. These include: ineffective federal oversight of state programs (possibly both too stringent and too lenient in some respects); insufficient collection and dissemination of data (e.g., on environmental conditions, performance, pollution impacts), as well as inconsistent tracking of key environmental indicators; lack of state-specific effective risk communication and messaging; limited state resources for filling federal regulatory gaps or experimenting with innovative ways of implementing federal and state regulations; and insufficient federal funding for state programs. Recent critiques also point to the need to build out state administrative law to improve the functioning of cooperative federalism.

Opportunities for Renewing Cooperative Federalism

Recent developments in federal programs are disrupting many aspects of the country’s environmental protection efforts. These developments include drastic regulatory rollbacks, multiplied industry influence, curtailed input from scientists and other experts, rollback of federal grant funds to states and local governments, and sweeping staffing cuts resulting in loss of critical expertise. 

Cooperative federalism has been particularly undermined by federal funding cuts (e.g., withdrawal of federal grants, reductions in revolving loan funds) and cuts to the federal programs that collect and analyze environmental data. Moreover, federal interference with independent or “more stringent than” state initiatives is taking a toll (e.g., response to California’s electric vehicle requirements ).

Given the barriers outlined above that make major statutory change infeasible, building an entirely new structure to replace cooperative federalism will be a nonstarter for the foreseeable future. However, ample opportunities exist to strengthen the existing structure in a manner that yields more effective and innovative approaches to environmental protection. 

Front and center is building state and local governmental capacity to fill the gaps created by federal inaction and rollbacks as well as to lead on regulatory innovation. In so doing, states and local governments can serve as more effective laboratories of democracy and foster innovative federal action. And because states and local governments are on the frontlines of managing environmental and climate impacts such as floods and wildfires, as well as aging water infrastructure and other environment-related challenges, they are motivated to address the cause and effects of these harms, despite the intensely politicized nature of environmental issues such as climate change. 

To be sure, renewing the existing structure is complicated by an uneven political landscape. For example, the level of political and popular support for environmental protection measures in the 26 states led by Republican governors differs from the levels of support in the 24 states led by Democratic governors, and the relative dominance of a particular party (e.g., trifectas or triplexes) is also a factor. These dynamics likewise influence environmental action by local governments when, for example, the potential for state preemption of local authority is a factor. 

Nevertheless, the practical reality of increased extreme weather events, aging water infrastructure, and other environment-related challenges provides a strong incentive for all states and local governments to act. State and local efforts, however, are hindered by limited capacity in the form of staffing, funding, expertise, data, and other factors. For example, virtually all states could benefit in their decisionmaking from more robust data on local environmental conditions, and many states lack adequate funding, staff, and other resources.

Private Sector Synergies and Opportunities

Private environmental governance (PEG)—which can take a range of forms including collective standard-setting, certification and labeling systems, corporate carbon commitments, investor and lender initiatives, and supply chain requirements—is already making its mark across industries as diverse as electronics, forestry, apparel, and AI. For example, roughly 20 percent of the fish caught for human consumption worldwide and 15 percent of all temperate forests are subject to private certification standards. In addition, 80 percent of the largest companies in key sectors impose environmental supply chain contract requirements on their suppliers. And investors are increasingly taking environmental, social, and governance (ESG) into account, including risks related to climate change. A 2022 study estimated, for example, that assets invested in U.S. ESG products could double from 2021 to 2026 and reach $10.5 trillion. 

As professors Vandenbergh, Light, and Salzman explain in their book Private Environmental Governance: “If you want to understand the future of environmental policy in the 21st century, you need to understand the actors, strategies, and challenges central to private environmental governance.” 

Given the scope of PEG activities, it is not surprising that a range of regulatory regimes are implicated, including corporate governance, contract, antitrust, and consumer protection laws. In some cases, these legal regimes place constraints on the forms and scope of PEG initiatives. Many contend, however, that these constraints are inadequate, as reflected in recent efforts to severely curtail ESG initiatives. 

Further, some scholars and advocates have criticized PEG from an entirely different perspective, citing concerns that PEG measures constitute greenwashing—that is, that they do not actually change corporate behavior and environmental conditions. Among other concerns is that PEG may undermine support for public governance measures in certain contexts. 

Yet federal legislative gridlock, a dramatically swinging environmental regulatory pendulum, unregulated new technologies, and other factors point to needing a better understanding of how PEG can be leveraged to advance environmental protection efforts—including the improved functioning of cooperative federalism.

Sample Topics for Multi-Perspective Discussions
Building a robust and widely disseminated information base

How can we use innovative approaches for preserving existing data and collecting new data on environmental conditions, regulated entity performance, and pollution impacts to enhance interoperability of local, state, and federal systems, foster consistency among assessments of risk, and help align priorities and approaches?

Leveraging traditional state and local powers

Problems such as climate change require a whole of government approach to address and could benefit from leveraging adjacent state and local regulatory authorities in areas such as land use (e.g., zoning), infrastructure, and public health.

Enhancing connectivity within jurisdictional nesting and fostering networks of state-level and local-level regulators to align priorities

Bolstering state and local officials’ networks for sharing data, best practices, and regulatory innovations may help align priorities and produce further progress on cross-jurisdictional problems as well as new challenges such as permitting reforms.

Examining how PEG can be leveraged to advance environmental protection

For example, asking—what are the effects of PEG (e.g., emissions reductions); what are the drivers of PEG (e.g., brand reputation, shareholder actions, employees, and corporate customers); are there ways to reduce greenwashing and greenhushing; and how can we ensure that PEG complements public governance.

Leveraging new technologies for capacity-building

For example, AI and advanced monitoring technologies—if thoughtfully leveraged—could lessen the burden on state and local governments, particularly those that are under-resourced, in their efforts to assess climate risk, develop resilience plans, and monitor regulatory compliance.

Conclusion

The environmental gains of the last half-century demonstrate that governance choices matter. The United States built a system capable of addressing the urgent environmental crises of its time by combining scientific expertise, democratic accountability, and enforceable legal standards. 

Today’s urgent challenges—climate change, biodiversity loss, and pervasive pollution—demand a similar alignment under far more complex conditions. The challenge is not merely to regulate more, faster, or differently, but to recommit to decisionmaking that is credible and durable: by restoring confidence that evidence matters, that participation is meaningful, that tradeoffs get confronted honestly, and that rules will persist long enough to justify investment and collective effort.

The path forward lies neither in abandoning the foundations of environmental law, nor in relying solely on technological or private solutions. It will be found by strengthening and adapting existing governance structures—integrating cross-cutting objectives across domains, clarifying roles across jurisdictions, and rebuilding the shared evidentiary base and institutional capacity needed to act amid uncertainty, rather than deferring action in pursuit of unattainable certainty. And it requires clear communication about today’s complex, dispersed challenges that enhances understanding and reduces polarization. 

At its core, the triple planetary crisis is a democratic and governance challenge: how societies decide, together, to protect people and places while sharing costs and benefits fairly. Meeting that challenge will require systems capable of carrying both technical complexity and public trust, as well as a sustained commitment to invest in institutions that can decide, act, and endure. 

Costs Come First in a Reset Climate Agenda

Durable and legitimate climate action requires a government capable of clearly weighting, explaining, and managing cost tradeoffs to the widest away of audiences, which in turn requires strong technocratic competency. 

Democratic governance needs

State Capacity needs


Key Takeaways

Introduction

Public policy involves tradeoffs. The primary tradeoff for climate change mitigation is economic cost. Secondary tradeoffs include commercial freedom, consumer choice, and the quality or reliability of goods and services. Political movements seeking to address a collective action problem, such as climate change, are prone to overlook the consequences of tradeoffs on other parties, like consumers and taxpayers. This paper posits that the cost tradeoffs of climate change mitigation have been underappreciated in the formation of public policy. This has resulted in an overselection of high cost policies that are not politically durable and may erode social welfare. It also results in overlooking low or negative-cost policies that are durable and hold deep abatement potential. These policies can have broad political appeal because they align with the self-interest of the United States, however they typically require dispersed beneficiaries to overcome the concentrated lobby of entrenched interests. 

A core, normative objective of public policy is to improve social welfare, which “encourages broadminded attentiveness to all positive and negative effects of policy choices”. Environmental economics determines the welfare effects of climate change mitigation policy by the net of its abatement benefits less the costs. The conventional technique to determine abatement benefits is the social cost of carbon (SCC). The barometer for whether climate policy benefits society is to determine whether abatement benefits exceed costs. Accounting for full social welfare effects requires consideration of co-benefits as well, granted these tend to be conventional air emissions with existing mitigation mechanisms covered under the Clean Air Act. Nevertheless, accounting for costs is essential to ensure climate policy benefits society. 

Abatement costs also have a discernable bearing on the likelihood and durability of policy reforms. Climate policies exhibit patterns of passage, mid-course adjustments, and political resilience across election cycles based on the constituency support levels linked to benefit-allocation and cost imposition. This paper develops four policy classifications as a function of their abatement benefit-cost profile, and uses this framework to examine the political economy, abatement effectiveness, and economic performance of select past and potential policy instruments. 

Political Economy and Policy Taxonomy 

The translation of climate policy concepts into legitimate policy options in the eyes of policymakers can be viewed through the Overton Window. That is, politicians tend to support policies when they do not unduly risk their electoral support. The Overton Window for climate policy is constantly shifting within and across political movements with the foremost factor being cost. 

In a 2024 survey of voters, the most valued characteristics of energy consumption were 37% for energy cost, 36% for power availability, 19% for climate effect, 6% for U.S. energy security effect, and 1% for something else. Democrats slightly valued energy cost and power availability more than climate effects. Independents and Republicans heavily valued energy cost and power availability more than climate effect. 

Figure 1. Voters’ Energy Values

Progressives have long exhibited greater prioritization of climate change policy, but cost concerns are driving an overhaul of the progressive Overton Window on climate change. In California, which contains perhaps the most climate-concerned electorate in the U.S., progressives have begun a “climate retreat” to recalibrate policy as “[e]lected officials are warning that ambitious laws and mandates are driving up the state’s onerous cost of living”. Nationally, a new progressive thought leadership think tank is encouraging Democrats to downplay climate change for electoral benefit. Importantly, they find that 61% of battleground voters acknowledge that “climate change is at least a very serious problem,” but that “it is far less important than issues like affordability.” 

Similarly, veteran progressive thought leaders, such as the Progressive Policy Institute, now stress that “energy costs come first” in a new approach to environmental justice. While emphasising the continued importance of GHG emissions reductions, those policy leaders are making energy affordability the top priority, amid a broader Democratic messaging pivot from climate to the “cheap energy” agenda. The rise of cost-conscious progressives is particularly notable because the progressive electorate has expressed a higher willingness to pay to mitigate climate change than moderate and conservative electoral segments. 

Economic tradeoffs, namely costs and more government control, has long been the central concern on climate policy for the conservative movement. The conventional climate movement messaged on fear and the need for economic sacrifice, which is the antithesis of the conservative electoral mantra: economic opportunity. Yet the conservative climate Overton Window emerged with a series of state and federal policy reforms when climate change mitigation aligned with expanded economic opportunity. However, pro-climate conservative thought leaders remain opposed to high cost policies, such as calling to phase out Inflation Reduction Act (IRA) subsidies for mature technologies. 

Many leading conservative thought leaders continue to challenge the climate agenda writ large because of its association with high cost policies. For example, President Trump’s 2025 Climate Working Group report was expressly motivated by concerns over “access to reliable, affordable energy” while acknowledging that climate change is a real challenge. Similarly, a 2025 American Enterprise Institute report finds that the public is most interested in energy cost and reliability and unwilling to sacrifice much financially to address climate change. Meanwhile, climate-conscious conservative thought leaders like the Conservative Coalition for Climate Solutions and the R Street Institute continue to emphasize a market-driven, innovation-focused policy agenda that prioritizes American economic interests and drives a cleaner, more prosperous future. Altogether, it indicates a conservative Overton Window on negative and low-cost climate change mitigation. 

While cost is driving the Overton Window within each political movement, it also buoys the potential for alignment across political movements. Political movements are not monoliths, but rather exhibit major subsets within each movement. The progressive movement has seen gains in popularity among its populist left flank, often identified as the “democratic socialist” wing, which contributes to ongoing debate about Democrats’ ideological direction. Climate policy initiated by this wing, however, is associated with high economic tradeoffs (e.g., degrowth) and has prompted a backlash within the progressive movement. By contrast, a subset of the progressive movement, sometimes labelled “abundance progressives,” has emerged to support a more pro-market, pro-development posture. This movement is especially responsive to energy cost concerns, and is an emerging substitute for the anti-development traditions of the progressive environmental movement. Overall, variances in the progressive movement are fairly straightforward to categorize linearly on the economic policy spectrum. 

The Republican electorate views capitalism far more favorably than Democrats, but with modest decline in recent years. Republicans have trended away from consistently conservative positions associated with limited government, which historically emphasized the rule of law and a strict cost-benefit justification for government intervention in the market economy. They have migrated towards right-wing populism associated with the Make America Great Again (MAGA) movement. Right-wing populism is hard to operationalize for economic policy because it is not a standalone ideology, but a movement vaguely attached to conservative ideology. Generally, the “America First” orientation of MAGA implies positions based on the self-interest of the U.S., with the Trump administration prioritizing cost reductions in energy policy. 

MAGA is further to the right of conventional conservatives on environmental regulation and general government reform. For example, conservatives have noted the contrast between conservative “limited, effective government” and the Department of Government Efficiency’s “gutted, ineffective government” reform approach. On the other hand, MAGA will occasionally back leftist policy instruments, such as coal subsidies, wind restrictions, executive orders to override state policies, and emergency authorities for fossil power plants. These are often justified to counteract the leftist policies passed by progressives (e.g., renewables subsidies, fossil restrictions, emergency authorities for renewables), resulting in dueling versions of industrial policy. In other words, ostensible overlap between MAGA and progressives on policy instrument choice actually reflects the use of similar tools used for conflicting purposes (e.g., restrictive permitting or subsidies for opposing resources; i.e. picking different “winners and losers”). Nevertheless, the disciplinary agent for right-wing energy populism has been cost concerns, which have influenced the Trump administration to pursue more traditionally conservative energy policies like permitting reform and lowering electric transmission costs. 

This political economy identifies the broadest cross-movement Overton Window between moderate or “abundance progressives” and traditional conservatives. Regardless, both broad movements exhibit cost sensitivity and growing prioritization of U.S. self–interest. Distinguishing the domestic SCC from global SCC is essential to determine what policies are consistent with the self-interest of the U.S. versus the world as a whole. Traditionally, the U.S. government only considers domestic effects in cost-benefit analysis, yet the vast majority of domestic climate change abatement benefits accrue globally. 

The first SCC, developed under the Obama administration, relied solely on a global SCC. Leading conservative scholars, including the former regulatory leads for President George W. Bush, criticized the use of the global SCC only to set federal regulations. They argued for a “domestic duty” to refocus regulatory analysis on domestic costs and benefits. Similarly, the first Trump administration used a domestic SCC. Although the second Trump administration moved to discard the SCC outright, this appears to be part of a regulatory containment strategy, not a reflection of the conservative movement’s dismissal of the negative effects of climate change. In other words, even if the SCC is not the explicit basis for policymaking, it is a useful heuristic for policymakers.

The proper value of the SCC is the subject of intense scholarly and political debate. It has fluctuated between $42/ton under President Obama, $1-$8/ton under President Trump, and $190/ton under the Biden administration (all values for 2020). The main methodological disagreement has been over whether to use a domestic or global SCC, with the Trump administration position guided by “domestic self-interest.” This suggests the original domestic and global SCC values may approximate the Overton Window parameters the best. This underscores the following policy taxonomy that characterizes climate abatement policies by cost relative to domestic and global SCC levels:

Policy Applications

There are myriad policies across the abatement cost spectrum. This analysis applies to particularly popular domestic policies already pursued or readily considered. This includes policies targeting the environmental market failure via direct abatement (GHG regulation) and indirect abatement (public spending, clean technology mandates, and fuel bans). It also includes policies targeting non-climate market failure, yet hold deep climate co-benefits (innovation policy). The analysis also examines policies that correct government failure and have major climate co-benefits (permitting, siting, and electric regulation reform). 

Fuel Mandates and Bans

For the last two decades, the most prevalent climate policy type in the U.S. has been state level fuel mandates and bans. Last decade, the environmental movement came to prefer policies that explicitly promote or remove fuels or technologies, not emissions. This is despite ample evidence in the economics literature that market-based policies are more effective and carry far lower abatement costs. Nevertheless, the most common domestic climate policy instrument this century has been state renewable portfolio standards (RPS). The literature notes several key findings from RPS:

Micro-mandates have also sprung up, primarily in progressive states. These have often targeted the promotion of nascent or symbolic energy sources that the market would not otherwise provide, with the costs obscured from public view (e.g., rolled into non-bypassable electric customer charges). A good example is offshore wind requirements in the Northeast, which carries a high abatement cost (over $100/ton). 

Fuel bans have become increasingly popular climate policy in progressive states and municipalities. Beginning in 2016, a handful of progressive states began banning coal. However, this does not appear to have created much cost or abatement benefit, as evidenced by a lack of commercial interest in coal expansion in areas without such restrictions. In fact, neither federal nor state regulation was responsible for steep emissions declines from coal retirements. Coal retirements were mostly driven by market forces, especially breakthroughs in low-cost natural gas production and high efficiency power plants. Policy factors, like the Mercury and Air Toxics Rule, were secondary drivers of coal plant retirement. 

Around 2020, California, New York, and most New England states began adopting partial natural gas bans or de facto bans on new gas infrastructure through highly restrictive permitting and siting practices. Unlike coal restrictions, these laws have markedly decreased commercial activity, namely gas pipeline and power plant development, and in some cases caused economically premature retirements. This has caused “pronounced economic costs and reliability risk.” Resulting pipeline constraints drive steep gas price premiums in these states, which translate into a core driver of elevated electricity prices

Insufficient pipeline service in the Northeast is especially problematic, as demonstrated by a December 2022 winter storm event that nearly led to an unprecedented loss of the Con Edison gas system in New York City that would have taken weeks or months to restore. Further, preventing gas infrastructure development does not provide a clear abatement benefit, because more infrastructure is needed to meet peak conditions even if gas burn declines. A prominent study found a 130 gigawatt increase in gas generation capacity by 2050 was compatible with a 95% decarbonization scenario. 

Progressive states and municipalities have also pursued natural gas consumption bans. This policy may carry exceptional cost, especially for existing buildings, with potentially well over $1 trillion in investment cost to replace gas with electric infrastructure. One estimate put the cost of natural gas bans at over $25,600 per New York City household. A Stanford study projected a 56% electric residential rate increase in California from a natural gas appliance ban. Generally, conservative thought leaders and elected officials have opposed natural gas bans for cost as well as non-pecuniary reasons, including security concerns and the erosion of consumer choice. This applies even for prominent members of the Conservative Climate Caucus. Altogether, gas bans are considered class IV policy with virtually no Overton Window alignment. 

GHG Transparency 

GHG regulation takes various forms. The least stringent is GHG transparency, which addresses an information deficiency and lowers transaction costs in voluntary markets. This begins with reporting and accounting requirements on emitters (Scope 1 emissions). Public policy can help resolve measurement and verification problems that have eroded confidence in voluntary carbon markets. GHG transparency policy can also standardize terminology and provide indirect emissions platforms. For example, making locational marginal emissions rates on power systems publicly available lets market participants identify the indirect power emissions of power consumption (Scope 2 emissions). Progressives have consistently favored GHG transparency policy, while conservatives have typically supported light-touch versions of it like the Growing Climate Solutions Act

The second Trump administration recently pursued removal of basic GHG reporting requirements on ideological grounds, specifically repeal of the GHG Reporting Program (GHGRP). This appears to reflect an optical deregulatory agenda over an effective one. Conservative groups have warned of the downsides of GHGRP repeal. Pressure to course correct may prove fruitful, given that the industry the Trump administration aims to assist – oil and natural gas – maintain that the U.S. Environmental Protection Agency (EPA) should retain the GHGRP. A recent analysis found that if states replace the GHGRP, new programs will be more expensive (Figure 2). 

Figure 2. Cost Comparison of Federal and California Reporting Programs

Many regulated industry and conservative groups instead support a low compliance cost GHG reporting regime with durability across future administrations. This not only applies to direct emissions reporting but indirect emissions reporting, as in the absence of federal policy industry faces a patchwork of compliance requirements across states and foreign governments. The same economic self-interest rationale justifies a role for limited government in emissions accounting, with an emphasis on the capital market appeal of showcasing the “carbon advantage” of the U.S. in emissions-intensive industries. An example is liquified natural gas, whose export market is enhanced by showcasing its lifecycle emissions advantage over foreign gas and coal. 

The abatement effectiveness of GHG transparency has grown appreciably in the 2020s, as voluntary industry initiatives have sharply increased. This policy set enables an efficient “greening of the invisible hand” with staying power, as corporate environmental sustainability efforts appear resilient regardless of political sentiment, unlike corporate social endeavors. In fact, the aggregate willingness to pay for voluntary abatement from producers, consumers, and investors suggests that well-informed domestic markets go a long way towards self-correcting the externality of GHGs (e.g., convergence of the private and social cost curves). Certain voluntary corporate behaviors may even exceed the global SCC, especially commitments to nuclear, carbon capture, and other higher cost abatement generation financed by the largest sources of power demand growth. Well-functioning voluntary carbon markets could yield roughly one billion metric tons of domestic carbon dioxide abatement by 2030. Providing locational marginal emissions data can slash abatement costs from $19-$47/ton down to $8-$9/ton while doubling abatement levels from some power generation sources. 

Overall, efficient GHG transparency policy described above is a low-cost mitigation strategy consistent with class II designation. Basic, federal GHG transparency policy may even constitute class I policy, because it avoids the higher compliance cost alternative of a patchwork of state and international standards that would manifest in the absence of federal policy. However, stringent GHG transparency policy may constitute class III or IV policy. Prominent examples include a recent California climate disclosure law and a former Securities and Exchange Commission proposed rule to require emissions disclosure related to assets a firm does not own or control (Scope 3). Such efforts may obfuscate material information on climate-related risk and worsen private-sector led emission mitigation efforts.

Direct GHG Regulation 

Classic environmental regulation takes the form of a command-and-control approach. These instruments include applying emissions performance standards or technology-forcing mechanisms, typically for power plants or mobile sources. These policies vary widely in stringency and cost. Overall, command-and-control is widely considered in the economics literature to be an unnecessarily costly approach to reducing GHGs relative to market-based alternatives. It can also result in freezing innovation, by discouraging adoption of new technologies. 

Federal command-and-control GHG programs have not been particularly environmentally effective, cost-effective, or demonstrated legal or political durability. The first power plant program was the Clean Power Plan, which was struck down in court, and yet its emissions target was achieved a decade early from favorable market forces and subnational climate policy. The most recent federal command-and-control approaches for GHG regulation were 2024 EPA rules for vehicles and power plants. A 2025 review of these and other federal climate regulations over the last two decades of federal climate regulations found:

The 2025 review study implies that past federal command-and-control had very high cost – well into class IV range. It has also been a top priority of conservatives to undercut. However, it is possible for modest command-and-control policy with class II or III costs. 

Some conservatives, noting EPA’s legal obligation to regulate GHGs and the cost of regulatory uncertainty from decades of EPA policy oscillations between administrations, suggested modest requirements as a better option to replace high cost rules in order to mitigate legal risk and provide industry a predictable, low-cost compliance pathway. For example, conservatives argued that replacing high cost requirements for power plants to adopt carbon capture and storage (CCS) with low cost requirements for heat rate improvements may lower compliance costs more than attempting to repeal the Biden era rule for CCS outright. Similarly, the oil and gas industry opposed stringent GHG regulations on power plants and mobile sources, but often validated alternative low cost compliance requirements. 

The first Trump administration pursued modest replace-and-repeal GHG regulation. The second Trump administration has opted for repeal policies and to eliminate the endangerment finding via executive rulemaking. However, regulated industry and many conservative thought leaders believe this is a strategic blunder, given the low odds of legal success, resulting in the perpetuation of “regulatory ping-pong that has plagued Washington, D.C., for decades.” If the courts uphold Massachusetts v. EPA and the associated endangerment finding, this implies that modest command-and-control policy may have durable political alignment potential. Yet this does not hold much abatement potential. In the absence of a legal requirement to regulate GHGs, there is unlikely to be broad political alignment for even modest command-and-control policy. Conservatives tend to view this as a gateway to more costly policies that will probably not meaningfully affect global GHG trajectories. 

The 2025 review study understates the full cost of U.S. climate regulations because they exclude state and local levels. Although no comprehensive study of state climate regulation is known, command-and-control state regulations often raise major cost concerns as well. The cost and environmental performance of such state programs varies immensely, often owing to differences in the accuracy of abatement technology costs that regulatory decisions are based upon (e.g., the failure of California’s zero-emission vehicle program compared to success with its low-emission vehicle program). A recent example is California’s rail locomotive mandate, which projected to impose tens of billions of dollars in costs before being withdrawn. State command-and-control regulation is commonplace in progressive states, but not beyond, implying meager Overton Window alignment. 

A more economical version of GHG regulation is a system of marketable allowances, or cap-and-trade (C&T). Over three decades of experience with C&T programs reveals two things. First, C&T is environmentally effective and economically cost effective relative to command-and-control policy. Second, C&T performance depends on its design quality and interaction with other policies. Abatement costs depend on stringency and other design features, but C&T in a backstop role is generally close to the domestic SCC, rendering it class II policy. Robust C&T generally falls in the class III policy range. C&T is an example of abatement policy that can be cost-effective on a per unit basis, but given the breadth of its coverage its total costs can be substantial. Recent developments in Pennsylvania indicate a possible preference for policies with higher per-unit abatement costs than C&T, which may reflect a political preference for policies with less cost transparency and lower aggregate costs. 

Some environmental C&T complaints are valid, such as emissions leakage, but C&T effectiveness concerns are generally readily fixable design flaws. C&T effectiveness complaints are often the result of interference from other government interventions like fuel mandates, relegating C&T to a backstop role and suppressing allowance prices. Such state interventions triggered anti-competitive concerns in wholesale power markets overseen by the Federal Energy Regulatory Commission (FERC). This prompted conservative state electric regulators to call for a conference to validate mechanisms like C&T as a market-compatible alternative to high cost interventions. Conservative expert testimony at that conference, invited by conservative FERC leadership, explained that interventions layered on top of C&T merely reallocate emissions reduction under a binding cap, which raises costs, creates no additional abatement, and undermines innovation. This implies that such states might increase abatement and lower aggregate costs by upgrading the role of C&T and downgrading the role of costlier interventions. 

In the 2000s, bipartisan interest in federal C&T policy arose, but it failed and has not resurfaced. In its absence, states have supplanted federal policy with subnational C&T programs. However, the durability of C&T beyond progressive states is unclear. Moderate states have sometimes joined a regional C&T program under Democratic leadership, but sometimes departed them under Republican leadership. Conservative state groups typically challenge C&T adoption and seek repeal of C&T programs like the Regional Greenhouse Gas Initiative. This suggests that C&T is at the fringe, but typically outside, an Overton Window across political movements. 

Permitting and Siting 

Permitting policy can base decisions explicitly on GHG criteria, or they can be based on non-GHG factors but hold indirect GHG consequences. Generally, only progressive states and presidents have pursued the former. Federally, these include the Obama administration’s “coal study” and Biden administration’s “pause” on liquified natural gas (LNG). The LNG pause did not provide any apparent emissions benefit, yet carried substantial foregone economic opportunity and strategic value to U.S. allies. Pragmatic progressive thought leaders expressed concern with the pause, noting the creation of economic and security risks, and suggested lifting the pause in exchange for companies to commit to strict, third-party verified methane emissions standards. Relatedly, some conservative thought leaders have supported policy that enables voluntary participation in certified programs that provide market clarity and confidence to harness private willingness to pay for lower GHG products. This has been buttressed by support from an industry-led effort to advance a market for environmentally differentiated natural gas based on a standard, secure certification process. 

Permitting constraints on clean technology supply chains can have perverse economic and emissions effects. A prime example is critical minerals, which are essential components to clean energy technologies. A net-zero emission energy transition, relative to current consumption, would increase U.S. annual mineral demand by 121% for copper, 504% for nickel, 2,007% for cobalt, and 13,267% for lithium. Market forces, unsubsidized, are poised to produce a sufficient amount of domestic copper and lithium supply to satiate a large share of domestic demand, but face undue barriers to entry that restrict production far below its potential. To meet net-zero objectives, permitting reform allowing all currently proposed projects to enter the market would lower U.S. import reliance for copper from 74% to 41%, while dropping lithium import reliance from 100% to 51%. 

Expanding domestic mining no doubt carries local environmental tradeoffs. However, the U.S. has some of the most stringent and comprehensive mining safeguards in the world. Thus, foregoing development domestically is likely to push mining toward foreign countries with inferior environmental, safety, and child labor protections. It is therefore critical that domestic permitting decisions account for the unintended effects of denying permits, not merely the direct consequences of approving a project. 

Permitting and siting constraints on energy infrastructure also impose major costs and foregone abatement. These entry barriers largely exist as environmental safeguards, yet almost always inhibit projects with a superior emissions profile to the legacy resources they replace. In fact, 90% of planned and in progress energy projects on the federal dashboard were clean energy related as of July 2023. In 2023, the ratio of clean energy to fossil projects requiring an environmental impact statement to comply with the National Environmental Policy Act (NEPA) was 2:1 for the Department of Energy and nearly 4:1 for the Bureau of Land Management. A 2025 study estimated that bringing down permitting timelines from 60 months to 24 months would reduce 13% of U.S. electric power emissions. 

Permitting has proven to be a litmus test for the progressive environmental movement, as the movement bifurcates between anti-development symbolists and pragmatic pro-abundance progressives. While a minority of mainstream environmental groups have become amenable to permitting reform, such as The Nature Conservancy and Audubon Society, the core of progressive environmental groups have not. Instead, new progressive groups like Clean Tomorrow and the Institute for Progress filled the pro-abundance void alongside traditional market-friendly progressive groups like the Progressive Policy Institute. This progressive subset has helped influence moderate Democrats to support permitting reform in a collaborative way with conservatives. 

Permitting reform has long been championed by conservatives for its economic benefits, with climate considerations typically a secondary-at-best rationale. Yet permitting reform has become a priority for the newer climate-minded conservative movement. However, permitting has also proven to be a differentiator between conservatives and right-wing populists. The latter engages in forms of government intervention that sometimes contradict conservative principles. For example, the Trump administration enacted an offshore wind energy pause that followed the same problematic blueprint as the Biden administration’s LNG pause. This elevates the importance of technology-neutral permitting reforms with an emphasis on permitting permanence safeguards. 

In recent years, a coalition of Republicans, centrist Democrats, and clean energy and abundance advocates have pressed for reform to NEPA. A broad suite of federal permitting reforms with bipartisan appeal was identified in a 2024 report by the Bipartisan Policy Center. Bipartisan alignment led to the passage of the Fiscal Responsibility Act of 2023 into law and the Senate passage of the Energy Permitting Reform Act of 2024 (EPRA). Although a 2025 Supreme Court decision suggests executive actions alone may substantially reduce NEPA obstacles, plenty of NEPA and other federal statutory reforms remain of high value and hold considerable bipartisan potential

The positions of leading progressive, conservative, and centrist thought leadership organizations highlight alignment on various federal permitting and siting reforms. These include statutory changes to NEPA, the Endangered Species Act, the Clean Water Act, the Clean Air Act and the National Historic Preservation Act. Substantive alignment includes reforms that reduce litigation risk (e.g., judicial review reform), limit executive power to stop project approvals and undermine permitting permanence, maintain technology neutrality, strengthen federal backstop siting authority for interstate infrastructure, codify the Seven County decision, and streamline agency practices while ensuring sufficient state capacity. 

Despite considerable positive momentum at the federal level, the greatest permitting and siting barriers generally reside at the state and local levels and trending sharply in a more restrictive direction. Wind and solar ordinances have grown by over 1,500% since the late 2000s. Oil and gas pipelines and power plants face mounting permitting and siting restrictions in progressive states, which not only raise costs but do not necessarily reduce emissions. In fact, the New England Independent System Operator said that a lack of natural gas infrastructure in the region has raised prices and pollution by forcing reliance on higher-cost resources like oil-fired power plants. The only major power generation resource with a less restrictive trend is nuclear, as six states recently modified or repealed nuclear moratoria to ease siting. 

Motivation for opposing energy infrastructure permitting has included the well-known “not in my backyard” concerns, such as noise, construction disruptions, or land use conflicts. Interestingly, much opposition appears to come from perception, as much as substantiated negative effects. Relatedly, permitting resistance rationales increasingly appear to result from ideological opposition to particular energy sources. Finally, much opposition and most litigation of energy projects comes from non-governmental organizations, not the land owners directly affected. Altogether, this underscores the importance of permitting and siting reform that improves the quality of information to agencies and parties, ties decisionmaking to specific harms not speculative claims, limits standing to affected parties, and creates appeals processes for landowners to challenge obstructive local government laws and decisions. A key tension to overcome is that technology-agnostic legislation has been more likely to advance in states with one or more Republican chamber, yet environmental advocates resist “all-of-the-above” reforms.

Policies that reduce permitting and siting burdens are class I: they boost economic output and are increasingly key to emissions reductions. Permitting and siting policies that are restrictive on fossil development are not particularly effective at reducing emissions and often add considerable cost, granted costs vary widely depending on the nature of the policies and implementation. Effective fossil restrictions can range from class II to class IV policy, while ineffective ones actually increase emissions. The political economy of permitting and siting must overcome the lobby of entrenched suppliers, who seek to maintain competitive moats. An ironic example was incumbent asset owners funding environmental groups to oppose transmission infrastructure in the Northeast that would import emissions-free hydropower. 

Electric Regulation

The power industry is at the forefront of energy cost concerns and decarbonization objectives. In the early 2020s, electric rates have risen most in Democratic states. These concerns reoriented progressives towards cost containment, even at the expense of climate objectives. In the 2024 election, cost of living concerns propelled Republicans to widespread victories as President Trump vowed to halve electricity prices. A year later, voter concerns over rising electricity rates in Georgia, New Jersey, and Virginia boosted Democrats in gubernatorial and public service commission (PSC) elections. 

At the same time, electricity is arguably the most important sector for climate abatement given its emissions share and the indirect effects of electrifying other sectors, namely transportation and manufacturing. Ample pathways exist to reduce electric costs and emissions simultaneously, primarily by fixing profound government failure embedded in legacy regulation. Electric industrial organization shapes economic and climate outcomes, with market liberalization an advantage for both. 

Electric regulation falls into two basic formats. The first is cost-of-service (CoS) regulation, where the role of government is to substitute for the role of competition in overseeing a monopoly utility. The alternative is for regulation to facilitate competition by using the “visible hand” of market rules to enable the “invisible hand” to go to work. 

CoS regulation historically applied to power generation, though about a third of states enacted restructuring to introduce competition into power generation and retail services, in response to rising rates and the recognition that these are not natural monopoly services. Nearly all transmission and distribution (T&D) historically and today remains under CoS regulation. Importantly, CoS regulation motivates a utility to expand the regulated rate base upon which it earns a state-approved return. Generally, the main sources of cost discipline problems in the power industry stem from its CoS regulation segments: transmission, distribution, and the portion of generation that remains on CoS rates. 

Generally, restructured jurisdictions see greater innovation and downward pressure on the supply portion of customer bills. The economic performance of restructuring is highly sensitive to the quality of implementation. This includes the quality of wholesale energy price formation and capacity market design. It also includes various elements of retail choice implementation. They have also seen improved governance, whereas CoS utilities are prone to cronyism and corruption given the inherent incentives of their business model. Competitive wholesale and retail power markets hold cost and emissions advantages through several mechanisms:

Electric cost increases are multifaceted, prompting many misdiagnoses that blame markets for non-market problems. Utilities have begun pushing campaigns in restructured states to revert back to CoS regulation, whereas the growing consumer segment – namely data centers and industrials – are organizing campaigns to expand consumer choice. Independent economic assessments warn against a return to CoS regulation, and instead encourage state regulators to implement restructuring better. This includes better market design, consumer exposure to wholesale prices, and effective coordination with transmission investment. 

T&D costs, generally, are the core driver of electricity cost pressures nationwide. Over the last two decades, utility capital spending on distribution has increased 2.5 times while nearly tripling for transmission. This reflects profound flaws in CoS regulation of T&D, resulting in overinvestment in inefficient infrastructure and underinvestment in cost-effective infrastructure. This projects to worsen, given T&D expansion needed to meet grid reliability criteria as a result of aging infrastructure, turnover in the generation fleet, and load growth. 

T&D expansion is also central to abatement. Even partial transmission reforms can reduce carbon dioxide emissions by hundreds of million of tons per year. This explains why progressives have made reforms that expand transmission a top priority. This needs to be reconciled with the cost concerns of consumers and conservatives to result in durable policy. Consumers and conservatives have a budding transmission agenda rooted in upgrading the existing system, removing barriers to voluntary transmission development, using sound economic practices for mandatorily planned transmission, streamlined permitting and siting, and improved governance. A particularly promising frontier is reforms to enhance the existing system, given the expedience of their cost relief and consistency with a Trump administration directive

Recent federal regulatory actions have demonstrated bipartisan willingness to improve transmission policy and the related issue of interconnection, which has emerged as a major cost and emissions issue. In 2023, FERC passed Order 2023 on a bipartisan basis to reduce barriers to new power plants trying to interconnect to regional transmission systems. Subsequent reforms were motivated by a coalition of consumer groups and the center-right R Street Institute. In 2024, FERC passed Order 1920-A on a bipartisan basis to improve economic practices in regional transmission development. EPRA, a gamechanger for interregional transmission development, passed the Senate with bipartisan support in 2024. 

Demand growth has sparked reliability concerns over tight supply margins and recently put upward pressure on wholesale market prices. However, states with the greatest price decreases typically had increasing demand from 2019 to 2024 (Figure 3). This shows the importance of infrastructure utilization on electric rate pressures, as many areas had supply slack previously. The past may not be prologue. Emerging conditions show supply-constrained scenarios where marginal generation and T&D costs increase steeply to meet new load increase. The Energy Information Administration observes steady retail price increases and projects further rises to exceed inflation. 

Figure 3. Relationship Between Load Growth and Changes in Retail Electricity Prices (2019-2024)

Source: Wiser et al., 2025.

In an era of resurgent power demand growth, the states poised to keep rates and emissions down have wholesale competition, retail competition, efficient generator interconnection processes, economical T&D practices, and low permitting and siting barriers. The only state that reasonably accomplishes all of these is Texas, which is experiencing the most commercial interest among competitive suppliers and growing power consumers. Texas has experienced industry-leading clean energy investment and earned the distinction of Newsweek’s “greenest state” in 2024. 

All aforementioned electric reforms are considered class I policy. Despite cost-reduction appeal, power industry reforms have proven challenging for two reasons. First, reforms are highly technical in nature and face limited state capacity among legislative advisors and technocratic agencies, namely PSCs and FERC. For example, recent FERC and PSC activities reveal that these entities do not have the bandwidth or expertise to properly implement existing transmission policy, much less reform it. Secondly, reforms face strong resistance from incumbent utilities who hold concentrated interests in the status quo, creating a strong lobbying incentive. By contrast, the beneficiaries of reform, especially consumers, are dispersed interests that do not organize as effectively as a lobbying force. 

Although the Texas electricity experiment and associated federal power market reforms under President George W. Bush is a conservative legacy, most restructured states are progressive. This reflects significant bipartisan historic appeal. However, traditional conservatives have sometimes conflated pro-utility positions as the “pro-business” position, while it is unclear whether right-wing populist influences will catalyze pro-market reforms by challenging the status quo or retrench monopoly utility interests based on technocratic market skepticism (e.g., Project 2025). CoS utilities also commonly oppose cost-effective T&D reform, especially vertically-integrated utilities, which is consistent with their financial incentives to expand rate base and deter lower-cost imports from third parties. Nonetheless, the political economy of bipartisan electric regulatory reform remains promising, given voters’ prioritization of reducing electricity costs. 

Public Spending 

Government spending occurs through direct spending outlays or indirect spending through tax expenditures. Spending takes the form of industrial policy or innovation policy. The economics literature is historically critical of industrial policy, while positive literature on industrial policy usually conflates it with innovation policy. A distinguishing element is that innovation policy selects policy instruments suited to specific market failures, namely the positive externalities of knowledge spillovers and learning-by-doing. These generally apply to research and development (R&D) and early stage technologies, including those in demonstration stage and infant industries that have not achieved economies of scale. 

Predictably, progressives have been consistent backers of robust innovation policy, while conservatives typically scrutinize such expenses closely. Although differences of opinion exist on optimal funding levels, historically conservatives and progressives have agreed on a role for the government in supporting R&D. There is also a history of good governance agreement, such as a joint project between the Center for American Progress and the Heritage Foundation in 2013 on improving the performance of the national lab system. Improving outcomes-based Department of Energy program performance may have broad appeal, including better performance metrics, stronger linkages to private sector needs, and program reevaluation to determine government investment phase-out. Improvements to state capacity are paramount in this regard. 

Conservatives are often critical of public spending on infant industry, where government failure can outweigh market failure. For example, policymakers often struggle to identify when to end industry support, while industry engages in rent-maintenance behavior even after it has achieved maturity. Historic evidence indicates that direct subsidies and tax exemptions for infant energy industry continue well after the targeted technologies mature. Conservative and progressive scholars have historically framed the merits over subsidies for infant industry as a debate over government versus market failure. 

Since innovation policy targets non-climate market failures (e.g., knowledge spillovers) it may have a high static abatement cost. However, it is an inexpensive abatement policy when accounting for dynamic effects, because of induced innovation and learning-by-doing. Importantly, innovation policy holds massive climate benefits, because achieving abatement cost parity between clean and emitting resources is central to clean technology market adoption. Efficient R&D policy can be classified as class I policy, because the upfront cost of the policy is outweighed by long-term cost savings. Demonstration and infant industry support falls into class II-III range, depending on its implementation, and often exhibits substantial durability. 

In recent years, climate-minded conservatives have shown stronger inclinations of public spending for innovation policy. However, there is a stark difference between conservatives and right-wing populism on innovation policy. Conservatives note that the adverse consequences of Department of Government Efficiency’s “gutted, ineffective government” approach to the Department of Energy is inconsistent with limited, effective government practice. The economic self-interest benefits of innovation policy may induce a course-correction with MAGA, which has not deliberately targeted innovation policy insomuch as sacrificing it amid a rash government downsizing exercise. 

In contrast to innovation policy, industrial policy aims to directly promote a given industry, typically using mature technology, with interventions untethered to any underlying market failure (e.g., negative emissions externality). This generally takes the form of public spending on mature industries. For decades, traditional conservatives and climate-minded conservative scholars have been critical of green industrial policy for carrying high costs with modest emissions reductions. 

The most relevant case study in climate industrial policy versus innovation policy is the Inflation Reduction Act (IRA) of 2022. IRA represented the “largest federal response to climate change to date.” It consisted mostly of subsidies for mature technologies, especially wind, solar, and electric vehicles (EVs). It also contained subsidies for infant industry. IRA was passed exclusively by Democrats, with Republicans voicing concerns over its cost. Republicans then passed the One Big Beautiful Big Act (OBBBA) in 2025, which phased-out subsidies for mature technologies, but generally retained those for infant industry. This underscores the political durability of innovation policy and the fragility of industrial policy. 

A broader debrief on IRA and OBBBA reveals:

The takeaway from IRA and OBBBA is that subsidies for mature technologies are high cost, likely to erode social welfare, and not politically durable. Efficient public spending for RD&D, however, enhances social welfare and falls in the Overton Window due to its value for economic self-interest. Late-stage infant industry is at the fringe of the Overton Window. It is the area where conservative and progressive scholars have historically had contrasting views on whether market failure outweighs government failure, yet political outcomes have largely supported infant industry. 

Generally, the literature finds strong evidence of opportunity cost neglect in public policy, which “creates artificially high demand for public spending.” The IRA was a case-in-point. Meanwhile, the opportunity cost of public spending is rapidly rising given the dire fiscal trajectory of the United States. In 2025, moderate experts emphasized a pivot away from unsustainable and ineffective “Green New Deal thinking” for clean technology subsidies in favor of an innovation-driven strategy. 

Takeaways 

This analysis finds chronic flaws of cost considerations in ex ante policy analysis. Many medium and high-cost policies have passed without any robust accounting of costs at all (e.g., IRA, fuel bans). Interventions with cost-benefit analysis have had a tendency to underestimate costs (e.g., regulation). These flaws contribute to public misconception and play into political economy dynamics that tend to incent policies with hidden costs over those with transparent ones. 

High-cost policies have typically only been enacted by progressive governments and have come under greater scrutiny as energy costs escalate. This calls their social welfare effects and durability into question. It has cast climate action in the public eye as requiring deep economic sacrifice. 

Conservatives have been hesitant to engage on climate policy outright, largely over dire economic tradeoff perceptions. Such concerns have instigated a conservative backlash to climate policy, including to policies that are compatible with U.S. economic interests. This has been exacerbated by right-wing populism, which often strays from limited government conservatism in pursuit of cultural identity objectives. For example, in a 2024 piece promoting energy affordability, the Heritage Foundation correctly attributed cost increases to renewable energy mandates, but incorrectly presumed that a broad shift towards renewable energy and away from fossil fuels would always increase costs. 

High abatement cost policies not only risk reducing aggregate social welfare, but they create distributional concerns. Policies that raise energy costs tend to be regressive. This has challenged the social justice narrative of progressives, prompting a rethink by progressive leaders to take a “cost-first approach to [the] clean energy transition.” Although subsidies are a common response to lower burdens on low-income households, the most popular green subsidies pursued have exacerbated distributional concerns. Specifically, renewables subsidies favored by progressives have been challenged by conservatives as “green corporate welfare.” Progressives have also faced criticism for EV tax credits for disproportionately benefiting wealthy households. 

Encouragingly, negative- and low-cost policies comprise a rising share of the abatement curve. The Overton Window for pursuing such policies has grown remarkably for “abundance progressives” and conventional conservatives. However, populist subsets within both movements challenge the potential for political alignment. Enacting negative-cost policies also faces the collection active problem of dispersed beneficiaries versus a concentrated incumbent supplier lobby favoring the status quo. Mobilizing consumer and taxpayer groups is an underappreciated strategy to enact these policies. 

Abatement cost categoryOverton Window StrengthSocial Welfare EffectPolicy Examples
Class I. negativeStrongVery positiveLiberalized permitting and siting; Liberalized power markets; Streamlined generator interconnection; Economical transmission expansion; Efficient R&D policy
Class II. lowSubstantialPositiveEfficient GHG transparency; Efficient demonstration policy; Modest RPS; Backstop cap-and-trade; Modest command-and-control regulation
Class III. mediumInconsistentGlobally positive, often domestically negativeModerate RPS; Robust cap-and-trade; Moderate command-and-control regulation; Infant industry support
Class IV. highPoorOften negativeStringent RPS; Stringent command-and-control regulation; Onerous GHG transparency; Mature technology subsidies

This analysis is far from comprehensive. A notable omission from this paper is transportation policy, the largest GHG sector in the U.S. A scan of the transportation literature underscores major abatement potential for negative and low-cost policies, including reducing government barriers to efficient heavy-duty transportation like railways, shipping, and heavier trucking. Further, the electrification of transportation requires extensive fixes to government failure, such as liberalizing markets to enable competitive charging infrastructure, which lowers costs. The merits of innovation and GHG transparency policy, previously discussed, also appear to hold promise for transportation applications such as aviation fuel. The transportation sector has also been the target of GHG regulation, mostly in progressive states, which warrants close assessment of costs. For example, one study identified a vast abatement cost range for fuel standards ($60-$2,272/tonne). 

A shortcoming of this analysis is that it only characterizes costs by their efficiency (i.e., $/ton). Political decisions are highly sensitive to aggregate cost and its visibility to the public, which our taxonomy does not characterize. It is possible that efficient, transparent, and higher aggregate cost policies (e.g., C&T) fare less favorably in some political settings than inefficient, opaque, and sometimes lower aggregate cost policies (e.g., RPS solar carveouts). 

Despite the limitations of this analysis, the sample of policies evaluated is sufficient to support the thesis. That is, a retooled climate policy agenda that prioritizes cost considerations should elevate social welfare and achieve greater abatement by selecting more durable policies. 

Conclusion 

Abatement costs have huge bearing on whether climate policies benefit society, their likelihood of passage, and whether they prove politically durable. Most abatement need not come from dedicated climate policy, per se, but rather sound economic policy that carries deep climate co-benefits. Chronic disregard for cost considerations has led to an overselection of high-cost policies and underpursuit of low- and negative-cost policies. This has undermined policy durability and exacerbated political polarization over climate change abatement. 

This paper finds extensive abatement opportunities within negative-cost policies. These largely constitute fixes to government failure and include permitting, siting, and power regulation reforms. This analysis also finds considerable low-cost policies that are compatible with U.S. economic self-interests. These policies primarily spur voluntary private sector abatement through efficient innovation policy and GHG transparency. 

We offer three sets of recommendations moving forward for influencers of the climate policy agenda:

  1. Focus on results. Climate change abatement is a function of global GHG concentrations. Too much attention pursues symbolic objectives, like preventing fossil fuel infrastructure. This tends to undermine abatement goals and impose high costs.
  2. Emphasize cost considerations in policy agenda setting, formulation, and maintenance. Negative abatement cost policies should take top priority, with an emphasis on mobilizing beneficiaries. Robust cost-benefit analyses should precede all cost-additive policies and be reconducted periodically to guide policy adjustments.
  3. Prioritize quality state capacity. The net benefits of abatement policies are sensitive to government capacity and performance. Public management is in great jeopardy in an era of institutional decay. Negative-cost policies are often highly technocratic and require sufficient staffing expertise and accountable management at public institutions like DOE, FERC, PSCs, and permitting and siting agencies. 

In an era of energy affordability precedence, a reset climate agenda should anchor itself in good policy basics. That is, a sober-minded return to results-driven, net-benefits prioritized policy. This should improve the durability of climate policy and ensure it enhances social welfare. Executing reforms well requires a recommitment to improving the quality of institutions as much as the policy itself. 

FAS Launches New “Center for Regulatory Ingenuity” to Modernize American Governance, Drive Durable Climate Progress

WASHINGTON, D.C. — February 12, 2026 — The Federation of American Scientists (FAS) today announced the launch of the Center for Regulatory Ingenuity, a new hub designed to reimagine how the government tackles “wicked” modern problems while delivering everyday benefits for Americans.

“We can’t manage today’s problems with yesterday’s laws,” said Dr. Jedidah Isler, FAS Chief Science Officer. “The Center for Regulatory Ingenuity will bridge the gap between high-level policy design and on-the-ground implementation, ensuring that government promises translate into real-world results that Americans experience.”

FAS is launching the Center for Regulatory Ingenuity (CRI) to build a new, transpartisan vision of government that works – that has the capacity to achieve ambitious goals while adeptly responding to people’s basic needs. CRI does this by (1) creating high-trust environments to brainstorm and refine the big ideas that will breathe new life into government, and (2) building a “network of networks” that supports policymakers and practitioners in implementing those ideas at scale.

“The administrative state has delivered extraordinary achievements in the past, but today’s operating model is a complete mismatch for the complexity we face. As a result, trust in government has been in the basement for decades,” said Loren DeJonge Schulman, Director of Government Capacity at FAS. “Strengthening government capacity is an investment in democracy and deeply intertwined with climate progress. It requires thinking creatively about how to build the government we need, not endlessly pointing fingers at the government we have–CRI aims to do just that.”

CRI is launching with a focus on climate: a space where there’s an increasingly evident mismatch between the functions the government needs to provide and the tools it has to deliver. FAS is pleased to welcome Climate Group North America, ICLEI USA, and the Environmental Law Institute as core partners in this initial work.

“Today’s rollback of the endangerment finding underscores that we are in a new era for U.S. climate policy,” said Dr. Hannah Safford, Associate Director of Climate and Environment at FAS. “To be clear: there’s no credible scientific basis for that rollback, which FAS strongly opposes. At the same time, it’s worth recognizing that while foundational environmental laws like the Clean Air Act worked to curb industrial pollution, they weren’t designed to guide the economy-wide transition to clean technologies that’s currently underway. There’s tremendous opportunity for innovation on how we design and deliver climate policies that are equitable, efficient, effective, and durable. With EPA stepping back on this front, it’s time for others to step forward.“

With the support of contributors from across the ideological spectrum, CRI is already charting paths for a renewed administrative state, a more responsive government, and ambitious climate policy that lasts. These paths are explored in CRI’s inaugural essay collection, Bureaucracy as Social Hope: An Argument for Renewing the Administrative State. The first two of these essays, “Rebuilding Environmental Governance: Understanding the Foundations”, by Jordan Diamond and collaborators at the Environmental Law Institute, and “Costs Come First in a Reset Climate Agenda, by Devin Hartman (R Street Institute) and Neel Brown (Progressive Policy Institute) are available now; the remainder will be released in coming weeks. Other authors featured in the collection include:

In addition, CRI is today releasing “From Ambition to Action: Shovel-Ready Policy Solutions for Climate Leaders”. This policy primer, crowd-sourced from dozens of experts and policy entrepreneurs, outlines how motivated public leaders – especially at the state and local level – can turn big ideas into reality, cutting emissions while delivering cheaper electricity, ensuring affordable housing, and improving transportation for all of America.

Moving forward, CRI intends to deliver more detailed playbooks illustrating how an approach grounded in regulatory ingenuity can improve outcomes and achieve goals in these key sectors, which collectively account for two-thirds of U.S. emissions and contribute at least 25% of U.S GDP. 

More information about CRI is available here. For updates, and to stay connected, click here



ABOUT FAS

The Federation of American Scientists (FAS) works to advance progress on a broad suite of contemporary issues where science, technology, and innovation policy can deliver transformative impact, and seeks to ensure that scientific and technical expertise have a seat at the policymaking table. Established in 1945 by scientists in response to the atomic bomb, FAS continues to bring scientific rigor and analysis to address national challenges. More information about FAS work at fas.org.

Media Contact: Katie McCaskey, kmccaskey@fas.org, (202) 933-8857

From Ambition to Action: A Policy Primer

How public leaders can boost climate progress, restore trust in government, and make lives better…starting today.

People across the nation are clamoring for solutions that make their lives better. And they’re frustrated by the responses they’re getting. Confronting massive inequality, Americans watch leaders finger-point on the price of eggs; yearning for security and stability, Americans watch politics lurch between radically different agendas. No wonder, then, that public trust in the U.S. government has been in the basement for decades. Americans are facing both everyday challenges and a deep, growing sense of discontent. But they’ve lost faith in government to resolve either.

That sense of stuckness doesn’t need to last. But change means focusing on outcomes, eliminating bottlenecks, and prioritizing delivery. It means embracing tools and talent that better connect big ideas to real-world results. It means resisting the temptation to chase buzzwords – from “abundance” to “dominance” to “affordability” – and focusing on the method over the message.

One place to start is with the shift to clean technologies, a place where there is powerful momentum. One in five cars globally are already electric, while heat pumps have outsold gas furnaces in the United States for four consecutive years. The vast bulk of new energy generation is renewable: globally, clean energy investment is now double the amount spent on all fossil fuels combined.

While the transition to clean technologies is unstoppably underway, it is also in its messy middle. Rival technologies and energy systems (and the economic and political systems on which they depend) are now colliding. Many counties and cities depend heavily on fossil fuel revenues; meanwhile, job quality and union density in the renewable energy industry leaves much to be desired. And core parts of our infrastructure – from the power grid to gas stations – are complex and expensive to convert to serve renewable and clean industries, even if those industries will ultimately boost affordability.

Put simply, remaining globally competitive on critical clean technologies requires far more than pointing out that individual electric cars and rooftop solar panels might produce consumer savings. But we also can’t afford to cede the space. Internationally, clean energy spending is booming. China’s clean energy industry by itself would be the world’s eighth largest economy if it were a country, and Europe’s investments have almost doubled over the last decade. Even if current estimates hold, fossil fuel demand will peak mid-century. If the U.S. continues to hold fast to existing policies until then, we’ll be 30 years behind the rest of the world’s energy economy, and it will be impossible to catch up. The bottom line? Good climate policy is good economic policy, and vice versa.

Good climate policy is also good politics. Climate-induced disasters are increasing by the day, and are impacting both safety and affordability. Americans generally see climate and energy policy as important as immigration. Most Americans, on both sides of the political aisle, support environmental regulations and clean energy development. Many say electricity costs are just as stressful as grocery bills, and they worry about higher insurance rates and local market problems. And they’re tired of entrenched corporate interests calling the shots.

What’s needed are creative, clever strategies that boost climate progress while delivering everyday benefits. The Federation of American Scientists (FAS), as part of our new Center for Regulatory Ingenuity (CRI), developed this primer to put a bunch of those strategies in one place. Our goal is for this primer to serve as a resource for public-sector leaders at the federal, state, and local levels who believe that government can do great things for our communities and our planet.

The strategies herein are open-sourced from a diverse network of contributors and collaborators, and are shovel-ready. Many of these strategies are already being deployed across the country. They’re designed to make energy, housing, and transportation better this year.

Indeed, we hope that readers see the actionability of these solutions not just as a benefit, but as an imperative. Americans aren’t looking for the magic message or the magic moment. They’re looking to government for leadership. Every day that government is paralyzed by gridlock, indecisiveness, or fear of failure is another day that it fails to realize the potential of the good that it can achieve, and that public trust in government further erodes. That’s a downwards spiral that we’ve got to stop.

Finally, we emphasize that this primer is a starting place. We’re at the precipice of a new era for climate and energy policy in the United States, and the strategies that will form the backbone of this new era – by adeptly fitting together government capacity, private innovation, and democratic decision-making – are just starting to come into view. As they do, CRI and its partners are committed to working hand-in-glove with bold doers and thinkers, sharpening our collective focus, and realizing the vision of a more responsive government, more optimistic society, and more resilient nation.


Getting to Work: Opportunities in Energy, Transportation, and Housing

Solving problems requires framing them accurately. As observed above, the truth is that clean technologies are increasingly dominant, and that the United States is rapidly falling behind. A response predicated on propping up the 20th-century fossil economy is doomed to fail. So too, we’ve learned, is a response that relies on the U.S. federal government to muscle the clean-technology transition forward single-handedly.

Fortunately, because so many clean technologies are now commercial, the opportunity for leadership on multiple levels, and multiple fronts, has never been more available – or more crucial. For example, simple economics will do much to propel wind, solar, and battery technologies if needed supporting infrastructure is in place and clean technologies are given the chance to compete on fair terms. Policymakers can worry less about expending political capital on expensive public subsidies for clean power, and focus instead on transpartisan policies enabling broad market access, streamlined interconnection processes, and swift power grid build-out. In the transportation sector, policies that ensure transparent vehicle pricing or increase market competition for legacy car companies may matter more than traditional regulatory standards.

This new reality also makes thoughtful economic, industrial, and social policy indispensable. The advent of new technology often comes with the promise of broad societal benefits, but making good on that promise is hardly a guarantee (witness the emergent effects of AI). It’s incumbent on government to ensure that the clean-technology transition reduces inequality and improves quality of life at scale, and that the transition doesn’t abandon workers in fossil-dependent regions and industries to the vagaries of the market. And it’s government, working across multiple scales, that can assess regional comparative advantages and figure out where the United States can still compete – as well as where it must innovate and diversify.

Government leaders, in short, have the unique ability to see all the way from the kitchen table to the commanding heights of the global economy, and to mediate between them.

We illustrate below the types of approaches that entrepreneurial policymakers can adopt to secure U.S. leadership on critical clean technologies, in ways that benefit all Americans. We focus on energy, transportation, and housing, which are collectively the largest sources of climate pollution and key elements of household and regional economies nationwide. The list below is not exhaustive, or comprehensive, but exemplary – a demonstration that there are real opportunities for change.

Unleashing Modern Energy

There’s massive untapped potential for clean energy in the United States. To realize it, we’ve got to make room for new energy to move.

This isn’t primarily a project of continued renewable energy subsidies: there’s good evidence that renewable energy can compete on a level playing field when it’s given the chance. Rather, the project is one of clearing away barriers to financing and building projects, fixing broken market incentives that favor existing players over new entrants and distort energy pricing, and accelerating construction of major grid infrastructure. 

This project looks a lot like the successful national push towards rural electrification that the United States led a century ago: a serious effort that aligns private and public investments to rethink how and where we deliver energy. In executing this effort, we must grapple with the full set of barriers to building – not just cost and permitting, but also thorny local siting processes, misaligned incentives for electric utilities, and lengthy wait times to connect projects to the grid. 

Today, of course, we’ve also got to reckon with the growing threats of cyberattacks and extreme weather to energy infrastructure, as well as the unprecedented, unpredictable energy demands of hyperscalers. Such challenges can only be managed by a mix of climate stabilization policies, economic risk-sharing strategies, and investments in infrastructure modernization. That’s not a cheap or easy proposition, but it is one with major lasting benefits.

At the consumer level, building more clean energy can help stabilize residential electricity prices (though many other factors also contribute to electricity prices and price volatility). More broadly, clean energy could unlock billions of dollars in potential efficiencies, such as by reducing costs associated with redundant natural gas transmission infrastructure. Expanding clean energy, especially distributed energy resources and virtual power plants, can also upgrade outdated grid infrastructure and secure it against cyber threats. But getting to these benefits requires government leadership.

Energy ingenuity could look like:

Making Transportation Cleaner and Cheaper

People just want to get to where they’re going safely, efficiently, and affordably. Yet despite record levels of federal transportation spending, traffic, emissions, and pedestrian deaths keep rising. And as the Cato Institute observes, “U.S. policy contributes to an inefficient and costly transportation system that reduces workers’ time and incomes.”

We can do better. This starts by recognizing that in much of the United States, cars are both essential and increasingly unaffordable. There’s opportunity for a suite of policies that break market strangleholds while expanding consumer choice, moving us away from involuntary dependence on expensive cars and towards a future with transit that people actually want to ride – as well as affordable yet excellent, and often zero-emission, personal transportation. Core federal clean transportation programs have supported $4.6 billion in domestic investments and created at least 14,000 jobs in manufacturing, demonstrating the large-scale benefits of such programs and the economic case for continued federal support. Because the tools involved are nearly all within the authorities of state and local governments, and independent of ongoing federal regulatory disputes, they also can go into effect quickly.

On the vehicle side, this agenda includes governmental efforts to address legacy company market power. Incentives and protections for domestic manufacturing are sensible so long as they boost local economies, support American workers, and drive American innovation – but they’ve got to be coupled with policies ensuring price transparency and other oversight mechanisms, to ensure that benefits flow to consumers rather than pad company profits. Unlocking a more affordable, competitive, zero-emission vehicle (ZEV) market – with more options for buyers at lower prices – is also a key political foundation to the next round of vehicle regulatory mandates, by creating a larger constituency for further progress.

On the system side, states and cities can significantly build up regional budgets with savvy transportation investments. The data are clear that transit and walkability investments bring more valuable housing into cities and connect people with jobs, raising economic activity and raising property values. Investments in electric-vehicle charging similarly boost local business revenue and spurs economic vitality. Communities thrive when their members have transportation options (that all work well), instead of being steered towards legacy vehicle technology and wrestling with creaky 20th-century infrastructure.

On the vehicle side, transportation ingenuity could look like:

On the system side, transportation ingenuity could look like:

Building Affordable, Abundant Housing

Housing shouldn’t be a luxury: it’s a prerequisite for a stable, healthy life. Yet Americans – facing prohibitively high (and increasing) rental costs as well as unrealistic down payments and pathways to ownership – are struggling to meet this basic need. And with extreme weather on the rise, renters and owners alike are facing concerns about physical safety and skyrocketing insurance as well as price hurdles. The emissions that the housing sector produces only worsen these problems.

Delivering more affordable, resilient, and climate-friendly housing means making it easier to build housing of all shapes and sizes; tailoring solutions to rural communities, urban communities, and different geographies generally; and striking a better balance between development for housing and development for other purposes. These strategies need to be paired with deep investments in government capacity to facilitate permitting and approval of new housing construction, as well as to facilitate more complex projects – like retrofits, infill development, and office-to-residential conversion – at scale. Also critical is reimagining community and stakeholder engagement on housing questions, aiming to maintain trust, democratic process, and local buy-in without overvaluing the perspectives of existing homeowners, developers, or any other particular constituency. at the expense of the rest of the community.

Housing ingenuity could look like:


Making Solutions Stick: The Cross-Cutting Benefits of Government Capacity, Pro-Democracy Design, and Innovative Financing

Each of the policy solutions above offers a way to boost climate progress while delivering everyday benefits across energy, transportation, and/or housing. But how do we make those solutions stick? With trust in government at historic lows, public-sector leaders must quickly follow ambition with action, investing in both ideas and the building blocks that turn ideas into reality. Below, we outline how public leaders can use three of these core building blocks – government capacity, financing, and pro-democracy design – to get on the scoreboard early…and stay there for the long term.

Government Capacity

Government capacity refers to the ability of government to get things done, whether through efficient processes, effective talent, or fit-for-purpose tools. Americans are frustrated by the slow pace of government, but they don’t want the functions that keep them safe and supported dismantled: they want them improved. Accomplishing this requires more than new programs or new funding streams or new inventions. It requires leaders to seriously (and systematically – not via a “wrecking ball” approach) consider which government functions are working, which need to be overhauled, and which should be retired.

Rebuilding government capacity is inseparable from strengthening democracy itself. Both of these goals are wholly intertwined with climate progress. When government acts competently, transparently, and in partnership across levels, it restores public faith that collective action is possible and worthwhile. When it can’t, even well-designed policies stall under the weight of fragmented authority, procedural burden, risk aversion, and institutional inertia. Treating government capacity as a core investment is therefore much more than administrative housekeeping. It’s a prerequisite for durable climate progress.

To boost government capacity, public leaders can:

Finance

Capital is a powerful tool for policymakers and others working in the public interest to shape the forward course of the economy in a fair and effective way. Very often, the capital needed to achieve major societal goals comes from a blend of sources; this is certainly true with respect to climate action and facilitating the transition to clean technologies.

States, cities, banks, community-driven financial institutions (CDFIs), impact investors, and philanthropies have long worked in partnership with the federal government on clean-technology projects – and are stepping up in a new way now that federal support for such projects has been scaled back. These entities are developing bond-backed financing, joint procurement schemes, and revolving loan funds – not just to fill gaps, but to reimagine what the clean technology economy can look like.

In the near term, opportunities for subnational investments are ripe because the now partially paused boom in potential firms and projects generated by recent U.S. industrial policy has generated a rich set of already underwritten, due-diligenced projects for re-investment. In the longer term, the success of redesigned regulatory approaches will almost certainly depend on creating profitable firms that can carry forward the clean-technology transition. Public sector leaders can assume an entrepreneurial role in ensuring these new entities, to the degree they benefit from public support, advance the public interest: connecting economic growth to shared prosperity.

To be sure, subnational actors generally cannot fund at the scale of the federal government. But they can have a truly catalytic impact on financing availability and capital flows nevertheless. 

To boost finance, public leaders can:

Public Participation

Public participation in climate action is often treated as a procedural requirement to be satisfied late in the process, rather than as a core function of governing well. The result is familiar: performative town halls, notice-and-comment processes that invite frustration rather than insight, and transparency tools that are easily weaponized by organized interests. This dynamic erodes trust, slows projects, and fuels the perception that government is both unresponsive and incapable. Yet participation, when designed well and tailored to the moment, is not an obstacle to effective governance:  it is how government discovers what will work, where friction will arise, and how to build solutions that communities will defend rather than resist. Treating participation as a functional component of state capacity means seeing it as an input to smarter design, faster implementation, and more durable outcomes.

Upgrading how government listens and engages is vital to upgrading how government delivers. When residents see clearly how their input shapes decisions, participation builds legitimacy and reduces the incentives for obstruction and litigation later in the process. When agencies invest in the infrastructure, tools, roles, and expectations that make participation meaningful, they create a feedback loop that improves policy design and strengthens democratic trust at the same time. And when climate leaders meet the public where they are in terms of how they experience and make consumer choices in the the climate transition, we can strengthen the connective tissue between government action and public trust.The recommendations below are aimed at helping public leaders move beyond compliance-driven engagement toward participation models that are relational, deliberative, and integrated into the machinery of experience and delivery. This approach ensures that climate solutions are not only technically sound, but socially resilient and democratically grounded. These take time, but we encourage recognition that they enable enormous time, risk and failure saved. 

To boost public participation, public leaders can:


About The Primer

Ambition to Action was authored by Angela Barranco, Zoë Brouns, Megan Husted, Kristi Kimball, Arjun Krishnaswami, Hannah Safford, Loren Schulman, Craig Segall, and Addy Smith.

Many individuals contributed ideas and input to this primer. The authors are grateful to the following individuals and organizations for their time, expertise, and constructive feedback: Patrick Bigger, Laurel Blatchford, Heather Clark, Ted Fertik, Danielle Gagne, Kate Gordon, Betony Jones, Nuin-Tara Key, Alex McDonough, Sara Meyers, Shara Mohtadi, Saharnaz Mirzazad, Beth Osborne, Alexis Pelosi, Sam Ricketts, Bridget Sanderson, Lotte Schlegel, Igor Tregub, Louise White, and Clinton Britt. The content of this primer does not necessarily reflect the views of individuals or organizations acknowledged. Any errors are the sole fault of the authors.

Bureaucracy as Social Hope: An Argument for Renewing the Administrative State

I. Why Isn’t Government Working?

The “administrative state” is an unlovely bureaucratic term for a bureaucracy that has grown increasingly unloved: the network of government agencies that implements and enforces laws. In the United States, critiques of the administrative state abound. The nativist right pushes back against a purportedly dangerously powerful “deep state” while the left sees a meek state beholden to big corporations and incumbent interests. Libertarians bemoan bureaucratic inefficiency and hubris, while the newer “Abundance” movement describes a state choking on its own procedures. Though different narrators are telling different stories, they are arriving at the same moral that the core mechanics of the world’s greatest democracy just don’t work. From there, it is not too big a jump towards casting a nihilistic eye on democracy itself, and towards reckless deconstruction.

Erosion of faith in government is manifesting acutely in the climate movement. The Inflation Reduction Act (IRA) was by far the largest climate investment the world has ever seen. Biden-era regulations were intended to further spur rapid decarbonization of the world’s largest economy. And yet. If we had a dollar for every word written about the administrative state’s failure to effectively implement the IRA, we’d be shaving truffles on our eggs. Meanwhile, the current administration’s regulatory rollbacks are the latest play in what seems to be a never-ending game of political football around federal climate policy. If the administrative state can’t effectively address challenges it deems an “existential threat”, one might ask, what good is it?

Our answer: the American administrative state, since its modern creation out of the New Deal and the post-WWII order, has proven that it can do great things. Vast bureaucracies now successfully care for the elderly, the sick, the poor. Many communicable diseases are close to elimination. The administrative state, by directing tremendous amounts of public and private effort, built the power grid, the internet, the interstates. Nor are our glory days behind us: The American administrative state played the primary role in ending the Covid pandemic, saving millions of lives.

Even when it comes to climate change, the record simply isn’t one of failure. American bureaucratic regulation, including from the Environmental Protection Agency (EPA) and from the states, and from air pollution standards for cars to carbon trading systems for entire economies, combined with significant incentive investments, has brought us technological transformation. Renewable energy is the dominant source of new energy globally. Electric cars now comprise 20% of sales globally and will replace internal combustion by mid-century. Whole industries are decarbonizing and emissions will shortly be beginning to fall. For all the many dubiously legal rollbacks of the second Trump administration, the United States continues to decarbonize.

And so, we argue, it’s hardly time to abandon the administrative state. But it is time to reinvent it. Our core supposition is that the sense of malaise and stasis characterizing current views of the bureaucracy has a substantial amount to do with mismatches between tools that produced current successes and the next set of tools that will be required to sustain and grow them. In the same way that nations might have a first or a second Republic, with constitutional reforms intervening, it is likely time for the next American administrative state.

Again, grounding in climate illustrates the point. Significant administrative pushes have commercialized the technologies needed to address the climate crisis and substantially pushed them into use. The Inflation Reduction Act supercharged this process in the United States, while China – which has sought to dominate clean energy supply chains via its own administrative state and invested accordingly – did so globally. As we enter 2026, there is no real doubt that many clean technologies are available, profitable, and better than fossil technologies. Every nation, including those that do not substantially produce these clean technologies, benefits from their adoption

But we are now running into a “mid-transition” moment, in which rival technologies, energy systems, and the economic and political systems on which they depend, are in collision. Consider electric vehicles (EVs). It is one thing to call EVs into being by imposing traditional “supply-side” regulations on manufacturers. It is quite another, as gasoline demand begins to sharply decline, to manage knock-on consequences for the entirety of the fossil economy, from refineries to pipelines to gas stations – much less the local and state budgets and jobs that the fossil economy underpins. Though regulatory strategies can be designed to address these economy-wide consequences, we won’t get there by running the same plays harder and faster. We’ve got to seriously interrogate where the most significant bottlenecks are, who is equipped to address them, and what tools they have or will need to deploy.

Now add two further wrinkles. 

First, procedural tangles that were created for all the right reasons, but that now hamper problem solving. In the environmental space, laws and processes were put in place decades ago to carefully scrutinize the impacts of potentially polluting infrastructure and factories. These measures have, in many instances, succeeded in preventing harm and protecting communities. But they are also indisputably making it harder to rapidly, massively scale up green technologies. This “Greens’ Dilemma” playing itself out in debates over the national environmental regulatory regime nationally is, in fact, a specific manifestation of broader dynamics. Incumbent systems, and those invested in them, do not particularly like to change. Indeed, the American administrative state generally was designed to move deliberately and deliberatively, including multiple veto points to avoid capture by industry or any particular interests. A worthy goal, but distinct from moving with speed towards the public good. When system inertia makes it too easy to grind the gears, the result, unsurprisingly, is painfully slow progress on building new public infrastructure and harnessing new innovations. If we zoom back in on the environmental space with these broader dynamics in mind, the particular obstacle inhibiting climate progress emerges with startling clarity: a system that was designed to produce cleaner technologies within the fossil economy is simply not set up to replace the fossil economy.

Second, the fact that capacity of the government to navigate these challenging dynamics has been sapped. There are multiple drivers of eroding government capacity. At the state and local level, years of corrosive narrative attacks translated into unwise revenue restrictions that in turn made forward-looking capacity investments all but impossible. At the federal level, a variety of policies and misaligned incentives have led to stasis and overreliance on contractors as opposed to internal expertise. At all levels, well-intentioned good-government and environmental reforms have imposed layers of analytic requirements that, while initially successful, ultimately contributed to “kludgeocracy”, while a highly litigious American society has, unsurprisingly, produced a highly risk-averse American government. Make no mistake: U.S. government at all levels has, and has always had, countless dedicated and talented civil servants who find ways to accomplish great things. But generally, this government is riddled with systems and structures that make it ever-more difficult for even the most effective individual to quickly and creatively deliver, especially when armed with aging legal and regulatory tools. 

The upshot? We need not lose faith in the administrative state itself; we would do better to view it as having functioned with its hands tied tighter and tighter. But we are now starting, particularly in the climate and energy space, to hit real limits.

These aren’t issues we can resolve with one-off budget bills or Band-Aid workarounds. The vision, and the fixes, will have to run much deeper. The second Trump administration’s massive federal shake-ups, if nothing else, have opened the field for reconstruction. There is an opening – and, we believe, transpartisan appetite – for a bold, positive vision of a government that is attuned and responsive to the needs of American people and communities, that people can trust to deliver things like cheap, reliable energy; affordable, abundant housing; and fast, safe transportation even as it adeptly manages complex, higher-order challenges like climate change.

To launch its new Center for Regulatory Ingenuity, the Federation of American Scientists (FAS) engaged an ideologically diverse cohort of experts on government capacity and climate to describe how we might realize that vision. This cohort was asked to consider how to advance a paradigm of “regulatory ingenuity” – that is, creativity and cleverness in service of societal objectives alongside basic democratic values – in one or both of the following ways:

  1. Ingenuity in regulatory design. Looking across the entire regulatory lifecycle – from underlying statutory construction, to rule development, to implementation and (ideally) iterative improvement – to seriously examine how existing regulatory systems in the United States can be improved, and identify where fresh thinking is needed.
  2. Ingenuity in regulatory application. Considering how regulations can be coupled with other tools (e.g., innovative market designs, financial instruments, contracting mechanisms, etc.) to achieve societal goals quickly, equitably, and durably.

“Bureaucracy as Social Hope: An Argument for Renewing the Administrative State” is a collection of essays capturing the cohort’s insights. Essay authors envision new alignments of regulatory and financial power, new tools to enable multiple levels of government to move fast, to address distributional impacts, to channel capital at scale, to finally build infrastructure, and to, most fundamentally, break free from stasis. They are, eminently, not cynics. Though clear-eyed about the failings they seek to remedy, they understand that these failings are largely the shadows cast by past success. 

While these essays are grounded in climate policy, they address cross-cutting themes. They use climate as a lens to evaluate where government is and isn’t working. Indeed, the authors’ commentary with respect to government performance on climate challenges is easily extrapolated to other domains.

In writing, the authors revive an older American tradition of a vital administrative state in service of an equally vital and egalitarian democracy. Our nation used to regularly reorganize its government, and the Congress used to legislate regularly on hard problems. The recent reality of agencies working within aging statutes and confined by outdated structures was not the dominant face of government during the creative ferment of the New Deal or the Great Society or, indeed, of the Reconstruction itself. It is, in fact, deeply odd that we still largely live with the same administrative agencies and processes that we had in the 1970s.

So what should – what could – a modernized administrative state look like? The authors together imagine: 

A government that can deliver. It doesn’t need to take a generation to build a railroad, a power grid, or new housing. We can trade a veto-ocracy for the older progressive tradition of governance that rapidly responds to public needs – and secures us the service and infrastructure we need.

A government that can make decisions. The rules of the economy need to stop changing with every election and every major lawsuit. Re-empowering Congress to make big choices, and administrative agencies to deliver without constant swerves, will allow us to stop re-reading the manual and actually play the game.

A government for a modern economy. The future should be innovative and egalitarian. Realizing this future requires the de-risking and direction-setting powers of government to invite bold bets and spur investment, and the distributive powers of government to ensure that benefits are appropriately shared.

A government that listens and responds. We can replace the prevailing procedural labyrinth with a government that asks focused questions on the key issues, acknowledges and addresses real disagreements, and then moves forward thoughtfully yet confidently. That would involve, in part, staffing government fully and organizing it well – reversing decades of attacks on public servants and putting people to work on the right problems.

A government that works on all levels. Federal, state, and local governments each have unique levers and comparative strengths when it comes to serving our communities and society. A modern administrative state should recognize these, and emphasize frameworks that enable them to work well together.

Americans have spent too long living within a slowly failing version of last century’s government. The resulting civic frustration has largely fueled further attacks on government, spiraling us downwards. But an upwards spiral is possible too, in which structural reforms yield a government better equipped to chip away at tough problems in ways that improve daily life and rebuild civic satisfaction.  Because while the “administrative state” as a term is about as wonky as you can get, a renewed administrative state in practice is just common sense.

II. New Approaches for Climate and Democracy

As you will discover as you read, the authors do not all agree on every particular; our goal in inviting this collection was good-faith debate, not artificial consensus. Yet a survey of the collection’s component essays reveals common themes.

For instance, the authors generally agree that economic and industrial policy will be central to the next chapter of climate action. Incumbents still heavily invested in mature fossil-linked technologies and supply chains, as well as non-transparent pricing and other barriers to market entry, badly constrain the transition to competitive clean technologies in many sectors. And where promising technologies are still earlier-stage (e.g., as is the case for nuclear, geothermal, or green hydrogen), there are compelling arguments for government involvement to help establish U.S. dominance. Pollution regulators do not typically, though, control fiscal and monetary tools that can (i) correct market distortions, (ii) manage the very considerable distributive impacts of a shift away from fossil fuels that profoundly impacts industries and jobs across regions, and (iii) support a comprehensive strategy for incubating high-potential domestic industries. Nor are these regulators, with little ability to affect trade policy, well positioned to act within the complex geopolitical context of a partial energy transition. To put it frankly, it doesn’t make a lot of sense to run a massive societal transition with substantial global implications through the EPA. But in the absence of purpose-built institutions and statutes, that’s pretty much what we’ve been doing – with politically and legally unstable results.

This problem is compounded by the fact that the Supreme Court’s skepticism of sweeping regulatory mandates based on old statutes has left the administrative state with ever fewer tools to respond to economic transition needs. Regulations are regularly reversed, and the ongoing duel between litigators and executive branch agencies increasingly looks like an unproductive stalemate. The authors generally chart a path towards a reinvigorated role for Congress to settle disputes, for agencies to act more inventively, and for disputes to move away from the courts and back into democratic processes. 

The authors further point out that regulatory efforts alone are not sufficient to drive the infrastructure shifts needed to make those efforts last, or to buffer their up-front costs. Big infrastructure projects – including vastly growing the clean power grid, electrifying freight, expanding and upgrading transit systems, building new housing, and dismantling legacy, non-economic fuel systems – are central to regulatory success and stability, as well as to addressing an ongoing cost-of-living crisis and boosting national economic competitiveness. Infrastructure, the authors emphasize, isn’t an afterthought – it’s a core enabler of regulatory policy. Unfortunately, the now decades-long trench warfare over climate and other regulations has been accompanied by attacks on the state itself, stripping away administrative and delivery capacity along with the ability of many subnational governments to collect sufficient revenue to fund even basic services, let alone flagship infrastructure projects. The authors vehemently agree that there is much room to trim bureaucratic bloat, streamline process, and sensibly reorganize agencies. At the same time, they observe that a government that is smaller doesn’t always work better; not infrequently, the opposite is true. The authors therefore favor approaches that fit government agencies with the staffing, structures, and revenue they need to deliver on outcomes. Sometimes, those approaches are tweaks. Other times, they’re radical reforms.

II.A Towards a Shared Affirmative Vision

So how do we tackle these challenges – how do we start the upwards spiral in which effective delivery reinforces faith in democratic governance that in turn unlocks more delivery capacity? The authors develop a shared affirmative vision, one that broadly looks like this:

The collective vision is one in which the administrative state starts moving again, returning to the ethic of ongoing systematic revision that once characterized it. Rather than relying on the best ideas and institutions of a half-century ago, we would work towards structures more aligned with current needs – and do so in a way that reaffirms the creativity and vigor that has long powered America’s economy.

II.B Laying Out The Pieces

Each of the essays in this collection lays out particular pieces of the shared vision. Broadly: the collection starts by proposing fundamentally different ways to think about environmental and administrative law, seeing its task as delivering a clean economy at scale, rather than simply cutting pollution, and doing so with stable rules derived in democratically legitimate and procedurally stable ways. It then explores how these legal and regulatory structures could help guide the far larger private sector into configuration with public goals, removing barriers to competition that have insulated stubborn fossil incumbents and creating opportunities to move capital at scale into communities in ways that build a fairer and cleaner economy. From there, wrestling with the dislocations that nonetheless will accompany these changes, the collection describes ways to link participatory democracy with economic change, sharpening the focus of the regulatory state and its engagement with the public. The collection concludes by bringing these issues home, describing how state and local governments can deliver today – and presenting a “policy primer” of innovative ideas that can start moving from ambition to action this year. Below, we discuss each of these pieces in turn.

Jordan Diamond and co-authors at the Environmental Law Institute lays the foundation for this collection with a careful look at what environmental law can do, what it can’t, and how we might rebuild its powerful tools for modern challenges. They argue that the pollution statutes of the Nixon era, crucial though they are to addressing environmental contamination from fossil fuels, are at best limited tools for a whole-of-economy shift away from fossil fuels entirely. Viewing that new challenge as fundamentally one about driving economic innovation and infrastructure growth, they chart out areas ripe for legal development. At the same time, they explain why the next round of environmental progress is more likely to be led by infrastructure and economic agencies than pollution regulators – emphasizing that while pollution regulation will remain critical, we should stop asking pollution regulators to drive a national economic transition with aging environmental statutes alone. Their vision is of treating the energy transition like the economic problem it is, with tools to match. They would expand state capacity, bringing to bear a much wider set of agencies and approaches, and therefore also expand what we think of as “environmental law” to respond to the modern era.

Still working within legal reforms, Kirti Datla takes a close look at the profound challenges modern administrative law poses to the regulatory state. The Supreme Court’s new doctrines, she writes, are making it very difficult for environmental agencies, and regulators generally, to address new problems (and often even old problems) through existing statutes. And they suggest that the Court will impose its deregulatory views on even new statutes. These ever-changing rules strain government capacity, make it difficult for subnational governments and investors to plan a path forward, and prevent progress on policy goals. After acknowledging the need for new regulatory approaches, judicial system reforms, and new statutes, Datla focuses on how Congress can and should engage in the constitutional politics of asserting its role within our federal system, both to constrain the Court and to build its own capacity to address pressing problems like climate.

These two foundational essays, then, help us see the challenge before us. They explain why a kludged-together administrative state running off old statutes and aging structures keeps sputtering to a halt – and start to focus us on an expanded field of play, well beyond re-litigating the environmental policy disputes that have seesawed between the Obama, Biden, and Trump administrations. It is not that the regulatory state is inevitably a “hollow hope” for the shared challenges of climate, democracy, and fair economic growth – but that it has been asked to tackle enormous challenges without a shared theory of action or structures to match. Shifting the economy from its incumbent fossil foundations to a new electrified base, while managing the many linked distributive impacts of that shift under growing climate pressure, simply requires more than pollution regulations or one-time tax policy. If politics is the “slow boring of hard boards,” it helps to have the right tools to drill deep.

But, as Devin Hartman and Neel Brown posit, new tools need not – for durability’s sake, must not – be expensive tools. Nor will another round of mandates succeed without thinking seriously about how to address accompanying costs. Hartman and Brown argue that traditionally conservative lenses that look skeptically at giant fiscal policies and regulatory mandates do, in fact, bring to bear a canny understanding of the interests of incumbent economic system actors. The authors point out that the stuttering progress of the transition to clean technologies comes from the ways in which fossil fuels are deeply intertwined with the interests of powerful economic incumbents, and of existing government. And, having traced the root of the challenge, they conclude that opening these incumbents up to competitive disruption through appropriate reforms will be a potent strategy. For instance, Hartman and Brown contend that the repeal of the IRA may appropriately shift focus of subsidies from mature energy technologies (including clean technologies like solar as well as most fossil technologies) towards earlier-stage technologies (e.g., geothermal). From permitting reform to addressing market problems that deny Americans access to affordable EVs, Hartman and Brown set out a creative array of solutions that, with government backing, can push forward a modern economy at low, or even negative, cost.

Sometimes aligning with these arguments, sometimes complicating them, and always making them concrete, Beth Bafford describes how a focused set of government investments can further shift the economy onto new foundations by using public capital to leverage far greater private investments in the fundamental infrastructure American needs. She outlines how to wed together Hartman and Brown’s pro-competitive policies with the expanded and stable regulatory mission state described by Diamond and Datla. Regulators have often operated on a model in which government grants help underwrite regulatory mandates. Bafford instead starts to outline a structure in which government investments – including simple and accessible loan products – instead help shift the economy towards profitable and self-reinforcing clean new industries. Her model is one in which capital access builds entire businesses that can electrify and modernize core sectors of the economy, from the freight sector to the power grid. Regulations can and should still set the direction of travel in this model – but its engine is broadly shared profitability. Rather than forcing innovation into new channels with politically-exposed regulatory mandates, agencies in Bafford’s model would help convene and channel the economy towards new system states entirely, with regulations conceived as tools operating in concert with economic investments and planning to help crowd in capital to communities across the country.

Nicole Steele explores the role of capital in renewing the administrative state from a different lens. Steele observes that mission-aligned financial institutions (including values-based banks, green banks, CDFIs, and other purpose-driven funds) are increasingly functioning as essential partners in the administrative state’s delivery capacity. Sitting at the intersection of public policy and private markets, these institutions translate legislative and regulatory goals into bankable, scalable projects by absorbing early risk, standardizing structures, and aggregating demand. In practice, this has included mission-aligned banks working alongside state and local governments to deploy catalytic capital – whether as first-loss reserves, flexible operating support, balance-sheet backstops, or credit enhancement – in support of simple, repeatable lending platforms (such as residential and commercial PACE financing) that allow households, small businesses, and local governments to access clean energy, resilience, and efficiency upgrades without relying on bespoke grants or one-off subsidies.

By deploying catalytic capital, Steele continues, these intermediaries unlock funding that would not otherwise reach underserved markets or emerging project types. Critically, investment into mission-driven institutions does not substitute for private capital; it enables it. Strengthening the balance sheets and operating capacity of green banks and CDFIs allows them to originate, warehouse, and scale lending products that meet market standards, crowding in institutional capital while maintaining public purpose. In a period of federal uncertainty and shifting incentive regimes, expanding the availability of catalytic capital will require a diversified approach: drawing on state and local public balance sheets, philanthropy and quasi-philanthropic capital, and mission-aligned institutional investors willing to deploy flexible funds through intermediaries rather than relying on centralized federal programs alone.

Nana Ayensu builds on Bafford and Steele’s insights. As Ayensu points out, we have a transformational economic opportunity to deploy modern, clean energy infrastructure at scale. 

Federal and subnational governments have a real chance to catalyze significant capital deployment of mature and emerging clean energy technologies that are primed for growth – both directly and via investment into infrastructure. Widespread social benefits are available if governments use their authorities to assemble the puzzle pieces needed to create more actionable investment environments. Ayensu describes the state’s ability to do so: it can synchronize intra- and intergovernmental policy execution, build high-value foundational infrastructure to provide project stakeholders with the information they need, develop deeper risk and reward sharing partnerships with the private sector, and create the market forces that close align with economic & societal benefits. Making this type of consistent, efficient multi-pronged effort will be critical to garner the scale of investment needed to expand and update critical energy infrastructure systems and deliver lasting value to communities and industries across the nation.

Ali Zaidi makes the case for bringing this ingenuity to the arena of critical minerals and materials, what he calls “the atomic foundation for reindustrialization and any shot at lasting prosperity and security.” Zaidi draws moral inspiration from America’s post-oil shock response, a crisis moment that authored a broad policy playbook with a spine for experimentation. New laws and regulatory authorities, institutions and infrastructure, and moonshot moves on research…that moment, he writes, gave life to policy to solve a problem. It was “policy with helmets and pads”: playing offense, not defense. Zaidi urges bringing that same positioning to minerals and materials security policy today. In his conception, that policy should entail three pillars – production, partnership, and a drive for increasing productivity – that together support the shared goal of strengthening American competitiveness.

The third pillar is where Zaidi spends the most time. The oil shock of the 1970s propelled domestic standards designed to achieve greater fuel economy and appliance efficiency. Such standards have been weighed down over time by clunky test procedures, multi-year rulemakings, and heavy hand of government auditors. Zaidi proposes a framework for materials productivity that adopts the same solutions-oriented spirit of the 1970s energy policy environment, but is characterized by standards that bind instead of burden. To unlock minerals and materials security, Zaidi writes, “we should replace red tape with rubber bands, just enough structure to allow us to slingshot forward new production, processing, and partnerships — and increased productivity.” Zaidi details a framework that is digital, dynamic, and data-driven: where enforcement is algorithmic, not bureaucratic; and the work is easily federated and easily staffed. This light, flexible scaffolding will accelerate capital formation and technological innovation. 

Indeed, Jennifer DeCesaro, Jennifer Pahlka and Hannah Safford add, we’d do well to apply a similar mindset to planning: a standard feature, and all-too-common bug, of climate policy. Environmental statutes are rife with planning mandates, from Clean Air Act implementation plans to natural hazard mitigation plans required by the Stafford Act to all things NEPA. Look beyond pure statute and become quickly overwhelmed: climate-related plans are mandated by public utilities commissions, developed by task forces, produced as a precondition for grant eligibility, and on and on. Though plans are easy to ask for, they’re often expensive and time-consuming to develop; moreover, lack of coordination among overlapping plans can lead to duplication or even contradictions. DeCesaro, Pahlka, and Safford therefore ask a simple question: “What are all these plans getting us?” They argue that climate policy too often falls into the trap of “planning primacy”, where planning becomes the end goal instead of an intermediate step towards progress. Put another way, it’s rarely the case that a plan is developed and its directions are then followed to the letter. Rather, the process of thinking through scenarios, understanding constraints, building mental models, and developing relationships with other plan stakeholders is what really matters. DeCesaro, Pahlka, and Safford draw from both the climate space and other domains to illustrate how treating plans as compasses, not maps, can improve efficiency and outcomes. Because to quote Eisenhower: “In preparing for battle I have always found that plans are useless, but planning is indispensable.”

Shifting incumbent systems requires not just low-cost solutions, access to capital, and competent, efficient regulatory capacity. It also requires ways to reconcile or resolve competing interests. Our current regulatory system has gotten bogged down with ineffective procedural approaches to dispute resolution, yielding a litigation-driven collection of process fouls and veto points that no one really likes. Our next set of authors observes that improving this system requires more than a simplistic call for deregulation. Moreover, they argue, the solution can’t be to ignore stakeholder input altogether – that runs the risk of policies that are poorly informed, technically unfeasible, and brittle given lack of buy-in by the businesses, communities, and people they serve. Rather, our authors propose a range of reforms to help administrative bodies effectively collect input from stakeholders, weigh hard trade-offs and disputes, and move forward fairly, but expeditiously: thereby using democratically legitimate decisionmaking to strengthen industrial policy.

The first of these authors is James Goodwin, who argues for an “agonistic” view of the regulatory state in which regulators must actively surface and invite input on genuine disputes. Goodwin proposes replacing today’s box-checking engagement exercises and voluminous stacks of public comments with a focused participation process. In this process, administrators would at each state of a project or regulation, identify the core disputes and disagreements that need resolving, and draw in input specifically on these issues. By targeting engagement – and avoiding consensus – in this way, administrators would be able to efficiently advance dialogues with the public that are both quicker and inherently more resistant to status quo bias.

Loren DeJonge Schulman and Shaibya Dalal pick up on this theme. They argue that treating public engagement as a strategic asset, not a box-checking exercise, leads to smarter, more durable policies that reflect real community needs and build trust in government. Participation is not a distraction from governing – it is how government governs well. They argue that the failure of many engagement processes is not that agencies invite too much input, but that they do so too late, too perfunctorily, and in ways that exclude the communities most affected by public decisions. When participation is treated as compliance rather than governance, it fuels distrust, invites procedural obstruction, and produces policies that are fragile and contested. By reflecting the full range of transactional public participation and relational community engagement options, and by applying clear principles (purposeful design, mutual respect, transparency, accessibility, and iteration) agencies can use engagement to surface lived experience, anticipate conflict, improve policy design, and strengthen the legitimacy and durability of their actions. Done well, participation becomes a form of ingenuity that reduces conflict, eases implementation, and reinforces democratic accountability.

Of course, inviting public participation only works when people are interested in participating. Angela Barranco and Kristi Kimball argue that the American climate movement faces a critical public engagement crisis that threatens to undermine decades of progress on clean energy adoption – and explore how advocates can speak to the public to build interest and support for the shifts that government seeks to deliver and legitimize. Despite nearly 70% of Americans expressing concern about climate change, Barranco and Kimball contend that current advocacy strategies fail to tee up paths for politically durable dispute resolution (and eventual support) because those strategies are unduly rooted in fear-based messaging and technical data. Barranco and Kimball make the case for a shift towards a public conversation that approaches Americans as consumers (who must adopt new technologies and cannot be persuaded through regulatory mandates alone) making lifestyle choices rather than political constituents to be mobilized. Drawing on proven strategies from consumer marketing, behavioral psychology, and community-based social marketing research, Barranco and Kimball observe tremendous opportunities for (i) reframing climate engagement around consumer choice, and (ii) leveraging the unprecedented infrastructure investments necessitated by extreme weather impacts to build lasting climate coalitions while simultaneously strengthening democratic institutions and community trust. 

Ultimately, these changes and debates occur not in the abstract, and not just in Washington, DC. State and local governments are the theaters in which economic and democratic change play out, mediating federal policy and global geopolitical shifts in the lives of real people. Thus both the climate crisis and the economic transition are inherently “polycentric”. Subnational governments have therefore always been at the core of climate and regulatory policy. It is these governments that are most able to set democratically responsive visions for clean economic growth, climate resilience, and infrastructural change that will concretely change lives. If our future is to be shaped more by ordinary people than by technocrats, it is these governments that must have the capacity and creativity to act.

Louise Bedsworth provides a prospectus for local action. As she argues, a rebuilt regulatory state has to position state and local governments for creative action and response. These governments, she writes, are more than subsidiary partners, and more than replacements for federal regulators during deregulatory periods (important though those roles can be). State and local governments are innovators and leaders in their own right. The task is not just to provide ancillary community benefits from federal grants, or to mandate particular state plans, but for state and local democracies to be engines of national and even global change. By expanding their own capacity, aligning capital and economic plans to build regional prosperity and resilience, and engaging in and leveraging networks across geographies, nationally and globally, subnational governments can reshape climate action and the regulatory state.  

Indeed, because of the enormous creativity of subnational governments, and the huge opportunities created by the private sector, in response to past regulatory guidance and government investments, we do not need to wait for a new federal administration to start putting solutions into place. We have already identified a broad network of ideas and actors that can start building these ideas in reality, this year – in a policy primer for that foundational work. The primer, crowd-sourced from leaders across the field, highlights a starting list of policies well within the reach of subnational actors, and focusing strongly on economic and industrial policy interventions that can durably advance clean economic systems while managing real trade-offs with savvy deployment of government capacity. It is a practical point of engagement, allowing for the ideas articulated in these papers to be tested now, not after further electoral cycles.  

III. Conclusion

We do not need more stories of American decline. Critics on the left, center and right have already told us that our government doesn’t work. Americans feel underserved, underrepresented, and ripped off. But Americans also know how to do better. We are always rebuilding our democracy; it is time to do it again.

Collectively, our authors have sketched out the beginnings of an administrative state for this era – grounded in the pressing challenge of climate change and its increasingly evident impacts on American lives. This state would enable governments across scales, and stakeholders across sectors, to realize the vision of a nation where:

This sort of “mission state” – a government that sets a clear vision and brings together public and private sectors to execute it – is actually an old American tradition. What else were the New Deal, the Apollo Program, Operation Warp Speed, and the creation of the internet than missions of this sort? Indeed, when it comes to newer challenges like climate change, we have started, a bit haphazardly, to reach for a mission again. The Inflation Reduction Act’s billions in investments, and the Biden administration’s complementary regulations, were an attempt to bring together the public and private sectors around the vision of a clean and prosperous economy, with good-paying jobs and dominance in the technologies increasingly certain to underpin the 21st-century global order. Yet because of obstacles identified above, that mission was…while not entirely a failure, hardly a resounding success.

But the mission remains necessary. America must not remain mired halfway between the old economy and the new, exposed to climate shocks, with a government unable to satisfyingly respond. Clean technologies are advanced enough that retrenchment and retreat to fossil is a doomed strategy; similarly, we’ve seen that taking a chainsaw to government leaves our whole nation bleeding.

The only logical approach is to tap into the creative, determined spirit that is the essence of American identity. Think of the millions of Americans who, in the midst of the Great Depression, spread out to every part of this country to rebuild it. We still live among the lovely parks, trails, and civic architecture called into being by the Civilian Conservation Corps; our power grid was brought to us by rural electrification, the Federal Power Act, and the Tennessee Valley Authority. We know what it looks like when Americans believe in government and the government is worthy of that belief. 

It looks, to start, like a conversation. As CRI launches, in partnership with a broad network of partners and contributors, we invite debate, dissent, and experimentation. One of our goals is to bring together people and perspectives that are often in tension to identify where there are some threads of common sentiment – and how we can productively move forward despite the tension that remains. We will be gathering thinkers, exchanging ideas, and mapping out pilot projects with growing momentum across the months and years to come, working not just to theorize around solutions but to bring them to life. To adapt the truism about trees: the best time to renew our administrative state was ten years ago. The second-best time is today.

Impacts of Extreme Heat on Children’s Health and Future Success

Extreme heat poses serious and growing risks to children’s health, safety, and education. Yet, schools and childcare facilities are unprepared to handle rising temperatures. To protect the health and well-being of American children, Congress should (1) set policies that guide childcare facilities and schools in preparing for and responding to extreme heat, (2) collect the data required to inform extreme heat readiness and adaptation, and (3) strategically invest in necessary infrastructure upgrades to build heat resilience.

Children are Uniquely Vulnerable to Extreme Heat Exposure and Acute and Chronic Health Impacts

At least five factors drive children’s vulnerability to negative health outcomes from extreme heat, like heat-related illnesses and chronic complications. First, children’s bodies take a longer time to increase sweat production and acclimatize to higher temperatures. Second, young children are more prone to dehydration than adults because a larger percentage of their body weight is water. Third, infants and young children have challenges regulating their body temperatures and often do not recognize when they should act to cool down. Fourth, compared with adults, children spend more time active outdoors, which results in increased exposure to high ambient heat. Fifth, children usually depend on others to provide them with water and protect them from unsafe outdoor environments, but children’s caretakers often underestimate the seriousness of the symptoms of heat stress. Research shows that extreme heat days are linked to increased emergency room (ER) visits for children, especially the 16% of children living at or below the federal poverty line. Extreme heat also exacerbates children’s chronic diseases, like asthma and eczema, increasing health care costs and decreasing children’s overall quality of life.

The Consequences of Chronic Extreme Heat Exposure on Children’s Learning and Well-Being 

Studies show that excess temperatures reduce cognitive functioning. Hot weather also impacts children’s behavior, making them more prone to restlessness, irritability, aggression, and mental distress. Finally, nighttime extreme heat exposure can disrupt sleep patterns, making it harder to fall asleep and stay asleep. These factors can all reduce children’s ability to focus, learn and succeed in school. For each 1°F rise in average annual temperature in school districts without air conditioning or proper heat protections, there is a 1% drop in learning. The Environmental Protection Agency found that these learning losses could translate into nearly $7 billion dollars in annual future income losses if warming trends continue.

Extreme Heat’s Threat to Schools and Childcare Facilities

Rising temperatures force school districts and childcare facilities into a dilemma: choosing between staying open in unsafe heat or closing and disrupting learning and care. 

Staying open can expose students and young children to extreme indoor and outdoor temperatures. The Government Accountability Office found that 41% of U.S. schools need to upgrade their heating, ventilation, and air conditioning (HVAC) systems: upgrades that will cost billions of dollars that schools in low-income areas do not have. Similar infrastructure challenges extend to childcare facilities. Extreme heat also makes outdoor recess more dangerous, as unshaded playgrounds and asphalt surfaces can heat up far above ambient temperatures and pose burn risks. 

Yet when schools close for heat, children still suffer. Even five days of closures for inclement weather in a school year can cause measurable learning loss. Additionally, students may lose access to school meals; while food service continuation plans exist, overheated facilities can complicate implementation. Many children, especially in low-income families, also don’t have access to reliable cooling at home, meaning that when schools close for heat, these children receive little respite. Finally, parents are directly impacted as well: school closures also mean parents lose access to childcare, forcing many to miss work or pay for alternative arrangements, straining vulnerable households. 

Advancing Solutions that Safeguard American Children from the Impacts of Extreme Heat

To support the capacity of child-serving facilities to adapt to extreme heat, Congress should direct the Department of Education to develop extreme heat guidance, technical assistance programs, and temperature standards, following existing state-level policies as a model for action. Congress should also direct the Administration for Children and Families to develop analogous policies for early childhood facilities and daycare centers receiving federal funding. Finally, Congress should direct the U.S. Department of Agriculture to develop a waiver process for continuing school food service when extreme heat disrupts schedules during the school year. 

To support improved federal data collection efforts on extreme heat’s impacts, Congress should direct the Department of Education and Administration for Children and Families to collect data on how schools and childcare facilities are experiencing and responding to extreme heat. There should be a particular focus on the infrastructure upgrades that these facilities need to make to be more prepared for extreme temperatures — especially in low-income and rural communities.Lastly, to foster much-needed infrastructure improvements in schools and childcare facilities, Congress should consider amending Title I of the Elementary & Secondary Education Act or directing the Department of Education to clarify that funds for Title I schools may be used for school infrastructure upgrades needed to avoid learning losses. These upgrades can include the replacement of HVAC systems or installation of cool roofs, walls, and pavement, solar and other shade canopies, and green roofs, trees, and other green infrastructure, which can keep school buildings at safe temperatures during heat waves. Congress should also direct the Administration for Children and Families to identify funding resources that can be used to upgrade federally-supported childcare facilities.

Impacts of Extreme Heat on Federal Healthcare Spending

Public health insurance programs, especially Medicaid, Medicare, and the Children’s Health Insurance Program (CHIP), are more likely to cover populations at increased risk from extreme heat, including low-income individuals, people with chronic illnesses, older adults, disabled adults, and children. When temperatures rise to extremes, these populations are more likely to need care for their heat-related or heat-exacerbated illnesses. Congress must prioritize addressing the heat-related financial impacts onthese programs. To boost the resilience of these programs to extreme heat, Congress should incentivize prevention by enabling states to facilitate health-related social needs (HRSN) pilots that can reduce heat-related illnesses, continue to support screenings for the social drivers of health, and implement preparedness and resilience requirements into the Conditions of Participation (CoPs) and Conditions for Coverage (CfCs) of relevant programs

Extreme Heat Increases Fiscal Impacts on Public Insurance Programs

Healthcare costs are a function of utilization, which has been rapidly rising since 2010. Extreme heat is driving up utilization as more Americans seek medical care for heat-related illnesses. Extreme heat events are estimated to be annually responsible for nearly 235,000 emergency department visits and more than 56,000 hospital admissions, adding approximately $1 billion to national healthcare costs

Heat-driven increases in healthcare utilization are especially notable for public insurance programs. One recent study found that there is a 10% increase in heat-related emergency department visits and a 7% increase in hospitalizations during heat wave days for low-income populations eligible for both Medicaid and Medicare. Further demonstrating the relationship between increased spending and extreme heat, the Congressional Budget Office found that for every 100,000 Medicare beneficiaries, extreme temperatures cause an additional 156 emergency department visits and $388,000 in spending per day on average. These higher utilization rates also drive increases in Medicaid transfer payments from the federal government to help states cover rising costs. For every 10 additional days of extreme heat above 90°F, annual Medicaid transfer payments increase by nearly 1%, equivalent to an $11.78 increase per capita

Additionally, Medicaid funds services for over 60% of nursing home residents. Yet Medicaid reimbursement rates often fail to cover the actual cost of care, leaving many facilities operating at a financial loss. This can make it difficult for both short-term and long-term care facilities to invest in and maintain the cooling infrastructure necessary to comply with existing requirements to maintain safe indoor temperatures. Further, many short-term and long-term care facilities do not have the emergency power back-ups that can keep the air conditioning on during extreme weather events and power outages, nor do they have emergency plans for occupant evacuation in case of dangerous indoor temperatures. This can and does subject residents to deadly indoor temperatures that can worsen their overall health outcomes.

The Impacts of the One Big Beautiful Bill Act

The One Big Beautiful Bill Act (H.R. 1) will have consequential impacts on federally-supported health insurance programs. The Congressional Budget Office projects that an estimated 10 million people could lose their healthcare coverage by 2034. Researchers have estimated that a loss of coverage could result in 50,000 preventable deaths. Further, health care facilities and hospitals will likely see funding losses as a result of Medicaid funding reductions. This will be especially burdensome to low-resourced hospitals, such as those serving rural areas, and result in reductions in available offerings for patients and even closure of facilities. States will need support navigating this new funding landscape while also identifying cost-effective measures and strategies to address the health-related impacts of extreme heat.

Advancing Solutions that Safeguard America’s Health from Extreme Heat

To address these impacts in this additionally challenged context, there are common-sense strategies to help people avoid extreme heat exposure. For example, access to safely cool indoor environments is one of the best preventative strategies for heat-related illness. In particular, Congress should create a demonstration pilot that provides eligible Medicare beneficiaries with cooling assistance and direct CMS to encourage Section 1115 demonstration waivers for HRSN related to extreme heat. Section 1115 waivers have enabled states to finance pilots for life-saving cooling devices and air filter distributions. These HRSN financing pilots have helped several states to work around the challenges of U.S. underinvestment in health and social services by providing a flexible vehicle to test methods of delivering and paying for healthcare services in Medicaid and CHIP. As Congress members explore these policies, they should consider the impact of H.R. 1’s new requirements for 1115 waiver’s proof of cost-neutrality.

To further support these efforts for heat interventions, Congress should direct CMS to continue Social Drivers of Health (SDOH) screenings as a part of Quality Reporting Programs and integrate questions about extreme heat exposure risks into the screening process. These screenings are critical for identifying the most vulnerable patients and directing them to the preventative services they need. This information will also be critical for identifying facilities that are treating high proportions of heat-vulnerable patients, which could then be sites for testing interventions like energy and housing assistance.

Congress should also direct the CMS to integrate heat preparedness and resilience requirements and metrics into the Conditions of Participation (CoPs) and Conditions for Coverage (CfCs), such as through the Emergency Preparedness Rule. This could include assessing the cooling capacity of a health care facility under extreme heat conditions, back-up power that is sufficient to maintain safe indoor temperatures, and policies for resident evacuation in the event of high indoor temperatures. For safety net facilities, such as rural hospitals and federally qualified health centers, Congress should consider allocating resources for technical assistance to assess these risks and the infrastructure upgrades.

Impacts of Extreme Heat on Agriculture

Agriculture, food, and related industries produce nearly 90% of the food consumed in the United States and contribute approximately $1.54 trillion to the national GDP. Given the agricultural sector’s importance to the national economy, food security, and public health, Congress must pay attention to the impacts of extreme heat. To boost the resilience of this sector, Congress should design strategic insurance solutions, enhance research and data, and protect farmworkers through on-farm adaptation measures.  

Extreme Heat Reduces Farm Productivity and Profitability

Extreme heat threatens agricultural productivity by increasing crop damage, causing livestock illness and mortality, and worsening water scarcity. Hotter conditions can damage crops through crop sunburn and heat stress, reducing annual yields for farms by as much as 40%. Animals raised for meat, milk, and eggs also experience increased risks of heat stress and heat-related mortality. For dairy production in particular, an estimated 1% of total annual yield is lost to heat stress alone. Further straining agricultural productivity, extreme heat accelerates water scarcity by increasing water evaporation rates. These higher evaporation rates force farmers to use even more water, drawing often from already stressed water sources. The compounding pressures posed by extreme heat can translate into significant economic losses: a study of Kansas commodity farms found that for every 1°C (1.8°F) increase in temperature, net farm incomes drop by 66%. Together, this means reduced revenue for farms and less food available for people.

Insurance solutions can help mitigate these financial impacts from extreme heat if employed responsibly. Multiple permanently authorized federal programs provide insurance or direct payments to help producers recover losses from extreme heat, including the Federal Crop Insurance Program, the Noninsured Crop Disaster Assistance Program, the Livestock Indemnity Program, and the Emergency Assistance for Livestock, Honey Bees, and Farm-Raised Fish Program. These programs need to ensure that producers are adequately covered against heat-related impacts and incentivize practices that reduce the risk of extreme heat related damages. This in turn will reduce the fiscal exposure of federal farm risk management programs. Congress should call on the United States Department of Agriculture (USDA) to research the feasibility of incentivizing heat resilience through federal crop insurance rates. Congress should also consider insurance premium subsidies for producers who adopt practices that enhance heat resilience for crops and livestock. 

Given the increasing stress of extreme heat on the water systems necessary to sustain agricultural production, National Oceanic and Atmospheric Administration (NOAA) should build on its Weather, Water, and Climate Strategy and collaborate with USDA on a national water security strategy that accounts for current and future hotter temperatures. To enhance system-wide drought resilience, Congress can also appropriate funds to leverage existing USDA programs to support on-farm adoption of shade systems, effective water management, cover crops, and soil regeneration practices

Finally, there are still notable knowledge gaps around extreme heat and its impacts on agriculture. These gaps include the long-term effects of higher temperatures on yields, farm input costs, and federal program spending. To address these information gaps and guide future research, Congress can direct the USDA Secretary to submit a report to Congress on the impacts of extreme heat on agriculture, farm input costs and losses, consumer prices, and the federal government’s spending (e.g.,  federal insurance and direct payment programs for losses of agricultural products and the provision of Supplemental Nutrition Assistance Program (SNAP) benefits).

Extreme Heat Lowers Agricultural Workers’ Productivity and Exposes Them to Health Risks

Higher temperatures and resulting heat stress are endangering farmer and farmworker safety and reducing their overall productivity, impacting bottom lines. Farmworkers are essential to the American food system, yet they are among the most vulnerable to extreme heat, facing a 35 times greater risk of dying from heat-related illnesses than workers in other sectors. This risk is intensifying as the sector increasingly relies on H‑2A farmworkers, who are hired to fill persistent domestic farm labor shortages. In many regions, over 25% of certified H‑2A farmworkers are required to work when local average temperatures exceed 90°F, and counties with the highest concentrations of H‑2A workers often coincide with the hottest parts of the country. After the work day, many of these workers return to substandard employer-provided housing that lacks essential cooling or ventilation, preventing effective recovery from daily heat exposure and exacerbating heat-related health risks. On top of the health risks, these conditions make people less effective on the job, which translates to economy-wide impacts: heat-related labor productivity losses across the U.S. economy currently exceeds $100 billion annually.

To address these risks, Congress should pass legislation requiring the Occupational Safety and Health Administration to finalize a federal heat standard that provides sufficient coverage for farming operations. In tandem with Occupational Safety and Health Administration (OSHA) finalizing the standard, USDA should be funded to provide technical assistance to agricultural employers for tailoring heat illness prevention plans and implementing cost-effective interventions that improve working conditions while maintaining productivity. This should include support for agricultural employers to integrate heat awareness into workforce training, resources for safety equipment and education, and support for the addition of shade structures. Doing so would ensure that agricultural workers across both large and small-scale farming operations have access to essential protections, like shade, clean water, and breaks, as well as sufficient capacity to comply. Current funding streams that could have an extreme heat infrastructure “plus-up” include the Environmental Quality Incentives Program and the Farm Service Agency’s microloans program.Lastly, Congress should also direct OSHA to continue implementing its National Emphasis Program on Heat, which enforces employers’ obligation to protect workers against heat illness or injury. OSHA should additionally review employers’ practices to ensure that H2A and other agricultural workers are protected from job or wage loss when extreme heat renders working conditions unsafe.

Clean Water: Protecting New York State Private Wells from PFAS

This memo responds to a policy need at the state level that originates due to a lack of relevant federal data. The Environmental Protection Agency (EPA) has a learning agenda question that asks,“To what extent does EPA have ready access to data to measure drinking water compliance reliably and accurately?” This memo fills that need because EPA doesn’t measure private wells.

Per- and polyfluoroalkyl substances (PFAS) are widely distributed in the environment, in many cases including the contamination of private water wells. Given their links to numerous serious health consequences, initiatives to mitigate PFAS exposure among New York State (NYS) residents reliant on private wells were included among the priorities outlined in the annual State of the State address and have been proposed in state legislation. We therefore performed a scenario analysis exploring the impacts and costs of a statewide program testing private wells for PFAS and reimbursing the installation of point of entry treatment (POET) filtration systems where exceedances occur. 

Challenge and Opportunity

Why care about PFAS? 

Per- and polyfluoroalkyl substances (PFAS), a class of chemicals containing millions of individual compounds, are of grave concern due to their association with numerous serious health consequences. A 2022 consensus study report by the National Academies of Sciences, Engineering, and Medicine categorized various PFAS-related health outcomes based on critical appraisal of existing evidence from prior studies; this committee of experts concluded that there is high confidence of an association between PFAS exposure and (1) decreased antibody response (a key aspect of immune function, including response to vaccines) (2) dyslipidemia (abnormal fat levels in one’s blood), (3) decreased fetal and infant growth, and (4) kidney cancer, and moderate confidence of an association between PFAS exposure and (1) breast cancer, (2) liver enzyme alterations, (3) pregnancy-induced high blood pressure, (4) thyroid disease, and (5) ulcerative colitis (an autoimmune inflammatory bowel disease).

Extensive industrial use has rendered these contaminants virtually ubiquitous in both the environment and humans, with greater than 95% of the U.S. general population having detectable PFAS in their blood. PFAS take years to be eliminated from the human body once exposure has occurred, earning their nickname as “forever chemicals.” 

Why focus on private drinking water? 

Drinking water is a common source of exposure. 

Drinking water is a primary pathway of human exposure. Combining both public and private systems, it is estimated that approximately 45% of U.S. drinking water sources contain at least one PFAS. Rates specific to private water supplies have varied depending on location and thresholds used. Sampling in Wisconsin revealed that 71% of private wells contained at least one PFAS and 4% contained levels of perfluorooctanoic acid (PFOA) or perfluorooctanesulfonic acid (PFOS), two common PFAS compounds, exceeding Environmental Protection Agency (EPA)’s Maximum Contaminant Levels (MCLs) of 4 ng/L. Sampling in New Hampshire, meanwhile, found that 39% of private wells exceeded the state’s Ambient Groundwater Quality Standards (AGQS), which were established in 2019 and range from 11-18 ng/L depending on the specific PFAS compound. Notably, while the EPA MCLs represent legally enforceable levels accounting for the feasibility of remediation, the agency has also released health-based, non-enforceable Maximum Contaminant Level Goals (MCLGs) of zero for PFOA and PFOS. 

PFAS in private water are unregulated and expensive to remediate. 

In New York State (NYS), nearly one million households rely on private wells for drinking water; despite this, there are currently no standardized well testing procedures and effective well water treatment is unaffordable to many New Yorkers. As of April 2024, the EPA has established federal MCLs for several specific PFAS compounds and mixtures of compounds and its National Primary Drinking Water Regulations (NPDWR) require public water systems to begin monitoring and publicly reporting levels of these PFAS by 2027; if monitoring reveals exceedances of the MCLs, public water systems must also implement solutions to reduce PFAS by 2029. In contrast, there are no standardized testing procedures or enforceable limits for PFAS in private water. Additionally, testing and remediating private wells are both associated with high costs which are unaffordable to many well owners; prices range in hundreds of dollars for PFAS testing and can cost several thousands of dollars for the installation and maintenance of effective filtration systems.

Figure 1. Distribution of 901,441 private wells across New York State

How are states responding to the problem of PFAS in private drinking water? 

Several states, including Colorado, New Hampshire, and North Carolina, have already initiated programs offering well testing and financial assistance for filters to protect against PFAS.

An opportunity exists to protect New Yorkers. 

Launching a program in New York similar to those initiated in Colorado, New Hampshire, and North Carolina was among the priority initiatives described by New York Governor Kathy Hochul in the annual State of the State she delivered in January 2025. In particular, Hochul’s plans to improve water infrastructure included “a pilot program providing financial assistance for private well owners to replace or treat contaminated wells.” This was announced along with a $500 million additional investment beyond New York’s existing $5.5 billion dedicated to water infrastructure, which will also be used to “reduce water bills, combat flooding, restore waterways, and replace lead service lines to protect vulnerable populations, particularly children in underserved communities.” In early 2025, the New York Legislature introduced Senate Bill S3972, which intended to establish an installation grant program and a maintenance rebate program for PFAS removal treatment. Bipartisan interest in protecting the public from PFAS-contaminated drinking water is further evidenced by a hearing focused on the topic held by the NYS Assembly in November 2024. 

Though these efforts would likely initially be confined to a smaller pilot program with limited geographic scope, such a pilot program would aim to inform a broader, statewide intervention. Challenges to planning an intervention of this scope include uncertainty surrounding both the total funding which would be allotted to such a program and its total costs. These costs will be dependent on factors such as the eligibility criteria employed by the state, the proportion of well owners who opt into sampling, and the proportion of tested wells found to have PFAS exceedances (which will further vary based on whether the state adopts EPA MCLs or NYS Department of Health MCLs, which are 10 ng/L for PFOA and PFOS). We allay the uncertainty associated with these numerous possibilities by estimating the numbers of wells serviced and associated costs under various combinations of 10 potential eligibility criteria, 5 possible rates (5, 25, 50, 75, and 100%) of PFAS testing among eligible wells, and 5 possible rates (5, 25, 50, 75, and 100%) of PFAS>MCL and subsequent POET installation among wells tested. 

Table 1. Comparison of total cost and cost/year over a 5-year implementation period for 10 eligibility scenarios
Table 2. Overlap of disadvantaged and small communities with other eligibility criteria

Scenario Analysis 

Key findings

Plan of Action

New York State is already considering a PFAS remediation program (e.g., Senate Bill S3972). The 2025 draft of the bill directed the New York Department of Environmental Conservation to establish an installation grant program and a maintenance rebate program for PFAS removal treatment, and establishes general eligibility criteria and per-household funding amounts. To our knowledge, S3972 did not pass in 2025, but its program provides a strong foundation for potential future action. Our suggestions below resolve some gaps in S3972, including additional detail that could be followed by the implementing agency and overall cost estimates that could be used by the Legislature when considering overall financial impacts.

Recommendation 1. Remediate all disadvantaged wells statewide

We recommend including every well located within a census tract designated as disadvantaged (based on NYS Disadvantaged Community (DAC) criteria) and/or belonging to a household with annual income <$150,000 as the eligibility criteria which protects the widest range of vulnerable New Yorkers. Using this criteria, we estimate a total program cost of approximately $833 million, or $167 million per year if the program were to be implemented over a 5-year period. Even accounting for the other projects which the state will be undertaking at the same time, this annual cost falls well within the additional $500 million which the 2025 State of the State reports will be added in 2025 to an existing $5.5 million state investment in water infrastructure. 

Recommendation 2. Target disadvantaged census tracts and household incomes

Wells in DAC census tracts accounts for a variety of disadvantages. Including NYS DAC criteria helps to account for the heterogeneity of challenges experienced by New Yorkers by weighing statistically meaningful thresholds for 45 different indicators across several domains. These include factors relevant to the risk of PFAS exposure, such as land use for industrial purposes and proximity to active landfills. 

Wells in low-income households account for cross-sectoral disadvantage. The DAC criteria alone is imperfect:

The inclusion of income-based criteria is useful in that financial strain is a universal indicator of resource constraint which can help to identify the most-in-need across every community. Further, including income-based criteria can widen the program’s eligibility criteria to reach a much greater proportion of well owners (Table 2). Finally, in contrast to the DAC criteria’s binary nature, income thresholds can be adjusted to include greater or fewer wells depending on final budget availability.

Recommendation 3. Alternatives to POETs might be more cost-effective and accessible

A final recommendation is for the state to maximize the breadth of its well remediation program by also offering reimbursements for point-of-use treatment (POUT) systems and for connecting to public water systems, not just for POET installations. While POETs are effective in PFAS removal, they require invasive changes to household plumbing and prohibitively expensive ongoing maintenance, two factors which may give well owners pause even if they are eligible for an initial installation rebate. Colorado’s PFAS TAP program models a less invasive and extremely cost-effective POUT alternative to POETs. We estimate that if NYS were to provide the same POUT filters as Colorado, the total cost of the program (using the recommended eligibility criteria of location within a DAC-designated census tract and/or belonging to a household with annual income <$150,000) would be $163 million, or $33 million per year across 5 years. This amounts to a total decrease in cost of nearly $670 million if POUTs were to be provided in place of POETs. Connection to public water systems, on the other hand, though a significant initial investment, provides an opportunity to streamline drinking water monitoring and remediation moving forward and eliminates the need for ongoing and costly individual interventions and maintenance. 

Conclusion 

Well testing and rebate programs provide an opportunity to take preventative action against the serious health threats associated with PFAS exposure through private drinking water. Individuals reliant on PFAS-contaminated private wells for drinking water are likely to ingest the chemicals on a daily basis. There is therefore no time to waste in taking action to break this chain of exposure. New York State policymakers are already engaged in developing this policy solution; our recommendations can help both those making the policy and those tasked with implementing it to best serve New Yorkers. Our analysis shows that a program to mitigate PFAS in private drinking water is well within scope of current action and that fair implementation of such a program can help those who need it most and do so in a cost-effective manner.

Frequently Asked Questions
Why not regulate private wells at the federal level in the same manner as public water systems?

While the Safe Drinking Water Act regulates the United States’ public drinking water supplies, there is no current federal government to regulate private wells. Most states also lack regulation of private wells. Introducing new legislation to change this would require significant time and political will. Political will to enact such a change is unlikely given resource limitations, concerns around well owners’ privacy, and the current time in which the EPA is prioritizing deregulation.

If greater than 95% of the U.S. general population already has detectable PFAS in their blood, what’s the point in addressing this one source of exposure?

Decreasing blood serum levels is likely to decrease negative health impacts. Exposure via drinking water is particularly associated with elevated serum PFAS levels, while appropriate water filtration has demonstrated efficacy in reducing serum PFAS levels.

How were the total costs associated with each scenario determined?

We estimated total costs assuming that 75% of eligible wells are tested for PFAS and that of these tested wells, 25% are both found to have PFAS exceedances and proceed to have filter systems installed. This PFAS exceedance/POET installation rate was selected because it falls between the rates of exceedances observed when private well sampling was conducted in Wisconsin and New Hampshire in recent years.

The disadvantaged community criteria are specific to New York State. . How might other states without similar criteria maximize the fairness of their programs using factors beyond income?

For states which do not have their own tools for identifying disadvantaged communities, the Social Vulnerability Index developed by the Centers for Disease Control and Prevention (CDC) and Agency for Toxic Substances and Disease Registry (ATSDR) may provide an alternative option to help identify those most in need.

Too Hot not to Handle

Every region in the U.S. is experiencing year after year of record-breaking heat. More households now require home cooling solutions to maintain safe and liveable indoor temperatures. Over the last two decades, U.S. consumers and the private sector have leaned heavily into purchasing and marketing conventional air conditioning (AC) systems, such as central air conditioning, window units and portable ACs, to cool down overheating homes. 

While AC can offer immediate relief, the rapid scaling of AC has created dangerous vulnerabilities: rising energy bills are straining people’s wallets and increasing utility debt, while surging electricity demand increases reliance on high-polluting power infrastructure and mounts pressure on an aging power grid increasingly prone to blackouts. There is also an increasing risk of elevated demand for electricity during a heat wave, overloading the grid and triggering prolonged blackouts, causing whole regions to lose their sole cooling strategy. This disruption could escalate into a public health emergency as homes and people overheat, leading to hundreds of deaths

What Americans need to be prepared for more extreme temperatures is a resilient cooling strategy. Resilient cooling is an approach that works across three interdependent systems — buildings, communities, and the electric grid — to affordably maintain safe indoor temperatures during extreme heat events and reduce power outage risks. 

Read the full report
Too Hot not to handle
Resilient Cooling Policy and Strategy Toolkit

This toolkit introduces a set of Policy Principles for Resilient Cooling and outlines a set of actionable policy options and levers for state and local governments to foster broader access to resilient cooling technologies and strategies.

read more

This toolkit introduces a set of Policy Principles for Resilient Cooling and outlines a set of actionable policy options and levers for state and local governments to foster broader access to resilient cooling technologies and strategies. For example, states are the primary regulators of public utility commissions, architects of energy and building codes, and distributors of federal and state taxpayer dollars. Local governments are responsible for implementing building standards and zoning codes, enforcing housing and health codes, and operating public housing and retrofit programs that directly shape access to cooling. 

The Policy Principles for Resilient Cooling for a robust resilient cooling strategy are:

By adopting a resilient cooling strategy, state and local policymakers can address today’s overlapping energy, health, and affordability crises, advance American-made innovation, and ensure their communities are prepared for the hotter decades ahead.

A Holistic Framework for Measuring and Reporting AI’s Impacts to Build Public Trust and Advance AI 

As AI becomes more capable and integrated throughout the United States economy, its growing demand for energy, water, land, and raw materials is driving significant economic and environmental costs, from increased air pollution to higher costs for ratepayers. A recent report projects that data centers could consume up to 12% of U.S. electricity by 2028, underscoring the urgent need to assess the tradeoffs of continued expansion. To craft effective, sustainable resource policies, we need clear standards for estimating the data centers’ true energy needs and for measuring and reporting the specific AI applications driving their resource consumption. Local and state-level bills calling for more oversight of utility rates and impacts to ratepayers have received bipartisan support, and this proposal builds on that momentum. 

In this memo, we draw on research proposing a holistic evaluation framework for characterizing AI’s environmental impacts, which establishes three categories of impacts arising from AI: (1) Computing-related impacts; (2) Immediate application impacts; and (3) System-level impacts . Concerns around AI’s computing-related impacts, e.g. energy and water use due to AI data centers and hardware manufacturing, have become widely known with corresponding policy starting to be put into place. However, AI’s immediate application and system-level impacts, which arise from the specific use cases to which AI is applied, and the broader socio-economic shifts resulting from its use, remain poorly understood, despite their greater potential for societal benefit or harm.

To ensure that policymakers have full visibility into the full range of AI’s environmental impacts we recommend that the National Institute of Standards and Technology (NIST) oversee creation of frameworks to measure the full range of AI’s impacts. Frameworks should rely on quantitative measurements of the computing and application related impacts of AI and qualitative data based on engagements with the stakeholders most affected by the construction of data centers. NIST should produce these frameworks based on convenings that include academic researchers, corporate governance personnel, developers, utility companies, vendors, and data center owners in addition to civil society organizations. Participatory workshops will yield new guidelines, tools, methods, protocols and best practices to facilitate the evolution of industry standards for the measurement of the social costs of AI’s energy infrastructures.

Challenge and Opportunity 

Resource consumption associated with AI infrastructures is expanding quickly, and this has negative impacts, including asthma from air pollution associated with diesel backup generators, noise pollution, light pollution, excessive water and land use, and financial impacts to ratepayers. A lack of transparency regarding these outcomes and public participation to minimize these risks losing the public’s trust, which in turn will inhibit the  beneficial uses of AI. While there is a huge amount of capital expenditure and a massive forecasted growth in power consumption, there remains a lack of transparency and scientific consensus around the measurement of AI’s environmental impacts with respect to data centers and their related negative externalities.

A holistic evaluation framework for assessing AI’s broader impacts requires empirical evidence, both qualitative and quantitative, to influence future policy decisions and establish more responsible, strategic technology development. Focusing narrowly on carbon emissions or energy consumption arising from AI’s computing related impacts is not sufficient. Measuring AI’s application and system-level impacts will help policymakers consider multiple data streams, including electricity transmission, water systems and land use in tandem with downstream economic and health impacts.

Regulatory and technical attempts so far to develop scientific consensus and international standards around the measurement of AI’s environmental impacts have focused on documenting AI’s computing-related impacts, such as energy use, water consumption, and carbon emissions required to build and use AI. Measuring and mitigating AI’s computing-related impacts is necessary, and has received attention from policymakers (e.g. the introduction of the AI Environmental Impacts Act of 2024 in the U.S., provisions for environmental impacts of general-purpose AI in the EU AI Act, and data center sustainability targets in the German Energy Efficiency Act). However, research by Kaack et al (2022) highlights that impacts extend beyond computing. AI’s application impacts, which arise from the specific use cases for which AI is deployed (e.g. AI’s enabled emissions, such as application of AI to oil and gas drilling have much greater potential scope for positive or negative impacts compared to AI’s computing impacts alone, depending on how AI is used in practice). Finally, AI’s system-level impacts, which include even broader, cascading social and economic impacts associated with AI energy infrastructures, such as increased pressure on local utility infrastructure leading to increased costs to ratepayers, or health impacts to local communities due to increased air pollution, have the greatest potential for positive or negative impacts, while being the most challenging to measure and predict. See Figure 1 for an overview.

Figure 1. Framework for assessing the impacts of AI

from Kaack et al. (2022). Effectively understanding and shaping AI’s impacts will require going beyond impacts arising from computing alone, and requires consideration and measurement of impacts arising from AI’s uses (e.g. in optimizing power systems or agriculture) and how AI’s deployment throughout the economy leads to broader systemic shifts, such as changes in consumer behavior.

Effective policy recommendations require more standardized measurement practices, a point raised by the Government Accountability Office’s recent report on AI’s human and environmental effects, which explicitly calls for increasing corporate transparency and innovation around technical methods for improved data collection and reporting. But data should also include multi-stakeholder engagement to ensure there are more holistic evaluation frameworks that meet the needs of specific localities, including state and local government officials, businesses, utilities, and ratepayers. Furthermore, while states and municipalities are creating bills calling for more data transparency and responsibility, including in California, Indiana, Oregon, and Virginia, the lack of federal policy means that data center owners may move their operations to states that have fewer protections in place and similar levels of existing energy and data transmission infrastructure. 

States are also grappling with the potential economic costs of data center expansion. Ohio’s Policy Matters found that tax breaks for data center owners are hurting tax revenue streams that should be used to fund public services. In Michigan, tax breaks for data centers are increasing the cost of water and power for the public while undermining the state’s climate goals. Some Georgia Republicans have stated that data center companies should “pay their way.” While there are arguments that data centers can provide useful infrastructure, connectivity, and even revenue for localities, a recent report shows that at least ten states each lost over $100 million a year in revenue to data centers because of tax breaks. The federal government can help create standards that allow stakeholders to balance the potential costs and benefits of data centers and related energy infrastructures. We now have an urgent need to increase transparency and accountability through multi-stakeholder engagement, maximizing economic benefits while reducing waste. 

Despite the high economic and policy stakes, critical data needed to assess the full impacts—both costs and benefits—of AI and data center expansion remains fragmented, inconsistent, or entirely unavailable. For example, researchers have found that state-level subsidies for data center expansion may have negative impacts on state and local budgets, but this data has not been collected and analyzed across states because not all states publicly release data about data center subsidies. Other impacts, such as the use of agricultural land or public parks for transmission lines and data center siting, must be studied at a local and state level, and the various social repercussions require engagement with the communities who are likely to be affected. Similarly, estimates on the economic upsides of AI vary widely, e.g. the estimated increase in U.S. labor productivity due to AI adoption ranges from 0.9% to 15% due in large part to lack of relevant data on AI uses and their economic outcomes, which can be used to inform modeling assumptions.

Data centers are highly geographically clustered in the United States, more so than other industrial facilities such as steel plants, coal mines, factories, and power plants (Fig. 4.12, IEA World Energy Outlook 2024). This means that certain states and counties are experiencing disproportionate burdens associated with data center expansion. These burdens have led to calls for data center moratoriums or for the cessation of other energy development, including in states like Indiana. Improved measurement and transparency can help planners avoid overly burdensome concentrations of data center infrastructure, reducing local opposition. 

With a rush to build new data center infrastructure, states and localities must also face another concern: overbuilding. For example, Microsoft recently put a hold on parts of its data center contract in Wisconsin and paused another in central Ohio, along with contracts in several other locations across the United States and internationally. These situations often stem from inaccurate demand forecasting, prompting utilities to undertake costly planning and infrastructure development that ultimately goes unused. With better measurement and transparency, policymakers will have more tools to prepare for future demands, avoiding the negative social and economic impacts of infrastructure projects that are started but never completed. 

While there have been significant developments in measuring the direct, computing-related impacts of AI data centers, public participation is needed to fully capture many of their indirect impacts. Data centers can be constructed so they are more beneficial to communities while mitigating their negative impacts, e.g. by recycling data center heat, and they can also be constructed to be more flexible by not using grid power during peak times. However, this requires collaborative innovation and cross-sector translation, informed by relevant data. 

Plan of Action

Recommendation 1. Develop a database of AI uses and framework for reporting AI’s immediate applications in order to understand the drivers of environmental impacts. 

The first step towards informed decision-making around AI’s social and environmental impacts is understanding what AI applications are actually driving data center resource consumption. This will allow specific deployments of AI systems to be linked upstream to compute-related impacts arising from their resource intensity, and downstream to impacts arising from their application, enabling estimation of immediate application impacts. 

The AI company Anthropic demonstrated a proof-of-concept categorizing queries to their Claude language model under the O*NET database of occupations. However, O*NET was developed in order to categorize job types and tasks with respect to human workers, which does not exactly align with current and potential uses of AI. To address this, we recommend that NIST works with relevant collaborators such as the U.S. Department of Labor (responsible for developing and maintaining the O*NET database) to develop a database of AI uses and applications, similar to and building off of O*NET, along with guidelines and infrastructure for reporting data center resource consumption corresponding to those uses. This data could then be used to understand particular AI tasks that are key drivers of resource consumption.

Any entity deploying a public-facing AI model (that is, one that can produce outputs and/or receive inputs from outside its local network) should be able to easily document and report its use case(s) within the NIST framework. A centralized database will allow for collation of relevant data across multiple stakeholders including government entities, private firms, and nonprofit organizations. 

Gathering data of this nature may require the reporting entity to perform analyses of sensitive user data, such as categorizing individual user queries to an AI model. However, data is to be reported in aggregate percentages with respect to use categories without attribution to or listing of individual users or queries. This type of analysis and data reporting is well within the scope of existing, commonplace data analysis practices. As with existing AI products that rely on such analyses, reporting entities are responsible for performing that analysis in a way that appropriately safeguards user privacy and data protection in accordance with existing regulations and norms. 

Recommendation 2. NIST should create an independent consortium to develop a system-level evaluation framework for AI’s environmental impacts, while embedding robust public participation in every stage of the work. 

Currently, the social costs of AI’s system-level impacts—the broader social and economic implications arising from AI’s development and deployment—are not being measured or reported in any systematic way. These impacts fall heaviest on the local communities that host the data centers powering AI: the financial burden on ratepayers who share utility infrastructure, the health effects of pollutants from backup generators, the water and land consumed by new facilities, and the wider economic costs or benefits of data-center siting. Without transparent metrics and genuine community input, policymakers cannot balance the benefits of AI innovation against its local and regional burdens. Building public trust through public participation is key when it comes to ensuring United States energy dominance and national security interests in AI innovation, themes emphasized in policy documents produced by the first and second Trump administrations.

To develop evaluation frameworks in a way that is both scientifically rigorous and broadly trusted, NIST should stand up an independent consortium via a Cooperative Research and Development Agreement (CRADA). A CRADA allows NIST to collaborate rapidly with non-federal partners while remaining outside the scope of the Federal Advisory Committee Act (FACA), and has been used, for example, to convene the NIST AI Safety Institute Consortium. Membership will include academic researchers, utility companies and grid operators, data-center owners and vendors, state, local, Tribal, and territorial officials, technologists, civil-society organizations, and frontline community groups.

To ensure robust public engagement, the consortium should consult closely with FERC’s Office of Public Participation (OPP)—drawing on OPP’s expertise in plain-language outreach and community listening sessions—and with other federal entities that have deep experience in community engagement on energy and environmental issues. Drawing on these partners’ methods, the consortium will convene participatory workshops and listening sessions in regions with high data-center concentration—Northern Virginia, Silicon Valley, Eastern Oregon, and the Dallas–Fort Worth metroplex—while also making use of online comment portals to gather nationwide feedback.

Guided by the insights from these engagements, the consortium will produce a comprehensive evaluation framework that captures metrics falling outside the scope of direct emissions alone. These system-level metrics could encompass (1) the number, type, and duration of jobs created; (2) the effects of tax subsidies on local economies and public services; (3) the placement of transmission lines and associated repercussions for housing, public parks, and agriculture; (4) the use of eminent domain for data-center construction; (5) water-use intensity and competing local demands; and (6) public-health impacts from air, light, and noise pollution. NIST will integrate these metrics into standardized benchmarks and guidance.

Consortium members will attend public meetings, engage directly with community organizations, deliver accessible presentations, and create plain-language explainers so that non-experts can meaningfully influence the framework’s design and application. The group will also develop new guidelines, tools, methods, protocols, and best practices to facilitate industry uptake and to evolve measurement standards as technology and infrastructure grow.

We estimate a cost of approximately $5 million over two years to complete the work outlined in recommendation 1 and 2, covering staff time, travel to at least twelve data-center or energy-infrastructure sites across the United States, participant honoraria, and research materials.

Recommendation 3. Mandate regular measurement and reporting on relevant metrics by data center operators. 

Voluntary reporting is the status quo, via e.g. corporate Environmental, Social, and Governance (ESG) reports, but voluntary reporting has so far been insufficient for gathering necessary data. For example, while the technology firm OpenAI, best known for their highly popular ChatGPT generative AI model, holds a significant share of the search market and likely corresponding share of environmental and social impacts arising from the data centers powering their products, OpenAI chooses not to publish ESG reports or data in any other format regarding their energy consumption or greenhouse gas (GHG) emissions. In order to collect sufficient data at the appropriate level of detail, reporting must be mandated at the local, state, or federal level. At the state level, California’s Climate Corporate Data Accountability Act (SB -253, SB-219) requires that large companies operating within the state report their GHG emissions in accordance with the GHG Protocol, administered by the California Air Resources Board (CARB). 

At the federal level, the EU’s Corporate Sustainable Reporting Directive (CSRD), which requires firms operating within the EU to report a wide variety of data related to environmental sustainability and social governance, could serve as a model for regulating companies operating within the U.S. The Environmental Protection Agency’s (EPA) GHG Reporting Program already requires emissions reporting by operators and suppliers associated with large GHG emissions sources, and the Energy Information Administration (EIA) collects detailed data on electricity generation and fuel consumption through forms 860 and  923. With respect to data centers specifically, the Department of Energy (DOE) could require that developers who are granted rights to build AI data center infrastructure on public lands perform the relevant measurement and reporting, and more broadly reporting could be a requirement to qualify for any local, state or federal funding or assistance provided to support buildout of U.S. AI infrastructure.

Recommendation 4. Incorporate measurements of social cost into AI energy and infrastructure forecasting and planning. 

There is a huge range in estimates of future data center energy use, largely driven by uncertainty around the nature of demands from AI. This uncertainty stems in part from a lack of historical and current data on which AI use cases are most energy intensive and how those workloads are evolving over time. It also remains unclear the extent to which challenges in bringing new resources online, such as hardware production limits or bottlenecks in permitting, will influence growth rates. These uncertainties are even more significant when it comes to the holistic impacts (i.e. those beyond direct energy consumption) described above, making it challenging to balance costs and benefits when planning future demands from AI. 

To address these issues, accurate forecasting of demand for energy, water, and other limited resources must incorporate data gathered through holistic measurement frameworks described above. Further, the forecasting of broader system-level impacts must be incorporated into decision-making around investment in AI infrastructure. Forecasting needs to go beyond just energy use. Models should include predicting energy and related infrastructure needs for transmission, the social cost of carbon in terms of pollution, the effects to ratepayers, and the energy demands from chip production. 

We recommend that agencies already responsible for energy-demand forecasting—such as the Energy Information Administration at the Department of Energy—integrate, in line with the NIST frameworks developed above, data on the AI workloads driving data-center electricity use into their forecasting models. Agencies specializing in social impacts, such as the Department of Health and Human Services in the case of health impacts, should model social impacts and communicate those to EIA and DOE for planning purposes. In parallel, the Federal Energy Regulatory Commission (FERC) ​​should update its new rule on long-term regional transmission planning, to explicitly include consideration of the social costs corresponding to energy supply, demand and infrastructure retirement/buildout across different scenarios.

Recommendation 5. Transparently use federal, state, and local incentive programs to reward data-center projects that deliver concrete community benefits.  

Incentive programs should attach holistic estimates of the costs and benefits collected under the frameworks above, and not purely based on promises. When considering using incentive programs, policymakers should ask questions such as: How many jobs are created by data centers and for how long do those jobs exist, and do they create jobs for local residents? What tax revenue for municipalities or states is created by data centers versus what subsidies are data center owners receiving? What are the social impacts of using agricultural land or public parks for data center construction or transmission lines? What are the impacts to air quality and other public health issues? Do data centers deliver benefits like load flexibility and sharing of waste heat? 

Grid operators (Regional Transmission Organizations [RTOs] and Independent System Operators [ISOs]) can leverage interconnection queues to incentivize data center operators to justify that they have sufficiently considered the impacts to local communities when proposing a new site. FERC recently approved reforms to processing the interconnect request queue, allowing RTOs to implement a “first-ready first-served” approach rather than a first-come first-served approach, wherein proposed projects can be fast-tracked based on their readiness. A similar approach could be used by RTOs to fast-track proposals that include a clear plan for how they will benefit local communities (e.g. through load flexibility, heat reuse, and clean energy commitments), grounded in careful impact assessment.

There is the possibility of introducing state-level incentives in states with existing significant infrastructure. Such incentives could be determined in collaboration with the National Governors Association, who have been balancing AI-driven energy needs with state climate goals

Conclusion 

Data centers have an undeniable impact on energy infrastructures and the communities living close to them. This impact will continue to grow alongside AI infrastructure investment, which is expected to skyrocket. It is possible to shape a future where AI infrastructure can be developed sustainably, and in a way that responds to the needs of local communities. But more work is needed to collect the necessary data to inform government decision-making. We have described a framework for holistically evaluating the potential costs and benefits of AI data centers, and shaping AI infrastructure buildout based on those tradeoffs. This framework includes: establishing standards for measuring and reporting AI’s impacts, eliciting public participation from impacted communities, and putting gathered data into action to enable sustainable AI development.

This memo is part of our AI & Energy Policy Sprint, a policy project to shape U.S. policy at the critical intersection of AI and energy. Read more about the Policy Sprint and check out the other memos here.

Frequently Asked Questions
If regulations happen at the state level, will investment just move to less regulated states?

Data centers are highly spatially concentrated largely due to reliance on existing energy and data transmission infrastructure; it is more cost-effective to continue building where infrastructure already exists, rather than starting fresh in a new region. As long as the cost of performing the proposed impact assessment and reporting in established regions is less than that of the additional overhead of moving to a new region, data center operators are likely to comply with regulations in order to stay in regions where the sector is established.


Spatial concentration of data centers also arises due to the need for data center workloads with high data transmission requirements, such as media streaming and online gaming, to have close physical proximity to users in order to reduce data transmission latency. In order for AI to be integrated into these realtime services, data center operators will continue to need presence in existing geographic regions, barring significant advances in data transmission efficiency and infrastructure.

Are these policies bad for economic growth? What about national security?

bad for national security and economic growth. So is infrastructure growth that harms the local communities in which it occurs.


Researchers from Good Jobs First have found that many states are in fact losing tax revenue to data center expansion: “At least 10 states already lose more than $100 million per year in tax revenue to data centers…” More data is needed to determine if data center construction projects coupled with tax incentives are economically advantageous investments on the parts of local and state governments.

What about recent efforts to build AI data centers on public lands?

The DOE is opening up federal lands in 16 locations to data center construction projects in the name of strengthening America’s energy dominance and ensuring America’s role in AI innovation. But national security concerns around data center expansion should also consider the impacts to communities who live close to data centers and related infrastructures.


Data centers themselves do not automatically ensure greater national security, especially because the critical minerals and hardware components of data centers depend on international trade and manufacturing. At present, the United States is not equipped to contribute the critical minerals and other materials needed to produce data centers, including GPUs and other components.

If every state or locality has unique infrastructures and energy needs, then what is the point of a federal policy?

Federal policy ensures that states or counties do not become overburdened by data center growth and will help different regions benefit from the potential economic and social rewards of data center construction.


Developing federal standards around transparency helps individual states plan for data center construction, allowing for a high-level, comparative look at the energy demand associated with specific AI use cases. It is also important for there to be a federal intervention because data centers in one state might have transmission lines running through a neighboring state, and resultant outcomes across jurisdictions. There is a need for a national-level standard.

How will you weigh costs and benefits? What forms of data will you be collecting?

Current cost-benefit estimates can often be extremely challenging. For example, while municipalities often expect there will be economic benefits attached to data centers and that data center construction will yield more jobs in the area, subsidies and short-term jobs in construction do not necessarily translate into economic gains.


To improve the ability of decision makers to do quality cost-benefit analysis, the independent consortium described in Recommendation 2 will examine both qualitative and quantitative data, including permitting histories, transmission plans, land use and eminent domain cases, subsidies, jobs numbers, and health or quality of life impacts in various sites over time. NIST will help develop standards in accordance with this data collection, which can then be used in future planning processes.

How could these changes benefit AI data centers?

Further, there is customer interest in knowing their AI is being sourced from firms implementing sustainable and socially responsible practices. These efforts which can be used in marketing communications and reported as a socially and environmentally responsible practice in ESG reports. This serves as an additional incentive for some data center operators to participate in voluntary reporting and maintain operations in locations with increased regulation.

Advance AI with Cleaner Air and Healthier Outcomes

Artificial intelligence (AI) is transforming industries, driving innovation, and tackling some of the world’s most pressing challenges. Yet while AI has tremendous potential to advance public health, such as supporting epidemiological research and optimizing healthcare resource allocation, the public health burden of AI due to its contribution to air pollutant emissions has been under-examined. Energy-intensive data centers, often paired with diesel backup generators, are rapidly expanding and degrading air quality through emissions of air pollutants. These emissions exacerbate or cause various adverse health outcomes, from asthma to heart attacks and lung cancer, especially among young children and the elderly. Without sufficient clean and stable energy sources, the annual public health burden from data centers in the United States is projected to reach up to $20 billion by 2030, with households in some communities located near power plants supplying data centers, such as those in Mason County, WV, facing over 200 times greater burdens than others.

Federal, state, and local policymakers should act to accelerate the adoption of cleaner and more stable energy sources and address AI’s expansion that aligns innovation with human well-being, advancing the United States’ leadership in AI while ensuring clean air and healthy communities.

Challenge and Opportunity

Forty-six percent of people in the United States breathe unhealthy levels of air pollution. Ambient air pollution, especially fine particulate matter (PM2.5), is linked to 200,000 deaths each year in the United States. Poor air quality remains the nation’s fifth highest mortality risk factor, resulting in a wide range of immediate and severe health issues that include respiratory diseases, cardiovascular conditions, and premature deaths.

Data centers consume vast amounts of electricity to power and cool the servers running AI models and other computing workloads. According to the Lawrence Berkeley National Laboratory, the growing demand for AI is projected to increase the data centers’ share of the nation’s total electricity consumption to as much as 12% by 2028, up from 4.4% in 2023. Without enough sustainable energy sources like nuclear power, the rapid growth of energy-intensive data centers is likely to exacerbate ambient air pollution and its associated public health impacts.

Data centers typically rely on diesel backup generators for uninterrupted operation during power outages. While the total operation time for routine maintenance of backup generators is limited, these generators can create short-term spikes in PM2.5, NOx, and SO2 that go beyond the baseline environmental and health impacts associated with data center electricity consumption. For example, diesel generators emit 200–600 times more NOx than natural gas-fired power plants per unit of electricity produced. Even brief exposure to high-level NOx can aggravate respiratory symptoms and hospitalizations. A recent report to the Governor and General Assembly of Virginia found that backup generators at data centers emitted approximately 7% of the total permitted pollution levels for these generators in 2023. Based on the Environmental Protection Agency’s COBRA modeling tool, the public health cost of these emissions in Virginia is estimated at approximately $200 million, with health impacts extending to neighboring states and reaching as far as Florida. In Memphis, Tennessee, a set of temporary gas turbines powering a large AI data center, which has not undergone a complete permitting process, is estimated to emit up to 2,000 tons of NOx annually. This has raised significant health concerns among local residents and could result in a total public health burden of $160 million annually. These public health concerns coincide with a paradigm shift that favors dirty energy and potentially delays sustainability goals.

In 2023 alone, air pollution attributed to data centers in the United States resulted in an estimated $5 billion in health-related damages, a figure projected to rise up to $20 billion annually by 2030. This projected cost reflects an estimated 1,300 premature deaths in the United States per year by the end of the decade. While communities near data centers and power plants bear the greatest burden, with some households facing over 200 times greater impacts than others, the health impacts of these facilities extend to communities across the nation. The widespread health impacts of data centers further compound the already uneven distribution of environmental costs and water resource stresses imposed by AI data centers across the country.

While essential for mitigating air pollution and public health risks, transitioning AI data centers to cleaner backup fuels and stable energy sources such as nuclear power presents significant implementation hurdles, including lengthy permitting processes. Clean backup generators that match the reliability of diesel remain limited in real-world applications, and multiple key issues must be addressed to fully transition to cleaner and more stable energy

While it is clear that data centers pose public health risks, comprehensive evaluations of data center air pollution and related public health impacts are essential to grasp the full extent of the harms these centers pose, yet often remain absent from current practices. Washington State conducted a health risk assessment of diesel particulate pollution from multiple data centers in the Quincy area in 2020. However, most states lack similar evaluations for either existing or newly proposed data centers. To safeguard public health, it is essential to establish transparency frameworks, reporting standards, and compliance requirements for data centers, enabling the assessment of PM2.5, NOₓ, SO₂, and other harmful air pollutants, as well as their short- and long-term health impacts. These mechanisms would also equip state and local governments to make informed decisions about where to site AI data center facilities, balancing technological progress with the protection of community health nationwide.

Finally, limited public awareness, insufficient educational outreach, and a lack of comprehensive decision-making processes further obscure the potential health risks data centers pose to public health. Without robust transparency and community engagement mechanisms, communities housing data center facilities are left with little influence or recourse over developments that may significantly affect their health and environment. 

Plan of Action

The United States can build AI systems that not only drive innovation but also promote human well-being, delivering lasting health benefits for generations to come. Federal, state, and local policymakers should adopt a multi-pronged approach to address data center expansion with minimal air pollution and public health impacts, as outlined below. 

Federal-level Action

Federal agencies play a crucial role in establishing national standards, coordinating cross-state efforts, and leveraging federal resources to model responsible public health stewardship. 

Recommendation 1. Incorporate Public Health Benefits to Accelerate Clean and Stable Energy Adoption for AI Data Centers

Congress should direct relevant federal agencies, including the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and the Environmental Protection Agency (EPA), to integrate air pollution reduction and the associated public health benefits into efforts to streamline the permitting process for more sustainable energy sources, such as nuclear power, for AI data centers. Simultaneously, federal resources should be expanded to support research, development, and pilot deployment of alternative low-emission fuels for backup generators while ensuring high reliability.

Recommendation 2. Establish a Standardized Emissions Reporting Framework for AI Data Centers

Congress should direct the EPA, in coordination with the National Institute of Standards and Technology (NIST), to develop and implement a standardized reporting framework requiring data centers to publicly disclose their emissions of air pollutants, including PM₂.₅, NOₓ, SO₂, and other hazardous air pollutants associated with backup generators and electricity use.

State-level Action 

Recommendation 1. State environmental and public health departments should conduct a health impact assessment (HIA) before and after data center construction to evaluate discrepancies between anticipated and actual health impacts for existing and planned data center operations. To maintain and build trust, HIA findings, methodologies, and limitations should be publicly available and accessible to non-technical audiences (including policymakers, local health departments, and community leaders representing impacted residents), thereby enhancing community-informed action and participation. Reports should focus on the disparate impact between rural and urban communities, with particular attention to overburdened communities that have under-resourced health infrastructure. In addition, states should coordinate HIA and share findings to address cross-boundary pollution risks. This includes accounting for nearby communities across state lines, considering that jurisdictional borders should not constrain public health impacts and analysis.

Recommendation 2. State public health departments should establish a state-funded program that offers community education forums for affected residents to express their concerns about how data centers impact them. These programs should emphasize leading outreach, engaging communities, and contributing to qualitative analysis for HIAs. Health impact assessments should be used as a basis for informed community engagement.

Recommendation 3. States should incorporate air pollutant emissions related to data centers into their implementation of the National Ambient Air Quality Standards (NAAQS) and the development of State Implementation Plans (SIPs). This ensures that affected areas can meet standards and maintain their attainment statuses. To support this, states should evaluate the adequacy of existing regulatory monitors in capturing emissions related to data centers and determine whether additional monitoring infrastructure is required.

Local-level Action

Recommendation 1. Local governments should revise zoning regulations to include stricter and more explicit health-based protections to prevent data center clustering in already overburdened communities. Additionally, zoning ordinances should address colocation factors and evaluate potential cumulative health impacts. A prominent example is Fairfax County, Virginia, which updated its zoning ordinance in September 2024 to regulate the proximity of data centers to residential areas, require noise pollution studies prior to construction, and establish size thresholds. These updates were shaped through community engagement and input.

Recommendation 2. Local governments should appoint public health experts to the zoning boards to ensure data center placement decisions reflect community health priorities, thereby increasing public health expert representation on zoning boards. 

Conclusion

While AI can revolutionize industries and improve lives, its energy-intensive nature is also degrading air quality through emissions of air pollutants. To mitigate AI’s growing air pollution and public health risks, a comprehensive assessment of AI’s health impact and transitioning AI data centers to cleaner backup fuels and stable energy sources, such as nuclear power, are essential. By adopting more informed and cleaner AI strategies at the federal and state levels, policymakers can mitigate these harms, promote healthier communities, and ensure AI’s expansion aligns with clean air priorities.

This memo is part of our AI & Energy Policy Sprint, a policy project to shape U.S. policy at the critical intersection of AI and energy. Read more about the Policy Sprint and check out the other memos here.