
Speed Grid Connection Using ‘Smart AI Fast Lanes’ and Competitive Prizes
Innovation in artificial intelligence (AI) and computing capacity is essential for U.S. competitiveness and national security. However, AI data center electricity use is growing rapidly. Data centers already consume more than 4% of U.S. electricity annually and could rise to 6% to 12% of U.S. electricity by 2028. At the same time, electricity rates are rising for consumers across the country, with transmission and distribution infrastructure costs a major driver of these increases. For the first time in fifteen years, the U.S. is experiencing a meaningful increase in electricity demand. Electricity use from data centers already consumes more than 25% of electricity in Virginia, which leads the world in data center installations. Data center electricity load growth results in real economic and environmental impacts for local communities. It also represents a national policy trial on how the U.S. responds to rising power demand from the electrification of homes, transportation, and manufacturing– important technology transitions for cutting carbon emissions and air pollution.
Federal and state governments need to ensure that the development of new AI and data center infrastructure does not increase costs for consumers, impact the environment, and exacerbate existing inequalities. “Smart AI Fast Lanes” is a policy and infrastructure investment framework that ensures the U.S. leads the world in AI while building an electricity system that is clean, affordable, reliable, and equitable. Leveraging innovation prizes that pay for performance, coupled with public-private partnerships, data center providers can work with the Department of Energy, the Foundation for Energy Security and Innovation (FESI), the Department of Commerce, National Labs, state energy offices, utilities, and the Department of Defense to drive innovation to increase energy security while lowering costs.
Challenge and Opportunity
Targeted policies can ensure that the development of new AI and data center infrastructure does not increase costs for consumers, impact the environment, and exacerbate existing energy burdens. Allowing new clean power sources co-located or contracted with AI computing facilities to connect to the grid quickly, and then manage any infrastructure costs associated with that new interconnection, would accelerate the addition of new clean generation for AI while lowering electricity costs for homes and businesses.
One of the biggest bottlenecks in many regions of the U.S. in adding much-needed capacity to the electricity grid are the so-called “interconnection queues”. There are different regional requirements for power plants to complete (often, a number of studies on how a project affects grid infrastructure) before they are allowed to connect. Solar, wind, and battery projects represented 95% of the capacity waiting in interconnection queues in 2023. The operator of Texas’ power grid, the Electric Reliability Council of Texas (ERCOT), uses a “connect and manage” interconnection process that results in faster interconnections of new energy supplies than the rest of the country. Instead of requiring each power plant to complete lengthy studies of needed system-wide infrastructure investments before connecting to the grid, the “connect and manage” approach in Texas gets power plants online quicker than a “studies first” approach. Texas manages any risks that arise using the power markets and system-wide planning efforts. The results are clear: the median time from an interconnection request to commercial operations in Texas was four years, compared to five years in New York and more than six and a half years in California.
“Smart AI Fast Lanes” expands the spirit of the Texas “connect and manage” approach nationwide for data centers and clean energy, and adds to it investment and innovation prizes to speed up the process, ensure grid reliability, and lower costs.
Data center providers would work with the Department of Energy, the Foundation for Energy Security and Innovation (FESI), the Department of Commerce, National Laboratories, state energy offices, utilities, and the Department of Defense to speed up interconnection queues, spur innovation in efficiency, and re-invest in infrastructure, to increase energy security and lower costs.
Why FESI Should Lead ‘Smart AI Fast Lanes’
With FESI managing this effort, the process can move faster than the government acting alone. FESI is an independent, non-profit, agency-related foundation that was created by Congress in the CHIPS and Science Act of 2022 to help the Department of Energy achieve its mission and accelerate “the development and commercialization of critical energy technologies, foster public-private partnerships, and provide additional resources to partners and communities across the country supporting solutions-driven research and innovation that strengthens America’s energy and national security goals”. Congress has created many other agency-related foundations, such as the Foundation for NIH, the National Fish and Wildlife Foundation, and the National Park Foundation, which was created in 1935. These agency-related foundations have a demonstrated record of raising external funding to leverage federal resources and enabling efficient public-private partnerships. As a foundation supporting the mission of the Department of Energy, FESI has a unique opportunity to quickly respond to emergent priorities and create partnerships to help solve energy challenges.
As an independent organization, FESI can leverage the capabilities of the private sector, academia, philanthropies, and other organizations to enable collaboration with federal and state governments. FESI can also serve as an access point to opening up additional external investment, and shared risk structures and clear rules of engagement make emerging energy technologies more attractive to institutional capital. For example, the National Fish and Wildlife Foundation awards grants that are matched with non-federal private, philanthropic, or local funding sources that multiply the impact of any federal investments. In addition, the National Fish and Wildlife Foundation has partnered with the Department of Defense and external funding sources to enhance coastal resilience near military installations. Both AI compute capabilities and energy resilience are of strategic importance to the Department of Defense, Department of Energy, and other agencies, and leveraging public-private partnerships is a key pathway to enhance capabilities and security. FESI leading a Smart AI Fast Lanes initiative could be a force multiplier to enable rapid deployment of clean AI compute capabilities that are good for communities, companies, and national security.
Use Prizes to Lessen Cost and Maximize Return
The Department of Energy has long used prize competitions to spur innovation and accelerate access to funding and resources. Prize competitions with focused objectives but unstructured pathways for success enables the private sector to compete and advance innovation without requiring a lot of federal capacity and involvement. Federal prize programs pay for performance and results, while also providing a mechanism to crowd in additional philanthropic and private sector investment. In the Smart AI Fast Lane framework, FESI could use prizes to support energy innovation from AI data centers while working with the Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER) to enable a repeatable and scalable public private partnership program. These prizes would be structured so that there is a low administrative and operational effort required for FESI itself, with other groups such as American-Made, National Laboratories, or organizations like FAS, helping to provide technical expertise to review and administer prize applications. This can ensure quality while enabling scalable growth.
Plan of Action
Here’s how “Smart AI Fast Lanes” would work. For any proposed data center investment of more than 250 MW, companies could apply to work with FESI. Successful application would leverage public, private, and philanthropic funds and technical assistance. Projects would be required to increase clean energy supplies, achieve world-leading data center energy efficiency, invest in transmission and distribution infrastructure, and/or deploy virtual power plants for grid flexibility.
Recommendation 1. Use a “Smart AI Fast Lane” Connection Fee to Quickly Connect to the Grid, Further Incentivized by a “Bring Your Own Power” Prize
New large AI data center loads choosing the “Smart AI Fast Lane” would pay a fee to connect to the grid without first completing lengthy pre-connection cost studies. Those payments would go into a fund, managed and overseen by FESI, that would be used to cover any infrastructure costs incurred by regional grids for the first three years after project completion. The fee could be a flat fee based on data center size, or structured as an auction, enabling the data centers bidding the highest in a region to be at the front of the line. This enables the market to incentivize the highest priority additions. Alternatively, large load projects could choose to do the studies first and remain in the regular – and likely slower – interconnection queue to avoid the fee.
In addition, FESI could facilitate a “Bring Your Own Power” prize award that is a combination of public, private, and philanthropic funds that data center developers can match to contract for new, additional zero-emission electricity generated locally that covers twice as much as the data center uses annually. For data centers committing to this “Smart AI Fast Lane” process, both the data center and the clean energy supply would receive accelerated priority in the interconnection queue and technical assistance from National Laboratories. This leverages economies of scale for projects, lowers the cost of locally-generated clean electricity, and gets clean energy connected to the grid quicker. Prize resources would support a “connect and manage” interconnection approach to cover 75% of the costs of any required infrastructure for local clean power projects resulting from the project. FESI prize resources could further supplement these payments to upgrade electrical infrastructure in areas of national need for new electricity supplies to maintain electricity reliability. These include areas assessed by the North American Reliability Corporation to have a high risk of an electricity shortfall in the coming years, such as the Upper Midwest or Gulf Coast, or areas with an elevated risk such as California, the Great Plains, Texas, the Mid-Atlantic, or the Northeast.
Recommendation 2. Create an Efficiency Prize To Establish World-Leading Energy and Water Efficiency at AI Data Centers
Data centers have different design configurations that affect how much energy and water are needed to operate. Data centers use electricity for computing, but also for the cooling systems needed for computing equipment, and there are innovation opportunities to increase the efficiency of both. One historical measure of AI data center energy efficiency is Power Use Effectiveness (PUE), which is the total facility annual energy use, divided by the computing equipment annual energy use, with values closer to 1.0 being more efficient. Similarly, Water Use Effectiveness (WUE) is measured as total annual water use divided by the computing equipment annual energy use, with closer to zero being more efficient. We should continue to push for improvement in PUE and WUE, but these are incomplete current metrics to drive deep innovation because they do nor reflect how much computing power is provided and do not assess impacts in the broader infrastructure energy system. While there have been multiple different metrics for data center energy efficiency proposed over the past several years, what is important for innovation is to improve the efficiency of how much AI computing work we get for the amount of energy and water used. Just like efficiency in a car is measured in miles per gallon (MPG), we need to measure the “MPG” of how AI data centers perform work and then create incentives and competition for continuous improvements. There could be different metrics for different types of AI training and inference workloads, but a starting point could be the tokens per kilowatt-hour of electricity used. A token is a word or portion of a word that AI foundation models use for analysis. Another way could be to measure the efficiency of computing performance, or FLOPS, per kilowatt-hour. The more analysis an AI model or data center can perform using the same amount of energy, the more energy efficient it is.
FESI could deploy sliding scale innovation prizes based on data center size for new facilities that demonstrate leading edge AI data center MPG. These could be based on efficiency targets for tokens per kilowatt-hour, FLOPS per kilowatt-hour, top-performing PUE, or other metrics of energy efficiency. Similar prizes could be provided for water use efficiency, within different classes of cooling technologies that exceed best-in-class performance. These prizes could be modeled after the USDA’s agency-related foundation’s FFAR Egg-Tech Prize, which was a program that was easy to administer and has had great success. A secondary benefit of an efficiency innovation prize is continuous competition for improvement, and open information about best-in-class data center facilities.

Fig. 1. Power Use Efficiency (PUE) and Water Use Efficiency (WUE) values for Data Centers Source: LBNL 2024
Recommendation 3. Create Prizes to Maximize Transmission Throughput and Upgrade Grid Infrastructure
FESI could award prizes for rapid deployment of reconductoring, new transmission, or grid enhancing technologies to increase the transmission capacity for any project in DOE’s Coordinated Interagency Authorizations and Permit Program. Similarly, FESI could award prizes for utilities to upgrade local distribution infrastructure beyond the direct needs for the project to reduce future electricity rate cases, which will keep electricity costs affordable for residential customers. The Department of Energy already has authority to finance up to $2.5 billion in the Transmission Facilitation Program, a revolving fund administered by the Grid Deployment Office (GDO) that helps support transmission infrastructure. These funds could be used for public-private partnerships in a national interest electric transmission corridor and necessary to accommodate an increase in electricity demand across more than one state or transmission planning region.
Recommendation 4. Develop Prizes That Reward Flexibility and End-Use Efficiency Investments
Flexibility in how and when data centers use electricity can meaningfully reduce the stress on the grid. FESI should award prizes to data centers that demonstrate best-in-class flexibility through smart controls and operational improvements. Prizes could also be awarded to utilities hosting data centers that reduce summer and winter peak loads in the local service territory. Prizes for utilities that meet home weatherization targets and deploy virtual power plants could help reduce costs and grid stress in local communities hosting AI data centers.
Conclusion
The U.S. is facing the risk of electricity demand outstripping supplies in many parts of the country, which would be severely detrimental to people’s lives, to the economy, to the environment, and to national security. “Smart AI Fast Lanes” is a policy and investment framework that can rapidly increase clean energy supply, infrastructure, and demand management capabilities.
It is imperative that the U.S. addresses the growing demand from AI and data centers, so that the U.S. remains on the cutting edge of innovation in this important sector. How the U.S. approaches and solves the challenge of new demand from AI, is a broader test on how the country prepares its infrastructure for increased electrification of vehicles, buildings, and manufacturing, as well as how the country addresses both carbon pollution and the impacts from climate change. The “Smart AI Fast Lanes” framework and FESI-run prizes will enable U.S. competitiveness in AI, keep energy costs affordable, reduce pollution, and prepare the country for new opportunities.
This memo is part of our AI & Energy Policy Sprint, a policy project to shape U.S. policy at the critical intersection of AI and energy. Read more about the Policy Sprint and check out the other memos here.
Could the largest U.S. public-private critical minerals deal of the decade be a model for the future?
To meet growing tech industry energy demands without imposing a staggering toll on individual energy consumers, Congress should create a pathway for data centers to be viably integrated into Thermal Energy Networks.
Now that the One Big Beautiful Bill is law, the elimination of clean energy tax credits will cause a nation of higher energy bills – even for consumers and states that aren’t using clean energy.
Federal and state governments need to ensure that the development of new AI and data center infrastructure does not increase costs for consumers, impact the environment, and exacerbate existing inequalities.