Using Home Energy Rebates to Support Market Transformation
Without market-shaping interventions, federal and state subsidies for energy-efficient products like heat pumps often lead to higher prices, leaving the overall market worse off when rebates end. This is a key challenge that must be addressed as the Department of Energy (DOE) and states implement the Inflation Reduction Act’s Home Electrification and Appliance Rebates (HEAR) program.
DOE should prioritize the development of evidence-based market-transformation strategies that states can implement with their HEAR funding. The DOE should use its existing allocation of administrative funds to create a central capability to (1) develop market-shaping toolkits and an evidence base on how state programs can improve value for money and achieve market transformation and (2) provide market-shaping program implementation assistance to states.
There are proven market-transformation strategies that can reduce costs and save consumers billions of dollars. DOE can look to the global public health sector for an example of what market-shaping interventions could do for heat pumps and other energy-efficient technologies. In that arena, the Clinton Health Access Initiative (CHAI) has shown how public funding can support market-based transformation, leading to sustainably lower drug and vaccine prices, new types of “all-inclusive” contracts, and improved product quality. Agreements negotiated by CHAI and the Bill and Melinda Gates Foundation have generated over $4 billion in savings for publicly financed health systems and improved healthcare for hundreds of millions of people.
Similar impact can be achieved in the market for heat pumps if DOE and states can supply information to empower consumers to purchase the most cost-effective products, offer higher rebates for those cost-effective products, and seek supplier discounts for heat pumps eligible for rebates.
Challenge and Opportunity
HEAR received $4.5 billion in appropriations from the Inflation Reduction Act and provides consumers with rebates to purchase and install high-efficiency electric appliances. Heat pumps, the primary eligible appliance, present a huge opportunity for lowering overall greenhouse gas emissions from heating and cooling, which makes up over 10% of global emissions. In the continental United States, studies have shown that heat pumps can reduce carbon emissions up to 93% compared to gas furnaces across their lifetime.
However, direct-to-consumer rebate programs have been shown to enable suppliers to increase prices unless these subsidies are used to reward innovation and reduce cost. If subsidies are dispersed and the program design is not aligned with a market-transformation strategy, the result will be a short-term boost in demand followed by a fall-off in consumer interest as prices increase and the rebates are no longer available. This is a problem because program funding for heat pump rebates will support only ~500,000 projects over the life of the program—but more than 50 million households will need to convert to heat pumps in order to decarbonize the sector.
HEAR aims to address this through Market Transformation Plans, which states are required to submit to DOE within a year after receiving the award. States will then need to obtain DOE approval before implementing them. We see several challenges with the current implementation of HEAR:
- Need for evidence: There is a lack of evidence and policy agreement on the best approaches for market transformation. The DOE provides a potpourri of areas for action, but no evidence of cost-effectiveness. Thus, there is no rational basis for states to allocate funding across the 10 recommended areas for action. There are no measurable goals for market transformation.
- Redundancy: It is wasteful and redundant to have every state program allocate administrative expenses to design market-transformation strategies incorporating some or all of the 10 recommended areas for action. There is nothing unique to Georgia, Iowa, or Vermont in creating a tool to better estimate energy savings. A best-in-class software tool developed by DOE or one of the states could be adapted for use in each state. Similarly, if a state develops insights into lower-cost ways to install heat pumps, these insights will be valuable in many other state programs. The best tools should be a public good made known to every state program.
Despite these challenges, DOE has a clear opportunity to increase the impact of HEAR rebates by providing program design support to states for market-transformation goals. To ensure a competitive market and better value for money, state programs need guidance on how to overcome barriers created by information asymmetry – meaning that HVAC contractors have a much better understanding of the technical and cost/benefit aspects of heat pumps than consumers do. Consumers cannot work with contractors to select a heat pump solution that represents the best value for money if they do not understand the technical performance of products and how operating costs are affected by Seasonal Energy Efficiency Rating, coefficient of performance, and utility rates. If consumers are not well-informed, market outcomes will not be efficient. Currently, consumers do not have easy access to critical information such as the tradeoff in costs between increased Seasonal Energy Efficiency Rating and savings on monthly utility bills.
Overcoming information asymmetry will also help lower soft costs, which is critical to lowering the cost of heat pumps. Based on studies conducted by New York State, Solar Energy Industries Association and DOE, soft costs run over 60% of project costs in some cases and have increased over the past 10 years.
There is still time to act, as thus far only a few states have received approval to begin issuing rebates and state market-transformation plans are still in the early stages of development.
Plan of Action
Recommendation 1. Establish a central market transformation team to provide resources and technical assistance to states.
To limit cost and complexity at the state level for designing and staffing market-transformation initiatives, the DOE should set up central resources and capabilities. This could either be done by a dedicated team within the Office of State and Community Energy Programs or through a national lab. Funding would come from the 3% of program funds that DOE is allowed to use for administration and technical assistance.
This team would:
- Collect, centralize, and publish heat pump equipment and installation cost data to increase transparency and consumer awareness of available options.
- Develop practical toolkits and an evidence base on how to achieve market transformation most cost-effectively.
- Provide market-shaping program design assistance to states to create and implement market transformation programs.
Data collection, analysis, and consistent reporting are at the heart of what this central team could provide states. The DOE data and tools requirements guide already asks states to provide information on the invoice, equipment and materials, and installation costs for each rebate transaction. It is critical that the DOE and state programs coordinate on how to collect and structure this data in order to benefit consumers across all state programs.
A central team could provide resources and technical assistance to State Energy Offices (SEOs) on how to implement market-shaping strategies in a phased approach.
Phase 1. Create greater price transparency and set benchmarks for pricing on the most common products supported by rebates.
The central market-transformation team should provide technical support to states on how to develop benchmarking data on prices available to consumers for the most common product offerings. Consumers should be able to evaluate pricing for heat pumps like they do for major purchases such as cars, travel, or higher education. State programs could facilitate these comparisons by having rebate-eligible contractors and suppliers provide illustrative bids for a set of 5–10 common heat pump installation scenarios, for example, installing a ductless mini-split in a three-bedroom home.
States should also require contractors to provide hourly rates for different types of labor, since installation costs are often ~70% of total project costs. Contractors should only be designated as recommended or preferred service providers (with access to HEAR rebates) if they are willing to share cost data.
In addition, the central market-transformation team could facilitate information-sharing and data aggregation across states to limit confusion and duplication of data. This will increase price transparency and limit the work required at the state level to find price information and integrate with product technical performance data.
Phase 2. Encourage price and service-level competition among suppliers by providing consumers with information on how to judge value for money.
A second area to improve market outcomes is by promoting competition. Price transparency supports this goal, but to achieve market transformation programs need to go further to help consumers understand what products, specific to their circumstances, offer best value for money.
In the case of a heat pump installation, this means taking account of fuel source, energy prices, house condition, and other factors that drive the overall value-for-money equation when achieving improved energy efficiency. Again, information asymmetry is at play. Many energy-efficiency consultants and HVAC contractors offer to advise on these topics but have an inherent bias to promoting their products and services. There are no easily available public sources of reliable benchmark price/performance data for ducted and ductless heat pumps for homes ranging from 1500 to 2700 square feet, which would cover 75% of the single-family homes in the United States.
In contrast, the commercial building sector benefits from very detailed cost information published on virtually every type of building material and specialty trade procedure. Data from sources such as RSMeans provides pricing and unit cost information for ductwork, electrical wiring, and mean hourly wage rates for HVAC technicians by region. Builders of newly constructed single-family homes use similar systems to estimate and manage the costs of every aspect of the new construction process. But a homeowner seeking to retrofit a heat pump into an existing structure has none of this information. Since virtually all rebates are likely to be retrofit installations, states and the DOE have a unique interest in making this market more competitive by developing and publishing cost/performance benchmarking data.
State programs have considerable leverage that can be used to obtain the information needed from suppliers and installers. The central market-transformation team should use that information to create a tool that provides states and consumers with estimates of potential bill savings from installation of heat pumps in different regions and under different utility rates. This information would be very valuable to low- and middle-income (LMI) households, who are to receive most of the funding under HEAR.
Phase 3. Use the rebate program to lower costs and promote best-value products by negotiating product and service-level agreements with suppliers and contractors and awarding a higher level of rebate to installations that represent best value for money.
By subsidizing and consolidating demand, SEOs will have significant bargaining power to achieve fair prices for consumers.
First, by leveraging relationships with public and private sector stakeholders, SEOs can negotiate agreements with best-value contractors, offering guaranteed minimum volumes in return for discounted pricing and/or longer warranty periods for participating consumers. This is especially important for LMI households, who have limited home improvement budgets and experience disproportionately higher energy burdens, which is why there has been limited uptake of heat pumps by LMI households. In return, contractors gain access to a guaranteed number of additional projects that can offset the seasonal nature of the business.
Second, as states design the formulas used to distribute rebates, they should be encouraged to create systems that allocate a higher proportion of rebates to projects quoted at or below the benchmark costs and a smaller proportion or completely eliminate the rebates to projects higher than the benchmark. This will incentivize contractors to offer better value for money, as most projects will not proceed unless they receive a substantial rebate. States should also adopt a similar process as New York and Wisconsin in creating a list of approved contractors that adhere to “reasonable price” thresholds.
Recommendation 2. For future energy rebate programs, Congress and DOE can make market transformation more central to program design.
In future clean energy legislation, Congress should direct DOE to include the principles recommended above into the design of energy rebate programs, whether implemented by DOE or states. Ideally, that would come with either greater funding for administration and technical assistance or dedicated funding for market-transformation activities in addition to the rebate program funding.
For future rebate programs, DOE could take market transformation a step further by establishing benchmarking data for “fair and reasonable” prices from the beginning and requiring that, as part of their applications, states must have service-level agreements in place to ensure that only contractors that are at or below ceiling prices are awarded rebates. Establishing this at the federal level will ensure consistency and adoption at the state level.
Conclusion
The DOE should prioritize funding evidence-based market transformation strategies to increase the return on investment for rebate programs. Learning from U.S.-funded programs for global public health, a similar approach can be applied to the markets for energy-efficient appliances that are supported under the HEAR program. Market shaping can tip the balance towards more cost-effective and better-value products and prevent rebates from driving up prices. Successful market shaping will lead to sustained uptake of energy-efficient appliances by households across the country.
This action-ready policy memo is part of Day One 2025 — our effort to bring forward bold policy ideas, grounded in science and evidence, that can tackle the country’s biggest challenges and bring us closer to the prosperous, equitable and safe future that we all hope for whoever takes office in 2025 and beyond.
There is compelling evidence that federal and state subsidies for energy-efficient products can lead to price inflation, particularly in the clean energy space. The federal government has offered tax credits in the residential solar space for many years. While there has been a 64% reduction in the ex-factory photovoltaic module price for residential panels, the total residential installed cost per kWh has increased. The soft costs, including installation, have increased over the same period and are now ~65% or more of total project costs.
In 2021, the National Bureau of Economic Research linked consumer subsidies with firms charging higher prices, in the case of Chinese cell phones. The researchers found that by introducing competition for eligibility, through techniques such as commitment to price ceilings, price increases were mitigated and, in some cases, even reduced, creating more consumer surplus. This type of research along with the observed price increases after tax credits for solar show the risks of government subsidies without market-shaping interventions and the likely detrimental long-term impacts.
CHAI has negotiated over 140 agreements for health commodities supplied to low-and-middle-income countries (LMICs) with over 50 different companies. These market-shaping agreements have generated $4 billion in savings for health systems and touched millions of lives.
For example, CHAI collaborated with Duke University and Bristol Myers Squibb to combat hepatitis-C, which impacts 71 million people, 80% of whom are in LMICs, mostly in Southeast Asia and Africa [see footnote]. The approval in 2013 of two new antiviral drugs transformed treatment for high-income countries, but the drugs were not marketed or affordable in LMICs. Through its partnerships and programming, CHAI was able to achieve initial pricing of $500 per treatment course for LMICs. Prices fell over the next six years to under $60 per treatment course while the cost in the West remained at over $50,000 per treatment course. This was accomplished through ceiling price agreements and access programs with guaranteed volume considerations.
CHAI has also worked closely with the Bill and Melinda Gates Foundation to develop the novel market-shaping intervention called a volume guarantee (VG), where a drug or diagnostic test supplier agrees to a price discount in exchange for guaranteed volume (which will be backstopped by the guarantor if not achieved). Together, they negotiated a six-year fixed price VG with Bayer and Merck for contraceptive implants that reduced the price by 53% for 40 million units, making family planning more accessible for millions and generating $500 million in procurement savings.
Footnote: Hanafiah et al., Global epidemiology of hepatitis C virus infection: New estimates of age-specific antibody to HCV seroprevalence, J Hepatol. (2013), Volume 57, Issue 4, Pages 1333–1342; Gower E, Estes C, Blach S, et al. Global epidemiology and genotype distribution of the hepatitis C virus infection. J Hepatol. (2014),61(1 Suppl):S45-57; World Health Organization. Work conducted by the London School of Hygiene and Tropical Medicine. Global Hepatitis Report 2017.
Many states are in the early stages of setting up the program, so they have not yet released their implementation plans. However, New York and Wisconsin indicate which contractors are eligible to receive rebates through approved contractor networks on their websites. Once a household applies for the program, they are put in touch with a contractor from the approved state network, which they are required to use if they want access to the rebate. Those contractors are approved based on completion of training and other basic requirements such as affirming that pricing will be “fair and reasonable.” Currently, there is no detail about specific price thresholds that suppliers need to meet (as an indication of value for money) to qualify.
DOE’s Data and Tools Requirements document lays out the guidelines for states to receive federal funding for rebates. This includes transaction-level data that must be reported to the DOE monthly, including the specs of the home, the installation costs, and the equipment costs. Given that states already have to collect this data from contractors for reporting, this proposal recommends that SEOs streamline data collection and standardize it across all participating states, and then publish summary data so consumers can get an accurate sense of the range of prices.
There will be natural variation between homes, but by collecting a sufficient sample size and overlaying efficiency metrics like Seasonal Energy Efficiency Rating, Heating Seasonal Performance Factor, and coefficient of performance, states will be able to gauge value for money. Rewiring America and other nonprofits have software that can quickly make these calculations to help consumers understand the return on investment for higher-efficiency (and higher-cost) heat pumps given their location and current heating/cooling costs.
In the global public health markets, CHAI has promoted price transparency for drugs and diagnostic tests by publishing market surveys that include product technical specifications, and links to product performance studies. We show the actual prices paid for similar products in different countries and by different procurement agencies. All this information has helped public health programs migrate to the best-in-class products and improve value for money. Stats could do the same to empower consumers to choose best-in-class and best-value products and contractors.
Driving Product Model Development with the Technology Modernization Fund
The Technology Modernization Fund (TMF) currently funds multiyear technology projects to help agencies improve their service delivery. However, many agencies abdicate responsibility for project outcomes to vendors, lacking the internal leadership and project development teams necessary to apply a product model approach focused on user needs, starting small, learning what works, and making adjustments as needed.
To promote better outcomes, TMF could make three key changes to help agencies shift from simply purchasing static software to acquiring ongoing capabilities that can meet their long-term mission needs: (1) provide education and training to help agencies adopt the product model; (2) evaluate investments based on their use of effective product management and development practices; and (3) fund the staff necessary to deliver true modernization capacity.
Challenge and Opportunity
Technology modernization is a continual process of addressing unmet needs, not a one-time effort with a defined start and end. Too often, when agencies attempt to modernize, they purchase “static” software, treating it like any other commodity, such as computers or cars. But software is fundamentally different. It must continuously evolve to keep up with changing policies, security demands, and customer needs.
Presently, agencies tend to rely on available procurement, contracting, and project management staff to lead technology projects. However, it is not enough to focus on the art of getting things done (project management); it is also critically important to understand the art of deciding what to do (product management). A product manager is empowered to make real-time decisions on priorities and features, including deciding what not to do, to ensure the final product effectively meets user needs. Without this role, development teams typically march through a vast, undifferentiated, unprioritized list of requirements, which is how information technology (IT) projects result in unwieldy failures.
By contrast, the product model fosters a continuous cycle of improvement, essential for effective technology modernization. It empowers a small initial team with the right skills to conduct discovery sprints, engage users from the outset and throughout the process, and continuously develop, improve, and deliver value. This approach is ultimately more cost effective, results in continuously updated and effective software, and better meets user needs.
However, transitioning to the product model is challenging. Agencies need more than just infrastructure and tools to support seamless deployment and continuous software updates – they also need the right people and training. A lean team of product managers, user researchers, and service designers who will shape the effort from the outset can have an enormous impact on reducing costs and improving the effectiveness of eventual vendor contracts. Program and agency leaders, who truly understand the policy and operational context, may also require training to serve effectively as “product owners.” In this role, they work closely with experienced product managers to craft and bring to life a compelling product vision.
These internal capacity investments are not expensive relative to the cost of traditional IT projects in government, but they are currently hard to make. Placing greater emphasis on building internal product management capacity will enable the government to more effectively tackle the root causes that lead to legacy systems becoming problematic in the first place. By developing this capacity, agencies can avoid future costly and ineffective “modernization” efforts.
Plan of Action
The General Services Administration’s Technology Modernization Fund plays a crucial role in helping government agencies transition from outdated legacy systems to modern, secure, and efficient technologies, strengthening the government’s ability to serve the public. However, changes to TMF’s strategy, policy, and practice could incentivize the broader adoption of product model approaches and make its investments more impactful.
The TMF should shift from investments in high-cost, static technologies that will not evolve to meet future needs towards supporting the development of product model capabilities within agencies. This requires a combination of skilled personnel, technology, and user-centered approaches. Success should be measured not just by direct savings in technology but by broader efficiencies, such as improvements in operational effectiveness, reductions in administrative burdens, and enhanced service delivery to users.
While successful investments may result in lower costs, the primary goal should be to deliver greater value by helping agencies better fulfill their missions. Ultimately, these changes will strengthen agency resilience, enabling them to adapt, scale, and respond more effectively to new challenges and conditions.
Recommendation 1. The Technology Modernization Board, responsible for evaluating proposals, should:
- Assess future investments based on the applicant’s demonstrated competencies and capacities in product ownership and management, as well as their commitment to developing these capabilities. This includes assessing proposed staffing models to ensure the right teams are in place.
- Expand assessment criteria for active and completed projects beyond cost savings, to include measurements of improved mission delivery, operational efficiencies, resilience, and adaptability.
Recommendation 2. The TMF Program Management Office, responsible for stewarding investments from start to finish, should:
- Educate and train agencies applying for funds on how to adopt and sustain the product model.
- Work with the General Services Administration’s 18F to incorporate TMF project successes and lessons learned into a continuously updated product model playbook for government agencies that includes guidance on the key roles and responsibilities needed to successfully own and manage products in government.
- Collaborate with the Office of Personnel Management (OPM) to ensure that agencies have efficient and expedited pathways for acquiring the necessary talent, utilizing appropriate assessments to identify and onboard skilled individuals.
Recommendation 3. Congress should:
- Encourage agencies to set up their own working capital funds under the authorities outlined in the TMF legislation.
- Explore the barriers to product model funding in the current budgeting and appropriations processes for the federal government as a whole and develop proposals for fitting them to purpose.
- Direct OPM to reduce procedural barriers that hinder swift and effective hiring.
Conclusion
The TMF should leverage its mandate to shift agencies towards a capabilities-first mindset. Changing how the program educates, funds, and assesses agencies will build internal capacity and deliver continuous improvement. This approach will lead to better outcomes, both in the near and long terms, by empowering agencies to adapt and evolve their capabilities to meet future challenges effectively.
This action-ready policy memo is part of Day One 2025 — our effort to bring forward bold policy ideas, grounded in science and evidence, that can tackle the country’s biggest challenges and bring us closer to the prosperous, equitable and safe future that we all hope for whoever takes office in 2025 and beyond.
Congress established TMF in 2018 “to improve information technology, and to enhance cybersecurity across the federal government” through multiyear technology projects. Since then, more than $1 billion has been invested through the fund across dozens of federal agencies in four priority areas.
How to Build Effective Digital Permitting Products in Government
The success of historic federal investments in climate resilience, clean energy, and new infrastructure hinges on the government’s ability to efficiently permit, site, and build projects. Many of these projects are subject to the National Environmental Policy Act (NEPA), which dictates the procedures agencies must use to anticipate environmental, social, and economic impacts of potential actions.
Agencies use digital tools throughout the permitting process for a variety of tasks including permit data collection and application development, analysis, surveys, impact assessments, public comment processing, and post-permit monitoring. However, many of the technology tools presently used in NEPA processes are fragmented, opaque, and lack user-friendly features. Investments in permitting technology (such as software, decision support tools, data standards, and automation) could reduce the long timelines that plague environmental review. In fact, the Council on Environmental Quality (CEQ)’s recent report to Congress highlights the “tremendous potential” for technology to improve the efficiency and responsiveness of environmental review.
The Permitting Council, a federal agency focused on improving the “transparency, predictability, and outcomes” of federal permitting processes, recently invested $30 million in technology projects at various agencies to “strengthen the efficiency and predictability of environmental review.” Agencies are also investing in their own technology tools aimed at improving various parts of the environmental review process. As just one example, the Department of Energy’s Coordinated Interagency Transmission Authorizations and Permits (CITAP) Program recently released a new web portal designed to create more seamless communication between agencies and applicants.
Yet permitting innovation is still moving at a slow pace and not all agencies have dedicated funding to develop needed technology tools for permitting. We recently wrote a case study about the Department of Transportation’s Freight Logistics Optimization Works (FLOW) project to illustrate how agency staff can make progress in developing technology without large upfront funding investments or staff time. FLOW is a public-private partnership that supports transportation industry users in anticipating and responding to supply chain disruptions. Andrew Petritin, who we interviewed for our case study, was a member of the team that co-created this digital product with users.
In a prior case study, Jennifer Pahlka and Allie Harris identified strategies that contributed to DOT FLOW’s success in building a great digital product in government. Here, we expand on a subset of these strategies and how they can be applied to build great digital products in the permitting sector. We also point to several permitting technology efforts that have benefited from independently applying similar strategies to demonstrate how agencies with permitting responsibilities can approach building digital products. These case studies and insights serve as inspiration for how agencies can make positive change even when substantive barriers exist.
Make data function as a compass, not a grade.
Here is an illustrative example of how data can be used as a compass to inform decisions and provide situational awareness to customers.
The National Telecommunications and Information Administration (NTIA) recently launched a Permitting Mapping Tool to support grantees and others in deploying infrastructure by identifying permitting requirements and potential environmental impacts. This is a tool that both industry and the public can use to see the permitting requirements for a geographic location. The data gathered and shared through this tool is not intended to assess performance; rather, it is used to provide an understanding of the landscape to support decision making.
NTIA staff recognized the potential value of the Federal Communication Commission’s (FFC) existing map of broadband serviceable locations to users in the permitting process and worked to combine it with other available information in order to support decision making. According to NTIA staff, NTIA’s in-house data analysts started prototyping mapping tools to see how they could better support their customers by using the FCC’s information about broadband serviceable locations. They first overlaid federal land management agency boundaries and showed other agencies where deployments will be required on federal lands in remote and unserved areas, where they might not have a lot of staff to process permits. The team then pulled in hundreds of publicly available data sources to illustrate where deployments will occur on state and Tribal lands and in or near protected environmental resources including wetlands, floodplains, and critical habitats before releasing the application on NTIA’s website with an instructional video. Notably, NTIA staff were able to make substantial progress prior to receiving Permitting Council funds to support grant applicants in using the environment screening to improve the efficiency of categorical exclusions processing.
Build trust. Trust allows you to co-create with your users. Understand your users’ needs, don’t solicit advice.
Recent recipients of Permitting Council grants for technology development have the opportunity to define their customers and work with them from day one to understand their needs. Rather than assuming their customer’s pain points, grant recipients can gather input from their customers and build the new technology to meet their needs. Recipients can learn from FLOW’s example by building trust early through direct collaboration. Examples of strategies agencies can use to engage customers include defining user personas for their technology; facilitating user interviews to understand their needs; visiting field offices to meet their customers and learn how technology integrates into their work processes and environment; conducting observations of existing technologies to assess opportunities for improvement; and rapidly prototyping, testing, and iterating their solutions with user feedback.
In the longer term, the Permitting Council and other funding entities can drive the adoption of a user-center approach to technology development through their future grant requirements. By incorporating user research, user testing, and agile methodologies in their requests for proposals, the Permitting Council and others can set clear expectations for user involvement throughout the technology development process.
In comparison to DOT FLOW, where the customers are largely external to the federal government, the customers and stakeholders for permitting technology include internal federal employees with responsibilities for preparing, reviewing, and publishing NEPA documentation. But even if your end-users are within your organization (or even on your same team!), the principles of building trust, co-creating, and understanding user needs still apply.
Fight trade-off denial.
When approaching the complex process of permitting and developing technological tools to support customers, it is critical for teams to focus on a specific problem and prioritize user needs to develop a minimum viable product (MVP). A great example of this is the Department of Energy (DOE)’s Coordinated Interagency Transmission Authorizations and Permits Program (CITAP).
DOE collaborated with a development team at the National Renewable Energy Laboratory to create a new portal for interstate transmission applications requiring environmental review and compliance. The team applied a “user-centered, agile approach” to develop and deploy the new tool by the effective date for new CITAP regulations. The tool streamlines communication by allowing the project proponent to track the status of the permit, submit documentation, and communicate with DOE through the platform. Through iterative development, DOE plans to expand the system to include additional users, including cooperating agencies, and provide the ability for cooperating agencies to receive applicant-submitted materials. Deprioritizing these desired functions in the initial release required tradeoffs and a prioritization of user needs, but enabled the team to ultimately meet its deadline and provide near-term value to the public.
Prioritizing functionality and activities for improvements in permitting can be challenging, but it is critical that agencies make decisions on where to focus and start small with any technology development. Having more accessible data can help inform these trade off decisions by providing an assessment of problem criticality and impact.
Don’t just reduce burden – provide value.
Our partners at EPIC recently wrote about the opportunity to operationalize rules and regulations in permitting technology. They discussed how AI could be applied to: (1) help answer permitting questions using a database of rulings, guidelines, and past projects; (2) verify compliance of permits and analyses with up-to-date legal requirements, and (3) automatically apply legal updates impacting permitting procedures to analyses. These examples illustrate how improving permitting technology can not only reduce burdens on the permitting workforce, but simultaneously provide value by offering decision support tools.
Fund products, not projects.
The federal government often uses the project funding model for developing and modernizing technology. This approach provides different levels of funding based on a specific waterfall process step (e.g., requirements gathering, development, and operations and maintenance). While straightforward, this model provides little flexibility for iteration and little support for modernization and maintenance. Jen Pahlka, former U.S. Deputy Chief Technology Officer, recommends the government move towards a product funding model that acknowledges software development never ends, rather there is ongoing work to improve technology over time. This requires steady levels of funding and has implications for talent.
Permitting teams should be considering these different models when developing new technology and tools. Whether procuring technology or developing it in-house, teams should be thinking about how they can support long-term technology development and hire employees with the knowledge, skills, and abilities to effectively manage the technology. Where relevant, agencies should seek to fund products. While product funding models may seem onerous at first, they are likely to have lower costs and enable teams to respond more effectively to user needs over time.
Several existing resources support product development in government. The 18F unit, part of the General Services Administration (GSA)’s Technology Transformation Services (TTS), helps federal agencies build, share, and buy technology products. 18F offers a number of tools to support agencies with product management. GSA’s IT Modernization Centers of Excellence can support agency staff in using a product-focused approach. The Centers focused on Contact Center, Customer Experience, and Data and Analytics may be most relevant for agencies building permitting technology. Finally, the U.S. Digital Service (USDS) “collaborates with public servants throughout the government”; their staff can assist with product, strategy, and operations as well as procurement and user experience. Agencies can also look to the private sector and NGOs for compelling examples of product development.
Looking forward
Agency staff can deploy tactics like those outlined above to quickly improve permitting technology using existing authorities and resources. But these tactics should complement, not substitute, a longer-term systemic strategy for improving the U.S. permitting ecosystem. Center of government entities and individual agencies need to be thinking holistically about shared user needs across processes and technologies. As CEQ stated in their report, where there are shared needs, there should be shared services. Government leadership must equip successful small-scale projects with the resources and guidance needed to scale effectively.
Additionally, there needs to be an investment from the government in developing effective permitting technology, with technical talent (product managers, developers, user researchers, data scientists) hired to support these efforts).
As the government continues to modernize to meet emerging challenges, it will need to adopt best practices from industry and compete for the talent to bring their visions to life. Sustained investment in interagency collaboration, talent, and training can shift the status quo from pockets of innovation (such as DOT FLOW and other examples highlighted here) to an innovation ecosystem guided by a robust, shared product strategy.
Scaling Proven IT Modernization Strategies Across the Federal Government
Ten years after the creation of the U.S. Digital Service (USDS) and 18F (an organization with the General Services Administration that helps other government agencies build, buy, and share technology products), the federal government still struggles to buy, build, and operate technology in a speedy, modern, scalable way. Cybersecurity remains a continuous challenge – in part due to lack of modernization of legacy technology systems. As data fuels the next transformative modernization phase, the federal government has an opportunity to leverage modern practices to leap forward in scaling IT Modernization.
While there have been success stories, like IRS’s direct file tool and electronic passport renewal, most government technology and delivery practices remain antiquated and the replacement process remains too slow. Many obstacles to modernization have been removed in theory, yet in practice Chief Information Officers (CIOs) still struggle to exercise their authority to achieve meaningful results. Additionally, procurement and hiring processes, as well as insufficient modernization budgets, remain barriers.
The DoD failed to modernize its 25-year-old Defense Travel System (DTS) after spending $374 million, while the IRS relies on hundreds of outdated systems, including a key taxpayer data processing system built in the 1960s, with full replacement not expected until 2030. The GAO identified 10 critical systems across various agencies, ranging from 8 to 51 years old, that provide essential services like emergency management, health care, and defense, costing $337 million annually to operate and maintain, many of which use outdated code and unsupported hardware, posing major security and reliability risks. Despite the establishment of the Technology Modernization Fund (TMF) with a $1.23 billion appropriation, most TMF funds have been expended for a small number of programs, many of which did not solve legacy modernization problems. Meanwhile the urgency of modernizing antiquated legacy systems to prevent service breakdowns continues to increase.
This memo proposes a new effort to rapidly scale proven IT modernization strategies across the federal government. The result will be a federal government with the structure and culture in place to buy, build, and deliver technology that meets the needs of Americans today and into the future.
Challenge and Opportunity
Government administrations typically arrive with a significant policy agenda and a limited management agenda. The management agenda often receives minimal focus until the policy agenda is firmly underway. As a result, the management agenda is rarely well implemented, if it is implemented at all. It should be noted that there are signs of progress in this area, as the Biden-Harris Administration publishing its management agenda in the first year of the Administration, while the the Trump Administration did not publish its management agenda until the second year of the administration. However, even when the management agenda is published earlier, alignment, accountability and senior White House and departmental leadership focus on the management agenda is far weaker than for the policy agenda.
Even when a PMA has been published and alignment is achieved amongst all the stakeholders within the EOP, the PMA is simply not a priority for Departmental/Agency leadership and there is little focus on the PMA among Secretaries/Administrators. Each Department/Agency is responsible for a policy agenda and, unless, IT or other management agenda items are core to the delivery of the policy agenda, such as at the VA, departmental political leadership pays little attention to the PMA or related activities such as IT and procurement.
An administration’s failure to implement a management agenda and improve government operations jeopardizes the success of that administration’s policy agenda, as poor government technology inhibits successful implementation of many policiesThis has been clear during the Biden – Harris administration as departments have struggled to rapidly deliver IT systems to support loan, grant and tax programs, sometimes delaying or slowing the implementation of those programs.
The federal government as a whole spends about 80% of its IT budget on maintenance of outdated systems—a percentage that is increasing, not declining. Successful innovations in federal technology and service delivery have not scaled, leaving pockets of success throughout the government that are constantly at risk of disappearing with changes in staff or leadership.
The Obama administration created USDS and 18F/Technology Transformation Services (TTS) to begin addressing the federal government’s technology problems through improved adoption of modern Digital Services. The Trump administration created the Office of American Innovation (OAI) to further advance government technology management. As adoption of AI accelerates, it becomes even more imperative for the federal government to close the technology gap between where we are and where we need to be to provide the government services that the American people deserve.
The Biden administration has adapted IT modernization efforts to address the pivot to AI innovations by having groups like USDS, 18F/TTS and DoD Software Factories increasingly focus on Data adoption and AI. With the Executive Order on AI and the Consortium Dedicated to AI Safety the Biden-Harris administration is establishing guidelines to adopt and properly govern increasing focus on Data and AI. These are all positive highlights for IT modernization – but there is a need for these efforts to deliver real productivity. Expectations of citizens continue to increase. Services that take months should take weeks, weeks should take days, and days should take hours. This level of improvement can’t be reached across the majority of government services until modernization occurs at scale. While multiple laws designed to enhance CIO authorities and accelerate digital transformation have been passed in recent years, departmental CIOs still do not have the tools to drive change, especially in large, federated departments where CIOs do not have substantial budget authority.
As the evolution of Digital Transformation for the government pivots to data, modernizedAgencies/Department can leap forward, while others are still stuck with antiquated systems and not able to derive value from data yet. For more digitally mature Agencies/Departments, the pivot to data-driven decisions, automation and AI, offer the best chance for a leap in productivity and quality gains. AI will fuel the next opportunity to leap forward by shifting focus from the process of delivering digital services (as they become norms) and more on the data based insights they ingest and create. For the Agencies/Departments “left behind” the value of data driven-decisions, automation and AI – could drive rapid transformation and new tools to deliver legacy system modernization.
The Department of Energy’s “Scaling IT Modernization Playbook” offers key approaches to scale IT modernization by prioritizing mission outcomes, driving data adoption, coordinating at scale across government, and valuing speed and agility because, “we underrate speed as value”. Government operations have become too complacent with slow processes and modernization; we are increasingly outpaced by faster developing innovations. Essentially, Moore’s Law (posited by Gordon Moore that the number of transistors in an integrated circuit doubles every 18 months while cost increases minimally. Moore’s law has been more generally applied to a variety of advanced technologies) is outpacing successful policy implementation.
As a result, the government and the public continue to struggle with dysfunctional legacy systems that make government services difficult to use under normal circumstances and can be crippling in a crisis. The solution to these problems is to boldly and rapidly scale emerging modernization efforts across the federal government enterprise – embracing leaps forward with the opportunistic shift of data and AI fueled transformation.
Some departments have delivered notably successful modern systems, such DHS’ Global Entry site and the State Department’s online passport renewal service. While these solutions are clearly less complex than the IRS’ tax processing system, which the IRS has struggled to modernize, they demonstrate that the government can deliver modern digital services under the right conditions.
Failed policy implementation due to failed technology implementation and modernization will continue until management and leadership practices associated with modern delivery are rapidly adopted at scale across government and efforts and programs are retained between administrations.
Plan of Action
Recommendation 1. Prioritize Policy Delivery through the Office of Management and Budget (OMB) and the General Services Administration (GSA)
First, the Administration should elevate the position of Federal CIO to be a peer to the Deputy Directors at the OMB and move the Federal CIO outside of OMB, while remaining within the Executive Office of the President, to ensure that the Federal CIO and, therefore, IT and Cybersecurity priorities and needs of the departments and agencies have a true seat at the table. The Federal CIO represents positions that are as important as but different from those of the OMB Deputy Directors and the National Security Advisor and, therefore, should be peers to those individuals, just as they are within departments and agencies, where CIOs are required to report to the Secretary or Administrator. Second, Elevate the role of the GSA Administrator to a Cabinet-level position, and formally recognize GSA as the federal government’s “Operations & Implementation” agency. These actions will effectively make the GSA Administrator the federal government’s Chief Operating Officer (COO). Policy, financial oversight, and governance will remain the purview of the OMB. Operations & Implementation will become the responsibility of the GSA, aligning existing GSA authorities of TTS, quality & proven shared services, acquisitions, and asset management with a renewed focus on mission centric government-service delivery. The GSA Administrator will collaborate with the President’s Management Council (PMC), OMB and agency level CIOs to coordinate policy delivery strategy with delivery responsibility, thereby integrating existing modernization and transformation efforts from the GSA Project Management Office (PMO) towards a common mission with prioritization on rapid transformation.
For the government to improve government services, it needs high-level leaders charged with prioritizing operations and implementation—as a COO does for a commercial organization. Elevating the Federal CIO to an OMB Deputy Director and the GSA Administrator to a Cabinet-level position tasked with overseeing “Operations & Implementation” would ensure that management and implementation best practices go hand in hand with policy development, dramatically reducing the delivery failures that put even strong policy agendas at risk.
Recommendation 2. Guide Government Leaders with the Rapid Agency Transformation Playbook
Building on the success of the Digital Services Playbook, and influenced by the DOE’s “Scaling IT Modernization Playbook” the Federal CIO should develop a set of “plays” for rapidly scaling technology and service delivery improvements across an entire agency. The Rapid Agency Transformation Playbook will act both as a guide to advise agency leaders in scaling best practices, as well as a standard against which modernization efforts can be assessed. The government wide “plays” will be based on practices that have proven successful in the private and public sectors, and will address concepts such as fostering innovation, rapid transformation, data adoption, modernizing or sunsetting legacy systems, and continually improving work processes infused with AI. Where the Digital Services Playbook has helped successfully innovate practices in pockets of government, the Rapid Agency Transformation Playbook will help scale those successful practices across government as a whole.
A Rapid Agency Transformation Playbook will provide a living document to guide leadership and management, helping align policy implementation with policy content. The Playbook will also clearly lay out expected practices for Federal employees and contractors who collaborate on policy delivery.
Recommendation 3. Fuel Rapid Transformation by Creating Rapid Transformation Funds
Congress should create Rapid Transformation Funds (RTF) under the control of each Cabinet-level CIO, as well as the most senior-IT leader in smaller departments and independent agencies. These funds would be placed in a Working Capital Fund (WCF) that is controlled by the cabinet level CIO or the most senior IT leader in smaller departments and independent agencies. These funds must be established through legislation. For those departments that do not currently have a working capital fund under the control of the CIO, the legislation should create that fund, rather than depending on each department or agency to make a legislative request for an IT WCF.
This structure will give the CIO of each Department/Agency direct control of the funds. All RTFs must be under the control of the most senior IT leader in each organization and the authority to manage these funds must not be delegatable.The TMF puts the funds under the control of GSA’s Office of the Federal Chief Information Officer (OFCIO) and a board that has to juggle priorities among GSA OCIO and the individual Departments and Agencies. Direct control will streamline decision making and fund disbursement. It will help to create a carrot to align with existing Federal Information Technology Acquisition Reform Act (FITARA) (i.e., stick) authorities. In addition, Congress should evaluate how CIO authorities are measured under FITARA to ensure that CIOs have a true seat at the table.
The legislation will provide the CIO the authority to sweep both expiring and canceling funds into the new WCF. Seed funds in the amount of 10% of department/agency budgets will be provided to each department/agency. CIOs will have the discretion to distribute the funds for modernization projects throughout their department or agency and to determine payback model(s) that best suit their organization, including the option to reduce or waive payback for projects, while the overarching model will be cost reimbursement.
The RTF will enhance the CIO’s ability to drive change within their own organization. While Congress has expanded CIO authorities through legislation three different times in recent years, no legislation has redirected funding to CIOs. Most cabinet level CIOs control a single digit percentage of the Department’s IT budget. For example, the Department of Energy CIO directly controls about 5% of DOE’s IT spending. Direct control of a meaningfully sized pool of money that can be allocated to component IT teams by the cabinet level CIO enables that cabinet level CIOs to drive critical priorities including modernization and security. Without funding, CIO authorities amount to unfunded mandates. The RTF will allow the CIO to enhance their authority by directly funding new initiatives. A reevaluation of the metrics associated with CIO authorities would ensure that CIOs have a true seat at the table.
Recommendation 4. Ensure transformation speed through continuity by establishing a Transformation Advisory Board and department/agency management councils.
First, OMB should establish a Transformation Advisory Board (TAB) within the Executive Office of the President (EOP), composed of senior and well-respected individuals who will be appointed to serve fixed terms not tied to the presidential administration and sponsored by the Federal CIO. The TAB will be chartered to impact management and technology policy across the government and make recommendations to change governance that impedes rapid modernization and transformation of government. Modeled after the Defense Innovation Board, the TAB will focus on entrenching rapid modernization efforts across administrations and on supporting, protecting, and enhancing existing digital-transformation capabilities. Second, each department and agency should be directed to establish a management council composed of leaders of the department/agency’s administrative functions to include at least IT, finance, human resources, and acquisition, under the leadership of the deputy secretary/deputy administrator. In large departments this may require creating a new deputy secretary or undersecretary position to ensure meaningful focus on the priorities, rather than simply holding meaningless council meetings. This council will ensure that collaborative management attention is given to departmental/agency administration and that leadership other than the CIO understand IT challenges and opportunities.
A Transformation Advisory Board will ensure continuity across administrations and changes in agency leadership to prevent the loss of good practices, enabling successful transformative innovations to take root and grow without breaks and gaps in administration changes. The management council will ensure that modernization is a priority of departmental/agency leadership beyond the CIO.
Ann Dunkin contributed to an earlier version of this memo.
This idea was originally published on November 13, 2020; we’ve re-published this updated version on October 22, 2024.
While things have not changed as much as we would like, departments and agencies have made progress in modernizing their technology products and processes. Elevating the GSA Administrator to the cabinet level, adding a Transformation Advisory Board, elevating the Federal CIO, reevaluating how CIO authorities are measured, creating departmental/agency management councils, and providing modernization funds directly to CIOs through working capital funds will provide agencies and departments with the management attention, expertise, support, and resources needed to scale and sustain that progress over time. Additionally, CIOs—who are responsible for technology delivery—are often siloed rather than part of a broad, holistic approach to operations and implementation. Elevating the GSA Administrator and the Federal CIO, as well as establishing the TAB and departmental/agency management councils, will provide coordinated focus on the government’s need to modernize IT.
Elevating the role of the Federal CIO and the GSA Administrator will provide more authority and attention for the President’s Management Agenda, thereby aligning policy content with policy implementation. Providing CIOs with a direct source of modernization funding will allow them to direct funds to the most critical projects throughout their organizations, as well as require adherence to standards and best practices. A new focus on successful policy delivery aided by experienced leaders will drive modernization of government systems that rely on dangerously outdated technology.
We believe that an administration that embraces the proposal outlined here will see scaling innovation as critical. Establishing a government COO and elevating the Federal CIO along with an appointed board that crosses administrations, departmental management councils, better measurement of CIO authorities, and direct funding to CIOs will dramatically increase the likelihood that that improved technology and service delivery remain a priority for future administrations.
The federal government has many pockets of innovation that have proven modern methodologies can and do work in government. These pockets of innovation—including USDS, GSA TTS, 18F, the U.S. Air Force Software Factories, fellowships, the Air Force Works Program (AFWERX), Defense Advanced Research Projects Agency (DARPA), and others—are inspiring. It is time to build on these innovations, coordinate their efforts under a U.S. government COO and empowered Federal CIO, and scale solutions to modernize the government as a whole.
Yes. A cabinet-level chief operating officer with top-level executive authority over policy operations and implementation is needed to carry out policy agendas effectively. It is hard to imagine a high-performing organization without a COO and a focus on operations and implementation at the highest level of leadership.
The legacy of any administration is based on its ability to enact its policy agenda and its ability to respond to national emergencies. Scaling modernization across the government is imperative if policy implementation and emergency response is important to the president.
Mobilizing Innovative Financial Mechanisms for Extreme Heat Adaptation Solutions in Developing Nations
Global heat deaths are projected to increase by 370% if direct action is not taken to limit the effects of climate change. The dire implications of rising global temperatures extend across a spectrum of risks, from health crises exacerbated by heat stress, malnutrition, and disease, to economic disparities that disproportionately affect vulnerable communities in the U.S. and in low- and middle-income countries. In light of these challenges, it is imperative to prioritize a coordinated effort at both national and international levels to enhance resilience to extreme heat. This effort must focus on developing and implementing comprehensive strategies to ensure the vulnerable developing countries facing the worst and disproportionate effects of climate change have the proper capacity for adaptation, as wealthier, developed nations mitigate their contributions to climate change.
To address these challenges, the U.S. Agency for International Development (USAID) should mobilize finance through environmental impact bonds focused on scaling extreme heat adaptation solutions. USAID should build upon the success of the SERVIR joint initiative and expand it to include a partnership with NIHHIS to co-develop decision support tools for extreme heat. Additionally, the Bureau for Resilience, Environment, and Food Security (REFS) within the USAID should take the lead in tracking and reporting on climate adaptation funding data. This effort will enhance transparency and ensure that adaptation and mitigation efforts are effectively prioritized. By addressing the urgent need for comprehensive adaptation strategies, we can mitigate the impacts of climate change, increase resilience through adaptation, and protect the most vulnerable communities from the increasing threats posed by extreme heat.
Challenge
Over the past 13 months, temperatures have hit record highs, with much of the world having just experienced their warmest June on record. Berkeley Earth predicts a 95% chance that 2024 will rank as the warmest year in history. Extreme heat drives interconnected impacts across multiple risk areas including: public health; food insecurity; health care system costs; climate migration and the growing transmission of life-threatening diseases.
Thus, as global temperatures continue to rise, resilience to extreme heat becomes a crucial element of climate change adaptation, necessitating a strategic federal response on both domestic and international scales.
Inequitable Economic and Health Impacts
Despite contributing least to global greenhouse gas emissions, low- and middle-income countries experience four times higher economic losses from excess heat relative to wealthier counterparts. The countries likely to suffer the most are those with the most humidity, i.e. tropical nations in the Global South. Two-thirds of global exposure to extreme heat occurs in urban areas in the Global South, where there are fewer resources to mitigate and adapt.
The health impacts associated with increased global extreme heat events are severe, with projections of up to 250,000 additional deaths annually between 2030 and 2050 due to heat stress, alongside malnutrition, malaria, and diarrheal diseases. The direct cost to the health sector could reach $4 billion per year, with 80% of the cost being shouldered by Sub-Saharan Africa. On the whole, low-and middle-income countries (LMICs) in the Global South experience a higher portion of adverse health effects from increasing climate variability despite their minimal contributions to global greenhouse emissions, underscoring a clear global inequity challenge.
This imbalance points to a crucial need for a focus on extreme heat in climate change adaptation efforts and the overall importance of international solidarity in bolstering adaptation capabilities in developing nations. It is more cost-effective to prepare localities for extreme heat now than to deal with the impacts later. However, most communities do not have comprehensive heat resilience strategies or effective early warning systems due to the lack of resources and the necessary data for risk assessment and management — reflected by the fact that only around 16% of global climate financing needs are being met, with far less still flowing to the Global South. Recent analysis from Climate Policy Initiative, an international climate policy research organization, shows that the global adaptation funding gap is widening, as developing countries are projected to require $212 billion per year for climate adaptation through 2030. The needs will only increase without direct policy action.
Opportunity: The Role of USAID in Climate Adaptation and Resilience
As the primary federal agency responsible for helping partner countries adapt to and build resilience against climate change, USAID announced multiple commitments at COP28 to advance climate adaptation efforts in developing nations. In December 2023, following COP28, Special Presidential Envoy for Climate John Kerry and USAID Administrator Power announced that 31 companies and partners have responded to the President’s Emergency Plan for Adaptation and Resilience (PREPARE) Call to Action and committed $2.3 billion in additional adaptation finance. Per the State Department’s December 2023 Progress Report on President Biden’s Climate Finance Pledge, this funding level puts agencies on track to reach President Biden’s pledge of working with Congress to raise adaptation finance to $3 billion per year by 2024 as part of PREPARE.
USAID’s Bureau for Resilience, Environment, and Food Security (REFS) leads the implementation of PREPARE. USAID’s entire adaptation portfolio was designed to contribute to PREPARE and align with the Action Plan released in September 2022 by the Biden Administration. USAID has further committed to better integrating adaptation in its Climate Strategy for 2022 to 2030 and established a target to support 500 million people’s adaptation efforts.
This strategy is complemented by USAID’s efforts to spearhead international action on extreme heat at the federal level, with the launch of its Global Sprint of Action on Extreme Heat in March 2024. This program started with the inaugural Global Heat Summit and ran through June 2024, calling on national and local governments, organizations, companies, universities, and youth leaders to take action to help prepare the world for extreme heat, alongside USAID Missions, IFRC and its 191-member National Societies. The executive branch was also advised to utilize the Guidance on Extreme Heat for Federal Agencies Operating Overseas and United States Government Implementing Partners.
On the whole, the USAID approach to climate change adaptation is aimed at predicting, preparing for, and mitigating the impacts of climate change in partner countries. The two main components of USAID’s approach to adaptation include climate risk management and climate information services. Climate risk management involves a “light-touch, staff-led process” for assessing, addressing, and adaptively managing climate risks in non-emergency development funding. The climate information services translate data, statistical analyses, and quantitative outputs into information and knowledge to support decision-making processes. Some climate information services include early warning systems, which are designed to enable governments’ early and effective action. A primary example of a tool for USAID’s climate information services efforts is the SERVIR program, a joint development initiative in partnership with the National Aeronautics and Space Administration (NASA) to provide satellite meteorology information and science to partner countries.
Additionally, as the flagship finance initiative under PREPARE, the State Department and USAID, in collaboration with the U.S. Development Finance Corporation (DFC), have opened an Adaptation Finance Window under the Climate Finance for Development Accelerator (CFDA), which aims to de-risk the development and scaling of companies and investment vehicles that mobilize private finance for climate adaptation.
Plan of Action
Recommendation 1: Mobilize private capital through results-based financing such as environmental impact bonds
Results-based financing (RBF) has long been a key component of USAID’s development aid strategy, offering innovative ways to mobilize finance by linking payments to specific outcomes. In recent years, Environmental Impact Bonds (EIBs) have emerged as a promising addition to the RBF toolkit and would greatly benefit as a mechanism for USAID to mobilize and scale novel climate adaptation. Thus, in alignment with the PREPARE plan, USAID should launch an EIB pilot focused on extreme heat through the Climate Finance for Development Accelerator (CFDA), a $250 million initiative designed to mobilize $2.5 billion in public and private climate investments by 2030. An EIB piloted through the CFDA can help unlock public and private climate financing that focuses on extreme heat adaptation solutions, which are sorely needed.
With this EIB pilot, the private sector, governments, and philanthropic investors raise the upfront capital and repayment is contingent on the project’s success in meeting predefined goals. By distributing financial risk among stakeholders in the private sector, government, and philanthropy, EIBs encourage investment in pioneering projects that might struggle to attract traditional funding due to their novel or unproven nature. This approach can effectively mobilize the necessary resources to drive climate adaptation solutions.
This approach can effectively mobilize the necessary resources to drive climate adaptation solutions.
The USAID EIB pilot should focus on scaling projects that facilitate uptake and adoption of affordable and sustainable cooling systems such as solar-reflective roofing and other passive cooling strategies. In Southeast Asia alone, annual heat-related mortality is projected to increase by 295% by 2030. Lack of access to affordable and sustainable cooling mechanisms in the wake of record-shattering heat waves affects public health, food and supply chain, and local economies. An EIB that aims to fund and scale solar-reflective roofing (cool roofs) has the potential to generate high impact for the local population by lowering indoor temperature, reducing energy use for air conditioning, and mitigating the heat island effect in surrounding areas. Indonesia, which is home to 46.5 million people at high risk from a lack of access to cooling, has seen notable success in deploying cool roofs/solar-reflective roofing through the Million Cool Roof Challenge, an initiative of the Clean Cooling Collaborative. The country is now planning to scale production capacity of cool roofs and set up its first testing facility for solar-reflective materials to ensure quality and performance. Given Indonesia’s capacity and readiness, an EIB to scale cool roofs in Indonesia can be a force multiplier to see this cooling mechanism reach millions and spur new manufacturing and installation jobs for the local economy.
To mainstream EIBs and other innovative financial instruments, it is essential to pilot and explore more EIB projects. Cool roofs are an ideal candidate for scaling through an EIB due to their proven effectiveness as a climate adaptation solution, their numerous co-benefits, and the relative ease with which their environmental impacts can be measured (such as indoor temperature reductions, energy savings, and heat island index improvements). Establishing an EIB can be complex and time-consuming, but the potential rewards make the effort worthwhile if executed effectively. Though not exhaustive, the following steps are crucial to setting up an environmental impact bond:
Analyze ecosystem readiness
Before launching an environmental impact bond, it’s crucial to conduct an analysis to better understand what capacities already exist among the private and public sectors in a given country to implement something like an EIB. Additionally working with local civil society organizations is important to ensure climate adaptation projects and solutions are centered around the local community.
Determine the financial arrangement, scope, and risk sharing structure
Determine the financial structure of the bond, including the bond amount, interest rate, and maturity date. Establish a mechanism to manage the funds raised through the bond issuance.
Co-develop standardized, scientifically verified impact metrics and reporting mechanism
Develop a robust system for measuring and reporting the environmental impact projects; With key stakeholders and partner countries, define key performance indicators (KPIs) to track and report progress.
USAID has already begun to incubate and pilot innovative financing mechanisms in the global health space through development impact bonds. The Utkrisht Impact Bond, for example, is the world’s first maternal and newborn health impact bond, which aims to reach up to 600,000 pregnant women and newborns in Rajasthan, India. Expanding the use case of this financing mechanism in the climate adaptation sector can further leverage private capital to address critical environmental challenges, drive scalable solutions, and enhance the resilience of vulnerable communities to climate impacts.
Recommendation 2: USAID should expand the SERVIR joint initiative to include a partnership with NIHHIS and co-develop decision support tools such as an intersectional vulnerability map.
Building on the momentum of Administrator Power’s recent announcement at COP28, USAID should expand the SERVIR joint initiative to include a partnership with NOAA, specifically with NIHHIS, the National Integrated Heat Health Information System. NIHHIS is an integrated information system supporting equitable heat resilience, which is an important area that SERVIR should begin to explore. Expanded partnerships could begin with a pilot to map regional extreme heat vulnerability in select Southeast Asian countries. This kind of tool can aid in informing local decision makers about the risks of extreme heat that have many cascading effects on food systems, health, and infrastructure.
Intersectional vulnerabilities related to extreme heat refer to the compounding impacts of various social, economic, and environmental factors on specific groups or individuals. Understanding these intersecting vulnerabilities is crucial for developing effective strategies to address the disproportionate impacts of extreme heat. Some of these intersections include age, income/socioeconomic status, race/ethnicity, gender, and occupation. USAID should partner with NIHHIS to develop an intersectional vulnerability map that can help improve decision-making related to extreme heat. Exploring the intersectionality of extreme heat vulnerabilities is critical to improving local decision-making and helping tailor interventions and policies to where it is most needed. The intersection between extreme heat and health, for example, is an area that is under-analyzed, and work in this area will contribute to expanding the evidence base.
The pilot can be modeled after the SERVIR-Mekong program, which produced 21 decision support tools throughout the span of the program from 2014-2022. The SERVIR-Mekong program led to the training of more than 1,500 people, the mobilization of $500,000 of investment in climate resilience activities, and the adoption of policies to improve climate resilience in the region. In developing these tools, engaging and co-producing with the local community will be essential.
Recommendation 3: USAID REFS and the State Department Office of Foreign Assistance should work together to develop a mechanism to consistently track and report climate funding flow. This also requires USAID and the State Department to develop clear guidelines on the U.S. approach to adaptation tracking and determination of adaptation components.
Enhancing analytical and data collection capabilities is vital for crafting effective and informed responses to the challenges posed by extreme heat. To this end, USAID REFS, along with the State Department Office of Foreign Assistance, should co-develop a mechanism to consistently track and report climate funding flow. Currently, both USAID and the State Department do not consistently report funding data on direct and indirect climate adaptation foreign assistance. As the Department of State is required to report on its climate finance contributions annually for the Organisation for Economic Co-operation and Development (OECD) and biennially for the United Nations Framework Convention on Climate Change (UNFCCC), the two agencies should report on adaptation funding at similarly set, regular interval and make this information accessible to the executive branch and the general public. A robust tracking mechanism can better inform and aid agency officials in prioritizing adaptation assistance and ensuring the US fulfills its commitments and pledges to support global adaptation to climate change.
The State Department Office of Foreign Assistance (State F) is responsible for establishing standard program structures, definitions, and performance indicators, along with collecting and reporting allocation data on State and USAID programs. Within the framework of these definitions and beyond, there is a lack of clear definitions in terms of which foreign assistance projects may qualify as climate projects versus development projects and which qualify as both. Many adaptation projects are better understood on a continuum of adaptation and development activities. As such, this tracking mechanism should be standardized via a taxonomy of definitions for adaptation solutions.
Therefore, State F should create standardized mechanisms for climate-related foreign assistance programs to differentiate and determine the interlinkages between adaptation and mitigation action from the outset in planning, finance, and implementation — and thereby enhance co-benefits. State F relies on the technical expertise of bureaus, such as REFS, and the technical offices within them, to evaluate whether or not operating units have appropriately attributed funding that supports key issues, including indirect climate adaptation.
Further, announced at COP26, PREPARE is considered the largest U.S. commitment in history to support adaptation to climate change in developing nations. The Biden Administration has committed to using PREPARE to “respond to partner countries’ priorities, strengthen cooperation with other donors, integrate climate risk considerations into multilateral efforts, and strive to mobilize significant private sector capital for adaptation.” Co-led by USAID and the U.S. Department of State (State Department), the implementation of PREPARE also involves the Treasury, NOAA, and the U.S. International Development Finance Corporation (DFC). Other U.S. agencies, such as USDA, DOE, HHS, DOI, Department of Homeland Security, EPA, FEMA, U.S. Forest Service, Millennium Challenge Corporation, NASA, and U.S. Trade and Development Agency, will respond to the adaptation priorities identified by countries in National Adaptation Plans (NAPs) and nationally determined contributions (NDCs), among others.
As USAID’s REFS leads the implementation of the PREPARE and hosts USAID’s Chief Climate Officer, this office should be responsible for ensuring the agency’s efforts to effectively track and consistently report climate funding data. The two REFS Centers that should lead the implementation of these efforts include the Center for Climate-Positive Development, which advises USAID leadership and supports the implementation of USAID’s Climate Strategy, and the Center for Resilience, which supports efforts to help reduce recurrent crises — such as climate change-induced extreme weather events — through the promotion of risk management and resilience in the USAID’s strategies and programming.
In making standardized processes to prioritize and track the flow of adaptation funds, USAID will be able to more effectively determine its progress towards addressing global climate hazards like extreme heat, while enhancing its ability to deliver innovative finance and private capital mechanisms in alignment with PREPARE. Additionally, standardization will enable both the public and private sectors to understand the possible areas of investment and direct their flows for relevant projects.
USAID uses the Standardized Program Structure and Definitions (SPSD) system — established by State F — to provide a common language to describe climate change adaptation and resilience programs and therefore enable the comparison and analysis of budget and performance data within a country, regionally or globally. The SPSD system uses the following categories: (1) democracy, human rights, and governance; (2) economic growth; (3) education and social services; (4) health; (5) humanitarian assistance; (6) peace and security; and (7) program development and oversight. Since 2016, climate change has been in the economic growth category and each climate change pillar has separate Program Areas and Elements. The SPSD consists of definitions for foreign assistance programs, providing a common language to describe programs. By utilizing a common language, information for various types of programs can be aggregated within a country, regionally, or globally, allowing for the comparison and analysis of budget and performance data.
Using the SPSD program areas and key issues, USAID categorizes and tracks the funding for its allocations related to climate adaptation as either directly or indirectly addressing climate adaptation. Funding that directly addresses climate adaptation is allocated to the “Climate Change—Adaptation” under SPSD Program Area EG.11 for activities that enhance resilience and reduce the vulnerability to climate change of people, places, and livelihoods. Under this definition, adaptation programs may have the following elements: improving access to science and analysis for decision-making in climate-sensitive areas or sectors; establishing effective governance systems to address climate-related risks; and identifying and disseminating actions that increase resilience to climate change by decreasing exposure or sensitivity or by increasing adaptive capacity. Funding that indirectly addresses climate adaptation is not allocated to a specific SPSD program area. It is funding that is allocated to another SPSD program area and also attributed to the key issue of “Adaptation Indirect,” which is for adaptation activities. The SPSD program area for these activities is not Climate Change—Adaptation, but components of these activities also have climate adaptation effects.
In addition to the SPSD, the State Department and USAID have also identified “key issues” to help describe how foreign assistance funds are used. Key issues are topics of special interest that are not specific to one operating unit or bureau and are not identified, or only partially identified, within the SPSD. As specified in the State Department’s foreign assistance guidance for key issues, “operating units with programs that enhance climate resilience, and/or reduce vulnerability to climate variability and change of people, places, and/or livelihoods are expected to attribute funding to the Adaptation Indirect key issue.”
Operating units use the SPSD and relevant key issues to categorize funding in their operational plans. State guidance requires that any USAID operating unit receiving foreign assistance funding must complete an operational plan each year. The purpose of the operational plan is to provide a comprehensive picture of how the operating unit will use this funding to achieve foreign assistance goals and to establish how the proposed funding plan and programming supports the operating unit, agency, and U.S. government policy priorities. According to the operational plan guidance, State F does an initial screening of these plans.
MDBs play a critical role in bridging the significant funding gap faced by vulnerable developing countries that bear a disproportionate burden of climate adaptation costs—estimated to reach up to 20 percent of GDP for small island nations exposed to tropical cyclones and rising seas. MDBs offer a range of financing options, including direct adaptation investments, green financing instruments, and support for fiscal adjustments to reallocate spending towards climate resilience. To be most sustainably impactful, adaptation support from MDBs should supplement existing aid with conditionality that matches the institutional capacities of recipient countries.
In January 2021, President Biden issued an Executive Order (EO 14008) calling upon federal agencies and others to help domestic and global communities adapt and build resilience to climate change. Shortly thereafter in September 2022, the White House announced the launch of the PREPARE Action Plan, which specifically lays out America’s contribution to the global effort to build resilience to the impacts of the climate crisis in developing countries. Nineteen U.S. departments and agencies are working together to implement the PREPARE Action Plan: State, USAID, Commerce/NOAA, Millennium Challenge Corporation (MCC), U.S. Trade and Development Agency (USTDA), U.S. Department of Agriculture (USDA), Treasury, DFC, Department of Defense (DOD) & U.S. Army Corps of Engineers (USACE), International Trade Administration (ITA), Peace Corps, Environmental Protection Agency (EPA), Department of Energy (DOE), Federal Emergency Management Agency (FEMA), Department of Transportation (DOT), Health and Human Services (HHS), NASA, Export–Import Bank of the United States (EX/IM), and Department of Interior (DOI).
Congress oversees federal climate financial assistance to lower-income countries, especially through the following actions: (1) authorizing and appropriating for federal programs and multilateral fund contributions, (2) guiding federal agencies on authorized programs and appropriations, and (3) overseeing U.S. interests in the programs. Congressional committees of jurisdiction include the House Committees on Foreign Affairs, Financial Services, Appropriations, and the Senate Committees on Foreign Relations and Appropriations, among others.
Scaling Effective Methods across Federal Agencies: Looking Back at the Expanded Use of Incentive Prizes between 2010-2020
Policy entrepreneurs inside and outside of government, as well as other stakeholders and advocates, are often interested in expanding the use of effective methods across many or all federal agencies, because how the government accomplishes its mission is integral to what the government is able to produce in terms of outcomes for the public it serves. Adoption and use of promising new methods by federal agencies can be slowed by a number of factors that discourage risk-taking and experimentation, and instead encourage compliance and standardization, too often as a false proxy for accountability. As a result, many agency-specific and government-wide authorities for promising methods go under-considered and under-utilized.
Policy entrepreneurs within center-of-government agencies (e.g., Executive Office of the President) are well-positioned to use a variety of policy levers and actions to encourage and accelerate federal agency adoption of promising and effective methods. Some interventions by center-of-government agencies are better suited to driving initial adoption, others to accelerating or maintaining momentum, and yet others to codifying and making adoption durable once widespread. Therefore, a policy entrepreneur interested in expanding adoption of a given method should first seek to understand the “adoption maturity” of that method and then undertake interventions appropriate for that stage of adoption. The arc of agency adoption of new methods can be long—measured in years and decades, not weeks and months. Policy entrepreneurs should be prepared to support adoption over similar timescales. In considering adoption maturity of a method of interest, policy entrepreneurs can also reference the ideas of Tom Kalil in a July 2024 Federation of American Scientists blog post on “Increasing the ‘Policy Readiness of Ideas,” which offers sample questions to ask about “the policy landscape surrounding a particular idea.”
As a case study for driving federal adoption of a new method, this paper looks back at actions that supported the widespread adoption of incentive prizes by most federal agencies over the course of fiscal years 2010 through 2020. Federal agency use of prizes increased from several incentive prize competitions being offered by a handful of agencies in the early 2000s to more than 2,000 prize competitions offered by over 100 federal agencies by the end of fiscal year 2022. These incentive prize competitions have helped federal agencies identify novel solutions and technologies, establish new industry benchmarks, pay only for results, and engage new talent and organizations.
A summary framework below includes types of actions that can be taken by policy entrepreneurs within center-of-government agencies to support awareness, piloting, and ongoing use of new methods by federal agencies in the years ahead. (Federal agency program and project managers who seek to scale up innovative methods within their agencies are encouraged to reference related resources such as this article by Jenn Gustetic in the Winter 2018 Issues in Science and Technology: “Scaling Up Policy Innovations in the Federal Government: Lessons from the Trenches.”)
Efforts to expand federal capacity through new and promising methods are worthwhile to ensure the federal government can use a full and robust toolbox of tactics to meet its varied goals and missions.
OPPORTUNITIES AND CHALLENGES IN FEDERAL ADOPTION OF NEW METHODS
Opportunities for federal adoption and use of promising and effective methods
To address national priorities, solve tough challenges, or better meet federal missions to serve the public, a policy entrepreneur may aim to pilot, scale, and make lasting federal use of a specific method.
A policy entrepreneur’s goals might include new ways for federal agencies to, for example:
- Catalyze the development, demonstration, and deployment of technology and novel solutions;
- Acquire or commercialize products and services that meet government or national needs;
- Engage and seek input from communities and the public;
- Deliver more effective, efficient, and equitable services and benefits;
- Provide technical assistance to state, local, Tribal, and territorial governments;
- Retain and recruit talent for mission critical occupations or to fill federal skills gaps;
- Assess and evaluate organizational health and performance or program-level outcomes; or
- Translate evidence to practice.
To support these and other goals, an array of promising methods exist and have been demonstrated, such as in other sectors like philanthropy, industry, and civil society, in state, local, Tribal, or territorial governments and communities, or in one or several federal agencies—with promise for beneficial impact if more federal agencies adopted these practices. Many methods are either specifically supported or generally allowable under existing government-wide or agency-specific authorities.
Center-of-government agencies include components of the Executive Office of the President (EOP) like the Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP), as well as the Office of Personnel Management (OPM) and the General Services Administration (GSA). These agencies direct, guide, convene, support, and influence the implementation of law, regulation, and the President’s policies across all Federal agencies, especially the executive departments. An August 2016 report by the Partnership for Public Service and the IBM Center for the Business of Government noted that, “The Office of Management and Budget and other “center of government” agencies are often viewed as adding processes that inhibit positive change—however, they can also drive innovation forward across the government.”
A policy entrepreneur interested in expanding adoption of a given method through actions driven or coordinated by one or more center-of-government agencies should first seek to understand the “adoption maturity” of a given method of interest by assessing: (1) the extent that adoption of the method has already occurred across the federal interagency; (2) any real or perceived barriers to adoption and use; and (3) the robustness of existing policy frameworks and agency-specific and government-wide infrastructure and resources that support agency use of the method.
Challenges in federal adoption and use of new methods
Policy entrepreneurs are usually interested in expanding federal adoption of new methods for good reason: a focus on supporting and expanding beneficial outcomes. Effective leaders and managers across sectors understand the importance of matching appropriate and creative tactics with well-defined problems and opportunities. Ideally, leaders are picking which tactic or tool to use based on their expert understanding of the target problem or opportunity, not using a method solely because it is novel or because it is the way work has always been done in the past. Design of effective program strategies is supported by access to a robust and well-stocked toolbox of tactics.
However, many currently authorized and allowable methods for achieving federal goals are generally underutilized in the implementation strategies and day-to-day tactics of federal agencies. Looking at the wide variety of existing authorities in law and the various flexibilities allowed for in regulation and guidance, one might expect agency tactics for common activities like acquisition or public comment to be varied, diverse, iterative, and even experimental in nature, where appropriate. In practice, however, agency methods are often remarkably homogeneous, repeated, and standardized.
This underutilization of existing authorities and allowable flexibilities is due to factors such as:
- Comfort with existing methods among program and legal staff (“but that’s how we have always done it!”);
- In turn, limited internal expertise on how to deploy specific new methods (“who in our agency knows how to do this well?”);
- Unclear legal authorities or lack of established agency policies or processes (“are we allowed to do that and, if so, where is that authority written?”)
- Unclear permission authorities and approval roles (“whose review and sign-off do I need to do that?”);
- Difficulties clearly defining the opportunity or problem to be addressed at an actionable level of specificity (“what does success look like?”);
- Concerns about perceived risks, such as the risk of funds going unawarded or effective solutions not being identified (“what if no one/nothing meets our target?”);
- Oversight processes that seek out failure, flaws, and non-compliance and overreliance on strict procedures to ensure accountability (“who is responsible for this failure?” instead of “what can we learn from this for the future?”, and “have all the boxes been checked?” instead of “where might we start and how will we learn along the way?”); and
- Reluctance to define and implement meaningful performance indicators and assessment methods at the start of program design (“how will we know if this new method improves our outcomes compared to the status quo?”).
Strategies for addressing challenges in federal adoption and use of new methods
Attention and action by center-of-government agencies often is needed to address the factors cited above that slow the adoption and use of new methods across federal agencies and to build momentum. The following strategies are further explored in the case study on federal use of incentive prizes that follows:
- clarifying government-wide and agency-specific policies and processes;
- building awareness and fostering leadership and staff buy-in;
- offering case studies, examples, and “how-to” playbooks;
- creating connections among a federal community of practice
- engaging external experts and practitioners;
- removing identified barriers;
- increasing ambition through iterative experimentation; and
- fostering an enterprise-wide learning culture that encourages experimentation, invests in evaluation, and manages risk.
Additional strategies can be deployed within federal agencies to address agency-level barriers and scale promising methods—see, for example, this article by Jenn Gustetic in the Winter 2018 Issues in Science and Technology: “Scaling Up Policy Innovations in the Federal Government: Lessons from the Trenches.”
LOOKING BACK: A DECADE OF POLICY ACTIONS SUPPORTING EXPANDED FEDERAL USE OF INCENTIVE PRIZES
The use of incentive prizes is one method for open innovation that has been adopted broadly by most federal agencies, with extensive bipartisan support in Congress and with White House engagement across multiple administrations. In contrast to recognition prizes, such as the Nobel Prize or various presidential medals, which reward past accomplishments, incentive prizes specify a target, establish a judging process (ideally as objective as possible), and use a monetary prize purse and/or non-monetary incentives (such as media and online recognition, access to development and commercialization facilities, resources, or experts, or even qualification for certain regulatory flexibility) to induce new efforts by solvers competing for the prize.
The use of incentive prizes by governments (and by high net worth individuals) to catalyze novel solutions certainly is not new. In 1795, Napoleon offered 12,000 francs to improve upon the prevailing food preservation methods of the time, with a goal of better feeding his army. Fifteen years later, confectioner Nicolas François Appert claimed the prize for his method involving heating, boiling and sealing food in airtight glass jars — the same basic technology still used to can foods. Dava Sobel’s book Longitude details how the rulers of Spain, the Netherlands, and Britain all offered separate prizes, starting in 1567, for methods of figuring out longitude at sea, and finally John Harrison was awarded Britain’s top longitude prize in 1773. In 1919, Raymond Orteig, a French-American hotelier, aviation enthusiast, and philanthropist, offered a $25,000 prize for the first person who could perform a nonstop flight between New York and Paris. The prize offer initially expired by 1924 without anyone claiming it. Given technological advances and a number of engaged pilots involved in trying to win the prize, Orteig extended the deadline by 5 years. By 1926, nine teams had come forward to formally compete, and the prize went to a little-known aviator named Charles Lindbergh, who attempted the flight in a custom-built plane known as the “Spirit of St. Louis.”
The U.S. Government did not begin to adopt the use of incentive prizes until the early 21st century, following a 1999 National Academy of Engineering workshop about the use of prizes as an innovation tool. In the first decade of the 2000s, the Defense Advanced Research Projects Agency (DARPA), the National Aeronautics and Space Administration (NASA), and the Department of Energy conducted a small number of pilot prize competitions. These early agency-led prizes focused on autonomous vehicles, space exploration, and energy efficiency, demonstrating a range of benefits to federal agency missions.
Federal use of incentive prizes did not accelerate until, in the America COMPETES Reauthorization Act of 2010, Congress granted all federal agencies the authority to conduct prize competitions (15 USC § 3719). With that new authority in place, and with the support of a variety of other policy actions, federal use of incentive prizes reached scale, with over 2,000 prize competitions offered on Challenge.gov by over 100 federal agencies between the fiscal years 2010 and 2022.
There certainly remains extensive opportunity to improve the design, rigor, ambition, and effectiveness of federal prize competitions. That said, there are informative lessons to be drawn from how incentive prizes evolved in the United States from a method used primarily outside of government, with limited pilots among a handful of early-adopter federal agencies, to a method being tried by many civil servants across an active interagency community of practice and lauded by administration leaders, bipartisan members of Congress, and external stakeholders alike.
A summary follows of the strategies and tactics used by policy entrepreneurs within the EOP—with support and engagement from Congress as well as program managers and legal staff across federal agencies—that led to increased adoption and use of incentive prizes in the federal government.
Summary of strategies and policy levers supporting expanded use of incentive prizes
In considering how best to expand awareness, adoption, and use among federal agencies of promising methods, policy entrepreneurs might consider utilizing some or all of the strategies and policy levers described below in the incentive prizes example. Those strategies and levers are summarized generally in the table that follows. Some of the listed levers can advance multiple strategies and goals. This framework is intended to be flexible and to spark brainstorming among policy entrepreneurs, as they build momentum in the use of particular innovation methods.
Policy entrepreneurs are advised to consider and monitor the maturity level of federal awareness, adoption, and use, and to adjust their strategies and tactics accordingly. They are encouraged to return to earlier strategies and policy levers as needed, should adoption and momentum lag, should agency ambition in design and implementation of initiatives be insufficient, or should concerns regarding risk management be raised by agencies, Congress, or stakeholders.
Stage of Federal Adoption | Strategy | Types of Center-of-Government Policy Levers |
---|---|---|
Early – No or few Federal agencies using method | Understand federal opportunities to use method, and identify barriers and challenges | * Connect with early adopters across federal agencies to understand use of agency-specific authorities, identify pain points and lessons learned, and capture case studies (e.g., 2000-2009) * Engage stakeholder community of contractors, experts, researchers, and philanthropy * Look to and learn from use of method in other sectors (such as by philanthropy, industry, or academia) and document (or encourage third-party documentation of) that use and its known benefits and attributes (e.g., April 1999, July 2009) * Encourage research, analysis, reports, and evidence-building by National Academies, academia, think tanks, and other stakeholders (e.g., April 1999, July 2009, June 2014) * Discuss method with OMB Office of General Counsel and other relevant agency counsel * Discuss method with relevant Congressional authorizing committee staff * Host convenings that connect interested federal agency representatives with experts * Support and connect nascent federal “community of interest” |
Early – No or few Federal agencies using method | Build interest among federal agencies | * Designate primary policy point of contact/dedicated staff member in the EOP (e.g., 2009-2017, 2017-2021) * Designate a primary implementation point of contact/dedicated staff at GSA and/or OPM * Identify leads in all or certain federal agencies * Connect topic to other administration policy agendas and strategies * Highlight early adopters within agencies in communications from center-of-government agencies to other federal agencies (and to external audiences) * Offer congressional briefings and foster bipartisan collaboration (e.g., 2015) |
Early – No or few Federal agencies using method | Establish legal authorities and general administration policy | * Engage OMB Office of OMB General Counsel and OMB Legislative Review Division, as well as other relevant OMB offices and EOP policy councils * Identify existing general authorities and regulations that could support federal agency use of method (e.g., March 2010) * Establish general policy guidelines, including by leveraging Presidential authorities through executive orders or memoranda (e.g., January 2009) * Issue OMB directives on specific follow-on agency actions or guidance to support agency implementation (“M-Memos” or similar) (e.g., December 2009, March 2010, August 2011, March 2012) * Provide technical assistance to Congress regarding government-wide or agency-specific authority (or authorities) (e.g., June-July 2010, January 2011) * Delegate existing authorities within agencies (e.g., October 2011) * Encourage issuance of agency-specific guidance (e.g., October 2011, February 2014) * Include direction to agencies as part of broader Administration policy agendas (e.g., September 2009, 2011-2016) |
Early – No or few Federal agencies using method | Remove barriers and “make it easier” | * Create a central government website with information for federal agency practitioners (such as toolkits, case studies, and trainings) and for the public (e.g., September 2010) * Create dedicated GSA schedule of vendors (e.g., July 2011) * Establish an interagency center of excellence (e.g., September 2011) * Encourage use of interagency agreements on design or implementation of pilot initiatives (e.g., September 2011) * Request agency budget submissions to OMB to support pilot use in President’s budget (e.g., December 2013) |
Adoption well underway – Many federal agencies have begun to use method | Connect practitioners | * Launch a federal “community of practice” with support from GSA for meetings, listserv, and collaborative projects (e.g., April 2010, 2016, June 2019) * Host regular events, workshops, and conferences with federal agency and, where appropriate and allowable, seek philanthropic or nonprofit co-hosts (e.g., April 2010, June 2012, April 2015, March 2018, May 2022) |
Adoption well underway – Many federal agencies have begun to use method | Strengthen agency infrastructure | * Foster leadership buy-in through briefings from White House/EOP to agency leadership, including members of the career senior executive service * Encourage agencies to dedicate agency staff and invest in prize design support within agencies * Encourage agencies to create contract vehicles as needed to support collaboration with vendors/ experts * Encourage agencies to develop intra-agency networks of practitioners and to provide external communications support and platforms for outreach * Request agency budget submissions to OMB for investments in agency infrastructure and expansion of use, to include in the President's budget where needed (e.g., 2012-2013), and request agencies otherwise accommodate lower-dollar support (such as allocation of FTEs) where possible within their budget toplines |
Adoption well underway – Many federal agencies have begun to use method | Clarify existing policies and authorities | * Issue updated OMB, OSTP, or agency-specific policy guidance and memoranda as needed based on engagement with agencies and stakeholders (e.g.,: August 2011, March 2012) * Provide technical assistance to Congress on any needed updates to government-wide or agency-specific authorities (e.g., January 2017) |
Adoption prevalent – Most if not all federal agencies have adopted, with a need to maintain use and momentum over time | Highlight progress and capture lessons learned | * Require regular reporting from agencies to EOP (OSTP, OMB, or similar) (e.g., April 2012, May 2022) * Require and take full advantage of regular reports to Congress (e.g., April 2012, December 2013, May 2014, May 2015, August 2016, June 2019, May 2022, April 2024) * Continue to capture and publish federal-use case studies in multiple formats online (e.g., June 2012) * Undertake research, evaluation, and evidence-building * Co-develop practitioner toolkit with federal agency experts (e.g., December 2016) * Continue to feature promising examples on White House/EOP blogs and communication channels (e.g., October 2015, August 2020) * Engage media and seek both general interest and targeted press coverage, including through external awards/honorifics (e.g., December 2013) |
Adoption prevalent – Most if not all federal agencies have adopted, with a need to maintain use and momentum over time | Prepare for presidential transitions and document opportunities for future administrations | * Integrate go-forward proposals and lessons learned into presidential transition planning and transition briefings (e.g., June 2016-January 2017) * Brief external stakeholders and Congressional supporters on progress and future opportunities * Connect use of method to other, broader policy objectives and national priorities (e.g., August 2020, May 2022, April 2024) |
Phases and timeline of policy actions advancing the adoption of incentive prizes by federal agencies
- Growing number of incentive prizes offered outside government (early 2000s)
At the close of the 20th century, federal use of incentive prizes to induce activity toward targeted solutions was limited, though the federal government regularly utilized recognition prizes to reward past accomplishment. In October 2004, the $10 million Ansari XPRIZE—which was first announced in May 1996—was awarded by the XPRIZE Foundation for the successful flights of Spaceship One by Scaled Composites. Following the awarding of the Ansari XPRIZE and the extensive resulting news coverage, philanthropists and high net worth individuals began to offer prize purses to incentivize action on a wide variety of technology and social challenges. A variety of new online challenge platforms sprung up, and new vendors began offering consulting services for designing and hosting challenges, trends that lowered the cost of prize competition administration and broadened participation in prize competitions among thousands of diverse solvers around the world. This growth in the use of prizes by philanthropists and the private sector increased the interest of the federal government in trying out incentive prizes to help meet agency missions and solve national challenges. Actions during this period to support federal use of incentive prizes include:
- EXTERNAL REPORT/ANALYSIS (April 1999): In response to a request from the Clinton-Gore Administration’s National Economic Council, the National Academy of Engineering (NAE), with funding from the National Science Foundation, convened a workshop in April 1999 to “assess the potential value of federally sponsored prizes and contests in advancing science and technology in the public interest” and issued a brief summary report, which recommended “limited experiments” in the use of federally sponsored incentive prizes and encouraged both Congress and federal agencies “to take a flexible approach to the design and administration” of such prizes and that the use of incentive prizes be “evaluated at specified intervals by the agencies involved to determine their effectiveness and impact.”
- AGENCY-SPECIFIC AUTHORITIES and PILOTS (early 2000s): During this period, though, only a few federal government agencies had (and still have) flexible agency-specific prize authorities that allowed them to pilot the use of incentive prizes to advance their missions, scan markets for new solutions, engage new solvers, and solve long-standing problems. Early examples include:
- Because of prizes authority provided to DARPA by Congress under 10 U.S.C. 2374a, enacted as part of the National Defense Authorization Act of 2000 in October 1999, DARPA was early in the federal prizes game, offering a series of challenges demonstrating the capabilities of autonomous vehicles in 2004 and 2005, called the DARPA Grand Challenges.
- With authority from Congress, NASA began offering its ongoing series of Centennial Challenges starting in 2005, to directly engage the public in the process of advanced technology development related to problems of interest to NASA and the nation.
- Congress also saw opportunity for the Department of Energy (DOE) to make progress on energy challenges through the use of incentive prizes, giving DOE authority through the Energy Independence and Security Act of 2007 to run a prize focused on efficient lighting call the “L-Prize” and also to run a series of “H-Prizes” to encourage research into the use of hydrogen as an energy carrier in a hydrogen economy.
- Obama-Biden Administration Seeks to Expand Federal Prizes Through Administrative Action (2009-2010)
From the start of the Obama-Biden Administration, OSTP and OMB took a series of policy steps to expand the use of incentive prizes across federal agencies and build federal capacity to support those open-innovation efforts. Bipartisan support in Congress for these actions soon led to new legislation to further advance agency adoption of incentive prizes. Actions during this period to support federal use of incentive prizes include:
- PRESIDENTIAL DIRECTIVE (January 2009): On the first day of the Obama-Biden Administration on January 21, 2009, President Barack Obama signed a Memorandum on Transparency and Open Government, committing the Administration to creating a more transparent, participatory, and collaborative government. The memorandum directed that federal agencies “should offer Americans increased opportunities to participate in policymaking and to provide their Government with the benefits of their collective expertise and information.”
- EXTERNAL REPORT/ANALYSIS (July 2009): In July 2009, with funding from the Templeton Foundation, McKinsey issued a report called And the Winner Is… that documented the recent resurgence of incentive prizes and noted that over the past decade total prize purses across the large incentive prizes being offered had tripled to surpass $375 million. This report provided synthesis of learnings from recent prizes. For example, the report found that, “As Ken Davidian, formerly of the NASA Challenges, puts it, there are at least four core rewards that drive participants to compete for prizes: ‘goal, glory, guts, and gold—and gold is usually last.’ Or to be more precise (if less memorable), competitors are motivated by the intrinsic interest of a challenge, the recognition or prestige accompanying a winner, the challenge of the problem-solving process itself, and any material incentive. Which motives matter most, and in what mix, will vary depending on the problem—and the problem solver.”
- INCLUSION IN ADMINISTRATION POLICY AGENDA (September 2009): In addition, in September 2009, President Obama released his Strategy for American Innovation, developed by the National Economic Council (NEC) and OSTP. In that strategy, the President called for federal agencies to “take advantage of the expertise and insight of people both inside and outside the federal government, use high-risk, high-reward policy tools such as prizes and challenges to solve tough problems.”
- OMB DIRECTIVE (December 2009): Responding to President Obama’s open government memorandum, on December 8, 2009, the OMB Director issued an Open Government Directive, which required executive departments and agencies to take specific actions to further the principles established by the open government memorandum. The directive charged the OMB Deputy Director for Management to “issue, through separate guidance or as part of any planned comprehensive management guidance, a framework for how agencies can use challenges, prizes, and other incentive-backed strategies to find innovative or cost-effective solutions to improving open government.” The directive also charged federal agencies to include in agency Open Government Plans “innovative methods, such as prizes and competitions, to obtain ideas from and to increase collaboration with those in the private sector, non-profit, and academic communities.”
- EOP LEADERSHIP ROLES (January 2009-January 2017): To support the development and implementation of these and other open-innovation policies, a member of the OSTP policy staff served as policy lead for open innovation, reporting to OSTP Deputy Director Tom Kalil, with additional leadership backing from OSTP Director John Holdren, the inaugural U.S. Chief Technology Officer (CTO) Aneesh Choprah, and then Deputy U.S. CTO Beth Noveck. During the Obama Administration, this OSTP open-innovation policy role was filled by Robynn Sturm Steffen from 2009-2011, the author (Cristin Dorgelo) from 2011-2014 until she became OSTP Chief of Staff, Jenn Gustetic on detail from NASA from 2014-2016, and Christofer Nelson from 2016 through the end of the Administration in January 2017. Sonal Shah as inaugural director of the White House Office of Social Innovation and Civic Participation in the Domestic Policy Council (DPC), and her successor Jonathan Greenblatt, also provided helpful leadership and led key stakeholder engagement efforts.
- OMB GUIDANCE (March 2010): Consistent with the Open Government Directive and the Strategy for American Innovation, OSTP worked closely with the OMB Office of the Deputy Director for Management and the OMB Office of General Counsel on developing guidance on incentive prizes for federal agencies. In March 2010, OMB issued OMB Memorandum M-10-11, Guidance on the Use of Challenges and Prizes to Promote Open Government. This memorandum included clarifications for federal agencies regarding what authorities they could use to offer prize purses, host and sponsor prize competitions, and engage third parties to operate such competitions. OMB Memorandum M-10-11 also established as Administration policy that agencies should:
- Utilize prizes and challenges as tools for advancing open government, innovation, and the agency’s mission;
- Identify and proactively address legal, regulatory, technical, and other barriers to the use of prizes and challenges;
- Select one or more individuals to identify and implement prizes and challenges, potentially in partnership with outside organizations, and to participate in a governmentwide “community of practice” led by OMB and OSTP; and
- Increase their capacity to support, design, and manage prizes, potentially in collaboration with external partners.
- CONVENING and COMMUNITY OF PRACTICE (April 2010): In April 2010, the White House (OSTP and the DPC Office of Social Innovation and Civic Participation) with the Case Foundation convened experts in incentive prize design and administration to share private-sector success stories with nearly 200 representatives from more than 35 federal agencies. This Summit on Promoting Innovation: Prizes, Challenges and Open Grantmaking served as a formal kickoff to a federal prizes and challenges community of practice, which GSA administered, and for which OSTP and OMB provided strategic direction, substantive agenda setting, and resourcing. With GSA’s support, this community of practice remains more than a decade later a valuable network for federal prize practitioners to connect and exchange promising practices and lessons learned.
- TECHNICAL ASSISTANCE TO CONGRESS TO INFORM NEW LEGAL AUTHORITY (June-July 2010): Throughout this period, OSTP and OMB were collaborating with Congress to advance government-wide prize authority. On June 24, 2010, Senators Mark Pryor and Mark Warner introduced in the 111th Congress S.3530, the Reward Innovation in America Act of 2010. Drawing from this introduced bill, in July 2010, the Senate Commerce Committee approved the America COMPETES Reauthorization Act of 2010 with a provision providing government-wide prize authority. Specifically, P.L. 111-358 added Section 24 to the Stevenson-Wydler Technology Innovation Act of 1980 (15 USC § 3719). On July 27, 2010, the Director of OSTP thanked Senators Pryor and Warner for their leadership in a letter that OSTP also published on its blog, highlighting the steps the Administration was taking to set the stage for agencies to take full advantage of prizes authority should Congress move the authority forward.
- SHARED GOVERNMENT WEBSITE (September 2010 to Present): Responding to directives in M-10-11, with support from OSTP and OMB, and with leadership from internal champions, GSA in September 2010 launched a government-wide website called Challenge.gov to provide one place for citizen solvers to come and find the challenges being offered by federal agencies. Over time, Challenge.gov developed back-end capabilities to help federal program managers administer certain types of prize competition and expanded to serve as a knowledge repository for federal program managers looking for more information about designing and administering incentive prizes. Challenge.gov remains today the primary online hub for federally hosted prize competitions.
- Implementing New Government-Wide Prizes Authority Provided by the America COMPETES Act (2011-2016)
During this period of expansion in the federal use of incentive prizes supported by new government-wide prize authority provided by Congress, the Obama-Biden Administration continued to emphasize its commitment to the model, including as a key method for accomplishing administration priorities, including priorities related to open government and evidence-based decision making. Actions during this period to support federal use of incentive prizes include:
- NEW AUTHORITIES THROUGH LEGISLATION (January 2011): On January 4, 2011, President Obama signed into law the America COMPETES Reauthorization Act of 2010 (COMPETES Act), granting all agencies broad authority to conduct prize competitions to spur innovation, solve tough problems, and advance their core missions (Public Law 111-358).
- GSA CONTRACT VEHICLE (July 2011 to Present): In July 2011, as called for by the new law, GSA established a contract vehicle—originally, Sub-Schedule 541 4G, now maintained as the Multiple Award Schedule, 541613, Professional Services – Marketing and Public Relations—to help federal agencies access private-sector technical assistance and consulting support for incentive prizes. Because prize competitions were still an emerging practice both inside and outside of the federal government, this contract vehicle allowed federal agencies to access experts and consultants who were building capacity for prizes across sectors and identifying what works in prize design and operations.
- OMB GUIDANCE (August 2011): In August 2011, OMB’s General Counsel and Chief Information Officer issued a memorandum on the new COMPETES prize authority to agency general counsels and CIOs, which OMB developed hand-in-hand with OSTP. The memorandum included a concise summary of the COMPETES Act’s new prizes authorities and requirements, and it provided guidance to agencies in their implementation of the prize authority found in this legislation. It also addressed an array of frequently asked questions raised by agencies, including agency questions about the new authority to conduct prizes up to $50 million with existing appropriations, as well as the new authorities to: accept private-sector funds for the design, administration, or prize purse of a competition; to partner with nonprofits and tap the expertise of for-profits for successful implementation; and to co-sponsor with another agency.
- AGENCY-SPECIFIC POLICIES and DELEGATION OF AUTHORITY (October 2011): Following issuance of this government-wide guidance, with the support of OSTP and OMB, agencies began to establish strategies and policies to further accelerate widespread use of the new prize authority granted to them under COMPETES. For example, the Department of Health and Human Services (HHS) was at the forefront of agency implementation efforts. On October 12, 2011, HHS Secretary Kathleen Sebelius issued a memorandum notifying the Department of the new prize authority provided under the America COMPETES Reauthorization Act, outlining the Department’s strategy to optimize the use of prize competitions, and calling on the heads of HHS operating and staff divisions to forecast their future use of prize competitions to stimulate innovation in advancing the agency’s mission. Secretary Sebelious also issued a formal delegation of the new prize authority in the Federal Register.
- CENTER OF EXCELLENCE and INTERAGENCY AGREEMENTS (November 2011): In November 2011, OSTP worked with NASA to launch a Center of Excellence for Collaborative Innovation (CoECI), which was co-founded by Jason Crusan and Jeff Davis. As NASA continued to mature the use of challenges and crowdsourcing methods as a new tool in its toolkit, OSTP encouraged NASA to assist other federal agencies in the use of crowdsourced challenges to solve tough, mission-critical problems. This included the new Center working with federal agencies on challenge design through interagency agreements, and supporting those agencies with accessing NASA’s contracts with prize platforms, including for algorithm and apps challenges and for ideation challenges.
- OMB GUIDANCE (March 2012): Throughout 2011, OSTP met regularly with agencies, holding regular phone calls and in-person meetings—often at agency offices—with agency leadership, counsel, and program managers who were considering the use of prizes or undertaking prize design. In these interactions, OSTP listened to agencies, asked clarifying questions, and tracked common issues and challenges. For example, OSTP heard in these agency conversations a variety of agency questions regarding the Paperwork Reduction Act and its intersection with prize authorities. OSTP then collaborated with OMB to assess these issues and determine how clarifying guidance could support agency implementation of prizes. On March 1, 2012, OMB Office of Information and Regulatory Affairs (OIRA) issued a Frequently Asked Questions summary to address common agency questions.
- AGENCY REPORTING TO EOP and REPORT TO CONGRESS (April 2012): In April 2012, the Obama-Biden Administration released a first report to Congress on federal use of incentive prizes in fiscal year 2011, as required by the America COMPETES Act. These reports to Congress and related reporting from federal agencies to OSTP—initially on an annual cycle and now biennial—have been an essential mechanism for tracking federal agency prize activity and capturing case studies and outcomes. The work led by OSTP with GSA to standardize how federal agencies tracked indicators and metrics regarding federal incentive prizes supported both progress tracking over time as well as storytelling by the Administration and prize supporters in Congress. The reports have also recorded steps taken by OSTP, GSA, and other agencies like NASA to build government-wide capacity and infrastructure related to prizes, and the steps taken within agencies to establish policies and processes to ease the use of prizes and remove barriers.
- CONVENING and ONLINE CASE STUDIES AND RESOURCES (June 2012): In June 2012, OSTP, the Case Foundation, and the Joyce Foundation hosted a day-long conference called Collaborative Innovation: Public Sector Prizes, which brought together hundreds of public- and private-sector practitioners to share case studies, research, and lessons learned. The event was partially live streamed, and resulted in a large amount of case studies and video resources that were available on the Case Foundation website, now archived.
- INCLUSION IN ADMINISTRATION POLICY AGENDA (2011-2016): The use of incentive prizes was included and encouraged throughout a variety of Administration policy agendas, year over year. For example, commitments related to prizes and challenges were included in each of the United States Open Government biennial National Action Plans issued during the Obama-Biden Administration, to maintain momentum and highlight this body of work to the international open government community.
- GUIDANCE ON AGENCY BUDGET PROPOSALS (2012-2013): OSTP also collaborated with OMB to ensure that agencies considered and submitted to OMB budget proposals to expand their use of incentive prizes. On May 18, 2012, OMB issued Memorandum M-12-14 on the Use of Evidence and Evaluation in the 2014 Budget, which directed, “Agencies should also consider using the new authority under the America COMPETES legislation to support incentive prizes of up to $50 million. Like Pay for Success, well designed prizes and challenges can yield a very high return on the taxpayer dollar.” For agencies seeking to learn more about prizes, the memo noted, “The Office of Science and Technology Policy has created a ‘community of practice’ for agency personnel involved in designing and managing incentive prizes.” On July 26, 2013, OMB issued Memorandum M-13-17 on Next Steps in the Evidence and Innovation Agenda, OMB encouraged federal agencies to develop budget proposals “that focus Federal dollars on effective practices while also encouraging innovation in service delivery” and specifically mentioned incentive prizes as an encouraged pay-for-performance strategy.
- REPORT TO CONGRESS (December 2013): On December 27, 2013, OSTP released a second report to Congress of federal use of incentive prizes under the COMPETES Act authority, and other authorities, in fiscal year 2012.
- LIFTING UP EFFORT FOR EXTERNAL RECOGNITION (December 2013): On January 24, 2014, Harvard University’s Ash Center for Democratic Governance and Innovation announced Challenge.gov as winner of the prestigious 2013 “Innovations in American Government Award” in honor of exemplary service and creativity in the public interest. The Obama Administration through GSA nominated Challenge.gov for this external honor to raise awareness and increase attention for federal prizes.
- AGENCY-SPECIFIC POLICIES (February 2014): Agencies continued to state and clarify their internal policies and processes related to incentive prizes. For example, on February 12, 2014, the NASA Administrator issued an agency-wide policy directive—still in place today and last updated in June 2023—to encourage “the use of challenges, prize competitions, and crowdsourcing activities at all levels of the Agency to further its mission.”
- REPORT TO CONGRESS (May 2014): On May 7, 2014, OSTP released a third report to Congress on federal use of incentive prizes, focused on fiscal year 2013. This report found an 85 percent annual increase in prizes run under all legal authorities, an over 50 percent increase in the number of prizes conducted under the authority provided by COMPETES increased by over 50 percent compared to fiscal year 2012 (and nearly six-fold compared to 2011), and an increase in the size of agency-sponsored prize purses has grown as well—11 prizes had prize purses of $100,000 or greater in fiscal year 2013.
- EXTERNAL REPORT/ANALYSIS (June 2014): On June 19, 2014, Deloitte University Press released a report—informed by research involving prize practitioners across government—covering in depth the lessons learned and best practices identified from over 350 prizes conducted by the Federal government and over 50 prizes conducted by state, local, and philanthropic leaders. The report, titled The craft of Prize Design: lessons from the public sector, was produced by Doblin (Deloitte’s innovation practice), in collaboration with Bloomberg Philanthropies, the Case Foundation, the Joyce Foundation, the Knight Foundation, the Kresge Foundation, and the Rockefeller Foundation.
- REPORT TO CONGRESS (May 2015): On May 8, 2015, OSTP released a fourth report to Congress on federal use of incentive prizes, focused on fiscal year 2014. This report highlighted steps agencies were taking to support the use of incentive prizes across their components and divisions, from streamlining access to vendors to support the design and implementation of prize competitions through contract vehicles, creating internal working groups, designating points of contact, and creating internal and external communications tools.
- CONVENING and WHITE HOUSE MICROSITE (October 2015): On Oct 7, 2015, five years after the launch of Challenge.gov, the White House, in conjunction with the Case Foundation, the Joyce Foundation, and Georgetown University hosted a conference called “All Hands on Deck: Solving Complex Problems through Prizes and Challenges” to convene federal prize practitioners and catalyze the next generation of ambitious federal prizes. The following day, GSA brought together the federal community to recognize progress with an awards ceremony. By that point, more than 440 federal prizes had been offered, engaging more than 200,000 citizen solvers. Also in October 2015, OSTP and DPC collaborated on the launch of a WhiteHouse.gov microsite with information on federal use of incentive prizes.
- FOSTER BIPARTISAN CONGRESSIONAL SUPPORT (2015): During this period, bipartisan support for the use of incentive prizes by federal agencies continued. In 2015, a Congressional Prize Caucus with bipartisan sponsorship was held to increase awareness and encourage the use of prize competitions. Numerous pieces of legislation supporting prize competitions to fuel medical research were also passed (e.g., the 21st Century Cures Act [Public Law 114-255] included a provision on EUREKA Prize Competitions [42 U.S.C. 284et seq] that authorized the National Institutes of Health in the Department of Health and Human Services to conduct prize competitions to fuel medical research).
- REPORT TO CONGRESS and TRAINING (August 2016): In August 2016, OSTP released a fifth report to Congress on the federal use of prizes in fiscal year 2015. OSTP noted that Challenge.gov had, by August 2016, “featured more than 700 prize competitions and challenges—conducted under the authority provided by COMPETES and other authorities—from more than 100 Federal agencies, departments, and bureaus.” OSTP and GSA together had engaged more than 1,500 federal professionals in training on prize design and operations.
- PRACTITIONER TOOLKIT (December 2016): In December 2016, building on the robust body of federal knowledge on prizes, OSTP and GSA with the federal Community of Practice on Prizes and Challenges issued a robust practitioner’s toolkit on Challenge.gov with a lot of how-to information and practical case studies. The toolkit was developed by an interagency team using insights drawn from experts across federal agencies.
- Maintaining Momentum in New Presidential Administrations
Support for federal use of incentive prizes continued beyond the Obama-Biden Administration foundational efforts. Leadership by federal agency prize leads was particularly important to support this momentum from administration to administration. Actions during the Trump-Pence and Biden-Harris Administrations to support federal use of incentive prizes include:
- INTEGRATION INTO PRESIDENTIAL TRANSITION PLANNING (June 2016 – January 2017): As the end of the Obama-Biden Administration neared, OSTP worked with GSA and federal agency prize leads to prepare for the upcoming presidential transition and ensure the agency leads felt prepared, empowered, and supported with agency-level policies and processes so they could continue to design and launch prize competitions as part of their ongoing regular course of business. OSTP also integrated incentive prizes into its transition communications as a recommendation for the Trump-Pence Administration to continue. For example, in its list of 100 examples of the Obama-Biden Administration putting science it its rightful place, issued in June 2016, OSTP included the following:
Harnessed American ingenuity through increased use of incentive prizes. Since 2010, more than 80 Federal agencies have engaged 250,000 Americans through more than 700 challenges on Challenge.gov to address tough problems ranging from fighting Ebola, to decreasing the cost of solar energy, to blocking illegal robocalls. These competitions have made more than $220 million available to entrepreneurs and innovators and have led to the formation of over 275 startup companies with over $70 million in follow-on funding, creating over 1,000 new jobs.
In addition, in January 2017, the Obama-Biden Administration OSTP mentioned the use of incentive prizes in its public “exit memo” as a key “pay-for-performance” method in agency science and technology strategies that “can deliver better results at lower cost for the American people,” and also noted:
Harnessing the ingenuity of citizen solvers and citizen scientists. The Obama Administration has harnessed American ingenuity, driven local innovation, and engaged citizen solvers in communities across the Nation by increasing the use of open-innovation approaches including crowdsourcing, citizen science, and incentive prizes. Following guidance and legislation in 2010, over 700 incentive prize competitions have been featured on Challenge.gov from over 100 Federal agencies, with steady growth every year.
- TECHNICAL ASSISTANCE TO CONGRESS ON UPDATES TO AUTHORITIES (January 2017): On January 6, 2017, the American Innovation and Competitiveness Act (AICA) was signed into law by President Obama [Public Law 114-329]. Reflecting extensive, multi-year staff-level engagement among Congressional staff and experts at the Obama-Biden Administration’s OSTP and OMB, the AICA updated the government-wide authority that previously had been granted to federal agencies by the COMPETES Act. The aim of the updates were to encourage more ambitious interagency and cross-sector partnerships (and co-funding, with explicit authority to solicit funds for federal prizes from beyond the federal government) in the design and administration of prize competitions, and to eliminate unnecessary administrative burden, among other changes.
- EOP LEADERSHIP ROLES (2017-2021): During the Trump-Pence Administration, support for federal agency use of prizes and challenges continued. In the EOP, Matt Lira, then Special Assistant to the President for Innovation Policy in the White House Office of American Innovation, Michael Kratsios, then Deputy Assistant to the President and Deputy U.S. Chief Technology Officer, and others in OSTP engaged with agencies to maintain momentum and identify new opportunities for the effective application of the COMPETES Act prize authority.
- CONVENING (March 2018): In March 2018, the White House hosted a “Fostering Innovation with Prizes and Challenges” roundtable with then Secretary of Energy Rick Perry and other leaders. The White House confirmed to participants that, “the Trump Administration strongly supports efforts by Federal agencies to host prizes and challenges, particularly those that leverage COMPETES Act authority, to address some of the Nation’s most pressing issues.”
- REPORT TO CONGRESS AND COMMUNITY OF PRACTICE (June 2019): During the Trump-Pence Administration, the federal prizes and challenges community of practice supported by GSA continued a network and active email list exchange of more than 730 current and prospective challenge managers in the Federal space. These agency prize practitioners were and continue to be essential to forward progress in the use of incentive prizes, in the federal government and beyond. OSTP’s fiscal year 2017-2018 biennial report to Congress on agency use of prizes, issued in June 2019 and the sixth such report, noted that, “monitoring the proliferation of State and local crowdsourcing initiatives, Challenge.gov expanded the email list to State and local government prize practitioners in 2018, inviting exchange and opening avenues for partnership.”
- CONNECTIONS TO NATIONAL PRIORITIES AND LEVERAGING WHITE HOUSE COMMUNICATION CHANNELS (August 2020): As the nation faced the COVID-19 pandemic, and as federal agencies responded to emerging challenges and sought to meet urgent needs during the ongoing public health crisis, they turned to incentive prizes as one tool for connecting with solvers across the country and identifying promising solutions. On August 12, 2020, then Director of OSTP Kelvin Droegemeier issued a memorandum to federal agencies highlighting nine prize competitions launched by agencies related to COVID-19 and calling on agencies to “double-down” on their deployment of prizes to meet the challenges of COVID-19. The memo also noted that OSTP was convening open innovation working groups to support these efforts and planning to host a series of webinars with Challenge.gov. The White House also issued a Fact Sheet communicating agency prize competition and open innovation activities related to COVID-19, with incentive prizes being used to catalyze advances in testing technologies, computational models, mental health services, ventilators, and needs of frontline health care workers.
- CONVENING (May 2022): OSTP and GSA collaborated on hosting an Open Innovation Forum to bring together practitioners of incentive prizes, citizen science, and crowdsourcing from across government and other sectors.
- AGENCY REPORTING TO EOP, REPORT TO CONGRESS, AND CONNECTIONS TO POLICY AGENDAS (May 2022 and April 2024): The Biden-Harris Administration has supported the continued use of incentive prizes by federal agencies as part of its commitment to expanding and improving public engagement in the work of the federal government.
- On May 4, 2022, OSTP released its seventh report to Congress on federal incentive prize competitions (and the second that also included a focus on citizen science and crowdsourcing activities alongside incentive prizes). In releasing the report, OSTP noted in a blog post, “This new report details recent Federal efforts to stimulate innovation and partnership and expand the American public’s participation in science. These developments are aligned with the Biden-Harris Administration’s commitment to advancing equity in the science and technology ecosystem, including OSTP’s Time is Now Initiative, and recently released Equity Action Plan.” This report also reflected a new and more robust survey approach used by OSTP and GSA to collect information from agencies about federal incentive prizes, as well as continued efforts among federal agencies to streamline the use of incentive prizes and reduce or remove barriers.
- On April 16, 2024, OSTP released its eighth report to Congress on Federal incentive prize competitions (and the third that also included a focus on citizen science and crowdsourcing). The report connected the continued growth in the use of incentive prizes by federal agencies to a broader Administration-wide “movement towards improving and expanding participation and engagement in not only government research and development, but also government processes more broadly.”
By the end of fiscal year 2022, federal agencies had hosted over 2,000 prize competitions on Challenge.gov, since its launch in 2010. OSTP, GSA, and NASA CoECI had provided training to well over 2,000 federal practitioners during that same period.
Number of Federal Prize Competitions by Authority FY14-FY22
Source: Office of Science and Technology Policy. Biennial Report on “IMPLEMENTATION OF FEDERAL PRIZE AND CITIZEN SCIENCE AUTHORITY: FISCAL YEARS 2021-22.” April 2024.
Federal Agency Practices to Support the Use of Prize Competitions
Source: Office of Science and Technology Policy. Biennial Report on “IMPLEMENTATION OF FEDERAL PRIZE AND CITIZEN SCIENCE AUTHORITY: FISCAL YEARS 2019-20.” March 2022.
CONCLUSION
Over the span of a decade, incentive prizes had moved from a tool used primarily outside of the federal government to one used commonly across federal agencies, due to a concerted, multi-pronged effort led by policy entrepreneurs and incentive prize practitioners in the EOP and across federal agencies, with bipartisan congressional support, crossing several presidential administrations. And yet, the work to support the use of prizes by federal agencies is not complete–there remains extensive opportunity to further improve the design, rigor, ambition, and effectiveness of federal prize competitions; to move beyond “ideas challenges” to increase the use of incentive prizes to demonstrate technologies and solutions in testbeds and real-world deployment scenarios; to train additional federal personnel on the use of incentive prizes; to learn from the results of federal incentive prizes competitions; and to apply this method to address pressing and emerging challenges facing the nation.
In applying these lessons to efforts to expand the use of other promising methods in federal agencies, policy entrepreneurs in center-of-government federal agencies should be strategic in the policy actions they take to encourage and scale method adoption, by first seeking to understand the adoption maturity of that method (as well as the relevant policy readiness) and then by undertaking interventions appropriate for that stage of adoption. With attention and action by policy entrepreneurs to address factors that discourage risk-taking, experimentation, and piloting of new methods by federal agencies, it will be possible for federal agencies to utilize a further-expanded strategic portfolio of methods to catalyze the development, demonstration, and deployment of technology and innovative solutions to meet agency missions, solve long-standing problems, and address grand challenges facing our nation.
Photo by Nick Fewings
Don’t Fight Paper With Paper: How To Build a Great Digital Product With the Change in the Couch Cushions
Barriers abound. If there were a tagline for most peoples’ experience building tech systems in government, that would be a contender. At FAS, we constantly hear about barriers agencies face in building systems that can help speed permitting review, a challenge that’s more critical than ever as the country builds new infrastructure to move away from a carbon economy. But breaking down barriers isn’t just a challenge in the permitting arena. So today we’re bringing you an instructive and hopefully inspiring story from Andrew Petrisin, Deputy Assistant Secretary for Multimodal Freight at the U.S. Department of Transportation. We hope his success in building a new system to help manage the supply chain crisis provides the insight – and motivation – you need to overcome the barriers you face.
To understand Andrew’s journey, we need to go back to the start of the pandemic. Shelter-in-place orders around the world disrupted global supply chains. Increased demand for many goods could not be met, creating a negative feedback loop that drove up costs and propelled inflation. In June of 2021, the Biden Administration announced it would establish a Supply Chain Disruption Task Force to address near-term supply and demand misalignments. Andrew joined the team together with Port Envoy and previous Deputy Secretary of Transportation John Porcari.
Porcari pulled together all the supply chain stakeholders out of the Port of LA on a regular basis to build situational awareness. That included the ports, terminal operators, railroads, ocean carriers, trucking associations, and labor. These types of meetings, happening three times each week during the height of the crisis, allowed stakeholders to share data and talk through challenges from different perspectives. Before the supply-chain crisis, meetings with all of the key players – in what Petrisin calls a “wildly interdependent system” – were rare. Now, railroads and the truckers had better awareness of the dwell times at the port (i.e., how long a container is sitting on terminal). Ocean carriers and ports now had greater understanding of what might be causing delays inland.
Going with the FLOW
These meetings were helpful, but to better see around corners, it needed to evolve to something more sophisticated. “The irony is that the problem was staring us right in the face,” Andrew told us, “but at the time we really had limited options to proactively fix it.” The meetings were building new relationships and strengthening existing ones, but there was a clear need for more than what had, thus far, consisted mostly of exchanges of slide decks. This prompted Petrisin to start asking some new questions: “How could we provide more value? What would make the data more actionable for each of you?” And critically, “Who would each of you trust with the data needed to make something valuable for everyone?” This was the genesis of FLOW (Freight Logistics Optimization Works).
Looking back, it might be easy to see a path from static data in slide decks shared during big conference calls to a functional data system empowering all the actors to move more quickly. But that outcome was far from certain. To start, the DOT is rarely a direct service provider. There was little precedent for the agency taking on such a role. The stakeholders Andrew was dealing with saw the Department as either a regulator or a grantmaker, both roles with inherent power dynamics. Under normal circumstances, if the Department asked a company for data, the purpose was to evaluate them to inform either a regulatory or grantmaking decision. That makes handing over data to the Department something private companies do carefully, with great caution and often trepidation. In fact, one company told Andrew “we’ve never shared data with the federal government that didn’t come back to bite us.” Yet to provide the service Andrew was envisioning, the stakeholders would need to willingly share their data on an ongoing, near real-time basis. They would need to see DOT in a whole new light, and a whole new role. DOT would need to see itself in a new light as well.
Oh, This is Different: Value to the Ecosystem
Companies had no obligation to give DOT this data, and until now, had no real reason to do so. In fact, other parts of government had asked for it before, and been turned down. But companies did share the data with Andrew’s team, at least enough of them to get started. Part of what Andrew thinks was different this time was that DOT wasn’t collecting this data primarily for its own use. “Oh, this is very different,” one of his colleagues said. “You are collecting data for other people to use.” The goal in this case was not a decision about funding or rules, but rather the creation of value back to the ecosystem of companies whose data Andrew’s system was ingesting.
To create that value, Andrew could not rely on a process that presumed to know up front what would work. Instead, he, his team, and the companies would need to learn along the way. And they would need to learn and adjust together. Instead of passive customers, Andrew’s team needed active participants who would engage in tight ‘build-measure-learn’ cycles – partners who would immediately try out new functionality and not only provide candid, quick feedback, but also share how they were using the data provided to improve their operations. “I was very clear with the companies: I need you guys to be candid. I need to know if this is working for you or not; if it’s valuable for you or not. I don’t need advice, I need active participation,” Petrisin says.
This is an important point. Too often, leaders of tech projects misunderstand the principle of listening to users as soliciting advice and opinions from a range of stakeholders and trying to find an average or mid-point from among them. What product managers should be trying to surface are needs, not opinions. Opinions get you what people think they want.“If I’d asked them what they wanted, they would have said faster horses,” Henry Ford is wrongly credited with saying. It’s the job of the digital team to uncover and prioritize needs, find ways to meet those needs, serve them back to the stakeholders, learn what works, adjust as necessary, and continue that cycle. The FLOW team did this again and again.
Building Trust through Partnership
That said, many of the features of FLOW exist because of ideas from users/companies that the team realized would create value for a larger set of stakeholders. “People sometimes ask How’d you get the shippers to give you purchase order data? The truth is, it was their idea,” Petrisin says. But this brings us back to the importance of an iterative process that doesn’t presume to know what will work up front. If the FLOW team had asked shippers to give them purchase order data in a planning stage, the answer most certainly would have been no, in part because the team hadn’t built the necessary trust yet, and in part because the shippers could not yet imagine how they would use this tool and how valuable it would be to them.
Co-creation with users relies on a foundation of trust and demonstrable value, which is built over time. It’s very hard to build that foundation through a traditional requirements-heavy up front planning process, which is assumed to be the norm in government. An iterative – and more nimble – process matters. One industry partner told Petrisin, “‘usually the government comes in and tells us how to do our jobs, but that isn’t what you did. It was a partnership.’”
One way that iterative, collaborative process manifested for the FLOW team was regular document review with the participating companies. “Each week we’d send them a written proposal on something like how we’re defining demand side data elements, for example,” Petrisin told us. “It would essentially say, ‘This is what we heard, this is what we think makes sense based on what we’re hearing from you. Do you agree?’ And people would review it and comment every week. Over time, you build that culture, you show progress, and you build trust.”
Petrisin’s team knew you can’t cultivate this kind of rapid, collaborative learning environment at scale from day one. So he started small, with a group of companies that were representative of the larger ecosystem. “So we got five shippers, two ocean carriers, three ports, two terminals, two chassis companies, three third-party logistics firms, a trucking company, and a warehouse company,” he told us, trying to keep the total number under 20. “Because when you get above 20, it becomes hard to have a real conversation.” In these early stages, quality mattered more than quantity, and the quality of the learning was directly tied to the ability to be in constant, frank communication about what was working, what wasn’t, and where value was emerging.
Byrne’s Law states that you can get 85% of the features of most software for 10% of the price. You just need to choose the right priorities. This is true not only for features, but for data, too. The FLOW team could have specified that a system like this could only succeed if it had access to all of the relevant data, and very often thorough requirements-gathering processes reinforce this thinking. But even with only 20 companies participating early on, FLOW ensured those companies were representative of the industry. This enabled real insights from very early in the process. Completeness and thoroughness, so often prized in government software practices, is often neither practical nor desired.
Small Wins Yield Large Returns
Starting small can be hard for government tech programs. It’s uncomfortable internally because it feels counter to the principle that government serves everyone equally; external stakeholders can complain if a competitor or partner is in the program and they’re not. (Such complaints can be a good problem to have; it can mean that the ecosystem sees value in what’s being built.) But too often, technology projects are built with the intent of serving everyone from day one, only to find that they meet a large, pre-defined set of requirements but don’t actually serve the needs of real users, and adoption is weak. Petrisin didn’t enjoy having to explain to companies why they couldn’t be in the initial cohort, but he stuck to his guns. The discipline paid off. “Some of my favorite calls were to go back to those companies a few months later and say, ‘We’re ready! We’re ready to expand, and we can onboard you.’” He knew he was onboarding them to a better project because his team had made the hard choices they needed to make earlier.
Starting small can ironically position products to grow fast, and when they do, strategies must change. Petrisin says his team really felt that. “I’ve gone from zero to one on a bunch of different things before this, but never really past the teens [in terms of team size], so to speak,” he says. “And now we’re approaching something like 100. So a lot of the last fiscal year for me was learning how to scale.” Learning how to scale a model of collaborative and shared governance was challenging.
Petrisin had to maintain FLOW’s commitment to serving the needs of the broader public, while also being pragmatic about who DOT could serve with given resources and continuing to build tight build-measure-learn cycles. Achieving consensus or even directional agreement during a live conversation with 20 stakeholders is one thing, but it’s much harder, and possibly counterproductive, with 60 or 100. So instead of changing the group of 20, which provided crucial feedback and served as a decision-making body, Andrew developed a second point of engagement: a bi-weekly meeting open to everyone for the FLOW team to share progress against the product roadmap weekly, which provided transparency and another opportunity to build trust through communicating feature delivery.
Fighting Trade-off Denial
One thing that didn’t change as the project scaled up was the team’s commitment to realistic and transparent prioritization. “We have to be very honest with ourselves that we can’t do everything,” Petrisin tells us. “But we can figure out what we want to do and transparently communicate that to the industry. They [the industry members] run teams. They manage P&Ls [profits and loss statements]. They understand what it is to make trade-offs with a given budget.” There was a lot of concern about not serving all potential supply chain partners, but Petrisin fought that “trade-off denial.” At that point, his team could either serve a smaller group well or serve everyone poorly. Establishing the need for prioritization early allowed for an incremental and iterative approach to development.
What drove that prioritization is not the number of features, lines of code, or fidelity to a predetermined plan, but demonstrable value to the ecosystem, and to the public. “Importers are working to better forecast congestion, which improves their ability to smooth out their kind of warehouse deliveries. Ocean carriers are working to forecast their bookings using the purchase order data. The chassis providers have correlated the flow demand data to their chassis utilization.” These are all qualitative outcomes, highly valuable but ones that could not necessarily have been predicted. There are quantitative measures too. FLOW aims to reduce operational variance, smoothing out the spikes in the supply chain because the actors can better manage changing demands. That means a healthier economy, and it means Americans are more likely to have access to the goods they need and want.
Major Successes Don’t Have to Start With Major Funding
What did the FLOW team have that put them in a position to succeed that other government software products don’t have? Given that FLOW was born out of the pandemic crisis, you might guess that it had a big budget, a big team, and a brand-name vendor. It had none of those. The initial funding was “what we could find in the couch cushions, so to speak,” says Andrew. More funding came as FLOW grew, but at that point there was already a working product to point to, and its real-world value was already established, both inside DOT and with industry. What did the procurement look like and how did they choose a vendor? They didn’t. So far, FLOW has been built entirely by developers led by DOT’s Bureau of Transportation Statistics, and the FLOW team continues to hire in-house. Having the team in-house and not having to negotiate change orders has made those build-measure-learn cycles a lot tighter.
FLOW did have great executive support, all the way up to the office of Transportation Secretary Pete Buttigieg, who understood the critical need of better digital infrastructure for the global supply chain. It’s unfortunately not as common as it should be for leadership to be involved and back development the way DOT’s top brass showed up for Petrisin and his team. But the big difference was just the team’s approach. “The problem already existed, the data already existed, the data privacy authorities already existed, the people already existed,” he told us. “What we did was put those pieces together in a different way. We changed processes and culture, but otherwise, the tools were already there.”
FLOW into the Future
FLOW is still an early stage product. There’s a lot ahead for it, including new companies to onboard that bring more and more kinds of data, more features, and more insights that will allow for a more resilient supply chain in the US. When Andrew thinks about where FLOW is going, he thinks about his role in its sustainability. “My job is to get the people, processes, purpose, and culture in place. So I’ve spent a lot of time on making sure we have a really great team who are ready to continue to move this forward, who have the relationships with industry. It’s not just my vision. It’s our vision.” He also thinks about inevitability, or at least the perception of it. “Five years from now we should look back and think, why did we not do this before? Or why would we ever have not done it? It’s digital public infrastructure we need. This is a role government should play, and I hope in the future people think it’s crazy that anyone would have thought government can’t or shouldn’t do things like this.”
Just this spring, when the Baltimore bridge collapsed, FLOW allowed stakeholders to monitor volume changes and better understand the impact of cargo rerouting to other ports. Following the collapse of the Francis Scott Key Bridge, Ports America Chesapeake, the container terminal operator at the Port of Baltimore, committed to joining FLOW for the benefits of supply chain resiliency to the Baltimore region. But FLOW’s success was not inevitable. Anyone who’s worked on government tech projects can rattle off a dozen ways a project like this could have failed. And outside of government, there’s a lot of skepticism, too. Petrisin remembers talking to the staff of one of the companies recently, who admitted that when they’d heard about the project, they just assumed it wouldn’t go anywhere and ignored it. He admits that’s a fair response. Another one, though, told him that he’d realized early on that the FLOW team wasn’t going to try to “press release” their way to success, but rather prove their value. The first company has since come back and told him, “Okay, now that everyone’s in it, we’re at a disadvantage to not be in it. When can we onboard?“
“You can’t fight paper with paper.” Ultimately, this sentiment sums up the approach Petrisin and his team took, and why FLOW has been such a success with such limited resources. It reminds us of the giant poster Mike Bracken, who founded the Government Digital Service in the UK, and inspired the creation of such offices as the US Digital Service, used to have on the wall behind his desk. In huge letters, it said “SHOW THE THING.” There’ll never be a shortage of demands for paperwork, for justification, for risk mitigation, for compliance in the game of building technology in government. These “government needs” (as Mike would call them) can eat up all your team has to give, leaving little room for meeting user needs. But momentum comes from tangible progress — working software that provides your stakeholders immediate value and that they can imagine getting more and more from over time. That’s what the FLOW team delivered, and continues to. They fought paper with value.
If Andrew and the FLOW team at DOT can do it, you can too. Where do you see successes like this in your work? What’s holding you back and what help do you need to overcome these barriers? Share on LinkedIn.
Lessons
Use a precipitating event to change practices. The supply chain crisis gave the team both the excuse to build this infrastructure that now seems indispensable, and the room to operate a little differently in how they built it. Now the struggle will be to sustain these different practices when the crisis is perceived to be over.
Make data function as a compass, not a grade. Government typically uses data for purposes of after-the fact evaluation, which can make internal and external actors wary of sharing it. When it can serve instead to inform one’s actions in closer to real-time, its value to all parties becomes evident. As Jennifer says in her book Recoding America, make data a compass, not a grade.
Build trust. Don’t try to “press release your way to success.” Actively listen to your users and be responsive to their concerns. Use that insight to show your users real value and progress frequently, and they’ll give you more of their time and attention.
Trust allows you to co-create with your users. Many of FLOW’s features came from the companies who use it, who offered up the relevant data to enable valuable functionality.
But understand your users’ needs, don’t solicit advice. The FLOW roadmap was shaped by understanding what was working for its stakeholders and what wasn’t, by what features were actually used and how. The behavior and actions of the user community are better signals than people’s opinions.
Start small. A traditional requirements-heavy up front planning process asks you to know everything the final product will do when you start. The FLOW team started with the most basic needs, built that, observed how the companies used it, and added from there. This enables them to learn along the way and build a product with greater value at lower cost.
Be prepared to change practices as you scale. How you handle stakeholders early on won’t work as you scale from a handful to hundreds. Adapt processes to the needs of the moment.
Fight trade-off denial. There will always be stakeholders, internal and external, who want more than the team can provide. Failing to prioritize and make clear decisions about trade-offs benefits no one in the end.
Don’t just reduce burden – provide value. There’s been a huge focus on reduction of burden on outside actors (like the companies involved in FLOW) over the past years. While it’s important to respect their time, FLOW’s success shows that companies will willingly engage deeply, offering more and more of their time, when there’s a real benefit to them. Watch your ratio of burden to value.
Measure success by use and value to users. Too many software projects define success as on-time and on-budget. Products like FLOW define success by the value they create for their users, as measured by quantitative measures of use and qualitative use cases.
Fund products not projects. FLOW started with a small internal team trying to build an early prototype, not a lengthy requirements-building stage. It was funded with “what we could find in the couch cushions” followed by modest dedicated allocations, and continues to grow modestly. This matches Jennifer’s model of product funding, as distinct from project funding.
Improving Government Capacity: Unleashing the capacity, creativity, energy, and determination of the public sector workforce
Peter Bonner is a Senior Fellow at FAS.
Katie: Peter, first, can you explain what government capacity means to you?
Peter: What government capacity means to me is ensuring that the people in the public sector, federal government primarily, have the skills, the tools, the technologies, the relationships, and the talent that help them meet their agency missions they need to do their jobs.
Those agency missions are really quite profound. I think we lose sight of this: if you’re working at the EPA, your job is to protect human health in the environment. If you’re working at the Department of the Interior, it’s to conserve and protect our natural resources and cultural heritage for the benefit of the public. If you’re working for HHS, you’re enhancing the health and well-being of all Americans. You’re working for the Department of Transportation, you’re ensuring a safe and efficient transportation system. And you can get into the national security agencies about protecting us from our enemies, foreign and domestic. These missions are amazing. Building that capacity so that the people can do their jobs better and more effectively is a critical and noble undertaking. Government employees are stewards of what we hold in common as a people. To me, that’s what government capacity is about.
Mr. Bonner’s Experience and Ambitions at FAS
You’ve had a long career in government – but how is it that you’ve come to focus on this particular issue as something that could make a big difference?
I’ve spent a couple of decades building government capacity in different organizations and roles, most recently as a government executive and political appointee as an associate director at the Office of Personnel Management. Years ago I worked as a contractor with a number of different companies, building human capital and strategic consulting practices. In all of those roles, in one way or another, it’s been about building government capacity.
One of my first assignments when I worked as a contractor was working on the Energy Star program, and helping to bridge the gaps between the public sector interests – wanting to create greater energy efficiency and reduce energy usage to address climate change – to the private sector interests – making sure their products were competitive and using market forces to demonstrate the effectiveness of federal policy. This work promoted energy efficiency across energy production, computers, refrigerators, HVAC equipment, even commercial building and residential housing. Part of the capacity building piece of that was working with the federal staff and the federal clients who ran those programs, but also making sure they had the right sets of collaboration skills to work effectively with the private sector around these programs and work effectively with other federal agencies. Agencies not only needed to work collaboratively wih the private sector, but across agencies as well. Those collaboration skills–those skills to make sure they’re working jointly inter-agency – don’t always come naturally because people feel protective about their own agency, their own budgets, and their own missions. So that’s an example of building capacity.
Another project early on I was involved in was helping to develop a training program for inspectors of underground storage tanks. That’s pretty obscure, but underground storage tanks have been a real challenge in the nation in creating groundwater pollution. We developed an online course using simulations on how to detect leaks and underground storage tanks. The capacity building piece was getting the agencies and tank inspectors at the state and local level to use this new learning technology to make their jobs easier and more effective.
Capacity building examples abound – helping OPM build human capital frameworks and improve operating processes, improving agency performance management systems, enhancing the skills of Air Force medical personnel to deal with battlefield injuries, and on. I’ve been doing capacity building through HR transformation, learning, leadership development, strategy and facilitation, human centered design, and looking at how do you develop HR and human capital systems that support that capacity building in the agencies. So across my career, those are the kinds of things that I’ve been involved in around government capacity.
What brought you to FAS and what you’re doing now?
I left my job as the associate director for HR Solutions at the Office of Personnel Management last May with the intent of finding ways to continue to contribute to the effective functioning of the federal government. This opportunity came about from a number of folks I’d worked with while at OPM and elsewhere.
FAS is in a unique position to change the game in federal capacity building through thought leadership, policy development, strategic placement of temporary talent, and initiatives to bring more science and technical professionals to lead federal programs.
I’m really trying to help change the game in talent acquisition and talent management and how they contribute to government capacity. That ranges from upfront hiring in the HR arena through to onboarding and performance management and into program performance.
I think what I’m driven by at FAS is to really unleash the capacity, the creativity, the energy, the determination of the public sector workforce to be able to do their jobs as efficiently and effectively as they know how. The number of people I know in the federal government that have great ideas on how to improve their programs in the bottom left hand drawer of their desk or on their computer desktop, that they can never get around to because of everything else that gets in the way.
There are ways to cut through the clutter to help make hiring and talent management effective. Just in hiring: creative recruiting and sourcing for science and technical talent, using hiring flexibilities and hiring authorities on hand, equipping HR staffing specialists and hiring managers with the tools they need, working across agencies on common positions, accelerating background checks are all ways to speed up the hiring process and improve hiring quality.
It’s the stuff that gets in the way that inhibits their ability to do these things. So that unleashing piece is the real reason I’m here. When it comes to the talent management piece changing, if you can move the needle a little bit on the perception of public sector work and federal government work, because the perception, the negative perception of what it’s like to work in the federal government or the distrust in the federal government is just enormous. The barriers there are profound. But if we can move the needle on that just a little bit, and if we can change the candidate experience of the person applying for a federal job so that they, while it may be arduous, results in a positive experience for them and for the hiring manager and HR staffing specialist, that then becomes the seed bed for a positive employee experience in the federal job. That then becomes the seed bed for an effective customer experience because the linkage between employee experience and customer experience is direct. So if we can shift the needle on those things just a little bit, we then start to change the perception of what public sector work is like, and tap into that energy of what brought them to the public sector job in the first place, which by and large is the mission of the agency.
Using Emerging Technologies to Improve Government Capacity
How do you see emerging technologies assisting or helping that mission?
The emerging technologies in talent management are things that other sectors of the economy are working with and that the federal government is quickly catching up on. Everybody thinks the private sector has this lock picked. Well, not necessarily. Private sector organizations also struggle with HR systems that effectively map to the employee journey and that provide analytics that can guide HR decision-making along the way.
A bright spot for progress in government capacity is in recruiting and sourcing talent. Army Corps of Engineers, Department of Energy are using front end recruiting software to attract people into their organizations. The Climate Corps, for example, or the Clean Energy Corps at Department of Energy. So they’re using those front end recruiting systems to bring people in and attract people in to submit the resumes and their applications that can again, create that positive initial candidate experience, then take ’em through the rest of the process. There’s work being done in automating and developing more effective online assessments from USA Hire, for example, so that if you’re in a particular occupation, you can take an online test when you apply and that test is going to qualify you for the certification list on that job.
Those are not emerging technologies but they are being deployed effectively in government. The mobile platforms to quickly and easily communicate with the applicants and communicate with the candidates at different stages of the process. Those things are coming online and already online in many of the agencies.
In addition to some experimentation with AI tools, I think one of the more profound pieces around technologies is what’s happening at the program level that is changing the nature of the jobs government workers do that then impacts what kind of person an HR manager is looking for.
For example, while there are specific occupations focused on machine learning, AI, and data analytics, data literacy and acumen and using these tools going to be part of everyone’s job in the future. So facility with those analytic tools and with the data visualization tools that are out there is going to have a profound impact on the jobs themselves. Then you back that up to, okay, what kind of person am I looking for here? I need somebody with that skill set coming in. Or who can be easily up-skilled into that. That’s true for data literacy, data analytics, some of the AI skill sets that are coming online. It’s not just the technologies within the talent management arena, but it’s the technologies that are happening in the front lines and the programs that then determine what kind of person I’m looking for and impact those jobs.
The Significance of Permitting Reform for Hiring
You recently put on a webinar for the Permitting Council. Do you mind explaining what that is and what the goal of the webinar was?
The Permitting Council was created under what’s called the Fast 41 legislation, which is legislation to improve the capacity and the speed at which environmental permits are approved so that we can continue with federal projects. Permitting has become a real hot button issue right now because the Inflation Reduction Act, the CHIPS and Science Act, the Bipartisan Infrastructure Law created all of these projects in the field, some on federal lands, some on state and local lands, and some on tribal or private sector lands, that then create the need to do an environmental permit of some kind in order to get approval to build.
So under the Bipartisan Infrastructure Law, we’re trying to create internet for all, for example, and particularly provide internet access in rural communities where they haven’t had it before, and people who perhaps couldn’t afford it. That requires building cell towers and transmission lines on federal lands, and that then requires permits, require a permitting staff or a set of permitting contractors to actually go in and do that work.
Permitting has been, from a talent perspective, underresourced. They have not had the capacity, they have not had the staff even to keep up with the permits necessitated by these new pieces of legislation. So getting the right people hired, getting them in place, getting the productive environmental scientists, community planners, the scientists of different types, marine biologists, landscape folks, the fish and wildlife people who can advise on how best to do those environmental impact statements or categorical exclusions as a result of the National Environmental Protection Act – it has been a challenge. Building that capacity in the agencies that are responsible for permitting is really a high leverage point for these pieces of legislation because if I can’t build the cell tower, I then don’t realize the positive results from the Bipartisan Infrastructure Law. And you can think of the range of things that those pieces of legislation have fostered around the country from clean water systems in underserved communities, to highways, to bridges, to roads, to airports.
Another example is offshore wind. So you need marine biologists to be able to help do the environmental impact statements around building the wind turbines offshore and examine the effect on the marine habitats. It’s those people that the Department of Interior, the Department of Energy, and Department of Commerce need to hire to come in and run those programs and do those permits effectively. That’s what the Permitting Council does.
One of the things that we worked with with OPM and the Permitting Council together on is creating a webinar so that we got the hiring managers and the HR staffing specialists in the room at the same time to talk about the common bottlenecks that they face in the hiring process. After doing outreach and research, we created journey maps and a set of personas to identify a couple of the most salient and common challenges and high leverage challenges that they face.
Looking at the ecosystem within hiring, from what gets in the way in recruiting and sourcing, all the way through to onboarding, to focusing in on the position descriptions and what do you do if you don’t have an adequate position description upfront when you’re trying to hire that environmental scientist to the background check process and the suitability process. What do you do when things get caught in that suitability process? And if you can’t bring those folks on board in a timely fashion you risk losing them.
We focused on a couple of the key challenges in that webinar, and we had, I don’t know, 60 or 70 people who were there, the hiring managers and HR staffing specialists who took away from that a set of tools that they can use to accelerate and improve that hiring process and get high quality hires on quickly to assist with the permitting.
The Permitting Council has representatives from each of the agencies that do permitting, and works with them on cross agency activities. The council also has funding from some of these pieces of legislation to foster the permitting process, either through new software or people process, the ability to get the permits done as quickly as possible. So that’s what the webinar was about. I We’re talking about doing a second one to look at the more systemic and policy related changes, challenges in permitting hiring.
The Day One Project 2025
FAS has launched its Day One Project 2025, a massive call for nonpartisan, science-based policy ideas that a next presidential administration can utilize on “day one” – whether the new administration is Democrat or Republican. One of the areas we’ve chosen to focus on is Government Capacity. Will you be helping evaluate the ideas that are submitted?
I’ve had input into the Day One Project, and particularly around the talent pieces in the government capacity initiative, and also procurement and innovation in that area. I think the potential of that to help set the stage for talent reform more broadly, be it legislative policy, regulatory or the risk averse culture we have in the federal government. I think the impact of that Day One Project could be pretty profound if we get the right energy behind it. So one of the things that I’ve known for a while, but has come clear to me over the past five months working with FAS, is that there are black boxes in the talent management environment in the federal government. What I mean by that is that it goes into this specialized area of expertise and nobody knows what happens in that specialized area until something pops out the other end.
How do you shed light on the inside of those black boxes so it’s more transparent what happens? For instance: position descriptions when agencies are trying to hire someone. Sometimes what happens with position descriptions is that the job needs to be reclassified because it’s changed dramatically from the previous position description. Well, I know a little about classification and what happens in the classification process, but to most people looking from the outside to hiring managers, that’s a black box. Nobody knows what goes on. I mean, they don’t know what goes on within that classification process to know that it’s going to be worthwhile for them once they have the position description at the other end and are able to do an effective job announcement. Shedding light on that, I think has the potential to increase transparency and trust between the hiring manager and the HR folks or the program people and the human people.
If we’re able to create that greater transparency. If we’re able to tell the candidates when they come in and apply for a job where they are in the hiring process and whether they made the cert list or didn’t make the cert list. And if they are in the CT list, what’s next in terms of their assessment and the process? If they’ve gone through the interview, where are we in the decision deliberations about offering me the job? Same thing in suitability. Those are many black boxes all the way, all the way across. And creating transparency and communication around it, I think will go a long way, again, to moving that needle on the perception of what federal work is and what it’s like to work in the system. So it’s a long answer to a question that I guess I can summarize by saying, I think we are in a target rich environment here. There’s lots of opportunity here to help change the game.
Revitalizing Federal Jobs Data: Unleashing the Potential of Emerging Roles
Emerging technologies and creative innovation are pivotal economic pillars for the future of the United States. These sectors not only promise economic growth but also offer avenues for social inclusion and environmental sustainability. However, the federal government lacks reliable and comprehensive data on these sectors, which hampers its ability to design and implement effective policies and programs. A key reason for this data gap is the outdated and inadequate job categories and classifications used by the Bureau of Labor Statistics (BLS).
The BLS is the main source of official statistics on employment, wages, and occupations in the U.S. Part of the agency’s role is to categorize different industries, which helps states, researchers and other outside parties measure and understand the size of certain industries or segments of the economy. Another BLS purpose is to use the Standard Occupational Classification (SOC) system to categorize and define jobs based on their duties, skills, and education requirements. This is how all federal workers and contracted federal workers are classified. For an agency to create and fill a role, it needs a classification or SOC. State and private employers also use the classifications and data to allocate funding and determine benefits related to different kinds of positions.
Where no classification (SOC) or job exists, it is unclear whether hiring and contracting happen according to programmatic intent and in a timely manner. This is particularly concerning to some employers and federal agencies that need to align numerous jobs with the provisions of Justice 40, the Inflation Reduction Act and the newly created American Climate Corps. Many of the roles imagined by the American Climate Corps do not have classifications. This poses a significant barrier for effective program and policy design related to green and tech jobs.
The SOC system is updated roughly once every 10 years. There is not a set comprehensive review schedule for that or the industry categories. Updates are topical, with the last broad revision taking place in 2018. Unemployment reports and data related to wages are updated annually, and other topics less predictably. Updates and work on the SOC systems and categories for what are broadly defined as “green jobs” stopped in 2013 due to sequestration. This means that the BLS data may not capture the current and future trends and dynamics of the green and innovation economies, which are constantly evolving and growing.Because the BLS does not have a separate category for green jobs, it identifies them based on a variety of industry and occupation codes. The range spans restaurant industry SOCs to construction. Classifying positions this way cannot reflect the cross-cutting and interdisciplinary nature of green jobs. Moreover, the process may not account for the variations and nuances of green jobs, such as their environmental impact, social value, and skill level. For example, if you want to work with solar panels, there is a construction classification, but nothing for community design, specialized finance, nor any complementary typographies needed for projects at scale.
Similarly, the BLS does not have a separate category for tech jobs. It identifies them based on the “Information and Communication Technologies” occupational groups of the SOC system. Again, this approach may not adequately reflect the diversity and complexity of tech jobs, which may involve new and emerging skills and technologies that are not yet recognized by the BLS. There are no classifications for roles associated with machine learning or artificial intelligence. Where the private sector has a much-discussed large language model trainer role, the federal system has no such classification. Appropriate skills matching, resource allocation, and the ability to measure the numbers and impacts of these jobs on the economy will be difficult if not impossible to fully understand. Classifying tech jobs in this manner may not account for the interplay and integration of tech jobs with other sectors, such as health care, education, and manufacturing.
These data limitations have serious implications for policy design and evaluation. Without accurate and timely data on green and tech jobs, the federal government may not be able to assess the demand and supply of these jobs, identify skill gaps and training needs, allocate resources, and measure the outcomes and impacts of its policies and programs. This will result in missed opportunities, wasted resources, and suboptimal outcomes.
There is a need to update the BLS job categories and classifications to better reflect the realities and potentials of the green and innovation economies. This can be achieved by implementing the following strategic policy measures:
- Inter-Agency Collaboration: Establish an inter-agency task force, including representatives from the BLS, Department of Energy (DOE), Environmental Protection Agency (EPA), Department of Education (ED), and the Department of Commerce (DOC), to review and update the current job categories and classifications. This task force would be responsible for ensuring that the classifications accurately reflect the evolving nature of jobs in the green and innovation economies.
- Public-Private Partnerships: Engage in public-private partnerships with industry leaders, academic institutions, and non-profit organizations. These partnerships can provide valuable insights into the changing job landscape and help inform the update of job categories and classifications. They can also facilitate the dissemination and adoption of the updated classifications among employers and workers, as well as the development and delivery of training and education programs related to green and tech jobs.
- Stakeholder Engagement: Conduct regular consultations with stakeholders, including educational institutions, employers, workers, and unions in the green and innovation economies. Their input can ensure that the updated classifications accurately represent the realities and challenges of the job market. They can also provide feedback and suggestions on how to improve the quality and accessibility of the BLS data.
- Regular Updates: Implement a policy for regular reviews and updates of job categories and classifications. The policy should also specify the frequency and criteria for the updates, as well as the roles and responsibilities of the involved agencies and partners.
By updating the BLS job categories and classifications, the federal government can ensure that its data and statistics accurately reflect the current and future job market, thereby supporting effective policy design and evaluation related to green and tech jobs. Accurate and current data that mirrors the ever-evolving job market will also lay the foundation for effective policy design and evaluation in the realms of green and tech jobs. This commitment can contribute to the development of a workforce that not only meets economic needs but also aligns with the nation’s environmental aspirations.
FAS Senior Fellow Jen Pahlka testifies on Using AI to Improve Government Services
Jennifer Pahlka (@pahlkadot) is a FAS Senior Fellow and the author of Recoding America: Why Government is Failing in the Digital Age and How We Can Do Better. Here is Pahlka’s testimony about artificial intelligence presented today, January 10, 2024, to the full Senate Committee on Homeland Security and Government Affairs hearing on “Harnessing AI to Improve Government Services and Customer Experience”. More can be found here, here, and here.
How the U.S. government chooses to respond to the changes AI brings is indeed critical, especially in its use to improve government services and customer experience. If the change is going to be for the better (and we can’t afford otherwise) it will not be primarily because of how much or how little we constrain AI’s use. Constraints are an important conversation, and AI safety experts are better suited to discuss these than me. But we could constrain agencies significantly and still get exactly the bad outcomes that those arguing for risk mitigation want to avoid. We could instead direct agencies to dive headlong into AI solutions, and still fail to get the benefit that the optimists expect. The difference will come down to how much or how little capacity and competency we have to deploy these technologies thoughtfully.
There are really two ways to build capacity: having more of the right people doing the right things (including but not limited to leveraging technology like AI) and safely reducing the burdens we place on those people. AI, of course, could help reduce those burdens, but not without the workforce we need – one that understands the systems we have today, the policy goals we have set, and the technology we are bringing to bear to achieve those goals. Our biggest priority as a government should be building that capacity, working both sides of that equation (more people, less burden.)
Building that capacity will require bodies like the US Senate to use a wide range of the tools at its disposal to shape our future, and use them in a specific way. Those tools can be used to create mandates and controls on the institutions that deliver for the American people, adding more rules and processes for administrative agencies and others to comply with. Or they can be used to enable these institutions to develop the capacity they so desperately need and to use their judgment in the service of agreed-upon goals, often by asking what mandates and controls might be removed, rather than added. This critical AI moment calls for enablement.
The recent executive order on AI already provides some new controls and safeguards. The order strikes a reasonable balance between encouragement and caution, but I worry that some of its guidance will be applied inappropriately. For example, some government agencies have long been using AI for day to day functions like handwriting recognition on envelopes or improved search to retrieve evidence more easily, and agencies may now subject these benign, low-risk uses to red tape based on the order. Caution is merited in some places, and dangerous in others, where we risk moving backwards, not forward. What we need to navigate these frameworks of safeguard and control are people in agencies who can tell the difference, and who have the authority to act accordingly.
Moreover, in many areas of government service delivery, the status quo is frankly not worth protecting. We understandably want to make sure, for instance, that applicants for government benefits aren’t unfairly denied because of bias in algorithms. The reality is that, to take just one benefit, one in six determinations of eligibility for SNAP is substantively incorrect today. If you count procedural errors, the rate is 44%. Worse are the applications and adjudications that haven’t been decided at all, the ones sitting in backlogs, causing enormous distress to the public and wasting taxpayer dollars. Poor application of AI in these contexts could indeed make a bad situation worse, but for people who are fed up and just want someone to get back to them about their tax return, their unemployment insurance check, or even their company’s permit to build infrastructure, something has to change. We may be able to make progress by applying AI, but not if we double down on the remedies that failed in the Internet Age and hope they somehow work in the age of AI. We must finally commit to the hard work of building digital capacity.
Applying ARPA-I: A Proven Model for Transportation Infrastructure
Executive Summary
In November 2021, Congress passed the Infrastructure Investment and Jobs Act (IIJA), which included $550 billion in new funding for dozens of new programs across the U.S. Department of Transportation (USDOT). Alongside historic investments in America’s roads and bridges, the bill created the Advanced Research Projects Agency-Infrastructure (ARPA-I). Building on successful models like the Defense Advanced Research Projects Agency (DARPA) and the Advanced Research Program-Energy (ARPA-E), ARPA-I’s mission is to bring the nation’s most innovative technology solutions to bear on our most significant transportation infrastructure challenges.
ARPA-I must navigate America’s uniquely complex infrastructure landscape, characterized by limited federal research and development funding compared to other sectors, public sector ownership and stewardship, and highly fragmented and often overlapping ownership structures that include cities, counties, states, federal agencies, the private sector, and quasi-public agencies. Moreover, the new agency needs to integrate the strong culture, structures, and rigorous ideation process that ARPAs across government have honed since the 1950s. This report is a primer on how ARPA-I, and its stakeholders, can leverage this unique opportunity to drive real, sustainable, and lasting change in America’s transportation infrastructure.
How to Use This Report
This report highlights the opportunity ARPA-I presents; orients those unfamiliar with the transportation infrastructure sector to the unique challenges it faces; provides a foundational understanding of the ARPA model and its early-stage program design; and empowers experts and stakeholders to get involved in program ideation. However, individual sections can be used as standalone tools depending on the reader’s prior knowledge of and intended involvement with ARPA-I.
- If you are unfamiliar with the background, authorization, and mission of ARPA-I, refer to the section “An Opportunity for Transportation Infrastructure Innovation.”
- If you are relatively new to the transportation infrastructure sector, refer to the section “Unique Challenges of the Transportation Infrastructure Landscape.”
- If you have prior transportation infrastructure experience or expertise but are new to the ARPA model, you can move directly to the sections beginning with “Core Tenets of ARPA Success.”
An Opportunity for Transportation Infrastructure Innovation
In November 2021, Congress passed the Infrastructure Investment and Jobs Act (IIJA) authorizing the U.S. Department of Transportation (USDOT) to create the Advanced Research Projects Agency-Infrastructure (ARPA-I), among other new programs. ARPA-I’s mission is to advance U.S. transportation infrastructure by developing innovative science and technology solutions that:
- lower the long-term cost of infrastructure development, including costs of planning, construction, and maintenance;
- reduce the life cycle impacts of transportation infrastructure on the environment, including through the reduction of greenhouse gas emissions;
- contribute significantly to improving the safe, secure, and efficient movement of goods and people; and
- promote the resilience of infrastructure from physical and cyber threats.
ARPA-I will achieve this goal by supporting research projects that:
- advance novel, early-stage research with practicable application to transportation infrastructure;
- translate techniques, processes, and technologies, from the conceptual phase to prototype, testing, or demonstration;
- develop advanced manufacturing processes and technologies for the domestic manufacturing of novel transportation-related technologies; and
- accelerate transformational technological advances in areas in which industry entities are unlikely to carry out projects due to technical and financial uncertainty.
ARPA-I is the newest addition to a long line of successful ARPAs that continue to deliver breakthrough innovations across the defense, intelligence, energy, and health sectors. The U.S. Department of Defense established the pioneering Defense Advanced Research Projects Agency (DARPA) in 1958 in response to the Soviet launch of the Sputnik satellite to develop and demonstrate high-risk, high-reward technologies and capabilities to ensure U.S. military technological superiority and confront national security challenges. Throughout the years, DARPA programs have been responsible for significant technological advances with implications beyond defense and national security, such as the early stages of the internet, the creation of the global positioning system (GPS), and the development of mRNA vaccines critical to combating COVID-19.
In light of the many successful advancements seeded through DARPA programs, the government replicated the ARPA model for other critical sectors, resulting in the Intelligence Advanced Research Projects Activity (IARPA) within the Office of the Director of National Intelligence, the Advanced Research Projects Agency-Energy within the Department of Energy, and, most recently, the Advanced Research Projects Agency-Health (ARPA-H) within the Department of Health and Human Services.
Now, there is the opportunity to bring that same spirit of untethered innovation to solve the most pressing transportation infrastructure challenges of our time. The United States has long faced a variety of transportation infrastructure-related challenges, due in part to low levels of federal research and development (R&D) spending in this area; the fragmentation of roles across federal, state, and local government; risk-averse procurement practices; and sluggish commercial markets. These challenges include:
- Roadway safety. According to the National Highway Traffic Safety Administration, an estimated 42,915 people died in motor vehicle crashes in 2021, up 10.5% from 2020.
- Transportation emissions. According to the U.S. Environmental Protection Agency, the transportation sector accounted for 27% of U.S. greenhouse gas (GHG) emissions in 2020, more than any other sector.
- Aging infrastructure and maintenance. According to the 2021 Report Card for America’s Infrastructure produced by the American Society of Civil Engineers, 42% of the nation’s bridges are at least 50 years old and 7.5% are “structurally deficient.”
The Fiscal Year 2023 Omnibus Appropriations Bill awarded ARPA-I its initial appropriation in early 2023. Yet even before that, the Biden-Harris Administration saw the potential for ARPA-I-driven innovations to help meet its goal of net-zero GHG emissions by 2050, as articulated in its Net-Zero Game Changers Initiative. In particular, the Administration identified smart mobility, clean and efficient transportation systems, next-generation infrastructure construction, advanced electricity infrastructure, and clean fuel infrastructure as “net-zero game changers” that ARPA-I could play an outsize role in helping develop.
For ARPA-I programs to reach their full potential, agency stakeholders and partners need to understand not only how to effectively apply the ARPA model but how the unique circumstances and challenges within transportation infrastructure need to be considered in program design.
Unique Challenges of the Transportation Infrastructure Landscape
Using ARPA-I to advance transportation infrastructure breakthroughs requires an awareness of the most persistent challenges to prioritize and the unique set of circumstances within the sector that can hinder progress if ignored. Below are summaries of key challenges and considerations for ARPA-I to account for, followed by a deeper analysis of each challenge.
- Federal R&D spending on transportation infrastructure is considerably lower than other sectors, such as defense, healthcare, and energy, as evidenced by federal spending as a percentage of that sector’s contribution to gross domestic product (GDP).
- The transportation sector sees significant private R&D investment in vehicle and aircraft equipment but minimal investment in transportation infrastructure because the benefits from those investments are largely public rather than private.
- Market fragmentation within the transportation system is a persistent obstacle to progress, resulting in reliance on commercially available technologies and transportation agencies playing a more passive role in innovative technology development.
- The fragmented market and multimodal nature of the sector pose challenges for allocating R&D investments and identifying customers.
Lower Federal R&D Spending in Transportation Infrastructure
Federal R&D expenditures in transportation infrastructure lag behind those in other sectors. This gap is particularly acute because, unlike for some other sectors, federal transportation R&D expenditures often fund studies and systems used to make regulatory decisions rather than technological innovation. The table below compares actual federal R&D spending and sector expenditures for 2019 across defense, healthcare, energy, and transportation as a percentage of each sector’s GDP. The federal government spends orders of magnitude less on transportation than other sectors: energy R&D spending as a percentage of sector GDP is nearly 15 times higher than transportation, while health is 13 times higher and defense is nearly 38 times higher.
Public Sector Dominance Limits Innovation Investment
Since 1990, total investment in U.S. R&D has increased by roughly 9 times. When looking at the source of R&D investment over the same period, the private and public sectors invested approximately the same amount of R&D funding in 1982, but today the rate of R&D investment is nearly 4 times greater for the private industry than the government.
While there are problems with the bulk of R&D coming from the private sector, such as innovations to promote long-term public goods being overlooked because of more lucrative market incentives, industries that receive considerable private R&D funding still see significant innovation breakthroughs. For example, the medical industry saw $161.8 billion in private R&D funding in 2020 compared to only $61.5 billion from federal funding. More than 75% of this private industry R&D occurred within the biopharmaceutical sector where corporations have profit incentives to be at the cutting edge of advancements in medicine.
The transportation sector has one robust domain for private R&D investment: vehicle and aircraft equipment manufacturing. In 2018, total private R&D was $52.6 billion. Private sector transportation R&D focuses on individual customers and end users, creating better vehicles, products, and efficiencies. The vast majority of that private sector R&D does not go toward infrastructure because the benefits are largely public rather than private. Put another way, the United States invests more than 50 times the amount of R&D into vehicles than the infrastructure systems upon which those vehicles operate.
Market Fragmentation across Levels of Government
Despite opportunities within the public-dominated transportation infrastructure system, market fragmentation is a persistent obstacle to rapid progress. Each level of government has different actors with different objectives and responsibilities. For instance, at the federal level, USDOT provides national-level guidance, policy, and funding for transportation across aviation, highway, rail, transit, ports, and maritime modes. Meanwhile, the states set goals, develop transportation plans and projects, and manage transportation networks like the interstate highway system. Metropolitan planning organizations take on some of the planning functions at the regional level, and local governments often maintain much of their infrastructure. There are also local individual agencies that operate facilities like airports, ports, or tollways organized at the state, regional, or local level. Programs that can use partnerships to cut across this tapestry of systems are essential to driving impact at scale.
Local agencies have limited access and capabilities to develop cross-sector technologies. They have access to limited pools of USDOT funding to pilot technologies and thus generally rely on commercially available technologies to increase the likelihood of pilot success. One shortcoming of this current process is that both USDOT and infrastructure owner-operators (IOOs) play a more passive role in developing innovative technologies, instead depending on merely deploying market-ready technologies.
Multiple Modes, Customers, and Jurisdictions Create Difficulties in Efficiently Allocating R&D Resources
The transportation infrastructure sector is a multimodal environment split across many modes, including aviation, maritime, pipelines, railroads, roadways (which includes biking and walking), and transit. Each mode includes various customers and stakeholders to be considered. In addition, in the fragmented market landscape federal, state, and local departments of transportation have different—and sometimes competing—priorities and mandates. This dynamic creates difficulties in allocating R&D resources and considering access to innovation across these different modes.
Customer identification is not “one size fits all” across existing ARPAs. For example, DARPA has a laser focus on delivering efficient innovations for one customer: the Department of Defense. For ARPA-E, it is less clear; their customers range from utility companies to homeowners looking to benefit from lower energy costs. ARPA-I would occupy a space in between these two cases, understanding that its end users are IOOs—entities responsible for deploying infrastructure in many cases at the local or regional level.
However, even with this more direct understanding of its customers, a shortcoming of a system focused on multiple modes is that transportation infrastructure is very broad, occupying everything from self-healing concrete to intersection safety to the deployment of electrified mobility and more. Further complicating matters is the rapid evolution of technologies and expectations across all modes, along with the rollout of entirely new modes of transportation. These developments raise questions about where new technologies and capabilities fit in existing modal frameworks, what actors in the transportation infrastructure market should lead their development, and who the ultimate “customers” or end users of innovation are.
Having a matrixed understanding of the rapid technological evolution across transportation modes and their potential customers is critical to investing in and building infrastructure for the future, given that transportation infrastructure investments not only alter a region’s movement of people and goods but also fundamentally impact its development. ARPA-I is poised to shape learnings across and in partnership with USDOT’s modes and various offices to ensure the development and refinement of underlying technologies and approaches that serve the needs of the entire transportation system and users across all modes.
Core Tenets of ARPA Success
Success using the ARPA model comes from demonstrating new innovative capabilities, building a community of people (an “ecosystem”) to carry the progress forward, and having the support of key decision-makers. Yet the ARPA model can only be successful if its program directors (PDs), fellows, stakeholders, and other partners understand the unique structure and inherent flexibility required when working to create a culture conducive to spurring breakthrough innovations. From a structural and cultural standpoint, the ARPA model is unlike any other agency model within the federal government, including all existing R&D agencies. Partners and other stakeholders should embrace the unique characteristics of an ARPA.
Cultural Components
ARPAs should take risks.
An ARPA portfolio may be the closest thing to a venture capital portfolio in the federal government. They have a mandate to take big swings so should not be limited to projects that seem like safe bets. ARPAs will take on many projects throughout their existence, so they should balance quick wins with longer-term bets while embracing failure as a natural part of the process.
ARPAs should constantly evaluate and pivot when necessary.
An ARPA needs to be ruthless in its decision-making process because it has the ability to maneuver and shift without the restriction of initial plans or roadmaps. For example, projects around more nascent technology may require more patience, but if assessments indicate they are not achieving intended outcomes or milestones, PDs should not be afraid to terminate those projects and focus on other new ideas.
ARPAs should stay above the political fray.
ARPAs can consider new and nontraditional ways to fund innovation, and thus should not be caught up in trends within their broader agency. As different administrations onboard, new offices get built and partisan priorities may shift, but ARPAs should limit external influence on their day-to-day operations.
ARPA team members should embrace an entrepreneurial mindset.
PDs, partners, and other team members need to embrace the creative freedom required for success and operate much like entrepreneurs for their programs. Valued traits include a propensity toward action, flexibility, visionary leadership, self-motivation, and tenacity.
ARPA team members must move quickly and nimbly.
Trying to plan out the agency’s path for the next two years, five years, 10 years, or beyond is a futile effort and can be detrimental to progress. ARPAs require ultimate flexibility from day to day and year to year. Compared to other federal initiatives, ARPAs are far less bureaucratic by design, and forcing unnecessary planning and bureaucracy on the agency will slow progress.
Collegiality must be woven into the agency’s fabric.
With the rapidly shifting and entrepreneurial nature of ARPA work, the federal staff, contractors, and other agency partners need to rely on one another for support and assistance to seize opportunities and continue progressing as programs mature and shift.
Outcomes matter more than following a process.
ARPA PDs must be free to explore potential program and project ideas without any predetermination. The agency should support them in pursuing big and unconventional ideas unrestricted by a particular process. While there is a process to turn their most unconventional and groundbreaking ideas into funded and functional projects, transformational ideas are more important than the process itself during idea generation.
ARPA team members welcome feedback.
Things move quickly in an ARPA, and decisions must match that pace, so individuals such as fellows and PDs must work together to offer as much feedback as possible. Constructive pushback helps avoid blind alleys and thus makes programs stronger.
Structural Components
The ARPA Director sets the vision.
The Director’s vision helps attract the right talent and appropriate levels of ambition and focus areas while garnering support from key decision-makers and luminaries. This vision will dictate the types and qualities of PDs an ARPA will attract to execute within that vision.
PDs can make or break an ARPA and set the technical direction.
Because the power of the agency lies within its people, ARPAs are typically flat organizations. An ARPA should seek to hire the best and most visionary thinkers and builders as PDs, enable them to determine and design good programs, and execute with limited hierarchical disruption. During this process, PDs should engage with decision-makers in the early stages of the program design to understand the needs and realities of implementers.
Contracting helps achieve goals.
The ARPA model allows PDs to connect with universities, companies, nonprofits, organizations, and other areas of government to contract necessary R&D. This allows the program to build relationships with individuals without needing to hire or provide facilities or research laboratories.
Interactions improve outcomes.
From past versions of ARPA that attempted remote and hybrid environments, it became evident that having organic collisions across an ARPA’s various roles and programs is important to achieving better outcomes. For example, ongoing in-person interactions between and among PDs and technical advisors are critical to idea generation and technical project and program management.
Staff transitions must be well facilitated to retain institutional knowledge.
One of ARPA’s most unique structural characteristics is its frequent turnover. PDs and fellows are term-limited, and ARPAs are designed to turn over those key positions every few years as markets and industries evolve, so having thoughtful transition processes in place is vital, including considering the role of systems engineering and technical assistance (SETA) contractors in filling knowledge gaps, cultivating an active alumni network, and staggered hiring cycles so that large numbers of PDs and fellows are not all exiting their service at once.
Scaling should be built into the structure.
It cannot be assumed that if a project is successful, the private sector will pick that technology up and help it scale. Instead, an ARPA should create its own bridge to scaling in the form of programs dedicated to funding projects proven in a test environment to scale their technology for real-world application.
Technology-to-market advisors play a pivotal role.
Similarly to the dedicated funding for scaling described above, technology-to-market advisors are responsible for thinking about how projects make it to the real world. They should work hand in hand with PDs even in the early stages of program development to provide perspectives on how projects might commercialize and become market-ready. Without this focus, technologies run the risk of dying on the vine—succeeding technically, but failing commercially.
A Primer on ARPA Ideation
Tackling grand challenges in transportation infrastructure through ARPA-I requires understanding what is unique about its program design. This process begins with considering the problem worth solving, the opportunity that makes it a ripe problem to solve, a high-level idea of an ARPA program’s fit in solving it, and a visualization of the future once this problem has been solved. This process of early-stage program ideation requires a shift in one’s thinking to find ideas for innovative programs that fit the ARPA model in terms of appropriate ambition level and suitability for ARPA structure and objectives. It is also an inherently iterative process, so while creating a “wireframe” outlining the problem, opportunity, program objectives, and future vision may seem straightforward, it can take months of refining.
Common Challenges
No clear diagnosis of the problem
Many challenges facing our transportation infrastructure system are not defined by a single problem; rather, they are a conglomeration of issues that simultaneously need addressing. An effective program will not only isolate a single problem to tackle, but it will approach it at a level where something can be done to solve it through root cause analysis.
Thinking small and narrow
On the other hand, problems being considered for ARPA programs can be isolated down to the point that solving them will not drive transformational change. In this situation, narrow problems would not cater to a series of progressive and complementary projects that would fit an ARPA.
Incorrect framing of opportunities:
When doing early-stage program design, opportunities are sometimes framed as “an opportunity to tackle a problem.” Rather, an opportunity should reflect a promising method, technology, or approach already in existence but which would benefit from funding and resources through an advanced research agency program.
Approaching solutions solely from a regulatory or policy angle
While regulations and policy changes are a necessary and important component of tackling challenges in transportation infrastructure, approaching issues through this lens is not the mandate of an ARPA. ARPAs focus on supporting breakthrough innovations in developing new methods, technologies, capabilities, and approaches. Additionally, regulatory approaches to problem-solving can often be subject to lengthy policy processes.
No explicit ARPA role
An ARPA should pursue opportunities to solve problems where, without its intervention, breakthroughs may not happen within a reasonable timeframe. If the public or private sector already has significant interest in solving a problem, and they are well on their way to developing a transformational solution in a few years or less, then ARPA funding and support might provide a higher value-add elsewhere.
Lack of throughline
The problems identified for ARPA program consideration should be present as themes throughout the opportunities chosen to solve them as well as how programs are ultimately structured. Otherwise, a program may lack a targeted approach to solving a particular challenge.
Forgetting about end users
Human-centered design should be at the heart of how ARPA programs are scoped, especially when considering the scale at which designers need to think about how solving a problem will provide transformational change for everyday users.
Being solutions-oriented
Research programs should not be built with predetermined solutions in mind; they should be oriented around a specific problem to ensure that any solutions put forward are targeted and effective.
Not being realistic about direct outcomes of the program
Program objectives should not simply restate the opportunity, nor should they jump to where the world will be many years after the program has run its course. They should separate the tactical elements of a program and what impact they will ultimately drive. Designers should consider their program as one key step in a long arc of commercialization and adoption, with a firm sense of who needs to act and what needs to happen to make a program objective a reality.
Keeping these common mistakes in mind throughout the design process ensures that programs are properly scoped, appropriately ambitious, and in line with the agency’s goals. With these guideposts in mind, idea generators should begin their program design in the form of a wireframe.
Wireframe Development
The first phase in ARPA program development is creating a program wireframe, which is an outline of a potential program that captures key components for consideration to assess the program’s fit and potential impact. The template below shows the components characteristic of a program wireframe.
To create a fully fleshed-out wireframe, program directors work backward by first envisioning a future state that would be truly transformational for society and across sectors if it were to be realized. Then, they identify a clearly-articulated problem that needs solving and is hindering progress toward this transformational future state. During this process, PDs need to conduct extensive root cause analysis to consider whether the problem they’ve identified is exacerbated by policy, regulatory, or environmental complications—as opposed to those that technology can already solve. This will inform whether a problem is something that ARPA-I has the opportunity to impact fundamentally.
Next, program directors identify a promising opportunity—such as a method, approach, or technology—that, if developed, scaled, and implemented, would solve the problem they articulated and help achieve their proposed future state. When considering a promising opportunity, PDs must assess whether it front-runs other potential technologies that would also need developing to support it and whether it is feasible to achieve concrete results within three to five years and with an average program budget. Additionally, it is useful to think about whether an opportunity considered for program development is part of a larger cohort of potential programs that lie within an ARPA-I focus area that could all be run in parallel.
Most importantly, before diving into how to solve the problem, PDs need to articulate what has prevented this opportunity from already being solved, scaled, and implemented, and what explicit role or need there is for a federal R&D agency to step in and lead the development of technologies, methods, or approaches to incentivize private sector deployment and scaling. For example, if the private sector is already incentivized to, and capable of, taking the lead on developing a particular technology and it will achieve market readiness within a few years, then there is less justification for an ARPA intervention in that particular case. On the other hand, the prescribed solution to the identified problem may be so nascent that what is needed is more early-stage foundational R&D, in which case an ARPA program would not be a good fit. This area should be reserved as the domain of more fundamental science-based federal R&D agencies and offices.
One example to illustrate this maturity fit is DARPA investment in mRNA. While the National Institutes of Health contributed significantly to initial basic research, DARPA recognized the technological gap in being able to quickly scale and manufacture therapeutics, prompting the agency to launch the Autonomous Diagnostics to Enable Prevention and Therapeutics (ADEPT) program to develop technologies to respond to infectious disease threats. Through ADEPT, in 2011 DARPA awarded a fledgling Moderna Therapeutics with $25 million to research and develop its messenger RNA therapeutics platform. Nine years later, Moderna became the second company after Pfizer-BioNTech to receive an Emergency Use Authorization for its COVID-19 vaccine.
Another example is DARPA’s role in developing the internet as we know it, which was not originally about realizing the unprecedented concept of a ubiquitous, global communications network. What began as researching technologies for interlinking packet networks led to the development of ARPANET, a pioneering network for sharing information among geographically separated computers. DARPA then contracted BBN Technologies to build the first routers before becoming operational in 1969. This research laid the foundation for the internet. The commercial sector has since adopted ARPANET’s groundbreaking results and used them to revolutionize communication and information sharing across the globe.
Wireframe Refinement and Iteration
To guide program directors through successful program development, George H. Heilmeier, who served as the director of DARPA from 1975 to 1977, used to require that all PDs answer the following questions, known as the Heilmeier Catechism, as part of their pitch for a new program. These questions should be used to refine the wireframe and envision what the program could look like. In particular, wireframe refinement should examine the first three questions before expanding to the remaining questions.
- What are you trying to do? Articulate your objectives using absolutely no jargon.
- How is it done today, and what are the limits of current practice?
- What is new in your approach, and why do you think it will be successful?
- Who cares? If you are successful, what difference will it make?
- What are the risks?
- How much will it cost?
- How long will it take?
- What are the midterm and final “exams” to check for success?
Alongside the Heilmeier Catechism, a series of assessments and lines of questioning should be completed to pressure test and iterate once the wireframe has been drafted. This refinement process is not one-size-fits-all but consistently grounded in research, discussions with experts, and constant questioning to ensure program fit. The objective is to thoroughly analyze whether the problem we are seeking to solve is the right one and whether the full space of opportunities around that problem is ripe for ARPA intervention.
One way to think about determining whether a wireframe could be a program is by asking, “Is this wireframe science or is this science fiction?” In other words, is the proposed technology solution at the right maturity level for an ARPA to make it a reality? There is a relatively broad range in the middle of the technological maturity spectrum that could be an ARPA program fit, but the extreme ends of that spectrum would not be a good fit, and thus those wireframes would need further iteration or rejection. On the far left end of the spectrum would be basic research that only yields published papers or possibly a prototype. On the other extreme would be a technology that is already developed to the point that only full-scale implementation is needed. Everything that falls between could be suitable for an ARPA program topic area.
Once a high-impact program has been designed, the next step is to rigorously pressure test and develop a program until it resembles an executable ARPA program.
Applying ARPA Frameworks to Transportation Infrastructure Challenges
By using this framework, any problem or opportunity within transportation infrastructure can be evaluated for its fit as an ARPA-level idea. Expert and stakeholder idea generation is essential to creating an effective portfolio of ARPA-I programs, so idea generators must be armed with this framework and a defined set of focus areas to develop promising program wireframes. An initial set of focus areas for ARPA-I includes safety, climate and resilience, and digitalization, with equity and accessibility as underlying considerations within each focus area.
There are hundreds of potential topic areas that ARPA-I could tackle; the two wireframes below represent examples of early-stage program ideas that would benefit from further pressure testing through the program design iteration cycle.
Note: The following wireframes are samples intended to illustrate ARPA ideation and the wireframing process, and do not represent potential research programs or topics under consideration by the U.S. Department of Transportation.
Next-Generation Resilient Infrastructure Management
A Digital Inventory of Physical Infrastructure and Its Uses
Wireframe Development Next Steps
After initial wireframe development, further exploration is needed to pressure test an idea and ensure that it can be developed into a viable program to achieve “moonshot” ambitions. Wireframe authors should consider the following factors when iterating:
- The Heilmeier Catechism questions (see page 14) and whether the wireframe needs to be updated or revised as they seek to answer each of the Heilmeier Catechism questions
- Common challenges wireframes face (see page 11) and whether any of them might be reflected in the wireframe
- The federal, state, and local regulatory landscape and any regulatory factors that will impact the direction of a potential research program
- Whether the problem or technology already receives significant investment from other sources (if there is significant investment from venture capital, private equity, or elsewhere, then it would not be an area of interest for ARPA-I)
- Adjacent areas of work that might inform or affect a potential research program
- The transportation infrastructure sector’s unique challenges and landscape
- How long will it take?
- Existing grant programs and opportunities that might achieve similar goals
Wireframes are intended to be a summary communicative of a larger plan to follow. After further iteration and exploration of the factors outlined above, what was first just a raw program wireframe should develop into more detailed documents. These should include an incisive diagnosis of the problem and evidence and citations validating opportunities to solve it. Together, these components should lead to a plausible program objective as an outcome.
Conclusion
The newly authorized and appropriated ARPA-I presents a once-in-a-generation opportunity to apply a model that has been proven successful in developing breakthrough innovations in other sectors to the persistent challenges facing transportation infrastructure.
Individuals and organizations that would work within the ARPA-I network need to have a clear understanding of the unique circumstances, challenges, and opportunities of this sector, as well as how to apply this context and the unique ARPA program ideation model to build high-impact future innovation programs. This community’s engagement is critical to ARPA-I’s success, and the FAS is looking for big thinkers who are willing to take on this challenge by developing bold, innovative ideas.
To sign up for future updates on events, convenings, and other opportunities for you to work in support of ARPA-I programs and partners, click here.
To submit an advanced research program idea, click here.
Advanced Research Priorities in Transportation
The Federation of American Scientists (FAS) has identified several domains in the transportation and infrastructure space that retain a plethora of unsolved opportunities ripe for breakthrough innovation.
Transportation is not traditionally viewed as a research- and development-led field, with less than 0.7% of the U.S. Department of Transportation (DOT) annual budget dedicated to R&D activities. The majority of DOT’s R&D funds are disbursed by modal operating administrators mandated to execute on distinct funding priorities rather than a collective, integrated vision of transforming the nation’s infrastructure across 50 states and localities.
Historically, a small percentage of these R&D funds have supported and developed promising, cross-cutting initiatives, such as the Federal Highway Administration’s Exploratory Advanced Research programs deploying artificial intelligence to better understand driver behavior and applying novel data integration techniques to enhance freight logistics. Yet, the scope of these programs has not been designed to scale discoveries into broad deployment, limiting the impact of innovation and technology in transforming transportation and infrastructure in the United States.
As a result, transportation and infrastructure retain a plethora of unaddressed opportunities – from reducing the 40,000 annual vehicle-related fatalities, to improving freight logistics through ports, highways, and rail, to achieving a net zero carbon transportation system, to building infrastructure resilient to the impacts of climate change and severe weather. The reasons for these persistent challenges are numerous: low levels of federal R&D spending, fragmentation across state and local government, risk-averse procurement practices, sluggish commercial markets, and more. When innovations do emerge in this field, they suffer from two valleys of death: one to bring new ideas out of the lab into commercialization, and the second to bring successful deployments of those technologies to scale.
The United States needs a concerted national innovation pipeline designed to fill this gap, exploring early-stage, moonshot research while nurturing breakthroughs from concept to deployment. An Advanced Research Projects Agency-Infrastructure would deliver on this mission. Modeled after the Defense Advanced Research Projects Agency (DARPA) and the Advanced Research Projects Agency-Energy (ARPA-E), the Advanced Research Projects Agency-Infrastructure (ARPA-I) will operate nimbly and with rigorous program management and deep technical expertise to tackle the biggest infrastructure challenges and overcome entrenched market failures. Solutions would cut across traditional transportation modes (e.g. highways, rail, aviation, maritime, pipelines etc) and would include innovative new infrastructure technologies, materials, systems, capabilities, or processes.
The list of domain areas below reflects priorities for DOT as well as areas where there is significant opportunity for breakthrough innovation:
Key Domain Areas
Metropolitan Safety
Despite progress made since 1975, dramatic reductions in roadway fatalities remain a core, persistent challenge. In 2021, an estimated 42,915 people were killed in motor vehicle crashes, with an estimated 31,785 people killed in the first nine months of 2022. The magnitude of this challenge is articulated in DOT’s most recent National Roadway Safety Strategy, a document that begins with a statement from Secretary Buttigieg: “The status quo is unacceptable, and it is preventable… Zero is the only acceptable number of deaths and serious injuries on our roadways.”
Example topical areas include but are not limited to: urban roadway safety; advanced vehicle driver assistance systems; driver alcohol detection systems; vehicle design; street design; speeding and speed limits; and V2X (vehicle-to-everything) communications and networking technology.
Key Questions for Consideration:
- What steps can be taken to create safer urban mobility spaces for everyone, and what role can technology play in helping create the future we envision?
- What capabilities, systems, and datasets are we missing right now that would unlock more targeted safety interventions?
Rural Safety
Rural communities possess their own unique safety challenges stemming from road design and signage, speed limits, and other factors; and data from the Federal Highway Administration shows that “while only 19% of the U.S. population lives in rural areas, 43% of all roadway fatalities occur on rural roads, and the fatality rate on rural roads is almost 2 times higher than on urban roads.”
Example topical areas include but are not limited to: improved information collection and management systems; design and evaluation tools for two-lane highways and other geometric design decisions; augmented visibility; mitigating or anti-rollover crash solutions; and enhanced emergency response.
Key Questions for Consideration:
- How can rural-based safety solutions address the resource and implementation issues that are faced by local transportation agencies?
- How can existing innovations be leveraged to support the advancement of road safety in rural settings?
Resilient & Climate Prepared Infrastructure
Modern roads, bridges, and transportation are designed to withstand storms that, at the time of their construction, had a probability of occurring once in 100 years; today, climate change has made extreme weather events commonplace. In 2020 alone, the U.S. suffered 22 high-impact weather disasters that each cost over $1 billion in damages. When Hurricane Sandy hit New York City and New Jersey subways with a 14-foot storm surge, millions were left without their primary mode of transportation for a week. Meanwhile, rising sea levels are likely to impact both marine and air transportation, as 13 of the 47 largest U.S. airports have at least one runway within 12 feet of the current sea level. Additionally, the persistent presence of wildfires–which are burning an average of 7 million acres annually across the United States, more than double the average in the 1990s–dramatically reshapes the transportation network in acute ways and causes downstream damage through landslides, flooding, and other natural events.
These trends are likely to continue as climate change exacerbates the intensity and scope of these events. The Department of Transportation is well-positioned to introduce systems-level improvements to the resilience of our nation’s infrastructure.
Example topical areas include but are not limited to: High-performance long-life, advanced materials that increase resiliency and reduce maintenance and reconstruction needs, especially materials for roads, rail, and ports; nature-based protective strategies such as constructed marshes; novel designs for multi-modal hubs or other logistics/supply chain redundancy; efficient and dynamic mechanisms to optimize the relocation of transportation assets; intensive maintenance, preservation, prediction, and degradation analysis methods; and intelligent disaster-resilient infrastructure countermeasures.
Key Questions for Consideration:
- How can we ensure that innovations in this domain yield processes and technologies that are flexible and adaptive enough to ward against future uncertainties related to climate-related disasters?
- How can we factor in the different climate resilience needs of both urban and rural communities?
Digital Infrastructure
Advancing the systems, tools, and capabilities for digital infrastructure to reflect and manage the built environment has the power to enable improved asset maintenance and operations across all levels of government, at scale. Advancements in this field would make using our infrastructure more seamless for transit, freight, pedestrians, and more. Increased data collection from or about vehicle movements, for example, enables user-friendly and demand-responsive traffic management, dynamic curb management for personal vehicles, transit and delivery transportation modes, congestion pricing, safety mapping and targeted interventions, and rail and port logistics. When data is accessible by local departments of transportation and municipalities, it can be harnessed to improve transportation operations and public safety through crash detection as well as to develop Smart Cities and Communities that utilize user-focused mobility services; connected and automated vehicles; electrification across transportation modes, and intelligent, sensor-based infrastructure to measure and manage age-old problems like potholes, air pollution, traffic, parking, and safety.
Example topical areas include but are not limited to: traffic management; curb management; congestion pricing; accessibility; mapping for safety; rail management; port logistics; and transportation system/electric grid coordination.
Key Questions for Consideration:
- How might we leverage data and data systems to radically improve mobility and our transportation system across all modes?
Expediting and Upgrading Construction Methods
Infrastructure projects are fraught with expensive delays and overrun budgets. In the United States, fewer than 1 in 3 contractors report finishing projects on time and within budgets, with 70% citing coordination at the site of construction as the primary reason. In the words of one industry executive, “all [of the nation’s] major projects have cost and schedule issues … the truth is these are very high-risk and difficult projects. Conditions change. It is impossible to estimate it accurately.” But can process improvements and other innovations make construction cheaper, better, faster, and easier?
Example topical areas include but are not limited to: augmented forecasting and modeling techniques; prefabricated or advanced robotic fabrication, modular, and adaptable structures and systems such as bridge sub- and superstructures; real-time quality control and assurance technologies for accelerated construction, materials innovation; new pavement technologies; bioretention; tunneling; underground infrastructure mapping; novel methods for bridge engineering, building information modeling (BIM), coastal, wind, and offshore engineering; stormwater systems; and computational methods in structural engineering, structural sensing, control, and asset management.
Key Questions for Consideration:
- What innovations are more critical to the accelerated construction requirements of the future?
Logistics
Our national economic strength and quality of life depend on the safe and efficient movement of goods throughout our nation’s borders and beyond. Logistic systems—the interconnected webs of businesses, workers, infrastructure processes, and practices that underlie the sorting, transportation, and distribution of goods must operate with efficiency and resilience. . When logistics systems are disrupted by events such as public health crises, extreme weather, workforce challenges, or cyberattacks, goods are delayed, costs increase, and Americans’ daily lives are affected. The Biden Administration issued Executive Order 14017 calling for a review of the transportation and logistics industrial base. DOT released the Freight and Logistics Supply Chain Assessment in February 2022, spotlighting a range of actions that DOT envisions to support a resilient 21st-century freight and logistics supply chain for America.
Topical areas include but are not limited to: freight infrastructure, including ports, roads, airports, and railroads; data and research; rules and regulations; coordination across public and private sectors; and supply chain electrification and intersections with resilient infrastructure.
Key Questions for Consideration:
- How might we design and develop freight infrastructure to maximize efficiency and use of emerging technologies?
- What existing innovations and technologies could be introduced and scaled up at ports to increase the processing of goods and dramatically lower the transaction costs of US freight?
- How can we design systems that optimize for both efficiency and resilience?
- How can we reduce the negative externalities associated with our logistics systems, including congestion, air pollution, noise, GHG emissions, and infrastructure degradation?