Carbon Capture in the Industrial Sector: Addressing Training, Startups, and Risk

This memo is part of the Day One Project Early Career Science Policy Accelerator, a joint initiative between the Federation of American Scientists & the National Science Policy Network.

Summary

Decarbonizing our energy system is a major priority for slowing and eventually reversing climate change. Federal policies supporting industrial-scale solutions for carbon capture, utilization, and sequestration (CCUS) have significantly decreased costs for large-scale technologies, yet these costs are still high enough to create considerable investment risks. Multiple companies and laboratories have developed smaller-scale, modular technologies to decrease the risk and cost of point-source carbon capture and storage (CCS). Additional federal support is needed to help these flexible, broadly implementable technologies meet the scope of necessary decarbonization in the highly complex industrial sector. Accordingly, the Department of Energy (DOE) should launch an innovation initiative comprising the following three pillars:

  1. Launch a vocational CCS training program to grow the pool of workers equipped with the skills to install, operate, and maintain CCS infrastructure.
  2. Develop an accelerator to develop and commercialize modular CCS for the industrial sector.
  3. Create a private-facing CCS Innovation Connector (CIC) to increase stability and investment. 

These activities will target underfunded areas and complement existing DOE policies for CCS technologies.

Challenge and Opportunity

Carbon dioxide (CO2) is the largest driver of human-induced climate change. Tackling the climate crisis requires the United States to significantly decarbonize; however, CCS and CCUS are still too costly. CCUS costs must drop to $100 per ton of CO2 captured to incentivize industry uptake. U.S. policymakers have paved the way for CCUS by funding breakthrough research, increasing demand for captured CO2through market-shaping, improving technologies for point-source CCS, and building large-scale plants for direct-air capture (DAC). DAC has great promise for remediating CO2 in the atmosphere despite its higher cost (up to $600/ton of CO2 sequestered), so the Biden Administration and DOE have recently focused on DAC via policies such as the Carbon Negative Shot (CNS) and the 2021 Infrastructure Investment and Jobs Act (IIJA). By comparison, point-source CCS has been described as an “orphan technology” due to a recent lack of innovation.

Part of the problem is that few long-term mechanisms exist to make CCS economical. Industrial CO2 demand is rising, but without a set carbon price, emissions standard, or national carbon market, the cost of carbon capture technology outweighs demand. The Biden Administration is increasing demand for captured carbon through government purchasing and market-shaping, but this process is slow and does not address the root problems of high technology and infrastructure costs. Therefore, targeting the issue from the innovation side holds the most promise for improving industry uptake. DOE grants for technology research and demonstration are common, while public opinion and the 45Q tax credit have led to increased funding for CCS from companies like ExxonMobil. These efforts have allowed large-scale projects like the $1 billion Petra Nova plant to be developed; however, concerns about carbon capture pipelines, the high-cost, high-risk technology, and years needed for permitting mean that large-scale projects are few and far between. Right now, there are only 26 operating CCUS plants globally. Therefore, a solution is to pursue smaller-scale technologies to fill this gap and provide lower-cost and smaller-scale — but much more widespread — CCS installations. 

Modular CCS technologies, like those created by the startups Carbon Clean and Carbon Capture, have shown promise for industrial plants. Carbon Clean has serviced 44 facilities that have collectively captured over 1.4 million metric tons of carbon. Mitsubishi is also trialing smaller CCS plants based on successful larger facilities like Orca or Petra Nova. Increasing federal support for modular innovation with lower risks and installation costs could attract additional entrants to the CCS market. Most research focuses on breakthrough innovation to significantly decrease carbon capture costs. However, there are many existing CCS technologies — like amine-based solvents or porous membranes — that can be improved and specialized to cut costs as well. In particular, modular CCS systems could effectively target the U.S. industrial sector, given that industrial subsectors such as steel or plastics manufacturing receive less pressure and have fewer decarbonization options than oil and gas enterprises. The industrial sector accounts for 30% of U.S. greenhouse gas emissions through a variety of small point sources, which makes it a prime area for smaller-scale CCS technologies.

Plan of Action

DOE should launch an initiative designed to dramatically advance technological options for and use of small-scale, modular CCS in the United States. The program would comprise three major pillars, detailed in Table 1.

Table 1.
Three complementary efforts to increase industrial uptake of CCS technologies.
PillarPurposeChampionCostFundingTime Frame
Vocational TrainingIncrease CCS workforceDOE OCED$5 millionIIJA2-4 years
Modular CSS Innovation ProgramDevelop modular CCS technology for industry subsectorsDOE OCED or FECM$10 millionIIJA, DOE grants1 year
CCS Innovator ConnectorEncourage private CCS investmentDOE OCED$750,000/yearIIJA2 years

DOE should leverage IIJA and the new DOE Office of Clean Energy Demonstration (OCED) to create a vocational CCS training program. DOE has in the past supported — and is currently supporting — a suite of regional carbon capture training. However, DOE’s 2012 program was geared toward scientists and workers already in the CCS field, and its 2022 program is specialized for 20–30 specific scientists and projects. DOE should build on this work with a new vocational CCS training program that will:

This educational program would be cost-effective: the online course would require little upkeep, and the vocational training programs could be largely developed with financial and technical support from external partners. Initial funding of $5 million would cover course development and organization of the vocational training programs.

Pillar 2. Create an accelerator for the development and commercialization of modular CCS technologies.

The DOE Office of Fossil Energy and Carbon Management (FECM) or OCED should continue to lead global innovation by creating the Modular CCS Innovation Program (MCIP). This accelerator would provide financial and technical support for U.S. research and development (R&D) startups working on smaller-scale, flexible CCS for industrial plants (e.g., bulk chemical, cement, and steel manufacturing plants). The MCIP should prioritize technology that can be implemented widely with lower costs for installation and upkeep. For example, MCIP projects could focus on improving the resistance of amine-based systems to specialty chemicals, or on developing a modular system like Carbon Clean that can be adopted by different industrial plants. Projects like these have been proposed by different U.S. companies and laboratories, yet to date they have received comparatively less support from government loans or tax credits. 

Figure 1. 

Proposed timeline of the MCIP accelerator for U.S. startups.

As illustrated in Figure 1, the MCIP would be launched with a Request for Proposals (RFP), awarding an initial $1 million each to the top 10 proposals received. In the first 100 days after receiving funding, each participating startup would be required to submit a finalized design and market analysis for its proposed product. The startup would then have an additional 200 days to produce a working prototype of the product. Then, the startup would move into the implementation and commercialization stages, with the goal to have its product market-ready within the next year. Launching the MCIP could therefore be achieved with approximately $10 million in grant funding plus additional funding to cover administrative costs and overhead — amounts commensurate with recent DOE funding for large-scale CCUS projects. This funding could come from the $96 million recently allocated by DOE to advance carbon capture technology and/or from funding allocated in the IIJA allocation. Implementation funding could be secured in part or in whole from private investors or other external industry partners.

Pillar 3. Create a private-facing CCS Innovation Connector (CIC) to increase stability and investment. 

The uncertainty and risk that discourages private investment in CCS must be addressed. Many oil and gas companies such as ExxonMobil have called for a more predictable policy landscape and increased funding for CCS projects. Creating a framework for a CCS Innovation Connector (CIC) within the DOE OCED based on a similar fund in the European Union would decrease the perceived risks of CCS technologies emerging from MCIP. The CIC would work as follows: first, a company would submit a proposal relating to point-source carbon capture. DOE technical experts would perform an initial quality-check screening and share proposals that pass to relevant corporate investors. Once funding from investors is secured, the project would begin. CIC staff (likely two to three full-time employees) would monitor projects to ensure they are meeting sponsor goals and offer technical assistance as necessary. The CIC would serve as a liaison between CCS project developers and industrial sponsors or investors to increase investment in and implementation of nascent CCS technologies. While stability in the CCS sector will require policies such as increasing carbon tax credits or creating a global carbon price, the CIC will help advance such policies by funding important American CCS projects. 

Conclusion

CO2 emissions will continue to rise as U.S. energy demand grows. Many existing federal policies target these emissions through clean energy or DAC projects, but more can and should be done to incentivize U.S. innovation in point-source CCS. In particular, increased federal support is needed for small-scale and modular carbon capture technologies that target complex areas of U.S. industry and avoid the high costs and risks of large-scale infrastructure installations. This federal support should involve improving CCS education and training, accelerating the development and commercialization of modular CCS technologies for the industrial sector, and connecting startup CCS projects to private funding. Biden Administration policies — coupled with growing public and industrial support for climate action — make this the ideal time to expand the reach of our climate strategy into an “all of the above” solution that includes CCS as a core component.

An Earthshot for Clean Steel and Aluminum

Summary

The scale of mobilization and technological advancement required to avoid the worst effects of climate change has recently led U.S. politicians to invoke the need for a new, 21st century “moonshot.” The Obama Administration launched the SunShot Initiative to dramatically reduce the cost of solar energy and, more recently, the Department of Energy (DOE) announced a series of “Earthshots” to drive down the cost of emerging climate solutions, such as long-duration energy storage.

While DOE’s Earthshots to date have been technology-specific and sector-agnostic, certain heavy industrial processes, such as steel and concrete, are so emissions- intensive and fundamental to modern economies as to demand an Earthshot unto themselves. These products are ubiquitous in modern life, and will be subject to increasing demand as we seek to deploy the clean energy infrastructure necessary to meet climate goals. In other words, there is no reasonable pathway to preserving a livable planet without developing clean steel and concrete production at mass scale. Yet the sociotechnical pathways to green industry – including the mix of technological solutions to replace high-temperature heat and process emissions, approaches to address local air pollutants, and economic development strategies – remain complex and untested. We urgently need to orient our climate innovation programs to the task.

Therefore, this memo proposes that DOE launch a Steel Shot to drive zero-emissions iron, steel, and aluminum production to cost-parity with traditional production within a decade. In other words, zero dollar difference for zero-emissions steel in ten years, or Zero for Zero in Ten.

Challenge and Opportunity

As part of the Biden-Harris Administration’s historic effort to quadruple federal funding for clean energy innovation, DOE has launched a series of “Earthshots” to dramatically slash the cost of emerging technologies and galvanize entrepreneurs and industry to hone in on ambitious but achievable goals. DOE has announced Earthshots for carbon dioxide removal, long-duration storage, and clean hydrogen. New programs authorized by the Infrastructure Investment and Jobs Act, such as hydrogen demonstration hubs, provide tools to help DOE to meet the ambitious cost and performance targets set in the Earthshots. The Earthshot technologies have promising applications for achieving net-zero emissions economy-wide, including in sectors that are challenging to decarbonize through clean electricity alone.

One such sector is heavy industry, a notoriously challenging and emissions-intensive sector that, despite contributing to nearly one-third of U.S. emissions, has received relatively little focus from federal policymakers. Within the industrial sector, production of iron and steel, concrete, and chemicals are the biggest sources of CO2 emissions, producing climate pollution not only from their heavy energy demands, but also from their inherent processes (e.g., clinker production for cement). 

Meanwhile, global demand for cleaner versions of these products – the basic building blocks of modern society – is on the rise. The International Energy Agency (IEA) estimates that CO2 emissions from iron and steel production alone will need to fall from 2.4 Gt to 0.2 Gt over the next three decades to meet a net-zero emissions target economy-wide, even as overall steel consumption increases to meet our needs for clean energy buildout. Accordingly, by 2050, global investment in clean energy and sustainable infrastructure materials will grow to $5 trillion per year. The United States is well-positioned to seize these economic opportunities, particularly in the metals industry, given its long history of metals production, skilled workforce, the initiation of talks to reach a carbon emissions-based steel and aluminum trade agreement, and strong labor and political coalitions in favor of restoring U.S. manufacturing leadership.

“The metals industry is foundational to economic prosperity, energy infrastructure, and national security. It has a presence in all 50 states and directly employs more than a half million people. The metals industry also contributes 10% of national climate emissions.”

Department of Energy request for information on a new Clean Energy Manufacturing Institute, 2021

However, the exact solutions that will be deployed to decarbonize heavy industry remain to be seen. According to the aforementioned IEA Net-Zero Energy (NZE) scenario, steel decarbonization could require a mix of carbon capture, hydrogen-based, and other innovative approaches, as well as material efficiency gains. It is likely that electrification – and in the case of steel, increased global use of electric arc furnaces – will also play a significant role. While technology research funding should be increased, traditional “technology-push” efforts alone are unlikely to spur rapid and widespread adoption of a diverse array of solutions, particularly at low-margin, capital-intensive manufacturing facilities. This points to the potential for creative technology-neutral policies, such as clean procurement programs, which create early markets for low-emissions production practices without prescribing a particular technological pathway.

Therefore, as a complement to its Earthshots that “push” promising clean energy technologies down the cost curve, DOE should also consider adopting technology-neutral Earthshots for the industrial sector, even if some of the same solutions may be found in other Earthshots (e.g., hydrogen). It is important for DOE to be very disciplined in identifying one or two essential sectors, where the opportunity is large and strategic, to avoid creating overly balkanized sectoral strategies. In particular, DOE should start with the launch of a Steel Shot to buy down the cost of zero-emissions iron, steel, and aluminum production to parity with traditional production within a decade, while increasing overall production in the sector. In other words, zero dollar difference for zero-emissions steel in ten years, or Zero for Zero in Ten.

The Steel Shot can bring together applied research and demonstration programs, public-private partnerships, prizes, and government procurement, galvanizing public energy around a target that enables a wide variety of approaches to compete. These efforts will be synergistic with technology-specific Earthshots seeking dramatic cost declines on a similar timeline.

Plan of Action

Develop and launch a metals-focused Earthshot: 

Invest in domestic clean steelmaking capacity:

Create demand for “green steel” through market pull mechanisms:

Frequently Asked Questions
Is a sector-focused Energy Earthshot really necessary?

The lower technology prices targeted by the Hydrogen Earthshot and the Carbon Negative Shot are necessary but not sufficient to guarantee that these technologies are deployed in the highest emissions producing sectors, such as steel, cement, and chemicals. The right combination of approaches to achieve price reduction remains uncertain and can vary by plant, location, process, product, as noted in a recent McKinsey study on decarbonization challenges across the industrial sector. Additionally, there is a high upfront cost to deploying novel solutions, and private financers are reluctant to take a risk on untested technologies. Nonetheless, to avoid creating overly balkanized sectoral strategies, it will be important for DOE to be very disciplined in identifying one or two essential sectors, such as metals, where the opportunity is large and strategic.

Why are metals the best opportunity for a sector-focused Earthshot?

These products are ubiquitous and increasingly crucial for deploying the clean energy infrastructure necessary to reach net-zero. The United States of America has a long history of metals production, a skilled workforce, and strong labor and political coalitions in favor of restoring U.S. manufacturing leadership. Additionally, carbon-intensive steel from China has become a growing concern for U.S. manufacturers and policymakers; China produces 56% of global crude steel, followed by India (6%), Japan (5%), and then the U.S. (4%). The U.S. already maintains a strong competitive advantage in clean steel, and the technologies needed to double-down and fully decarbonize steel are close to commercialization, but still require government support to achieve cost parity.

Will this Earthshot reduce U.S. metals manufacturing competitiveness?

U.S. steel production is already less polluting than many foreign sources, but that typically comes with additional costs. Reducing the “green premium” will help to keep US metal producers competitive, while preparing them for the needs of buyers, who are increasingly seeking out green steel products. End users such as Volkswagen are aiming for zero emissions across their entire value chain by 2050, while Mercedes-Benz and Volvo have already begun sourcing low-emissions steel for new autos. Meanwhile,  the EU is preparing to implement a carbon border adjustment mechanism that could result in higher prices for steel and aluminum-produced products from the United States. The ramifications of the carbon border tax are already being seen in steel agreements, such as the recent US-EU announcement to drop punitive tariffs on each other’s steel and aluminum exports and to begin talks on a carbon-based trade agreement.

What is the right baseline to use for calculating the “green premium” of metals?

Breakthrough Energy estimated that the “green premium” for steel using carbon capture is approximately 16% – 29% higher than “normally” produced steel. Because there are a variety of processes that could be used to reduce emissions, and thus contribute to the “green premium,” there may not be a single number that can be estimated for the current costs. However, wherever possible, we advocate for using real-world data of “green” produced steel to estimate how close DOE is to achieving its benchmark targets in comparison to “traditional” steel.

Leveraging Department of Energy Authorities and Assets to Strengthen the U.S. Clean Energy Manufacturing Base

Summary

The Biden-Harris Administration has made revitalization of U.S. manufacturing a key pillar of its economic and climate strategies. On the campaign trail, President Biden pledged to do away with “invent it here, make it there,” alluding to the long-standing trend of outsourcing manufacturing capacity for critical technologies — ranging from semiconductors to solar panels —that emerged from U.S. government labs and funding. As China and other countries make major bets on the clean energy industries of the future, it has become clear that climate action and U.S. manufacturing competitiveness are deeply intertwined and require a coordinated strategy.

Additional legislative action, such as proposals in the Build Back Better Act that passed the House in 2021, will be necessary to fully execute a comprehensive manufacturing agenda that includes clean energy and industrial products, like low-carbon cement and steel. However, the Department of Energy (DOE) can leverage existing authorities and assets to make substantial progress today to strengthen the clean energy manufacturing base. 

This memo recommends two sets of DOE actions to secure domestic manufacturing of clean technologies:

  1. Foundational steps to successfully implement the new Determination of Exceptional Circumstances (DEC) issued in 2021 under the Bayh-Dole Act to promote domestic manufacturing of clean energy technologies.
  2. Complementary U.S.-based manufacturing investments to maximize the DEC’s impact and to maximize the overall domestic benefits of DOE’s clean energy innovation programs.

Challenge and Opportunity

Recent years have been marked by growing societal inequality, a pandemic, and climate change-driven extreme weather. These factors have exposed the weaknesses of essential supply chains and our nation’s legacy energy system. 

Meanwhile, once a reliable source of supply chain security and economic mobility, U.S. manufacturing is at a crossroads. Since the early 2000s, U.S. manufacturing productivity has stagnated and five million jobs have been lost. While countries like Germany and South Korea have been doubling down on industrial innovation — in ways that have yielded a strong manufacturing job recovery since the Great Recession — the United States has only recently begun to recognize domestic manufacturing as a crucial part of a holistic innovation ecosystem. Our nation’s longstanding, myopic focus on basic technological research and development (R&D) has contributed to the American share of global manufacturing declining by 10 percentage points, and left U.S. manufacturers unprepared to scale up new innovations and compete in critical sectors long-term.

The Biden-Harris administration has sought to reverse these trends with a new industrial strategy for the 21st century, one that includes a focus on the industries that will enable us to tackle our most pressing global challenge and opportunity: climate change. This strategy recognizes that the United States has yet to foster a robust manufacturing base for many of the key products —ranging from solar modules to lithium-ion batteries to low-carbon steel — that will dominate a clean energy economy, despite having funded a large share of the early and applied research into underlying technologies. The strategy also recognizes that as clean energy technologies become increasingly foreign-produced, risks increase for U.S. climate action, national security, and our ability to capture the economic benefits of the clean energy transition. 

The U.S. Department of Energy (DOE) has a central role to play in executing the administration’s strategy. The Obama administration dramatically ramped up funding for DOE’s Advanced Manufacturing Office (AMO) and launched the Manufacturing USA network, which now includes seven DOE-sponsored institutes that focus on cross-cutting research priorities in collaboration with manufacturers. In 2021, DOE issued a Determination of Exceptional Circumstances (DEC) under the Bayh-Dole Act of 19801 to ensure that federally funded technologies reach the market and deliver benefits to American taxpayers through substantial domestic manufacturing. The DEC cites global competition and supply chain security issues around clean energy manufacturing as justification for raising manufacturing requirements from typical Bayh-Dole “U.S. Preference” rules to stronger “U.S. Competitiveness” rules across DOE’s entire science and energy portfolio (i.e., programs overseen by the Under Secretary for Science and Innovation (S4)). This change requires DOE-funded subject inventions to be substantially manufactured in the United States for all global use and sales (not just U.S. sales) and expands applicability of the manufacturing requirement to the patent recipient as well as to all assignees and licensees. Notably, the DEC does allow recipients or licensees to apply for waivers or modifications if they can demonstrate that it is too challenging to develop a U.S. supply chain for a particular product or technology.

The DEC is designed to maximize return on investment for taxpayer-funded innovation: the same goal that drives all technology transfer and commercialization efforts. However, to successfully strengthen U.S. manufacturing, create quality jobs, and promote global competitiveness and national security, DOE will need to pilot new evaluation processes and data reporting frameworks to better assess downstream impacts of the 2021 DEC and similar policies, and to ensure they are implemented in a manner that strengthens manufacturing without slowing technology transfer. It is essential that DOE develop an evidence base to assess a common critique of the DEC: that it reduces appetite for companies and investors to engage in funding agreements. Continuous evaluation can enable DOE to understand how well-founded these concerns are.

Yet, the new DEC rules and requirements alone cannot overcome the structural barriers to domestic commercialization that clean energy companies face today. DOE will also need to systematically build domestic manufacturing efforts into basic and applied R&D, demonstration projects, and cross-cutting initiatives. DOE should also pursue complementary investments to ensure that licensees of federally funded clean energy technologies are able and eager to manufacture in the United States. Under existing authorities, such efforts can include: 

These complementary efforts will enable DOE to generate more productive outcomes from its 2021 DEC, reduce the need for waivers, and strengthen the U.S. clean manufacturing base. In other words, rather than just slow the flow of innovation overseas without presenting an alternative, they provide a domestic outlet for that flow. Figure 1 provides an illustration of the federal ecosystem of programs, DOE and otherwise, that complement the mission of the DEC.

Figure 1

Programs are arranged in rough accordance to their role in the innovation cycle. TRL and MRL refer to technology and manufacturing readiness level, respectively. Proposed programs, highlighted with a dotted yellow border, are either found in the Build Back Better Act passed by the House in 2021 or the Bipartisan Innovation Bill (USICA/America COMPETES)

Figure 1Programs are arranged in rough accordance to their role in the innovation cycle. TRL and MRL refer to technology and manufacturing readiness level, respectively. Proposed programs, highlighted with a dotted yellow border, are either found in the Build Back Better Act passed by the House in 2021 or the Bipartisan Innovation Bill (USICA/America COMPETES).

Plan of Action

While further Congressional action will be necessary to fully execute a long-term national clean manufacturing strategy and ramp up domestic capacity in critical sectors, DOE can meaningfully advance such a strategy now through both long-standing authorities and recently authorized programs. The following plan of action consists of (1) foundational steps to successfully implement the DEC, and (2) complementary efforts to ensure that licensees of federally funded clean energy technologies are able and eager to manufacture in the United States. In tandem, these recommendations can maximize impact and benefits of the DEC for American companies, workers, and citizens.

Part 1: DEC Implementation

The following action items, many of which are already underway, are focused on basic DEC implementation.

Part 2: Complementary Investments

Investments to support the domestic manufacturing sector and regional innovation infrastructure must be pursued in tandem with the DEC to translate into enhanced clean manufacturing competitiveness. The following actions are intended to reduce the need for waivers, shore up supply chains, and expand opportunities for domestic manufacturing:

Deploy a National Network of Air-Pollution and CO2 Sensors in 300 American Cities by 2030

Summary

The Biden-Harris Administration should deploy a national network of low-cost, co-located, real-time greenhouse gas (GHG) and air-pollution emission sensors in 300 American cities by 2030 to help communities address environmental inequities, combat global warming, and improve public health. Urban areas contribute more than 70% of total GHG emissions. Aerosols and other byproducts of fossil-fuel combustion — the major drivers of poor air quality — are emitted in huge quantities alongside those GHGs. A “300 by ‘30” initiative establishing a national network of local, ground-level sensors will provide precise and customized information to drive critical climate and air-quality decisions and benefit neighborhoods, schools, and businesses in communities across the nation. Ground-level dense sensor networks located in community neighborhoods also provide a resource that educators can leverage to engage students on co-created “real-time and actionable science”, helping the next generation see how science and technology can contribute to solving our country’s most challenging issues.

Challenge and Opportunity

U.S. cities contribute 70% of our nation’s GHG emissions and have more concentrated air pollutants that harm neighborhoods and communities unequally. Climate change profoundly impacts human health and wellbeing through drought, wildfire, and extreme-weather events, among numerous other impacts. Microscopic air pollutants, which penetrate the body’s respiratory and circulatory systems, play a significant role in heart disease, stroke, lung cancer, and asthma. These diseases collectively cost Americans $800 billion annually in medical bills and result in more than 100,000 Americans dying prematurely each year. Also, health impacts are experienced more acutely for certain communities. Some racial groups and poorer households, especially those located near highways and industry, face higher exposure to harmful air pollutants than others, deepening health inequities across American society. 

GHG emissions and ground-level air pollution are both negative products of fossil-fuel combustion and are inextricably linked. But our nation lacks a comprehensive approach to measure, monitor, and mitigate these drivers of climate change and air pollution. Furthermore, key indicators of air quality — such as ground-level pollutant measurements — are not typically considered alongside GHG measurements in governmental attempts to regulate emissions. A coordinated and data-driven approach across government is needed to drive policies that are ambitious enough to simultaneously and equitably tackle both the climate crisis and worsening air-quality inequities in the United States.

Technologies that are coming down in cost enable ground-level, real-time, and neighborhood-scale observations of GHG and air-pollutant levels. These data support cost-effective mapping of carbon dioxide (CO2) and air-quality related emissions (such as PM2.5, ozone, CO, and nitrogen oxides) to aid in forecasting local air quality, conducting GHG inventories, detecting pollution hotspots, and assessing the effectiveness of policies designed to reduce air pollution and GHG emissions. The result can be more successful, targeted strategies to reduce climate impacts, improve human health, and ensure environmental equity.

Pilot projects are proving the value of hyper-local GHG and air-quality sensor networks. Multiple universities, philanthropies, and nongovernmental organizations (NGOs) have launched pilot projectsdeploying local, real-time GHG and air-pollutant sensors in cities including Los Angeles, New York City, Houston, TX, Providence, RI, and cities in the San Francisco Bay Area. In the San Francisco Bay Area, for instance, a dense network of 70 sensors enabled researchers to closely investigate how movement patterns changed as a result of the COVID-19 pandemic. Observations from local air-quality sensors could be used to evaluate policies aimed at increasing electric-vehicle deployment, to demonstrate how CO and NOx emissions from vehicles change day to day, and to prove that emissions from heavy-duty trucks disproportionately impact lower-income neighborhoods and neighborhoods of color. The federal government can and should incorporate lessons learned from these pilot projects in designing a national network of air-quality sensors in cities across the country. 

Components of a national air-quality sensor network are in place. On-the-ground sensor measurements provide essential ground-level, high-spatial-density measurements that can be combined with data from satellites and other observing systems to create more accurate climate and air-quality maps and models for regions, states, and the country. Through sophisticated computational models, for instance, weather data from the National Oceanic and Atmospheric Administration (NOAA) are already being combined with existing satellite data and data from ground-level dense sensor networks to help locate sources of GHG emissions and air-pollution in cities throughout the day and across seasons. The Environmental Protection Agency (EPA) is working on improving these measurements and models by encouraging development of standards for low-cost sensor data. Finally, data from pilot projects referenced above is being used on an ad hoc basis to inform policy. Data showing that CO2 emissions from the vehicle fleet are decreasing faster than expected in cities with granular emissions monitoring are that policies designed to reduce GHG emissions are working as or better than intended. Federal leadership is needed to bring the impacts of such insights to scale on larger and even more impactful levels.

A national network of hyper-local GHG and air-quality sensors will contribute to K–12 science curricula. The University of California, Berkeley partnered with the National Aeronautics and Space Administration (NASA) on the GLOBE educational program. The program provides ideas and materials for K–12 activities related to climate education and data literacy that leverage data from dense local air-quality sensor networks. Data from a national air-quality sensor network would expand opportunities for this type of place-based learning, motivating students with projects that incorporate observations occurring on the roof of their schools or nearby in their neighborhoods to investigate the atmosphere, climate, and use of data in scientific analyses.Scaling a national network of local GHG and air-quality sensors to include hundreds of cities will yield major economies of scale. A national air-quality sensor network that includes 300 American cities — essentially, all U.S. cities with populations greater than 100,000 — will drive down sensor costs and drive up sensor quality by growing the relevant market. Scaling up the network will also lower operational costs of merging large datasets, interpreting those data, and communicating insights to the public. This city-federal collaboration would provide validated data needed to prove which national and local policies to improve air quality and reduce emissions work, and to weed out those that don’t. 

Plan of Action

The National Oceanic and Atmospheric Administration (NOAA), in partnership with the Bureau of Economic Analysis, the Centers for Disease Control and Prevention (CDC), the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Institute of Standards and Technology (NIST), and the National Science Foundation (NSF) should lead a $100 million “300 by ’30: The American City Urban Air Challenge” to deploy low-cost, real-time, ground-based sensors by the year 2030 in all 300 U.S. cities with populations greater than 100,000 residents.

The initiative could be organized and managed by region through an expanded NOAA Regional Collaboration Network, under the auspices of NOAA’s Office of Oceanic and Atmospheric Research. NOAA is responsible for weather and air-quality forecasting and already manages a large suite of global CO2 and global air-quality-related observations along with local weather observations. In a complementary manner, the “300 by ‘30” sensor network would measure CO2, CO (carbon monoxide), NO (nitric oxide), NO2 (nitrogen dioxide), O3 (ozone), and PM2.5 (particulate matter down to 2.5 microns in size) at the neighborhood scale. “300 by ‘30” network operators would coordinate data integration and management within and across localities and report findings to the public through a uniform portal maintained by the federal government. Overall, NOAA would coordinate sensor deployment, network integration and data management and manage the transition from research to operations. NOAA would also work with NIST and EPA to provide uniform formats for collecting and sharing data.

Though NOAA is the natural agency to lead the “300 by ‘30” initiative, other federal agencies can and should play key supporting roles. NSF can support new approaches to instrument design and major innovations in data and computational science methods for analysis of observations that would transition rapidly to practical deployment. NIST can provide technical expertise and leadership in much-needed standards-setting for GHG measurements. NASA can advance the STEM-education portion of this initiative (see below), showing educators and students how to observe GHGs and air quality in their neighborhoods and how to link ground-level observations to observations made from space. NASA can also work with NOAA to merge high-density ground-level and wide-area space-based datasets. BEA can develop local models to provide the nonpartisan, nonpolitical economic information cities will need to inform urban air-policy decisions triggered by insights from the sensor network. Similarly, the EPA can help guide cities in using climate and air-quality information from the sensor network. The CDC can use network data to better characterize public-health threats related to climate change and air pollution, as well as to coordinate responses with state and local health officials. 

The “300 by ‘30” challenge should be deployed in a phased approach that (i) leverages lessons learned from pilot projects referenced above, and (ii) optimizes cost savings and efficiencies from increasing the number of networked cities. Leveraging its Regional Collaboration Network, NOAA would launch the Challenge in 2023 with an initial cohort of nine cities (one in each of NOAA’s nine regions). The Challenge would expand to 25 cities by 2024, 100 cities by 2027, and all 300 cities by 2030. The Challenge would also be open to participation by states and territories whose largest cities have populations less than 100,000.

The challenge should also build on NASA’s GLOBE program to develop and share K–12 curricula, activities, and learning materials that use data from the sensor network to advance climate education and data literacy and to inspire students to pursue higher education and careers in STEM. NOAA and NSF could provide additional support in promoting observation-based science education in classrooms and museums, illustrating how basic scientific observations of the atmosphere vary by neighborhood and collectively contribute to weather, air-quality, and climate models.

Frequently Asked Questions
Has something like the “300 by ‘30” initiative been tried before?

Recent improvements in sensor technologies are only now enabling the use of dense mesh networks of sensors to precisely pinpoint levels and sources of GHGs and air pollutants in real time and at the neighborhood scale. Pilot projects in the San Francisco Bay Area, Los Angeles, Houston, Providence, and New York City have proven the value of localized networks of air-quality sensors, and have demonstrated how data from these sensors can inform emissions-reductions policies. While individual localities, states, and the EPA are continuing to support pilot projects, there has never been a national effort to deploy networked GHG and air-quality sensors in all of the nation’s largest cities, nor has there been a concerted effort to link data collected from such sensors at scale.

If the proposed sensor networks will be inherently local, then why does the federal government need to get involved?

Although urban areas are responsible for over 70% of national GHG emissions and over 70% of air pollution in urban environments, even cities with existing policy approaches to GHGs and air quality lack the information to rapidly evaluate whether their emissions-reduction policies are effective. Further, COVID-19 has impacted local revenue, strained municipal budgets, and has understandably detracted attention from environmental issues in many localities. Federal involvement is needed to (i) give cities the equipment, data, and support they need to make meaningful progress on emissions of GHGs and air pollutants, (ii) coordinate efforts and facilitate exchange of information and lessons learned across cities, and (iii) provide common standards for data collection and sharing.

Is there precedent for this initiative in other countries?

A pilot project including a 20-device sensor network was led by U.S. scientists and developed for the City of Glasgow, Scotland as a demonstration for the COP26 climate conference. The City of Glasgow is an active partner in efforts to expand sensor networks, and is one model for how scientists and municipalities can work together to develop needed information presented in a useful format.

Where will the sensors come from?

Sensors appropriate for this initiative can be manufactured in the United States. A design for a localized network air-quality sensors the size of a shoe box has been described in freely available literature by researchers at the University of California, Berkeley. Domestic manufacture, installation, and maintenance of sensors needed for a national monitoring network will create stable, well-paying jobs in cities nationwide.

Which organizations are already working in this space?

Leading scientific societies Optica (formerly OSA) and the American Geophysical Union (AGU) are spearheading the effort to provide “actionable science” to local and regional policymakers as part of their Global Environmental Measurement & Monitoring (GEMM) Initiative. Optica and AGU are also exploring opportunities with the United Nations Human Settlements Program (UN-Habitat) and the World Meteorological Organization (WMO) to expand these efforts. GHG- and air-quality-measurement pilot projects referenced above are based on the BEACO2N Network of sensors developed by University of California, Berkeley Professor Ronald Cohen.

What about methane

Broadening the Knowledge Economy through Independent Scholarship

Summary

Scientists and scholars in the United States are faced with a relatively narrow set of traditional career pathways. Our lack of creativity in defining the scholarly landscape is limiting our nation’s capacity for innovation by stifling exploration, out-of-the-box thinking, and new perspectives.

This does not have to be the case. The rise of the gig economy has positioned independent scholarship as an effective model for people who want to continue doing research outside of traditional academic structures, in ways that best fit their life priorities. New research institutes are emerging to support independent scholars and expand access to the knowledge economy.

The Biden-Harris Administration should further strengthen independent scholarship by (1) facilitating partnerships between independent scholarship institutions and conventional research entities; (2) creating professional-development opportunities for independent scholars; and (3) allocating more federal funding for independent scholarship.

Challenge and Opportunity

The academic sector is often seen as a rich source of new and groundbreaking ideas in the United States. But it has become increasingly evident that pinning all our nation’s hopes for innovation and scientific advancement on the academic sector is a mistake. Existing models of academic scholarship are limited, leaving little space for any exploration, out-of-the-box thinking, and new perspectives. Our nation’s universities, which are shedding full-time faculty positions at an alarming rate, no longer offer as reliable and attractive career opportunities for young thinkers as they once did. Conventional scholarly career pathways, which were initially created with male breadwinners in mind, are strewn with barriers to broad participation. But outside of academia, there is a distinct lack of market incentive structures that support geographically diverse development and implementation of new ideas. 

These problems are compounded by the fact that conventional scholarly training pathways are long, expensive, and unforgiving. A doctoral program takes an average of 5.8 years and $115,000 to complete. The federal government spends $75 billion per year on financial assistance for students in higher education. Yet inflexible academic structures prevent our society from maximizing returns on these investments in human capital. Individuals who pursue and complete advanced scholarly training but then opt to take a break from the traditional academic pipeline — whether to raise a family, explore another career path, or deal with a personal crisis — can find it nearly impossible to return. This problem is especially pronounced among first-generation studentswomen of color, and low income groups. A 2020 study found that out of the 67% of Ph.D. students who wanted to stay in academia after completing their degree, only 30% of those people did. Outside of academia, though, there are few obvious ways for even highly trained individuals to contribute to the knowledge economy. The upshot is that every year, innumerable great ideas and scholarly contributions are lost because ideators and scholars lack suitable venues in which to share them.

Fortunately, an alternative model exists. The rise of the gig economy has positioned independent scholarship as a viable approach to work and research. Independent scholarship recognizes that research doesn’t have to be a full-time occupation, be conducted via academic employment, or require attainment of a certain degree. By being relatively free of productivity incentives (e.g., publish or perish), independent scholarship provides a flexible work model and career fluidity that allows people to pursue research interests alongside other life and career goals. 

Online independent-scholarship institutes (ISIs) like the Ronin InstituteIGDORE, and others have recently emerged to support independent scholars. By providing an affiliation, a community, and a boost of confidence, such institutes empower independent scholars to do meaningful research. Indeed, the original perspectives and diverse life experiences that independent scholars bring to the table increase the likelihood that such scholars will engage in high-risk research that can deliver tremendous benefits to society. 

But it is currently difficult for ISIs to help independent scholars reach their full potential. ISIs generally cannot provide affiliated individuals with access to resources like research ethics review boards, software licenses, laboratory space, scientific equipment, computing services, and libraries. There is also concern that without intentionally structuring ISIs around equity goals, ISIs will develop in ways that marginalize underrepresented groups. ISIs (and individuals affiliated with them) are often deemed ineligible for research grants, and/or are outcompeted for grants by well-recognized names and affiliations in academia. Finally, though independent scholarship is growing, there is still relatively little concrete data on who is engaging in independent scholarship, and how and why they are doing so. 


Strengthening support for ISIs and their affiliates is a promising way to fast-track our nation towards needed innovation and technological advancements. Augmenting the U.S. knowledge-economy infrastructure with agile ISIs will pave the way for new and more flexible scholarly work models; spur greater diversity in scholarship; lift up those who might otherwise be lost Einsteins; and increase access to the knowledge economy as a whole.

Plan of Action

The Biden-Harris Administration should consider taking the following steps to strengthen independent scholarship in the United States: 

  1. Facilitate partnerships between independent scholarship institutions and conventional research entities.
  2. Create professional-development opportunities for independent scholars.
  3. Allocate more federal funding for independent scholarship.

More detail on each of these recommendations is provided below.

1. Facilitate partnerships between ISIs and conventional research entities.

The National Science Foundation (NSF) could provide $200,000 to fund a Research Coordination Network or INCLUDES alliance of ISIs. This body would provide a forum for ISIs to articulate their main challenges and identify solutions specific to the conduct of independent research (see FAQ for a list) — solutions may include exploring Cooperative Research & Development Agreements (CRADAs) as mechanisms for accessing physical infrastructure needed for research. The body would help establish ISIs as recognized complements to traditional research facilities such as universities, national laboratories, and private-sector labs. 

NSF could also include including ISIs in its proposed National Networks of Research Institutes (NNRIs). ISIs meet many of the criteria laid out for NNRI affiliates, including access to cross-sectoral partnerships (many independent scholars work in non-academic domains), untapped potential among diverse scholars who have been marginalized by — or who have made a choice to work outside of — conventional research environments, novel approaches to institutional management (such as community-based approaches), and a model that truly supports the “braided river” or ”ecosystem” career pathway model. 

The overall goal of this recommendation is to build ISI capacity to be effective players in the broader knowledge-economy landscape. 

2. Create professional-development opportunities for independent scholars. 

To support professional development among ISIs, The U.S. Small Business Administration and/or the NSF America’s Seed Fund program could provide funding to help ISI staff develop their business models, including funding for training and coaching on leadership, institutional administration, financial management, communications, marketing, and institutional policymaking. To support professional development among independent scholars directly, the Office of Postsecondary Education at the Department of Education — in partnership with professional-development programs like Activate, the Department of Labor’s Wanto, and the Minority Business Development Agency — can help ISIs create professional-development programs customized towards the unique needs of independent scholars. Such programs would provide mentorship and apprenticeship opportunities for independent scholars (particularly for those underrepresented in the knowledge economy), led by scholars experienced with working outside of conventional academia.

The overall goal of this recommendation is to help ISIs and individuals create and pursue viable work models for independent scholarship. 

3.  Allocate more federal funding for independent scholarship.

Federal funding agencies like NSF struggle to diversify the types of projects they support, despite offering funding for exploratory high-risk work and for early-career faculty. A mere 4% of NSF funding is provided to “other” entities outside of private industry, federally supported research centers, and universities. But outside of the United States, independent scholarship is recognized and funded. NSF and other federal funding agencies should consider allocating more funding for independent scholarship. Funding opportunities should support individuals over institutions, have low barriers to entry, and prioritize provision of part-time funding over longer periods of time (rather than full funding for shorter periods of time).

Funding opportunities could include: 

Conclusion

Our nation urgently needs more innovative, broadly sourced ideas. But limited traditional career options are discouraging participation in the knowledge economy. By strengthening independent scholarship institutes and independent scholarship generally, the Biden-Harris Administration can help quickly diversify and grow the pool of people participating in scholarship. This will in turn fast-track our nation towards much-needed scientific and technological advancements.

Frequently Asked Questions
What comprises the traditional academic pathway?

The traditional academic pathway consists of 4–5 years of undergraduate training (usually unfunded), 1–3 years for a master’s degree (sometimes funded; not always a precondition for enrollment in a doctoral program), 3–6+ years for a doctoral degree (often at least partly funded through paid assistantships), 2+ years of a postdoctoral position (fully funded at internship salary levels), and 5–7 years to complete the tenure-track process culminating in appointment to an Associate Professor position (fully funded at professional salary levels).

What is independent scholarship?

Independent scholarship in any academic field is, as defined by the Effective Altruism Forum, scholarship “conducted by an individual who is not employed by any organization or institution, or who is employed but is conducting this research separately from that”.

What benefits can independent scholars offer academia and the knowledge economy?

Independent scholars can draw on their varied backgrounds and professional experience to bring fresh and diverse worldviews and networks to research projects. Independent scholars often bring a community-oriented and collaborative approach to their work, which is helpful for tackling pressing transdisciplinary social issues. For students and mentees, independent scholars can provide connections to valuable field experiences, practicums, research apprenticeships, and career-development opportunities. In comparison to their academic colleagues, many independent scholars have more time flexibility, and are less prone to being influenced by typical academic incentives (e.g., publish or perish). As such, independent scholars often demonstrate long-term thinking in their research, and may be more motivated to work on research that they feel personally inspired by.

What is an independent scholarship institute (ISI)?

An ISI is a legal entity or organization (e.g, a nonprofit) that offers an affiliation for people conducting independent scholarship. ISIs can take the form of research institutes, scholarly communities, cooperatives, and others. Different ISIs can have different goals, such as emphasizing work within a specific domain or developing different ways of doing scholarship. Many ISIs exist solely online, which allows them to function in very low-cost ways while retaining a broad diversity of members. Independent scholarship institutes differ from professional societies, which do not provide an affiliation for individual researchers.

Why does a purportedly independent scholar need to be affiliated with an institute?

As the Ronin Institute explains, federal grant agencies and many foundations in the United States restrict their support to individuals affiliated with legally recognized classes of institutions, such as nonprofits. For individual donors, donations made to independent scholars via nonprofits are tax-deductible. Being affiliated with a nonprofit dedicated to supporting independent scholars enables those scholars to access the funding needed for research. In addition, many independent scholars find value in being part of a community of like-minded individuals with whom they can collaborate and share experiences and expertise.

How do ISIs differ from universities?

Universities are designed to support large complex grants requiring considerable infrastructure and full-time support staff; their incentive structures for faculty and students mirror these needs. In contrast, research conducted through an independent-scholarship model is often part-time, inexpensive, and conducted by already trained researchers with little more than a personal computer. With their mostly online structures, ISIs can be very cost effective. They have agile and flexible frameworks, with limited bureaucracy and fewer competing priorities. ISIs are best positioned to manage grants that are stand alone, can be administered with lower indirect rates, require little physical research infrastructure, and fund individuals partnering with collaborators at universities. While toxic academic environments often push women and minority groups out of universities and academia, agile ISIs can take swift and decisive action to construct healthier work environments that are more welcoming of non-traditional career trajectories. These qualities make ISIs great places for testing high-risk, novel ideas.

What types of collaboration agreements could traditional knowledge-economy institutions enter into with ISIs?

Options include:


Curing Alzheimer’s by Investing in Aging Research

Summary

Congress allocates billions of dollars annually to Alzheimer’s research in hopes of finding an effective prophylactic, treatment, or cure. But these massive investments have little likelihood of paying off absent a game-changing improvement in our present knowledge of biology. Funds currently earmarked for Alzheimer’s research would be more productive if they were instead invested into deepening understanding of aging biology at the cell, tissue, and organ levels. Fundamental research advances in aging biology would directly support better outcomes for patients with Alzheimer’s as well as a plethora of other chronic diseases associated with aging — diseases that are the leading cause of mortality and disability, responsible for 71% of annual deaths worldwide and 79% of years lived with disability. Congress should allow the National Institute on Aging to spend funds currently restricted for research into Alzheimer’s specifically on research into aging biology more broadly. The result would be a society better prepared for the imminent health challenges of an aging population.

Challenge and Opportunity

The NIH estimates that 6.25 million Americans now have Alzheimer’s disease, and that due to an aging population, that number will more than double to 13.85 million by the year 2060. The Economist similarly estimates that an estimated 50 million people worldwide suffer dementia, and that that number will increase to 150 million by the year 2050. These dire statistics, along with astute political maneuvering by Alzheimer’s advocates, have led Congress to earmark billions of dollars of federal health-research funds for Alzheimer’s disease.

President Obama’s FY2014 and FY2015 budget requests explicitly cited the need for additional Alzheimer’s research at the National Institutes of Health (NIH). In FY2014, Congress responded by giving the NIH’s National Institute on Aging (NIA) a small but disproportionate increase in funding relative to other national institutes, “in recognition of the Alzheimer’s disease research initiative throughout NIH.” Congress’s explanatory statement for its FY2015 appropriations laid out good reasons not to earmark a specific portion of NIH funds for Alzheimer’s research, stating:

“In keeping with longstanding practice, the agreement does not recommend a specific amount of NIH funding for this purpose or for any other individual disease. Doing so would establish a dangerous precedent that could politicize the NIH peer review system. Nevertheless, in recognition that Alzheimer’s disease poses a serious threat to the Nation’s long-term health and economic stability, the agreement expects that a significant portion of the recommended increase for NIA should be directed to research on Alzheimer’s. The exact amount should be determined by scientific opportunity of additional research on this disease and the quality of grant applications that are submitted for Alzheimer’s relative to those submitted for other diseases.”

But this position changed suddenly in FY2016, when Congress earmarked $936 million for Alzheimer’s research. The amount earmarked by Congress for Alzheimer’s research has risen almost linearly every year since then, reaching $3.1 billion in FY2021 (Figure 1).

This tsunami of funding has been unprecedented for the NIA. The seemingly limitless availability of money for Alzheimer’s research has created a perverse incentive for the NIH and NIA to solicit additional Alzheimer’s funding, even as agencies struggle to deploy existing funding efficiently. The NIH Director’s latest report to Congress on Alzheimer’s funding suggests that with an additional $226 million per year in funding, the NIH and NIA could effectively treat or prevent Alzheimer’s disease and related dementias by 2025. 

This is a laughable untruth. No cure for Alzheimer’s is in the offing. Progress on Alzheimer’s research is stalling and commercial interest is declining. Of the 413 Alzheimer’s clinical trials performed in the United States between 2002 and 2012, 99.6% failed. Recent federal investments seemed to be paying off when in 2021 the Food and Drug Administration (FDA) approved Aduhelm, the first new treatment for Alzheimer’s since 2003. But the approval was based on the surrogate endpoint of amyloid plaques in the brain as observed by PET scans, not on patient outcomes. In its first months on the market, Aduhelm visibly flopped. Scientists subsequently called on the FDA to withdraw marketing approval for the drug. If an effective treatment were likely by 2025, Big Pharma would be doubling down. But Pfizer announced it was abandoning Alzheimer’s research in 2018.

The upshot is clear: lavish funding on treatments and cures for a disease can only do so much absent knowledge of that disease’s underlying biological mechanisms. We as a society must resist the temptation to waste money on expensive shots in the dark, and instead invest strategically into understanding the basic biochemical and genetic mechanisms underlying aging processes at the cell, tissue, and organ levels.

Plan of Action

Aging is the number-one risk factor for Alzheimer’s disease, as it is for many other diseases. All projections of an increasing burden of Alzheimer’s are based on the fact that our society is getting older. And indeed, even if a miraculous cure for Alzheimer’s were to emerge, we would still have to contend with an impending onslaught of other impending medical and social costs. 

Economists and scientists have estimated extending average life expectancy in the United States by one year is worth $38 trillion. But funding for basic research on aging remains tight. Outside of the NIA, several foundations in the United States are actively funding aging research: the American Federation for Aging Research (AFAR), The Glenn Foundation for Medical Research, and the SENS Foundation each contribute a few million per year for aging research. Privately funded fast grants have backed bold aging projects with an additional $26 million. 

This relatively small investment in basic research has generated billions in private funding to commercialize findings. Startups raised $850 million in 2018 to target aging and age-related diseases. Google’s private research arm Calico is armed with billions and a pharmaceutical partner in Abbvie, and the Buck Institute’s Unity Biotechnology launched an initial public offering (IPO) in 2018. In 2021, Altos Labs raised hundreds of millions to commercialize cellular reprogramming technology. Such dynamism and progress in aging research contrasts markedly with the stagnation in Alzheimer’s research and indicates that the former is a more promising target for federal research dollars.

Now is the time for the NIA to drive science-first funding for the field of aging. Congress should maintain existing high funding levels at NIA, but this funding should no longer be earmarked solely for Alzheimer’s research. In every annual appropriation since FY2016, the House and Senate appropriations committees have issued a joint explanatory statement that has force of law and includes the Alzheimer’s earmark. These committees should revert to their FY2015 position against politically directing NIH funds towards particular ends. The past six years have shown such political direction to be a failed experiment.

Removing the Alzheimer’s earmark would allow the NIA to use its professional judgment to fund the most promising research into aging based on scientific opportunity and the quality of the grant applications it receives. We expect that this in turn would cause agency-funded research to flourish and stimulate further research and commercialization from industry, as privately funded aging research already has. Promising areas that the NIA could invest in include building tools for understanding molecular mechanisms of aging, establishing and validating aging biomarkers, and funding more early-stage clinical trials for promising drugs. By building a better understanding of aging biology, the NIA could do much to render even Alzheimer’s disease treatable.

Frequently Asked Questions
How did Congress get so interested in Alzheimer’s disease? What recent actions has Congress taken on funding for Alzheimer’s research?

In 2009, a private task force calling itself the Alzheimer’s Study Group released a report entitled “A National Alzheimer’s Strategic Plan.” The group, co-chaired by former Speaker of the House Newt Gingrich and former Nebraska Senator Bob Kerrey, called on Congress to immediately increase funding for Alzheimer’s and dementia research at the NIH by $1 billion per year.


In response to the report, Senators Susan Collins and Evan Bayh introduced the National Alzheimer’s Project Act (NAPA), which was signed into law in 2011 by Barack Obama. NAPA requires the Department of Health and Human Services to produce an annual assessment of the nation’s progress in preparing for an escalating burden of Alzheimer’s disease. This annual assessment is called the National Plan to Address Alzheimer’s Disease. The first National Plan, released in 2012, established a goal of effectively preventing or treating Alzheimer’s disease by 2025. In addition, the Alzheimer’s Accountability Act, which passed in the 2015 omnibus, gives the NIH director the right and the obligation to report directly to Congress on the amount of additional funds needed to meet the goals of the national plan, including the self-imposed 2025 goal.

Why is treating Alzheimer’s so hard?

Understanding diseases that progress over a long period of time such as Alzheimer’s requires complex clinical studies. Lessons learned from past research indicate that animal models don’t necessarily translate into humans when it comes to such diseases. Heterogeneity in disease presentation, imprecise clinical measures, relevance of target biomarkers, and difficulty in understanding underlying causes exacerbate the problem for Alzheimer’s specifically.


Alzheimer’s is also a whole-system, multifactorial disease. Dementia is associated with a decreased variety of gut microbiota. Getting cataract surgery seemingly reduces Alzheimer’s risk. Inflammatory responses from the immune system can aggravate neurodegenerative diseases. The blood-brain barrier uptakes less plasma protein with age. The list goes on. Understanding Alzheimer’s hence requires understanding of many other biological systems.

What is the amyloid hypothesis?

Alzheimer’s is named after Alois Alzheimer, a German scientist credited with publishing the first case of the disease in 1906. In the post-mortem brain sample of his patient, he identified extracellular deposits, now known as amyloid plaques, clumps of amyloid-beta (Aβ) protein. In 1991, David Allsop and John Hardy proposed the amyloid hypothesis after discovering a pathogenic mutation in the APP (Aβ precursor protein) gene on chromosome 21. Such a mutation led to increased Aβ deposits which present as early-onset Alzheimer’s disease in families.


The hypothesis suggested that Alzheimer’s follows the pathological cascade of Aβ aggregation → tau phosphorylation → neurofibrillary tangles → neuronal death. These results indicated that Aβ could be a drug target for Alzheimer’s disease.


In the 1990s, Elan Pharmaceuticals proposed a vaccine against Alzhiemer’s by stopping or slowing the formation of Aβ aggregates. It was a compelling idea. In the following decades, drug development centered around this hypothesis, leading to the current approaches to Alzhiemer’s treatment: Aβinhibition (β- and γ-secretase inhibitors), anti-aggregation (metal chelators),  Aβ clearing (protease-activity regulating drugs), and immunotherapy.


In the last decade, the growing arsenal of Aβ therapies fueled the excitement that we were close to an Alzheimer’s treatment. The 2009 report, the 2012 national plan, and Obama’s funding requestsseemed to confirm that this was the case.


However, the strength of the amyloid hypothesis has declined since then. Since the shutdown of the first Alzheimer’s vaccine in 2002, numerous other pharmaceutical companies have tried and failed at creating their own vaccine, despite many promising assets shown to clear Aβ plaques in animal models. Monoclonal antibody treatments (of which aducanamab is an example) have reduced free plasma concentrations of Aβ by 90%, binding to all sorts of Aβ from monomeric and soluble Aβ to fibrillar and oligomeric Aβ. These treatments have suffered high-profile late-stage clinical trial failures in the last five years. Similar failures surround other approaches to Alzheimer’s drug development.


There is no doubt these therapies are successful at reducing Aβ concentration in pre-clinical trials. But combined with the continuous failure of these drugs in late-stage clinical trials, perhaps Aβ does not play as major a role in the mechanistic process as hypothesized.

Creating a Public System of National Laboratory Schools

Summary

The computational revolution enables and requires an ambitious reimagining of public high-school and community-college designs, curricula, and educator-training programs. In light of a much-changed — and much-changing — society, we as a nation must revisit basic assumptions about what constitutes a “good” education. That means re-considering whether traditional school schedules still make sense, updating outdated curricula to emphasize in-demand skills (like computer programming), bringing current perspectives to old subjects (like computational biology); and piloting new pedagogies (like project-based approaches) better aligned to modern workplaces. To do this, the Federal Government should establish a system of National Laboratory Schools in parallel to its existing system of Federally Funded Research & Development Centers (FFRDCs).

The National Science Foundation (NSF) should lead this work, partnering with the Department of Education (ED) to create a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) Directorate. The DSI would act as a platform analogous to the Small Business Innovation Research (SBIR) program, catalyzing Laboratory Schools by providing funding and technical guidance to federal, state, and local entities pursuing educational or cluster-based workforce-development initiatives.

The new Laboratory Schools would take inspiration from successful, vertically-integrated research and design institutes like Xerox PARC and the Mayo Clinic in how they organized research, as well as from educational systems like Governor’s Schools and Early College High Schools in how they organized their governance. Each Laboratory School would work with a small, demographically and academically representative cohort financially sustainable on local per-capita education budgets.
Collectively, National Laboratory Schools would offer much-needed “public sandboxes” to develop and demonstrate novel school designs, curricula, and educator-training programs rethinking both what and how people learn in a computational future.

Challenge and Opportunity

Education is fundamental to individual liberty and national competitiveness. But the United States’ investment in advancing the state of the art is falling behind. 

Innovation in educational practice has been incremental. Neither the standards-based nor charter-school movements departed significantly from traditional models. Accountability and outcomes-based incentives like No Child Left Behind suffer from the same issue.

The situation in research is not much better: NSF and ED’s combined spending on education research is barely twice the research and development budget of Nintendo. And most of that research focuses on refining traditional school models (e.g. presuming 50-minute classes and traditional course sequences).

Despite all these efforts, we are still seeing unprecedented declines in students’ math and reading scores.

Meanwhile, the computational revolution is widening the gap between what school teaches and the skills needed in a world where work is increasingly creative, collaborative, and computational. Computation’s role in culture, commerce, and national security is rapidly expanding; computational approaches are transforming disciplines from math and physics to history and art. School can’t keep up.

For years, research has told us individualized, competency- and project-based approaches can reverse academic declines while aligning with the demands of industry and academia for critical thinking, collaboration, and creative problem-solving skills. But schools lack the capacity to follow suit.

Clearly, we need a different approach to research and development in education: We need prototypes, not publications. While studies evaluating and improving existing schools and approaches have their place, there is a real need now for “living laboratories” that develop and demonstrate wholly transformative educational approaches.

Schools cannot do this on their own. Constitutionally and financially, education is federated to states and districts. No single public actor has the incentives, expertise, and resources to tackle ambitious research and design — much less to translate into research to practice on a meaningful scale. Private actors like curriculum developers or educational technologists sell to public actors, meaning private sector innovation is constrained by public school models. Graduate schools of education won’t take the brand risk of running their own schools, and researchers won’t pursue unfunded or unpublishable questions. We commend the Biden-Harris administration’s Multi-Agency Research and Development Priorities for centering inclusive innovation and science, technology, education, and math (STEM) education in the nation’s policy agenda. But reinventing school requires a new kind of research institution, one which actually operates a school, developing educators and new approaches firsthand.Luckily, the United States largely invented the modern research institution. It is time we do so again. Much as our nation’s leadership in science and technology was propelled by the establishment ofland-grant universities in the late 19th century, we can trigger a new era of U.S. leadership in education by establishing a system of National Laboratory Schools. The Laboratory Schools will serve as vertically integrated “sandboxes” built atop fully functioning high schools and community colleges, reinventing how students learn and how we develop in a computational future.

Plan of Action

To catalyze a system of National Laboratory Schools, the NSF should establish a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) directorate. With an annually escalating investment over five years (starting at $25 million in FY22 and increasing to $400 million by FY26), the DSI could support development of 100 Laboratory Schools nationwide.

The DSI would support federal, state, and local entities — and their partners — in pursuing education or cluster-based workforce-development initiatives that (i) center computational capacities, (ii) emphasize economic inclusion or racial diversity, and (iii) could benefit from a high-school or community-college component.

DSI support would entail:

  1. Competitive matching grants modeled on SBIR grants. These grants would go towards launching Laboratory Schools and sustaining those that demonstrate success.
  2. Technical guidance to help Laboratory Schools (i) innovate while maintaining regulatory compliance, and (ii) develop financial models workable on local education budgets.
  3. Accreditation support, working with partner executives (e.g., Chairs of Boards of Higher Education) where appropriate, to help Laboratory Schools establish relationships with accreditors, explain their educational models, and document teacher and student work for evaluation purposes.
  4. Responsible-research support, including providing Laboratory Schools assistance with obtainingFederalwide Assurance (FWA) and access to partners’ Institutional Review Boards (IRBs).
  5. Convening and storytelling, raising awareness of and interest in Laboratory Schools’ mission and operations.

Launching at least ten National Laboratory Schools by FY23 would involve three primary steps. First, the White House Office of Science and Technology Policy (OSTP) should convene an expert group comprised of (i) funders with a track record of attempting radical change in education and (ii) computational domain experts to design an evaluation process for the DSI’s competitive grants, secure industry and academic partners to help generate interest in the National Laboratory School System, and recruit the DSI’s first Director.

In parallel, Congress should issue one appropriations report asking NSF to establish a $25 million per year pilot Laboratory School program aligned with the NSF Directorate for Technology, Innovation, and Partnerships (TIP)’s Regional Innovation Accelerators (RIA)’s Areas of Investment. Congress should issue a second appropriations report asking the Office of Elementary and Secondary Education (OESE) to release a Dear Colleague letter encouraging states that have spent less than 75% of their Elementary and Secondary School Emergency Relief (ESSER) or American Recovery Plan funding to propose a Laboratory School.

Finally, the White House should work closely with the DSI’s first Director to convene the Department of Defense Education Activity (DDoEA) and National Governors Association (NGA) to recruit partners for the National Laboratory Schools program. These partners would later be responsible for operational details like:

Focus will be key for this initiative. The DSI should exclusively support efforts that center:

  1. New public schools, not programs within (or reinventions of) existing schools.
  2. Radically different designs, not incremental evolutions.
  3. Computationally rich models that integrate computation and other modern skills into all subjects.
  4. Inclusive innovation focused on transforming outcomes for the poor and historically marginalized.

Conclusion

Imagine the pencil has just been invented, and we treated it the way we’ve treated computers in education. “Pencil class” and “pencil labs” would prepare people for a written future. We would debate the cost and benefit of one pencil per child. We would study how oral test performance changed when introducing one pencil per classroom, or after an after-school creative-writing program.

This all sounds stupid because the pencil and writing are integrated throughout our educational systems rather than being considered individually. The pencil transforms both what and how we learn, but only when embraced as a foundational piece of the educational experience.

Yet this siloed approach is precisely the approach our educational system takes to computers and the computational revolution. In some ways, this is no great surprise. The federated U.S. school system isn’t designed to support invention, and research incentives favor studying and suggesting incremental improvements to existing school systems rather than reimagining education from the ground up. If we as a nation want to lead on education in the same way that we lead on science and technology, we must create laboratories to support school experimentation in the same way that we establish laboratories to support experimentation across STEM fields. Certainly, the federal government shouldn’t run our schools. But just as the National Institutes of Health (NIH) support cutting-edge research that informs evolving healthcare practices, so too should the federal government support cutting-edge research that informs evolving educational practices. By establishing a National Laboratory School system, the federal government will take the risk and make the investments our communities can’t on their own to realize a vision of an equitable, computationally rich future for our schools and students.

Frequently Asked Questions

Who

1. Why is the federal government the right entity to lead on a National Laboratory School system?

Transformative education research is slow (human development takes a long time, as does assessing how a given intervention changes outcomes), laborious (securing permissions to test an intervention in a real-world setting is often difficult), and resource-intensive (many ambitious ideas require running a redesigned school to explore properly). When other fields confront such obstacles, the public and philanthropic sectors step in to subsidize research (e.g., by funding large research facilities). But tangible education-research infrastructure does not exist in the United States.

Without R&D demonstrating new models (and solving the myriad problems of actual implementation), other public- and private-sector actors will continue to invest solely in supporting existing school models. No private sector actor will create a product for schools that don’t exist, no district has the bandwidth and resources to do it themselves, no state is incentivized to tackle the problem, and no philanthropic actor will fund an effort with a long, unclear path to adoption and prominence.

National Laboratory Schools are intended primarily as research, development, and demonstration efforts, meaning that they will be staffed largely by researchers and will pursue research agendas that go beyond the traditional responsibilities and expertise of local school districts. State and local actors are the right entities to design and operate these schools so that they reflect the particular priorities and strengths of local communities, and so that each school is well positioned to influence local practice. But funding and overseeing the National Laboratory School system as a whole is an appropriate role for the federal government.

2. Why is NSF the right agency to lead this work?

For many years, NSF has developed substantial expertise funding innovation through the SBIR/STTR programs, which award staged grants to support innovation and technology transfer. NSF also has experience researching education through its Directorate for Education and Human Resources (HER). Finally, NSF’s new Directorate for Technology, Innovation, and Partnerships (TIP) has a mandate to “[create] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” NSF is the right agency to lead the National Laboratory Schools program because of its unique combination of experience, in-house expertise, mission relevance, and relationships with agencies, industry, and academia.

3. What role will OSTP play in establishing the National Laboratory School program? Why should they help lead the program instead of ED?

ED focuses on the concerns and priorities of existing schools. Ensuring that National Laboratory Schools emphasize invention and reimagining of educational models requires fresh strategic thinking and partnerships grounded in computational domain expertise.

OSTP has access to bodies like the President’s Council of Advisors on Science and Technology (PCAST)and the National Science and Technology Council (NSTC). Working with these bodies, OSTP can easily convene high-profile leaders in computation from industry and academia to publicize and support the National Laboratory Schools program. OSTP can also enlist domain experts who can act as advisors evaluating and critiquing the depth of computational work developed in the Laboratory Schools. And annually, in the spirit of the White House Science Fair, OSTP could host a festival showcasing the design, practices, and outputs of various Laboratory Schools.

Though OSTP and NSF will have primary leadership responsibilities for the National Laboratory Schools program, we expect that ED will still be involved as a key partner on topics aligned with ED’s core competencies (e.g., regulatory compliance, traditional best practices, responsible research practices, etc.).

4. What makes the Department of Defense Education Activity (DoDEA) an especially good partner for this work?

The DoDEA is an especially good partner because it is the only federal agency that already operates schools; reaches a student base that is large (more than 70,000 students, of whom more than 12,000 are high-school aged) as well as academically, socioeconomically, and demographically diverse; more nimble than a traditional district; in a position to appreciate and understand the full ramifications of the computational revolution; and very motivated to improve school quality and reduce turnover

5. Why should the Division for School Invention (DSI) be situated within NSF’s TIP Directorate rather than EHR Directorate?

EHR has historically focused on the important work of researching (and to some extent, improving) existing schools. The DSI’s focus on invention, secondary/postsecondary education, and opportunities for alignment between cluster-based workforce-development strategies and Laboratory Schools’ computational emphasis make the DSI a much better fit for the TIP, which is not only focused on innovation and invention overall, but is also explicitly tasked with “[creating] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” Situating the DSI within TIP will not preclude DSI from drawing on EHR’s considerable expertise when needed, especially for evaluating, contextualizing, and supporting the research agendas of Laboratory Schools.

6. Why shouldn’t existing public schools be eligible to serve as Laboratory Schools?

Most attempts at organizational change fail. Invention requires starting fresh. Allowing existing public schools or districts to launch Laboratory Schools will distract from the ongoing educational missions of those schools and is unlikely to lead to effective invention. 

7. Who are some appropriate partners for the National Laboratory School program?

Possible partners include:

8. What should the profile of a team or organization starting a Laboratory School look like? Where and how will partners find these people?

At a minimum, the team should have experience working with youth, possess domain expertise in computation, be comfortable supporting both technical and expressive applications of computation, and have a clear vision for the practical operation of their proposed educational model across both the humanities and technical fields.

Ideally, the team should also have piloted versions of their proposed educational model approach in some form, such as through after-school programs or at a summer camp. Piloting novel educational models can be hard, so the DSI and/or its partners may want to consider providing tiered grants to support this kind of prototyping and develop a pipeline of candidates for running a Laboratory School.

To identify candidates to launch and operate a Laboratory School, the DSI and/or its partners can:

What

1. What is computational thinking, and how is it different from programming or computer science?

A good way to answer this question is to consider writing as an analogy. Writing is a tool for thought that can be used to think critically, persuade, illustrate, and so on. Becoming a skilled writer starts with learning the alphabet and basic grammar, and can include craft elements like penmanship. But the practice of writing is distinct from the thinking one does with those skills. Similarly, programming is analogous to mechanical writing skills, while computer science is analogous to the broader field of linguistics. These are valuable skills, but are a very particular slice of what the computational revolution entails.

Both programming and computer science are distinct from computational thinking. Computational thinking refers to thinking with computers, rather than thinking about how to communicate problems and questions and models to computers. Examples in other fields include:

These transitions each involve programming, but are no more “about” computer science than a philosophy class is “about” writing. Programming is the tool, not the topic.

2. What are some examples of the research questions that National Laboratory Schools would investigate?

There are countless research agendas that could be pursued through this new infrastructure. Select examples include:

  1. Seymour Papert’s work on LOGO (captured in books like Mindstorms) presented a radically different vision for the potential and role for technology in learning. In Mindstorms, Papert sketches out that vision vis a vis geometry as an existence proof. Papert’s work demonstrates that research into making things more learnable differs from researching how to teach more effectively. Abelson and diSessa’s Turtle Geometry takes Papert’s work further, conceiving of ways that computational tools can be used to introduce differential geometry and topology to middle- and high-schoolers. The National Laboratory Schools could investigate how we might design integrated curricula combining geometry, physics, and mathematics by leveraging the fact that the vast majority of mathematical ideas tackled in secondary contexts appear in computational treatments of shape and motion.
  2. The Picturing to Learn program demonstrated remarkable results in helping staff to identify and students to articulate conceptions and misconceptions. The National Laboratory Schools could investigate how to take advantage of the explosion of interactive and dynamic media now available for visually thinking and animating mental models across disciplines.
  3. Bond graphs as a representation of physical dynamic systems were developed in the 1960s. These graphs enabled identification of “effort” and “flow” variables as new ways of defining power. This in turn allowed us to formalize analogies across electricity and magnetism, mechanics, fluid dynamics, and so on. Decades later, category theory has brought additional mathematical tools to bear on further formalizing these analogies. Given the role of analogy in learning, how could we reconceive people’s introduction to natural sciences in cross-disciplinary language emphasizing these formal parallels.
  4. Understanding what it means for one thing to cause (or not cause) another, and how we attempt to establish whether this is empirically true is an urgent and omnipresent need. Computational approaches have transformed economics and the social sciences: Whether COVID vaccine reliability, claims of election fraud, or the replication crisis in medicine and social science, our world is full of increasingly opaque systems and phenomena which our media environment is decreasingly equipped to tackle for and with us. An important tool in this work is the ability to reason about and evaluate empirical research effectively, which in turn depends on fundamental ideas about causality and how to evaluate the strength and likelihood of various claims. Graphical methods in statistics offer a new tool complementing traditional, easily misused ideas like p-values which dominate current introductions to statistics without leaving youth in a better position to meaningfully evaluate and understand statistical inference.

The specifics of these are less important than the fact that there are many, many such agendas that go largely unexplored because we lack the tangible infrastructure to set ambitious, computationally sophisticated educational research agendas.

3. How will the National Laboratory Schools differ from magnet schools for those interested in computer science?

The premise of the National Laboratory Schools is that computation, like writing, can transform many subjects. These schools won’t place disproportionate emphasis on the field of computer science, but rather will emphasize integration of computational thinking into all disciplines—and educational practice as a whole. Moreover, magnet schools often use selective enrollment in their admissions. National Laboratory Schools are public schools interested in the core issues of the median public school, and therefore it is important they tackle the full range of challenges and opportunities that public schools face. This involves enrolling a socioeconomically, demographically, and academically diverse group of youth.

4. How will the National Laboratory Schools differ from the Institute for Education Science’s Regional Education Laboratories?

The Institute for Education’s (IES’s) Regional Education Laboratories (RELs) do not operate schools. Instead, they convene and partner with local policymakers to lead applied research and development, often focused on actionable best practices for today’s schools (as exemplified by the What Works Clearinghouse). This is a valuable service for educators and policymakers. However, this service is by definition limited to existing school models and assumptions about education. It does not attempt to pioneer new school models or curricula.

5. How will the National Laboratory Schools program differ from tech-focused workforce-development initiatives, coding bootcamps, and similar programs?

These types of programs focus on the training and placement of software engineers, data scientists, user-experience designers, and similar tech professionals. But just as computational thinking is broader than just programming, the National Laboratory Schools program is broader than vocational training (important as that may be). The National Laboratory Schools program is about rethinking school in light of the computational revolution’s effect on all subjects, as well as its effects on how school could or should operate. An increased sensitivity to vocational opportunities in software is only a small piece of that.

6. Can computation really change classes other than math and science?

Yes. The easiest way to prove this is to consider how professional practice of non-STEM fields has been transformed by computation. In economics, the role of data has become increasingly prominent in both research and decision making. Data-driven approaches have similarly transformed social science, while also expanding the field’s remit to include specifically online, computational phenomena (like social networks). Politics is increasingly dominated by technological questions, such as hacking and election interference. 3D modeling, animation, computational art, and electronic music are just a few examples of the computational revolution in the arts. In English and language arts, multimedia forms of narrative and commentary (e.g., podcasts, audiobooks, YouTube channels, social media, etc.) are augmenting traditional books, essays, and poems. 

7. Why and how should National Laboratory Schools commit to financial and legal parity with public schools?

The challenges facing public schools are not purely pedagogical. Public schools face challenges in serving diverse populations in resource-constrained and highly regulated environments. Solutions and innovation in education need to be prototyped in realistic model systems. Hence the National Laboratory Schools must commit to financial and legal parity with public schools. At a minimum, this should include a commitment to (i) a per-capita student cost that is no more than twice the average of the relevant catchment area for a given National Laboratory School (the 2x buffer is provided to accommodate the inevitably higher cost of prototyping educational practices at a small scale), and (ii) enrollment that is demographically and academically representative (including special-education and English Language Learner participation) of a similarly aged population within thirty minutes’ commute, and that is enrolled through a weighted lottery or similarly non-selective admissions process.

8. Why are Xerox PARC and the Mayo Clinic good models for this initiative?

Both Xerox PARC and the Mayo Clinic are prototypical examples of hyper-creative, highly-functioning research and development laboratories. Key to their success inventing the future was living it themselves.

PARC researchers insisted on not only building but using their creations as their main computing systems. In doing so, they were able to invent everything from ethernet and the laser printer to the whole paradigm of personal computing (including peripherals like the modern mouse and features like windowed applications that we take for granted today).

The Mayo Clinic runs an actual hospital. This allows the clinic to innovate freely in everything from management to medicine. As a result, the clinic created the first multi-specialty group practice and integrated medical record system, invented the oxygen mask and G-suit, discovered cortisone, and performed the first hip replacement.

One characteristic these two institutions share is that they are focused on applied design research rather than basic science. PARC combined basic innovations in microelectronics and user interface to realize a vision of personal computing. Mayo rethinks how to organize and capitalize on medical expertise to invent new workflows, devices, and more.

These kinds of living laboratories are informed by what happens outside their walls but are focused on inventing new things within. National Laboratory Schools should similarly strive to demonstrate the future in real-world operation.

Why?

1. Don’t laboratory schools already exist? Like at the University of Chicago?

Yes. But there are very few of them, and almost all of those that do exist suffer from one or more issues relative to the vision proposed herein for National Laboratory Schools. First, most existing laboratory schools are not public. In fact, most university-affiliated laboratory schools have, over time, evolved to mainly serve faculty’s children. This means that their enrollment is not socioeconomically, demographically, or academically representative. It also means that families’ risk aversion may constrain those schools’ capacity to truly innovate. Most laboratory schools not affiliated with a university use their “laboratory” status as a brand differentiator in the progressive independent-school sector.

Second, the research functions of many laboratory schools have been hollowed out given the absence of robust funding. These schools may engage in shallow renditions of participatory action research by faculty in lieu of meaningful, ambitious research efforts. 

Third, most educational-design questions investigated by laboratory schools are investigated at the classroom or curriculum (rather than school design) level. This creates tension between those seeking to test innovative practices (e.g., a lesson plan that involves an extended project) and the constraints of traditional classrooms.

Finally, insofar as bona fide research does happen, it is constrained by what is funded, publishable, and tenurable within traditional graduate schools of education. Hence most research reflects the concerns of existing schools instead of seeking to reimagine school design and educational practice.

2. Why will National Laboratory Schools succeed where past efforts at educational reform (e.g., charter schools) have failed?

Most past educational-reform initiatives have focused on either supporting and improving existing schools (e.g., through improved curricula for standard classes), or on subsidizing and supporting new schools (e.g., charter schools) that represent only minor departures from traditional models.

The National Laboratory Schools program will provide a new research, design, and development infrastructure for inventing new school models, curricula, and educator training. These schools will have resources, in-house expertise, and research priorities that traditional public schools—whether district or charter or pilot—do not and should not. If the National Laboratory Schools are successful, their output will help inform educational practice across the U.S. school ecosystem. 

3. Don’t charter schools and pilot schools already support experimentation? Wasn’t that the original idea for charter and pilot schools—that they’d be a laboratory to funnel innovation back into public schools?

Yes, but this transfer hasn’t happened for at least two reasons. First, the vast majority of charter and pilot schools are not pursuing fundamentally new models because doing so is too costly and risky. Charter schools can often perform more effectively than traditional public schools, but this is just as often because of problematic selection bias in enrollment as it is because the autonomy they’re given allows for more effective leadership and organizational management. Second, the politics around charter and pilots has become increasingly toxic in many places, which prevents new ideas from being considered by public schools or advocated for effectively by public leaders.

4. Why do we need invention at the school rather than at the classroom level? Wouldn’t it be better to figure out how to improve schools that exist rather than end up with some unworkable model that most districts can’t adopt?

The solutions we need might not exist at the classroom level. We invest a great deal of time, money, and effort into improving existing schools. But we underinvest in inventing fundamentally different schools. There are many design choices which we need to explore which cannot be adequately developed through marginal improvements to existing models. One example is project-based learning, wherein students undertake significant, often multidisciplinary projects to develop their skills. Project-based learning at any serious level requires significant blocks of time that don’t fit in traditional school schedules and calendars. A second example is the role of computational thinking, as centered in this proposal. Meaningfully incorporating computational approaches into a school design requires new pedagogies, developing novel tools and curricula, and re-training staff. Vanishingly few organizations do this kind of work as a result.

If and when National Laboratory Schools develop substantially innovative models that demonstrate significant value, there will surely need to be a translation process to enable districts to adopt these innovations, much as translational medicine brings biomedical innovations from the lab to the hospital. That process will likely need to involve helping districts start and grow new schools gradually, rather then district-wide overhauls.

5. What kinds of “traditional assumptions” need to be revisited at the school level?

The basic model of school assumes subject-based classes with traditionally licensed teachers lecturing in each class for 40–90 minutes a day. Students do homework, take quizzes and tests, and occasionally do labs or projects. The courses taught are largely fixed, with some flexibility around the edges (e.g., through electives and during students’ junior and senior high-school years).

Traditional school represents a compromise among curriculum developers, standardized-testing outfits, teacher-licensure programs, regulations, local stakeholder politics, and teachers’ unions. Attempts to change traditional schools almost always fail because of pressures from one or more of these groups. The only way to achieve meaningful educational reform is to demonstrate success in a school environment rethought from the ground up. Consider a typical course sequence of Algebra I, Geometry, Algebra II, and Calculus. There are both pedagogical and vocational reasons to rethink this sequence and instead center types of mathematics that are more useful in computational contexts (like discrete mathematics and linear algebra). But a typical school will not be able to simultaneously develop the new tools, materials, and teachers needed to do so.

6. Has anything like the National Laboratory School program been tried before?

No. There have been various attempts to promote research in education without starting new schools. There have been interesting attempts by states to start new schools (like Governor’s Schools),there have been some ambitious charter schools, and there have been attempts to create STEM-focused and computationally focused magnet schools. But there has never been a concerted attempt in the United States to establish a new kind of research infrastructure built atop the foundation of functioning schools as educational “sandboxes”.

How?

1. How will we pay for all this? What existing funding streams will support this work? Where will the rest of the money for this program come from?

For budgeting purposes, assume that each Laboratory School enrolls a small group of forty high school or community college students full-time at an average per capita rate of $40,000 per person per year. Half of that budget will support the functioning of schools themselves. The remaining half will support a small research and development team responsible for curating and developing the computational tools, materials, and curricula needed to support the School’s educators. This would put the direct service budget of the school solidly at the 80th percentile of current per capita spending on K–12 education in the United States.With these assumptions, running 100 National Laboratory Schools would cost ~$160 million. Investing $25 million per year would be sufficient to establish an initial 15 sites. This initial federal funding should be awarded through a 1:1 matching competitive-grant program funded by (i) the 10% of American Competitiveness and Workforce Improvement Act (ACWIA) Fees associated with H1-B visas (which the NSF is statutorily required to devote to public-private partnerships advancing STEM education), and (ii) the NSF TIP Directorate’s budget, alongside budgets from partner agency programs (for instance, the Department of Education’s Education Innovation and Research and Investing in Innovation programs). For many states, these funds should also be layered atop their existing Elementary and Secondary School Emergency Relief (ESSER) and American Rescue Plan (ARP) awards.

2. Why is vertical integration important? Do we really need to run schools to figure things out?

Vertical integration (of research, design, and operation of a school) is essential because schools and teacher education programs cannot be redesigned incrementally. Even when compelling curricular alternatives have been developed under the auspices of an organization like the NSF, practical challenges in bringing those innovations to practice have proven insurmountable. In healthcare, the entire field of translational medicine exists to help translate research into practice. Education has no equivalent.

The vertically integrated National Laboratory School system will address this gap by allowing experimenters to control all relevant aspects of the learning environment, curricula, staffing, schedules, evaluation mechanisms, and so on. This means the Laboratory Schools can demonstrate a fundamentally different approach, learning from great research labs like Xerox PARC and the Mayo Clinic, much of whose success depended on tightly-knit, cross-disciplinary teams working closely together in an integrated environment.

3. What would the responsibilities of a participating agency look like in a typical National Laboratory School partnership?

A participating agency will have some sort of educational or workforce-development initiative that would benefit from the addition of a National Laboratory School as a component. This agency would minimally be responsible for:

4. How should success for individual Laboratory Schools be defined?

Working with the Institute of Education Sciences (IES)’ National Center for Education Research(NCER), the DSI should develop frameworks for collecting necessary qualitative and quantitative data to document, understand, and evaluate the design of any given Laboratory School. Evaluation would include evaluation of compliance with financial and legal parity requirements as well as evaluation of student growth and work products.

Evaluation processes should include:

Success should be judged by a panel of experts that includes domain experts, youthworkers and/or school leaders, and DSI leadership. Dimensions of performance these panels should address should minimally include depth and quality of students’ work, degree of traditional academic coverage, ambition and coherence of the research agenda (and progress on that research agenda), retention of an equitably composed student cohort, and growth (not absolute performance) on the diagnostic/formative assessments.In designing evaluation mechanisms, it will be essential to learn from failed accountability systems in public schools. Specifically:, it will be essential to avoid pushing National Laboratory Schools to optimize for the particular metrics and measurements used in the evaluation process. This means that the evaluation process should be largely based on holistic evaluations made by expert panels rather than fixed rubrics or similar inflexible mechanisms. Evaluation timescales should also be selected appropriately: e.g., performance on diagnostic/formative assessments should be measured by examining trends over several years rather than year-to-year changes.

5. What makes the Small Business Innovation Research (SBIR) program a good model for the National Laboratory School program?

The SBIR program is a competitive grant competition wherein small businesses submit proposals to a multiphase grant program. SBIR awards smaller grants (~$150,000) to businesses at early stages of development, and makes larger grants (~$1 million) available to awardees who achieve certain progress milestones. SBIR and similar federal tiered-grant programs (e.g., the Small Business Technology Transfer, or STTR, program) have proven remarkably productive and cost-effective, with many studies highlighting that they are as or more efficient on a per-dollar basis when compared to the private sector via common measures of innovation like number of patents, papers, and so on.

The SBIR program is a good model for the National Laboratory School program; it is an example of the federal government promoting innovation by patching a hole in the funding landscape. Traditional financing options for businesses are often limited to debt or equity, and most providers of debt (like retail banks) for small businesses are rarely able or incentivized to subsidize research and development. Venture capitalists typically only subsidize research and development for businesses and technologies with reasonable expectations of delivering 10x or greater returns. SBIR provides funding for the innumerable businesses that need research and development support in order to become viable, but aren’t likely to deliver venture-scale returns.

In education, the funding landscape for research and development is even worse. There are virtually no sources of capital that support people to start schools, in part because the political climate around new schools can be so fraught. The funding that does exist for this purpose tends to demand school launch within 12–18 months: a timescale upon which it is not feasible to design, evaluate, refine an entirely new school model. Education is a slow, expensive public good: one that the federal government shouldn’t provision, but should certainly subsidize. That includes subsidizing the research and development needed to make education better.

States and local school districts lack the resources and incentives to fund such deep educational research. That is why the federal government should step in. By running a tiered educational research-grant program, the federal government will establish a clear pathway for prototyping and launching ambitious and innovative schools.

6. What protections will be in place for students enrolled in Laboratory Schools?

The state organizations established or selected to oversee Laboratory Schools will be responsible for approving proposed educational practices. That said, unlike in STEM fields, there is no “lab bench” for educational research: the only way we can advance the field as a whole is by carefully prototyping informed innovations with real students in real classrooms.

7. Considering the challenges and relatively low uptake of educational practices documented in the What Works Clearinghouse, how do we know that practices proven in National Laboratory Schools will become widely adopted?

National Laboratory Schools will yield at least three kinds of outputs, each of which is associated with different opportunities and challenges with respect to widespread adoption.

The first output is people. Faculty trained at National Laboratory Schools (and at possible educator-development programs run within the Schools) will be well positioned to take the practices and perspectives of National Laboratory Schools elsewhere (e.g., as school founders or department heads). The DSI should consider establishing programs to incentivize and support alumni personnel of National Laboratory Schools in disseminating their knowledge broadly, especially by founding schools.

The second output is tools and materials. New educational models that are responsive to the computational revolution will inevitably require new tools and materials—including subject-specific curricula, cross-disciplinary software tools for analysis and visualization, and organizational and administrative tools—to implement in practice. Many of these tools and materials will likely be adaptations and extensions of existing tools and materials to the needs of education.

The final output is new educational practices and models. This will be the hardest, but probably most important, output to disseminate broadly. The history of education reform is littered with failed attempts to scale or replicate new educational models. An educational model is best understood as the operating habits of a highly functioning school. Institutionalizing those habits is largely about developing the skills and culture of a school’s staff (especially its leadership). This is best tackled not as a problem of organizational transformation (e.g., attempting to retrofit existing schools), but rather one of organizational creation—that is, it is better to use models as inspirations to emulate as new schools (and new programs within schools) are planned. Over time, such new and inspired schools and programs will supplant older models.

8. How could the National Laboratory School program fail?

Examples of potential pitfalls that the DSI must strive to avoid include:

Reforming Nuclear Research Practices in the Marshall Islands

Summary

In the mid-20th century, the United States test-detonated dozens of nuclear weapons in the Republic of the Marshall Islands (RMI). Using the RMI as a test site for nuclear- weapons research allowed the U.S. to better understand the effects of such weapons and their destructive capacities — but at significant cost. Conducting nuclear tests in the vulnerable RMI harmed human health, fomented distrust in research sponsored by the U.S. government, and fueled tensions with the Marshallese. Fallout from the tests undermined U.S. influence in the Pacific, cooperation over ecological restoration, and the reputation of the U.S. research enterprise. Building back relations with the RMI (and other allies that have long supported the United States) is crucial for enabling the Biden Administration to undo the adverse effects of Trump-era policies on international relations and the environment, especially amid rising threats from China and Russia.

To that end, the Department of Energy (DOE) and Department of Interior (DOI) should adopt provisions for conducting nuclear research with and in the Marshall Islands that will: (i) increase transparency and trust in American research concerning the Marshall Islands, and (ii) elevate Marshallese voices in the fight for preservation of their lands. These provisions are as follows:

  1. All collected data should be translated into Marshallese and shared with RMI officials and relevant stakeholders.
  2. When appropriate (e.g., when security and privacy considerations permit), collected data should be published in an easy-to-access online format for public consumption.
  3. All research should be clearly laid out and submitted to the RMI National Nuclear Commission (NNC) in accordance with the NNC’s Nuclear Research Protocol.
  4. The United States should coordinate with the NNC, the College of the Marshall Islands (CMI) Nuclear Institute, regional agencies, and other relevant nongovernmental organizations and local stakeholders to ensure that local knowledge is considered in the design of nuclear-related research and data projects.
  5. All possible steps should be taken to include the participation of Marshallese residents in research ventures and operations.

Pathways to Net-zero Soil Loss by 2050

The current administration should announce its intention to achieve net-zero soil loss by 2050. This target aligns with President Biden’s plan to “mount a historic, whole-of-Government-approach to combating climate change,” would help fulfill the administration’s commitment to achieving a net-zero-emissions economy by 2050, and is key to protecting our nation’s agricultural productivity.

Healthy soil is essential to food production. Less well recognized is the vital role that soil plays in climate modulation. Soil is the largest terrestrial carbon repository on the planet, containing three times the amount of carbon in Earth’s atmosphere. Soil represents a potential sink for 133 billion tons of carbon (equal to 25 years of U.S. fossil-fuel emissions). Using soil to offset emissions generates significant co-benefits. Carbon sequestration in soil nourishes soil ecosystems by improving soil architecture and increasing water-holding capacity. Deeper and more fertile soil also supports biodiversity and enriches natural habitats adjacent to agricultural land.

Over two-thirds of the United States is grassland, forestland, and cropland. Land practices that increase the amount of carbon stored underground present a relatively low-cost means for President Biden’s administration to pursue its goal of net-zero carbon emissions by 2050. But lost soil can no longer serve as a carbon repository. And once lost, soil takes centuries to rebuild. Increasingly extreme climate events and soil-degrading industrial farming practices are combining to rapidly deplete our nation’s strategic soil resources. The United States is losing 10.8 tons of fertile soil per hectare per year: a rate that is at least ten times greater than the rate of soil production. At this rate, many parts of the United States will run out of soil in the next 50 years; some regions already have. For example, in the Piedmont region of the eastern United States, farming practices that were inappropriate for the topography caused topsoil erosion and led to the abandonment of agriculture. The northwestern Palouse region has lost 40–50% of its topsoil, and one-third of the Midwest corn belt has lost all of its topsoil.

Soil loss reduces crop yields, destroys species’ habitats that are critical to food production, and causes high financial losses. Once roughly half of the soil is lost from a field, crop yields and nutrient density suffer. Maintaining a desired level of agricultural output then requires synthetic fertilizers that further compromise soil health, unleashing a feedback loop with widespread impacts on air, land, and water quality — impacts that are often disproportionately concentrated in underserved populations.

Climate change and soil erosion create a dual-threat to food production. As climate change progresses, more extreme weather events like intense flooding in the northeastern United States and prolonged drought in the Southwest make farmland less hospitable to production. Concurrently, soil erosion and degradation release soil carbon as greenhouse gases and make crops more vulnerable to extreme weather by weakening the capabilities of plants to fix carbon and deposit it in the soil. Halting soil erosion could reduce emissions, and building stable stores of soil carbon will reduce atmospheric carbon.

Prioritizing soil health and carbon sequestration as a domestic response to the climate and food-security crises is backed by centuries of pre-industrial agricultural practices. Before European occupation of tribal lands and the introduction of “modern agricultural practices,” Indigenous peoples across North America used soil protective practices to produce food while enhancing the health of larger ecosystems. Some U.S. farmers adhere to principles that guide all good soil stewardship — prevent soil movement and improve soil structure. Practices like no-till farming, cover cropping, application of organic soil amendments, and intercropping with deep-rooted prairie plants are proven to anchor soil and can increase its carbon content. In livestock production, regenerative grazing involves moving animals frequently to naturally fertilize the soil while allowing plants to recover and regrow. If all farms implemented these practices, most soil erosion would halt. The challenge is to equip farmers with the knowledge, financial incentives, and flexibility to use soil-protective techniques.

This document recommends a set of actions that the federal government — working with state and local governments, corporations, research institutions, landowners, and farmers — can take towards achieving net-zero soil loss by 2050. These recommendations are supported by policy priorities outlined in President Biden’s Discretionary Budget Request for Fiscal Year 2022 and the bipartisan infrastructure deal currently under negotiation in Congress. Throughout, we emphasize the importance of (1) prioritizing storage of stable carbon (i.e., carbon that remains in soils for the long term) and (2) addressing environmental injustices associated with soil erosion by engaging a broad group of stakeholders.

Firm commitments to restore degraded land will establish the United States as an international leader in soil health, help avoid the worst impacts of climate change, strengthen food security, advance environmental justice, and inspire other countries to set similar net-zero targets. The health of our planet and its people depend on soil preservation. Our nation can, and should, lead the way.

Plan of Action

Action 1. Become a signatory of “4 per mille,” the international initiative encouraging countries to collectively increase global soil carbon by 0.4 percent per year.

The United States should officially join the international effort, “4 per mille” (4p1000), and commit to increasing stable soil carbon by at least 0.4 percent per year. By signing onto this effort, President Biden would send a powerful message of appreciation for U.S. conservation farmers and signal to the rest of the world that soil and forest management are important strategies for mitigating and adapting to climate change.

Detractors of 4p1000 have raised concerns about its feasibility, measurement, and accountability. These arguments obscure the target’s intent: to motivate a global effort to sequester carbon in soil and avert the worst of anthropogenic climate change. The target gives countries a tangible and common goal to work towards as they identify and implement the soil-carbon sequestration strategies that will work best in their respective domestic environments.

Before COP26, the White House Office of Science and Technology Policy, in partnership with the Secretary of Agriculture and the Biden administration’s climate change leaders (John Kerry and Gina McCarthy), should develop a strategy to accompany the United States’ endorsement of 4p1000 and garner endorsements of the agreement from other nations. A central pillar of this strategy should focus on developing and deploying inexpensive methods to estimate soil carbon. These new tools would help farmers track their net carbon increases and ensure that carbon emissions from soil are not negating their efforts.

This action could be supported by funds allocated to the Department of State for multilateral climate initiatives, Department of Interior funding for ecosystem resilience among all land-management agencies, and USDA’s renewed investment to engage landowners to combat climate change and increase participation in voluntary conservation.

Action 2. Invest in a data repository for agriculture and soil carbon. 

Advances in soil health of agricultural systems, like advances in human health, will depend on the sector’s capacity to aggregate and refine big data. This capacity is needed to develop comprehensive decision-support tools underpinned by hyperlocal data in a publicly accessible and well-maintained database

USDA’s Agricultural Research Service currently supports a data repository through its National Agricultural Library (NAL). The NAL repository houses datasets generated by USDA researchers and other USDA-funded research. Unfortunately, the NAL repository is poorly equipped to handle data originating from additional sources. Nor does the NAL repository support the industry-wide annotation system needed to make data searchable and comparable.

A new repository is needed. The National Library of Medicine (NLM) offers an excellent model in GenBank. By helping researchers compare genes, this open-access bioinformatics tool deepens our understanding of health and accelerates development of medical treatments. GenBank connects to similar databases worldwide, and researchers contribute to and search the databases with associated software. The National Weather Service (NWS) similarly compiles a massive set of weather data that supports research and generates income from business services. Both GenBank and the National Weather Service’s databases have supported an explosion of resources, products, and services, from diagnostic medical tests, precision medicine, and genetic testing to weather apps for phones. These databases also feature budgets an order of magnitude larger than the budget for USDA’s NAL.

A right-sized investment in a broad agricultural research database at the NAL, including data generated with proprietary smart-farm technologies and other public-private collaborations, is the future of modern agriculture and agriculture research. Nationally available, high-quality, and curated agricultural data would seed a wealth of new services and companies in the sector. The database would also support the implementation of reliable, locally tailored, and situationally relevant soil-management practices and decision tools that provide precision health practices for soil.

Specifically, we recommend that USDA take the following steps to establish a broad agricultural data repository:

These steps could be carried out using discretionary funding at USDA earmarked for investments in research and development capacity of farmers. These steps collectively align with the administration’s goal to “support a multi-agency initiative aimed at integrating science-based tools into conservation planning and verifying stable carbon sequestration, greenhouse-gas reduction, wildlife stewardship, and other environmental services at the farm level and on federal lands.”

Action 3. Invest in targeted research to reduce soil erosion and increase carbon sequestration.

General factors contributing to soil loss and mitigation principles are universal. Still, the most effective combination of specific practices for reducing soil erosion and increasing carbon sequestration depends on local soil type, slope, soil condition, land use, and weather. In many farming settings, regenerative practices can increase soil carbon and eliminate soil erosion in as little as one or two growing seasons. But matching best practices to a given location can be complex.

For example, intensive tillage is the most soil-erosive practice in agriculture. Reducing the use of this practice has been an important goal for soil-preservation efforts over the last four decades. Organic farms frequently use intensive tillage because organic certification prohibits the use of genetically engineered plants or herbicides—even though herbicide treatment provides excellent weed control and genetic engineering has made it possible to suppress weeds using herbicides without damage to the engineered crop plant. Reducing soil erosion on organic farms hence requires research into new methods of weed control.

The USDA National Institute of Food and Agriculture (NIFA) and the National Science Foundation (NSF) should jointly fund competitive grants for research into practices that reduce soil erosion, increase the nutrient density of food, and sequester carbon stably. Priority projects of these grants might include:

As with Action 2, these steps could be carried out using discretionary funding at USDA earmarked for investments in farmers’ research and development capacity. These steps collectively align with the administration’s goal to support a multi-agency initiative to integrate science-based tools into conservation planning and verify stable carbon sequestration, greenhouse-gas reduction, wildlife stewardship, and other environmental services at the farm level on federal lands.

Action 4. Develop financial and educational programs that help farmers transition to soil-protective practices.

Soil-protective practices have agronomic and economic benefits. Farmers using continuous no-till methods save several thousand dollars each year due to reduced fuel and labor investments. But economic returns on soil-saving practices can take several years to accrue. Growers are rightly concerned about their financial solvency in the short term should they implement such practices, as well as about yield reductions associated with no-till agriculture in some cases. USDA should (i) provide financial assistance to help producers transition to soil-saving practices and (ii) offer training to help producers realize maximal benefits of soil-protective practices at each phase of the transition.

For instance, USDA’s Farm Service Agency (FSA) could offer loans based on cost-saving projections from reduced need for synthetic inputs and increased potential yield once the transition to soil-protective practices is complete. For example, loans could cover the cost of the first five years of projected lost income per acre. At the end of this term, USDA’s Risk Management Agency (RMA) could offer discounted crop insurance rates because the now-healthier soil would engender a more resilient system less likely to experience catastrophic losses during floods and droughts. Farmers could use savings on insurance costs to repay loans and keep premiums constant once repayment begins.

Participation in the loan program could be contingent on farmers’ capacity to maintain soil-protective practices for at least ten years. During the initial five-year loan period, soil-health specialists affiliated with USDA could provide farmers with training on measuring progress, collecting data, and uploading that data to a centralized database. Outcomes across participating farms could be tracked and iteratively inform best practices during the transition period. After the initial five-year period, farmers could qualify for a five-year loan-forbearance period if they demonstrate continued participation in the program.

USDA could also offer direct payments to farmers participating in soil revitalization. Another Day One Project policy proposal recommends that the USDA offer incentive payments for climate-smart practices that produce ecosystem services if the producer cannot find a buyer through an ecosystem-services market. 

Specifically, we recommend that USDA take the following steps to develop financial and educational programs that help farmers transition to soil-protective practices:

These steps could be supported by discretionary funding at the Department of Treasury earmarked for investments in American communities and small businesses and USDA funds dedicated to growing rural economies. These steps align with President Biden’s commitment to expanding the role of Community Development Financial Institutions (CDFIs), which offer loans to start-ups and small businesses in rural communities and create new markets for reaching a net-zero carbon economy by 2050.

Action 5. Develop circular economy practices for young entrepreneurs supporting soil conservation.

Small businesses have a significant role in post-pandemic recovery by providing jobs and combating the climate crisis through innovation. The path to a net-zero carbon economy by 2050 must include circular economy principles that design waste out of economic cycles, keep products and materials in use, and regenerate natural systems. Additionally, closing education gaps and creating new paths to secure jobs for young people who did not complete high school has transformational effects on economic opportunities, health, and life expectancy.

USDA, the Small Business Administration (SBA), and the Minority Business Development Administration (MBDA) should jointly develop a “Ground Up” program that (i) engages the agriculture industry in identifying circular-economy business opportunities and (ii) engages young people without a high-school education in starting small businesses that conserve, restore, and protect soil and other natural resources. Ground Up would fill gaps created by the uneven and insufficient USDA Extension workforce in underserved and under-resourced communities. Ground Up would also provide more extensive business and entrepreneurship training than is typically possible through Extension programs. By leveraging relationships with industry partners, program participants could be connected to byproducts—or “wasted resources”—they need to start a circular business and access to mentoring and markets required to sell their products and services profitably. For example, a Ground Up enterprise might incorporate grounds from commercial or residential coffee-making operations or municipal waste into commercial compost production. The Participants who complete the Ground Up program would be eligible for nointerest federal business loans, with repayment required once the business was profitable. The federal government could partner with Community Development Financial Institutions (CDFIs) to share the cost of loans and build connections among young entrepreneurs, Extension professionals, and potential partner businesses.

Specifically, we recommend that USDA and the White House take the following steps to develop circular economy practices for young entrepreneurs supporting soil conservation:

These steps could be implemented using discretionary funds within USDA, SBA, and MBDA earmarked to support innovative multi-agency business opportunities for rural and minority entrepreneurs. These steps align with the SBA’s commitments help small businesses combat climate change and invest in underserved entrepreneurs; the USDA’s mandate to grow rural economies and foster innovation in the agricultural sector, as well as USDA’s dedication to increasing and protecting biodiversity through good farm stewardship; and the MBDA’s economic-development grants aimed at addressing long-standing racial inequity for minority-owned firms.

Action 6. Support diversity in the agricultural workforce pipeline.

People of color, including Black, LatinX, and Indigenous people, are underrepresented in agriculture and agricultural sciences. To begin addressing this underrepresentation, the Biden administration should ensure diversity in its proposed Civilian Climate Corps (CCC). The CCC is envisioned as a modern-day equivalent of the Depression-era Civilian Conservation Corps work-relief program. The new iteration focuses on enhancing conservation and climate-smart practices across the country. The new CCC represents a terrific way for the Department of the Interior (DOI) to train a diverse workforce in climate- and soil-smart land-management practices with clear pathways to careers in technical assistance, agribusiness, and academic agricultural research, among others.

The administration can boost diversity in agricultural research by directing the USDA’s Office of Civil Rights and the National Institute of Food and Agriculture (NIFA) to conduct an in-depth assessment of challenges faced by researchers of color in agricultural science and develop discipline-wide plans to address them. The administration can also increase research funding and funding for research infrastructure targeted at underrepresented populations. Students from disadvantaged backgrounds are more likely to choose fields with reliable funding. The relative lack of funding for agricultural sciences, as evidenced by outdated educational infrastructure and shrinking training programs, puts agriculture departments at a stark disadvantage compared to the modern facilities (and reliable post-graduate incomes) of other scientific departments (e.g., biomedicine). The National Science Foundation (NSF) should support research and facilities at Historically Black Colleges and Universities (HBCUs) to demonstrate and communicate programmatic stability and cutting-edge innovation in agriculture.

Specifically, we recommend that the Biden administration take the following steps to support diversity in the agricultural-workforce pipeline: 

These steps could be supported by funding allocated at USDA, NSF, and DOI to increase racial equity, specifically the participation of historically underrepresented people in the Civilian Climate Corps and farming, science, and engineering more broadly.

Action 7. Fund existing and proposed advanced research projects agencies (ARPAs) to invest in soil-saving research.

USDA’s research agencies tend to fund low-risk research that delivers incremental changes in agricultural practices. This essential research provides many strategies for stemming soil loss, but remarkably few farms employ these strategies. The nation needs paradigm-shifting advances that farmers will use. The Advanced Research Projects Agency (ARPA) model can help realize such advances by investing deeply in bold ideas outside of mainstream thinking. Several existing and proposed ARPA programs are well-positioned to invest in soil-saving research.

ARPA-Energy (ARPA-E) in the Department of Energy (DOE) is already funding high-impact agricultural research that protects soil. ARPA-E has invested in one soil-centered project, ROOTS, to develop “root-focused” plant cultivars that could dramatically reduce atmospheric carbon. The agency is also gearing up for a new project on carbon farming. These projects match ARPA-E’s energy-focused mission, which includes reducing greenhouse gases in the atmosphere. However, ARPA-E does not have the mandate to invest in specific agricultural projects that build and protect soil. Two additional ARPA-style entities have been proposed that could do so instead: ARPA-I (infrastructure), included as part of the bipartisan Infrastructure Investment and Jobs Act, and AgARDA, a USDA-based ARPA-style agency authorized by the 2018 Agriculture Improvement Act (Farm Bill). If funded, ARPA-I, AgARDA, or both could invest in groundbreaking research to drive soil protection.

To leverage the ARPA model for transformative advances in soil-saving research, we recommend that the Biden administration:

These steps could be supported by discretionary funds allocated to the DOE and USDA. Cumulatively, the President’s most recent budget request directs $1.1 billion to DOE to support breakthroughs in climate and clean-energy research and solutions. Specifically, mitigating and adapting to the climate crisis involves more than inventing cleaner energy; new technologies that help farmers protect soil and fix carbon into the land will also be essential for correcting extreme imbalances in the global carbon budget.

Action 8. Develop criteria and funding for “Earth Cities.”

People feel helpless and fatigued about climate change at the local level partly because they lack the agency to make positive steps to remove greenhouse gases from the atmosphere. The White House should deepen its relationships with mayors and nonprofit coalition groups of cities—such as C40, U.S. Conference of Mayors, and the National League of Cities— to engage urban communities in combating hazards related to climate change.

Like the Arbor Day Foundation’s “Tree Cities” program that encourages communities to steward their tree resources, a national “Earth Cities” program would recognize cities leading the way on urban soil stewardship and management. Criteria for receiving the “Earth City” designation could include implementation of a centralized municipal composting program, large-scale replanting of public parks and rights-ofway with native grasses and perennials that have soil-health benefits, creative management of excavated soil and rock generated by urban construction, becoming a signatory to the 4p1000 initiative, and observance of World Soil Day on December 5. Taking steps to become an “Earth City” and prioritizing soil management at the municipal level offers communities a way to make a positive difference and experience benefits locally while addressing global climate challenges.

Recent research demonstrates that temperatures can vary as much as 20 degrees across different neighborhoods within the same city. Urban heat islands often overlap with communities of color and low-income households in areas with few trees and large amounts of heat-trapping pavement. In these historically redlined communities, rates of heat-related illness and deaths are also higher than wealthier, whiter, and cooler parts of town. Additionally, meeting green building codes and keeping federally supported housing projects affordable has become increasingly difficult in urban centers. Tending to soil health by reusing excavated soil, planting trees and tall grasses on site, and creating more green spaces can inexpensively mitigate the urban heat-island effect while increasing access to nature in historically under-resourced communities. A partnership between soil experts at USDA, pollution and environmental-hazard experts at EPA, and affordable housing programs at the Department of Housing and Urban Development (HUD) would support cities with funding and implementation and further strengthen program viability by tying federal support to local soil stewardship practices.

Specifically, we recommend that the Biden administration take the following steps to recognize and support cities striving to preserve soil and enhance soil-carbon sequestration:

These steps could be supported through earmarked funds at EPA for the Accelerating Environmental and Economic Justice Initiative, HUD funds to modernize and rehabilitate public housing, infrastructure, and facilities in historically underfunded and marginalized communities; and USDA funds that encourage conservation and increased biodiversity on private land.

Action 9. Plant deep-rooted perennials on median strips to foster carbon-rich soils for multi-benefit surface transportation.

As a part of President Biden’s plan to invest in multi-benefit transportation infrastructures, a policy to populate median strips with deep-rooted prairie perennials presents a means to restore soil carbon and simultaneously sustain essential pollinators in agricultural and other ecosystems. Highway medians are supposed to be at least 50 feet wide for safety, creating a minimum of 6 acres of median per mile of highway. The 47,000 miles of U.S. Interstate and 160,000 miles of other highways amount to nearly 300,000 and 1 million acres, respectively, of median strips in the United States. Each acre could sequester 1.7 tons of carbon per year until the soil’s carrying capacity is reached.

Deep carbon stores of soil in the Midwest resulted from centuries of growth of perennial plants that store most of their carbon in their roots. The crops that replaced the prairies shunt most of their carbon to the harvested aboveground tissues, leaving little in the soil. Corn roots, for example, represent only 1% of the plant biomass by the end of the growing season, whereas the roots of perennials—which can grow to as deep as 15 feet underground—can account for as much as 70% of the plant’s biomass. Between 2009 and 2015, 53 million acres of U.S. land was converted from native vegetation to cropland, leading to a loss of 2% of the soil carbon stored in that land per year. This loss translates to 3.2 gigatons of carbon dioxide released into the atmosphere—equivalent to almost one-half of annual U.S. fossil-fuel emissions.

One way to mitigate soil loss is by planting highway median strips with the native, deep-rooted perennials that simultaneously nourish pollinators, enrich soil, and sequester copious amounts of carbon. The Department of Transportation (DOT) could coordinate a large-scale highway-replanting initiative through the effort proposed in the bipartisan infrastructure bill to rebuild the interstate system. In parallel, federal and local “Adopt-a-Highway” programs could enlist citizens, businesses, and municipalities in seeding median strips with native plants.

Specifically, we recommend that:

The administration could pursue these steps using discretionary funds allocated to the Department of Transportation to support competitive-grant programs for infrastructure. The administration could also leverage part of the $110 billion allocated in the bipartisan Infrastructure Investment and Jobs Act towards infrastructure upgrades, including upgrades focused on climate-change mitigation, resilience, and equity.

Ensuring Good Governance of Carbon Dioxide Removal

Climate change is an enormous environmental, social, and economic threat to the United States. Carbon dioxide (CO2) emissions from burning fossil fuels and other industrial processes are a major driver of this threat. Even if the world stopped emitting CO2 today, the huge quantities of CO2 generated by human activity to date would continue to sit in the atmosphere and cause dangerous climate effects for at least another 1,000 years. The Intergovernmental Panel on Climate Change (IPCC) has reported that keeping average global warming below 1.5°C is not possible without the use of carbon dioxide removal (CDR).2 While funding and legislative support for CDR has greatly increased in recent years, the United States does not yet have a coordinated plan for implementing CDR technologies. The Department of Energy’s CDR task force should recommend a governance strategy for CDR implementation to responsibly, equitably, and effectively combat climate change by achieving net-negative CO2 emissions.

Challenge and Opportunity 

There is overwhelming scientific consensus that climate change is a dangerous global threat. Climate change, driven in large part by human-generated CO2 emissions, is already causing severe flooding, drought, melting ice sheets, and extreme heat. These phenomena are in turn compromising human health, food and water security, and economic growth.

Figure 1. Data collected from observation stations show how noticeably atmospheric CO2 concentrations have risen over the past several decades. (Data compiled by the National Oceanic and Atmospheric Association; figure by Klaus S. Lackner.)

Morton, E.V. (2020). Reframing the Climate Change Problem: Evaluating the Political, Technological, and Ethical Management of Carbon Dioxide Emissions in the United States. Ph.D. thesis, Arizona State University.

CO2 concentrations are higher today than they have been at any point in the last 3 million years. The contribution of human activity is causing CO2 emissions to rise at an unprecedented rate — approximately 2% per year for the past several decades (Figure 1) — a rate that far outpaces the rate at which the natural world can adapt and adjust. A monumental global effort is needed to reduce CO2 emissions from human activity. But even this is not enough. Because CO2 can persist in the atmosphere for hundreds or thousands of years, CO2 already emitted will continue to have climate impacts for at least the next 1,000 years. Keeping the impacts of climate change to tolerable levels requires not only a suite of actions to reduce future CO2 emissions, but also implementation of carbon dioxide removal (CDR) strategies to mitigate the damage we have already done.

The IPCC defines CDR as “anthropogenic activities removing CO2 from the atmosphere and durably storing it in geological, terrestrial, or ocean reservoirs, or in products.” While becoming more energy efficient can reduce emissions and using renewable energy causes zero emissions, only CDR can achieve the “net negative” emissions needed to help restore climate stability.

Five companies around the world — two of which are based in the United States — have already begun commercializing a particular CDR technology called direct air capture. Climeworks is the most advanced company, and can already remove 900 tons of atmospheric CO2 per year at its plant in Switzerland. Though these companies have demonstrated that CDR technologies like direct air capture work, costs need to come down and capacity needs to expand for CDR to remove meaningful levels of past emissions from the atmosphere.

Thankfully, the Energy Act of 2020, a subsection of the 2021 Consolidated Appropriations Act, was passed into law in December 2020. This act creates a carbon removal research, development, and demonstration program within the Department of Energy. It also establishes a prize competition for pre-commercial and commercial applications of direct air capture technologies, provides grants for direct air capture and storage test centers, and creates a CDR task force.

The CDR task force will be led by the Secretary of Energy and include the heads of any other relevant federal agencies chosen by the Secretary. The task force is mandated to write a report that includes an estimate of how much excess CO2 needs to be removed from the atmosphere by 2050 to achieve net zero emissions, an inventory and evaluation of CDR approaches, and recommendations for policy tools that the U.S. government can use to meet the removal estimation and advance CDR deployment. This report will be used to advise the Secretary of Energy on next steps for CDR development and will be submitted to the Senate Committee on Energy and Natural Resources and the House of Representatives Committees on Energy and Commerce and Science, Space, and Technology.

The Biden administration has clearly shown its commitment to combating climate change by rejoining the Paris Agreement and signing several Executive Orders that take a whole-of-government approach to the climate crisis. The Energy Act complements these actions by advancing development and demonstration of CDR. However, the Energy Act does not address CDR governance, i.e., the policy tools necessary to efficiently and ethically steward CDR implementation. A proactive governance strategy is needed to ensure that CDR is used to repair past damage and support communities that have been disproportionately harmed by climate change — not as an excuse for the fossil-fuel industry and other major contributors to the climate crisis to continue dumping harmful greenhouse gases into the atmosphere. The CDR task force should therefore leverage the crucial opportunity it has been given to shape future use of CDR by incorporating governance recommendations into its report.

Plan of Action

The Department of Energy’s CDR task force should consider recommending the following options in its final report. Taken together, these recommendations form the basis of a governance framework to ensure that CDR technologies are implemented in a way that most responsibly, equitably, and effectively addresses climate change.

Establish net-zero and net-negative carbon removal targets.

The Energy Act commendably directs the CDR task force to estimate the amount of CO2 that the United States must remove to become net zero by 2050. But the task force should not stop there. The task force should also estimate the amount of CO2 that the United States must remove to limit average global warming to 1.5°C (a target that will require net negative emissions) and estimate what year this goal could feasibly be achieved. Much like the National Ambient Air Quality Standards enforced by the Environmental Protection Agency, there should be a specific amount of CO2 that the United States should work toward removing to enhance environmental quality. This target could be based on how much CO2 the United States has put into the atmosphere to date and how much of that amount the United States should be responsible for removing. Both net-zero and net-negative removal targets should be preserved through legislation to continue progress beyond the Biden administration.

Design a public carbon removal service.

If carbon removal targets become law, the federal government will need to develop an organized way of removing and storing CO2 in order to reach those targets. Therefore, the CDR task force should also consider what it would take to develop a public carbon removal service. Just as waste disposal and sewage infrastructure are public services paid for by those that generate waste, industries would pay for the service of having their past and current CO2 emissions removed and stored securely. Revenue generated from a public carbon removal service could be reinvested into CDR technology, carbon storage facilities, maintenance of CDR infrastructure, environmental justice initiatives, and job creation. As the Biden administration ramps up its American Jobs Plan to modernize the country’s infrastructure, it should consider including carbon removal infrastructure. A public carbon removal service could materially contribute to the goals of expanding clean energy infrastructure and creating jobs in the green economy that the American Jobs Plan aims to achieve. 

Planning the design and implementation of a public carbon removal service should be conducted in parallel with CDR technology development. Knowing what CDR technologies will be used may change how prize competitions and grant programs funded by the Energy Act are evaluated and how the CDR task force will prioritize its policy recommendations. The CDR task force should assess the CDR technology landscape and determine which technologies — including mechanical, agricultural, and ocean-based processes — are best suited for inclusion in a public carbon removal service. The assessment should be based on factors such as affordability, availability, and storage permanence. The assessment could also consider results from the research, development, and demonstration (RD&D) program and the prize competitions mandated by the Energy Act when making its determination. The task force should also recommend concrete steps towards getting a public carbon removal service up and running. Steps could include, for instance, establishing public-private partnerships with prize competition winners and other commercialized CDR companies.

Create a national carbon accounting standard.

The Energy Act directs the RD&D program to collaborate with the Environmental Protection Agency to develop an accounting framework to certify how much carbon different techniques can remove and how long that carbon can be stored. This may involve investigating the storage permanence of various carbon storage and utilization options. This may also involve creating a database of storage lifetimes for CDR products and processes and identification of CDR techniques best suited for attaining carbon removal targets. The task force could recommend to the Secretary of Energy that the framework becomes a standard. A national carbon accounting standard will be integral for achieving carbon removal targets and verifying removal through public service described above.

Ensure equity in CDR.

While much of the technical and economic aspects of carbon removal have been (or are being) investigated, questions related to equity remain largely unaddressed. The CDR task force should investigate and recommend policies and actions to ensure that carbon removal does not impose or exacerbate societal inequities, especially for vulnerable communities of color and low-income communities. Recommendations that the task force could explore include:

Include CDR in international climate discussions.

Because CDR is a necessary part of any realistic strategy to keep average global warming to tolerable levels, CDR is a necessary part of future international discussions on climate change. The United States can take the lead by including CDR in its nationally determined contribution (NDC) to the Paris Agreement. The U.S. NDC most recently submitted in April 2021 does discuss increasing carbon sequestration through agriculture and oceans but could be even more aggressive by including a broader suite of CDR technologies (e.g., engineered direct air capture) and prioritizing pursuit of carbon-negative solutions. The CDR task force could recommend that the Department of Energy work with the Special Presidential Envoy for Climate and the Department of State Office of Global Change on (1) enhancing the NDC through CDR, and (2) developing climate-negotiation strategies intended to increase the use of CDR globally.

Conclusion

Global climate change has worsened to the point where simply reducing emissions is not enough. Even if all global emissions were to cease today, the climate impacts of the carbon we have dumped into the atmosphere would continue to be felt for centuries to come. The only solution to this problem is to achieve net-negative emissions by dramatically accelerating development and deployment of carbon dioxide removal (CDR). As one of the world’s biggest emitters, the United States has a responsibility to do all it can to tackle the climate crisis. And as one of the world’s technological and geopolitical leaders, the United States is well positioned to rise to the occasion, investing in CDR governance alongside the technical and economic aspects of CDR. The CDR task force can lead in this endeavor by advising the Secretary of Energy on an overall governance strategy and specific policy recommendations to ensure that CDR is used in an aggressive, responsible, and equitable manner.

Reforming Federal Rules on Corporate-Sponsored Research at Tax-Exempt University Facilities

Improving university/corporate research partnerships is key to advancing U.S. competitiveness. Reform of the IRS rules surrounding corporate sponsored research taking place in university facilities funded by tax-exempt bonds has long been sought by the higher education community and will stimulate more public-private partnerships. With Congress considering new ways to fund research through a new NSF Technology Directorate and the possibility of a large infrastructure package, an opportunity is now open for Congress to address these long-standing reforms in IRS rules.

Challenge and Opportunity

Research partnerships between private companies and universities are critical to U.S. technology competitiveness. China and other countries are creating massive, government-funded research centers in artificial intelligence, robotics, quantum computing, biotechnology, and other critical sectors, threatening our nation’s international technology advantage. The United States has responded with initiatives such as the corporate research and development (R&D) tax credit, the SBIR and STTR programs, Manufacturing USA institutes, and numerous other programs and policies to assist tech development and encourage public-private collaborations. States and cities have mirrored these efforts, helping to build a network of innovation hubs in communities across the nation

The U.S. Innovation and Competition Act recently passed by the Senate is designed to build on this progress. A key provision of the Act is the establishment of a National Science Foundation (NSF) Directorate for Science and Technology that would “identify and develop opportunities to reduce barriers for technology transfer, including intellectual property frameworks between academia and industry, nonprofit entities, and the venture capital communities.”

One such barrier is the suite of “private use” rules surrounding corporate research taking place in university facilities financed with tax-exempt bonds. Tax-exempt bonds are a preferred financing option for university research facilities as they carry lower interest rates and more favorable terms. But the Internal Revenue Service (IRS) prohibition on “private business use“ of facilities financed using tax-exempt bonds have the unfortunate consequence of hamstringing the U.S. research enterprise. Current IRS rules place universities wishing to avoid concerns about sponsoring research from outside organizations to hold the rights to almost all intellectual property (IP) generated within their research facilities, even when the research is sponsored by private corporations. This can lead U.S. corporations wishing to retain IP rights to partner with universities overseas instead of U.S. universities institutions. Small technology companies whose business plans depend on their claim to IP rights may similarly avoid partnerships with universities.

Though the IRS has issued policies that aim to address these problems (e.g., Revenue Procedure 2007-47), such policies are so narrow in scope that most research partnerships between companies and universities are still considered private uses. As a result, universities engaged in cutting-edge, industry-relevant research face an unenviable choice: they must either (i) forego promising partnerships with the many companies unwilling to completely cede claims to IP rights, (ii) dedicate substantial time and administrative resources to track and report all specific instances of corporate-sponsored research occurring in facilities financed by tax-exempt bonds, or (iii) use funding that would otherwise be available for research to finance facilities through taxable bonds.

Forcing this choice upon universities further exacerbates a system of “haves” and “have-nots”. Large and/or well-endowed universities may have the financial resources to avoid relying on tax-exempt financing for research facilities, or to hire sophisticated and expensive legal expertise for help structuring financing in a way that complies with IRS rules. But for many — perhaps most — universities, the more viable solution is to avoid corporate-sponsored research altogether.

Complex federal rules governing intellectual property and private business use are widely acknowledged as an issue. A memo from the Association of American Universities (AAU), which represents the leading research universities in North America, notes that “[m]any universities believe that the remaining [IRS] private use regulations are overly restrictive” and “[limit] their ability to conduct certain cooperative research.” Similarly, the website of the Carnegie Mellon University Office of Sponsored Programs warns:

“While colleges and universities have lobbied the Internal Revenue Service to reconsider its position with respect to sponsored research in bond financed facilities, they have not, as yet, been successful. Consequently, if the University does not receive fair market royalties from the sponsors of sponsored research, it risks having its tax-exempt bonds become taxable, with all of the concomitant consequences.”

At a 2013 hearing on “Improving Technology Transfer at Universities, Research Institute and National Laboratories” before the U.S. House of Representatives Committee on Science, Space and Technology, several university witnesses and members of Congress commented on the complications that federal rules present for cooperative research conducted by universities working in partnership with corporations.

In 2014, Congress introduced H.R. 5829 to amend the Internal Revenue Code to provide an exception from the “business use” test for certain public-private research arrangements, but it did not pass as a stand-alone bill.

In June 2021, the American Council of Education and Association of American Universities released a letter to Congress on behalf of over 20 higher education organizations asking Congress to modernize rules on tax exempt bond financing.

Overly restrictive federal rules may hamstring bipartisan efforts by the new administration and Congress to accelerate tech commercialization and enhance U.S. competitiveness in science and technology (S&T). The recent U.S. Innovation and Competition Act, passed by the Senate, for instance, aims to support public-private partnerships, cross-sectoral innovation hubs, and other multistakeholder initiatives in priority S&T areas. But such initiatives may run afoul of rules on facilities financed by tax-exempt bonds…unless reforms are adopted.

Plan of Action

The administration should implement the following two reforms to clarify and update rules governing use of facilities financed by tax-exempt bonds:

  1. Eliminate the requirement that universities must retain ownership to all IP generated in university-owned facilities financed by tax-exempt bonds. Instead, universities and corporations should be allowed to negotiate their own terms of IP ownership before entering a research partnership.
  2. Broaden applicability of IRS safe-harbor provisions. IRS revenue procedures include safe-harbor provisions that exempt “basic research agreements” from restrictions on private business use. The IRS defines basic research as “any original investigation for the advancement of scientific knowledge not having a specific commercial objective.” This definition is too narrow. But especially today, the lines between “basic” and applied research are blurry — and virtually nonexistent when it comes to cutting-edge fields such as digitalization, biosciences, and quantum computing. The IRS should broaden the applicability of its safe-harbor provisions to include all research activities, not just ‘basic research’.

Together, these reforms would support new public-private initiatives by the federal government (such the research hubs funded under the U.S. Innovation and Competition Act); help emerging research universities (including minority-serving institutions such as historically black colleges and universities (HBCUs) and Hispanic-serving institutions (HSIs)) grow their profiles and better compete for talent and resources; and repatriate corporate research to the United States. Moreover — since other countries do not have similarly onerous restrictions on research activities conducted in facilities financed with tax-exempt bonds — these reforms are needed for the U.S. tech economy to remain competitive on an international scale.

These reforms require changes to tax laws, but do not require a direct outlay of federal appropriations. Reforms could be implemented as part of several tech-commercialization legislative packages expected to be considered by this Congress, including the U.S. Innovation and Competition Act or the proposed US Infrastructure bill.

Conclusion

As the Congress and the Administration explore ways to make the U.S. more technologically competitive, ensuring robust university-industry partnerships should be a key factor in any strategy. Reforming the current rules concerning corporate research performed in university facilities needs to be considered, given that the IRS rules have not been updated in over 30 years. The debate over the infrastructure bill or other competitiveness initiatives provides such an opportunity to make these reforms. Now is the time.

Expanding the Corporation for Public Broadcasting to Fund Local News

The Biden administration can respond to the rise in disinformation campaigns and ensure the long-term viability of American democracy by championing an expanded Corporation for Public Broadcasting to transform, revive, and create local public service newsrooms around the country.

Local newsrooms play key roles in a democracy: informing communities, helping set the agenda for local governments, grounding national debates in local context, and contributing to local economies; so it is deeply troubling that the market for local journalism is failing in the United States. Lack of access to credible, localized information makes it harder for communities to make decisions, hampers emergency response, and creates fertile ground for disinformation and conspiracy theories. There is significant, necessary activity in the academic, philanthropic, and journalism sectors to study and document the hollowing out of local news and sustain, revive, and transform the landscape for local journalism. But the scope of the problem is too big to be solely addressed privately. Maintaining local journalism requires treating it as a public good, with the scale of investment that the federal government is best positioned to provide.

The United States has shown that it can meet the opportunities and challenges of a changing information landscape. In the 1960s, Congress established the Corporation for Public Broadcasting (CPB), creating a successful and valuable public media system in response to the growth of widely available corporate radio and TV. But CPB’s purview hasn’t changed much since then. Given the challenges of today’s information landscape, it is time to reimagine and grow CPB.

The Biden administration should work with Congress to revitalize local journalism in the United States by:

  1. Passing legislation to evolve the CPB into an expanded Corporation for Public Media. CPB’s current mandate is to fund and support broadcast public media. An expanded Corporation for Public Media would continue to support broadcast public media while additionally supporting local, nonprofit, public-service-oriented outlets delivering informational content to meet community needs through digital, print, or other mediums.
  2. Doubling CPB’s annual federal budget allocation, from $445 to $890 million, to fund greater investment in local news and digital innovation. The local news landscape is the subject of significant interest from philanthropies, industry, communities, and individuals. Increased federal funding for local, nonprofit, and public-service-oriented news outlets could stimulate follow-on investment from outside of government.

Challenge and Opportunity

Information systems are fracturing and shifting in the United States and globally. Over the past 20 years, the Internet disrupted news business models and the national news industry consolidated; both shifts have contributed to the reduction of local news coverage. American newsrooms have lost almost a quarter of their staff since 2008. Half of all U.S. counties have only one newspaper or information equivalent, and many counties have no local information source. The media advocacy group Free Press estimates that the national “reporting gap”, which they define as the wages of the roughly 15,000 reporting roles lost since the peak of journalism in 2005, stands at roughly $1 billion a year.

The depletion of reputable local newsrooms creates an information landscape ripe for manipulation. In particular, the shrinking of local newsrooms can exacerbate the risk of “data voids”, when good information is not available via online search and instead users find “falsehoods, misinformation, or disinformation”. In 2019, the Tow Center at the Columbia Journalism School documented 450 websites masquerading as local news outlets, with titles like the East Michigan News, Hickory Sun, and Grand Canyon Times But instead of publishing genuine journalism, these sites were distributing algorithmically generated, hyperpartisan, locally tailored disinformation. The growing popularity of social media as news sources or conduits to news sources — more than half of American adults today get at least some news from social media — compounds the problem by making it easier for disinformation to spread.

Studies show that the erosion of local journalism has negative impacts on local governance and democracy, including increased voter polarization, decreased accountability for elected officials to their communities, and decreased competition in local elections. Disinformation narratives that take root in the information vacuum left behind when local news outlets fold often disproportionately target and impact marginalized people and people of color. Erosion of local journalism also threatens public safety. Without access to credible and timely local reporting, communities are at greater risk during emergencies and natural disasters. In the fall of 2020, for instance, emergency response to wildfires threatening communities in Oregon was hampered by the spread of inaccurate information on social media. These problems will only become more pronounced if the market for local journalism continues to shrink.

These are urgent challenges and the enormity of them can make them feel intractable. Fortunately, history presents a path forward. By the 1960s, with the rise of widely available corporate TV and radio, the national information landscape had changed dramatically. At the time, the federal government recognized the need to reduce the potential harms and meet the opportunities that these information systems presented. In particular, then-Chair of the Federal Communications Commission (FCC) Newton Minow called for more educational and public-interest programming, which the private TV market wasn’t producing.18 Congress responded by passing the Public Broadcasting Act in 1967, creating the Corporation for Public Broadcasting (CPB). CPB is a private nonprofit responsible for funding public radio and television stations and public-interest programming. (Critically, CPB itself does not produce content.) CPB has a mandate to champion diversity and excellence in programming, serve all American communities, ensure local ownership and independent operation of stations, and shield stations from the possibility of political interference. Soon after its creation, CPB established the independent entities of the Public Broadcasting Service (PBS) and National Public Radio (NPR).

CPB, PBS, NPR, and local affiliate stations collectively developed into the national public media system we know today. Public media is explicitly designed to fill gaps not addressed by the private market in educational, youth, arts, current events, and local news programming, and to serve all communities, including populations frequently overlooked by the private sector. While not perfect, CPB has largely succeeded in both objectives when it comes to broadcast media. CPB supports1 about 1,500 public television and radio broadcasters, many of which produce and distribute local news in addition to national news. CPB funds much-needed regional collaborations that have allowed public broadcasters to combine resources, increase reporting capacity, and cover relevant regional and localized topics, like the opioid crisis in Appalachia and across the Midwest. CPB also provides critical — though, currently too little — funding to broadcast public media for historically underserved communities, including Black, Hispanic, and Native communities.

Public media and CPB’s funding and support is a consistent bright spot for the struggling journalism industry.25 More than 2,000 American newspapers closed in the past 15 years, but NPR affiliates added 1,000 local part- and full-time journalist positions from 2011–2018 (though these data are pre-pandemic).26 Trust in public media like PBS remains high compared to other institutions; and local news is more trusted than national news.27,28 There is clear potential for CPB to revive and transform the local news landscape: it can help to develop community trust, strengthen democratic values, and defend against disinformation at a time when all three outcomes are critical.

Unfortunately, the rapid decline of local journalism nationwide has created information gaps that CPB — an institution that has remained largely unchanged since its founding more than 50 years ago — does not have the capacity to fill. Because its public service mandate still applies only to broadcasters, CPB is unable to fund stand-alone digital news sites. The result is a dearth of public-interest newsrooms with the skills and infrastructure to connect with their audiences online and provide good journalism without a paywall. CPB also simply lacks sufficient funding to rebuild local journalism at the pace and scope needed. Far too many people in the United States — especially members of rural and historically underserved communities — live in a “news desert”, without access to any credible local journalism at all.2

The time is ripe to reimagine and expand CPB. Congress has recently demonstrated bipartisan willingness to invest in local journalism and public media. Both major COVID-19 relief packages included supplemental funding for CPB. The American Rescue Act and the CARES Act provided $250 million and $75 million, respectively, in emergency stabilization funds for public media. Funds were prioritized for small, rural, and minority-oriented public-media stations.30 As part of the American Rescue Plan, smaller news outlets were newly and specifically made eligible for relief funds—a measure that built on the Local News and Emergency Information Act of 2020 previously introduced by Senator Maria Cantwell (D-WA) and Representative David Cicilline (D-RI) and supported by a bipartisan group. Numerous additional bills have been proposed to address specific pieces of the local news crisis. Most recently, Senators Michael Bennet (D-CO), Amy Klobuchar (D-MI), and Brian Schatz (D-HI), along with Representative Marc Veasy (DTX) reintroduced the Future of Local News Commission Act, calling for a commission “to study the state of local journalism and offer recommendations to Congress on the actions it can take to support local news organizations”.

These legislative efforts recognize that while inspirational work to revitalize local news is being done across sectors, only the federal government can bring the level of scale, ambition, and funding needed to adequately address the challenges laid out above. Reimagining and expanding CPB into an institution capable of bolstering the local news ecosystem is necessary and possible; yet it will not be easy and would bring risks. The United States is amid a politically uncertain, difficult, and even violent time, with a rapid, continued fracturing of shared understanding. Given how the public media system has come under attack in the “culture wars” over the decades, many champions of public media are understandably wary of considering any changes to the CPB and public media institutions and initiatives. But we cannot afford to continue avoiding the issue. Our country needs a robust network of local public media outlets as part of a comprehensive strategy to decrease blind partisanship and focus national dialogues on facts. Policymakers should be willing to go to bat to expand and modernize the vision of a national public media system first laid out more than fifty years ago.

Plan of Action

The Biden administration should work with Congress to reimagine CPB as the Corporation for Public Media: an institution with the expanded funding and purview needed to meet the information challenges of today, combat the rise of disinformation, and strengthen community ties and democratic values at the local level. Recommended steps towards this vision are detailed below.

Recommendation 1. Expand CPB’s Purview to Support and Fund Local, Nonprofit, Public Service Newsrooms.

Congress should pass legislation to reestablish the Corporation for Public Broadcasting (CPB) as the Corporation for Public Media (CPM), expanding the Corporation’s purview from solely supporting broadcast outlets to additionally supporting local, nonprofit newsrooms of all types (digital, print, broadcast).

The expanded CPM would retain all elements of the CPB’s mandate, including “ensuring universal access to non-commercial, high-quality content and telecommunications services that are commercial free and free of charge,” with a particular emphasis on ensuring access in rural, small town, and urban communities across the country. Additionally, CPB “strives to support diverse programs and services that inform, educate, enlighten and enrich the public”, especially addressing the needs of underserved audiences, children, and minorities.

Legislation expanding the CPB into the CPM must include criteria that local, nonprofit, public-service newsrooms would need to meet to be considered “public media” and become eligible for federal funding and support. Broadly, local, nonprofit newsrooms should mean nonprofit or noncommercial newsrooms that cover a region, state, county, city, neighborhood, or specific community; it would not include individual reporters, bloggers, documentarians, etc. Currently, CPB relies on FCC broadcasting licenses to determine which stations might be eligible to be considered public broadcasters. The Public Broadcasting Act lays out additional eligibility requirements, including an active community advisory board and regularly filed reports on station activities and spending. Congress should build on these existing requirements when developing eligibility criteria for CPM funding and support.

In designing eligibility criteria, Congress could also draw inspiration from the nonprofit Civic Information Consortium (CIC) being piloted in New Jersey. The CIC is a partnership between higher education institutions and newsrooms in the state to strengthen local news coverage and civic engagement, seeded with an initial $500,000 in funding. The CPM could require that to be eligible for CPM funding, nonprofit public-service newsrooms must partner with local, accredited universities.39 Requiring that an established external institution be involved in local news endeavors selected for funding would (i) decrease risks of investing in new and untested newsrooms, (ii) leverage federal investment by bringing external capabilities to bear, and (iii) increase the impact of federal funding by creating job training and development opportunities for local students and workers. Though, of course, this model also brings its own risks, potentially putting unwelcome pressure on universities to act as public media and journalism gatekeepers.

An expanded CPM would financially support broadcast, digital, and print outlets at the local and national levels. Once eligibility criteria are established, clear procedures would need to be established for funding allocation and prioritization (see next section for recommended prioritizations). For instance, procedures should explain how factors such as demonstrated need and community referrals will factor into funding decisions. Procedures should also ensure that funding is prioritized towards local newsrooms that serve communities in “news deserts”, or places at risk of becoming news deserts, and historically underserved communities, including Black, Hispanic, Native, and rural communities.

Congress could consider tasking the proposed Future of Local News Act commission with digging deeper into how CPB could be evolved into the CPM in a way that best positions it to address the disinformation and local news crises. The commission could also advise on how to avoid two key risks associated with such an evolution. The first is the perpetual risk of government interference in public media, which CPB’s institutional design and norms have historically guarded against. (For example, there are no content or viewpoint restrictions on public media today—and there should not be in the future.) Second, expanding CPB’s mandate to include broadly funding local, nonprofit newsrooms would create a new risk that disinformation or propaganda sites, operating without journalistic practices, could masquerade as genuine news sites to apply for CPM funding. It will be critical to design against both of these risks. One of the most important factors in CPB’s success is its ethos of public service and the resulting positive norms; these norms and its institutional design are a large part of what makes CPB a good institutional candidate to expand. In designing the new CPM, these norms should be intentionally drawn on and renewed for the digital age.

Numerous digital outlets would likely meet reasonable eligibility criteria. One recent highlight in the journalism landscape is the emergence of many nonprofit digital outlets, including almost 300 affiliated with the Institute for Nonprofit News. (These sites certainly have not made up for the massive numbers of journalism jobs and outlets lost over the past two decades.) There has also been an increase in public media mergers, where previously for-profit digital sites have merged with public media broadcasters to create mixed-media nonprofit journalistic entities. As part of legislation expanding the CPB into the CPM, Congress should make it easier for public media mergers to take place and easier for for-profit newspapers and sites to transition to nonprofit status, as the Rebuild Local News Coalition has proposed.

Recommendation 2. Double CPB’s Yearly Appropriation from $445 to $890 million.

Whether or not CPB is evolved into CPM, Congress should (i) double (at minimum) CPB’s next yearly appropriation from $445 to $890 million, and (ii) appropriate CPB’s funding for the next ten years up front. The first action is needed to give CPB the resources it needs to respond to the local news and disinformation crises at the necessary scale, and the second is needed to ensure that local newsrooms are funded over a time horizon long enough to establish themselves, develop relationships with their communities, and attract external revenue streams (e.g., through philanthropy, pledge drives, or other models). The CPB’s current appropriation is sufficient to fund some percentage of station operational and programming costs at roughly 1,500 stations nationwide. This is not enough. Given that Free Press estimates the national “reporting gap” as roughly $1 billion a year, CPB’s annual budget appropriation needs to be at least doubled. Such increased funding would dramatically improve the local news and public media landscape, allowing newsrooms to increase local coverage and pay for the digital infrastructure needed to better meet audiences where they are—online. The budget increase could be made part of the infrastructure package under Congressional negotiation, funded by the Biden administration’s proposed corporate tax increases, or separately funded through corporate tax increases on the technology sector.

The additional appropriation should be allocated in two streams. The first funding stream (75% of the additional appropriation, or $667.5 million) should specifically support local public-service journalism. If Free Press’s estimates of the reporting gap are accurate, this stream might be able to recover 75% of the journalism jobs (somewhere in the range of 10,000 to 11,000 jobs) lost since the industry’s decline began in earnest—a huge and necessary increase in coverage. Initial priority for this funding should go to local journalism outlets in communities that have already become “news deserts”. Secondary priority should go to outlets in communities that are at risk of becoming news deserts and in communities that have been historically underserved by media, including Black, Hispanic, Native, and rural communities. Larger, well-funded public media stations and outlets should still receive some funding from this stream (particularly given their role in surfacing local issues to national audiences), but with less of a priority focus. This funding stream should be distributed through a structured formula — similar to CPB’s funding formulas for public broadcasting stations — that ensures these priorities, protects news coverage from government interference, and ensures high-quality news.

The second funding stream (25% of additional appropriation, or $222.5 million) should support digital innovation for newsrooms. This funding stream would help local, nonprofit newsrooms build the infrastructure needed to thrive in the digital age. Public media aims to be accessible and meet audiences where they are. Today, that is often online. Though the digital news space is dominated by social media and large tech companies, public media broadcasters are figuring out how to deliver credible, locally relevant reporting in digital formats. NPR, for instance, has successfully digitally innovated with platforms like NPR One. But more needs to be done to expand the digital presence of public media. CPB should structure this funding stream as a prize challenge or other competitive funding model to encourage further digital innovation.

Finally, the overall additional appropriation should be used as an opportunity to encourage follow-on investment in local news, by philanthropies, individuals, and the private sector. There is significant interest across the board in revitalizing American local news. The attention that comes with a centralized, federally sponsored effort to reimagine and expand local public media can help drive additional resources and innovation to places where they are most needed. For instance, philanthropies might offer private matches for government investment in local news. Such an initiative could draw inspiration from the existing and successful NewsMatch program, a funding campaign where individuals donate to a nonprofit newsroom and their donation is matched by funders and companies.

Conclusion

Local news is foundational to democracy and the flourishing of communities. Yet with the rise of the Internet and social media companies, the market for local news is failing. There is significant activity in the journalistic, philanthropic, and academic sectors to sustain, revive, and transform the local news landscape. But these efforts can’t succeed alone. Maintaining local journalism as a public good requires the scale of investment that only the federal government can bring.

In 1967, our nation rose to meet a different moment of disruption in the information environment, creating public media broadcasting through the Public Broadcasting Act. Today, there is a similar need for the Biden administration, Congress, and CPB to reimagine public media for the digital age. They should champion an expanded Corporation for Public Media to better meet communities’ information needs; counter the rising tide of disinformation; and transform, revive, and create local, public-service newsrooms around the country.