Improve Extreme Heat Monitoring by Launching Cross-Agency Temperature Network

Year after year, record-breaking air temperatures and heat waves are reported nationwide. In 2023, Death Valley, California experienced temperatures as high as 129°F — the highest recorded temperature on Earth for the month of June—and in July,  Southwest states experienced prolonged heat waves where temperatures did not drop below 90°F. This is especially worrisome as the frequency, intensity, and duration of rising temperatures are projected to increase, and the leading weather-related cause of death in the United States is heat. To address this growing threat, the Environmental Protection Agency (EPA) and the National Oceanic and Atmospheric Administration (NOAA) should combine and leverage their existing resources to develop extreme-heat monitoring networks that can capture spatiotemporal trends of heat and protect communities from heat-related hazards. 

Urban areas are particularly vulnerable to the effects of extreme heat due to the urban heat island (UHI) effect. However, UHIs are not uniform throughout a city, with some neighborhoods experiencing higher air temperatures than others. Further, communities with higher populations of Color and lower socioeconomic status disproportionately experience higher temperatures and are reported to have the highest increase in heat-related mortality. It is imperative for local government officials and city planners to understand who is most vulnerable to the impacts of extreme heat and how temperatures vary throughout a city to develop effective heat mitigation and response strategies. While the NOAA’s National Weather Service (NWS) stations provide hourly, standardized air measurements, their data do not capture intraurban variability.

Challenge and Opportunity

Heat has killed more than 11,000 Americans since 1979, yet an extreme heat monitoring network does not exist in the country. While NOAA NWS stations capture air temperatures at a central location within a city, they do not reveal how temperatures within a city vary. This missing information is necessary to create targeted, location-specific heat mitigation and response efforts.

Synergistic Environmental Hazards and Health Impacts

UHIs are metropolitan areas that experience higher temperatures than surrounding rural regions. The temperature differences can be attributed to many factors, including high impervious surface coverage, lack of vegetation and tree canopy, tall buildings, air pollution, and anthropogenic heat. UHIs are of significant concern as they contribute to higher daytime temperatures and reduce nighttime cooling, which in turn exacerbates heat-related deaths and illnesses in densely populated areas. Heat-related illnesses include heat exhaustion, cramps, edema, syncope, and stroke, among others. However, heat is not uniform throughout a city, and some neighborhoods experience warmer temperatures than others in part due to structural inequalities. Further, it has been found that, on average, People of Color and those living below the poverty line are disproportionately exposed to higher air temperatures and experience the highest increase in heat-related mortality. As temperatures continue to rise, it becomes more imperative for the federal government to protect vulnerable populations and communities from the impacts of extreme heat. This requires tools that can help guide heat mitigation strategies, such as the proposed interagency monitoring network. 

High air temperatures and extreme heat are also associated with poor air quality. As common pavement surfacing materials, like asphalt and concrete, absorb heat and energy from the sun during the day, the warm air at the surface rises with present air pollutants. High air temperatures and sunlight are also known to help catalyze the production of air pollutants such as ozone in the atmosphere and impact the movement of air and, therefore, the movement of air pollution. As a result, during extreme heat events, individuals are exposed to increased levels of harmful pollutants. Because poor air quality and extreme heat are directly related, the EPA should expand its air quality networks, which currently only detect pollutants and their sources, to include air temperature. Projections have determined extreme heat events and poor air quality days will increase due to climate change, with compounding detriments to human health

Furthermore, extreme heat is linked not only to poor air quality but also to wildfire smoke—and they are becoming increasingly concomitant. Projections report with very high confidence that warmer temperatures will lengthen the wildfire season and thus increase areas burned. Similar to extreme heat’s relationship with poor air quality, extreme heat and wildfire smoke have a synergistic effect in negatively impacting human health. Extreme heat and wildfire smoke can lead to cardiovascular and respiratory complications as well as dehydration and death. These climatic hazards have an even larger impact on environmental and human health when they occur together.

As the UHI effect is localized and its causes are well understood, urban cities are ideal locations to implement heat mitigation and adaptation strategies. To execute these plans equitably, it is critical to identify areas and communities that are most vulnerable and impacted by extreme heat events through an extreme heat monitoring network. The information collected from this network will also be valuable when planning strategies targeting poor air quality and wildfire smoke. The launch of an extreme heat monitoring network will have a considerable impact on protecting lives. 

Urban Heat Mapping Efforts

Both NOAA and EPA have existing programs that aim to map, reduce, or monitor UHIs throughout the country. These efforts may have the capacity to also implement the proposed heat monitoring network. 

Since 2017, NOAA has worked with the National Integrated Heat Health Information System (NIHHIS) and CAPA Strategies LLC to fund yearly UHI mapping campaign programs, which has been instrumental in highlighting the uneven distribution of heat throughout U.S. cities. These programs rely on community science volunteers who attach NOAA-funded sensors to their cars to collect air temperature, humidity, and time data. These campaigns, however, are currently only run during summer months, and not all major cities are mapped each year. NOAA’s NIHHIS has also created a Heat Vulnerability Mapping Tool, which impressively illustrates the relationship between social vulnerability and heat exposure. These maps, however, are not updated in real-time and do not display air temperature data. Another critical tool in mapping UHIs is NWS recently created HeatRisk prototype, which identifies risks of heat-related impacts in numerous parts of the country. This prototype also forecasts levels of heat concerns up to seven days into the future. However, HeatRisk does not yet provide forecasts for the entire country and uses NWS air temperature products, which do not capture intraurban variability. The EPA has a Heat Island Reduction program dedicated to working with community groups and local officials to find opportunities to mitigate UHIs and adopt projects to build heat-resilient communities. While this program aims to reduce and monitor UHIs, there are no explicit monitoring or mapping strategies in place. 

While the products and services of each agency have been instrumental in mapping UHIs throughout the country and in heat communication and mitigation efforts, consistent and real-time monitoring is required to execute extreme heat response plans in a timely fashion. Merging the resources of both agencies would provide the necessary foundation to design and implement a nationwide extreme heat monitoring network.

Plan of Action

Heat mitigation strategies are often city-wide. However, there are significant differences in heat exposure between neighborhoods. To create effective heat adaptation and mitigation strategies, it is critical to understand how and where temperatures vary throughout a city. Achieving this requires a cross-agency extreme heat monitoring network between federal agencies. 

The EPA and NOAA should sign a memorandum of agreement to improve air temperature monitoring nationwide. Following this, agencies should collaborate to create an extreme heat monitoring network that can capture the intraurban variability of air temperatures in major cities throughout the country.

Implementation and continued success require a number of actions from the EPA and NOAA. 

  1. EPA should expand its Heat Island Reduction program to include monitoring urban heat. The Inflation Reduction Act (IRA) provided the agency with $41.5 billion to fund new and existing programs, with $11 billion going toward clean air efforts. Currently, their noncompetitive and competitive air grants do not address extreme heat efforts. These funds could be used to place air temperature sensors in each census tract within cities to map real-time air temperatures with high spatial resolution.
  2. EPA should include air temperature monitoring in their monitoring deployments. Due to air quality tracking efforts mandated by the Clean Air Act, there are existing EPA air quality monitoring sites in cities throughout the country. Heat monitoring efforts could be tested by placing temperature sensors in the same locations.
  3. EPA and NOAA should help determine vulnerable communities most impacted by extreme heat. Utilizing EPA’s Environmental Justice Screening and Mapping (EJScreen) Tool and NIHHIS’s Heat Vulnerability Mapping Tool, EPA and NOAA could determine where to place air temperature monitors, as the largest burden due to extreme heat tends to occur in neighborhoods with the lowest economic status.
  4. NOAA should develop additional air temperature sensors. NOAA’s summer UHI campaign programs highlight the agency’s ability to create sensors that capture temperature data. Given their expertise in capturing meteorological conditions, NOAA should develop national air temperature sensors that can withstand various weather conditions.
  5. NOAA should build data infrastructure capable of supporting real-time monitoring. Through NIHHIS, the data obtained from the monitoring network could be updated in real-time and be publicly available. This data could also merge with the current vulnerability mapping tool and HeatRisk to examine extreme heat impacts at finer spatial scales. 

Successful implementation of these recommendations would result in a wealth of air temperature data, making it possible to monitor extreme heat at the neighborhood level in cities throughout the United States. These data can serve as a foundation for developing extreme heat forecasting models, which would enable governing bodies to develop and execute response plans in a timely fashion. In addition, the publicly available data from these monitoring networks will allow local, state, and tribal officials, as well as academic and non-academic researchers, to better understand the disproportionate impacts of extreme heat. This insight can support the development of targeted, location-specific mitigation and response efforts.

Conclusion

As temperatures continue to rise in the United States, so do the risks of heat-related hazards, morbidity, and mortality. This is especially true for urban cities, where the effects of extreme heat are most prevalent. A cross-agency extreme-heat monitoring network can support the development of equitable heat mitigation and disaster preparedness efforts in major cities throughout the country.

This idea of merit originated from our Extreme Heat Ideas Challenge. Scientific and technical experts across disciplines worked with FAS to develop potential solutions in various realms: infrastructure and the built environment, workforce safety and development, public health, food security and resilience, emergency planning and response, and data indices. Review ideas to combat extreme heat here.

Frequently Asked Questions
How are urban heat islands formed?
Cities often have less vegetation and tree canopy cover than surrounding rural areas, which decreases cooling and evaporation. Tall buildings and ones that are close together reduce wind speed, trapping heat within a city. Buildings, as well as roads, streets, and sidewalks, are very good at absorbing and storing heat from the sun. Additionally, air pollution and heat from cars, buildings, and space heating absorb heat that is trying to escape from the city.
What is the difference between urban heat islands and heat waves?

Urban heat islands are urbanized regions experiencing higher temperatures compared to nearby rural areas. Heat waves—also known as extreme heat events—are persistent periods of unusually hot weather lasting more than two days. Research has found, however, that urban heat islands and heat waves have a synergistic relationship.

How many people die due to heat in the United States?

Nationwide, more than 1,300 annual deaths are estimated to be attributable to extreme heat. This number is likely an undercount, as medical records do not regularly include the impact of heat when describing the cause of death.

What can communities do to combat rising temperatures?
Cities can create more green spaces and plant more trees to increase evapotranspiration rates and provide shade. Installing cool or green roofs can reduce the amount of heat buildings store throughout the day. Altering roads, streets, or sidewalks with cool pavements may also reduce the amount of stored heat and provide less heat stress to pedestrians. Walking and biking instead of using an automobile, when possible, would reduce the amount of pollution introduced into the air, which could not only combat rising temperatures but also improve air quality.

The Missing Data for Systemic Improvements to U.S. Public School Facilities

Peter Drucker famously said, “You can’t improve what you don’t measure.” Data on facilities helps public schools to make equitable decisions, prevent environmental health risks, ensure regular maintenance, and conduct long-term planning. Publicly available data increases transparency and accountability, resulting in more informed decision making and quality analysis. Across the U.S., public schools lack the resources to track their facilities and operations, resulting in missed opportunities to ensure equitable access to high quality learning environments. As public schools face increasing challenges to infrastructure, such as climate change, this data gap becomes more pronounced.

Why do we need data on school facilities?

School facilities affect student health and learning. The conditions of a school building directly impact the health and learning outcomes of students. The COVID-19 pandemic brought the importance of indoor air quality into the public consciousness. Many other chronic diseases are exacerbated by inadequate facilities, causing absenteeism and learning loss. From asthma to obesity to lead poisoning, the condition of the places where children spend their time impacts their health, wellbeing, and ability to learn. Better data on the physical environment helps us understand the conditions that hinder student learning

School buildings are a source of emissions and environmental impacts. The U.S. Energy Information Administration reports that schools annually spend $8 billion on energy, and emit an estimated 72 million metric tons of carbon dioxide. While the energy use intensity of school buildings is not itself that high when compared with other sectors, there are interesting trends such as education being the largest consumer of natural gas. The public school fleet is the largest mass transit system in the U.S. As of 2023 only 1-2% of the countries estimated nearly 500,000 buses are electric

Data provides accountability for public investment. After highways, elementary and secondary education infrastructure is the leading public capital outlay expenditure nationwide (2021 Census). Most funds to maintain school facilities come from local and state tax sources. Considering the sizable taxpayer investment, relatively little is known about the condition of these facilities. Some state governments have no school facilities staff or funding to help manage or improve school facilities. The 2021 State of Our Schools Report, the leading resource on school facilities data, uses fiscal data to highlight the issues in school facilities. This report found that there is a $85 billion annual school facilities infrastructure funding gap, meaning that, according to industry standards for both capital investment and maintenance, schools are funded $85 billion less than what is required for upkeep. Consistent with these findings, the U.S. Government Accountability Office conducted research on school building common facilities issues and found that, in 2020, 50% of districts needed to replace or update multiple essential building systems such as HVAC or plumbing. 

What data do we need?

Despite the clear connections between students’ health, learning, and the condition of school buildings, there are no standardized national data sets that assist school leaders and policy makers in making informed and strategic decisions to systematically improve facilities to support health and learning. 

Some examples of data points school facilities advocates want more of include:

Getting strategic and accessible with facilities data

Gathering this type of data represents a significant challenge for schools that are already overburdened and lack the administrative support for facilities maintenance and operations. By supporting the best available facilities research methods, facilities conditions standards, and dedicating resources to long term planning, we ensure that data collection is undertaken equitably. Some strategies that bear these challenges in mind are:

Incorporate facilities into existing data collection and increase data linkages in integrated and high quality data centers like National Center for Education Statistics. School leaders should provide key facilities metrics through the same mechanisms by which they report other education statistics. Creating data linkages allow users to make connections using existing data.    

Building capacity ensures that there are staff and support systems in place to effectively gather and process school facilities data. There are more federal funds than ever before offered for building the capacity of schools to improve facilities conditions. For instance, the U.S. Department of Education recently launched the Supporting America’s School Infrastructure grant program, aimed at developing the ability of state departments of education to address facilities matters.    

Research how school facilities are connected to environmental justice to better understand how resources could be most equitably distributed. Ten Strands and UndauntedK12 are piloting a framework which looks at pollution burden indicators and school adoption of environmental and climate action. We can support policies and fund research that looks at this intersection and makes these connections more transparent.

The connection between school facilities and student health and learning outcomes is clear. What we need now are the resources to effectively collect more data on school facilities that can be used by policy makers and school leaders to plan, improve learning conditions, and provide accountability to the public. 


The Federation of American Scientists values diversity of thought and believes that a range of perspectives — informed by evidence — is essential for discourse on scientific and societal issues. Contributors allow us to foster a broader and more inclusive conversation. We encourage constructive discussion around the topics we care about.

It’s Time to Move Towards Movement as Medicine

For over 10 years, physical inactivity has been recognized as a global pandemic with widespread health, economic, and social impacts. Despite the wealth of research support for movement as medicine, financial and environmental barriers limit the implementation of physical activity intervention and prevention efforts. The need to translate research findings into policies that promote physical activity has never been higher, as the aging population in the U.S. and worldwide is expected to increase the prevalence of chronic medical conditions, many of which can be prevented or treated with physical activity. Action at the federal and local level is needed to promote health across the lifespan through movement.

Research Clearly Shows the Benefits of Movement for Health

Movement is one of the most important keys to health. Exercise benefits heart health and physical functioning, such as muscle strength, flexibility, and balance. But many people are unaware that physical activity is closely tied to the health conditions they fear most. Of the top five health conditions that people reported being afraid of in a recent survey conducted by the Centers for Disease Control and Prevention (CDC), the risk for four—cancer, Alzheimer’s disease, heart disease, and stroke—is increased by physical inactivity. It’s not only physical health that is impacted by movement, but also mental health and other aspects of brain health. Research shows exercise is effective in treating and preventing mental health conditions such as depression and anxiety, rates of which have skyrocketed in recent years, now impacting nearly one-third of adults in the U.S. Physical fitness also directly impacts the brain itself, for example, by boosting its ability to regenerate after injury and improving memory and cognitive functioning. The scientific evidence is clear: Movement, whether through structured exercise or general physical activity in everyday life, has a major impact on the health of individuals and as a result, on the health of societies.

Movement Is Not Just about Weight, It’s about Overall Lifelong Health

There is increasing recognition that movement is important for more than weight loss, which was the primary focus in the past. Overall health and stress relief are often cited as motivations for exercise, in addition to weight loss and physical appearance. This shift in perspective reflects the growing scientific evidence that physical activity is essential for overall physical and mental health. Research also shows that physical activity is not only an important component of physical and mental health treatment, but it can also help prevent disease, injury, and disability and lower the risk for premature death. The focus on prevention is particularly important for conditions such as Alzheimer’s disease and other types of dementia that have no known cure. A prevention mindset requires a lifespan perspective, as physical activity and other healthy lifestyle behaviors such as good nutrition earlier in life impact health later in life.

Despite the Research, Americans Are Not Moving Enough

Even with so much data linking movement to better health outcomes, the U.S. is part of what has been described as a global pandemic of physical inactivity. Results of a national survey by the CDC published in 2022 found that 25.3% of Americans reported that outside of their regular job, they had not participated in any physical activity in the previous month, such as walking, golfing, or gardening. Rates of physical inactivity were even higher in Black and Hispanic adults, at 30% and 32%, respectively. Another survey highlighted rural-urban differences in the number of Americans who meet CDC physical activity guidelines that recommend ≥ 150 minutes per week of moderate-intensity aerobic exercise and ≥ 2 days per week of muscle-strengthening exercise. Respondents in large metropolitan areas were most active, yet only 27.8% met both aerobic and muscle strengthening guidelines. Even fewer people (16.1%) in non-metropolitan areas met the guidelines.

Why are so many Americans sedentary? The COVID-19 pandemic certainly exacerbated the problem; however, data from 2010 showed similar rates of physical inactivity, suggesting long-standing patterns of sedentary behavior in the country. Some of the barriers to physical activity are internal to the individual, such as lack of time, motivation, or energy. But other barriers are societal, at both the community and federal level. At the community level, barriers include transportation, affordability, lack of available programs, and limited access to high-quality facilities. Many of these barriers disproportionately impact communities of color and people with low income, who are more likely to live in environments that limit physical activity due to factors such as accessibility of parks, sidewalks, and recreation facilities; traffic; crime; and pollution. Action at the state and federal government level could address many of these environmental barriers, as well as financial barriers that limit access to exercise facilities and programs.

Physical Inactivity Takes a Toll on the Healthcare System and the Economy

Aside from a moral responsibility to promote the health of its citizens, the government has a financial stake in promoting movement in American society. According to recent analyses, inactive lifestyles cost the U.S. economy an estimated $28 billion each year due to medical expenses and lost productivity. Physical inactivity is directly related to the non-communicable diseases that place the highest burden on the economy, such as hypertension, heart disease, and obesity. In 2016, these types of modifiable risk factors comprised 27% of total healthcare spending. These costs are mostly driven by older adults, which highlights the increasing urgency to address physical inactivity as the population ages. Physical activity is also related to healthcare costs at an individual level, with savings ranging from 9-26.6% for physically active people, even after accounting for increased costs due to longevity and injuries related to physical activity. Analysis of 2012 data from the Agency for Healthcare Research and Quality’s Medical Expenditure Panel Survey (MEPS) found that each year, people who met World Health Organization aerobic exercise guidelines, which correspond with CDC guidelines, paid on average $2,500 less in healthcare expenses related to heart disease alone compared to people who did not meet the recommended activity levels. Changes are needed at the federal, state, and local level to promote movement as medicine. If changes are not made in physical activity patterns by 2030, it is estimated that an additional $301.8 billion of direct healthcare costs will be incurred.

Government Agencies Can Play a Role in Better Promoting Physical Activity Programs

Promoting physical activity in the community requires education, resources, and removal of barriers in order for programs to have a broad reach to all citizens, including communities that are disproportionately impacted by the pandemic of physical inactivity. Integrated efforts from multiple agencies within the federal government is essential. 

Past initiatives have met with varying levels of success. For example, Let’s Move!, a campaign initiated by First Lady Michelle Obama in 2010, sought to address the problem of childhood obesity by increasing physical activity and healthy eating, among other strategies. The Food and Drug Administration, Department of Agriculture, Department of Health and Human Services including the Centers for Disease Control and Prevention, and Department of Interior were among the federal agencies that collaborated with state and local government, schools, advocacy groups, community-based organizations, and private sector companies. The program helped improve the healthy food landscape, increased opportunities for children to be more physically active, and supported healthier lifestyles at the community level. However, overall rates of childhood obesity remained constant or even increased in some age brackets since the program started, and there is no evidence of an overall increase in physical activity level in children and adolescents since that time.

More recently, the U.S. Office of Disease Prevention and Health Promotion’s Healthy People 2030 campaign established data-driven national objectives to improve the health and well-being of Americans. The campaign was led by the Federal Interagency Workgroup, which includes representatives across several federal agencies including the U.S. Department of Health and Human Services, the U.S. Department of Agriculture, and the U.S. Department of Education. One of the campaign’s leading health indicators—a small subset of high-priority objectives—is increasing the number of adults who meet current minimum guidelines for aerobic physical activity and muscle-strengthening activity from 25.2% in 2020 to 29.7% by 2030. There are also movement-related objectives focused on children and adolescents as well as older adults, for example:

Unfortunately, there is currently no evidence of improvement in any of these objectives. All of the objectives related to physical activity with available follow-up data either show little or no detectable change, or they are getting worse.

To make progress towards the physical activity goals established by the Healthy People 2030 campaign, it will be important to identify where breakdowns in communication and implementation may have occurred, whether it be between federal agencies, between federal and local organizations, or between local organizations and citizens. Challenges brought on by the COVID-19 pandemic (e.g., less movement outside of the house for people who now work from home) will also need to be addressed, with the recognition that many of these challenges will likely persist for years to come. Critically, financial barriers should be reduced in a variety of ways, including more expansive coverage by the Centers for Medicare & Medicaid Services for exercise interventions as well as exercise for prevention. Policies that reflect a recognition of movement as medicine have the potential to improve the physical and mental health of Americans and address health inequities, all while boosting the health of the economy.

Opening Up Scientific Enterprise to Public Participation

This article was written as part of the Future of Open Science Policy project, a partnership between the Federation of American Scientists, the Center for Open Science, and the Wilson Center. This project aims to crowdsource innovative policy proposals that chart a course for the next decade of federal open science. To read the other articles in the series, and to submit a policy idea of your own, please visit the project page.

For decades, communities have had little access to scientific information despite paying for it with their tax dollars. The August 2022 Office of Science and Technology Policy (OSTP) memorandum thus catalyzed transformative change by requiring all federally funded research to be made publicly available by the end of 2025. Implementation of the memo has been supported by OSTP’s “Year of Open Science”, which is coordinating actions across the federal government to advance open access research. Access, though, is the first step to building a more responsive, equitable research ecosystem. A more recent memorandum from the Office of Management and Budget (OMB) and OSTP outlining research and development (R&D) policy priorities for fiscal year (FY) 2025 called on federal agencies to address long-standing inequities by broadening public participation in R&D. This is a critical demand signal for solutions that ensure that federally funded research delivers for the American people.

Public engagement researchers have long been documenting the importance of partnerships with key local stakeholders — such as local government and community-based organizations — in realizing the full breadth of participation with a given community. The lived experience of community members can be an invaluable asset to the scientific process, informing and even shaping research questions, data collection, and interpretation of results. Public participation can also benefit the scientific enterprise by realizing active translation and implementation of research findings, helping to return essential public benefits from the $170 billion invested in R&D each year.

The current reality is that many local governments and community-based organizations do not have the opportunities, incentives, or capacity to engage effectively in federally-funded scientific research. For example, Headwaters Economics found that a significant proportion of communities in the United States do not have the staffing, resources, or expertise to apply to receive and manage federal funding. Additionally, community-based organizations (CBOs) — the groups that are most connected to people facing problems that science could be activated to solve, such as health inequities and environmental injustices — face similar capacity barriers, especially around compliance with federal grants regulations and reporting obligations. Few research funds exist to facilitate the building and maintenance of strong relationships with CBOs and communities, or to provide capacity-building financing to ensure their full participation. Thus, relationships between communities and academia, companies, and the federal government often consume those communities’ time and resources without much return on their investment.

Great participatory science exists, if we know where to look

Place-based investments in regional innovation and research and development (R&D) unlocked by the CHIPS and Science Act (i.e. Economic Development Administration’s (EDA) Tech Hubs and National Science Foundation’s (NSF) Regional Innovation Engines and Convergence Accelerator) are starting to provide transformative opportunities to build local research capacity in an equitable manner. What they’ll need are the incentives, standards, requirements, and programmatic ideas to institutionalize equitable research partnerships.

Models of partnership have been established between community organizations, academic institutions, and/or the federal government focused on equitable relationships to generate evidence and innovations that advance community needs. 

An example of an academic-community partnership is the Healthy Flint Research Coordinating Center (HFRCC). The HFRCC evaluates and must approve all research conducted in Flint, Michigan. HFRCC designs proposed studies that would align better with community concerns and con­text and ensures that benefits flow directly back to the community. Health equity is assessed holistically: considering the economic, environmental, behavioral, and physical health of residents. Finally, all work done in Flint is made open access through this organization. From these efforts we learn that communities can play a vital role in defining problems to solve and ensuring the research will be done with equity in mind.

An example of a federal agency-community partnership is the Environmental Protection Agency’s (EPA) Participatory Science Initiative. Through citizen science processes, the EPA has enabled data collection of under-monitored areas to identify climate-related and environmental issues that require both technical and policy solutions. The EPA helps to facilitate these citizen-science initiatives through providing resources on the best air monitoring equipment and how to then visualize field data. These initiatives specifically empower low-income and minority communities who face greater environmental hazards, but often lack power and agency to vocalize concerns. 

Finally, communities themselves can be the generators of research projects, initially without a partner organization. In response to the lack of innovation in diabetic care management, Type 1 diabetic patients founded openAPS. This open source effort spurred the creation of an overnight, closed loop artificial pancreas system to reduce disease burden and save lives. Through decentralized deployment to over 2700 individuals, there are 63 million hours of real-world “closed-loop” data, with the results of prospective trials and randomized control trials (RCTs) showing fewer highs and less severe lows, i.e., greater quality of life. Thus, this innovation is now ripe for federal investment and partnership for it to reach a further critical scale.

Scaling participatory science requires infrastructure

Participatory science and innovation is still an emerging field. Yet, effective models for infrastructuring participation within scientific research enterprises have emerged over the past 20 years to build community engagement capacity of research institutions. Participatory research infrastructure (PRI) could take the form of the following: 

  1. Offices that develop tools for interfacing with communities, like citizen’s juries, online platforms, deliberative forums, and future-thinking workshops.
  2. Ongoing technology assessment projects to holistically evaluate innovation and research along dimensions of equity, trust, access, etc.
  3. Infrastructure (physical and digital) for research, design experimentation, and open innovation led by community members.
  4. Organized stakeholder networks for co-creation and community-driven citizen science
  5. Funding resources to build CBO capacity to meaningfully engage (examples including the RADx-UP program from the NIH and Civic Innovation Challenge from NSF).
  6. Governance structures with community members in decision-making roles and requirements that CBOs help to shape the direction of the research proposals.
  7. Peer-review committees staffed by members of the public, demonstrated recently by NSF’s Regional Innovation Engines
  8. Coalitions that utilize research as an input for collective action and making policy and governance decisions to advance communities’ goals.

Call to action

The responsibility of federally-funded scientific research is to serve the public good. And yet, because there are so few interventions that have been scaled, participatory science will remain a “nice to have” versus an imperative for the scientific enterprise. To bring participatory science into the mainstream, there will need to be creative policy solutions that create incentive mechanisms, standards, funding streams, training ecosystems, assessment mechanisms, and organizational capacity for participatory science. To meet this moment, we need a broader set of voices contributing ideas on this aspect of open science and countless others. That is why we recently launched an Open Science Policy Sprint, in partnership with the Center for Open Science and the Wilson Center. If you have ideas for federal actions that can help the U.S. meet and exceed its open science goals, we encourage you to submit your proposals here.

Towards a Well-Being Economy: Establishing an American Mental Wealth Observatory

Summary

Countries are facing dynamic, multidimensional, and interconnected crises. The pandemic, climate change, rising economic inequalities, food and energy insecurity, political polarization, increasing prevalence of youth mental and substance use disorders, and misinformation are converging, with enormous sociopolitical and economic consequences that are weakening democracies, corroding the social fabric of communities, and threatening social stability and national security. Globalization and digitalization are synchronizing, amplifying, and accelerating these crises globally by facilitating the rapid spread of disinformation through social media platforms, enabling the swift transmission of infectious diseases across borders, exacerbating environmental degradation through increased consumption and production, and intensifying economic inequalities as digital advancements reshape job markets and access to opportunities.

Systemic action is needed to address these interconnected threats to American well-being.

A pathway to addressing these issues lies in transitioning to a Well-Being Economy, one that better aligns and balances the interests of collective well-being and social prosperity with traditional economic and commercial interests. This paradigm shift encompasses a ‘Mental Wealth’ approach to national progress, recognizing that sustainable national prosperity encompasses more than just economic growth and instead elevates and integrates social prosperity and inclusivity with economic prosperity. To embark on this transformative journey, we propose establishing an American Mental Wealth Observatory, a translational research entity that will provide the capacity to quantify and track the nation’s Mental Wealth, generate the transdisciplinary science needed to empower decision makers to achieve multisystem resilience, social and economic stability, and sustainable, inclusive national prosperity.

Challenge and Opportunity

America is facing challenges that pose significant threats to economic security and social stability. Income and wealth inequalities have risen significantly over the last 40 years, with the top 10% of the population capturing 45.5% of the total income and 70.7% of the total wealth of the nation in 2020. Loneliness, isolation, and lack of connection are a public health crisis affecting nearly half of adults in the U.S. In addition to increasing the risk of premature mortality, loneliness is associated with a three-fold greater risk of dementia

Gun-related suicides and homicides have risen sharply over the last decade. Mental disorders are highly prevalent. Currently, more than 32% of adults and 47% of young people (18–29 years) report experiencing symptoms of anxiety and depression. The COVID-19 pandemic compounded the burden, with a 25–30% upsurge in the prevalence of depressive and anxiety disorders. America is experiencing a social deterioration that threatens its continued prosperity, as evidenced by escalating hate crimes, racial tensions, conflicts, and deepening political polarization. 

To reverse these alarming trends in America and globally, policymakers must first acknowledge that these problems are interconnected and cannot effectively be tackled in isolation. For example, despite the tireless efforts of prominent stakeholder groups and policymakers, the burden of mental disorders persists, with no substantial reduction in global burden since the 1990s. This lack of progress is evident even in high-income countries where investments in and access to mental health care have increased. 

Strengthening or reforming mental health systems, developing more effective models of care, addressing workforce capacity challenges, leveraging technology for scalability, and advancing pharmaceuticals are all vital for enhancing recovery rates among individuals grappling with mental health and substance use issues. However, policymakers must also better understand the root causes of these challenges so we can reshape the economic and social environments that give rise to common mental disorders.

Understanding and Addressing the Root Causes 

Prevention research and action often focus on understanding and addressing the social determinants of health and well-being. However, this approach lacks focus. For example, traditional analytic approaches have delivered an extensive array of social determinants of mental health and well-being, which are presented to policymakers as imperatives for investment. These include (but are not limited to):

This practice is replicated across other public health and social challenges, such as obesity, child health and development, and specific infectious and chronic diseases. Long lists of social determinants lobbied for investment lead policymakers to conclude that nations simply can’t afford to invest sufficiently to solve these health and social challenges. 

However, it Is likely that many of these determinants and challenges are merely symptoms of a more systemic problem. Therefore, treating the ongoing symptoms only perpetuates a cycle of temporary relief, diverts resources away from nurturing innovation, and impedes genuine progress.

To create environments that foster mental health and well-being, where children can thrive and fulfill their potential, where people can pursue meaningful vocation and feel connected and supported to give back to communities, and where Americans can live a healthy, active, and purposeful life, policymakers must recognize human flourishing and prosperity of nations depends on a delicate balance of interconnected systems.

The Rise of Gross Domestic Product: An Imperfect Measure for Assessing the Success and Wealth of Nations

To understand the roots of our current challenges, we need to look at the history of the foundational economic metric, gross domestic product (GDP). While the concept of GDP had been established decades earlier, it was during a 1960 meeting of the Organization for Economic Co-operation and Development that economic growth became a primary ambition of nations. In the shadow of two world wars and the Great Depression, member countries pledged to achieve the highest sustainable economic growth, employment, efficiency, and development of the world economy as their top priority (Articles 1 & 2). 

GDP growth became the definitive measure of a government’s economic management and its people’s welfare. Over subsequent decades, economists and governments worldwide designed policies and implemented reforms aimed at maximizing economic efficiency and optimizing macroeconomic structures to ensure consistent GDP growth. The belief was that by optimizing the economic system, prosperity could be achieved for all, allowing governments to afford investments in other crucial areas. However, prioritizing the optimization of one system above all others can have unintended consequences, destabilizing interconnected systems and leading to a host of symptoms we currently recognize as the social determinants of health. 

As a result of the relentless focus on optimizing processes, streamlining resources, and maximizing worker productivity and output, our health, social, political, and environmental systems are fragile and deteriorating. By neglecting the necessary buffers, redundancies, and adaptive capacities that foster resilience, organizations and nations have unwittingly left themselves exposed to shocks and disruptions. Americans face a multitude of interconnected crises, which will profoundly impact life expectancy, healthy development and aging, social stability, individual and collective well-being, and our very ability to respond resiliently to global threats. Prioritizing economic growth has led to neglect and destabilization of other vital systems critical to human flourishing.

Shifting Paradigms: Building the Nation’s Mental Wealth 

The system of national accounts that underpins the calculation of GDP is a significant human achievement, providing a global standard for measuring economic activity. It has evolved over time to encompass a wider range of activities based on what is considered productive to an economy. As recently as 1993, finance was deemed “explicitly productive” and included in GDP. More recently, Biden-Harris Administration leaders have advanced guidance for accounting for ecosystem services in benefit-cost analyses for regulatory decision-making and a roadmap for natural capital inclusion in the nation’s economic accounting services. This shows the potential to expand what counts as beneficial to the American economy—and what should be measured as a part of economic growth.

While many alternative indices and indicators of well-being and national prosperity have been proposed, such as the genuine progress indicator, the vast majority of policy decisions and priorities remain focused on growing GDP. Further, these metrics often fail to recognize the inherent value of the system of national accounts that GDP is based on. To account for this, Mental Wealth is a measure that expands the inputs of GDP to include well-being indicators. In addition to economic production metrics, Mental Wealth includes both unpaid activities that contribute to the social fabric of nations and social investments that build community resilience. These unpaid activities (Figure 1, social contributions, Cs) include volunteering, caregiving, civic participation, environmental restoration, and stewardship, and are collectively called social production. Mental Wealth also includes the sum of investment in community infrastructure that enables engagement in socially productive activities (Figure 1, social investment, Is). This more holistic indicator of national prosperity provides an opportunity to shift policy priorities towards greater balance between the economy and broader societal goals and is a measure of the strength of a Well-Being Economy. 

Figure 1.

Mental wealth is a more comprehensive measure of national prosperity that monetizes the value generated by a nation’s economic and social productivity.

Valuing social production also promotes a more inclusive narrative of a contributing life, and it helps to rebalance societal focus from individual self-interest to collective responsibilities. A recent report suggests that, in 2021, Americans contributed more than $2.293 trillion in social production, equating to 9.8% of GDP that year. Yet social production is significantly underestimated due to data gaps. More data collection is needed to analyze the extent and trends of social production, estimate the nation’s Mental Wealth, and assess the impact of policies on the balance between social and economic production.

Unlocking Policy Insights through Systems Modeling and Simulation

Systems modeling plays a vital role in the transition to a Well-Being Economy by providing an understanding of the complex interdependencies between economic, social, environmental, and health systems, and guiding policy actions. Systems modeling brings together expertise in mathematics, biostatistics, social science, psychology, economics, and more, with disparate datasets and best available evidence across multiple disciplines, to better understand which policies across which sectors will deliver the greatest benefits to the economy and society in balance. Simulation allows policymakers to anticipate the impacts of different policies, identify strategic leverage points, assess trade-offs and synergies, and make more informed decisions in pursuit of a Well-Being Economy. Forecasting and future projections are a long-standing staple activity of infectious disease epidemiologists, business and economic strategists, and government agencies such as the National Oceanic and Atmospheric Administration, geared towards preparing the nation for the economic realities of climate change.

Plan of Action 

An American Mental Wealth Observatory to Support Transition to a Well-Being Economy

Given the social deterioration that is threatening America’s resilience, stability, and sustainable economic prosperity, the federal government must systemically redress the imbalance by establishing a framework that privileges an inclusive, holistic, and balanced approach to development. The government should invest in an American Mental Wealth Observatory (Table 1) as critical infrastructure to guide this transition. The Observatory will report regularly on the strength of the Well-Being Economy as a part of economic reporting (see Table 1, Stream 1); generate the transdisciplinary science needed to inform systemic reforms and coordinated policies that optimize economic, environmental, health and social sectors in balance such as adding Mental Wealth to the system of national accounts (Streams 2–4); and engage in the communication and diplomacy needed to achieve national and international cooperation in transitioning to a Well-Being Economy (Streams 5–6).

This transformative endeavor demands the combined instruments of science, policy, politics, public resolve, social legislation, and international cooperation. It recognizes the interconnectedness of systems and the importance of a systemic and balanced approach to social and economic development in order to build equitable long-term resilience, a current federal interagency priority. The Observatory will make better use of available data from across multiple sectors to provide evidence-based analysis, guidance, and advice. The Observatory will bring together leading scientists (across disciplines of economics, social science, implementation science, psychology, mathematics, biostatistics, business, and complex systems science), policy experts, and industry partners through public-private partnerships to rapidly develop tools, technologies, and insights to inform policy and planning at national, state, and local levels. Importantly, the Observatory will also build coalitions between key cross-sectoral stakeholders and seek mandates for change at national and international levels. 

The American Mental Wealth Observatory should be chartered by the National Science and Technology Council, building off the work of the White House Report on Mental Health Research Priorities. Federal partners should include, at a minimum, the Department of Health and Human Services (HHS) Office of the Assistant Secretary for Health (OASH), specifically the Office of the Surgeon General (OSG) and Office of Disease Prevention and Health Promotion (ODPHP); the Substance Abuse and Mental Health Services Administration (SAMHSA); the Office of Management and Budget; the Council of Economic Advisors (CEA); and the Department of Commerce (DOC), alongside strong research capacity provided by the National Science Foundation (NSF) and the National Institutes of Health (NIH).

Table 1. Blueprint for an American Mental Wealth Observatory
The aim of the American Mental Wealth Observatory is to provide the data and science needed to act systemically to transition to a Well-Being Economy, build multi-system resilience, human flourishing, and national prosperity. The Observatory will have 6 overlapping streams of activity.
Stream 1: Measuring and monitoring the nation’s mental wealth (CEA, OSTP, OMB, DOC)While a number of communities and nations are embracing Well-Being Economy frameworks and tracking progress against a broad range of indicators of individual and societal well-being, an overarching measure of progress is needed. Without it, GDP will remain a privileged indicator that policy levers are trained on. This stream is focused on the further evolution of GDP to be a more holistic topline indicator of the strength of a Well-Being Economy: Mental Wealth. National Mental Wealth will be estimated and reported annually in the establishment phase, followed by quarterly intervals to mirror reporting of GDP. This effort can build on existing frameworks developed by DOC to include natural capital accounting within the system of national accounts, including linking Mental Wealth accounts with national economic accounts, interagency coordination and data standardization and interoperability policy, and organizing the development of a U.S. system of statistics for Mental Wealth decision-making.
Stream 2: Complex systems modeling and simulation (NSF, NIH, OASH, SAMHSA, OSTP, DOC)Advancing from rudimentary analytic and decision support tools to harnessing complex systems modeling and simulation will inform greater alignment of policies across economic, social, and health systems to enhance Mental Wealth (economic and social prosperity). Systems models are platforms for Living Evidence. Developing systems models requires the integration of scientific theory with best available research evidence and diverse data sources in a way that allows decision makers to test alternative policies and initiatives or ask resource allocation questions in a safe virtual environment before implementing them in the real world. As new evidence and data become available, models are updated/refined, becoming more robust over time, and offering significant value as long-term decision support assets.
Stream 3: Strengthening transdisciplinary data ecosystems (SAMHSA, OASH, DOC, OMB, OSTP, CEA, NIH, NSF)Strengthening transdisciplinary data ecosystems by harnessing advances in technology and passive and/or sentinel surveillance is essential, and will provide intelligence to inform coordinated cross-sectoral policy and planning.
This stream will also support early detection and rapid response to system stress and inform both Stream 2 modeling and Stream 4 Brain Capital research program. This program will include the establishment of a U.S. Brain Capital Dashboard and ongoing monitoring of brain capital indicators across three pillars: brain capital drivers (social, digital, economic), brain health (including mental health, well-being, and neurological disorders), and brain skills (cognitive and emotional skills and education metrics.
In addition, innovative protocols are being developed. For example, a protocol for scalable wastewater monitoring of stress hormones like cortisol and cortisone is under development in order to gain near-real-time insights into community stress and inform rapid deployment of resources/infrastructure to support communities through difficult times and prevent social decline before it becomes entrenched.
Stream 4: Brain Capital research program (NSF, NIH, OSTP)Investing in research that prioritizes brain capital enhancement opens doors to understanding and harnessing the economic value of human cognitive abilities (coupled with augmented intelligence offered by generative AI), mental health, and overall brain functioning. Recognizing and nurturing the economic value of brain capital can pave the way for a more prosperous and sustainable future, where individuals and societies thrive both intellectually and economically.
This research program will harness advanced research technologies to answer priority questions such as:


  • What are the likely impacts of AI on the diffusion of productivity gains, wealth, and well-being?

  • What are the projected impacts of early childhood education and care (ECEC) on school readiness, workforce participation, and family income?

  • What is the relationship between social capital infrastructure investment, social connectedness, and mental health in young people?

  • How is AI changing the nature of work, well-being, and productivity?

  • What is the optimal balance of digital technologies and human workforces needed to scale mental health and social care to meet demand?

  • How can employers and educators work together to create workforces and workplaces that are adaptable to changing circumstances by mastering quality, transferable vocational skills?
Stream 5: Knowledge translation / Policy Lab (CEA, OMB, OSTP, external nonprofits and academic research institutions)Shifting entrenched economic narratives and frameworks requires transdisciplinary policy advocacy, knowledge translation, and public communications alongside private stakeholders because stable transition to a Well-Being Economy will require broad scientific, policy, and public support as well as better cooperation between public and private sectors.
Stream 6: Brain Health / Science diplomacy (OSTP, State Department)Nothing less than an ambitious, innovative, transdisciplinary, and coordinated transnational research agenda is needed to enable the transition to a Well-Being Economy. The open sharing of insights, tools, and metrics across global agencies is needed to elevate mental health’s importance as a policy focus and inform policy and advocacy efforts and momentum for change. Therefore, this stream will focus on building bridges between countries through a universal appreciation of the importance of the integrity of the social fabric of nations for a nation’s very stability and resilience. Science diplomacy will also be important in facilitating the sharing of knowledge and innovations across borders, as well as for fostering international cooperation.

Operationalizing the American Mental Wealth Observatory will require an annual investment of $12 million from diverse sources, including government appropriations, private foundations, and philanthropy. This funding would be used to implement a comprehensive range of priority initiatives spanning the six streams of activity (Table 2) coordinated by the American Mental Wealth Observatory leadership. Acknowledging the critical role of brain capital in upholding America’s prosperity and security, this investment offers considerable returns for the American people.

Table 2. Investment needed to actualize an American Mental Wealth Observatory
Budget (US$M)
Stream20242025202620272028
Stream 1: Measuring and monitoring the Mental Wealth of the nation1.51.71.71.71.7
Stream 2: Complex systems modeling and simulation2.32.82.82.82.8
Stream 3: Strengthening transdisciplinary data ecosystems2.83.03.73.13.1
Stream 4: Brain Capital research program2.53.03.03.02.5
Stream 5: Knowledge translation/Policy Lab1.51.51.51.51.5
Stream 6: Brain Health/Science Diplomacy0.70.70.70.70.7
Total11.312.713.412.812.3

Conclusion

America stands at a pivotal moment, facing the aftermath of a pandemic, a pressing crisis in youth mental and substance use disorders, and a growing sense of disconnection and loneliness. The fragility of our health, social, environmental, and political systems has come into sharp focus, and global threats of climate change and generative AI loom large. There is a growing sense that the current path is unsustainable. 

After six decades of optimizing the economic system for growth in GDP, Americans are reaching a tipping point where losses due to systemic fragility, disruption, instability, and civil unrest will outweigh the benefits. The United States government and private sector leaders must forge a new path. The models and approaches that guided us through the 20th century are ill-equipped to guide us through the challenges and threats of the 21st century.

This realization presents an extraordinary opportunity to transition to a Well-Being Economy and rebuild the Mental Wealth of the nations. An American Mental Wealth Observatory will provide the data and science capacity to help shape a new generation grounded in enlightened global citizenship, civic-mindedness, and human understanding and equipped with the cognitive, emotional, and social resources to address global challenges with unity, creativity, and resilience.

The University of Sydney’s Mental Wealth Initiative thanks the following organizations for their support in drafting this memo: FAS, OECDRice University’s Baker Institute for Public PolicyBoston University School of Public Health, the Brain Capital Alliance, and CSART.

Frequently Asked Questions
What is brain capital?

Brain capital is a collective term for brain skills and brain health, which are fundamental drivers of economic and social prosperity. Brain capital comprises (1) brain skills, which includes the ability to think, feel, work together, be creative, and solve complex problems, and (2) brain health, which includes mental health, well-being, and neurological disorders that critically impact the ability to use brain skills effectively, for building and maintaining positive relationships with others, and for resilience against challenges and uncertainties.

What is the social benefit of valuing unpaid forms of labor (social production)?

Social production is the glue that holds society together. These unpaid social contributions foster community well-being, support our economic productivity, improve environmental wellbeing, and help make us more prosperous and resilient as a nation.


Social production includes volunteering and charity work, educating and caring for children, participating in community groups, and environmental restoration—basically any activity that contributes to the social fabric and community well-being.


Making the value of social production visible helps us track how economic policies are affecting social prosperity and allows governments to act to prevent an erosion of our social fabric. So instead of just measuring our economic well-being through GDP, measuring and reporting social production as well gives us a more holistic picture of our national welfare. The two combined (GDP plus social production) is what we call the overall Mental Wealth of the nation, which is a measure of the strength of a Well-Being Economy.

As a society, what do we stand to lose by not measuring the Mental Wealth of the nation?

The Mental Wealth metric extends GDP to include not only the value generated by our economic productivity but also the value of this social productivity. In essence, it is a single measure of the strength of a Well-Being Economy. Without a Mental Wealth assessment, we won’t know how we are tracking overall in transitioning to such an economy.


Furthermore, GDP only includes the value created by those in the labor market. The exclusion of socially productive activities sends a signal that society does not value the contributions made by those not in the formal labor market. Privileging employment as a legitimate social role and indicator of societal integration leads to the structural and social marginalization of the unemployed, older adults, and the disabled, which in turn leads to lower social participation, intergenerational dependence, and the erosion of mental health and well-being.

How do well-being frameworks compare to Mental Wealth, and why are you proposing something different?

Well-being frameworks are an important evolution in our journey to understand national prosperity and progress in more holistic terms. Dashboards of 50-80 indicators like those proposed in Australia, Scotland, New Zealand, Iceland, Wales, and Finland include things like health, education, housing, income and wealth distribution, life satisfaction, and more, which help track some important contributors to social well-being.


However, these sorts of dashboards are unlikely to compete with topline economic measures like GDP as a policy focus. Some indicators will go up, some will go down, some will remain steady, so dashboards lack the ability to provide a clear statement of overall progress to drive policy change.


We need an overarching measure. Measurement of the value of social production can be integrated into the system of national accounts so that we can regularly report on the nation’s overall economic and social well-being (or Mental Wealth). Mental Wealth provides a dynamic measure of the strength (and good management) of a Well-Being Economy. By adopting Mental Wealth as an overarching indicator, we also gain an improved understanding of the interdependence of a healthy economy and a healthy society.

Tilling the Federal SOIL for Transformative R&D: The Solution Oriented Innovation Liaison

Summary 

The federal government is increasingly embracing Advanced Research Projects Agencies (ARPAs) and other transformative research and engagement enterprises (TREEs) to connect innovators and create the breakthroughs needed to solve complex problems. Our innovation ecosystem needs more of these TREEs, especially for societal challenges that have not historically benefited from solution-oriented research and development. And because the challenges we face are so interwoven, we want them to work and grow together in a solution-oriented mode. 

The National Science Foundation (NSF)’s new Directorate for Technology, Innovation and Partnerships should establish a new Office of the Solution-Oriented Innovation Liaison (SOIL) to help TREEs share knowledge about complementary initiatives, establish a community of practice among breakthrough innovators, and seed a culture for exploring new models of research and development within the federal government. The SOIL would have two primary goals: (1) provide data, information, and knowledge-sharing services across existing TREEs; and (2) explore opportunities to pilot R&D models of the future and embed breakthrough innovation models in underleveraged agencies.

Challenge and Opportunity

Climate change. Food security. Social justice. There is no shortage of complex challenges before us—all intersecting, all demanding civil action, and all waiting for us to share knowledge. Such challenges remain intractable because they are broader than the particular mental models that any one individual or organization holds. To develop solutions, we need science that is more connected to social needs and to other ways of knowing. Our problem is not a deficit of scientific capital. It is a deficit of connection.

Connectivity is what defines a growing number of approaches to the public administration of science and technology, alternatively labeled as transformative innovation, mission-oriented innovation, or solutions R&D. Connectivity is what makes DARPA, IARPA, and ARPA-E work, and it is why new ARPAs are being created for health and proposed for infrastructure, labor, and education. Connectivity is also a common element among an explosion of emerging R&D models, including Focused Research Organizations (FROs) and Distributed Autonomous Organizations (DAOs). And connectivity is the purpose of NSF’s new Directorate for Technology, Innovation and Partnerships (TIP), which includes “fostering innovation ecosystems” in its mission. New transformative research and engagement enterprises (TREEs) could be especially valuable in research domains at the margins, where “the benefits of innovation do not simply trickle down.

The history of ARPAs and other TREEs shows that solutions R&D is successfully conducted by entities that combine both research and engagement. If grown carefully, such organisms bear fruit. So why just plant one here or there when we could grow an entire forest? The metaphor is apt. To grow an innovation ecosystem, we must intentionally sow the seeds of TREEs, nurture their growth, and cultivate symbiotic relationships—all while giving each the space to thrive.

Plan of Action

NSF’s TIP directorate should create a new Office of Solution-Oriented Innovation (SOIL) to foster a thriving community of TREEs. SOIL would have two primary goals: (1) nurture more TREEs of more varieties in more mission spaces; and (2) facilitate more symbiosis among TREEs of increasing number and variety. 

Goal 1: More TREEs of more varieties in more mission spaces

SOIL would shepherd the creation of TREEs wherever they are needed, whether in a federal department, a state or local agency, or in the private, nonprofit, or academic sectors. Key to this is codifying the lessons of successful TREEs and translating them to new contexts. Not all such knowledge is codifiable; much is tacit. As such, SOIL would draw upon a cadre of research-management specialists who have a deep familiarity with different organizational forms (e.g., ARPAs, FROs, DAOs) and could work with the leaders of departments, businesses, universities, consortia, etc. to determine which form best suits the need of the entity in question and provide technical assistance in establishment.

An essential part of this work would be helping institutions create mission-appropriate governance models and cultures. Administering TREEs is neither easy nor typical. Indeed, the very fact that they are managed differently from normal R&D programs makes them special. Former DARPA Director Arati Prabhakar has emphasized the importance of such tailored structures to the success of TREEs. To this end, SOIL would also create a Community of Cultivators comprising former TREE leaders, principal investigators (PIs), and staff. Members of this community would provide those seeking to establish new TREEs with guidance during the scoping, launch, and management phases.

SOIL would also provide opportunities for staff at different TREEs to connect with each other and with collective resources. It could, for example, host dedicated liaison officers at agencies (as DARPA has with its service lines) to coordinate access to SOIL resources and other TREEs and support the documentation of lessons learned for broader use. SOIL could also organize periodic TREE conventions for affiliates to discuss strategic directions and possibly set cross-cutting goals. Similar to the SBIR office at the Small Business Administration, SOIL would also report annually to Congress on the state of the TREE system, as well as make policy recommendations.

Goal 2: More symbiosis among TREEs of increasing number and variety

Success for SOIL would be a community of TREEs that is more than the sum of its parts. It is already clear how the defense and intelligence missions of DARPA and IARPA intersect. There are also energy programs at DARPA that might benefit from deeper engagement with programs at ARPA-E. In the future, transportation-infrastructure programs at ARPA-E could work alongside similar programs at an ARPA for infrastructure. Fostering stronger connections between entities with overlapping missions would minimize redundant efforts and yield shared platform technologies that enable sector-specific advances.

Indeed, symbiotic relationships could spawn untold possibilities. What if researchers across different TREEs could build knowledge together? Exchange findings, data, algorithms, and ideas? Co-create shared models of complex phenomena and put competing models to the test against evidence? Collaborate across projects, and with stakeholders, to develop and apply digital technologies as well as practices to govern their use? A common digital infrastructure and virtual research commons would enable faster, more reliable production (and reproduction) of research across domains. This is the logic underlying the Center for Open Science and the National Secure Data Service.

To this end, SOIL should build a digital Mycelial Network (MyNet), a common virtual space that would harness the cognitive diversity across TREEs for more robust knowledge and tools. MyNet would offer a set of digital services and resources that could be accessed by TREE managers, staff, and PIs. Its most basic function could be to depict the ecosystem of challenges and solutions, search for partners, and deconflict programs. Once partnerships are made, higher-level functions would include secure data sharing, co-creation of solutions, and semantic interconnection. MyNet could replace the current multitude of ad hoc, sector-specific systems for sharing research resources, giving more researchers access to more knowledge about complex systems and fewer obstacles from paywalls. And the larger the network, the bigger the network effects. If the MyNet infrastructure proves successful for TREEs, it could ultimately be expanded more broadly to all research institutions—just as ARPAnet expanded into the public internet. 

For users, MyNet would have three layers:

These functions would collectively require:

How might MyNet be applied? Consider three hypothetical programs, all focused on microplastics: a medical program that maps how microplastics are metabolized and impact health; a food-security program that maps how microplastics flow through food webs and supply chains; and a social justice program that maps which communities produce and consume microplastics. In the data layer, researchers at the three programs could combine data on health records, supply logistics, food inspections, municipal records, and demographics. In the information layer, they might collaborate on coding and evaluating quantitative models. Finally, in the knowledge layer, they could work together to validate claims regarding who is impacted, how much, and by what means.

Initial Steps

First, Congress should authorize and appropriate the NSF TIP Directorate with $500 million over four years for a new Office of the Solution-Oriented Innovation Liaison. Congress should view SOIL as an opportunity to create a shared service among emergent, transformative federal R&D efforts that will empower—rather than bureaucratically stifle—the science and technological advances we need most. This mission fits squarely under the NSF TIP Directorate’s mandate to “mobilize the collective power of the nation” by serving as “a crosscutting platform that collaboratively integrates with NSF’s existing directorates and fosters partnerships—with government, industry, nonprofits, civil society and communities of practice—to leverage, energize and rapidly bring to society use-inspired research and innovation.” 

Once appropriated and authorized to begin intentionally growing a network of TREEs, NSF’s TIP Directorate should focus on a four-year plan for SOIL. TIP should begin by choosing an appropriate leader for SOIL, such as a former director or directorate manager of an ARPA (or other TREE). SOIL would be tasked with first engaging the management of existing ARPAs in the federal government, such as those at the Departments of Defense and Energy, to form an advisory board. The advisory board would in turn guide the creation of experience-informed operating procedures for SOIL to use to establish and aid new TREEs. These might include discussions geared toward arriving at best practices and mechanisms to operate rapid solutions-focused R&D programs for the following functions:

Beyond these structural aspects, the board must also incorporate important cultural aspects of TREES into best practices. In my own research into the managerial heuristics that guide TREEs, I found that managers must be encouraged to “drive change” (critique the status quo, dream big, take action), “be better” (embrace difference, attract excellence, stand out from the crowd), “herd nerds” (focus the creative talent of scientists and engineers), “gather support” (forge relationships with research conductors and potential adversaries), “try and err” (take diverse approaches, expect to fail, learn from failure), and “make it matter” (direct activities to realize outcomes for society, not for science).

The board would also recommend a governance structure and implementation strategy for MyNet. In its first year, SOIL could also start to grow the Community of Cultivators, potentially starting with members of the advisory board. The board chair, in partnership with the White House Office of Science and Technology Policy, would also convene an initial series of interagency working groups (IWGs) focused on establishing a community of practice around TREEs, including but not limited to representatives from the following R&D agencies, offices, and programs: 

In years two and three, SOIL would focus on growing three to five new TREEs at organizations that have not had solutions-oriented innovation programs before but need them. 

SOIL would also start to build a pilot version of MyNet as a resource for these new TREEs, with a goal of including existing ARPAs and other TREEs as quickly as possible. In establishing MyNet, SOIL should focus on implementing the most appropriate system of data governance by first understanding the nature of the collaborative activities intended. Digital research collaborations can apply and mix a range of different governance patterns, with different amounts of availability and freedoms with respect to digital resources. MyNet should be flexible enough to meet a range of needs for openness and security. To this end, SOIL should coordinate with the recently created National Secure Data Service and apply lessons forward in creating an accessible, secure, and ethical information-sharing environment. 

Year four and beyond would be characterized by scaling up. Building on the lessons learned in the prior two years of pilot programs, SOIL would coordinate with new and legacy TREEs to refresh operating procedures and governance structures. It would then work with an even broader set of organizations to increase the number of TREEs beyond the three to five pilots and continue to build out MyNet as well as the Community of Cultivators. Periodic evaluations of SOIL’s programmatic success would shape its evolution after this point. These should be framed in terms of its capacity to create and support programs that yield meaningful technological and socioeconomic outcomes, not just produce traditional research metrics. As such, in its creation of new TREEs, SOIL should apply a major lesson of the National Academies’ evaluation of ARPA-E: explicitly align the (necessarily) robust performance management systems at the project level with strategy and evaluation systems at the program, portfolio, and agency levels. The long-term viability of SOIL and TREEs will depend on their ability to demonstrate value to the public.

Frequently Asked Questions
What is the transformative research model? What makes it different from a typical R&D model?

The transformative research model typically works like this:



  • Engage with stakeholders to understand their needs and set audacious goals for addressing them.

  • Establish lean projects run by teams of diverse experts assembled just long enough to succeed or fail in one approach.

  • Continuously evaluate projects, build on what works, kill what doesn’t, and repeat as necessary.


In a nutshell, transformative research enterprises exist solely to solve a particular problem, rather than to grow a program or amass a stock of scientific capital.


To get more specific, Bonvillian and Van Atta (2011) identify the unique factors that contribute to the innovative nature of ARPAs. On the personnel front, ARPA program managers are talented managers, experienced in business, and appointed for limited terms. They are “translators,” as opposed to subject-matter experts, who actively engage with allies, rivals, and others. They have great power to choose projects, hire, fire, and contract. On the structure front, projects are driven by specific challenges or visions—co-developed with stakeholders—designed around plausible implementation pathways. Projects are executed extramurally, and managed as portfolios, with clear metrics to asses risk and reward. Success for ARPAs means developing products and services that achieve broad uptake and cost-efficacy, so finding first adopters and creating markets is part of the work.

What kinds of TREEs could SOIL help to create?

Some examples come from other Day One proposals. SOIL could work with the Department of Labor to establish a Labor ARPA. It could work with the Department of Education on an Education ARPA. We could imagine a Justice Department ARPA with a program for criminal justice reform, one at Housing and Urban Development aimed at solving homelessness, or one at the State Department for innovations in diplomacy. And there are myriad opportunities beyond the federal government.

What kind of authority over TREEs should SOIL have? Since TREEs are designed to be nimble and independent, wouldn’t SOIL oversight inhibit their operations with an extra layer of bureaucracy?

TREEs thrive on their independence and flexibility, so SOIL’s functions must be designed to impose minimal interference. Other than ensuring that the TREEs it supports are effectively administered as transformative, mission-oriented organizations, SOIL would be very hands-off. SOIL would help establish TREEs and set them up so they do not operate as typical R&D units. SOIL would give TREE projects and staff the means to connect cross-organizationally with other projects and staff in areas of mutual interest (e.g., via MyNet, the Community of Cultivators, and periodic convenings). And, like the SBIR office at the Small Business Administration, SOIL would report annually to Congress on its operations and progress toward goals.

What is the estimated cost of SOIL and its component initiatives? How would it be funded?

An excellent model for SOIL is the Small Business Innovative Research (SBIR) system. SBIR is funded by redirecting a small percentage of the budgets of agencies that spend $100 million or more on extramural R&D. Given that SOIL is intended to be relevant to all federal mission spaces, we recommend that SOIL be funded by a small fraction (between 0.1 and 1.0%) of the budgets of all agencies with $1 billion or more in total discretionary spending. This would yield about $15 billion to support SOIL in growing and connecting new TREEs in a vastly widened set of mission spaces. 


The risk is the opportunity cost of this budget reallocation to each funding agency. It is worth noting, though, that changes of 0.1–1.0% are less than the amount that the average agency sees as annual perturbations in its budget. Moreover, redirecting these funds may well be worth the opportunity cost, especially as an investment in solving the compounding problems that federal agencies face. By redirecting this small fraction of funds, we can keep agency operations 99–99.9% as effective while simultaneously creating a robust, interconnected, solutions-oriented R&D system.

Unlocking Federal Grant Data To Inform Evidence-Based Science Funding

Summary

Federal science-funding agencies spend tens of billions of dollars each year on extramural research. There is growing concern that this funding may be inefficiently awarded (e.g., by under-allocating grants to early-career researchers or to high-risk, high-reward projects). But because there is a dearth of empirical evidence on best practices for funding research, much of this concern is anecdotal or speculative at best.

The National Institutes of Health (NIH) and the National Science Foundation (NSF), as the two largest funders of basic science in the United States, should therefore develop a platform to provide researchers with structured access to historical federal data on grant review, scoring, and funding. This action would build on momentum from both the legislative and executive branches surrounding evidence-based policymaking, as well as on ample support from the research community. And though grantmaking data are often sensitive, there are numerous successful models from other sectors for sharing sensitive data responsibly. Applying these models to grantmaking data would strengthen the incorporation of evidence into grantmaking policy while also guiding future research (such as larger-scale randomized controlled trials) on efficient science funding.

Challenge and Opportunity

The NIH and NSF together disburse tens of billions of dollars each year in the form of competitive research grants. At a high level, the funding process typically works like this: researchers submit detailed proposals for scientific studies, often to particular program areas or topics that have designated funding. Then, expert panels assembled by the funding agency read and score the proposals. These scores are used to decide which proposals will or will not receive funding. (The FAQ provides more details on how the NIH and NSF review competitive research grants.) 

A growing number of scholars have advocated for reforming this process to address perceived inefficiencies and biases. Citing evidence that the NIH has become increasingly incremental in its funding decisions, for instance, commentators have called on federal funding agencies to explicitly fund riskier science. These calls grew louder following the success of mRNA vaccines against COVID-19, a technology that struggled for years to receive federal funding due to its high-risk profile.

Others are concerned that the average NIH grant-winner has become too old, especially in light of research suggesting that some scientists do their best work before turning 40. Still others lament the “crippling demands” that grant applications exert on scientists’ time, and argue that a better approach could be to replace or supplement conventional peer-review evaluations with lottery-based mechanisms

These hypotheses are all reasonable and thought-provoking. Yet there exists surprisingly little empirical evidence to support these theories. If we want to effectively reimagine—or even just tweak—the way the United States funds science, we need better data on how well various funding policies work.

Academics and policymakers interested in the science of science have rightly called for increased experimentation with grantmaking policies in order to build this evidence base. But, realistically, such experiments would likely need to be conducted hand-in-hand with the institutions that fund and support science, investigating how changes in policies and practices shape outcomes. While there is progress in such experimentation becoming a reality, the knowledge gap about how best to support science would ideally be filled sooner rather than later.

Fortunately, we need not wait that long for new insights. The NIH and NSF have a powerful resource at their disposal: decades of historical data on grant proposals, scores, funding status, and eventual research outcomes. These data hold immense value for those investigating the comparative benefits of various science-funding strategies. Indeed, these data have already supported excellent and policy-relevant research. Examples include Ginther et. al (2011) which studies how race and ethnicity affect the probability of receiving an NIH award, and Myers (2020), which studies whether scientists are willing to change the direction of their research in response to increased resources. And there is potential for more. While randomized control trials (RCTs) remain the gold standard for assessing causal inference, economists have for decades been developing methods for drawing causal conclusions from observational data. Applying these methods to federal grantmaking data could quickly and cheaply yield evidence-based recommendations for optimizing federal science funding.

Opening up federal grantmaking data by providing a structured and streamlined access protocol would increase the supply of valuable studies such as those cited above. It would also build on growing governmental interest in evidence-based policymaking. Since its first week in office, the Biden-Harris administration has emphasized the importance of ensuring that “policy and program decisions are informed by the best-available facts, data and research-backed information.” Landmark guidance issued in August 2022 by the White House Office of Science and Technology Policy directs agencies to ensure that federally funded research—and underlying research data—are freely available to the public (i.e., not paywalled) at the time of publication.

On the legislative side, the 2018 Foundations for Evidence-based Policymaking Act (popularly known as the Evidence Act) calls on federal agencies to develop a “systematic plan for identifying and addressing policy questions” relevant to their missions. The Evidence Act specifies that the general public and researchers should be included in developing these plans. The Evidence Act also calls on agencies to “engage the public in using public data assets [and] providing the public with the opportunity to request specific data assets to be prioritized for disclosure.” The recently proposed Secure Research Data Network Act calls for building exactly the type of infrastructure that would be necessary to share federal grantmaking data in a secure and structured way.

Plan of Action

There is clearly appetite to expand access to and use of federally held evidence assets. Below, we recommend four actions for unlocking the insights contained in NIH- and NSF-held grantmaking data—and applying those insights to improve how federal agencies fund science.

Recommendation 1. Review legal and regulatory frameworks applicable to federally held grantmaking data.

The White House Office of Management and Budget (OMB)’s Evidence Team, working with the NIH’s Office of Data Science Strategy and the NSF’s Evaluation and Assessment Capability, should review existing statutory and regulatory frameworks to see whether there are any legal obstacles to sharing federal grantmaking data. If the review team finds that the NIH and NSF face significant legal constraints when it comes to sharing these data, then the White House should work with Congress to amend prevailing law. Otherwise, OMB—in a possible joint capacity with the White House Office of Science and Technology Policy (OSTP)—should issue a memo clarifying that agencies are generally permitted to share federal grantmaking data in a secure, structured way, and stating any categorical exceptions.

Recommendation 2. Build the infrastructure to provide external stakeholders with secure, structured access to federally held grantmaking data for research. 

Federal grantmaking data are inherently sensitive, containing information that could jeopardize personal privacy or compromise the integrity of review processes. But even sensitive data can be responsibly shared. The NIH has previously shared historical grantmaking data with some researchers, but the next step is for the NIH and NSF to develop a system that enables broader and easier researcher access. Other federal agencies have developed strategies for handling highly sensitive data in a systematic fashion, which can provide helpful precedent and lessons. Examples include:

  1. The U.S. Census Bureau (USCB)’s Longitudinal Employer-Household Data. These data link individual workers to their respective firms, and provide information on salary, job characteristics, and worker and firm location. Approved researchers have relied on these data to better understand labor-market trends.
  2. The Department of Transportation (DOT)’s Secure Data Commons. The Secure Data Commons allows third-party firms (such as Uber, Lyft, and Waze) to provide individual-level mobility data on trips taken. Approved researchers have used these data to understand mobility patterns in cities.

In both cases, the data in question are available to external researchers contingent on agency approval of a research request that clearly explains the purpose of a proposed study, why the requested data are needed, and how those data will be managed. Federal agencies managing access to sensitive data have also implemented additional security and privacy-preserving measures, such as:

Building on these precedents, the NIH and NSF should (ideally jointly) develop secure repositories to house grantmaking data. This action aligns closely with recommendations from the U.S. Commission on Evidence-Based Policymaking, as well as with the above-referenced Secure Research Data Network Act (SRDNA). Both the Commission recommendations and the SRDNA advocate for secure ways to share data between agencies. Creating one or more repositories for federal grantmaking data would be an action that is simultaneously narrower and broader in scope (narrower in terms of the types of data included, broader in terms of the parties eligible for access). As such, this action could be considered either a precursor to or an expansion of the SRDNA, and could be logically pursued alongside SRDNA passage.

Once a secure repository is created, the NIH and NSF should (again, ideally jointly) develop protocols for researchers seeking access. These protocols should clearly specify who is eligible to submit a data-access request, the types of requests that are likely to be granted, and technical capabilities that the requester will need in order to access and use the data. Data requests should be evaluated by a small committee at the NIH and/or NSF (depending on the precise data being requested). In reviewing the requests, the committee should consider questions such as:

  1. How important and policy-relevant is the question that the researcher is seeking to answer? If policymakers knew the answer, what would they do with that information? Would it inform policy in a meaningful way? 
  2. How well can the researcher answer the question using the data they are requesting? Can they establish a clear causal relationship? Would we be comfortable relying on their conclusions to inform policy?

Finally, NIH and NSF should consider including right-to-review clauses in agreements governing sharing of grantmaking data. Such clauses are typical when using personally identifiable data, as they give the data provider (here, the NIH and NSF) the chance to ensure that all data presented in the final research product has been properly aggregated and no individuals are identifiable. The Census Bureau’s Disclosure Review Board can provide some helpful guidance for NIH and NSF to follow on this front.

Recommendation 3. Encourage researchers to utilize these newly available data, and draw on the resulting research to inform possible improvements to grant funding.

The NIH and NSF frequently face questions and trade-offs when deciding if and how to change existing grantmaking processes. Examples include:

Typically, these agencies have very little academic or empirical evidence to draw on for answers. A large part of the problem has been the lack of access to data that researchers need to conduct relevant studies. Expanding access, per Recommendations 1 and 2 above, is a necessary part of but not a sufficient solution. Agencies must also invest in attracting researchers to use the data in a socially useful way.

Broadly advertising the new data will be critical. Announcing a new request for proposals (RFP) through the NIH and/or the NSF for projects explicitly using the data could also help. These RFPs could guide researchers toward the highest-impact and most policy-relevant questions, such as those above. The NSF’s “Science of Science: Discovery, Communication and Impact” program would be a natural fit to take the lead on encouraging researchers to use these data.

The goal is to create funding opportunities and programs that give academics clarity on the key issues and questions that federal grantmaking agencies need guidance on, and in turn the evidence academics build should help inform grantmaking policy.

Conclusion

Basic science is a critical input into innovation, which in turn fuels economic growth, health, prosperity, and national security. The NIH and NSF were founded with these critical missions in mind. To fully realize their missions, the NIH and NSF must understand how to maximize scientific return on federal research spending. And to help, researchers need to be able to analyze federal grantmaking data. Thoughtfully expanding access to this key evidence resource is a straightforward, low-cost way to grow the efficiency—and hence impact—of our federally backed national scientific enterprise.

Frequently Asked Questions
How does the NIH currently select research proposals for funding?

For an excellent discussion of this question, see Li (2017). Briefly, the NIH is organized around 27 “Institutes or Centers” (ICs) which typically correspond to disease areas or body systems. ICs have budgets each year that are set by Congress. Research proposals are first evaluated by around 180 different “study sections”, which are committees organized by scientific areas or methods. After being evaluated by the study sections, proposals are returned to their respective ICs. The highest-scoring proposals in each IC are funded, up to budget limits.

How does the NSF currently select research proposals for funding?

Research proposals are typically submitted in response to announced funding opportunities, which are organized around different programs (topics). Each proposal is sent by the Program Officer to at least three independent reviewers who do not work at the NSF. These reviewers judge the proposal on its Intellectual Merit and Broader Impacts. The Program Officer then uses the independent reviews to make a funding recommendation to the Division Director, who makes the final award/decline decision. More details can be found on the NSF’s webpage.

What data on grant funding at the NIH and NSF is currently (publicly) available?

The NIH and NSF both provide data on approved proposals. These data can be found on the RePORTER site for the NIH and award search site for the NSF. However, these data do not provide any information on the rejected applications, nor do they provide information on the underlying scores of approved proposals.

Environmental Data in the Inflation Reduction Act

“It is a capital mistake,” Sherlock Holmes once observed, “to theorize before one has data.” In the Inflation Reduction Act, fortunately, Congress avoided making that capital mistake a Capitol one.

Tax credits and other incentives for clean energy, clean manufacturing, and clean transportation dominate the IRA’s environmental spending. But the bill also makes key investments in environmental data. This is important because data directly informs how efficiently dollars are spent. (You could have avoided wasting money on that extra jug of olive oil if you’d just had better data at hand on the contents of your pantry.)

The IRA’s environmental-data investments can be broken down into three categories: investments in specific datasets, investments in specific data infrastructure, and general support for data-related activities. Let’s take a closer look at each of these and why they matter.

Investments in specific datasets

The IRA appropriates $850 million (over six years) for the Environmental Protection Agency (EPA) to create incentives for methane mitigation and monitoring. The IRA directs EPA to use some of the funds to “prepare inventories, gather empirical data, and track emissions” related to the incentive program. This information will allow EPA (and third parties) to evaluate the program’s success, which could be very powerful indeed. Because methane is such a potent and short-lived greenhouse gas (with a 20-year global warming potential that is more than 70 times greater than that of carbon dioxide), scientists agree that cutting methane emissions quickly is one of the best opportunities for reducing near-term global warming. Understanding whether and which incentives spur significant methane mitigation would therefore help policymakers decide if and where to double down on mitigation incentives moving forward.

The IRA appropriates $1.3 billion (over nine years) for the U.S. Department of Agriculture’s Natural Resources Conservation Service (NRCS) to provide conservation technical assistance to farmers and ranchers—and to quantify the climate benefits. NRCS was established in 1935 to help farmers and ranchers conserve land, soil, water, and other key agricultural resources. The IRA boosts NRCS’s funding by an additional $1 billion over nine years. But it also kicks in an additional $300 million for NRCS to collect and use field-based data to quantify how much NRCS-based efforts sequester carbon and slash greenhouse-gas emissions. Insights could boost national support for practices like regenerative agriculture, incorporation of ecosystem services into agricultural cost-benefit analyses, and good soil stewardship.

The IRA appropriates $42.5 million (over six years) for the Department of Housing and Urban Development (HUD) to conduct energy and water benchmarking studies. Utility benchmarking helps property managers understand how efficient a given building is relative to other, similar buildings. Benchmarking results guide investments into upgrades. For instance, a property manager with $100,000 to spend may wisely decide to spend that money on “low-hanging fruit” fixes (such as replacing old lightbulbs, or installing weatherstripping around doors and windows) at their least-efficient properties instead of investing in upgrades at more-efficient properties that will yield only marginal portfolio improvements. The IRA funds collection of data to expand utility benchmarking across HUD-supported housing.

The IRA appropriates $32.5 million (over four years) to the White House Council on Environmental Quality (CEQ) to collect data on which communities are disproportionately harmed by negative environmental impacts, and to develop related decision-support tools. This component of the IRA directly supports the Biden administration’s Justice40 Initiative. Justice40 establishes a national goal of ensuring that so-called “environmental justice communities” realize at least 40% of the benefits of certain federal investments. But as an executive-led initiative, Justice40 can only direct existing federal funds—it can’t bring in additional money. While advocates have argued that the IRA does not go far enough in bolstering environmental justice, designating new funding for the White House to realize Justice40 objectives is undoubtedly a step in the right direction.

The IRA appropriates $25.5 million for the U.S. Geological Survey to “produce, collect, disseminate, and use 3D elevation data.” There’s no other way to say it: 3D elevation data are cool. These data, collected by aircraft-mounted sensors, can be stitched together to produce models of our world underneath surface features like trees and buildings. These models support everything from landslide prediction (see box) to flood-risk assessment. IRA funds USGS in continuing to fill gaps in the 3D elevation data available for the United States. Example of a model constructed using 3D elevation data. Clouds of data points (left) can be stitched into 3D elevation models (right) that, for instance, reveal past landslides and steep slopes at risk of failure. These features could be impossible to identify through aerial images that also capture surface features. (Source: USGS.)

Example of a model constructed using 3D elevation data

Clouds of data points (left) can be stitched into 3D elevation models (right) that, for instance, reveal past landslides and steep slopes at risk of failure. These features could be impossible to identify through aerial images that also capture surface features. (Source: USGS).

The IRA appropriates $5 million (over four years) for EPA to collect and analyze lifecycle fuels data. The diversifying U.S. energy system is triggering heated debates over the pros and cons of different fuels. Hydrogen-powered cars produce zero emissions at the tailpipe, yes. But given the carbon and energy footprints of generating fuel-grade hydrogen on the front end, are hydrogen cars really cleaner than their gas/electric hybrid counterparts? Biofuels are all renewable by definition, but certainly not all created equal. The IRA enables the EPA to empirically contribute to these debates.

Investments in specific data infrastructure

The IRA appropriates $190 million (over four years) for the National Oceanic and Atmospheric Administration (NOAA) to invest in high-performance computing and data management. This funding responds to concerns raised by NOAA’s Science Advisory Board that NOAA lacks the technical capacity to continue to advance U.S. weather research. The Board argued that this need is especially acute with regard to understanding and predicting high-impact weather amid rapidly changing climate, population, and development trends.

The IRA appropriates $18 million (over nine years) for EPA to update its Integrated Compliance Information System (ICIS). ICIS is EPA’s principal compliance and enforcement data system, including for regulatory pillars such as the Clean Air Act and Clean Water Act. While an outdated data-management system is hardly the primary reason why violations of U.S. environmental laws are rampant (a near 30% erosion of funding for EPA’s compliance office over the past decade is a bigger problem), it certainly doesn’t help. The IRA will enhance EPA’s efforts to operationalize an existing plan for modernizing the ICIS.

The IRA directs the Secretary of Energy to “develop and publish guidelines for States relating to residential electric and natural gas energy data sharing.” While not an investment per se, this brief provision nevertheless merits mention. The IRA channels funds through the Department of Energy (DOE) to state energy offices for new rebate programs that reward homeowners making energy-efficiency house retrofits. The IRA directs the Secretary of Energy to establish guidelines for sharing data related to these programs. Proactively developing such guidelines will be useful both for facilitating productive data exchange (e.g., among those trying to understand how widespread efficiency upgrades affect energy demand) as well as for forestalling adverse effects (e.g., cyberattacks from bad actors exploiting grid vulnerabilities). 

General support for data-related activities

In addition to the specific investments outlined above, the IRA appropriates (over the next nine years) $150 million, $115 million, $100 million, and $40 million, respectively, to the Department of the Interior, the Department of Energy, the Federal Energy Regulatory Commission, and the Environmental Protection Agency for activities including “the development of environmental data or information systems.”

This broad language gives agencies latitude to allocate resources as needs arise. It also underscores the fact that multiple agencies have pressing environmental-data and -technology needs, many of which overlap. The federal government should therefore consider creating a centralized entity—a Digital Service for the Planet—“with the expertise and mission to coordinate environmental data and technology across agencies.”

The hundreds of millions of dollars that the IRA invests in environmental-data collection and analysis will serve as critical scaffolding to efficiently guide federal spending on environmental initiatives in the coming years—spending that is poised to massively increase in years to come due to the IRA as well as other key recent and pending legislative packages (including the Infrastructure Investment and Jobs Act, the CHIPS and Science Act if authorized funds are appropriated, and the Recovering America’s Wildlife Act that has a strong chance of passing this Congress). The foundation for data-driven change has been laid. The game is officially afoot.

Creating a Digital Service for the Planet

Summary

Challenge and Opportunity

The Biden administration—through directives such as Executive Order 14008 on Tackling the Climate Crises at Home and Abroad and President Biden’s Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking, as well as through initiatives such as Justice40 and America the Beautiful (30×30)—has laid the blueprint for a data-driven environmental agenda. 

However, the data to advance this agenda are held and managed by multiple agencies, making them difficult to standardize, share, and use to their full potential. For example, water data are collected by 25 federal entities across 57 data platforms and 462 different data types. Permitting for wetlands, forest fuel treatments, and other important natural-resource management tasks still involves a significant amount of manual data entry, and protocols for handling relevant data vary by region or district. Staff at environmental agencies have privately noted that it can take weeks or months to receive necessary data from colleagues in other agencies, and that they have trouble knowing what data exist at other agencies. Accelerating the success and breadth of environmental initiatives requires digitized, timely, and accessible information for planning and execution of agency strategies.

The state of federal environmental data today echoes the state of public-health data in 2014, when President Obama recognized that the Department of Health and Human Services lacked the technical skill sets and capacity needed to stand up Healthcare.gov. The Obama administration responded by creating the U.S. Digital Service (USDS), which provides federal agencies with on-demand access to the technical expertise they need to design, procure, and deploy technology for the public good. Over the past eight years, USDS has developed a scalable and replicable model of working across government agencies. Projects that USDS has been involved in—like improving federal procurement and hiring processes, deploying healthcare.gov, and modernizing administrative tasks for veterans and immigrants—have saved agencies such as the Department of Veterans Affairs millions of dollars.

But USDS lacks the specialized capacity and skills, experience, and specific directives needed to fully meet the shared digital-infrastructure needs of environmental agencies. The Climate and Economic Justice Screening Tool (CEJST) is an example of how crucial digital-service capacity is for tackling the nation’s environmental priorities, and the need for a DSP. While USDS was instrumental in getting the tool off the ground, several issues with the launch point to a lack of specialized environmental capabilities and expertise within USDS. Many known environmental-justice issues—including wildfire, drought, and flooding—were not reflected in the tool’s first iteration. In addition, the CEJST should have been published in July 2021, but the beta version was not released until February 2022. A DSP familiar with environmental data would have started with a stronger foundation to help anticipate and incorporate such key environmental concerns, and may have been able to deliver the tool on a tighter timeline.

There is hope in this challenge. The fact that many environmental programs across multiple federal agencies have overlapping data and technology needs means that a centralized and dedicated team focused on addressing these needs could significantly and cost-effectively advance the capacities of environmental agencies to:

Plan of Action

To best position federal agencies to meet environmental goals, the Biden administration should establish a “Digital Service for the Planet (DSP).” The DSP would build off the successes of USDS to provide support across three key areas for environmental agencies:

  1. Strategic planning and procurement. Scoping, designing, and procuring technology solutions for programmatic goals. For example, a DSP could help the Fish and Wildlife Service (FWS) accelerate updates to the National Wetlands Inventory, which are currently estimated to take 10 years and cost $20 million dollars.
  2. Technical development. Implementing targeted technical-development activities to achieve mission-related goals in collaboration with agency staff. For example, a DSP could help update the accessibility and utility for many government tools that the public rely heavily on, such as the Army Corps system that tracks mitigation banks (known as the Regulatory In lieu fee and Bank Information Tracking System (RIBITS)).
  3. Cross-agency coordination on digital infrastructure. Facilitating data inventory and sharing, and development of the databases, tools, and technological processes that make cross-agency efforts possible. A DSP could be a helpful partner for facilitating information sharing among agencies that monitor interrelated events, environments, or problems, including droughts, wildfires, and algal blooms. 

The DSP could be established either as a new branch of USDS, or as a new and separate but parallel entity housed within the White House Office of Management and Budget. The former option would enable DSP to leverage the accumulated knowledge and existing structures of USDS. The latter option would enable DSP to be established with a more focused mandate, and would also provide a clear entry point for federal agencies seeking data and technology support specific to environmental issues.

Regardless of the organizational structure selected, DSP should include the essential elements that have helped USDS succeed—per the following recommendations.

Recommendation 1. The DSP should emulate the USDS’s staffing model and position within the Executive Office of the President (EOP).

The USDS hires employees on short-term contracts, with each contract term lasting between six months and four years. This contract-based model enables USDS to attract high-level technologists, product designers, and programmers who are interested in public service, but not necessarily committed to careers in government. USDS’s staffing model also ensures that the Service does not take over core agency capacities, but rather is deployed to design and procure tech solutions that agencies will ultimately operate in-house (i.e., without USDS involvement). USDS’s position within the EOP makes USDS an attractive place for top-level talent to work, gives staff access to high-level government officials, and enables the Service to work flexibly across agencies.

Recommendation 2. Staff the DSP with specialists who have prior experience working on environmental projects.

Working on data and technology issues within environmental contexts requires specialized skill sets and experience. For example, geospatial data and analysis are fundamental to environmental protection and conservation, but this has not been a focal point of USDS hiring. In addition, a DSP staff fluent in the vast and specific terminologies used in environmental fields (such as water management) will be better able to communicate with the many subject-matter experts and data stewards working in environmental agencies. 

Recommendation 3. Place interagency collaboration at the core of the DSP mission.

Most USDS projects focus on a single federal agency, but environmental initiatives—and the data and tech needs they present—almost always involve multiple agencies. Major national challenges, including flood-risk management, harmful algal blooms, and environmental justice, all demand an integrated approach to realize cross-agency benefits. For example, EPA-funded green stormwater infrastructure could reduce flood risk for housing units subsidized by the Department of Housing and Urban Development. DSP should be explicitly tasked with devising approaches for tackling complex data and technology issues that cut across agencies. Fulfilling this mandate may require DSP to bring on additional expertise in core competencies such as data sharing and integration.

Recommendation 4. Actively promote the DSP to relevant federal agencies.

Despite USDS’s eight-year existence, many staff members at agencies involved in environmental initiatives know little about the Service and what it can do for them. To avoid underutilization due to lack of awareness, the DSP’s launch should include an outreach campaign targeted at key agencies, including but not limited to the U.S. Army Corps of Engineers (USACE), the Department of Energy (DOE), the Department of the Interior (DOI), the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Department of Agriculture, and the U.S. Global Change Research Program (USGCRP).

Conclusion

A new Digital Service for the Planet could accelerate progress on environmental and natural-resource challenges through better use of data and technology. USDS has shown that a relatively small and flexible team can have a profound and lasting effect on how agencies operate, save taxpayer money, and encourage new ways of thinking about long standing problems. However, current capacity at USDS is limited and not specifically tailored to the needs of environmental agencies. From issues ranging from water management to environmental justice, ensuring better use of technology and data will yield benefits for generations to come. This is an important step for the federal government to be a better buyer, partner, and consumer of the data technology and innovations that are necessary to support the country’s conservation, water, and stewardship priorities.

Frequently Asked Questions
How would the DSP differ from the U.S. Digital Service?

The DSP would build on the successful USDS model, but would have two distinguishing characteristics. First, the DSP would employ staff experienced in using or managing environmental data and possessing special expertise in geospatial technologies, remote sensing, and other environmentally relevant tech capabilities. Second, DSP would have an explicit mandate to develop processes for tackling data and technology issues that frequently cut across agencies. For example, the Internet of Water found that at least 25 different federal entities collect water data, while the USGCRP has identified at least 217 examples of earth observation efforts spanning many agencies. USDS is not designed to work with so many agencies at once on a single project—but DSP would be.

Would establishing the DSP prohibit agencies from independently improving their data and tech practices? 

Not in most cases. The DSP would focus on meeting data and technology needs shared by multiple agencies. Agencies would still be free—and encouraged!—to pursue agency-specific data- and tech-improvement projects independently.


Indeed, a hope would be that by showcasing the value of digital services for environmental projects on a cross-agency basis, the DSP would inspire individual agencies to establish their own digital services teams. Precedent for this evolution exists: the USDS provided initial resources to solve digital challenges for healthcare.gov and Department of Veteran Affairs. The Department of Veteran Affairs and Department of Defense have since started their internal digital services teams. However, even with agency-based digital service teams, there will always be a need for a team with a cross-agency view, especially given that so many environmental problems and solutions extend well beyond the borders of a single agency. Digital-service teams at multiple levels can be complementary and would focus on different project scopes and groups of users. For example, agency-specific digital-service teams would be much better positioned to help sustain agency-specific components of an effort established by DSP.

How much would this proposal cost?

We propose the DSP start with a mid-sized team of twenty to thirty full-time equivalent employees (FTEs) and a budget around $8 million. These personnel and financial allocations are in line with allocations for USDS. DSP could be scaled up over time if needed, just as USDS grew from approximately 12 FTEs in fiscal year (FY) 2014 to over 200 FTEs in FY 2022. The long-term target size of the DSP team should be informed by the uptake and success of DSP-led work.

Why would agencies want a DSP? Why would they see it as beneficial?

From our conversations with agency staff, we (the authors) have heard time and again that agencies see immense value in a DSP, and find that two scenarios often inhibit improved adoption of environmental data and technology. The first scenario is that environmental-agency staff see the value in pursuing a technology solution to make their program more effective, but do not have the authority or resources to implement the idea, or are not aware of the avenues available to do so. DSP can help agency staff design and implement modern solutions to realize their vision and coordinate with important stakeholders to facilitate the process.


The second scenario is that environmental-agency staff are trained experts in environmental science, but not in evaluating technology solutions. As such, they are poorly equipped to evaluate the integrity of proposed solutions from external vendors. If they end up trialing a solution that is a poor fit, they may become risk-averse to technology at large. In this scenario, there is tremendous value in having a dedicated team of experts within the government available to help agencies source the appropriate technology or technologies for their programmatic goals.

Establishing the AYA Research Institute: Increasing Data Capacity and Community Engagement for Environmental-Justice Tools

Summary

Environmental justice (EJ) is a priority issue for the Biden Administration, yet the federal government lacks capacity to collect and maintain data needed to adequately identify and respond to environmental-justice (EJ) issues. EJ tools meant to resolve EJ issues — especially the Environmental Protection Agency (EPA)’s EJSCREEN tool — are gaining national recognition. But knowledge gaps and a dearth of EJ-trained scientists are preventing EJSCREEN from reaching its full potential. To address these issues, the Administration should allocate a portion of the EPA’s Justice40 funding to create the “AYA Research Institute”, a think tank under EPA’s jurisdiction. Derived from the Adinkra symbol, AYA means “resourcefulness and defiance against oppression.” The AYA Research Institute will functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues.

Challenge and Opportunity

Approximately 200,000 people in the United States die every year of pollution-related causes. These deaths are concentrated in underresourced, vulnerable, and/or minority communities. The EPA created the Office of Environmental Justice (OEJ) in 1992 to address systematic disparities in environmental outcomes among different communities. The primary tool that OEJ relies on to consider and address EJ concerns is EJSCREEN. EJSCREEN integrates a variety of environmental and demographic data into a layered map that identifies communities disproportionately impacted by environmental harms. This tool is available for public use and is the primary screening mechanism for many initiatives at state and local levels. Unfortunately, EJSCREEN has three major limitations:

  1. Missing indicators. EJSCREEN omits crucial environmental indicators such as drinking-water quality and indoor air quality. OEJ states that these crucial indicators are not included due to a lack of resources available to collect underlying data at the appropriate quality, spatial range, and resolution. 
  2. Small areas are less accurate. There is considerable uncertainty in EJSCREEN environmental and demographic estimates at the census block group (CBG) level. This is because (i) EJSCREEN’s assessments of environmental indicators can rely on data collected at scales less granular than CBG, and (ii) some of EJSCREEN’s demographic estimates are derived from surveys (as opposed to census data) and are therefore less consistent.
  3. Deficiencies in a single dataset can propagate across EJSCREEN analyses. Environmental indicators and health outcomes are inherently interconnected. This means that subpar data on certain indicators — such as emissions levels, ambient pollutant levels in air, individual exposure, and pollutant toxicity — can compromise the reliability of EJSCREEN results on multiple fronts. 

These limitations must be addressed to unlock the full potential of EJSCREEN as a tool for informing research and policy. More robust, accurate, and comprehensive environmental and demographic data are needed to power EJSCREEN. Community-driven initiatives are a powerful but underutilized way to source such data. Yet limited time, funding, rapport, and knowledge tend to discourage scientists from engaging in community-based research collaborations. In addition, effectively operationalizing data-based EJ initiatives at a national scale requires the involvement of specialists trained at the intersection of EJ and science, technology, engineering, and math (STEM). Unfortunately, relatively poor compensation discourages scientists from pursuing EJ work — and scientists who work on other topics but have interest in EJ can rarely commit the time needed to sustain long-term collaborations with EJ organizations. It is time to augment the federal government’s past and existing EJ work with redoubled investment in community-based data and training.

Plan of Action

EPA should dedicate $20 million of its Justice40 funding to establish the AYA Research Institute: an in-house think tank designed to functionally address EJSCREEN’s limitations as well as increase federal capacity to identify and effectively resolve existing and future EJ issues. The word AYA is the formal name for the Adinkra symbol meaning “resourcefulness and defiance against oppression” — concepts that define the fight for environmental justice.

The Research Institute will comprise three arms. The first arm will increase federal EJ data capacity through an expert advisory group tasked with providing and updating recommendations to inform federal collection and use of EJ data. The advisory group will focus specifically on (i) reviewing and recommending updates to environmental and demographic indicators included in EJSCREEN, and (ii) identifying opportunities for community-based initiatives that could help close key gaps in the data upon which EJSCREEN relies.

The second arm will help grow the pipeline of EJ-focused scientists through a three-year fellowship program supporting doctoral students in applied research projects that exclusively address EJ issues in U.S. municipalities and counties identified as frontline communities. The program will be three years long so that participants are able to conduct much-needed longitudinal studies that are rare in the EJ space. To be eligible, doctoral students will need to (i) demonstrate how their projects will help strengthen EJSCREEN and/or leverage EJSCREEN insights, and (ii) present a clear plan for interacting with and considering recommendations from local EJ grassroots organization(s). Selected students will be matched with grassroots EJ organizations distributed across five U.S. geographic regions (Northeast, Southeast, Midwest, Southwest, and West) for mentorship and implementation support. The fellowship will support participants in achieving their academic goals while also providing them with experience working with community-based data, building community-engagement and science-communication skills, and learning how to scale science policymaking from local to federal systems. As such, the fellowship will help grow the pipeline of STEM talent knowledgeable about and committed to working on EJ issues in the United States.

The third arm will embed EJ expertise into federal decision making by sponsoring a permanent suite of very dominant resident staff, supported by “visitors” (i.e., the doctoral fellows), to produce policy recommendations, studies, surveys, qualitative analyses, and quantitative analyses centered around EJ. This model will rely on the resident staff to maintain strong relationships with federal government and extragovernmental partners and to ensure continuity across projects, while the fellows provide ancillary support as appropriate based on their skills/interest and Institute needs. The fellowship will act as a screening tool for hiring future members of the resident staff.

Taken together, these arms of the AYA Research Institute will help advance Justice40’s goal of improving training and workforce development, as well as the Biden Administration’s goal of better preparing the United States to adapt and respond to the impacts of climate change. The AYA Research Institute can be launched with $10 million: $4 million to establish the fellowship program with an initial cohort of 10 doctoral students (receiving stipends commensurate with typical doctoral stipends at U.S. universities), and $6 million to cover administrative expenses and staff expert salaries. Additional funding will be needed to maintain the Institute if it proves successful after launch. Funding for the Institute could come from Justice40 funds allocated to EPA. Alternatively, EPA’s fiscal year (FY) 2022 budget for science and technology clearly states a goal of prioritizing EJ — funds from this budget could hence be allocated towards the Institute using existing authority. Finally, EPA’s FY 2022 budget for environmental programs and management dedicates approximately $6 million to EJSCREEN — a portion of these funds could be reallocated to the Institute as well.

Conclusion

The Biden-Harris Administration is making unprecedented investments in environmental justice. The AYA Research Institute is designed to be a force multiplier for those investments. Federally sponsored EJ efforts involve multiple programs and management tools that directly rely on the usability and accuracy of EJSCREEN. The AYA Research Institute will increase federal data capacity and help resolve the largest gaps in the data upon which EJSCREEN depends in order to increase the tool’s effectiveness. The Institute will also advance data-driven environmental-justice efforts more broadly by (i) growing the pipeline of EJ-focused researchers experienced in working with data, and (ii) embedding EJ expertise into federal decision making. In sum, the AYA Research Institute will strengthen the federal government’s capacity to strategically and meaningfully advance EJ nationwide. 

Frequently Asked Questions
How does this proposal align with grassroots EJ efforts?

Many grassroots EJ efforts are focused on working with scientists to better collect and use data to understand the scope of environmental injustices. The AYA Research Institute would allocate in-kind support to advance such efforts and would help ensure that data collected through community-based initiatives is used as appropriate to strengthen federal decision-making tools like EJSCREEN.

How does this proposal align with the Climate and Economic Justice Screening Tool (CEJST) recently announced by the Biden administration?

EJSCREEN and CEJST are meant to be used in tandem. As the White House explains, “EJSCREEN and CEJST complement each other — the former provides a tool to screen for potential disproportionate environmental burdens and harms at the community level, while the latter defines and maps disadvantaged communities for the purpose of informing how Federal agencies guide the benefits of certain programs, including through the Justice40 Initiative.” As such, improvements to EJSCREEN will inevitably strengthen deployment of CEJST.

Has a think tank ever been embedded in a federal government agency before?

Yes. Examples include the U.S. Army War College Strategic Studies Institute and the Asian-Pacific Center for Security Studies. Both entities have been successful and serve as primary research facilities.

What criteria would the AYA Research Institute use to evaluate doctoral students who apply to its fellowship program?

To be eligible for the fellowship program, applicants must have completed one year of their doctoral program and be current students in a STEM department. Fellows must propose a research project that would help strengthen EJSCREEN and/or leverage EJSCREEN insights to address a particular EJ issue. Fellows must also clearly demonstrate how they would work with community-based organizations on their proposed projects. Priority would be given to candidates proposing the types of longitudinal studies that are rare but badly needed in the EJ space. To ensure that fellows are well equipped to perform deep community engagement, additional selection criteria for the AYA Research Institute fellowship program could draw from the criteria presented in the rubric for the Harvard Climate Advocacy Fellowship.

What can be done to avoid politicizing the AYA Research Institute, and to ensure the Institute’s longevity across administrations?

A key step will be grounding the Institute in the expertise of salaried, career staff. This will offset potential politicization of research outputs.

What is the existing data the EJSCREEN is using?

EJSCREEN 2.0 is largely using data from the 2020 U.S. Census Bureau’s American Community Survey, as well as many other sources (e.g., the Department of Transportation (DOT) National Transportation Atlas Database, the Community Multiscale Air Quality (CMAQ) modeling system, etc.) The EJSCREEN Technical Document explicates the existing data sources that EJSCREEN relies on.

7. What are the demographic and environmental indicators of interest included in EJSCREEN?

The demographic indicators are: people of color, low income, unemployment rate, linguistic isolation, less than high school education, under age 5 and over age 64. The environmental indicators are: particulate matter 2.5, ozone, diesel particulate matter, air toxics cancer risk, air toxics respiratory hazard index, traffic proximity and volume, lead paint, Superfund proximity, risk management plan facility proximity, hazardous waste proximity, underground storage tanks and leaking UST, and wastewater discharge.

Creating a Public System of National Laboratory Schools

Summary

The computational revolution enables and requires an ambitious reimagining of public high-school and community-college designs, curricula, and educator-training programs. In light of a much-changed — and much-changing — society, we as a nation must revisit basic assumptions about what constitutes a “good” education. That means re-considering whether traditional school schedules still make sense, updating outdated curricula to emphasize in-demand skills (like computer programming), bringing current perspectives to old subjects (like computational biology); and piloting new pedagogies (like project-based approaches) better aligned to modern workplaces. To do this, the Federal Government should establish a system of National Laboratory Schools in parallel to its existing system of Federally Funded Research & Development Centers (FFRDCs).

The National Science Foundation (NSF) should lead this work, partnering with the Department of Education (ED) to create a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) Directorate. The DSI would act as a platform analogous to the Small Business Innovation Research (SBIR) program, catalyzing Laboratory Schools by providing funding and technical guidance to federal, state, and local entities pursuing educational or cluster-based workforce-development initiatives.

The new Laboratory Schools would take inspiration from successful, vertically-integrated research and design institutes like Xerox PARC and the Mayo Clinic in how they organized research, as well as from educational systems like Governor’s Schools and Early College High Schools in how they organized their governance. Each Laboratory School would work with a small, demographically and academically representative cohort financially sustainable on local per-capita education budgets.
Collectively, National Laboratory Schools would offer much-needed “public sandboxes” to develop and demonstrate novel school designs, curricula, and educator-training programs rethinking both what and how people learn in a computational future.

Challenge and Opportunity

Education is fundamental to individual liberty and national competitiveness. But the United States’ investment in advancing the state of the art is falling behind. 

Innovation in educational practice has been incremental. Neither the standards-based nor charter-school movements departed significantly from traditional models. Accountability and outcomes-based incentives like No Child Left Behind suffer from the same issue.

The situation in research is not much better: NSF and ED’s combined spending on education research is barely twice the research and development budget of Nintendo. And most of that research focuses on refining traditional school models (e.g. presuming 50-minute classes and traditional course sequences).

Despite all these efforts, we are still seeing unprecedented declines in students’ math and reading scores.

Meanwhile, the computational revolution is widening the gap between what school teaches and the skills needed in a world where work is increasingly creative, collaborative, and computational. Computation’s role in culture, commerce, and national security is rapidly expanding; computational approaches are transforming disciplines from math and physics to history and art. School can’t keep up.

For years, research has told us individualized, competency- and project-based approaches can reverse academic declines while aligning with the demands of industry and academia for critical thinking, collaboration, and creative problem-solving skills. But schools lack the capacity to follow suit.

Clearly, we need a different approach to research and development in education: We need prototypes, not publications. While studies evaluating and improving existing schools and approaches have their place, there is a real need now for “living laboratories” that develop and demonstrate wholly transformative educational approaches.

Schools cannot do this on their own. Constitutionally and financially, education is federated to states and districts. No single public actor has the incentives, expertise, and resources to tackle ambitious research and design — much less to translate into research to practice on a meaningful scale. Private actors like curriculum developers or educational technologists sell to public actors, meaning private sector innovation is constrained by public school models. Graduate schools of education won’t take the brand risk of running their own schools, and researchers won’t pursue unfunded or unpublishable questions. We commend the Biden-Harris administration’s Multi-Agency Research and Development Priorities for centering inclusive innovation and science, technology, education, and math (STEM) education in the nation’s policy agenda. But reinventing school requires a new kind of research institution, one which actually operates a school, developing educators and new approaches firsthand.Luckily, the United States largely invented the modern research institution. It is time we do so again. Much as our nation’s leadership in science and technology was propelled by the establishment ofland-grant universities in the late 19th century, we can trigger a new era of U.S. leadership in education by establishing a system of National Laboratory Schools. The Laboratory Schools will serve as vertically integrated “sandboxes” built atop fully functioning high schools and community colleges, reinventing how students learn and how we develop in a computational future.

Plan of Action

To catalyze a system of National Laboratory Schools, the NSF should establish a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) directorate. With an annually escalating investment over five years (starting at $25 million in FY22 and increasing to $400 million by FY26), the DSI could support development of 100 Laboratory Schools nationwide.

The DSI would support federal, state, and local entities — and their partners — in pursuing education or cluster-based workforce-development initiatives that (i) center computational capacities, (ii) emphasize economic inclusion or racial diversity, and (iii) could benefit from a high-school or community-college component.

DSI support would entail:

  1. Competitive matching grants modeled on SBIR grants. These grants would go towards launching Laboratory Schools and sustaining those that demonstrate success.
  2. Technical guidance to help Laboratory Schools (i) innovate while maintaining regulatory compliance, and (ii) develop financial models workable on local education budgets.
  3. Accreditation support, working with partner executives (e.g., Chairs of Boards of Higher Education) where appropriate, to help Laboratory Schools establish relationships with accreditors, explain their educational models, and document teacher and student work for evaluation purposes.
  4. Responsible-research support, including providing Laboratory Schools assistance with obtainingFederalwide Assurance (FWA) and access to partners’ Institutional Review Boards (IRBs).
  5. Convening and storytelling, raising awareness of and interest in Laboratory Schools’ mission and operations.

Launching at least ten National Laboratory Schools by FY23 would involve three primary steps. First, the White House Office of Science and Technology Policy (OSTP) should convene an expert group comprised of (i) funders with a track record of attempting radical change in education and (ii) computational domain experts to design an evaluation process for the DSI’s competitive grants, secure industry and academic partners to help generate interest in the National Laboratory School System, and recruit the DSI’s first Director.

In parallel, Congress should issue one appropriations report asking NSF to establish a $25 million per year pilot Laboratory School program aligned with the NSF Directorate for Technology, Innovation, and Partnerships (TIP)’s Regional Innovation Accelerators (RIA)’s Areas of Investment. Congress should issue a second appropriations report asking the Office of Elementary and Secondary Education (OESE) to release a Dear Colleague letter encouraging states that have spent less than 75% of their Elementary and Secondary School Emergency Relief (ESSER) or American Recovery Plan funding to propose a Laboratory School.

Finally, the White House should work closely with the DSI’s first Director to convene the Department of Defense Education Activity (DDoEA) and National Governors Association (NGA) to recruit partners for the National Laboratory Schools program. These partners would later be responsible for operational details like:

Focus will be key for this initiative. The DSI should exclusively support efforts that center:

  1. New public schools, not programs within (or reinventions of) existing schools.
  2. Radically different designs, not incremental evolutions.
  3. Computationally rich models that integrate computation and other modern skills into all subjects.
  4. Inclusive innovation focused on transforming outcomes for the poor and historically marginalized.

Conclusion

Imagine the pencil has just been invented, and we treated it the way we’ve treated computers in education. “Pencil class” and “pencil labs” would prepare people for a written future. We would debate the cost and benefit of one pencil per child. We would study how oral test performance changed when introducing one pencil per classroom, or after an after-school creative-writing program.

This all sounds stupid because the pencil and writing are integrated throughout our educational systems rather than being considered individually. The pencil transforms both what and how we learn, but only when embraced as a foundational piece of the educational experience.

Yet this siloed approach is precisely the approach our educational system takes to computers and the computational revolution. In some ways, this is no great surprise. The federated U.S. school system isn’t designed to support invention, and research incentives favor studying and suggesting incremental improvements to existing school systems rather than reimagining education from the ground up. If we as a nation want to lead on education in the same way that we lead on science and technology, we must create laboratories to support school experimentation in the same way that we establish laboratories to support experimentation across STEM fields. Certainly, the federal government shouldn’t run our schools. But just as the National Institutes of Health (NIH) support cutting-edge research that informs evolving healthcare practices, so too should the federal government support cutting-edge research that informs evolving educational practices. By establishing a National Laboratory School system, the federal government will take the risk and make the investments our communities can’t on their own to realize a vision of an equitable, computationally rich future for our schools and students.

Frequently Asked Questions

Who

1. Why is the federal government the right entity to lead on a National Laboratory School system?

Transformative education research is slow (human development takes a long time, as does assessing how a given intervention changes outcomes), laborious (securing permissions to test an intervention in a real-world setting is often difficult), and resource-intensive (many ambitious ideas require running a redesigned school to explore properly). When other fields confront such obstacles, the public and philanthropic sectors step in to subsidize research (e.g., by funding large research facilities). But tangible education-research infrastructure does not exist in the United States.

Without R&D demonstrating new models (and solving the myriad problems of actual implementation), other public- and private-sector actors will continue to invest solely in supporting existing school models. No private sector actor will create a product for schools that don’t exist, no district has the bandwidth and resources to do it themselves, no state is incentivized to tackle the problem, and no philanthropic actor will fund an effort with a long, unclear path to adoption and prominence.

National Laboratory Schools are intended primarily as research, development, and demonstration efforts, meaning that they will be staffed largely by researchers and will pursue research agendas that go beyond the traditional responsibilities and expertise of local school districts. State and local actors are the right entities to design and operate these schools so that they reflect the particular priorities and strengths of local communities, and so that each school is well positioned to influence local practice. But funding and overseeing the National Laboratory School system as a whole is an appropriate role for the federal government.

2. Why is NSF the right agency to lead this work?

For many years, NSF has developed substantial expertise funding innovation through the SBIR/STTR programs, which award staged grants to support innovation and technology transfer. NSF also has experience researching education through its Directorate for Education and Human Resources (HER). Finally, NSF’s new Directorate for Technology, Innovation, and Partnerships (TIP) has a mandate to “[create] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” NSF is the right agency to lead the National Laboratory Schools program because of its unique combination of experience, in-house expertise, mission relevance, and relationships with agencies, industry, and academia.

3. What role will OSTP play in establishing the National Laboratory School program? Why should they help lead the program instead of ED?

ED focuses on the concerns and priorities of existing schools. Ensuring that National Laboratory Schools emphasize invention and reimagining of educational models requires fresh strategic thinking and partnerships grounded in computational domain expertise.

OSTP has access to bodies like the President’s Council of Advisors on Science and Technology (PCAST)and the National Science and Technology Council (NSTC). Working with these bodies, OSTP can easily convene high-profile leaders in computation from industry and academia to publicize and support the National Laboratory Schools program. OSTP can also enlist domain experts who can act as advisors evaluating and critiquing the depth of computational work developed in the Laboratory Schools. And annually, in the spirit of the White House Science Fair, OSTP could host a festival showcasing the design, practices, and outputs of various Laboratory Schools.

Though OSTP and NSF will have primary leadership responsibilities for the National Laboratory Schools program, we expect that ED will still be involved as a key partner on topics aligned with ED’s core competencies (e.g., regulatory compliance, traditional best practices, responsible research practices, etc.).

4. What makes the Department of Defense Education Activity (DoDEA) an especially good partner for this work?

The DoDEA is an especially good partner because it is the only federal agency that already operates schools; reaches a student base that is large (more than 70,000 students, of whom more than 12,000 are high-school aged) as well as academically, socioeconomically, and demographically diverse; more nimble than a traditional district; in a position to appreciate and understand the full ramifications of the computational revolution; and very motivated to improve school quality and reduce turnover

5. Why should the Division for School Invention (DSI) be situated within NSF’s TIP Directorate rather than EHR Directorate?

EHR has historically focused on the important work of researching (and to some extent, improving) existing schools. The DSI’s focus on invention, secondary/postsecondary education, and opportunities for alignment between cluster-based workforce-development strategies and Laboratory Schools’ computational emphasis make the DSI a much better fit for the TIP, which is not only focused on innovation and invention overall, but is also explicitly tasked with “[creating] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” Situating the DSI within TIP will not preclude DSI from drawing on EHR’s considerable expertise when needed, especially for evaluating, contextualizing, and supporting the research agendas of Laboratory Schools.

6. Why shouldn’t existing public schools be eligible to serve as Laboratory Schools?

Most attempts at organizational change fail. Invention requires starting fresh. Allowing existing public schools or districts to launch Laboratory Schools will distract from the ongoing educational missions of those schools and is unlikely to lead to effective invention. 

7. Who are some appropriate partners for the National Laboratory School program?

Possible partners include:

8. What should the profile of a team or organization starting a Laboratory School look like? Where and how will partners find these people?

At a minimum, the team should have experience working with youth, possess domain expertise in computation, be comfortable supporting both technical and expressive applications of computation, and have a clear vision for the practical operation of their proposed educational model across both the humanities and technical fields.

Ideally, the team should also have piloted versions of their proposed educational model approach in some form, such as through after-school programs or at a summer camp. Piloting novel educational models can be hard, so the DSI and/or its partners may want to consider providing tiered grants to support this kind of prototyping and develop a pipeline of candidates for running a Laboratory School.

To identify candidates to launch and operate a Laboratory School, the DSI and/or its partners can:

What

1. What is computational thinking, and how is it different from programming or computer science?

A good way to answer this question is to consider writing as an analogy. Writing is a tool for thought that can be used to think critically, persuade, illustrate, and so on. Becoming a skilled writer starts with learning the alphabet and basic grammar, and can include craft elements like penmanship. But the practice of writing is distinct from the thinking one does with those skills. Similarly, programming is analogous to mechanical writing skills, while computer science is analogous to the broader field of linguistics. These are valuable skills, but are a very particular slice of what the computational revolution entails.

Both programming and computer science are distinct from computational thinking. Computational thinking refers to thinking with computers, rather than thinking about how to communicate problems and questions and models to computers. Examples in other fields include:

These transitions each involve programming, but are no more “about” computer science than a philosophy class is “about” writing. Programming is the tool, not the topic.

2. What are some examples of the research questions that National Laboratory Schools would investigate?

There are countless research agendas that could be pursued through this new infrastructure. Select examples include:

  1. Seymour Papert’s work on LOGO (captured in books like Mindstorms) presented a radically different vision for the potential and role for technology in learning. In Mindstorms, Papert sketches out that vision vis a vis geometry as an existence proof. Papert’s work demonstrates that research into making things more learnable differs from researching how to teach more effectively. Abelson and diSessa’s Turtle Geometry takes Papert’s work further, conceiving of ways that computational tools can be used to introduce differential geometry and topology to middle- and high-schoolers. The National Laboratory Schools could investigate how we might design integrated curricula combining geometry, physics, and mathematics by leveraging the fact that the vast majority of mathematical ideas tackled in secondary contexts appear in computational treatments of shape and motion.
  2. The Picturing to Learn program demonstrated remarkable results in helping staff to identify and students to articulate conceptions and misconceptions. The National Laboratory Schools could investigate how to take advantage of the explosion of interactive and dynamic media now available for visually thinking and animating mental models across disciplines.
  3. Bond graphs as a representation of physical dynamic systems were developed in the 1960s. These graphs enabled identification of “effort” and “flow” variables as new ways of defining power. This in turn allowed us to formalize analogies across electricity and magnetism, mechanics, fluid dynamics, and so on. Decades later, category theory has brought additional mathematical tools to bear on further formalizing these analogies. Given the role of analogy in learning, how could we reconceive people’s introduction to natural sciences in cross-disciplinary language emphasizing these formal parallels.
  4. Understanding what it means for one thing to cause (or not cause) another, and how we attempt to establish whether this is empirically true is an urgent and omnipresent need. Computational approaches have transformed economics and the social sciences: Whether COVID vaccine reliability, claims of election fraud, or the replication crisis in medicine and social science, our world is full of increasingly opaque systems and phenomena which our media environment is decreasingly equipped to tackle for and with us. An important tool in this work is the ability to reason about and evaluate empirical research effectively, which in turn depends on fundamental ideas about causality and how to evaluate the strength and likelihood of various claims. Graphical methods in statistics offer a new tool complementing traditional, easily misused ideas like p-values which dominate current introductions to statistics without leaving youth in a better position to meaningfully evaluate and understand statistical inference.

The specifics of these are less important than the fact that there are many, many such agendas that go largely unexplored because we lack the tangible infrastructure to set ambitious, computationally sophisticated educational research agendas.

3. How will the National Laboratory Schools differ from magnet schools for those interested in computer science?

The premise of the National Laboratory Schools is that computation, like writing, can transform many subjects. These schools won’t place disproportionate emphasis on the field of computer science, but rather will emphasize integration of computational thinking into all disciplines—and educational practice as a whole. Moreover, magnet schools often use selective enrollment in their admissions. National Laboratory Schools are public schools interested in the core issues of the median public school, and therefore it is important they tackle the full range of challenges and opportunities that public schools face. This involves enrolling a socioeconomically, demographically, and academically diverse group of youth.

4. How will the National Laboratory Schools differ from the Institute for Education Science’s Regional Education Laboratories?

The Institute for Education’s (IES’s) Regional Education Laboratories (RELs) do not operate schools. Instead, they convene and partner with local policymakers to lead applied research and development, often focused on actionable best practices for today’s schools (as exemplified by the What Works Clearinghouse). This is a valuable service for educators and policymakers. However, this service is by definition limited to existing school models and assumptions about education. It does not attempt to pioneer new school models or curricula.

5. How will the National Laboratory Schools program differ from tech-focused workforce-development initiatives, coding bootcamps, and similar programs?

These types of programs focus on the training and placement of software engineers, data scientists, user-experience designers, and similar tech professionals. But just as computational thinking is broader than just programming, the National Laboratory Schools program is broader than vocational training (important as that may be). The National Laboratory Schools program is about rethinking school in light of the computational revolution’s effect on all subjects, as well as its effects on how school could or should operate. An increased sensitivity to vocational opportunities in software is only a small piece of that.

6. Can computation really change classes other than math and science?

Yes. The easiest way to prove this is to consider how professional practice of non-STEM fields has been transformed by computation. In economics, the role of data has become increasingly prominent in both research and decision making. Data-driven approaches have similarly transformed social science, while also expanding the field’s remit to include specifically online, computational phenomena (like social networks). Politics is increasingly dominated by technological questions, such as hacking and election interference. 3D modeling, animation, computational art, and electronic music are just a few examples of the computational revolution in the arts. In English and language arts, multimedia forms of narrative and commentary (e.g., podcasts, audiobooks, YouTube channels, social media, etc.) are augmenting traditional books, essays, and poems. 

7. Why and how should National Laboratory Schools commit to financial and legal parity with public schools?

The challenges facing public schools are not purely pedagogical. Public schools face challenges in serving diverse populations in resource-constrained and highly regulated environments. Solutions and innovation in education need to be prototyped in realistic model systems. Hence the National Laboratory Schools must commit to financial and legal parity with public schools. At a minimum, this should include a commitment to (i) a per-capita student cost that is no more than twice the average of the relevant catchment area for a given National Laboratory School (the 2x buffer is provided to accommodate the inevitably higher cost of prototyping educational practices at a small scale), and (ii) enrollment that is demographically and academically representative (including special-education and English Language Learner participation) of a similarly aged population within thirty minutes’ commute, and that is enrolled through a weighted lottery or similarly non-selective admissions process.

8. Why are Xerox PARC and the Mayo Clinic good models for this initiative?

Both Xerox PARC and the Mayo Clinic are prototypical examples of hyper-creative, highly-functioning research and development laboratories. Key to their success inventing the future was living it themselves.

PARC researchers insisted on not only building but using their creations as their main computing systems. In doing so, they were able to invent everything from ethernet and the laser printer to the whole paradigm of personal computing (including peripherals like the modern mouse and features like windowed applications that we take for granted today).

The Mayo Clinic runs an actual hospital. This allows the clinic to innovate freely in everything from management to medicine. As a result, the clinic created the first multi-specialty group practice and integrated medical record system, invented the oxygen mask and G-suit, discovered cortisone, and performed the first hip replacement.

One characteristic these two institutions share is that they are focused on applied design research rather than basic science. PARC combined basic innovations in microelectronics and user interface to realize a vision of personal computing. Mayo rethinks how to organize and capitalize on medical expertise to invent new workflows, devices, and more.

These kinds of living laboratories are informed by what happens outside their walls but are focused on inventing new things within. National Laboratory Schools should similarly strive to demonstrate the future in real-world operation.

Why?

1. Don’t laboratory schools already exist? Like at the University of Chicago?

Yes. But there are very few of them, and almost all of those that do exist suffer from one or more issues relative to the vision proposed herein for National Laboratory Schools. First, most existing laboratory schools are not public. In fact, most university-affiliated laboratory schools have, over time, evolved to mainly serve faculty’s children. This means that their enrollment is not socioeconomically, demographically, or academically representative. It also means that families’ risk aversion may constrain those schools’ capacity to truly innovate. Most laboratory schools not affiliated with a university use their “laboratory” status as a brand differentiator in the progressive independent-school sector.

Second, the research functions of many laboratory schools have been hollowed out given the absence of robust funding. These schools may engage in shallow renditions of participatory action research by faculty in lieu of meaningful, ambitious research efforts. 

Third, most educational-design questions investigated by laboratory schools are investigated at the classroom or curriculum (rather than school design) level. This creates tension between those seeking to test innovative practices (e.g., a lesson plan that involves an extended project) and the constraints of traditional classrooms.

Finally, insofar as bona fide research does happen, it is constrained by what is funded, publishable, and tenurable within traditional graduate schools of education. Hence most research reflects the concerns of existing schools instead of seeking to reimagine school design and educational practice.

2. Why will National Laboratory Schools succeed where past efforts at educational reform (e.g., charter schools) have failed?

Most past educational-reform initiatives have focused on either supporting and improving existing schools (e.g., through improved curricula for standard classes), or on subsidizing and supporting new schools (e.g., charter schools) that represent only minor departures from traditional models.

The National Laboratory Schools program will provide a new research, design, and development infrastructure for inventing new school models, curricula, and educator training. These schools will have resources, in-house expertise, and research priorities that traditional public schools—whether district or charter or pilot—do not and should not. If the National Laboratory Schools are successful, their output will help inform educational practice across the U.S. school ecosystem. 

3. Don’t charter schools and pilot schools already support experimentation? Wasn’t that the original idea for charter and pilot schools—that they’d be a laboratory to funnel innovation back into public schools?

Yes, but this transfer hasn’t happened for at least two reasons. First, the vast majority of charter and pilot schools are not pursuing fundamentally new models because doing so is too costly and risky. Charter schools can often perform more effectively than traditional public schools, but this is just as often because of problematic selection bias in enrollment as it is because the autonomy they’re given allows for more effective leadership and organizational management. Second, the politics around charter and pilots has become increasingly toxic in many places, which prevents new ideas from being considered by public schools or advocated for effectively by public leaders.

4. Why do we need invention at the school rather than at the classroom level? Wouldn’t it be better to figure out how to improve schools that exist rather than end up with some unworkable model that most districts can’t adopt?

The solutions we need might not exist at the classroom level. We invest a great deal of time, money, and effort into improving existing schools. But we underinvest in inventing fundamentally different schools. There are many design choices which we need to explore which cannot be adequately developed through marginal improvements to existing models. One example is project-based learning, wherein students undertake significant, often multidisciplinary projects to develop their skills. Project-based learning at any serious level requires significant blocks of time that don’t fit in traditional school schedules and calendars. A second example is the role of computational thinking, as centered in this proposal. Meaningfully incorporating computational approaches into a school design requires new pedagogies, developing novel tools and curricula, and re-training staff. Vanishingly few organizations do this kind of work as a result.

If and when National Laboratory Schools develop substantially innovative models that demonstrate significant value, there will surely need to be a translation process to enable districts to adopt these innovations, much as translational medicine brings biomedical innovations from the lab to the hospital. That process will likely need to involve helping districts start and grow new schools gradually, rather then district-wide overhauls.

5. What kinds of “traditional assumptions” need to be revisited at the school level?

The basic model of school assumes subject-based classes with traditionally licensed teachers lecturing in each class for 40–90 minutes a day. Students do homework, take quizzes and tests, and occasionally do labs or projects. The courses taught are largely fixed, with some flexibility around the edges (e.g., through electives and during students’ junior and senior high-school years).

Traditional school represents a compromise among curriculum developers, standardized-testing outfits, teacher-licensure programs, regulations, local stakeholder politics, and teachers’ unions. Attempts to change traditional schools almost always fail because of pressures from one or more of these groups. The only way to achieve meaningful educational reform is to demonstrate success in a school environment rethought from the ground up. Consider a typical course sequence of Algebra I, Geometry, Algebra II, and Calculus. There are both pedagogical and vocational reasons to rethink this sequence and instead center types of mathematics that are more useful in computational contexts (like discrete mathematics and linear algebra). But a typical school will not be able to simultaneously develop the new tools, materials, and teachers needed to do so.

6. Has anything like the National Laboratory School program been tried before?

No. There have been various attempts to promote research in education without starting new schools. There have been interesting attempts by states to start new schools (like Governor’s Schools),there have been some ambitious charter schools, and there have been attempts to create STEM-focused and computationally focused magnet schools. But there has never been a concerted attempt in the United States to establish a new kind of research infrastructure built atop the foundation of functioning schools as educational “sandboxes”.

How?

1. How will we pay for all this? What existing funding streams will support this work? Where will the rest of the money for this program come from?

For budgeting purposes, assume that each Laboratory School enrolls a small group of forty high school or community college students full-time at an average per capita rate of $40,000 per person per year. Half of that budget will support the functioning of schools themselves. The remaining half will support a small research and development team responsible for curating and developing the computational tools, materials, and curricula needed to support the School’s educators. This would put the direct service budget of the school solidly at the 80th percentile of current per capita spending on K–12 education in the United States.With these assumptions, running 100 National Laboratory Schools would cost ~$160 million. Investing $25 million per year would be sufficient to establish an initial 15 sites. This initial federal funding should be awarded through a 1:1 matching competitive-grant program funded by (i) the 10% of American Competitiveness and Workforce Improvement Act (ACWIA) Fees associated with H1-B visas (which the NSF is statutorily required to devote to public-private partnerships advancing STEM education), and (ii) the NSF TIP Directorate’s budget, alongside budgets from partner agency programs (for instance, the Department of Education’s Education Innovation and Research and Investing in Innovation programs). For many states, these funds should also be layered atop their existing Elementary and Secondary School Emergency Relief (ESSER) and American Rescue Plan (ARP) awards.

2. Why is vertical integration important? Do we really need to run schools to figure things out?

Vertical integration (of research, design, and operation of a school) is essential because schools and teacher education programs cannot be redesigned incrementally. Even when compelling curricular alternatives have been developed under the auspices of an organization like the NSF, practical challenges in bringing those innovations to practice have proven insurmountable. In healthcare, the entire field of translational medicine exists to help translate research into practice. Education has no equivalent.

The vertically integrated National Laboratory School system will address this gap by allowing experimenters to control all relevant aspects of the learning environment, curricula, staffing, schedules, evaluation mechanisms, and so on. This means the Laboratory Schools can demonstrate a fundamentally different approach, learning from great research labs like Xerox PARC and the Mayo Clinic, much of whose success depended on tightly-knit, cross-disciplinary teams working closely together in an integrated environment.

3. What would the responsibilities of a participating agency look like in a typical National Laboratory School partnership?

A participating agency will have some sort of educational or workforce-development initiative that would benefit from the addition of a National Laboratory School as a component. This agency would minimally be responsible for:

4. How should success for individual Laboratory Schools be defined?

Working with the Institute of Education Sciences (IES)’ National Center for Education Research(NCER), the DSI should develop frameworks for collecting necessary qualitative and quantitative data to document, understand, and evaluate the design of any given Laboratory School. Evaluation would include evaluation of compliance with financial and legal parity requirements as well as evaluation of student growth and work products.

Evaluation processes should include:

Success should be judged by a panel of experts that includes domain experts, youthworkers and/or school leaders, and DSI leadership. Dimensions of performance these panels should address should minimally include depth and quality of students’ work, degree of traditional academic coverage, ambition and coherence of the research agenda (and progress on that research agenda), retention of an equitably composed student cohort, and growth (not absolute performance) on the diagnostic/formative assessments.In designing evaluation mechanisms, it will be essential to learn from failed accountability systems in public schools. Specifically:, it will be essential to avoid pushing National Laboratory Schools to optimize for the particular metrics and measurements used in the evaluation process. This means that the evaluation process should be largely based on holistic evaluations made by expert panels rather than fixed rubrics or similar inflexible mechanisms. Evaluation timescales should also be selected appropriately: e.g., performance on diagnostic/formative assessments should be measured by examining trends over several years rather than year-to-year changes.

5. What makes the Small Business Innovation Research (SBIR) program a good model for the National Laboratory School program?

The SBIR program is a competitive grant competition wherein small businesses submit proposals to a multiphase grant program. SBIR awards smaller grants (~$150,000) to businesses at early stages of development, and makes larger grants (~$1 million) available to awardees who achieve certain progress milestones. SBIR and similar federal tiered-grant programs (e.g., the Small Business Technology Transfer, or STTR, program) have proven remarkably productive and cost-effective, with many studies highlighting that they are as or more efficient on a per-dollar basis when compared to the private sector via common measures of innovation like number of patents, papers, and so on.

The SBIR program is a good model for the National Laboratory School program; it is an example of the federal government promoting innovation by patching a hole in the funding landscape. Traditional financing options for businesses are often limited to debt or equity, and most providers of debt (like retail banks) for small businesses are rarely able or incentivized to subsidize research and development. Venture capitalists typically only subsidize research and development for businesses and technologies with reasonable expectations of delivering 10x or greater returns. SBIR provides funding for the innumerable businesses that need research and development support in order to become viable, but aren’t likely to deliver venture-scale returns.

In education, the funding landscape for research and development is even worse. There are virtually no sources of capital that support people to start schools, in part because the political climate around new schools can be so fraught. The funding that does exist for this purpose tends to demand school launch within 12–18 months: a timescale upon which it is not feasible to design, evaluate, refine an entirely new school model. Education is a slow, expensive public good: one that the federal government shouldn’t provision, but should certainly subsidize. That includes subsidizing the research and development needed to make education better.

States and local school districts lack the resources and incentives to fund such deep educational research. That is why the federal government should step in. By running a tiered educational research-grant program, the federal government will establish a clear pathway for prototyping and launching ambitious and innovative schools.

6. What protections will be in place for students enrolled in Laboratory Schools?

The state organizations established or selected to oversee Laboratory Schools will be responsible for approving proposed educational practices. That said, unlike in STEM fields, there is no “lab bench” for educational research: the only way we can advance the field as a whole is by carefully prototyping informed innovations with real students in real classrooms.

7. Considering the challenges and relatively low uptake of educational practices documented in the What Works Clearinghouse, how do we know that practices proven in National Laboratory Schools will become widely adopted?

National Laboratory Schools will yield at least three kinds of outputs, each of which is associated with different opportunities and challenges with respect to widespread adoption.

The first output is people. Faculty trained at National Laboratory Schools (and at possible educator-development programs run within the Schools) will be well positioned to take the practices and perspectives of National Laboratory Schools elsewhere (e.g., as school founders or department heads). The DSI should consider establishing programs to incentivize and support alumni personnel of National Laboratory Schools in disseminating their knowledge broadly, especially by founding schools.

The second output is tools and materials. New educational models that are responsive to the computational revolution will inevitably require new tools and materials—including subject-specific curricula, cross-disciplinary software tools for analysis and visualization, and organizational and administrative tools—to implement in practice. Many of these tools and materials will likely be adaptations and extensions of existing tools and materials to the needs of education.

The final output is new educational practices and models. This will be the hardest, but probably most important, output to disseminate broadly. The history of education reform is littered with failed attempts to scale or replicate new educational models. An educational model is best understood as the operating habits of a highly functioning school. Institutionalizing those habits is largely about developing the skills and culture of a school’s staff (especially its leadership). This is best tackled not as a problem of organizational transformation (e.g., attempting to retrofit existing schools), but rather one of organizational creation—that is, it is better to use models as inspirations to emulate as new schools (and new programs within schools) are planned. Over time, such new and inspired schools and programs will supplant older models.

8. How could the National Laboratory School program fail?

Examples of potential pitfalls that the DSI must strive to avoid include:

FAS Organ Procurement Organization Innovation Cohort Shares Data to Advance Organ Recovery Research

WASHINGTON, D.C.– Today the Federation of American Scientists (FAS) announced that the Organ Procurement Organization (OPO) Innovation Cohort is opening up ten years of data to engage in research to increase the rates of lifesaving organ donations every year.

Data from the U.S. Department of Health and Human Services (HHS) indicate that improvements in organ recovery practices will lead to at least 7,000 additional lifesaving transplants every year.

Bipartisan Congressional leaders have highlighted the need for accelerated data-driven reforms given COVID-19’s ravaging effects on organs. According to a July 19, 2021 letter led by the Senate Finance Committee and the House Committee on Oversight and Reform:

“The COVID-19 pandemic is exacerbating the need for organs now and creating an urgent health equity issue, as communities of color are disproportionately impacted by the failures of the current organ donation system and the effects of COVID-19.” 

Historically, OPO accountability and data-driven improvement has been hindered by opacity coupled with self-interpreted and self-reported performance data. The OPO Innovation Cohort will make public a trove of data regarding OPO performance, operations, finances, and governance with the goal of informing ongoing federal policymaking toward improving OPO performance and addressing health inequities. The first tranche of data released will be shared with MIT’s Healthy ML Lab and Wilson Lab, and will include case-level performance data, including all unstructured case notes, allowing for never-before-possible analysis of the organ donation process. Detailed data to be shared as part of the OPO Innovation Cohort can be found here. A visualization of OPO performance can be found here.

MIT’s Healthy ML Lab, led by Dr. Marzyeh Ghassemi, will review the de-identified case notes of seven OPOs, representing one-sixth of the country, to better understand where and how potential donors are lost, including by race and ethnicity. Dr. Ghassemi’s groundbreaking work will include using natural language processing and sentiment analysis of case notes to better understand the ways variation in care, communication and context might impact organ procurement and utilization. The Wilson Lab, led by Dr. Ashia Wilson, will target estimation of risks and opportunities for organ placement by OPOs to improve utilization and fairness of the organ transplant system. Dr. Wilson’s work specializes in using optimization to improve the efficiency and fairness of machine learning systems, and will bring this expertise to look for opportunities to increase the availability of organs for all demographic groups.

“Working with this data is a first step towards making better decisions about how to save more lives through organ procurement and transplantation. We have an opportunity to use machine learning to understand potential issues and lead improvements in transparency and equity,” said Dr. Marzyeh Ghassemi.

“Patients deserve transparency, and this research is even more important given what we are learning about COVID-19’s effects on organs,” said Jennifer Erickson, Senior Fellow, Federation of American Scientists.

The seven organ procurement organizations who are leading in opening up their data include: Donor Network West, Life Connection of Ohio, LiveOn New York, Louisiana Organ Procurement Agency, Mid-America Transplant, OurLegacy, Southwest Transplant Alliance. They have publicly committed to:

###