Increasing National Resilience through an Open Disaster Data Initiative
Summary
Federal, state, local, tribal, and territorial agencies collect and maintain a range of disaster resilience, vulnerability, and loss data. However, this valuable data lives on different platforms and in various formats across agency silos. Inconsistent data collection and lack of validation can result in gaps and inefficiencies and make it difficult to implement appropriate mitigation, preparedness, response, recovery, and adaptation measures for natural hazards, including wildfires, smoke, drought, extreme heat, flooding, and debris flow. Lack of complete data down to the granular level also makes it challenging to gauge the true cost of disasters.
The Biden-Harris Administration should launch an Open Disaster Data Initiative to mandate the development and implementation of national standards for disaster resilience, vulnerability, and loss data to enable federal, state, local, tribal, and territorial agencies to regularly collect, validate, share, and report on disaster data in consistent and interoperable formats.
Challenge and Opportunity
Disaster resilience, vulnerability, and loss data are used in many life-saving missions, including early detection and local response coordination, disaster risk assessments, local and state hazard mitigation plans, facilitating insurance and payouts, enabling rebuilding and recovery, and empowering diverse communities to adapt to climate impacts in inclusive, equitable, and just ways.
While a plethora of tools are being developed to enable better analytics and visualizations of disaster and climate data, including wildfire data, the quality and completeness of the data itself remains problematic, including in the recently released National Risk Index.
This is because there is a lack of agency mandates, funding, capacity, and infrastructure for data collection, validation, sharing, and reporting in consistent and interoperable formats. Currently, only a few federal agencies have the mandate and funds from Congress to collect disaster data relevant to their mission. Further, this data does not necessarily integrate state and local data for non-federally declared disasters.
Due to this lack of national disaster and climate data standards, federal and state agencies, universities, nonprofits, and insurers currently maintain disaster-related data in silos, making it difficult to link in productive and efficient ways down to the granular level.
Also, only a few local, state, and federal agencies regularly budget for or track spending on disaster resilience, vulnerability, and response activities. As a result, local agencies, nonprofits, and households, particularly in underserved communities, often lack access to critical lifesaving data. Further, disaster loss data is often private and proprietary, leading to inequality in data access and usability. This leaves already disadvantaged communities unprepared and with only a limited understanding of the financial burden of disaster costs carried by taxpayers.
Since the 1990s, several bipartisan reviews, research, data, and policy documents, including the recent President’s Council of Advisors on Science and Technology (PCAST) report on modernizing wildland firefighting, have reiterated the need to develop national standards for the consistent collection and reporting of disaster vulnerability, damage, and loss data. Some efforts are under way to address the standardization and data gaps—such as the all-hazards dataset that created an open database by refining the Incident Command System data sets (ICS-209).
However, significant work remains to integrate secondary and cascading disasters and monitor longitudinal climate impacts, especially on disadvantaged communities. For example, the National Interagency Fire Center consolidates major wildfire events but does not currently track secondary or cascading impacts, including smoke (see AirNow’s Fire and Smoke Map), nor does it monitor societal vulnerabilities and impacts such as on public health, displacement, poverty, and insurance. There are no standardized methods for accounting and tracking damaged or lost structures. For example, damage and loss data on structures, fatalities, community assets, and public infrastructure is not publicly available in a consolidated format.
The Open Disaster Data Initiative will enable longitudinal monitoring of pre- and post-event data for multiple hazards, resulting in a better understanding of cascading climate impacts. Guided by the Open Government Initiative (2016), the Fifth National Action Plan (2022), and in the context of the Year of Open Science (2023), the Open Disaster Data Initiative will lead to greater accountability in how federal, state, and local governments prioritize funding, especially to underserved and marginalized communities.
Finally, the Open Disaster Data Initiative will build on the Justice40 Initiative and be guided by the recommendations of the PCAST Report on Enhancing prediction and protecting communities. The Open Disaster Data Initiative should also reiterate the Government Accountability Office’s 2022 recommendation to Congress to designate a federal entity to develop and update climate information and to create a National Climate Information System.
Precedents
Recent disaster and wildfire research data platforms and standards provide some precedence and show how investing in data standards and interoperability can enable inclusive, equitable, and just disaster preparedness, response, and recovery outcomes.
The Open Disaster Data Initiative must build on lessons learned from past initiatives, including:
- the National Weather Service’s (NWS) Storm Events database, which collects meteorological data on when and where extreme events occur, along with occasional but unverified estimates of socioeconomic impacts.
- the Centers for Disease Control’s (CDC) COVID-19 Data Modernization Initiative, which attempts to harmonize data collection and reporting across national, tribal, state, and local agencies.
- the National Oceanic and Atmospheric Administration’s (NOAA) National Integrated Drought Information System (NIDIS), a multiagency partnership that coordinates drought monitoring, forecasting, planning, and information at national, tribal, state, and local levels but is impacted by inconsistent data reporting.
- the Federal Emergency Management Agency (FEMA)’s OpenFEMA initiative, which shares vast amounts of data on multiple aspects of disaster outcomes, including disaster assistance, hazard mitigation investments, the National Flood Insurance Program, and grants, but requires technical expertise to access and utilize the data effectively.
- FEMA’s National Risk Index, which maps the nation’s resilience, vulnerability, and disaster losses at county and census tract levels but shows shortcomings in capturing the risk of geophysical events such as earthquakes and tsunamis. In late 2022, Congress passed the Community Disaster Resilience Zones Act (P.L. 117-255), which codifies the National Risk Index. The goal is to support the census tracts with the highest risk rating with financial, technical, and other forms of assistance.
There are also important lessons to learn from international efforts such as the United Nations’ ongoing work on monitoring implementation of the Sendai Framework for Disaster Risk Reduction (2015–2030) by creating the next generation of disaster loss and damage databases, and the Open Geospatial Consortium’s Disaster Pilot 2023 and Climate Resilience Pilot, which seek to use standards to enable open and interoperable sharing of critical geospatial data across missions and scales.
Plan of Action
President Biden should launch an Open Disaster Data Initiative by implementing the following four actions.
Recommendation 1. Issue an Executive Order to direct the development and adoption of national standards for disaster resilience, vulnerability, and loss data collection, validation, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies to create the enabling conditions for adoption by universities, non-profits, and the private sector. The scope of this Executive Order should include data on local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
Recommendation 2. Direct the Office of Management and Budget (OMB) to issue an Open Disaster Data Initiative Directive for all relevant federal agencies to collaboratively implement the following actions:
- Direct the National Council on Science and Technology to appoint a subcommittee to work with the National Institute of Standards and Technology to develop national standards for disaster resilience, vulnerability, and loss data collection, sharing, and reporting, by all relevant federal, state, local, tribal, and territorial agencies, as well as by universities, nonprofits, and the private sector.
- Direct all relevant federal agencies to adopt national standards for disaster resilience, vulnerability, and loss data collection; validation; sharing; and reporting to address ongoing issues concerning data quality, completeness, integration, interoperability, accessibility, and usability.
- Develop federal agency capacities to accurately collect, validate, and use disaster resilience, vulnerability, and loss data, especially as it relates to population estimates of mortality, morbidity, and displacements, including from extreme heat and wildfire smoke.
- Direct FEMA to coordinate and implement training for state, local, tribal, and territorial agencies on how to collect disaster resilience, vulnerability, and loss data in line with the proposed national standards. Further, building on the FEMA Data Strategy 2023-2027 and in line with OpenFEMA, FEMA should review its private and sensitive data sharing policy to ensure that disaster data is publicly available and useable. FEMA’s National Incident Management System will be well positioned to cut across hazard mission silos and offer wide-ranging operational support and training for disaster loss accounting to federal, state, local, tribal, and territorial agencies, as well as nonprofit stakeholders, in coordination with FEMA’s Emergency Management Institute.
Recommendation 3. Designate a lead coordinator for the Open Disaster Data Initiative within the Office of Science Technology and Policy (OSTP), such as the Tech Team, to work with the OMB on developing a road map for implementing the Open Disaster Data Initiative, including developing the appropriate capacities across all of government.
Recommendation 4. Direct FEMA to direct appropriate funding and capacities for coordination with the National Weather Service (NWS), the U.S. Department of Agriculture’s Risk Management Agency, and the National Centers for Environmental Information (NCEI) to maintain a federated, open, integrated, and interoperable disaster data system that can seamlessly roll up local data, including research, nonprofit, and private, including insurance data.
In addition, Congress should take the following three actions to ensure success of the Open Disaster Data Initiative:
Recommendation 5. Request the Government Accountability Office to undertake a Disaster Data Systems and Infrastructure Review to:
- Inform the development of national standards and identify barriers for accurate disaster data collection, validation, accounting, and sharing between federal, state, local, tribal, and territorial agencies, as well as the philanthropic and private sector.
- Review lessons learned from precedents (including NWS’s Storm Events database, CDC’s Data Modernization Initiative, NOAA’s NIDIS, and FEMA’s National Risk Index).
- Form the basis for the OMB and the OSTP to designate an appropriate budget and capacity commitment and suggest a national framework/architecture for implementing an Open Disaster Data Initiative.
Recommendation 6. Appropriate dedicated funding for the implementation of the Open Disaster Data Initiative to allow federal agencies, states, nonprofits, and the private sector to access regular trainings and develop the necessary infrastructure and capacities to adopt national disaster data standards and collect, validate, and share relevant data. This access to training should facilitate seamless roll-up of disaster vulnerability and loss data to the federal level, thereby enabling accurate monitoring and accounting of community resilience in inclusive and equitable ways.
Recommendation 7. Use the congressional tool of technical corrections to support and enhance the Initiative:
- Pass Technical Corrections and Improvements to the Community Disaster Resilience Zones Act to include provisions for the collection, sharing, and reporting of disaster resilience, vulnerability, and loss data by all relevant federal, state, local, tribal, and territorial agencies and academic, private, community-based, and nonprofit entities, in consistent and interoperable formats, and in line with the proposed national disaster data standards. Technical corrections language could point to the requirement to “review the underlying collection and aggregation methodologies as well as consider the scoping of additional data the agency (ies) may be collecting.” This language should also direct agencies to review dissemination procedures for propriety, availability, and access for public review. The scope of this technical correction should include local disasters that do not call for a Presidential Disaster Declaration and federal assistance.
- Pass Technical Corrections and Improvements or suspension bill to the Disaster Recovery Reform Act, section 1223 which mandates a study of information collection or section 1224, which requires publication of said data accessible to the public would complement the above by tasking FEMA to study, aggregate, and share information with the public in a way that is digestible and actionable.
Conclusion
The Open Disaster Data Initiative can help augment whole-of-nation disaster resilience in at least three ways:
- Enable enhanced data sharing and information coordination among federal, state, local, tribal, and territorial agencies, as well as with universities, nonprofits, philanthropies, and the private sector.
- Allow for longitudinal monitoring of compounding and cascading disaster impacts on community well-being and ecosystem health, including a better understanding of how disasters impact poverty rates, housing trends, local economic development, and displacement and migration trends, particularly among disadvantaged communities.
- Inform the prioritization of policy and program investments for inclusive, equitable, and just disaster risk reduction outcomes, especially in socially and historically marginalized communities, including rural communities.
Recent analysis by a federal interagency effort, Science for Disaster Reduction, shows that national-level databases significantly underreport disaster losses due to an overreliance on public sources and exclusion (or inaccessibility) of loss information from commercial as well as federal institutions that collect insured losses.
Also, past research has captured common weaknesses of national agency-led disaster loss databases, including:
- over- or underreporting of certain hazard types (hazard bias)
- gaps in historic records (temporal bias)
- overreliance on direct and/or monetized losses (accounting bias)
- focus on high impact and/or acute events while ignoring the extensive impacts of slow disasters or highly localized cascading disasters (threshold bias)
- overrepresentation of densely populated and/or easily accessible areas (geography bias)
The National Weather Service’s Storm Events Database, the USDA’s Risk Management Agency’s Crop Data, and the CDC’s COVID-19 Data Modernization Initiative provide good templates for how to roll up data from the local to federal level. However, it is important to recognize that past initiatives, such as NOAA’s NIDIS initiative, have found it challenging to go beyond data collection on standard metrics of immediate loss and damage to also capture data on impacts and outcomes. Further, disaster loss and damage data are not currently integrated with other datasets that may capture secondary and cascading effects, such as, injuries, morbidities, and mortalities captured in CDC’s data.
Defining new standards that expand the range of attributes to be collected in consistent and interoperable formats would allow for moving beyond hazard and geographic silos, allowing data to be open, accessible, and usable. In turn, this will require new capacity and operational commitments, including an exploration of artificial intelligence, machine learning, and distributed ledger system (DLS) and blockchain technology, to undertake expanded data collection, sharing, and reporting across missions and scales.
Aligning with guidance provided in the OSTP’s recent Blueprint for an AI Bill of Rights and several research collective initiatives in recent years, the Open Disaster Data Initiative should seek to make disaster resilience, vulnerability, loss, and damage data FAIR (findable, accessible, interoperable, reusable) and usable in CARE-full ways (collective benefit, with authority to control, for responsible, and, ethical use).
A technical corrections bill is a type of congressional legislation to correct or clarify errors, omissions, or inconsistencies in previously passed laws. Technical corrections bills are typically noncontroversial and receive bipartisan support, as their primary goal is to correct mistakes rather than to make substantive policy changes. Technical corrections bills can be introduced at any time during a congressional session and may be standalone bills or amendments to larger pieces of legislation. They are typically considered under expedited procedures, such as suspension of the rules in the House of Representatives, which allows for quick consideration and passage with a two-thirds majority vote. In the Senate, technical corrections bills may be considered under unanimous consent agreements or by unanimous consent request, which allows for passage without a formal vote if no senator objects. Sometimes more involved technical corrections or light policy adjustments happen during “vote-o-rama” in the Senate.
Technical corrections bills or reports play an important role in the legislative process, particularly during appropriations and budgeting, by helping to ensure the accuracy and consistency of proposed funding levels and programmatic changes. For example, during the appropriations process, technical corrections may be needed to correct funding levels or programmatic details that were inadvertently left out of the original bill. These technical changes can be made to ensure that funding is allocated to the intended programs or projects and that the language of the bill accurately reflects the intent of Congress.
Similarly, during the budgeting process, technical corrections may be needed to adjust estimates or projections based on new information or changes in circumstances that were not foreseen when the original budget was proposed. These technical changes can help to ensure that the budget accurately reflects the current economic and fiscal conditions and that funding priorities are aligned with the goals and priorities of Congress. For example, in 2021, Congress used a technical corrections bill to clarify budget allocations and program intent after Hurricane Ida to make recovery programs more efficient and help with overall disaster recovery program clarification. Similarly, in 2017, Congress relied on a technical corrections/suspension bill to clarify some confusing tax provisions related to previous legislation for relief from Hurricane Maria.
The U.S. response to wildland fire can be better informed by science, evidence, and Indigenous perspectives.
In an environment where fire seasons are turning to fire years, and summer skies across North America are filled with wildfire smoke from as far away as another coast, the need for scientifically accurate wildland fire policy has never been greater. Over the past several months, COMPASS Science Communication has been working in collaboration with […]
FAS unveils 23 actionable recommendations aimed at improving woodland fire policy
In this critical year for reimagining wildland fire policy, we brought together stakeholders across science, technology, and policy to exchange forward-looking ideas with the shared goal of improving the federal government’s approach to managing wildland fire.