A National Housing Policy Simulator: A Plan for Modeling Policy Changes to Spur New Housing Supply
By several measures, the United States faces the greatest shortage of housing since World War II. Increasingly stringent local regulatory barriers are often to blame, but we have little way of knowing what specific policies are constraining new housing supply in any given community. Is it overly onerous height limits? Outsized permitting fees? Uncertain approvals? Furthermore, for the federal government to spur local governments to encourage new housing development—e.g., by tying federal transit dollars to local pro-housing actions—it ideally needs to first understand the potential for new housing supply in those communities. In addition, because of the multi-year lag between enacting policies and seeing housing built as a consequence of those changes, modeling which policies will move the dial on production—and by how much—increases the likelihood that policy change will have the intended result. The good news is that the tools, data, and proof of concept all exist today; what is needed now is for the federal government to build, or fund the creation of, a National Housing Policy Simulator. Done right, such a tool would allow users to toggle policy and economic inputs to compare the relative impact of new policies and guide the next generation of land use reform.
Simulators, such as the one built by UC Berkeley’s Terner Center & Labs for the City of Los Angeles, demonstrate the value of connecting zoning data with economic feasibility pro formas for modeling supply impacts. However, successful, validated simulators have been rare and limited to a handful of geographies. With recent efforts to digitize land use data, increased real-time datasets on rents, home prices, financing, and construction costs, combined with remarkable advances in computing power, the federal government could accelerate a scaling of this modeling, bringing forward national adoption within two years.
While this proposal might be seen to compete for federal funding with other housing subsidies, the idea of a national simulator is complementary: first, because it allows for a cost-effective targeting of existing federal pro-housing efforts, and second, because it spotlights low-cost zoning and land use reforms that can result in new unsubsidized housing.
To make this happen, Congress should
- Appropriate $10 million to the Department of Housing and Urban Development’s (HUD) Office of Policy Development & Research (PD&R) to build, or fund the creation of, a National Housing Policy Simulator and commission related research.
- Appropriate $500,000 to HUD’s Office of Community Planning & Development (CPD) to update its existing data collection tools and provide technical assistance.
Once funded, HUD should
- Build and/or contract with one or more academic/research organizations to scale the necessary mapping and economic modeling tools and acquire necessary financial datasets (e.g., county assessor data, home price information, construction cost estimates, vacancy rates, census data, capitalization rates, and costs of financing) (via PD&R).
- Amend its Consolidated Annual Performance and Evaluation Report, which is required of all jurisdictions that receive federal housing block grant funding, to report annually on land use and zoning changes in order to complete to build out and ensure the ongoing maintenance of the existing National Zoning Atlas or a similar federally maintained resource (via CPD).
Once built, Congress should
- Direct all federal agencies to look for opportunities to integrate this modeling data and tool into federal pro-housing policies and program development as well as to enable its use by local governments, researchers, advocates, and policymakers.
- Prioritize and commit to funding ongoing reporting and research opportunities.
A National Housing Policy Simulator can unlock an entirely new field of research and drive the next generation of policy innovation. Equally importantly, it will allow a deeper understanding of the policy incentives, levers, regulatory policies, and financial programs that can precisely target incentives and penalties to stimulate housing supply, while also empowering local actors—such as civic leaders, elected officials, and local planners—to effectuate local policy changes that make meaningful improvements to housing supply.
This idea of merit originated from our Housing Ideas Challenge, in partnership with Learning Collider, National Zoning Atlas, and Cornell’s Legal Constructs Lab. Find additional ideas to address the housing shortage here.
Building internal staff capacity would help HUD support pro-housing policies
The United States is experiencing a persistent and widespread housing shortage. Over the past several decades, housing supply has become less responsive to changes in demand: growth in population and jobs does not lead to proportional growth in the number of homes, while prices and rents have increased faster than household incomes. While state and local governments have primary responsibility for regulating housing production, the federal government could more effectively support state and local pro-housing policy innovations that are currently underway.
The Department of Housing and Urban Development (HUD) should designate or hire at least one career staff member to work on housing supply and land use as their primary responsibility.
Because housing supply and land use have not been part of HUD’s historic portfolio of funded programs, the agency has not invested in building consistent staff capacity on these topics. Designated staff should have substantial expertise on the topic, either through direct work experience or research on land use policy, and enough seniority within HUD to be listened to. The Biden Administration has made a good start by appointing a Special Policy Advisor working on housing supply. Integrating this position into a career staff role would help ensure continuity across administrations. The most appropriate division of HUD would be either Policy Development and Research, which is research-focused, or Community Planning and Development, which provides technical assistance to communities.
HUD’s housing supply staff should oversee two primary efforts within the agency: supporting the efforts of state and local policymakers and other stakeholders that are experimenting with pro-housing policies, and disseminating clear, accessible, evidence-based information on the types of policies that support housing production. These roles fall well within HUD’s mission, do not require congressional authorization, and would require relatively modest financial investments (primarily staff time and direct costs of convenings).
First, to support more effective federal engagement, HUD’s housing supply lead should develop and maintain relationships with the extensive network of stakeholders across the country who are already working to understand and increase housing production. Examples of stakeholders include staff in state, regional, and local housing/planning agencies; universities and research organizations; as well as nonprofit and for-profit housing developers. Because of the decentralized nature of land use regulation, there is not an established venue or network for policymakers to connect with their peers. HUD could organize periodic convenings among policymakers and researchers to share their experiences on how policy changes are working in real time and identify knowledge gaps that are most important for policy design and implementation.
Second, HUD should assemble and disseminate clear, accessible guidelines on the types of policies that support housing production. Many local and state policymakers are seeking information and advice on how to design policies that are effective in their local or regional housing markets and how to achieve specific policy goals. Developing and sharing information on best practices as well as “poison pills”—based on research and evaluation—would reduce knowledge gaps, especially for smaller communities with limited staff capacity. Local governments and regional planning agencies would also benefit from federally funded technical assistance when they choose to rewrite their regulations.
Across the country, an increasing number of cities and states are experimenting with changes to zoning and related regulations intended to increase housing supply and create more diverse housing options, especially in high-opportunity communities. Through targeted investment in HUD’s staff capacity, the federal government can better support those efforts by facilitating conversations between stakeholders and sharing information about what policy changes are most effective.
This idea of merit originated from our Housing Ideas Challenge, in partnership with Learning Collider, National Zoning Atlas, and Cornell’s Legal Constructs Lab. Find additional ideas to address the housing shortage here.
Unblock Mass Timber by Incentivizing Up-to-date Building Codes
Mass timber can help solve the housing shortage, yet the building material is not widely adopted because old building codes treat it like traditional lumber. The 2021 International Building Code (IBC) addressed this issue, significantly updating mass timber allowances such as increasing height limits. But mass timber use is still broadly limited because state and local building codes usually don’t update automatically. The U.S. Department of Agriculture (USDA) could speed the adoption of mass timber through grants that incentivize state and local governments to adopt the latest IBC codes.
Mass timber can help with housing abundance and the climate transition.
Compared to concrete and steel, mass timber buildings are faster to build (therefore often cheaper), just as safe in fires, and create fewer CO2 emissions. Single- and multi-family housing using mass timber components could help with the 7.3 million gap of affordable homes.
Broader adoption could meaningfully increase productivity and thereby reduce construction costs. Constructing the superstructure for a 25-story mass timber building in Milwaukee completed in 2022 took about half as long compared to concrete. Developers have reported cost savings of up to 35% through lower time and labor costs. Mass timber isn’t only for small projects: Walmart is building a new 2.4-million-square-foot office campus from mass timber.
Most states are on older building codes that inhibit use of mass timber.
Use of mass timber is growing. But building codes, often slow to catch up with the latest research, have limited the impact so far. Only in 2021 did amendments to the IBC enable the construction of mass timber buildings taller than six stories Building taller increases the cost savings from building faster.
State and local government adoption of building codes lags further. By 2023, only 20 states had adopted IBC 2021. Eventually builders might lobby governments to catch up, but for now there’s little reason for many builders to consider mass timber when it’s so restricted.
USDA could incentivize the adoption of the latest IBC.
There should be a federal grantmaking program that implicitly requires the latest IBC codes to participate, incentivizing state and local government adoption.
The USDA could house this program due to policy interest in both the timber industry (Forest Service, FS) and housing (Rural Development, RD).
In fact, USDA is already making grants towards mass-timber housing, just not at a scale that directly incentivizes code changes. Since 2015, the Wood Innovations Grant Program has invested more than $93 million in projects that support the wood products economy, including multifamily buildings. USDA also recently partnered with the Softwood Lumber Board to competitively award more than $4 million to 11 mass timber projects. Most of these buildings are in states or cities that have adopted IBC 2021. For example, one winner is a 12-story multifamily building in Denver, which would be impossible without IBC 2021.
To unlock the adoption of innovative mass timber construction, Congress should take the following steps:
- Appropriate additional discretionary budget to USDA with direction to invest in mass timber innovation. For example, increasing the Wood Innovations Grant Program by 10 times the FY 2023 amount would be ~$430 million, less than 1.5% of USDA’s discretionary budget.
USDA should then take the following steps:
- Allocate those funds to the FS Wood Innovations Grant Program or a similar program within RD.
- Prioritize grants to multi- and single- family housing projects.
- Write funding priority that necessitates the latest IBC mass timber amendments. For example, priority to building designs over eight stories would only be possible in locations where IBC 2021 is in effect (or comparable amendments).
That funding opportunity incentivizes state and local governments to adopt mass timber amendments.
It’s uncertain how much funding would create a strong incentive. But even if most projects were awarded in states already using IBC 2021, there may still be positive downstream impacts from meaningful investment in the industry. While there are far fewer mass timber projects relative to total construction, there are far more than a grant program of this scale could directly support, so there shouldn’t be a shortage of projects. The goal is not to directly build millions of homes but to bring state building codes up-to-date. Updating building codes is necessary but not sufficient for construction at scale.
A simple mechanism to unlock the potential of mass timber.
A federal USDA grant program incentivizing adoption of the latest IBC amendments related to mass timber requires no new funding mechanisms and no new legislation. The structure is already available with the FS Wood Innovations Grant Program as a clear example. That program had ~$43 million in grants for FY 2023; perhaps an order of magnitude more funding would move more states to the updated IBC. This program would not drive mass timber adoption at scale on its own, but updating building codes is a necessary first step. Because mass timber is faster to build with and results in fewer emissions, it is a crucial building material that could contribute to both housing abundance and the climate transition.
This idea of merit originated from our Housing Ideas Challenge, in partnership with Learning Collider, National Zoning Atlas, and Cornell’s Legal Constructs Lab. Find additional ideas to address the housing shortage here.
Incentivizing Developers To Reuse Low Income Housing Tax Credits
The Low-Income Housing Tax Credit (LIHTC) program has been the backbone of new affordable housing construction nationwide for the last 37 years. Developers who receive LIHTC financing are paid twice: they collect a developer fee, and they own the building. They can raise rents to market rate after affordability periods expire. States are unable to leverage any capital gain in the project to develop more housing in the future because those gains have disappeared into the developer’s pockets.
Existing LIHTC incentives for nonprofits do not ensure that profits are recycled to build more housing, because many nonprofits have other, nonhousing missions. For example, the proceeds of the 2007 sale of one large, nonprofit-owned housing project built in the 1960s in Hawaii were donated to schools and hospitals. Those funds were generated from housing subsidies and could have created hundreds of new affordable homes, but they left the housing sector permanently.
Incentivizing organizations to use their profits to build more housing will enable LIHTC to create much more housing in the long term. My proposal would amend 26 U.S. Code §42(m)(1)(B)(ii) to ensure each state’s Qualified Allocation Plan gives preference to applicants that are required to use the profits from their development to construct more below-market housing. States and local governments will also receive preference, as they are mission-driven institutions with no incentive to raise rents to market in the future.
This proposal is based on the Vienna, Austria, housing model. Vienna spends no new taxpayer dollars on housing construction, yet houses 60 percent of its population—all who want it—in well-designed, mixed-income social housing. To produce new social housing, Vienna extends low-interest loans to Limited Profit Housing Associations (LPHAs), corporations that make profits but are required to use them to develop more housing in the future. LPHAs charge tenants an approximately $56,000 buy-in upfront, plus rent. Together, these revenue streams cover the cost of servicing the low-interest loans, enabling each building to be revenue positive, especially after the loan is repaid, and thus allowing the LPHA to build more housing in the future, creating a virtuous, self-sustaining cycle of housing creation.
In lieu of creating a separate, regulated category of business association, the LIHTC program can prioritize entities required to use their profits to construct more housing, such as through restrictions in their organizational documents.
Recommendation
Congress should
- Amend 26 U.S. Code §42(m)(1)(B)(ii) to include in its preferences “(iv) entities obligated to use the profits from their development to construct more below market housing” and;
- Amend 26 U.S. Code §42(m)(1)(C) to include in its selection criteria “(xi) projects that are state- or county owned, in which the state or county is an equity partner, or in which ownership is conveyed to the state or county at a definite time.”
Because states and counties have no incentive to raise rents to market or to pocket the profits from selling such housing, they would also be better recipients of taxpayer financing than for-profit developers.
Political resistance to this concept has come from two main sources. First, state housing finance agencies (HFAs) that administer LIHTC are reluctant to change processes that have been in place for decades. LIHTC currently allows states wide latitude in how to select developers, and HFAs will resist federal restrictions on that flexibility. Second, current LIHTC developers are reluctant to give up any compensation source, even those many years in the future. These arguments have become less persuasive as LIHTC applications have become much more competitive in recent years. If applicants are unwilling to build LIHTC projects without ownership, they will simply forgo those points in the application, and the current system will continue. But if there are applicants willing to use the new structure, as we have anecdotally heard here in Hawaii would be numerous, they will prove the counterarguments wrong.
Persuading Congress to adopt these changes may be challenging. Indeed, private developers successfully lobbied Congress to eliminate support for nonprofit and limited profit cooperatives as early as the Housing Act of 1937. Despite many criticisms over the years, LIHTC is one of the few affordable housing programs with bipartisan support, because it both rewards private sector developers and produces housing for the low income. Yet despite the billions that Congress appropriates year after year, America’s housing shortage has gotten worse and worse. If LIHTC funds created projects that recycled their profits into building more housing, LIHTC would create a virtuous cycle to build more and more housing, moving the needle without additional expenditure of taxpayer funds.
A potential source of support would be the mission-driven nonprofit organizations that would be the beneficiaries of this policy change. As part of their LIHTC applications, they would be very willing to create entities legally required to recycle their profits. They would also likely partner with existing LIHTC developers, who could be paid a fee, to deliver the projects. Existing developers would still be able to profit from producing LIHTC housing, even though they would forgo ownership of the building.
This idea of merit originated from our Housing Ideas Challenge, in partnership with Learning Collider, National Zoning Atlas, and Cornell’s Legal Constructs Lab. Find additional ideas to address the housing shortage here.
Expand the Fair Housing Initiatives Program to Enforce Federal and State Housing Supply and Affordability Laws
The U.S. Department of Housing and Urban Development’s (HUD) forthcoming Affirmatively Furthering Fair Housing (AFFH) final rule and a recent wave of state housing affordability legislation create mechanisms to substantially increase the nation’s affordable housing supply. However, because local noncompliance with these laws poses a crucial obstacle, successful implementation will require robust enforcement. To ensure local governments’ full compliance with AFFH and state housing affordability legislation, Congress and HUD should expand the Fair Housing Initiatives Program (FHIP) to fund external enforcement organizations.
FHIP is a model for enforcement of complementary federal and state housing laws. A “necessary component” of fair housing enforcement, FHIP funds local nonprofit organizations to investigate and raise legal complaints of discrimination in their communities under the Fair Housing Act. FHIP grantees play a “vital role” in Fair Housing Act enforcement because FHIP-initiated civil actions and complaints to enforcement agencies are more likely to be properly filed and successfully resolved.
FHIP should be expanded to enforce a new generation of federal and state laws that promote housing supply and affordability but are at risk of insufficient enforcement. AFFH will require localities to implement equity plans that reduce residential segregation and increase access to affordable housing in high opportunity areas. HUD is empowered to withhold substantial streams of federal funding from noncompliant jurisdictions. Nonetheless, HUD poorly enforced AFFH’s previous iterations, leading to calls for “external, relatively independent” enforcement mechanisms. At the state level, a raft of recent legislation overrides local exclusionary zoning and streamlines local housing development permitting processes. However, many localities have demonstrated fierce resistance to these state laws with efforts designed to avoid compliance, including constitutional challenges, declarations of entire towns as “mountain lion sanctuar[ies],” and proposals to give up public infrastructure. Understaffed state agencies may be strained to strictly enforce such laws against hundreds of statewide localities.
As independent community institutions with extensive legal expertise in the housing field, FHIP grantees are well situated to tailor innovative enforcement of the emerging housing supply and affordability regime to their communities. Grantees could build on their administrative expertise by filing complaints to HUD under §5.170 of the proposed AFFH rule. In addition, grantees could initiate civil actions under state laws and the federal False Claims and Fair Housing Acts against jurisdictions that shirk their housing obligations. Grantees might also use their expertise to educate local policymakers and stakeholders on their responsibilities under emerging laws.
Congress should:
- Amend the Fair Housing Initiatives Program’s authorizing statute (42 U.S.C. 3616a) to permit the allocation of grants to fair housing enforcement organizations for enforcement of relevant state and local housing supply and affordability laws, as determined by HUD.
HUD should:
- Amend the Fair Housing Initiatives Program regulations (24 C.F.R. 125) to add a new Affirmatively Furthering Fair Housing Enforcement Initiative (AFFH-EI). The initiative would fund fair housing enforcement organizations to enforce localities’ obligations under AFFH.
- Promulgate administrative guidance and administer technical assistance to AFFH-EI grantees to specify how funds should be used. Grantees should investigate local compliance with AFFH, raise administrative complaints of noncompliance, and pursue affirmative litigation against noncompliant jurisdictions under the Fair Housing Act, False Claims Act, and other relevant federal and state statutes.
If successfully implemented, an expanded FHIP would support the full enforcement of the forthcoming AFFH final rule and recent state housing supply and affordability legislation by bringing administrative complaints to HUD and state agencies, initiating civil enforcement actions, and educating local stakeholders. Indeed, these FHIP grantees would hold local governments accountable to their duties to equitably plan for, and remove legal barriers to, the development of affordable housing in high opportunity areas for all.
This idea of merit originated from our Housing Ideas Challenge, in partnership with Learning Collider, National Zoning Atlas, and Cornell’s Legal Constructs Lab. Find additional ideas to address the housing shortage here.
Incorporate open science standards into the identification of evidence-based social programs
Evidence-based policy uses peer-reviewed research to identify programs that effectively address important societal issues. For example, several agencies in the federal government run clearinghouses that review and assess the quality of peer-reviewed research to identify programs with evidence of effectiveness. However, the replication crisis in the social and behavioral sciences raises concerns that research publications may contain an alarming rate of false positives (rather than true effects), in part due to selective reporting of positive results. The use of open and rigorous practices — like study registration and availability of replication code and data — can ensure that studies provide valid information to decision-makers, but these characteristics are not currently collected or incorporated into assessments of research evidence.
To rectify this issue, federal clearinghouses should incorporate open science practices into their standards and procedures used to identify evidence-based social programs eligible for federal funding.
Details
The federal government is increasingly prioritizing the curation and use of research evidence in making policy and supporting social programs. In this effort, federal evidence clearinghouses—influential repositories of evidence on the effectiveness of programs—are widely relied upon to assess whether policies and programs across various policy sectors are truly “evidence-based.” As one example, the Every Student Succeeds Act (ESSA) directs states, districts, and schools to implement programs with research evidence of effectiveness when using federal funds for K-12 public education; the What Works Clearinghouse—an initiative of the U.S. Department of Education—identifies programs that meet the evidence-based funding requirements of the ESSA. Similar mechanisms exist in the Departments of Health and Human Services (the Prevention Services Clearinghouse and the Pathways to Work Evidence Clearinghouse), Justice (CrimeSolutions), and Labor (the Clearinghouse for Labor and Evaluation Research). Consequently, clearinghouse ratings have the potential to influence the allocation of billions of dollars appropriated by the federal government for social programs.
Clearinghouses generally follow explicit standards and procedures to assess whether published studies used rigorous methods and reported positive results on outcomes of interest. Yet this approach rests on assumptions that peer-reviewed research is credible enough to inform important decisions about resource allocation and is reported accurately enough for clearinghouses to distinguish which reported results represent true effects likely to replicate at scale. Unfortunately, published research often contains results that are wrong, exaggerated, or not replicable. The social and behavioral sciences are experiencing a replication crisis as a result of numerous large-scale collaborative efforts that had difficulty replicating novel findings in published peer-reviewed research. This issue is partly attributed to closed scientific workflows, which hinder reviewers’ and evaluators’ attempts to detect issues that negatively impact the validity of reported research findings—such as undisclosed multiple hypothesis testing and the selective reporting of results.
Research transparency and openness can mitigate the risk of informing policy decisions on false positives. Open science practices like prospectively sharing protocols and analysis plans, or releasing code and data required to replicate key results, would allow independent third parties such as journals and clearinghouses to fully assess the credibility and replicability of research evidence. Such openness in the design, execution, and analysis of studies on program effectiveness is paramount to increasing public trust in the translation of peer-reviewed research into evidence-based policy.
Currently, standards and procedures to measure and encourage open workflows—and facilitate detection of detrimental practices in the research evidence—are not implemented by either clearinghouses or the peer-reviewed journals publishing the research on program effectiveness that clearinghouses review. When these practices are left unchecked, incomplete, misleading, or invalid research evidence may threaten the ability of evidence-based policy to live up to its promise of producing population-level impacts on important societal issues.
Recommendations
Policymakers should enable clearinghouses to incorporate open science into their standards and procedures used to identify evidence-based social programs eligible for federal funding, and increase the funds appropriated to clearinghouse budgets to allow them to take on this extra work. There are several barriers to clearinghouses incorporating open science into their standards and procedures. To address these barriers and facilitate implementation, we recommend that:
- Dedicated funding should be appropriated by Congress and allocated by federal agencies to clearinghouse budgets so they can better incorporate the assessment of open science practices into research evaluation.
- Funding should facilitate the hiring of additional personnel dedicated to collecting data on whether open science practices were used—and if so, whether they were used well enough to assess the comprehensive of reporting (e.g., checking publications on results with prospective protocols) and reproducibility of results (e.g., rerunning analyses using study data and code).
- The Office of Management and Budget should establish a formal mechanism for federal agencies that run clearinghouses to collaborate on shared standards and procedures for reviewing open science practices in program evaluations. For example, an interagency working group can develop and implement updated standards of evidence that include assessment of open science practices, in alignment with the Transparency and Openness Promotion (TOP) Guidelines for Clearinghouses.
- Once funding, standards, and procedures are in place, federal agencies sponsoring clearinghouses should create a roadmap for eventual requirements on open science practices in studies on program effectiveness.
- Other open science initiatives targeting researchers, research funders, and journals are increasing the prevalence of open science practices in newly published research. As open science practices become more common, agencies can introduce requirements on open science practices for evidence-based social programs, similar to research transparency requirements implemented by the Department of Health and Human Services for the marketing and reimbursement of medical interventions.
- For example, evidence-based funding mechanisms often have several tiers of evidence to distinguish the level of certainty that a study produced true results. Agencies with tiered-evidence funding mechanisms can begin by requiring open science practices in the highest tier, with the long-term goal of requiring a program meeting any tier to be based on open evidence.
Conclusion
The momentum from the White House’s 2022 Year of Evidence for Action and 2023 Year of Open Science provides an unmatched opportunity for connecting federal efforts to bolster the infrastructure for evidence-based decision-making with federal efforts to advance open research. Evidence of program effectiveness would be even more trustworthy if favorable results were found in multiple studies that were registered prospectively, reported comprehensively, and computationally reproducible using open data and code. With policymaker support, incorporating these open science practices in clearinghouse standards for identifying evidence-based social programs is an impactful way to connect these federal initiatives that can increase the trustworthiness of evidence used for policymaking.
Develop a Digital Technology Fund to secure and sustain open source software
Open source software (OSS) is a key part of essential digital infrastructure. Recent estimates indicate that 95% of all software relies upon open source, with about 75% of the code being directly open source. Additionally, as our science and technology ecosystem becomes more networked, computational, and interdisciplinary, open source software will increasingly be the foundation on which our discoveries and innovations rest.
However, there remain important security and sustainability issues with open source software, as evidenced by recent incidents such as the Log4j vulnerability that affected millions of systems worldwide.
To better address security and sustainability of open source software, the United States should establish a Digital Technology Fund through multi-stakeholder participation.
Details
Open source software — software whose source code is publicly available and can be modified, distributed, and reused by anyone — has become ubiquitous. OSS offers myriad benefits, including fostering collaboration, reducing costs, increasing efficiency, and enhancing interoperability. It also plays a key role in U.S. government priorities: federal agencies increasingly create and procure open source software by default, an acknowledgement of its technical benefits as well as its value to the public interest, national security, and global competitiveness.
Open source software’s centrality in the technology produced and consumed by the federal government, the university sector, and the private sector highlights the pressing need for these actors to coordinate on ensuring its sustainability and security. In addition to fostering more robust software development practices, raising capacity, and developing educational programs, there is an urgent need to invest in individuals who create and maintain critical open source software components, often without financial support.
The German Sovereign Tech Fund — launched in 2021 to support the development and maintenance of open digital infrastructure — recently announced such support for the maintainers of Log4j, thereby bolstering its prospects for timely, secure production and sustainability. Importantly, this is one example of numerous that require similar support. Cybersecurity and Infrastructure Security (CISA)’s director Jen Easterly has affirmed the importance of OSS while noting its security vulnerabilities as a national security concern. Easterly rightly called upon moving the responsibility and support for critical OSS components away from individuals to the organizations that benefit from those individuals’ efforts.
Recommendations
To address these challenges, the United States should establish a Digital Technology Fund to provide direct and indirect support to OSS projects and communities that are essential for the public interest, national security, and global competitiveness. The Digital Technology Fund would be funded by a coalition of federal, private, academic, and philanthropic stakeholders and would be administered by an independent nonprofit organization.
To better understand the risks and opportunities:
- The Office of the Cyber National Director should publish a synopsis of the feedback to the recent RFI regarding OSS security; it should then commission a comparative analysis of this synopsis and the German Tech Sovereign Fund to identify the gaps and needs within the U.S. context.
To encourage multi-stakeholder participation and support:
- The White House should task the Open-Source Software Security Initiative (OS3I) working group with developing a strategy, draft legislation, and funding proposal for the Digital Technology Fund. The fund should be established as a public-private partnership with a focus on the security and sustainability of OSS; it could be designed to augment the existing Open Technology Fund, which supports internet freedom and digital rights. The strategy should include approaches for encouraging contribution from the private sector, universities, and philanthropy, along with the federal government, to the fund’s resources and organization.
To launch the Digital Tech Fund:
- Congress should appropriate funding in alignment with the proposal developed by the OS3I working group. Legislation could provide relevant agencies — many of which have identified secure OSS as a priority — with initial implementation and oversight responsibility for the fund, after which point a permanent board could be selected.
The realized and potential impact of open source software is transformative in terms of next-generation infrastructure, innovation, workforce development, and artificial intelligence safety. The Digital Tech Fund can play an essential and powerful role in raising our collective capacity to address important security and sustainability challenges by acknowledging and supporting the pioneering individuals who are advancing open source software.
Advance open science through robust data privacy measures
In an era of accelerating advancements in data collection and analysis, realizing the full potential of open science hinges on balancing data accessibility and privacy. As we move towards a more open scientific environment, the volume of sensitive data being shared is swiftly increasing. While open science presents an opportunity to fast-track scientific discovery, it also poses a risk to privacy if not managed correctly.
Building on existing data and privacy efforts, the White House and federal science agencies should collaborate to develop and implement clear standards for research data privacy across the data management and sharing life cycle.
Details
Federal agencies’ open data initiatives are a milestone in the move towards open science. They have the potential to foster greater collaboration, transparency, and innovation in the U.S. scientific ecosystem and lead to a new era of discovery. However, a shift towards open data also poses challenges for privacy, as sharing research data openly can expose personal or sensitive information when done without the appropriate care, methods, and tools. Addressing this challenge requires new policies and technologies that allow for open data sharing while also protecting individual privacy.
The U.S. government has shown a strong commitment to addressing data privacy challenges in various scientific and technological contexts. This commitment is underpinned by laws and regulations such as the Health Insurance Portability and Accountability Act and the regulations for human subjects research (e.g., Code of Federal Regulations Title 45, Part 46). These regulations provide a legal framework for protecting sensitive and identifiable information, which is crucial in the context of open science.
The White House Office of Science and Technology Policy (OSTP) has spearheaded the “National Strategy to Advance Privacy-Preserving Data Sharing and Analytics,” aiming to further the development of these technologies to maximize their benefits equitably, promote trust, and mitigate risks. The National Institutes of Health (NIH) operate an internal Privacy Program, responsible for protecting sensitive and identifiable information within NIH work. The National Science Foundation (NSF) complements these efforts with a multidisciplinary approach through programs like the Secure and Trustworthy Cyberspace program, aiming to develop new ways to design, build, and operate cyber systems, protect existing infrastructure, and motivate and educate individuals about cybersecurity.
Given the unique challenges within the open science context and the wide reach of open data initiatives across the scientific ecosystem, there remains a need for further development of clear policies and frameworks that protect privacy while also facilitating the efficient sharing of scientific data. Coordinated efforts across the federal government could ensure these policies are adaptable, comprehensive, and aligned with the rapidly evolving landscape of scientific research and data technologies.
Recommendations
To clarify standards and best practices for research data privacy:
- The National Institute of Standards and Technology (NIST) should build on its existing Research Data Framework to develop a new framework that is specific to research data privacy and addresses the unique needs of open science communities and practices. This would provide researchers with a clear roadmap for implementing privacy-preserving data sharing in their work.
- This framework should incorporate the principles of Privacy by Design, ensuring that privacy is an integral part of the research life cycle, rather than an afterthought.
- The framework should be regularly updated to stay current with the changes in state, federal, and international data privacy laws, as well as new privacy-preserving methodologies. This will ensure that it remains relevant and effective in the evolving data privacy landscape.
To ensure best practices are used in federally funded research:
- Funding agencies like the NIH and NSF should work with NIST to develop and implement training for Data Management and Sharing Plan applicants and reviewers. This training would equip both parties with knowledge of best practices in privacy-preserving data sharing in open science, thereby ensuring that data privacy measures are effectively integrated into research workflows.
- Agencies should additionally establish programs to foster privacy education, as recommended in the OSTP national strategy.
- Training on open data privacy could additionally be incorporated into agencies’ existing Responsible Conduct of Research requirements.
To catalyze continued improvements in data privacy technologies:
- Science funding agencies should increase funding for domain-specific research and development of privacy-preserving methods for research data sharing. Such initiatives would spur innovation in fields like cryptography and secure computation, leading to the development of new technologies that can broaden the scope of open and secure data sharing.
- To further stimulate innovation, these agencies could also host privacy/security innovation competitions, encouraging researchers and developers to create and implement cutting-edge solutions.
To facilitate inter-agency coordination:
- OSTP should launch a National Science and Technology Council subcommittee on research data privacy within the Committee on Science. This subcommittee should work closely with the Office of Management and Budget, leveraging its expertise in overseeing federal information resources and implementing data management policies. This collaboration would ensure a coordinated and consistent approach to addressing data privacy issues in open science across different federal agencies.
Incorporate open source hardware into Patent and Trademark Office search locations for prior art
Increasingly, scientific innovations reside outside the realm of papers and patents. This is particularly true for open source hardware — hardware designs made freely and publicly available for study, modification, distribution, production, and sale. The shift toward open source aligns well with the White House’s 2023 Year of Open Science and can advance the accessibility and impact of federally funded hardware. Yet as the U.S. government expands its support for open science and open source, it will be increasingly vital that our intellectual property (IP) system is designed to properly identify and protect open innovations. Without consideration of open source hardware in prior art and attribution, these public goods are at risk of being patented over and having their accessibility lost.
Organizations like the Open Source Hardware Association (OSHWA) — a standards body for open hardware — provide verified databases of open source innovations. Over the past six years, for example, OSHWA’s certification program has grown to over 2600 certifications, and the organization has offered educational seminars and training. Despite the availability of such resources, open source certifications and resources have yet to be effectively incorporated into the IP system.
We recommend that the United States Patent and Trademark Office (USPTO) incorporate open source hardware certification databases into the library of resources to search for prior art, and create guidelines and training to build agency capacity for evaluating open source prior art.
Details
Innovative and important hardware products are increasingly being developed as open source, particularly in the sciences, as academic and government research moves toward greater transparency. This trend holds great promise for science and technology, as more people from more backgrounds are able to replicate, improve, and share hardware. A prime example is the 3D printing industry. Once foundational patents in 3D printing were released, there was an explosion of invention in the field that led to desktop and consumer 3D printers, open source filaments, and even 3D printing in space.
For these benefits to be more broadly realized across science and technology, open source hardware must be acknowledged in a way that ensures scientists will have their contributions found and respected by the IP system’s prior art process. Scientists building open source hardware are rightfully concerned their inventions will be patented over by someone else. Recently, a legal battle ensued from open hardware being wrongly patented over. While the patent was eventually overturned, it took time and money, and revealed important holes in the United States’ prior art system. As another example, the Electronic Frontier Foundation found 30+ pieces of prior art that the ArrivalStar patent was violating.
Erroneous patents can harm the validity of open source and limit the creation and use of new open source tools, especially in the case of hardware, which relies on prior art as its main protection. The USPTO — the administrator of intellectual property protection and a key actor in the U.S. science and technology enterprise — has an opportunity to ensure that open source tools are reliably identified and considered. Standardized and robust incorporation of open source innovations into the U.S. IP ecosystem would make science more reproducible and ensure that open science stays open, for the benefits of rapid improvement, testing, citizen science, and general education.
Recommendations
We recommend that the USPTO incorporate open source hardware into prior art searches and take steps to develop education and training to support the protection of open innovation in the patenting process.
- USPTO should add OSHWA’s certification – a known, compliant open source hardware certification program – to its non-patent search library.
- USPTO should put out a request for information (RFI) seeking input on (a) optimal approaches for incorporating open source innovations into searches for prior art, and (b) existing databases, standards, or certification programs that can/should be added to the agency’s non-patent search library.
- Based on the results of the RFI, USPTO’s Scientific and Technical Information Center should create guidelines and educational training programs to build examiners’ knowledge and capacity for evaluating open source prior art.
- USPTO should create clear public guidelines for the submission of new databases into the agency’s prior art library, and the requirements for their consideration and inclusion.
Incorporation of open hardware into prior art searches will signify the importance and consideration of open source within the IP system. These actions have the potential to improve the efficiency of prior art identification, advance open source hardware by assuring institutional actors that open innovations will be reliably identified and protected, and ensure open science stays open.
Improve research through better data management and sharing plans
The United States government spends billions of dollars every year to support the best scientific research in the world. The novel and multidisciplinary data produced by these investments have historically remained unavailable to the broader scientific community and the public. This limits researchers’ ability to synthesize knowledge, make new discoveries, and ensure the credibility of research. But recent guidance from the Office of Science and Technology Policy (OSTP) represents a major step forward for making scientific data more available, transparent, and reusable.
Federal agencies should take coordinated action to ensure that data sharing policies created in response to the 2022 Nelson memo incentivize high-quality data management and sharing plans (DMSPs), include robust enforcement mechanisms, and implement best practices in supporting a more innovative and credible research culture.
Details
The 2022 OSTP memorandum “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research” (the Nelson memo) represents a significant step toward opening up not only the findings of science but its materials and processes as well. By including data and related research outputs as items that should be publicly accessible, defining “scientific data” to include “material… of sufficient quality to validate and replicate research findings” (emphasis added), and specifying that agency plans should cover “scientific data that are not associated with peer-reviewed scholarly publications,” this guidance has the potential to greatly improve the transparency, equity, rigor, and reusability of scientific research.
Yet while the 2022 Nelson memo provides a crucial foundation for open, transparent, and reusable scientific data, preliminary review of agency responses reveals considerable variation on how access to data and research outputs will be handled. Agencies vary by the degree to which policies will be reviewed and enforced and by the degree of specificity by which they define data as being materials needed to “validate and replicate” research findings. Finally, they could and should go further in including plans to fully support a research ecosystem that supports cumulative scientific evidence by enabling the accessibility, discoverability, and citation of researchers’ data sharing plans themselves.
Recommendations
To better incentivize quality and reusability in data sharing, agencies should:
- Make DMSPs publicly available in an easy-to-use interface on their websites where individual grants are listed to increase accountability for stated plans and discoverability of research outputs.
- Additionally, give DMSPs persistent, unique identifiers (e.g., digital object identifiers, or DOIs) so that they can be cited, read, and used.
- Make DMSPs subject to peer review as part of the same process that other aspects of a proposed research project’s intellectual merit are evaluated. This will directly incentivize high standards of planned data sharing practices and enable the diffusion of best practices across the research community.
To better ensure compliance and comprehensive availability, agencies should:
- Coordinate across agencies to create a consistent mechanism for DMSP enforcement to reduce applicant uncertainty about agencies’ expectations and processes.
- Approaches to enforcement should include evaluation of past adherence to DMSPs in future grant applications and should ensure that early career researchers and researchers from lower-resourced institutions are not penalized for a lack of a data-sharing record.
- Assert that data includes all digital materials needed for external researchers to replicate and validate findings
- Work with domain-specific stakeholders to develop guidance for the specific components that should be included as research outputs (e.g., data, codebooks, metadata, protocols, analytic code, preregistrations).
Updates to the Center for Open Science’s efforts to track, curate, and recommend best practices in implementing the Nelson memo will be disseminated through publication and through posting on our website at https://www.cos.io/policy-reform.
Support scientific software infrastructure by requiring SBOMs for federally funded research
Federally funded research relies heavily on software. Despite considerable evidence demonstrating software’s crucial role in research, there is no systematic process for researchers to acknowledge its use, and those building software lack recognition for their work. While researchers want to give appropriate acknowledgment for the software they use, many are unsure how to do so effectively. With greater knowledge of what software is used in research underlying publications, federal research funding agencies and researchers themselves will better be able to make efficient funding decisions, enhance the sustainability of software infrastructure, identify vital yet often overlooked digital infrastructure, and inform workforce development.
All agencies that fund research should require that resulting publications include a Software Bill of Materials (SBOM) listing the software used in the research.
Details
Software is a cornerstone in research. Evidence from numerous surveys consistently shows that a majority of researchers rely heavily on software. Without it, their work would likely come to a standstill. However, there is a striking contrast between the crucial role that software plays in modern research and our knowledge of what software is used, as well as the level of recognition it receives. To bridge this gap, we propose policies to properly acknowledge and support the essential software that powers research across disciplines.
Software citation is one way to address these issues, but citation alone is insufficient as a mechanism to generate software infrastructure insights. In recent years, there has been a push for the recognition of software as a crucial component of scholarly publications, leading to the creation of guidelines and specialized journals for software citation. However, software remains under-cited due to several challenges, including friction with journals’ reference list standards, confusion regarding which or when software should be cited, and opacity of the roles and dependencies among cited software. Therefore, we need a new approach to this problem.
A Software Bill of Materials (SBOM) is a list of the software components that were used in an effort, such as building application software. Executive Order 14028 requires that all federal agencies obtain SBOMs when they purchase software. For this reason, many high-quality open-source SBOM tools already exist and can be straightforwardly used to generate descriptions of software used in research.
SBOM tools can identify and list the stack of software underlying each publication, even when the code itself is not openly shared. If we were able to combine software manifests from many publications together, we would have the insights needed to better advance research. SBOM data can help federal agencies find the right mechanism (funding, in-kind contribution of time) to sustain software critical to their missions. Better knowledge about patterns of software use in research can facilitate better coordination among developers and reduce friction in their development roadmaps. Understanding the software used in research will also promote public trust in government-funded research through improved reproducibility.
Recommendation
We recommend the adoption of Software Bills of Materials (SBOMs) — which are already used by federal agencies for security reasons — to understand the software infrastructure underlying scientific research. Given their mandatory use for software suppliers to the federal government, SBOMs are ideal for highlighting software dependencies and potential security vulnerabilities. The same tools and practices can be used to generate SBOMs for publications. We, therefore, recommend that all agencies that fund research should require resulting publications to include an SBOM listing the software used in the research. Additionally, for research that has already been published with supplementary code materials, SBOMs should be generated retrospectively. This will not only address the issue of software infrastructure sustainability but also enhance the verification of research by clearly documenting the specific software versions used and directing limited funds to software maintenance that most need it.
- The Office of Science and Technology Policy (OSTP) should coordinate with agencies to undertake feasibility studies of this policy, building confidence that it would work as intended.
- Coordination should include funding agencies, federal actors currently applying SBOMs in software procurement, organizations developing SBOM tools and standards, and scientific stakeholders.
- Coordination should include funding agencies, federal actors currently applying SBOMs in software procurement, organizations developing SBOM tools and standards, and scientific stakeholders.
- Based on the results of the study, OSTP should direct funding agencies to design and implement policies requiring that publications resulting from federal funding include an openly accessible, machine-readable SBOM for the software used in the research.
- OSTP and the Office of Management and Budget should additionally use the Multi-Agency Research and Development Budget Priorities to encourage agencies’ collection, integration, and analysis of SBOM data to inform funding and workforce priorities and to catalyze additional agency resource allocations for software infrastructure assessment in follow-on budget processes.
Create an Office of Co-Production at the National Institutes of Health
The National Institutes of Health (NIH) spent $49 billion in fiscal year 2023 on research and development, a significant annual investment in medical treatment discovery and development. Despite NIH’s research investments producing paradigm-shifting therapies, such as CAR-T cancer treatments, CRISPR-enabled gene therapy for sickle cell, and the mRNA vaccine for COVID-19, the agency and medical scientists more broadly are grappling with declining trust. This further compounds decades-long mistrust in medical research by marginalized populations, whom researchers struggle to recruit as participants in medical research. If things do not improve, a lack of representation may lead to lack of access to effective medical interventions, worsen health disparities, and cost hundreds of billions of dollars.
A new paradigm for research is needed to ensure meaningful public engagement and rebuild trust. Co-production —in which researchers, patients, and practitioners work together as collaborators — offers a framework for embedding collaboration and trust into the biomedical enterprise.
The National Institutes of Health should form an Office of Co-Production in the Office of the Director, Division of Program Coordination, Planning, and Strategic Initiatives.
Details
In accordance with Executive Order 13985 and ongoing public access initiatives, science funding and R&D agencies have been seeking ways to embed equity, accessibility, and public participation into their processes. The NIH has been increasingly working to advance publicly engaged and led research, illustrated by trainings and workshops around patient-engaged research, funding resources for community partnerships like RADx Underserved Populations, community-led research programs like Community Partnerships to Advance Science for Society (ComPASS), and support from the new NIH director.
To ensure that public engagement efforts are sustainable, it is critical to invest in lasting infrastructure capable of building and maintaining these ties. Indeed, in their Recommendation on Open Science, the United Nations Educational, Scientific, and Cultural Organization outlined infrastructure that must be built for scientific funding to include those beyond STEMM practitioners in research decision-making. One key approach involves explicitly supporting the co-production of research, a process by which “researchers, practitioners and the public work together, sharing power and responsibility from the start to the end of the project, including the generation of knowledge.”
Co-production provides a framework with which the NIH can advance patient involvement in research, health equity, uptake and promotion of new technologies, diverse participation in clinical trials, scientific literacy, and public health. Doing so effectively would require new models for including and empowering patient voices in the agency’s work.
Recommendations
The NIH should create an Office of Co-Production within the Office of the Director, Division of Program Coordination, Planning, and Strategic Initiatives (DPCPSI). The Center for Co-Production would institutionalize best practices for co-producing research, train NIH and NIH-funded researchers in co-production principles, build patient-engaged research infrastructure, and fund pilot projects to build the research field.
The NIH Office of Co-Production, co-led by patient advocates (PA) and NIH personnel, should be established with the following key programs:
- A Resources and Training Program that trains patient advocates and researchers, both separately and together, so they can understand and work together as collaborators. This work would include helping researchers develop understanding about the communities affected by diseases they are investigating, relationship-building strategies, and ways to address power differentials, and helping patient advocates gain understanding about research processes, including understanding disease pathogenesis, different mechanisms of action and targets for research, and clinical research processes including regulatory requirements and ethical considerations. PAs could also be trained to qualify to serve on Data and Safety Monitoring Boards (DSMB).
- A Patient Advocate Advisors Management Program that would manage the placement of PAs into community advisory bodies, into advisory roles to NIH institutes’ major initiatives, onto ethical advisory bodies, onto DSMBs, onto peer review committees and study sections, and onto key long-range planning bodies, including those determining research prioritization.
- A Co-Production Principles and Practice Program led by a senior team of PAs and advisors that coordinates, organizes, and facilitates cross-disease understanding and solidarity and establishes basic principles for patient advocate engagement, grant requirements, and ongoing assessment of the quality of co-production and relational infrastructure. This program will focus on key principles such as:
- Sharing of power – the research is jointly owned and people work together to achieve a joint understanding
- Including all perspectives and skills – make sure the research team includes all those who can make a contribution
- Respecting and valuing the knowledge of all those working together on the research – everyone is of equal importance
- Reciprocity – everybody benefits from working together
- Building and maintaining relationships – an emphasis on relationships is key to sharing power. There needs to be joint understanding and consensus and clarity over roles and responsibilities. It is also important to value people and unlock their potential.
- A Communications, Planning, and Policy Program that works with the NIH director and institute directors to advocate for mutual goals to advance the public engagement mission of the NIH and its institutes.
- A Grantmaking Program that can pilot the expansion and scaling of NIH-sponsored Co-Production Cores and support the involvement of patient advocates in NIH-funded research across the country through equitable participation and standard compensation policies.
Creating an Office of Co-Production would achieve the following goals:
- It would address the growing gulf between the public, who ultimately fund biomedical research with their tax dollars, and researchers by directly and meaningfully engaging patient advocates in biomedical and clinical science. Co-production builds relationships and trust as it requires that relationships are valued and nurtured and that efforts are made to redress power differentials.
- By working early and often with patient populations around treatments, co-production helps medical scientists better anticipate and address risk early in the research process.
- It would institutionalize a known model for collaborative research that efficiently uses research dollars. During the HIV/AIDS crisis, rapid advances in biomedical and clinical research were made possible by patient advocate involvement in trial design, recruitment, and analysis.
- The NIH Center would create a replicable model of institutional support for co-production that can be scaled across the federal R&D agencies. The NIH should regularly report on the progress made by the Center for Co-Production to encourage replication in other agencies that can benefit from increased public participation.