Piloting and Evaluating NSF Science Lottery Grants: A Roadmap to Improving Research Funding Efficiencies and Proposal Diversity

This memo was jointly produced by the Federation of American Scientists & the Institute for Progress

Summary

The United States no longer leads the world in basic science. There is growing recognition of a gap in translational activities — the fruits of American research do not convert to economic benefits. As policymakers consider a slew of proposals that aim to restore American competitiveness with once-in-a-generation investments into the National Science Foundation (NSF), less discussion has been devoted to improving our research productivity — which has been declining for generations. Cross-agency data indicates that this is not the result of a decline in proposal merit, nor of a shift in proposer demographics, nor of an increase (beyond inflation) in the average requested funding per proposal, nor of an increase in the number of proposals per investigator in any one year. As the Senate’s U.S. Innovation and Competition Act (USICA) and House’s America COMPETES Act propose billions of dollars to the NSF for R&D activities, there is an opportunity to bolster research productivity but it will require exploring  new, more efficient ways of funding research. 

The NSF’s rigorous merit review process has long been regarded as the gold standard for vetting and funding research. However, since its inception in the 1950s, emergent circumstances — such as the significant growth in overall population of principal investigators (PIs) — have introduced a slew of challenges and inefficiencies to the traditional peer-review grantmaking process: The tax on research productivity as PIs submit about 2.3 proposals for every award they receive and spend an average of 116 hours grant-writing per NSF proposal (i.e., “grantsmanship”), corresponding to a staggering loss of nearly 45% of researcher time; the orientation of grantsmanship towards incremental research with the highest likelihood of surviving highly-competitive, consensus-driven, and points-based review (versus riskier, novel, or investigator-driven research); rating bias against interdisciplinary research or previously unfunded researchers as well as reviewer fatigue. The result of such inefficiencies is unsettling: as fewer applicants are funded as a percentage of the increasing pool, some economic analysis suggests that the value of the science that researchers forgo for grantsmanship may exceed the value of the science that the funding program supports.

Our nation’s methods of supporting new ideas should evolve alongside our knowledge base.

Our nation’s methods of supporting new ideas should evolve alongside our knowledge base. Science lotteries — when deployed as a complement to the traditional peer review grant process — could improve the systems’ overall efficiency-cost ratio by randomly selecting a small percentage of already-performed, high quality, yet unfunded grant proposals to extract value from. Tested with majority positive feedback from participants in New Zealand, Germany, and Switzerland, science lotteries would introduce an element of randomness that could unlock innovative, disruptive scholarship across underrepresented demographics and geographies. 

This paper proposes an experimental NSF pilot of science lotteries and the Appendix provides illustrative draft legislation text. In particular, House and Senate Science Committees should consider the addition of tight language in the U.S. Innovation and Competition Act (Senate) and the America COMPETES Act (House) that authorizes the use of “grant lotteries” across all NSF directorates, including the Directorate of Technology and Innovation. This language should carry the spirit of expanding the geography of innovation and evidence-based reviews that test what works.

Challenge and Opportunity

A recent NSF report pegged the United States as behind China in key scientific metrics, including the overall number of papers published and patents awarded. The numbers are sobering but reflect the growing understanding that America must pick which frontiers of knowledge it seeks to lead. One of these fields should be the science of science — in other words not just what science & technology innovations we hope to pursue, but in discovering new, more efficient ways to pursue them. 

Since its inception in 1950, NSF has played a critical role in advancing the United States’ academic research enterprise, and strengthened our leadership in scientific research across the world. In particular, the NSF’s rigorous merit review process has been described as the gold standard for vetting and funding research. However, growing evidence indicates that, while praiseworthy, the peer review process has been stretched to its limits. In particular, the growing overall population of researchers has introduced a series of burdens on the system. 

One NSF report rated nearly 70% of proposals as equally meritorious, while only one-third received funding. With a surplus of competitive proposals, reviewing committees often face tough close calls. In fact, empirical evidence has found that award decisions change nearly a quarter of the time when re-reviewed by a new set of peer experts. In response, PIs spend upwards of 116 hours on each NSF proposal to conform to grant expectations and must submit an average of 2.3 proposals to receive an award — a process known as “grantsmanship” that survey data suggests occupies nearly 45% of top researchers’ time. Even worse, this grantsmanship is oriented towards writing proposals on incremental research topics (versus riskier, novel, or investigator-driven research) which has a higher likelihood of surviving a consensus-driven, points-based review. On the reviewer side, data supports a clear rating bias against interdisciplinary research or previously unfunded researchers PIs, while experts increasingly are declining invitations to review proposals in the interests of protecting their winnowing time (e.g., reviewer fatigue). 

These tradeoffs in the current system appear quite troubling and merit further investigation of alternative and complementary funding models. At least one economic analysis suggests that as fewer applicants are funded as a percentage of the increasing pool, the value of the science that researchers forgo because of grantsmanship often exceeds the value of the science that the funding program supports. In fact, despite dramatic increases in research effort, America has for generations been facing dramatic declines in research productivity. And empirical analysis suggests this is notnecessarily the result of a decline in proposal merit, nor of a shift in proposer demographics, nor of an increase (beyond inflation) in the average requested funding per proposal, nor of an increase in the number of proposals per investigator in any one year. 

As the Senate’s U.S. Innovation and Competition Act (USICA) and House’s America COMPETES Act propose billions of dollars to the NSF for R&D activities, about 96% of which will be distributed via the peer review, meritocratic grant awards process, now is the time to apply the scientific method to ourselves in the experimentation of alternative and complementary mechanisms for funding scientific research. 

Science lotteries, an effort tested in New Zealand, Switzerland, and Germany, represent one innovation particularly suited to reduce the overall taxes on research productivity while uncovering new, worthwhile initiatives for funding that might otherwise slip through the cracks. In particular, modified science lotteries, as those proposed here, select a small percentage of well-qualified grant applications at random for funding. By only selecting from a pool of high-value projects, the lottery supports additional, quality research with minimal comparative costs to the researchers or reviewers. In a lottery, the value to the investigator of being admitted to the lottery scales directly with the number of awards available. 

These benefits translate to favorable survey data from PIs who have gone through science lottery processes. In New Zealand, for example, the majority of scientists supported a random allocation of 2% total research expenditures. Sunny Collings, chief executive of New Zealand’s Health Research Council, recounted

“Applications often have statistically indistinguishable scores, and there is a degree of randomness in peer review selection anyway. So why not formalize that and try to get the best of both approaches?”

By establishing conditions for entrance into the lottery — such as selecting for certain less funded or represented regions — NSF could also over-index for those applicants less prepared for “grantsmanship”.

What we propose, specifically, is a modified “second chance” lottery, whereby proposals that are deemed meritorious by the traditional peer-review process, yet are not selected for funding are entered into a lottery as a second stage in the funding process. This modified format ensures a high level of quality in the projects selected by the lottery to receive funding while still creating a randomized baseline to which the current system can be compared.

The use of science lotteries in the United States as a complement to the traditional peer-review process is likely to improve the overall system.  However, it is possible that selecting among well-qualified grants at random could introduce unexpected outcomes. Unfortunately, direct, empirical comparisons between the NSF’s peer review process and partial lotteries do not exist. Through a pilot, the NSF has the opportunity to evaluate to what extent the mechanism could supplement the NSF’s traditional merit review process. 

By formalizing a randomized selection process to use as a baseline for comparison, we may discover surprising things about the make up of and process that leads to successful or high-leverage research with reduced costs to researchers and reviewers. For instance, it may be the case that younger scholars who come from non-traditional backgrounds end up having as much or more success in terms of research outcomes through the lottery program as the typical NSF grant, but are selected at higher rates when compared to the traditional NSF grantmaking process. If this is the case, then there will be some evidence that something in the selection process is unfairly penalizing non-traditional candidates. 

Alternatively, we may discover that the average grant selected through the lottery is mostly indistinguishable from the average grant selected through the traditional meritorious selection, which would provide some evidence that existing administrative burdens to select candidates are too stringent. Or perhaps, we will discover that randomly selected winners, in fact, produce fewernoteworthy results than candidates selected through traditional means, which would be evidence that the existing process is providing tangible value in filtering funding proposals.By providing a baseline for comparison, a lottery would offer an evidence-based means of assessing the efficacy of the current peer-review system. Any pilot program should therefore make full use of a menu of selection criteria to toggle outcomes, while also undergoing evaluations from internal and external, scientific communities.

Plan of Action

Recommendation 1: Congress should direct the NSF to pilot experimental lotteries through America COMPETES and the U.S. Innovation and Competition Act, among other vehicles. 

In reconciling the differing House America COMPETES and Senate USICA, Congress should add language that authorizes a pilot program for “lotteries.” 

We recommend opting for signaling language and follow-on legislation that adds textual specificity. For example, in latest text of the COMPETES Act, the responsibilities of the Assistant Director of the Directorate for Science and Engineering Solutions could be amended to include “lotteries”: 

Sec. 1308(d)(4)(E). developing and testing diverse merit-review models and mechanisms, including lotteries, for selecting and providing awards for use-inspired and translational research and development at different scales, from individual investigator awards to large multi-institution collaborations;

Specifying language should then require the NSF to employ evidence-based evaluation criteria and grant it the flexibility to determine timeline of the lottery intake and award mechanisms, with broader goals of timeliness and supporting the equitable distribution among regional innovation contenders. 

The appendix contains one example structure of a science lottery in bill text (incorporated into the new NSF Directorate established by the Senate-passed United States Innovation and Competition Act), which includes the following key policy choices that Congress should consider: 

Recommendation 2: Create a “Translational Science of Science” Program within the new NSF Technology, Innovation and Partnerships Directorate that pilots the use of lotteries with evidence-based testing: 

First, the NSF Office of Integrative Activities (OIA) should convene a workshop with relevant stakeholders including representatives from each directorate, the research community including NSF grant recipients, non-recipients, and SME’s on programmatic implementation from New Zealand, Germany, and Switzerland in order to temperature- and pressure-test key criteria for implementing piloted science lotteries across directorates. 

Appendix: Bill Text

Note: Please view attached PDF for the formatted bill text

H. ______

To establish a pilot program for National Science Foundation grant lotteries.

In the House of Representatives of the United States

February 2, 2022

______________________________

A BILL

Title: To establish a pilot program for National Science Foundation grant lotteries.

Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, 

SEC. _____. Pilot Program to Establish National Science Foundation Grant Lotteries

Right to Review.—Nothing in this section shall affect an applicant’s right to review, appeal, or contest an award decision.

Supporting Market Accountability, Workplace Equity, and Fair Competition by Reining in Non-Disclosure Agreements

Summary

Overuse of non-disclosure agreements (NDAs) is a pervasive problem in the United States. Companies apply these silencing tools to prevent their workers from sharing critical information with one another and the public. This in turn threatens economic growth, limits competition, and inhibits workplace equity. Workers need reliable information about corporate practices to assess job quality, ensure personal safety, and obtain pay commensurate with their worth. The public needs information about corporate practices to decide how to use their investment and purchasing power. Yet existing laws give companies enormous latitude to designate information as confidential, allowing them to impose NDAs and other contract clauses and internal policies that prevent workers from sharing information with those who need to know.

It is time for government to rein in corporate secrecy. The #MeToo movement revealed how NDAs enable and perpetuate misconduct at work, prompting public outrage and support for legislative action. New empirical evidence has exposed just how widely NDAs are being used in the corporate world: researchers estimate that between 33% and 57% of U.S. workers are constrained by an NDA or similar mechanism.1 2 At recent hearings and public events, regulators have signaled their concern about the anti-competitive effects of restrictive employment agreements.3 Policymakers should seize this moment of support to pursue a comprehensive legislative and multi-agency agenda limiting inappropriate use of NDAs. A strong action plan should include proactive enforcement of existing laws governing NDAs; new legislation prohibiting the most harmful uses of NDAs; and interagency collaboration to educate the public, collect data, and support research on impacts of corporate secrecy practices. Together, these efforts to limit NDA abuse will promote market accountability, workplace equity, and fair competition.

Challenge and Opportunity

NDAs are contracts in which parties agree not to disclose any information designated confidential by the agreement. In some cases, NDAs may be used appropriately to protect valuable trade secrets or other intellectual property. But employers often draft these agreements broadly to conceal many other types of information, sometimes in ways that overstep existing legal bounds. For instance, the Weinstein Company required employees to sign NDAs that prohibit disclosure of “any confidential, private, and/or non-public information obtained by Employee during Employee’s employment with the Company concerning the personal, social, or business activities of the Company, the Co-Chairmen, or the executives, principals, officers, directors, agents, employees of, or contracting parties (including, but not limited to artists) with, the Company.”4 Some companies require employees to sign non-disparagement agreements. These particularly broad NDAs prohibit employees from disclosing any information that might, as a non-disparagement agreement for employees of Task Rabbit reads, “disparage the Company, and the Company’s officers, directors, employees, investors and agents, in any manner likely to be harmful to them or their business, business reputation or personal reputation.”5  NDAs and non-disparagement agreements often purport to apply indefinitely, preventing workers from sharing information long after they have left employment.

NDAs are imposed on workers at various points during the employment relationship. They are regularly included as part of a bundle of mandatory HR forms that new hires must sign as a condition of employment. They can also be imposed and enforced through confidentiality policies contained6 in personnel manuals or codes of conduct that prevent employees from sharing information about the company with outsiders and sometimes even with co-workers. They are also routinely included in standardized as well as negotiated severance agreements that workers sign when ending their employment. Lastly, they are also often included in settlement agreements that resolve workplace disputes and in agreements that force employees to arbitrate disputes in secret. By preventing workers from disclosing information on everything from workplace harassment and abuse to compensation practices and safety conditions, NDAs stifle competition, limit the free flow of ideas,7 and allow toxic workplace conditions to fester.8 9 10

Prevalence of NDAs

Researchers estimate that between 33% and 57% of U.S. workers are constrained by an NDA or similar mechanism.11 12 Yet it is difficult to precisely determine how many employees are silenced by NDAs because NDAs are designed to conceal information. In fact, NDAs often provide that the mere existence of the agreement is itself a secret. Lawyers regularly encourage firms13 to use broad NDAs as a condition of employment—not only to protect trade secrets, but also to discourage employees from revealing bad employment experiences.14 NDA prevalence also varies by sectors.

For instance, 73% of workers in “computer or mathematical jobs” report having an NDA with their employer.15 16

How NDAs Hurt Workers, the Public, and the Economy

The overuse of broad NDAs can have harmful economic and social effects. Depending on how they are drafted and enforced, NDAs may undermine law enforcement and regulatory compliance, distort labor and investment markets, constrain fair competition, allow toxic workplace conditions such as harassment and discrimination to persist, and undercut efforts to make workplaces more diverse and equitable.

Interference with Law Enforcement and Regulatory Compliance

Social, psychological, and economic disincentives already discourage employees from blowing the whistle on harmful and illegal corporate behavior.17 NDAs add another barrier preventing this critical information from reaching regulators and the public. NDAs have been used by companies to cover up illegal behavior. They have been used to silence whistleblowers who disclose information about products that threaten public health and safety.18 They have even been used to prevent employees from disclosing illegal conduct to government regulators despite countervailing law. A complaint filed by the California Department of Fair Employment and Housing (DFEH) against gaming company Activision Blizzard alleges that, contrary to law, the company pressured employees to sign contracts waiving their right to speak to investigators and requiring them to notify the company before disclosing information to DFEH.19 Some companies have required employees to agree to secrecy about corporate pay practices and diversity statistics, thereby depriving regulators of vital information about companies’ compliance with pay equity and anti-discrimination laws.20 The dangers of overly aggressive NDAs have become especially clear during the COVID-19 pandemic, when it is vital for the public to know if companies are disregarding essential health and safety guidelines designed to reduce virus spread.

Market Distortion

NDAs deprive individuals of information they need to assess competing job offers and make informed decisions about where to work. They also degrade the reliability of employer reviews that workers post to online job platforms. This is because workers subject to broad NDAs are more likely to censor themselves and withhold negative information. New research shows that on Glassdoor, workers in states with more stringent limits on NDAs are 16% more likely to give a one-star review, write 8% more about the “cons” of working at the firm, and discuss harassment at work 22% more often.21 That same research also shows that states with more stringent limits on NDAs increase reporting of sexual harassment and safety violations to federal agencies. NDAs hence remove an important check on corporate behavior, since companies have been shown to improve their practices in response to negative job reviews and investigations into their practices.22 NDAs thus enable bad employers to hide their flaws and make it difficult for good employers to distinguish themselves in the market.

Accurate information about workplace conditions is also valuable to investors, who have increasingly come to recognize that the ways companies treat their workers impact corporate financial performance.23 A nonprofit investment group recently called on the Securities and Exchange Commission to develop a standardized set of workplace-practice metrics as part of a comprehensive framework for evaluating socially responsible corporate governance.24 NDAs can hide information about workplace conditions that investors value.

Constraints on Fair Competition

Broad NDAs can impede fair competition. Research has demonstrated that non- compete agreements—which prohibit departing workers from joining competitors— impede worker mobility, economic growth, and new firm entry. Broad NDAs pose some of the same competitive risks as non-competes because they limit workers’ ability to share and apply knowledge gained through on-the-job experience. This in turn diminishes workers’ human capital and makes them less competitive in the labor market.25 Indeed, employers in states that ban non-competes have illegally attempted to use broad NDAs as an alternative mechanism to impede employee mobility.26

Harassment and Discrimination

NDAs conceal harassment, discrimination, and abuse in the workplace. As the #MeToo movement showed, perpetrators of harassment and discrimination are often repeat offenders.27 28 NDAs may prevent victims of harassment and discrimination from warning co-workers and prospective employees about a company’s toxic workplace environment, leaving others at risk. NDAs may also prohibit or inhibit employees from disclosing information to government agencies, shielding offenders from outside investigation. By limiting what employees can share, NDAs allow harmful and abusive behavior to persist.

Diversity, Equity, and Inclusion

Restrictions on employee disclosure of harassment and discrimination undermine the goal of achieving diverse and equitable workplaces. Workers of color, women, and LGBTQ+ workers are disproportionately likely to suffer harassment and discrimination in the workplace. Such adverse experiences can have significant psychological and professional consequences, including driving workers out of certain jobs and even out of certain industries.29 30 NDAs exacerbate these harms by suppressing information about systemic workplace inequities and by denying workers a forum to expose and discuss harassment and discrimination.

Corporate-secrecy practices shrouding employee compensation similarly undermine efforts of diverse employees to achieve pay equity. Contrary to law, some NDAs and confidentiality policies prohibit employees from discussing their compensation, which makes it challenging for those employees to negotiate fair salary terms commensurate with their value.31 Studies have found that states that adopted anti-secrecy pay laws increased gender wage equality relative to states that did not.32 33

National Leadership Is Needed

As described above, overly broad NDAs and the organizational secrecy practices they support pose serious risks to our economy and our society. Yet absent government intervention, these challenges will persist. Individual firms have incentives to maintain their reputations using corporate-secrecy tactics despite the social costs of such behavior. Many of those who value the information concealed by NDAs lack the capacity and power to pressure companies to change. Policymakers have an imperative to use the levers of government to curb NDA abuse.

A minority of states, including California, Illinois, New Jersey, New York and Washington, passed legislation in the wake of #MeToo regulating some uses of NDAs. But these laws comprise an inconsistent and incomplete regulatory patchwork. State laws differ in scope of coverage and impose different compliance standards, making it difficult for employees and companies to determine what employee disclosures are legally protected where. Moreover, the harms caused by NDAs do not stop at state lines. In fact, uneven regulation of NDAs further distorts markets by making it easier for companies to conceal information and restrict competition in some states than in others. Multi-state firms can use choice of law and choice of forum provisions to exploit inter-state legislative discrepancies, i.e., to apply the most lenient state-level secrecy laws to the entirety of a multi-state workforce.

The upshot is clear. National leadership is the only way to support market accountability, workplace equity, and fair competition by reining in non-disclosure agreements.

Plan of Action

Multiple policy interventions could curtail NDA misuse. Select options are presented below.

Better Enforce Existing Laws

Existing laws restrict some of the harmful uses of NDAs. But laws must be enforced to be effective. Research shows that some employers include unlawful non-compete clauses in their employment contracts, capitalizing on workers’ ignorance of the law and fears of being sued. Employers may similarly use NDAs in ways that violate existing law.34 35 Ensuring that employers are following laws that protect certain disclosures and forms of communication is a common-sense place to start when it comes to curbing NDA abuse.36

The Federal Trade Commission (FTC) has an important role to play in enforcement. The FTC has broad authority to punish companies engaging in unfair or deceptive practices that harm consumers or competitors, as NDA misuse often does.37 Unfair practices include practices that offend public policy as established by state statutes and common law,38 which already restrict use of overly broad NDAs as well as NDA misuse to silence disclosures of employer wrongdoing. Stronger enforcement by the FTC would give these laws some needed teeth and would help establish norms governing responsible NDA use. The FTC could also work with companies to develop standards and best practices around NDA use and to encourage companies to engage in robust self-regulation and police one another.39

Stronger enforcement of existing laws could also come from the various federal agencies that help oversee labor and employment in the United States. For example, the National Labor Relations Act protects workers who make common cause in seeking to discuss the terms and conditions of their employment. Regional offices of the National Labor Relations Board (NLRB) have the authority to investigate employers’ use of policies that discourage this type of communication, and to file unfair labor practice charges against employers acting unlawfully. The NLRB can and should exercise this authority more forcefully. Similarly, the Occupational Safety and Health Administration (OSHA) could better use its power to enforce whistleblower laws protecting employees who report unlawful behavior. The Equal Employment Opportunity Commission (EEOC) and the Department of Labor (DOL) also receive complaints about unlawful employment practices, including retaliation against employees who report or object to discriminatory behavior, wage theft, and other violations. The EEOC and DOL should actively seek information about NDAs and related practices in the course of their investigations and should make pursuit of retaliation claims a top priority.

In addition to redoubling enforcement efforts in their respective spheres of jurisdiction, the aforementioned agencies should collaboratively develop and implement strategies for amplifying the collective impact of their oversight.

Prohibit the Most Pernicious Uses of NDAs

New federal laws should be enacted to ban employer-imposed secrecy regarding key categories of essential information, including firm diversity, harassment and discrimination, compensation practices, and workplace health and safety. The recently proposed Ending the Monopoly of Power Over Workplace Harassment through Education and Reporting Act (EMPOWER Act) would make it illegal for an employer to require or enforce an NDA or nondisparagement clause related to workplace harassment based on a range of protected characteristics, including sex, race, national origin, disability, age, or religion. The proposed law, which enjoys bipartisan support, would also establish a confidential tip line for reporting systematic workplace harassment.

The EMPOWER Act is a step in the right direction, but federal legislation should go even further. New laws are needed to protect a wider range of disclosures and to ensure that employees know their rights. A section of California’s Silenced No More Act provides one example. It prohibits companies from using NDAs to silence employees not only about harassment, but also about discrimination and other illegal conduct. To ensure that employees know their rights, the act requires employers who use NDAs for lawful purposes to include in these contracts language clarifying that “[n]othing in this agreement prevents you from discussing or disclosing information about unlawful acts in the workplace, such as harassment or discrimination or any other conduct that you have reason to believe is unlawful.”

The federal Defend Trade Secrets Act (DTSA) provides another example of a provision that could be incorporated into new legislation aimed at reining in NDAs.40 Signed into law by President Obama in 2016, the DTSA requires employers to include language in all employment contracts notifying employees that they are immune from liability when blowing the whistle on unlawful employer behavior, even if doing so involves revealing trade secrets. This notice requirement could be expanded to cover any discussions about workplace conditions. It could also clarify that the NDAs may cover only technical information that is truly secret and not general skills, know- how, and job-related experience. This would give workers more freedom to leverage their knowledge in competing for quality jobs and market-based terms of employment.

Enacting a reform agenda that impacts the widest possible swath of employers requires Congressional action. However, President Biden could act immediately to limit NDA use by federal contractors. The president has the power to issue an executive order restricting or prohibiting the federal government from entering into contracts with companies that fail to adhere to certain rules. President Biden could issue an executive order requiring that federal contractors adhere to new rules prohibiting use of NDAs to conceal essential information, including information on firm diversity, harassment and discrimination, compensation practices, workplace health and safety, and other areas of regulatory compliance. In addition to the benefits discussed above, such rules could help prevent concealment of fraud by government contractors.

Collect Data and Require Disclosure

Research tells us that NDAs are common in American workplaces. Recent events have shown that some employers use NDAs to cover up unlawful behavior. Yet information on the prevalence and content of NDAs is still relatively scarce. Employers are not currently required to disclose their NDAs to any outside party or government regulator. Employers are also free to prohibit employees who sign NDAs from even revealing that the agreement exists. Without adequate information on the scope and nature of the NDA problem, it is difficult for lawmakers to craft well-tailored policy solutions that account for a variety of stakeholder concerns. Any law limiting NDAs must balance the damage that concealing information from the public imposes against the value of NDAs for employers when used appropriately. Legislation must also consider the personal interests of victims of misconduct who may prefer to keep their experiences secret.

Policymakers should therefore require organizations to disclose their NDAs and related clauses in employment agreements. The FTC should use its investigative authority under Section 6(b) of the FTC Act41 to gather and study these documents. The SEC should also consider requiring disclosure of companies’ use of NDAs as part of its broader response to investor demand for credible information about human- capital management42 43and environmental, social, and governance performance.44

In addition, the various agencies that investigate violations of employment laws should collaborate to conduct more research on the scope and effects of NDAs (as well as other corporate-secrecy practices) across states and industries. For instance, the EEOC already receives annual reports from employers about worker demographics, salary breakdowns by gender and race, and other employment information. A coordinated agency effort could provide insight into how NDAs affect diversity and equity in employment. Developing this type of data will help lawmakers assess the anti-competitive effects of corporate secrecy, balance competing policy interests, and draft effective legislation.

Fund Organizations, Not Projects: Diversifying America’s Innovation Ecosystem with a Portfolio of Independent Research Organizations

Summary

Dominant research-funding paradigms constrain the outputs of America’s innovation systems. Federal research-funding agencies like the National Institutes of Health (NIH) and the National Science Foundation (NSF) operate largely through milestone-scoped grants that fail to incentivize high-risk research, impose highly burdensome reporting requirements, and are closely managed by the government. Philanthropically-funded research organizations are an excellent mechanism to experiment with different research management approaches. However, they are perennially underfunded and rarely have a path to long-term sustainability.

A single program with two pieces can address this issue: 

First, the NSF’s new Technology, Innovation, and Partnership (TIP) Directorate should pilot a “organizations, not projects” program in which philanthropically matched grants fund a portfolio of independent research organizations instead of funding specific research initiatives. Partnering with philanthropies will leverage the diversity of American donors to identify a portfolio of research organizations with diverse constraints (and therefore the potential to create outlier outcomes). To have a significant impact, this pilot funding opportunity should be funded at $100 million per year for 10 years.

Second, drawing on the ideas of Kanjun Qiu and Michael Nielsen, the NSF should set aside an additional $100 million per year to sponsor independent research organizations with impressive track records for extended periods of time. This commitment to “acquire” successful organizations will complement Part One’s research-funding opportunity in two ways. First, it will encourage philanthropic participation by making philanthropies feel like their money is going towardssomething that won’t die the moment they stop funding it. Additionally, it will enable the federal government to leverage the institutional knowledge created by successful experiments in research funding and management.

If successful, this two-part program can be later replicated by other federal agencies. The Administration and Congress should prioritize funding this program in recognition of three converging facts: one, that federal spending on research and development (R&D) is increasing; two, that the American innovation ecosystem is not working as well as it once did; and three, that the proliferation of new institutional structures for managing research (e.g., Focused Research Organizations, private Advanced Research Projects Agencies (ARPAs), “science angels”, etc. Swift action could use the increased budgets to empower new organizations to experiment with new ways of organizing R&D in order to address the current system’s sclerosis!

Challenge and Opportunity

There is a growing consensus that there is a gap between the speed and efficiency of R&D projects closely managed by the government and R&D projects managed by the private sector. 

Federal funding is a major part of the American R&D ecosystem. However, most federal research funding comes with a litany of constraints: earmarks that prevent researchers from spending grant money on things they think are most important (like equipment or lab automation), onerous reporting requirements, the need to get every proposal through a committee, and dozens of hours of grant writing for shockingly small amount of money. Moreover, studies have found that with a mandate to fund innovative research, federal funding decisions tend to be risk-averse.

As a result, in situations where there’s a head-to-head comparison between government-managed research and technology development and privately-managed counterparts, there’s little question which is more efficient.

This efficiency gap exists largely because privately-managed organizations often push control over research funds to the organization or level where the “research design” occurs. This yields powerful results. Former Defense Advanced Research Projects Agency director Arati Prabhakar argues that this mechanism, in the form of empowering program managers, is a big part of why the ARPA model works. In the business world, coupling power (money) and responsibility (research design) is simply common sense. In the research world, the benefits of “embedded autonomy” are straightforward. Autonomy enables an organization or individual to react quickly to unexpected circumstances. Research is highly uncertain by nature. Coupling embedded autonomy with research design means that funding will be spent in the most useful way possible at a given moment based on knowledge gained as experimentation progresses — not in the way that a researcher thought would be most useful at the time they submitted their grant proposal.

Recognizing the power of embedded autonomy to enable powerful, diverse research, there is currently an explosion of experiments in non-academic research organizations. Many are too new to have clear results, but non-academic research organizations — including HHMI Janelia, Dynamicland, Willow Garage, and early SpaceX — have created new fields, won Nobel prizes, and changed the paradigms of entire industries. But even the most successful research organizations struggle to raise money unless there is a clear business case, which leaves public-goods oriented research in the lurch. Philanthropists are strongly motivated by legacy, so they want to fund things that will last. As a result, Private funders often hesitate to fund research organizations that produce public-good R&D.

Understanding this problem suggests a potent new way of deploying the federal government’s R&D budget: partnering with philanthropists to build a diverse portfolio of research organizations with autonomy over their own budgets, and then providing long-term support to the most effective of those organizations.

In other words, the federal government should experiment with funding organizations rather than projects.

Such an approach would position the federal government to act like a limited partner (LP) in multiple venture capital funds. In this capacity, the federal government would avoid setting overly specific requirements around how a particular grant is spent. The government would instead set very high-level priorities (e.g., “create new manufacturing paradigms” or simply “do impactful research”), give funded organizations the autonomy to figure out how to best achieve this goal, and then evaluate success after the fact.

The time is right to invest in creative federal research-funding approaches. There is bipartisan support for large increases to federally funded R&D. But pushing huge amounts of money through outdated R&D funding structures is like slamming on the accelerator of a car that needs an engine repair: incredibly inefficient and with the potential to backfire. By contrast, embedding autonomy in a diverse portfolio of organizations could unlock the sort of unexpected, game-changing inventions and discoveries that have driven the American economy: electricity, airplanes, the internet, the transistor, cryptography, and more.

Plan of Action

The current Administration should launch a two-part program at NSF to test a research-funding system that prioritizes organizations over projects.

As Part One of this program, the NSF’s TIP Directorate should pilot a research-funding opportunity in which philanthropically matched grants fund a portfolio of independent research organizations instead of funding specific research initiatives. This pilot funding opportunity should be funded at $100 million per year for 10 years. The Directorate should target funding between 5 and 15 organizations this way, quadratically matching philanthropic funds at values between 100% and 1000% depending on the number of participating philanthropic donors.

As Part Two of this program, NSF should set aside an additional $100 million per year to sponsor independent research organizations with impressive track records for extended periods of time. The Directorate should set a goal of identifying two organizations during the ten-year pilot that would be good candidates for this long-term funding, funding each at $50 million per year.

More detail on each of these program components is provided below.

Part One: Philanthropically matched grants

Partnering with private donors is key to the success of the proposed organization-focused funding opportunity. By funding only organizations that have already raised philanthropic dollars, the federal government will leverage philanthropists’ due diligence on screening applicants to ensure high-potential awardees. Similarly, the funding opportunity should employ quadratic matching funding to use donors’ confidence as an indicator of how much money to give each organization and to reduce bias favoring organizations that are able to raise a large amount of money from a small number of donors. 

Leveraging philanthropic opinion in this way does come with the risk of biasing awards towards organizations working on particularly popular areas or that are particularly good at sales or marketing. The organization-focused funding opportunity could address this risk by establishing a parallel funding pathway whereby a large number of researchers can file a petition for an organization to be selected for funding.

The TIP Directorate obviously must impose additional criteria beyond the endorsements of the philanthropic and research communities. It will be tempting for the Directorate to prioritize funding organizations working on specific, high-interest technology areas or themes. But the goal of this program is to advance the long term health of the American innovation ecosystem. Often, tomorrow’s high-priority area is one that doesn’t even exist today. To that end, the Directorate should evaluate potential grantee organizations on their “counterfactual impact”: i.e., their capacity to do work that is disincentivized in other institutional structures. 

The question of how best to evaluate success of the funding opportunity is a challenging one. It is notoriously hard to evaluate long-term research output. The whole point of this proposal is to move away from short-term metrics and rigid plans, but at the same time the government needs to be responsible to taxpayers. Metrics are the most straightforward way to evaluate outcomes. However, metrics are potentially counterproductive ways to evaluate new and experimental processes because existing metrics presume a specific way of organizing research. We therefore recommend that the TIP Directorate create a Notice of Funding Opportunity to hire an independent, nonpartisan, and nonprofit board whose job is to holistically evaluate funded organizations. The board should include people working in academia, industrial research, government research, and independent research organizations, as well as some “wildcards”. The board should collectively have deep experience performing and guiding high-uncertainty, long-term research and development.

The board would regularly (but not over-frequently) solicit opinions on output and impacts of funded organizations from the program’s philanthropic partners, members of the government, people working with the organizations, unaffiliated researchers, and members of the organizations themselves. At the end of each year, the board should give each organization an evaluation “report card” containing a holistic letter grade and an explanation for that grade. Organizations that receive an F should immediately be expelled from the funding program, as should organizations that receive a D for three years in a row.

Part Two: Invest deeply in demonstrated success

Kanjun Qiu and Michael Nielsen have proposed an important piece of the puzzle: In the same way that governments took over funding libraries once they were started by Gilded Age philanthropists, the government should take over funding immensely successful research organizations today.

At the five-year midpoint and ten-year endpoint of the pilot funding program, the evaluation board should identify any funded organizations that have produced outstanding output. The TIP Directorate should then select up to two of these candidates to receive indefinite government support, at a funding level of $50 million per organization per year. These indefinitely funded organizations would become a line item in the TIP’s budget, to be renewed every year except in extreme circumstances. The possibility of indefinite federal support as an “exit strategy” for philanthropic funders will encourage participation of additional philanthropic partners by providing (i) philanthropically funded organizations a pathway for becoming self-sustaining, and (ii) philanthropies with a clear opportunity to establish a legacy.  

What qualifies as “outstanding output”? Like evaluating success, it’s a challenging question. We recommend using the same board-based grading scheme outlined above. Any organization that receives an A grade in two of the past five years or an A+ in any one of the past five years should be eligible for indefinite support. This approach will require grading to be very strict: for instance, an A+ should only be given to an organization that enables Nobel-prize-quality work.

Conclusion

Building portfolios of independent research organizations is an incredibly effective way of spending government research money. The total federal research budget is almost $160 billion per year. Less than 1% of that could make a massive difference for independent research organizations, most of which have budgets in the $10 million range. Funding especially promising independent research organizations with an additional $10 million or more per year would have a huge effect, empowering organizations that are already doing outstanding work to take their contributions to the next level. 

Even the highest-performing private research organizations in the world — like Google DeepMind and HHMI Janelia Farm — have budgets in the range of $200 million per year. Sponsoring a select number of especially high-performing research organizations with an additional $100 million per year would hence have similarly transformative impacts. These large indefinite grants would also provide the major incentives needed to bring the world’s leading philanthropies to the table and to encourage the most cutting-edge independent research organizations to dedicate their talents to the public sector. The sum total of achieving these outcomes would still account for only a tiny fraction of the overall federal R&D budget.

Finally, we emphasize that the goal of this pilot program is not solely to establish an independent research organization portfolio in the TIP Directorate. It is also an opportunity to test a novel research-funding mechanism that could be replicated at numerous other federal agencies.

Creating Advanced Market Commitments and Prizes for Pandemic Preparedness

As part of its American Pandemic Preparedness plan, the Biden Administration should establish an interagency working group (IWG) focused exclusively on the design, funding, and implementation of advance market commitments (AMCs) and prizes for vaccine development. Under an AMC, pharmaceutical companies commit to providing many vaccine doses at a fixed price in return for a per-dose federal subsidy. Prizes can support AMCs by rewarding companies for meeting intermediate technical goals.

The IWG would immediately convene experts to identify suitable targets for ambitious vaccine-development and deployment efforts. The group would then work with stakeholders to implement AMCs and prizes crafted around these targets, offering a concrete and durable demonstration of the Administration’s commitment to proactive pandemic preparedness. As the American Pandemic Preparedness plan argues, an important part of rapid vaccine deployment is maintaining “hot manufacturing capacity”. Clear federal AMCs would create the market incentive needed to sustain such capacity, while simultaneously advancing procurement expertise within the federal government, in line with recent recommendations from a government review on the US supply chain

Challenge and Opportunity

Vaccines are very cost-effective medical interventions that have played a large role in reducing pathogen-induced deaths over the last 200 years. But vaccines do not yet exist for many diseases, including diseases concentrated in the developing world. Vaccines are undersupplied relative to their social benefit because their target populations are often poor and because strong political pressure for lower prices leads to low expected profits. When new vaccines are approved, scaling up production to fully supply low and middle-income countries (LMICs) can take up to 15 years. AMCs solve these issues by incentivizing vaccine development and hastening production scale-up. Prizes play an intermediate role by offering rewards for meeting technical goals along the way.


Vaccine AMCs have a track record of success. In 2007, GAVI, a public-private global health partnership based out of Geneva, launched an AMC for a pneumococcal conjugate vaccine (PCV) that covered pneumococcal strains more common in the developing world. The partnership received its first supply offers in 2009 (a fairly rapid response enabled by the fact that some PCV candidates were already in late-stage clinical trials). Compared to the rotavirus vaccine — which was developed around the same time but did not receive an AMC — PCVs achieved 3–4x greater coverage (defined as the fully vaccinated fraction of the target population). Moreover, new vaccines typically take about 10–15 years to become widely available in LMICs. PCV became available in those countries within a year. This example demonstrates the capacity of AMCs to incentivize rapid scaling. More recently, the United States (through Operation Warp Speed) and several other countries and organizations purchased substantial COVID-19 vaccine doses far in advance of approval, albeit using a more flexible AMC model that prioritized scaling production before data from clinical trials were available.

Plan of Action

To build on the progress and demonstrated success outlined above, the Biden Administration should invest in AMCs and prizes for vaccine development and deployment as part of its American Pandemic Preparedness plan. Below, we detail three specific recommendations for moving forward.

Recommendation 1. Form an Interagency Working Group (IWG) on Rapid Vaccine Innovation

Roles and responsibilities

Vaccine development and manufacturing is a multi-stage process that is too complicated for any single federal agency to manage. The Biden Administration should issue an Executive Order establishing an IWG on Rapid Vaccine Innovation.

Under emergency circumstances, the IWG would be the government hub for time-sensitive vaccine-procurement efforts. Under normal (non-pandemic) circumstances the IWG would focus on extant communicable diseases with a high disease burden and on potential future threats. This latter function would be carried out as follows.

1. Vaccine targeting. A “horizon scanning” IWG subgroup would identify priority targets for rapid vaccine development and broad deployment. The subgroup would consider factors such as pandemic potential, current disease burden, and vaccine tractability. The IWG would also consult with scientists at the VRC (whose work was essential to the rapid development of COVID-19 vaccines, and who already focus on viruses with pandemic potential) and at the CDC (which already performs pathogen surveillance) in making its determinations. Options for initial vaccine targets could include:

2. Incentive design. Once one or more vaccine targets are identified, an IWG subgroup comprising health economists and budget officers would design the AMC(s) and intermediate prizes intended to spur development and deployment of the target(s). Incentive design would (i) be carried out with substantial input from BARDA, which is familiar with the vaccine-manufacturing landscape, and (ii) consider both the technological distance of the target and market competitiveness. An output from this step would be a Vaccine Incentive Roadmap describing the different prizes and incentives that federal agencies will offer to ensure fast, consistent progress towards development and deployment of the target(s) in question. In other words, the linked prizes included in the roadmap will produce sustained incentives for continued forward progress on vaccine development. More information on this roadmap is provided below.

Structure and participation

The IWG should be structured as an integration, with each participating agency providing specific expertise on each aspect of the IWG’s charge. Participants should include senior leaders from the Biomedical Advanced Research and Development Authority (BARDA), the Centers for Disease Control and Prevention (CDC), the Department of Defense (DOD), the Food and Drug Administration (FDA), the U.S. Agency for International Development (USAID), US International Development Finance Corporation (DFC), and the Vaccine Research Center (VRC). BARDA has a track record of successful procurement of vaccines and expertise in negotiating with manufacturers. VRC’s founding mission is vaccine development and it has collaborated with manufacturers on large-scale production for multiple vaccines. They would provide expertise on vaccine tractability. Through upfront guidance on minimum efficacy requirements, the FDA will ensure vaccine standards. FDA will also work with global regulators on the possibility of regulatory reciprocity, akin to their PEPFAR program, which assists low-resource regulators in low and middle-income countries with decision-making.

The IWG should be chaired by a biosecurity expert housed at the White House Office of Science and Technology Policy (OSTP). 

Congressional notification

The IWGs recommendations (regarding both targets and AMC/prize design), once finalized, would be submitted to the Senate Health and House Ways and Means Subcommittee to request funding. Because federal agencies must notify Congress if they plan to disburse large prize sums (with agency-specific thresholds), this submittal would also serve as the required formal notification to Congress of prize amounts. 

Recommendation 2. Carry out the IWG’s Vaccine Incentive Roadmap

After the IWG has issued its recommendations on vaccine target(s) and incentive (AMC and prizes) design, implementation must follow. Where implementation support comes from will depend on the “technological distance” of the target(s) in question. 

Early-stage development focused on in-vitro or animal research should be supported with prizes from BARDA, the Department of Health and Human Services (HHS), and NIH. All federal agencies already have the authority to award prizes under the America Competes Act. Initial prizes could be awarded to vaccine candidates that successfully protect an animal model against disease. Later prizes could be awarded to candidates that hit clinical milestones such as completion of a successful Phase 1 trial in humans. We note that while agencies can theoretically pool funds for a multi-stage prize, cumbersome interagency processes mean that it will likely be easier to have separate agencies fund and oversee the separate prizes included in the roadmap. 

Later-stage development should be supported with larger prizes or purchases from USAID and DOD. Once a vaccine candidate has reached early-stage human clinical testing, larger prizes and/or different funding mechanisms will likely be required to advance that candidate to later-stage human testing. This is because costs of moving a vaccine candidate from the preclinical stage to the end of phase 2A (early-stage human clinical testing) range from $14 to $159 million dollars.

It is unlikely that a single federal agency would have the discretionary funds or willingness to sponsor a prize sufficient to incentivize participation in this process. Federal partnerships with private-sector entities and/or philanthropies could supplement federal prize funding. The promise of being a government-approved vendor of a vaccine or a DOD-supported prototype would serve as incentive for external entities to enter into such partnerships. USAID could also leverage its relationships with global health stakeholders and funders to provide incentive funding. Of course, external funding partnerships would be unnecessary if Congress appropriated sufficient designated funding for large vaccine-incentive prizes to relevant agencies.

An alternative to prize funding that would be appropriate for incentivizing later-stage R&D is use of the DOD’s Defense Commercial Solutions Opening (CSO) purchasing authority. DOD could use its CSO authority to pre-purchase vaccine doses in large quantities, effectively creating an AMC. Purchases of up to $100 million can be made through CSO authority. Early prize negotiations would use the leverage provided by becoming a government-approved vendor of vaccines (part of the CSO process) to negotiate for fair prices.  A second DOD purchase authority that could be used as an AMC-like incentive is the Other Transaction Authority (OTA), which exempts the DOD from some federal procurement regulations. OTA authority could likely be used to support vaccine research, purchase vaccine prototypes, and pay for some manufacturing of a successful prototype. OTA has also been used to fund research consortia, a possible alternative to a multi-stage prize roadmap. Purchases of up to $20 million can be made through OTA authority. In the context of diseases that affect low and middle income countries, a loan from the US International Development Finance Corporation (DFC) may be an option for supplementing an AMC.

Recommendation 3. Permanently expand BARDA’s mandate to include all communicable diseases, expand BARDA’s funding, and make BARDA the IWG’s permanent home

An IWG is a powerful tool for bringing federal agencies together. With existing prize authority and an administration that prioritizes vaccine development and deployment, much could be accomplished through only the steps outlined above. However, achieving truly transformative results requires a permanent and sustainably funded federal agency to be working consistently on advancing vaccines. Otherwise, future administrations may cancel ongoing IWG projects and/or fail to follow through. As the part of the federal government with the most expertise in therapeutics procurement, BARDA is an ideal permanent home for the IWG’s functions. 

BARDA’s mandate is currently limited to biological, chemical, or radiological threats to the health of Americans. This mandate should be expanded to include all important communicable diseases. The newly empowered BARDA would manage public-private partnerships for vaccine procurement, while  the NIH would remain the fundamental health-research arm of the U.S. government. Expanding BARDA’s mandate would require Congressional action. Congress would need to amend the Pandemic and All-Hazards Preparedness and Advancing Innovation Act appropriately, and would also need to appropriate specific funding for BARDA to carry out the roles and responsibilities of the IWG over the long term.

Frequently Asked Questions
What if we give pharmaceutical companies a bunch of taxpayer money to develop vaccine targets and they fail?

Prizes and AMCs only pay out when a product that meets pre-specified requirements is approved, so taxpayers won’t pay for any failures.

Tell me more about AMC design. What changes if a vaccine candidate is in the early-stage as opposed to the later-stage?

For technologically “close” vaccine targets with a high chance of imminent Phase 3 trial success, an AMC incentivizes rapid scale-up of manufacturing and ensures that more doses reach more people sooner. The AMC does this by circumventing a type of “hold-up” problem wherein purchasers negotiate vaccine prices down to per-unit costs. The 2007 GAVI Pneumococcus AMC was of this type. A GAS or malaria vaccine would similarly be “close” targets.


For more technologically distant targets, AMCs should incorporate “kill switches” that give future customers of the vaccine an effective veto over the AMC by way of not paying co-payments. This feature is designed to be a final check on the utility of a vaccine and avoids the difficulty of specifying standards for a vaccine many years ahead of time. An AMC structured in this way works well if a company manufactures a vaccine that meets pre-specified technical details but for hard-to-predict reasons is not useful.


For an especially distant target, a series of prize competitions could substitute for a traditional AMC. In this scenario, an initial prize could be awarded for any vaccine candidates that successfully protect an animal model against disease. A later prize could be awarded to candidates that hit clinical milestones such as completion of a Phase 1 trial in humans.


Other details of AMC and/or prize implementation depend on the market structure and cannot be determined ahead of time. For instance, the optimal AMC design is very different in monopoly versus competitive markets.

Why does this memo you propose a complicated multi-stage prize process instead of something simple like Operation Warp Speed?

Operation Warp Speed spent about $12 billion dollars on COVID-19 vaccine development and purchased hundreds of millions of vaccine doses far in advance of approval or clinical trials. While this was very effective, it is unlikely that Congress would be willing to appropriate such a large sum of money — or see that money disbursed so freely — in non-pandemic situations. A multi-stage prize process still incentivizes vaccine development and deployment but does so for a lower cost.

How can the federal government carry out these recommendations without provoking anti-vax sentiment?

The government could fund research into market segmentation for vaccines, since many who are vaccine-hesitant are avid consumers of alternative health products/supplements. There may be marketing and promotional strategies inspired by “natural” supplements that can increase vaccine uptake.

Doesn’t the federal government already fund influenza vaccine preparation? Why do we need a universal flu vaccine?

The federal government does fund influenza vaccine preparation, but that funding is only for a seasonal flu vaccine that works with 40–60% efficacy: a rate that is well below what other vaccines, such as the measles (97%) and mumps vaccines (88%) achieve. A pandemic influenza with an unexpected genetic background could still catch us by surprise. Investing in a universal influenza vaccine is essential in preparing for that eventuality.

What are the most likely points of failure for the steps outlined in this memo?

One issue is staffing. Drafting a high-quality AMC contract may require legal and economic expertise that isn’t available in-house at federal agencies, so the administration may need to engage external AMC experts. Another issue may be ensuring that activities outlined herein do not fall between interagency “cracks”. Assigning dedicated staff to oversee each activity will be important. A third issue is the potential for interagency friction. The more agencies that are involved with prize design, the longer it may take to design and authorize a given prize. One possible solution is to have only one agency administer each prize, with informal input from staff in other agencies when required.

Broadening the Knowledge Economy through Independent Scholarship

Summary

Scientists and scholars in the United States are faced with a relatively narrow set of traditional career pathways. Our lack of creativity in defining the scholarly landscape is limiting our nation’s capacity for innovation by stifling exploration, out-of-the-box thinking, and new perspectives.

This does not have to be the case. The rise of the gig economy has positioned independent scholarship as an effective model for people who want to continue doing research outside of traditional academic structures, in ways that best fit their life priorities. New research institutes are emerging to support independent scholars and expand access to the knowledge economy.

The Biden-Harris Administration should further strengthen independent scholarship by (1) facilitating partnerships between independent scholarship institutions and conventional research entities; (2) creating professional-development opportunities for independent scholars; and (3) allocating more federal funding for independent scholarship.

Challenge and Opportunity

The academic sector is often seen as a rich source of new and groundbreaking ideas in the United States. But it has become increasingly evident that pinning all our nation’s hopes for innovation and scientific advancement on the academic sector is a mistake. Existing models of academic scholarship are limited, leaving little space for any exploration, out-of-the-box thinking, and new perspectives. Our nation’s universities, which are shedding full-time faculty positions at an alarming rate, no longer offer as reliable and attractive career opportunities for young thinkers as they once did. Conventional scholarly career pathways, which were initially created with male breadwinners in mind, are strewn with barriers to broad participation. But outside of academia, there is a distinct lack of market incentive structures that support geographically diverse development and implementation of new ideas. 

These problems are compounded by the fact that conventional scholarly training pathways are long, expensive, and unforgiving. A doctoral program takes an average of 5.8 years and $115,000 to complete. The federal government spends $75 billion per year on financial assistance for students in higher education. Yet inflexible academic structures prevent our society from maximizing returns on these investments in human capital. Individuals who pursue and complete advanced scholarly training but then opt to take a break from the traditional academic pipeline — whether to raise a family, explore another career path, or deal with a personal crisis — can find it nearly impossible to return. This problem is especially pronounced among first-generation studentswomen of color, and low income groups. A 2020 study found that out of the 67% of Ph.D. students who wanted to stay in academia after completing their degree, only 30% of those people did. Outside of academia, though, there are few obvious ways for even highly trained individuals to contribute to the knowledge economy. The upshot is that every year, innumerable great ideas and scholarly contributions are lost because ideators and scholars lack suitable venues in which to share them.

Fortunately, an alternative model exists. The rise of the gig economy has positioned independent scholarship as a viable approach to work and research. Independent scholarship recognizes that research doesn’t have to be a full-time occupation, be conducted via academic employment, or require attainment of a certain degree. By being relatively free of productivity incentives (e.g., publish or perish), independent scholarship provides a flexible work model and career fluidity that allows people to pursue research interests alongside other life and career goals. 

Online independent-scholarship institutes (ISIs) like the Ronin InstituteIGDORE, and others have recently emerged to support independent scholars. By providing an affiliation, a community, and a boost of confidence, such institutes empower independent scholars to do meaningful research. Indeed, the original perspectives and diverse life experiences that independent scholars bring to the table increase the likelihood that such scholars will engage in high-risk research that can deliver tremendous benefits to society. 

But it is currently difficult for ISIs to help independent scholars reach their full potential. ISIs generally cannot provide affiliated individuals with access to resources like research ethics review boards, software licenses, laboratory space, scientific equipment, computing services, and libraries. There is also concern that without intentionally structuring ISIs around equity goals, ISIs will develop in ways that marginalize underrepresented groups. ISIs (and individuals affiliated with them) are often deemed ineligible for research grants, and/or are outcompeted for grants by well-recognized names and affiliations in academia. Finally, though independent scholarship is growing, there is still relatively little concrete data on who is engaging in independent scholarship, and how and why they are doing so. 


Strengthening support for ISIs and their affiliates is a promising way to fast-track our nation towards needed innovation and technological advancements. Augmenting the U.S. knowledge-economy infrastructure with agile ISIs will pave the way for new and more flexible scholarly work models; spur greater diversity in scholarship; lift up those who might otherwise be lost Einsteins; and increase access to the knowledge economy as a whole.

Plan of Action

The Biden-Harris Administration should consider taking the following steps to strengthen independent scholarship in the United States: 

  1. Facilitate partnerships between independent scholarship institutions and conventional research entities.
  2. Create professional-development opportunities for independent scholars.
  3. Allocate more federal funding for independent scholarship.

More detail on each of these recommendations is provided below.

1. Facilitate partnerships between ISIs and conventional research entities.

The National Science Foundation (NSF) could provide $200,000 to fund a Research Coordination Network or INCLUDES alliance of ISIs. This body would provide a forum for ISIs to articulate their main challenges and identify solutions specific to the conduct of independent research (see FAQ for a list) — solutions may include exploring Cooperative Research & Development Agreements (CRADAs) as mechanisms for accessing physical infrastructure needed for research. The body would help establish ISIs as recognized complements to traditional research facilities such as universities, national laboratories, and private-sector labs. 

NSF could also include including ISIs in its proposed National Networks of Research Institutes (NNRIs). ISIs meet many of the criteria laid out for NNRI affiliates, including access to cross-sectoral partnerships (many independent scholars work in non-academic domains), untapped potential among diverse scholars who have been marginalized by — or who have made a choice to work outside of — conventional research environments, novel approaches to institutional management (such as community-based approaches), and a model that truly supports the “braided river” or ”ecosystem” career pathway model. 

The overall goal of this recommendation is to build ISI capacity to be effective players in the broader knowledge-economy landscape. 

2. Create professional-development opportunities for independent scholars. 

To support professional development among ISIs, The U.S. Small Business Administration and/or the NSF America’s Seed Fund program could provide funding to help ISI staff develop their business models, including funding for training and coaching on leadership, institutional administration, financial management, communications, marketing, and institutional policymaking. To support professional development among independent scholars directly, the Office of Postsecondary Education at the Department of Education — in partnership with professional-development programs like Activate, the Department of Labor’s Wanto, and the Minority Business Development Agency — can help ISIs create professional-development programs customized towards the unique needs of independent scholars. Such programs would provide mentorship and apprenticeship opportunities for independent scholars (particularly for those underrepresented in the knowledge economy), led by scholars experienced with working outside of conventional academia.

The overall goal of this recommendation is to help ISIs and individuals create and pursue viable work models for independent scholarship. 

3.  Allocate more federal funding for independent scholarship.

Federal funding agencies like NSF struggle to diversify the types of projects they support, despite offering funding for exploratory high-risk work and for early-career faculty. A mere 4% of NSF funding is provided to “other” entities outside of private industry, federally supported research centers, and universities. But outside of the United States, independent scholarship is recognized and funded. NSF and other federal funding agencies should consider allocating more funding for independent scholarship. Funding opportunities should support individuals over institutions, have low barriers to entry, and prioritize provision of part-time funding over longer periods of time (rather than full funding for shorter periods of time).

Funding opportunities could include: 

Conclusion

Our nation urgently needs more innovative, broadly sourced ideas. But limited traditional career options are discouraging participation in the knowledge economy. By strengthening independent scholarship institutes and independent scholarship generally, the Biden-Harris Administration can help quickly diversify and grow the pool of people participating in scholarship. This will in turn fast-track our nation towards much-needed scientific and technological advancements.

Frequently Asked Questions
What comprises the traditional academic pathway?

The traditional academic pathway consists of 4–5 years of undergraduate training (usually unfunded), 1–3 years for a master’s degree (sometimes funded; not always a precondition for enrollment in a doctoral program), 3–6+ years for a doctoral degree (often at least partly funded through paid assistantships), 2+ years of a postdoctoral position (fully funded at internship salary levels), and 5–7 years to complete the tenure-track process culminating in appointment to an Associate Professor position (fully funded at professional salary levels).

What is independent scholarship?

Independent scholarship in any academic field is, as defined by the Effective Altruism Forum, scholarship “conducted by an individual who is not employed by any organization or institution, or who is employed but is conducting this research separately from that”.

What benefits can independent scholars offer academia and the knowledge economy?

Independent scholars can draw on their varied backgrounds and professional experience to bring fresh and diverse worldviews and networks to research projects. Independent scholars often bring a community-oriented and collaborative approach to their work, which is helpful for tackling pressing transdisciplinary social issues. For students and mentees, independent scholars can provide connections to valuable field experiences, practicums, research apprenticeships, and career-development opportunities. In comparison to their academic colleagues, many independent scholars have more time flexibility, and are less prone to being influenced by typical academic incentives (e.g., publish or perish). As such, independent scholars often demonstrate long-term thinking in their research, and may be more motivated to work on research that they feel personally inspired by.

What is an independent scholarship institute (ISI)?

An ISI is a legal entity or organization (e.g, a nonprofit) that offers an affiliation for people conducting independent scholarship. ISIs can take the form of research institutes, scholarly communities, cooperatives, and others. Different ISIs can have different goals, such as emphasizing work within a specific domain or developing different ways of doing scholarship. Many ISIs exist solely online, which allows them to function in very low-cost ways while retaining a broad diversity of members. Independent scholarship institutes differ from professional societies, which do not provide an affiliation for individual researchers.

Why does a purportedly independent scholar need to be affiliated with an institute?

As the Ronin Institute explains, federal grant agencies and many foundations in the United States restrict their support to individuals affiliated with legally recognized classes of institutions, such as nonprofits. For individual donors, donations made to independent scholars via nonprofits are tax-deductible. Being affiliated with a nonprofit dedicated to supporting independent scholars enables those scholars to access the funding needed for research. In addition, many independent scholars find value in being part of a community of like-minded individuals with whom they can collaborate and share experiences and expertise.

How do ISIs differ from universities?

Universities are designed to support large complex grants requiring considerable infrastructure and full-time support staff; their incentive structures for faculty and students mirror these needs. In contrast, research conducted through an independent-scholarship model is often part-time, inexpensive, and conducted by already trained researchers with little more than a personal computer. With their mostly online structures, ISIs can be very cost effective. They have agile and flexible frameworks, with limited bureaucracy and fewer competing priorities. ISIs are best positioned to manage grants that are stand alone, can be administered with lower indirect rates, require little physical research infrastructure, and fund individuals partnering with collaborators at universities. While toxic academic environments often push women and minority groups out of universities and academia, agile ISIs can take swift and decisive action to construct healthier work environments that are more welcoming of non-traditional career trajectories. These qualities make ISIs great places for testing high-risk, novel ideas.

What types of collaboration agreements could traditional knowledge-economy institutions enter into with ISIs?

Options include:


Curing Alzheimer’s by Investing in Aging Research

Summary

Congress allocates billions of dollars annually to Alzheimer’s research in hopes of finding an effective prophylactic, treatment, or cure. But these massive investments have little likelihood of paying off absent a game-changing improvement in our present knowledge of biology. Funds currently earmarked for Alzheimer’s research would be more productive if they were instead invested into deepening understanding of aging biology at the cell, tissue, and organ levels. Fundamental research advances in aging biology would directly support better outcomes for patients with Alzheimer’s as well as a plethora of other chronic diseases associated with aging — diseases that are the leading cause of mortality and disability, responsible for 71% of annual deaths worldwide and 79% of years lived with disability. Congress should allow the National Institute on Aging to spend funds currently restricted for research into Alzheimer’s specifically on research into aging biology more broadly. The result would be a society better prepared for the imminent health challenges of an aging population.

Challenge and Opportunity

The NIH estimates that 6.25 million Americans now have Alzheimer’s disease, and that due to an aging population, that number will more than double to 13.85 million by the year 2060. The Economist similarly estimates that an estimated 50 million people worldwide suffer dementia, and that that number will increase to 150 million by the year 2050. These dire statistics, along with astute political maneuvering by Alzheimer’s advocates, have led Congress to earmark billions of dollars of federal health-research funds for Alzheimer’s disease.

President Obama’s FY2014 and FY2015 budget requests explicitly cited the need for additional Alzheimer’s research at the National Institutes of Health (NIH). In FY2014, Congress responded by giving the NIH’s National Institute on Aging (NIA) a small but disproportionate increase in funding relative to other national institutes, “in recognition of the Alzheimer’s disease research initiative throughout NIH.” Congress’s explanatory statement for its FY2015 appropriations laid out good reasons not to earmark a specific portion of NIH funds for Alzheimer’s research, stating:

“In keeping with longstanding practice, the agreement does not recommend a specific amount of NIH funding for this purpose or for any other individual disease. Doing so would establish a dangerous precedent that could politicize the NIH peer review system. Nevertheless, in recognition that Alzheimer’s disease poses a serious threat to the Nation’s long-term health and economic stability, the agreement expects that a significant portion of the recommended increase for NIA should be directed to research on Alzheimer’s. The exact amount should be determined by scientific opportunity of additional research on this disease and the quality of grant applications that are submitted for Alzheimer’s relative to those submitted for other diseases.”

But this position changed suddenly in FY2016, when Congress earmarked $936 million for Alzheimer’s research. The amount earmarked by Congress for Alzheimer’s research has risen almost linearly every year since then, reaching $3.1 billion in FY2021 (Figure 1).

This tsunami of funding has been unprecedented for the NIA. The seemingly limitless availability of money for Alzheimer’s research has created a perverse incentive for the NIH and NIA to solicit additional Alzheimer’s funding, even as agencies struggle to deploy existing funding efficiently. The NIH Director’s latest report to Congress on Alzheimer’s funding suggests that with an additional $226 million per year in funding, the NIH and NIA could effectively treat or prevent Alzheimer’s disease and related dementias by 2025. 

This is a laughable untruth. No cure for Alzheimer’s is in the offing. Progress on Alzheimer’s research is stalling and commercial interest is declining. Of the 413 Alzheimer’s clinical trials performed in the United States between 2002 and 2012, 99.6% failed. Recent federal investments seemed to be paying off when in 2021 the Food and Drug Administration (FDA) approved Aduhelm, the first new treatment for Alzheimer’s since 2003. But the approval was based on the surrogate endpoint of amyloid plaques in the brain as observed by PET scans, not on patient outcomes. In its first months on the market, Aduhelm visibly flopped. Scientists subsequently called on the FDA to withdraw marketing approval for the drug. If an effective treatment were likely by 2025, Big Pharma would be doubling down. But Pfizer announced it was abandoning Alzheimer’s research in 2018.

The upshot is clear: lavish funding on treatments and cures for a disease can only do so much absent knowledge of that disease’s underlying biological mechanisms. We as a society must resist the temptation to waste money on expensive shots in the dark, and instead invest strategically into understanding the basic biochemical and genetic mechanisms underlying aging processes at the cell, tissue, and organ levels.

Plan of Action

Aging is the number-one risk factor for Alzheimer’s disease, as it is for many other diseases. All projections of an increasing burden of Alzheimer’s are based on the fact that our society is getting older. And indeed, even if a miraculous cure for Alzheimer’s were to emerge, we would still have to contend with an impending onslaught of other impending medical and social costs. 

Economists and scientists have estimated extending average life expectancy in the United States by one year is worth $38 trillion. But funding for basic research on aging remains tight. Outside of the NIA, several foundations in the United States are actively funding aging research: the American Federation for Aging Research (AFAR), The Glenn Foundation for Medical Research, and the SENS Foundation each contribute a few million per year for aging research. Privately funded fast grants have backed bold aging projects with an additional $26 million. 

This relatively small investment in basic research has generated billions in private funding to commercialize findings. Startups raised $850 million in 2018 to target aging and age-related diseases. Google’s private research arm Calico is armed with billions and a pharmaceutical partner in Abbvie, and the Buck Institute’s Unity Biotechnology launched an initial public offering (IPO) in 2018. In 2021, Altos Labs raised hundreds of millions to commercialize cellular reprogramming technology. Such dynamism and progress in aging research contrasts markedly with the stagnation in Alzheimer’s research and indicates that the former is a more promising target for federal research dollars.

Now is the time for the NIA to drive science-first funding for the field of aging. Congress should maintain existing high funding levels at NIA, but this funding should no longer be earmarked solely for Alzheimer’s research. In every annual appropriation since FY2016, the House and Senate appropriations committees have issued a joint explanatory statement that has force of law and includes the Alzheimer’s earmark. These committees should revert to their FY2015 position against politically directing NIH funds towards particular ends. The past six years have shown such political direction to be a failed experiment.

Removing the Alzheimer’s earmark would allow the NIA to use its professional judgment to fund the most promising research into aging based on scientific opportunity and the quality of the grant applications it receives. We expect that this in turn would cause agency-funded research to flourish and stimulate further research and commercialization from industry, as privately funded aging research already has. Promising areas that the NIA could invest in include building tools for understanding molecular mechanisms of aging, establishing and validating aging biomarkers, and funding more early-stage clinical trials for promising drugs. By building a better understanding of aging biology, the NIA could do much to render even Alzheimer’s disease treatable.

Frequently Asked Questions
How did Congress get so interested in Alzheimer’s disease? What recent actions has Congress taken on funding for Alzheimer’s research?

In 2009, a private task force calling itself the Alzheimer’s Study Group released a report entitled “A National Alzheimer’s Strategic Plan.” The group, co-chaired by former Speaker of the House Newt Gingrich and former Nebraska Senator Bob Kerrey, called on Congress to immediately increase funding for Alzheimer’s and dementia research at the NIH by $1 billion per year.


In response to the report, Senators Susan Collins and Evan Bayh introduced the National Alzheimer’s Project Act (NAPA), which was signed into law in 2011 by Barack Obama. NAPA requires the Department of Health and Human Services to produce an annual assessment of the nation’s progress in preparing for an escalating burden of Alzheimer’s disease. This annual assessment is called the National Plan to Address Alzheimer’s Disease. The first National Plan, released in 2012, established a goal of effectively preventing or treating Alzheimer’s disease by 2025. In addition, the Alzheimer’s Accountability Act, which passed in the 2015 omnibus, gives the NIH director the right and the obligation to report directly to Congress on the amount of additional funds needed to meet the goals of the national plan, including the self-imposed 2025 goal.

Why is treating Alzheimer’s so hard?

Understanding diseases that progress over a long period of time such as Alzheimer’s requires complex clinical studies. Lessons learned from past research indicate that animal models don’t necessarily translate into humans when it comes to such diseases. Heterogeneity in disease presentation, imprecise clinical measures, relevance of target biomarkers, and difficulty in understanding underlying causes exacerbate the problem for Alzheimer’s specifically.


Alzheimer’s is also a whole-system, multifactorial disease. Dementia is associated with a decreased variety of gut microbiota. Getting cataract surgery seemingly reduces Alzheimer’s risk. Inflammatory responses from the immune system can aggravate neurodegenerative diseases. The blood-brain barrier uptakes less plasma protein with age. The list goes on. Understanding Alzheimer’s hence requires understanding of many other biological systems.

What is the amyloid hypothesis?

Alzheimer’s is named after Alois Alzheimer, a German scientist credited with publishing the first case of the disease in 1906. In the post-mortem brain sample of his patient, he identified extracellular deposits, now known as amyloid plaques, clumps of amyloid-beta (Aβ) protein. In 1991, David Allsop and John Hardy proposed the amyloid hypothesis after discovering a pathogenic mutation in the APP (Aβ precursor protein) gene on chromosome 21. Such a mutation led to increased Aβ deposits which present as early-onset Alzheimer’s disease in families.


The hypothesis suggested that Alzheimer’s follows the pathological cascade of Aβ aggregation → tau phosphorylation → neurofibrillary tangles → neuronal death. These results indicated that Aβ could be a drug target for Alzheimer’s disease.


In the 1990s, Elan Pharmaceuticals proposed a vaccine against Alzhiemer’s by stopping or slowing the formation of Aβ aggregates. It was a compelling idea. In the following decades, drug development centered around this hypothesis, leading to the current approaches to Alzhiemer’s treatment: Aβinhibition (β- and γ-secretase inhibitors), anti-aggregation (metal chelators),  Aβ clearing (protease-activity regulating drugs), and immunotherapy.


In the last decade, the growing arsenal of Aβ therapies fueled the excitement that we were close to an Alzheimer’s treatment. The 2009 report, the 2012 national plan, and Obama’s funding requestsseemed to confirm that this was the case.


However, the strength of the amyloid hypothesis has declined since then. Since the shutdown of the first Alzheimer’s vaccine in 2002, numerous other pharmaceutical companies have tried and failed at creating their own vaccine, despite many promising assets shown to clear Aβ plaques in animal models. Monoclonal antibody treatments (of which aducanamab is an example) have reduced free plasma concentrations of Aβ by 90%, binding to all sorts of Aβ from monomeric and soluble Aβ to fibrillar and oligomeric Aβ. These treatments have suffered high-profile late-stage clinical trial failures in the last five years. Similar failures surround other approaches to Alzheimer’s drug development.


There is no doubt these therapies are successful at reducing Aβ concentration in pre-clinical trials. But combined with the continuous failure of these drugs in late-stage clinical trials, perhaps Aβ does not play as major a role in the mechanistic process as hypothesized.

Smarter Zoning for Fair Housing

Summary

Exclusionary zoning is damaging equity and inhibiting growth and opportunity in many parts of America. Though the Supreme Court struck down expressly racial zoning in 1917, many local governments persist with zoning that discriminates against low-wage families — including many families of color.1 Research shows that has connected such zoning to racial segregation, creating greater disparities in measurable outcomes.2

By contrast, real-world examples show that flexible zoning rules — rules that, for instance, that allow small groups to opt into higher housing density while bypassing veto players, or that permit some small areas to opt out of proposed zoning reforms — can promote housing fairness, supply, and sustainability. Yet bureaucratic and knowledge barriers inhibit broad implementation of such practices. To facilitate zoning reform, the Department of Housing and Urban Development should (i) draft model smarter zoning codes, (ii) fund efforts to evaluate the impact of smarter zoning practices, (iii) support smarter zoning pilot programs at the state and local levels, and (iv) coordinate with other federal programs and agencies on a whole-of-government approach to promote smarter zoning.

Challenge and Opportunity

Economists across the political spectrum agree that restrictive zoning laws banning inclusive, climate-friendly, multi-family housing have made housing less affordable, increased racial segregation and damaged the environment. Better zoning would enable fairer housing outcomes and boost growth across America.

The Biden-Harris administration is actively working to eliminate exclusionary zoning in order to advance the administration’s priorities of racial justice, respect for working-class people, and national unity. But in many states with unaffordable housing, local politics have made zoning reform painfully slow and/or precarious. In California, for instance, zoning-reform activists have garnered significant victories. But a recently launched petition to limit state power over zoning might undo some of the progress made so far. There is an urgent need for strategies to overcome political gridlock limiting or inhibiting zoning reform at the state and local levels.

Fortunately, a suite of new smarter zoning techniques can achieve needed reforms while alleviating political concerns. Consider Houston, TX, which faced resistance in reducing suburban minimum lot sizes to allow more housing. To overcome political obstacles, the city gave individual streets and blocks the option to opt out of the proposed reform. That simple technique reduced resistance and allowed the zoning measure to pass. The powerful incentives from increased land value meant that although opt outs reached nearly 50% in one neighborhood, they were rare in many others.3 The American Planning Association similarly published a proposal to allow opt-ins for upzoning at a street-by-street level — a practice that would allow small groups to bypassing those who currently block reform in order capture the huge incentives of upzoning.

In fact, opt-ins and opt-outs are proven methods of overcoming political obstacles in other policy fields, including parking reform and “play streets” in urban policy. Opt-ins and opt-outs reduce officials’ and politicians’ concerns that a vocal and unrepresentative group will blame them for reforms. While reformers may fear that allowing exemptions may weaken zoning reforms, the enormous increase in land value created by upzoning in unaffordable areas provides powerful incentives for small groups of homeowners to choose upzoning of their own lots. And by offering a pathway to circumvent opposition, flexible smarter zoning reforms can expedite construction of abundant new affordable housing that substantially improves equity, opportunity, and quality of life for working-class Americans. 

Absent action by HUD to encourage trials of innovative techniques, the pace of reform will continue to be much slower than it needs to be. Campaigners at state and local government level will continue to face opposition and setbacks. The pace of growth and innovation will be damaged, as bad zoning continues to block the benefits of mobility and opportunity. And disadvantaged minorities will continue to suffer the most from unjust and exclusionary zoning rules.xc

Plan of Action

The Department of Housing and Urban Development (HUD) should take the following steps to facilitate zoning reform in the United States: 

1. Create a model Smarter Zoning Code

HUD’s Office of Policy Development and Research, working with the Environmental Protection Agency (EPA)’s Office of Community Revitalization, should produce a model Smarter Zoning Code that state and local governments can adopt and adapt. The Smarter Zoning Code would provide a variety of options for state and local governments to minimize backlash against zoning reforms by reducing effects on other streets or blocks. Options could include:4

A draft of a model Smarter Zoning Code could be developed for $1 million and could be tested by seeking views from a range of stakeholders for $5 million. The model code should be highlighted in HUD’s Regulatory Barriers Clearinghouse.

2. Collect and showcase evidence on effectiveness and impacts of smarter zoning practices

As part of the list of policy-relevant questions in its systematic plan under the Foundations for Evidence-Based Policymaking Act of 20187, HUD should include the question of which types of zoning approaches, including smarter zoning, can best (i) help to address or overcome political and other barriers to meeting fair-housing standards, and (ii) support plentiful supplies of affordable housing to address equity and other issues.

HUD should also provide research grants under the Unlocking Possibilities Program8, once passed, to evaluate the impact of Smarter Zoning techniques, suggest improvements to the model Smarter Zoning Code, and prepare and showcase successful case studies of flexible zoning.

Finally, demonstrated thought leadership by the Biden-Harris Administration could kickstart a new wave of innovation in smarter zoning that helps address historic equity issues. HUD should work with the White House and key stakeholder groups (e.g., the American Planning Association, the National League of Cities, the National Governors’ Association) to host a widely publicized event on Planning for Opportunity and Growth. The event would showcase proven, innovative zoning practices that can help state and local government representatives meet housing and growth objectives.

3. Launch smarter-zoning pilot projects

Subject to funding through the Unlocking Possibilities Program, the HUD Secretary should direct HUD’s Office of Technical Assistance and Management to launch a collection of pilot projects for the implementation of the model Smarter Zoning Code. Specifically, HUD would provide planning grants to help states, local governments, and potentially other groups improve skills and technical capacity needed to implement or promote Smarter Zoning reforms. The technical assistance to help a local government adopt smarter zoning, where possible under existing state law, should cost less than $100,000; technical assistance for a state to enable smarter zoning on a state-wide basis should cost less than $500,000.

4. Promote federal incentives and coordination around smarter zoning

Model codes, evidence-based practices, and planning grants can help advance upzoning in areas that are already interested. The federal government could also provide stronger incentives to encourage more reluctant areas to adopt smarter zoning. It is lawful to condition a portion of federal funds upon criteria that are “directly related to one of the main purposes for which [such funds] are expended”, so long as the financial inducement is not “so coercive as to pass the point at which ‘pressure turns into compulsion’”.9 For instance, one of the purposes of highway funds is to reduce congestion in interstate traffic. Failure to allow walkable urban densification limits the opportunities for travel other than by car, which in turn increases congestion on federal highways. It would therefore be constitutional for the federal government to withhold 5% of federal highway funds from states that do not enact smarter zoning provisions. Similarly, funding for affordable home care proposed under the Build Back Better Act will be less effective in areas where exclusionary zoning makes it less affordable for carers to live. A portion of such funding could be withheld from states that do not pass smarter zoning laws. Similar action could be taken on federal funds for education, where unaffordable housing affects the supply of teachers, and on federal funds to fight climate change, because sprawl driven by single-family zoning increases carbon emissions.

HUD’s Office of Fair Housing and Equal Opportunity should consult with other federal bodies on what federal funding can be made conditional upon participation by state and local governments in smarter zoning programs, as well as on when implementing such conditions would require Congressional approval. HUD should similarly consult with other federal bodies on creative opportunities to incentivize smarter zoning through existing programs. If Congress does not wish to amend the law, it may be possible for other agencies to condition funding upon implementation of smarter zoning provisions at state or local level. Although smarter zoning will also benefit existing residents, billions of dollars of incentives may be needed for the most reluctant states and local governments to overcome existing veto players to get more equitable zoning.

Conclusion

Urgent reform is needed to address historic damage caused to equity by zoning rules, originally explicitly racist in language, that remain economically exclusionary in intent and racially discriminatory in impact. By modeling smarter zoning practices, demonstrating their benefits, providing financial and technical assistance for implementation, and conditioning federal funding upon adoption, HUD can accelerate and expand adoption of beneficial flexible zoning reforms nationwide.

Frequently Asked Questions
1. Why expend effort on flexible smarter zoning as opposed to more traditional, sweeping zoning reforms?

Many proposed zoning reforms that, if implemented, would go the furthest to improve equity and provision of fair housing have encountered considerable political challenges in areas where exclusionary zoning is most prevalent and damaging. Flexible zoning reforms may have apparently less sweeping impacts than traditional zoning reforms, but are also far more feasible in practice. Providing additional ideas to help overcome those political barriers may be a powerful way to unlock improvements in equity.

2. Would giving small groups the power to opt into upzoning really produce additional housing? Would giving small groups the power to opt out considerably weaken zoning reforms?

To be clear, there is no suggestion to give small groups the power to opt into zoning that is more restrictive than current rules. Flexible zoning reform can often be more powerful than traditional zoning reform. Members of the Squamish Nation recently demonstrated the enormous power of economic incentives to upzone when 87% voted to approve the construction of 6,000 new homes on their territory. Similarly, a large fraction of the residents of Houston — recognizing that upzoning could make their properties more valuable — did not choose to opt their blocks out of recent zoning reform. Incentives for apartment owners to vote for redevelopment under the TAMA 38 scheme in Israel accounted for 35% of the new homes built in Tel Aviv in 2020.


If no individual landowners wanted to gain the economic benefits of being permitted to develop their lots, there would be no demand from others for zoning rules to stop development from proceeding. Most existing processes governing upzoning give disproportionate weight to the opinions of vocal but unrepresentative groups who want no change, even in areas where a large majority would otherwise support reform. Direct democracy at very small scales can let small groups of residents bypass those veto players and capture the economic benefits of allowing more housing.

3. Why would any state or local government implement flexible smarter zoning?

Many state and local leaders are aware of the enormous equity and growth benefits that better, more inclusionary zoning can deliver. However, such leaders are often frustrated by political and public resistance to simple upzoning attempted via traditional zoning processes. Smarter zoning techniques can allow upzoning to proceed in the many blocks and streets where it is popular, without being frustrated by the resistance from the few residents among whom it is not.

4. Would smarter zoning practices crowd out more sweeping zoning reforms?

Smarter zoning proposals are designed to supplement and assist traditional zoning reforms, not replace them. “Opt-in” zoning mechanisms are designed to allow opt-ins only to more equitable upzoning, not to more exclusionary zoning, so they cannot make matters worse. Similarly, “opt-out” mechanisms only apply where the promoters of an ambitious new pro-equity reform want a way to overcome strong political resistance to that specific reform.


 


Another objection is that smarter zoning might be seen to perpetuate local zoning control. But existing local zoning processes are structured to block change and empower local veto players. By contrast, smarter zoning techniques are designed so that groups who wish to capture the economic benefits of upzoning can use direct democracy to bypass existing veto players, in a way that has proven successful in other fields. Where smarter zoning is imposed by state law, it can hardly be said to be entrenching local control. And in any case, existing state powers to override local zoning will remain, as will the potential for future federal action on zoning.

5. Could smarter zoning policies harm renters?

Not if designed correctly. As explained above, smarter zoning codes can and should include strong provisions to protect renters.

6. How quickly could HUD and EPA develop a Smarter Zoning Code?

An initial draft of a model Smarter Zoning Code could likely be produced within three months. Testing with stakeholders should take no more than six months, meaning that a final code could be published by HUD within one year of the effort beginning.

7. Who is likely to object to smarter zoning?

  • Officials wedded to traditional zoning processes may not wish to try innovative methods to improve equity, but smarter zoning proposals have been published by the American Planning Association and have little risk of harm.

  • Resistance will arise from some residents of areas with exclusionary zoning. However, such resistance will be less than the resistance to universal upzoning mandates. And this resistance will be counterbalanced and often outweighed by the support of the many residents drawn by the economic benefits of upzoning for them and their families.

  • Advocates of aggressive zoning reform may complain that smarter zoning is not sufficiently assertive. One response to this objection is that federal powers to impose such upzoning are highly constrained by political gridlock and partisanship. Smarter zoning is a politically feasible way to advance equitable zoning in the near term, while the campaign for broader national zoning reform continues in the long term.

Creating an AI Testbed for Government

Summary

The United States should establish a testbed for government-procured artificial intelligence (AI) models used to provide services to U.S. citizens. At present, the United States lacks a uniform method or infrastructure to ensure that AI systems are secure and robust. Creating a standardized testing and evaluation scheme for every type of model and all its use cases is an extremely challenging goal. Consequently, unanticipated ill effects of AI models deployed in real-world applications have proliferated, from radicalization on social media platforms to discrimination in the criminal justice system. Increased interest in integrating emerging technologies into U.S. government processes raises additional concerns about the robustness and security of AI systems.

Establishing a designated federal AI testbed is an important part of alleviating these concerns. Such a testbed will help AI researchers and developers better understand how to construct testing methods and ultimately build safer, more reliable AI models. Without this capacity, U.S. agencies risk perpetuating existing structural inequities as well as creating new government systems based on insecure AI systems — both outcomes that could harm millions of Americans while undermining the missions that federal agencies are entrusted to pursue.

Improving Graduate-Student Mentorship by Investing in Traineeship Grants

Summary

Graduate students are more likely to persist in their academic decisions if engaged in positive mentoring experiences. Graduate students also cite positive mentoring experiences as the most important factor in completing a Science, Technology, Engineering, Math, or Medicine (STEMM) degree. In the United States, though, these benefits are often undermined by a research ecosystem that ties mentorship and training of graduate students by Principal Investigators (PIs) to funding in the form of research assistantships. Such arrangements often lead to unreasonable work expectations, toxic work environments, and poor mentor-mentee relationships.

To improve research productivity, empower predoctoral researchers to achieve their career goals, and increase the intellectual freedom that young scientists need to pursue productively disruptive scholarship, we recommend that federal science funding agencies:

1. Establish traineeship grant programs at all federal science funding agencies.

2. Require every PI receiving a federal research grant to implement an Individual Development Plan (IDP) for each student funded by that grant.

3. Require every university receiving federal training grants to create a plan for how it will provide mentorship training to faculty, and to actively consider student mentorship as part of faculty promotion, reappointment, and tenure processes.

4. Direct and fund federal science agencies to build professional development networks and create other training opportunities to help more PIs learn best practices for mentorship.

Creating a Public System of National Laboratory Schools

Summary

The computational revolution enables and requires an ambitious reimagining of public high-school and community-college designs, curricula, and educator-training programs. In light of a much-changed — and much-changing — society, we as a nation must revisit basic assumptions about what constitutes a “good” education. That means re-considering whether traditional school schedules still make sense, updating outdated curricula to emphasize in-demand skills (like computer programming), bringing current perspectives to old subjects (like computational biology); and piloting new pedagogies (like project-based approaches) better aligned to modern workplaces. To do this, the Federal Government should establish a system of National Laboratory Schools in parallel to its existing system of Federally Funded Research & Development Centers (FFRDCs).

The National Science Foundation (NSF) should lead this work, partnering with the Department of Education (ED) to create a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) Directorate. The DSI would act as a platform analogous to the Small Business Innovation Research (SBIR) program, catalyzing Laboratory Schools by providing funding and technical guidance to federal, state, and local entities pursuing educational or cluster-based workforce-development initiatives.

The new Laboratory Schools would take inspiration from successful, vertically-integrated research and design institutes like Xerox PARC and the Mayo Clinic in how they organized research, as well as from educational systems like Governor’s Schools and Early College High Schools in how they organized their governance. Each Laboratory School would work with a small, demographically and academically representative cohort financially sustainable on local per-capita education budgets.
Collectively, National Laboratory Schools would offer much-needed “public sandboxes” to develop and demonstrate novel school designs, curricula, and educator-training programs rethinking both what and how people learn in a computational future.

Challenge and Opportunity

Education is fundamental to individual liberty and national competitiveness. But the United States’ investment in advancing the state of the art is falling behind. 

Innovation in educational practice has been incremental. Neither the standards-based nor charter-school movements departed significantly from traditional models. Accountability and outcomes-based incentives like No Child Left Behind suffer from the same issue.

The situation in research is not much better: NSF and ED’s combined spending on education research is barely twice the research and development budget of Nintendo. And most of that research focuses on refining traditional school models (e.g. presuming 50-minute classes and traditional course sequences).

Despite all these efforts, we are still seeing unprecedented declines in students’ math and reading scores.

Meanwhile, the computational revolution is widening the gap between what school teaches and the skills needed in a world where work is increasingly creative, collaborative, and computational. Computation’s role in culture, commerce, and national security is rapidly expanding; computational approaches are transforming disciplines from math and physics to history and art. School can’t keep up.

For years, research has told us individualized, competency- and project-based approaches can reverse academic declines while aligning with the demands of industry and academia for critical thinking, collaboration, and creative problem-solving skills. But schools lack the capacity to follow suit.

Clearly, we need a different approach to research and development in education: We need prototypes, not publications. While studies evaluating and improving existing schools and approaches have their place, there is a real need now for “living laboratories” that develop and demonstrate wholly transformative educational approaches.

Schools cannot do this on their own. Constitutionally and financially, education is federated to states and districts. No single public actor has the incentives, expertise, and resources to tackle ambitious research and design — much less to translate into research to practice on a meaningful scale. Private actors like curriculum developers or educational technologists sell to public actors, meaning private sector innovation is constrained by public school models. Graduate schools of education won’t take the brand risk of running their own schools, and researchers won’t pursue unfunded or unpublishable questions. We commend the Biden-Harris administration’s Multi-Agency Research and Development Priorities for centering inclusive innovation and science, technology, education, and math (STEM) education in the nation’s policy agenda. But reinventing school requires a new kind of research institution, one which actually operates a school, developing educators and new approaches firsthand.Luckily, the United States largely invented the modern research institution. It is time we do so again. Much as our nation’s leadership in science and technology was propelled by the establishment ofland-grant universities in the late 19th century, we can trigger a new era of U.S. leadership in education by establishing a system of National Laboratory Schools. The Laboratory Schools will serve as vertically integrated “sandboxes” built atop fully functioning high schools and community colleges, reinventing how students learn and how we develop in a computational future.

Plan of Action

To catalyze a system of National Laboratory Schools, the NSF should establish a Division for School Invention (DSI) within its Technology, Innovation, and Partnerships (TIP) directorate. With an annually escalating investment over five years (starting at $25 million in FY22 and increasing to $400 million by FY26), the DSI could support development of 100 Laboratory Schools nationwide.

The DSI would support federal, state, and local entities — and their partners — in pursuing education or cluster-based workforce-development initiatives that (i) center computational capacities, (ii) emphasize economic inclusion or racial diversity, and (iii) could benefit from a high-school or community-college component.

DSI support would entail:

  1. Competitive matching grants modeled on SBIR grants. These grants would go towards launching Laboratory Schools and sustaining those that demonstrate success.
  2. Technical guidance to help Laboratory Schools (i) innovate while maintaining regulatory compliance, and (ii) develop financial models workable on local education budgets.
  3. Accreditation support, working with partner executives (e.g., Chairs of Boards of Higher Education) where appropriate, to help Laboratory Schools establish relationships with accreditors, explain their educational models, and document teacher and student work for evaluation purposes.
  4. Responsible-research support, including providing Laboratory Schools assistance with obtainingFederalwide Assurance (FWA) and access to partners’ Institutional Review Boards (IRBs).
  5. Convening and storytelling, raising awareness of and interest in Laboratory Schools’ mission and operations.

Launching at least ten National Laboratory Schools by FY23 would involve three primary steps. First, the White House Office of Science and Technology Policy (OSTP) should convene an expert group comprised of (i) funders with a track record of attempting radical change in education and (ii) computational domain experts to design an evaluation process for the DSI’s competitive grants, secure industry and academic partners to help generate interest in the National Laboratory School System, and recruit the DSI’s first Director.

In parallel, Congress should issue one appropriations report asking NSF to establish a $25 million per year pilot Laboratory School program aligned with the NSF Directorate for Technology, Innovation, and Partnerships (TIP)’s Regional Innovation Accelerators (RIA)’s Areas of Investment. Congress should issue a second appropriations report asking the Office of Elementary and Secondary Education (OESE) to release a Dear Colleague letter encouraging states that have spent less than 75% of their Elementary and Secondary School Emergency Relief (ESSER) or American Recovery Plan funding to propose a Laboratory School.

Finally, the White House should work closely with the DSI’s first Director to convene the Department of Defense Education Activity (DDoEA) and National Governors Association (NGA) to recruit partners for the National Laboratory Schools program. These partners would later be responsible for operational details like:

Focus will be key for this initiative. The DSI should exclusively support efforts that center:

  1. New public schools, not programs within (or reinventions of) existing schools.
  2. Radically different designs, not incremental evolutions.
  3. Computationally rich models that integrate computation and other modern skills into all subjects.
  4. Inclusive innovation focused on transforming outcomes for the poor and historically marginalized.

Conclusion

Imagine the pencil has just been invented, and we treated it the way we’ve treated computers in education. “Pencil class” and “pencil labs” would prepare people for a written future. We would debate the cost and benefit of one pencil per child. We would study how oral test performance changed when introducing one pencil per classroom, or after an after-school creative-writing program.

This all sounds stupid because the pencil and writing are integrated throughout our educational systems rather than being considered individually. The pencil transforms both what and how we learn, but only when embraced as a foundational piece of the educational experience.

Yet this siloed approach is precisely the approach our educational system takes to computers and the computational revolution. In some ways, this is no great surprise. The federated U.S. school system isn’t designed to support invention, and research incentives favor studying and suggesting incremental improvements to existing school systems rather than reimagining education from the ground up. If we as a nation want to lead on education in the same way that we lead on science and technology, we must create laboratories to support school experimentation in the same way that we establish laboratories to support experimentation across STEM fields. Certainly, the federal government shouldn’t run our schools. But just as the National Institutes of Health (NIH) support cutting-edge research that informs evolving healthcare practices, so too should the federal government support cutting-edge research that informs evolving educational practices. By establishing a National Laboratory School system, the federal government will take the risk and make the investments our communities can’t on their own to realize a vision of an equitable, computationally rich future for our schools and students.

Frequently Asked Questions

Who

1. Why is the federal government the right entity to lead on a National Laboratory School system?

Transformative education research is slow (human development takes a long time, as does assessing how a given intervention changes outcomes), laborious (securing permissions to test an intervention in a real-world setting is often difficult), and resource-intensive (many ambitious ideas require running a redesigned school to explore properly). When other fields confront such obstacles, the public and philanthropic sectors step in to subsidize research (e.g., by funding large research facilities). But tangible education-research infrastructure does not exist in the United States.

Without R&D demonstrating new models (and solving the myriad problems of actual implementation), other public- and private-sector actors will continue to invest solely in supporting existing school models. No private sector actor will create a product for schools that don’t exist, no district has the bandwidth and resources to do it themselves, no state is incentivized to tackle the problem, and no philanthropic actor will fund an effort with a long, unclear path to adoption and prominence.

National Laboratory Schools are intended primarily as research, development, and demonstration efforts, meaning that they will be staffed largely by researchers and will pursue research agendas that go beyond the traditional responsibilities and expertise of local school districts. State and local actors are the right entities to design and operate these schools so that they reflect the particular priorities and strengths of local communities, and so that each school is well positioned to influence local practice. But funding and overseeing the National Laboratory School system as a whole is an appropriate role for the federal government.

2. Why is NSF the right agency to lead this work?

For many years, NSF has developed substantial expertise funding innovation through the SBIR/STTR programs, which award staged grants to support innovation and technology transfer. NSF also has experience researching education through its Directorate for Education and Human Resources (HER). Finally, NSF’s new Directorate for Technology, Innovation, and Partnerships (TIP) has a mandate to “[create] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” NSF is the right agency to lead the National Laboratory Schools program because of its unique combination of experience, in-house expertise, mission relevance, and relationships with agencies, industry, and academia.

3. What role will OSTP play in establishing the National Laboratory School program? Why should they help lead the program instead of ED?

ED focuses on the concerns and priorities of existing schools. Ensuring that National Laboratory Schools emphasize invention and reimagining of educational models requires fresh strategic thinking and partnerships grounded in computational domain expertise.

OSTP has access to bodies like the President’s Council of Advisors on Science and Technology (PCAST)and the National Science and Technology Council (NSTC). Working with these bodies, OSTP can easily convene high-profile leaders in computation from industry and academia to publicize and support the National Laboratory Schools program. OSTP can also enlist domain experts who can act as advisors evaluating and critiquing the depth of computational work developed in the Laboratory Schools. And annually, in the spirit of the White House Science Fair, OSTP could host a festival showcasing the design, practices, and outputs of various Laboratory Schools.

Though OSTP and NSF will have primary leadership responsibilities for the National Laboratory Schools program, we expect that ED will still be involved as a key partner on topics aligned with ED’s core competencies (e.g., regulatory compliance, traditional best practices, responsible research practices, etc.).

4. What makes the Department of Defense Education Activity (DoDEA) an especially good partner for this work?

The DoDEA is an especially good partner because it is the only federal agency that already operates schools; reaches a student base that is large (more than 70,000 students, of whom more than 12,000 are high-school aged) as well as academically, socioeconomically, and demographically diverse; more nimble than a traditional district; in a position to appreciate and understand the full ramifications of the computational revolution; and very motivated to improve school quality and reduce turnover

5. Why should the Division for School Invention (DSI) be situated within NSF’s TIP Directorate rather than EHR Directorate?

EHR has historically focused on the important work of researching (and to some extent, improving) existing schools. The DSI’s focus on invention, secondary/postsecondary education, and opportunities for alignment between cluster-based workforce-development strategies and Laboratory Schools’ computational emphasis make the DSI a much better fit for the TIP, which is not only focused on innovation and invention overall, but is also explicitly tasked with “[creating] education pathways for every American to pursue new, high-wage, good-quality jobs, supporting a diverse workforce of researchers, practitioners, and entrepreneurs.” Situating the DSI within TIP will not preclude DSI from drawing on EHR’s considerable expertise when needed, especially for evaluating, contextualizing, and supporting the research agendas of Laboratory Schools.

6. Why shouldn’t existing public schools be eligible to serve as Laboratory Schools?

Most attempts at organizational change fail. Invention requires starting fresh. Allowing existing public schools or districts to launch Laboratory Schools will distract from the ongoing educational missions of those schools and is unlikely to lead to effective invention. 

7. Who are some appropriate partners for the National Laboratory School program?

Possible partners include:

8. What should the profile of a team or organization starting a Laboratory School look like? Where and how will partners find these people?

At a minimum, the team should have experience working with youth, possess domain expertise in computation, be comfortable supporting both technical and expressive applications of computation, and have a clear vision for the practical operation of their proposed educational model across both the humanities and technical fields.

Ideally, the team should also have piloted versions of their proposed educational model approach in some form, such as through after-school programs or at a summer camp. Piloting novel educational models can be hard, so the DSI and/or its partners may want to consider providing tiered grants to support this kind of prototyping and develop a pipeline of candidates for running a Laboratory School.

To identify candidates to launch and operate a Laboratory School, the DSI and/or its partners can:

What

1. What is computational thinking, and how is it different from programming or computer science?

A good way to answer this question is to consider writing as an analogy. Writing is a tool for thought that can be used to think critically, persuade, illustrate, and so on. Becoming a skilled writer starts with learning the alphabet and basic grammar, and can include craft elements like penmanship. But the practice of writing is distinct from the thinking one does with those skills. Similarly, programming is analogous to mechanical writing skills, while computer science is analogous to the broader field of linguistics. These are valuable skills, but are a very particular slice of what the computational revolution entails.

Both programming and computer science are distinct from computational thinking. Computational thinking refers to thinking with computers, rather than thinking about how to communicate problems and questions and models to computers. Examples in other fields include:

These transitions each involve programming, but are no more “about” computer science than a philosophy class is “about” writing. Programming is the tool, not the topic.

2. What are some examples of the research questions that National Laboratory Schools would investigate?

There are countless research agendas that could be pursued through this new infrastructure. Select examples include:

  1. Seymour Papert’s work on LOGO (captured in books like Mindstorms) presented a radically different vision for the potential and role for technology in learning. In Mindstorms, Papert sketches out that vision vis a vis geometry as an existence proof. Papert’s work demonstrates that research into making things more learnable differs from researching how to teach more effectively. Abelson and diSessa’s Turtle Geometry takes Papert’s work further, conceiving of ways that computational tools can be used to introduce differential geometry and topology to middle- and high-schoolers. The National Laboratory Schools could investigate how we might design integrated curricula combining geometry, physics, and mathematics by leveraging the fact that the vast majority of mathematical ideas tackled in secondary contexts appear in computational treatments of shape and motion.
  2. The Picturing to Learn program demonstrated remarkable results in helping staff to identify and students to articulate conceptions and misconceptions. The National Laboratory Schools could investigate how to take advantage of the explosion of interactive and dynamic media now available for visually thinking and animating mental models across disciplines.
  3. Bond graphs as a representation of physical dynamic systems were developed in the 1960s. These graphs enabled identification of “effort” and “flow” variables as new ways of defining power. This in turn allowed us to formalize analogies across electricity and magnetism, mechanics, fluid dynamics, and so on. Decades later, category theory has brought additional mathematical tools to bear on further formalizing these analogies. Given the role of analogy in learning, how could we reconceive people’s introduction to natural sciences in cross-disciplinary language emphasizing these formal parallels.
  4. Understanding what it means for one thing to cause (or not cause) another, and how we attempt to establish whether this is empirically true is an urgent and omnipresent need. Computational approaches have transformed economics and the social sciences: Whether COVID vaccine reliability, claims of election fraud, or the replication crisis in medicine and social science, our world is full of increasingly opaque systems and phenomena which our media environment is decreasingly equipped to tackle for and with us. An important tool in this work is the ability to reason about and evaluate empirical research effectively, which in turn depends on fundamental ideas about causality and how to evaluate the strength and likelihood of various claims. Graphical methods in statistics offer a new tool complementing traditional, easily misused ideas like p-values which dominate current introductions to statistics without leaving youth in a better position to meaningfully evaluate and understand statistical inference.

The specifics of these are less important than the fact that there are many, many such agendas that go largely unexplored because we lack the tangible infrastructure to set ambitious, computationally sophisticated educational research agendas.

3. How will the National Laboratory Schools differ from magnet schools for those interested in computer science?

The premise of the National Laboratory Schools is that computation, like writing, can transform many subjects. These schools won’t place disproportionate emphasis on the field of computer science, but rather will emphasize integration of computational thinking into all disciplines—and educational practice as a whole. Moreover, magnet schools often use selective enrollment in their admissions. National Laboratory Schools are public schools interested in the core issues of the median public school, and therefore it is important they tackle the full range of challenges and opportunities that public schools face. This involves enrolling a socioeconomically, demographically, and academically diverse group of youth.

4. How will the National Laboratory Schools differ from the Institute for Education Science’s Regional Education Laboratories?

The Institute for Education’s (IES’s) Regional Education Laboratories (RELs) do not operate schools. Instead, they convene and partner with local policymakers to lead applied research and development, often focused on actionable best practices for today’s schools (as exemplified by the What Works Clearinghouse). This is a valuable service for educators and policymakers. However, this service is by definition limited to existing school models and assumptions about education. It does not attempt to pioneer new school models or curricula.

5. How will the National Laboratory Schools program differ from tech-focused workforce-development initiatives, coding bootcamps, and similar programs?

These types of programs focus on the training and placement of software engineers, data scientists, user-experience designers, and similar tech professionals. But just as computational thinking is broader than just programming, the National Laboratory Schools program is broader than vocational training (important as that may be). The National Laboratory Schools program is about rethinking school in light of the computational revolution’s effect on all subjects, as well as its effects on how school could or should operate. An increased sensitivity to vocational opportunities in software is only a small piece of that.

6. Can computation really change classes other than math and science?

Yes. The easiest way to prove this is to consider how professional practice of non-STEM fields has been transformed by computation. In economics, the role of data has become increasingly prominent in both research and decision making. Data-driven approaches have similarly transformed social science, while also expanding the field’s remit to include specifically online, computational phenomena (like social networks). Politics is increasingly dominated by technological questions, such as hacking and election interference. 3D modeling, animation, computational art, and electronic music are just a few examples of the computational revolution in the arts. In English and language arts, multimedia forms of narrative and commentary (e.g., podcasts, audiobooks, YouTube channels, social media, etc.) are augmenting traditional books, essays, and poems. 

7. Why and how should National Laboratory Schools commit to financial and legal parity with public schools?

The challenges facing public schools are not purely pedagogical. Public schools face challenges in serving diverse populations in resource-constrained and highly regulated environments. Solutions and innovation in education need to be prototyped in realistic model systems. Hence the National Laboratory Schools must commit to financial and legal parity with public schools. At a minimum, this should include a commitment to (i) a per-capita student cost that is no more than twice the average of the relevant catchment area for a given National Laboratory School (the 2x buffer is provided to accommodate the inevitably higher cost of prototyping educational practices at a small scale), and (ii) enrollment that is demographically and academically representative (including special-education and English Language Learner participation) of a similarly aged population within thirty minutes’ commute, and that is enrolled through a weighted lottery or similarly non-selective admissions process.

8. Why are Xerox PARC and the Mayo Clinic good models for this initiative?

Both Xerox PARC and the Mayo Clinic are prototypical examples of hyper-creative, highly-functioning research and development laboratories. Key to their success inventing the future was living it themselves.

PARC researchers insisted on not only building but using their creations as their main computing systems. In doing so, they were able to invent everything from ethernet and the laser printer to the whole paradigm of personal computing (including peripherals like the modern mouse and features like windowed applications that we take for granted today).

The Mayo Clinic runs an actual hospital. This allows the clinic to innovate freely in everything from management to medicine. As a result, the clinic created the first multi-specialty group practice and integrated medical record system, invented the oxygen mask and G-suit, discovered cortisone, and performed the first hip replacement.

One characteristic these two institutions share is that they are focused on applied design research rather than basic science. PARC combined basic innovations in microelectronics and user interface to realize a vision of personal computing. Mayo rethinks how to organize and capitalize on medical expertise to invent new workflows, devices, and more.

These kinds of living laboratories are informed by what happens outside their walls but are focused on inventing new things within. National Laboratory Schools should similarly strive to demonstrate the future in real-world operation.

Why?

1. Don’t laboratory schools already exist? Like at the University of Chicago?

Yes. But there are very few of them, and almost all of those that do exist suffer from one or more issues relative to the vision proposed herein for National Laboratory Schools. First, most existing laboratory schools are not public. In fact, most university-affiliated laboratory schools have, over time, evolved to mainly serve faculty’s children. This means that their enrollment is not socioeconomically, demographically, or academically representative. It also means that families’ risk aversion may constrain those schools’ capacity to truly innovate. Most laboratory schools not affiliated with a university use their “laboratory” status as a brand differentiator in the progressive independent-school sector.

Second, the research functions of many laboratory schools have been hollowed out given the absence of robust funding. These schools may engage in shallow renditions of participatory action research by faculty in lieu of meaningful, ambitious research efforts. 

Third, most educational-design questions investigated by laboratory schools are investigated at the classroom or curriculum (rather than school design) level. This creates tension between those seeking to test innovative practices (e.g., a lesson plan that involves an extended project) and the constraints of traditional classrooms.

Finally, insofar as bona fide research does happen, it is constrained by what is funded, publishable, and tenurable within traditional graduate schools of education. Hence most research reflects the concerns of existing schools instead of seeking to reimagine school design and educational practice.

2. Why will National Laboratory Schools succeed where past efforts at educational reform (e.g., charter schools) have failed?

Most past educational-reform initiatives have focused on either supporting and improving existing schools (e.g., through improved curricula for standard classes), or on subsidizing and supporting new schools (e.g., charter schools) that represent only minor departures from traditional models.

The National Laboratory Schools program will provide a new research, design, and development infrastructure for inventing new school models, curricula, and educator training. These schools will have resources, in-house expertise, and research priorities that traditional public schools—whether district or charter or pilot—do not and should not. If the National Laboratory Schools are successful, their output will help inform educational practice across the U.S. school ecosystem. 

3. Don’t charter schools and pilot schools already support experimentation? Wasn’t that the original idea for charter and pilot schools—that they’d be a laboratory to funnel innovation back into public schools?

Yes, but this transfer hasn’t happened for at least two reasons. First, the vast majority of charter and pilot schools are not pursuing fundamentally new models because doing so is too costly and risky. Charter schools can often perform more effectively than traditional public schools, but this is just as often because of problematic selection bias in enrollment as it is because the autonomy they’re given allows for more effective leadership and organizational management. Second, the politics around charter and pilots has become increasingly toxic in many places, which prevents new ideas from being considered by public schools or advocated for effectively by public leaders.

4. Why do we need invention at the school rather than at the classroom level? Wouldn’t it be better to figure out how to improve schools that exist rather than end up with some unworkable model that most districts can’t adopt?

The solutions we need might not exist at the classroom level. We invest a great deal of time, money, and effort into improving existing schools. But we underinvest in inventing fundamentally different schools. There are many design choices which we need to explore which cannot be adequately developed through marginal improvements to existing models. One example is project-based learning, wherein students undertake significant, often multidisciplinary projects to develop their skills. Project-based learning at any serious level requires significant blocks of time that don’t fit in traditional school schedules and calendars. A second example is the role of computational thinking, as centered in this proposal. Meaningfully incorporating computational approaches into a school design requires new pedagogies, developing novel tools and curricula, and re-training staff. Vanishingly few organizations do this kind of work as a result.

If and when National Laboratory Schools develop substantially innovative models that demonstrate significant value, there will surely need to be a translation process to enable districts to adopt these innovations, much as translational medicine brings biomedical innovations from the lab to the hospital. That process will likely need to involve helping districts start and grow new schools gradually, rather then district-wide overhauls.

5. What kinds of “traditional assumptions” need to be revisited at the school level?

The basic model of school assumes subject-based classes with traditionally licensed teachers lecturing in each class for 40–90 minutes a day. Students do homework, take quizzes and tests, and occasionally do labs or projects. The courses taught are largely fixed, with some flexibility around the edges (e.g., through electives and during students’ junior and senior high-school years).

Traditional school represents a compromise among curriculum developers, standardized-testing outfits, teacher-licensure programs, regulations, local stakeholder politics, and teachers’ unions. Attempts to change traditional schools almost always fail because of pressures from one or more of these groups. The only way to achieve meaningful educational reform is to demonstrate success in a school environment rethought from the ground up. Consider a typical course sequence of Algebra I, Geometry, Algebra II, and Calculus. There are both pedagogical and vocational reasons to rethink this sequence and instead center types of mathematics that are more useful in computational contexts (like discrete mathematics and linear algebra). But a typical school will not be able to simultaneously develop the new tools, materials, and teachers needed to do so.

6. Has anything like the National Laboratory School program been tried before?

No. There have been various attempts to promote research in education without starting new schools. There have been interesting attempts by states to start new schools (like Governor’s Schools),there have been some ambitious charter schools, and there have been attempts to create STEM-focused and computationally focused magnet schools. But there has never been a concerted attempt in the United States to establish a new kind of research infrastructure built atop the foundation of functioning schools as educational “sandboxes”.

How?

1. How will we pay for all this? What existing funding streams will support this work? Where will the rest of the money for this program come from?

For budgeting purposes, assume that each Laboratory School enrolls a small group of forty high school or community college students full-time at an average per capita rate of $40,000 per person per year. Half of that budget will support the functioning of schools themselves. The remaining half will support a small research and development team responsible for curating and developing the computational tools, materials, and curricula needed to support the School’s educators. This would put the direct service budget of the school solidly at the 80th percentile of current per capita spending on K–12 education in the United States.With these assumptions, running 100 National Laboratory Schools would cost ~$160 million. Investing $25 million per year would be sufficient to establish an initial 15 sites. This initial federal funding should be awarded through a 1:1 matching competitive-grant program funded by (i) the 10% of American Competitiveness and Workforce Improvement Act (ACWIA) Fees associated with H1-B visas (which the NSF is statutorily required to devote to public-private partnerships advancing STEM education), and (ii) the NSF TIP Directorate’s budget, alongside budgets from partner agency programs (for instance, the Department of Education’s Education Innovation and Research and Investing in Innovation programs). For many states, these funds should also be layered atop their existing Elementary and Secondary School Emergency Relief (ESSER) and American Rescue Plan (ARP) awards.

2. Why is vertical integration important? Do we really need to run schools to figure things out?

Vertical integration (of research, design, and operation of a school) is essential because schools and teacher education programs cannot be redesigned incrementally. Even when compelling curricular alternatives have been developed under the auspices of an organization like the NSF, practical challenges in bringing those innovations to practice have proven insurmountable. In healthcare, the entire field of translational medicine exists to help translate research into practice. Education has no equivalent.

The vertically integrated National Laboratory School system will address this gap by allowing experimenters to control all relevant aspects of the learning environment, curricula, staffing, schedules, evaluation mechanisms, and so on. This means the Laboratory Schools can demonstrate a fundamentally different approach, learning from great research labs like Xerox PARC and the Mayo Clinic, much of whose success depended on tightly-knit, cross-disciplinary teams working closely together in an integrated environment.

3. What would the responsibilities of a participating agency look like in a typical National Laboratory School partnership?

A participating agency will have some sort of educational or workforce-development initiative that would benefit from the addition of a National Laboratory School as a component. This agency would minimally be responsible for:

4. How should success for individual Laboratory Schools be defined?

Working with the Institute of Education Sciences (IES)’ National Center for Education Research(NCER), the DSI should develop frameworks for collecting necessary qualitative and quantitative data to document, understand, and evaluate the design of any given Laboratory School. Evaluation would include evaluation of compliance with financial and legal parity requirements as well as evaluation of student growth and work products.

Evaluation processes should include:

Success should be judged by a panel of experts that includes domain experts, youthworkers and/or school leaders, and DSI leadership. Dimensions of performance these panels should address should minimally include depth and quality of students’ work, degree of traditional academic coverage, ambition and coherence of the research agenda (and progress on that research agenda), retention of an equitably composed student cohort, and growth (not absolute performance) on the diagnostic/formative assessments.In designing evaluation mechanisms, it will be essential to learn from failed accountability systems in public schools. Specifically:, it will be essential to avoid pushing National Laboratory Schools to optimize for the particular metrics and measurements used in the evaluation process. This means that the evaluation process should be largely based on holistic evaluations made by expert panels rather than fixed rubrics or similar inflexible mechanisms. Evaluation timescales should also be selected appropriately: e.g., performance on diagnostic/formative assessments should be measured by examining trends over several years rather than year-to-year changes.

5. What makes the Small Business Innovation Research (SBIR) program a good model for the National Laboratory School program?

The SBIR program is a competitive grant competition wherein small businesses submit proposals to a multiphase grant program. SBIR awards smaller grants (~$150,000) to businesses at early stages of development, and makes larger grants (~$1 million) available to awardees who achieve certain progress milestones. SBIR and similar federal tiered-grant programs (e.g., the Small Business Technology Transfer, or STTR, program) have proven remarkably productive and cost-effective, with many studies highlighting that they are as or more efficient on a per-dollar basis when compared to the private sector via common measures of innovation like number of patents, papers, and so on.

The SBIR program is a good model for the National Laboratory School program; it is an example of the federal government promoting innovation by patching a hole in the funding landscape. Traditional financing options for businesses are often limited to debt or equity, and most providers of debt (like retail banks) for small businesses are rarely able or incentivized to subsidize research and development. Venture capitalists typically only subsidize research and development for businesses and technologies with reasonable expectations of delivering 10x or greater returns. SBIR provides funding for the innumerable businesses that need research and development support in order to become viable, but aren’t likely to deliver venture-scale returns.

In education, the funding landscape for research and development is even worse. There are virtually no sources of capital that support people to start schools, in part because the political climate around new schools can be so fraught. The funding that does exist for this purpose tends to demand school launch within 12–18 months: a timescale upon which it is not feasible to design, evaluate, refine an entirely new school model. Education is a slow, expensive public good: one that the federal government shouldn’t provision, but should certainly subsidize. That includes subsidizing the research and development needed to make education better.

States and local school districts lack the resources and incentives to fund such deep educational research. That is why the federal government should step in. By running a tiered educational research-grant program, the federal government will establish a clear pathway for prototyping and launching ambitious and innovative schools.

6. What protections will be in place for students enrolled in Laboratory Schools?

The state organizations established or selected to oversee Laboratory Schools will be responsible for approving proposed educational practices. That said, unlike in STEM fields, there is no “lab bench” for educational research: the only way we can advance the field as a whole is by carefully prototyping informed innovations with real students in real classrooms.

7. Considering the challenges and relatively low uptake of educational practices documented in the What Works Clearinghouse, how do we know that practices proven in National Laboratory Schools will become widely adopted?

National Laboratory Schools will yield at least three kinds of outputs, each of which is associated with different opportunities and challenges with respect to widespread adoption.

The first output is people. Faculty trained at National Laboratory Schools (and at possible educator-development programs run within the Schools) will be well positioned to take the practices and perspectives of National Laboratory Schools elsewhere (e.g., as school founders or department heads). The DSI should consider establishing programs to incentivize and support alumni personnel of National Laboratory Schools in disseminating their knowledge broadly, especially by founding schools.

The second output is tools and materials. New educational models that are responsive to the computational revolution will inevitably require new tools and materials—including subject-specific curricula, cross-disciplinary software tools for analysis and visualization, and organizational and administrative tools—to implement in practice. Many of these tools and materials will likely be adaptations and extensions of existing tools and materials to the needs of education.

The final output is new educational practices and models. This will be the hardest, but probably most important, output to disseminate broadly. The history of education reform is littered with failed attempts to scale or replicate new educational models. An educational model is best understood as the operating habits of a highly functioning school. Institutionalizing those habits is largely about developing the skills and culture of a school’s staff (especially its leadership). This is best tackled not as a problem of organizational transformation (e.g., attempting to retrofit existing schools), but rather one of organizational creation—that is, it is better to use models as inspirations to emulate as new schools (and new programs within schools) are planned. Over time, such new and inspired schools and programs will supplant older models.

8. How could the National Laboratory School program fail?

Examples of potential pitfalls that the DSI must strive to avoid include:

Improving Outcomes for Incarcerated People by Reducing Unjust Communication Costs

Summary

Providing incarcerated people opportunities to communicate with support networks on the outside improves reentry outcomes. As the COVID-19 pandemic continues to limit in-person interaction and use of electronic communication grows, it is critical that services such as video calling and email be available to people in prisons. Yet incarcerated people — and their support networks on the outside — pay egregious prices for electronic-communication services that are provided free to the general public. Video chatting with a person in prison regularly costs more than $1 a minute, and email costs are between $0.20 and $0.60 per message. A major reason rates are so high is that facilities are paid site commissions as a percentage of the amount spent on calls (ranging from 20% to 88%).

The Federal Communications Commission (FCC) has explicit authority to regulate interstate prison phone calls (called Inmate Calling Services, or ICS). However, the DC Circuit Court ruled in 2015 that video calls and emails are not covered under the definition of ICS and hence that the FCC does not have authority under the 1996 Telecommunications Act (47 U.S. Code) to regulate video calls or emails. They separately ruled that the FCC does not have authority under §276 of the Telecommunications Act to regulate site commissions. The DC Circuit Court ruling creates an imperative for Congressional action. Congress should revise the Telecommunications Act to clearly cover email and video calls in prisons and jails, capping costs of these communications at “just and reasonable” levels. In the interim, the FCC should try again to eliminate site commissions for telephone calls by relying on §201 of the Telecommunications Act.

Bringing Opportunities to Incarcerated Persons and Prison-Tech Startups

Summary

The Biden-Harris Administration should create a program that incentivizes unique prison-tech innovations by providing resources to help startups working in this space, specifically those that create solutions for individuals during and after their period of incarceration and beyond. The program would be structured as a partnership among several key government agencies, federal and state prison systems, and the private sector. For participating startups, the program would foster technical innovation, provide de-risking measures, connect viable product-market solutions, and establish equity-free funding opportunities. For individuals serving state and federal sentences, the program would improve rehabilitative efforts while in the corrections system, create potential job opportunities, and reduce recidivism rates. For the broader social good, the program would spur economic growth, create stronger communities, and contribute to more equitable outcomes.

Challenge and Opportunity

The United States has the highest number of incarcerated individuals worldwide: the U.S. prison population numbers nearly 1.9 million. Recidivism rates are equally astonishing. Of the over 600,000 individuals released from state and federal prisons each year, more than two-thirds are rearrested within three years of release. Half of those rearrested are subsequently reincarcerated.

The cost of recidivism is extraordinarily high. Recidivism costs taxpayers at least $366 million per year, with a single recidivism incident estimated to impose as much as $150,000 in taxpayer burden. Recidivism also has massive social costs. Continuous reincarceration harms communities, breaks families, and contributes to generational systemic poverty. To break this cycle, we as a nation need to rethink how we approach incarceration and assign more importance to reintegration efforts.

A major contributor to the recidivism cycle is prioritization of punitive measures over rehabilitative ones in U.S. prison systems. Such punitive measures can isolate inmates from friends, family, and even children for years or decades. Moreover, instead of providing access to educational tools that could set them up for meaningful work once released, prisons often shunt incarcerated individuals into low-level menial tasks that pay mere pennies per day. Incarcerated individuals often lack the skills needed to navigate life on the outside as a result. They are left without financial means or dependable job prospects. They are saddled with broken relationships and a lack of coping mechanisms. Coupled with the stigma of being labeled ex-offenders, they are often forced into unproductive behaviors and familiar but societally unacceptable actions. And inevitably, many fall into the same patterns and reoffend.

It is also worth considering the economic benefits our nation is failing to capture from formerly incarcerated individuals. According to the U.S. Chamber of Commerce, an estimated $78–87 billion in GDP annually is lost due to exclusion of formerly incarcerated job seekers from the workforce simply because of their ex-offender status, exclusion based on these individuals being “unskilled and unemployed” as a result of poor training and job opportunities while in prison. 

We can do better. Research shows that when incarcerated individuals are given access to tools that allow them to connect to people and resources who can help them, those individuals are better equipped to reenter society. Such tools include regular video and voice calls plus texts and emails with friends and loved ones. They include the ability to participate in physical, mental, and spiritual programs and community-led activities, including programs and activities offered through digital services. And importantly, they include access to online educational programs and learning platforms administered through hardware designed to make learning easier, more robust, and aligned with modern approaches to digital upskilling.

Indeed, there is a growing market for hardware, software, and other digital innovations designed to work within U.S. carceral systems. Startups focused on the prison-tech space have the knowledge and will to replace archaic, ineffective approaches to rehabilitation with more meaningful products and services. Unfortunately, prison-tech startups also face challenges not encountered by startups in other tech subsectors. 

First, many prison-tech startups are creating products and solutions that are extremely targeted towards smaller markets. For these players, finding customers means aligning with state and federal prison systems — something that is unfamiliar to a budding tech company.

Second, prison-tech startups, like all startups, often struggle to find funding. But while other startups can woo private funders with promises of equity, board seats, and concrete financial returns, success for these startups often includes bettering lives and fostering meaningful experiences, measures that cannot be quantified through revenue alone. Many investment firms have little interest in funding such “tech for social good” enterprises.
Third, prison-tech startups invest substantial time and money into including equality, accessibility and safety in their offerings. As such, access to this type of beneficial technology should not be limited to only carceral institutions with larger budgets to purchase them. At the same time, too many existing goods and services purport to serve incarcerated individuals equally and justly but are actually designed to maximize revenue generation. For example, systems like TRULINCS and JPay are pay-per-use services (for communication, money transfer, and other purposes) provided at extraordinarily high costs to incarcerated individuals and their networks on the outside — often at costs so high that the critical opportunities for connection they provide are simply unaffordable for those who need them most. Prison-tech products and services must be designed and used in ways that do not exploit, harm, or otherwise jeopardize the health and safety of incarcerated individuals and their families nor unduly burden individuals and their families with exorbitant costs per use.

Plan of Action

The Biden-Harris Administration should launch a cross-agency initiative to support prison-tech startups. The initiative would offer federal grants to fund private companies and nongovernmental organizations (NGOs) providing beneficial prison-tech goods and services: e.g., carceral learning platforms and tools that can prepare incarcerated individuals to reenter society. The initiative should also provide incentives for prison-tech startups to hire formerly incarcerated individuals. Such incentives will create self-sustaining ecosystems that provide meaningful, long-term employment to former inmates, drive bottom-line success for prison-tech startups, and better communities in which startups are based.

Relevant agencies

Key agencies to include in initiative design, management, and administration include the following:

Program structure

As explained above, the proposed initiative comprises two pillars. The first pillar focuses on federal grant funding to help prison-tech startups launch. The second pillar focuses on later-stage financial incentives and market support that help prison-tech startups scale and achieve long-term financial sustainability, and that encourage prison-tech startups to provide good jobs to previously incarcerated individuals. 

Pillar 1: Federal grant funding

Making federal grant funding (i.e., non-equity funding) available will encourage innovative startups to explore needed prison-tech solutions while minimizing risk associated with investing in such a specific market segment. The best option for funding the grant portion of the initiative is a combined approach that makes use of multiple existing federal funding vehicles.

The primary vehicle would be the Small Business Innovation Research (SBIR) program. Under SBIR, companies are generally awarded up to $150,000 for a Phase I (P1) grant that runs for up to six months. Companies who successfully complete P1 and show favorable outcomes and market opportunities can become eligible for Phase II (P2) funding, which has a cap of $1 million for a two-year period of performance. This staggered approach requires companies to measure and demonstrate positive outcomes in order to be eligible for follow-on investments. We propose creating a specific prison-tech topic code for SBIR, which would allow NSF and ED to use this program to allocate prison-tech startup grants. Though SBIR funding generally does not go beyond P2, the federal government could consider adding Phase III (P3) funding opportunities for particularly promising prison-tech startups. In P3, companies would be eligible for awards of $5–10 million to scale up products and services to meet the needs of prisons nationwide. A summary of proposed SBIR award numbers and funding levels for this initiative is proposed below.

Award numbers
PhasePeriod of PerformanceMax. awards disbursed per cycle
16 months — 1 year10
22 years5
3Est. 3 years2
Based on per annum investment at 100% for PI, 50% for PII and 33% for PIII, with an average of 7.5M award per PIII recipient)
Estimated funding levels (first five years)
YearFunding per phaseTotal funding
1P1: $1,500,000$1,500,000
2P1: $1,500,000
P2: $2,500,000
$4,000,000
3P1: $1,500,000
P2: $2,500,000
$4,000,000
4P1: $1,500,000
P2: $2,500,000
P3: $5,000,000
$9,000,000
5P1: $1,500,000
P2: $2,500,000
P3: $5,000,000
$9,000,000

Additionally, the Digital Equity Act — part of the recently passed bipartisan infrastructure bill—includes a total of $2.75 billion over five years to provide digital training and skill-development opportunities to low-income and disadvantaged populations, which includes those formerly incarcerated. Through this act (and specifically through its “Spurring Targeted Action through Competitive Grants” arm) the National Telecommunications and Information Administration (NTIA) will create an annual $125 million competitive grant program to support digital-inclusion projects undertaken by individual groups, coalitions, and/or communities of interest. The Biden-Harris administration should explore options for including the NTIA grants in the prison-tech startup initiative. 

Pillar 2: Later-stage financial incentives and market support

The goal of this pillar is to support prison-tech startups through the crucial period in between business launch and long-term fiscal sustainability — the period when many startups fail. Providing funding, markets and overall business support during this crucial time period ensures continuity of offering for the institution as well as ensuring small business thrives. 

The SBA and DOL should partner to provide continued financial incentives — e.g., extended tax credits and bonding programs — for prison-tech startups, particularly startups that hire previously incarcerated individuals. As part of this pillar, the DOL’s WOTC should be doubled to $19,600 per individual per year for employees making at least $65,000 per year.1 DOL’s Federal Bonding program should also be extended to cover the first 12 months or more of employment. Finally, the Biden-Harris administration should explore opportunities for retention bonuses or additional tax credits that encourage prison-tech startups to retain formerly incarcerated individuals beyond the first 12 months of employment.

The SBA and DOL should also help craft a business-to-prison product-matching service. This service will (1) allow prison-tech startups to focus on building the right solutions without worrying about customer acquisition, and (2) give prison management confidence that the prison-tech products and services they are purchasing are credible and tested. As part of this service, the SBA and DOL should assist businesses with understanding institutional needs and with understanding how to navigate federal and state contracting processes. The SBA and DOL could also try to help prison-tech startups identify supplementary customer bases among institutions such as city and county jails, juvenile-detention facilities, and state-sponsored healthcare facilities and hospitals in an effort to provide additional market opportunities for participating startups beyond the prison system. This ensures continued financial support for business and expanded product support through larger customer bases, something all startups need.