Fortifying America’s Future: Pathways for Competitiveness

The Federation of American Scientists (FAS) and Alliance for Learning Innovation (ALI) Coalition, in collaboration with the Aspen Strategy Group and Walton Family Foundation, released a new paper “Fortifying America’s Future: Pathways for Competitiveness,” co-authored and edited by Brienne Bellavita, Dan Correa, Emily Lawrence, Alix Liss, Anja Manuel, and Sara Schapiro. The report delves into the intersection of education, workforce, and national security preparedness in the United States, summarizing key findings from roundtable discussions in early 2024. These roundtable discussions gathered field experts from a variety of organizations to enrich the discourse and provide comprehensive recommendations for addressing this challenge. Additionally, a panel of topical experts discussed the subject matter of this report at the Aspen Security Forum on July 18th, 2024.

Read the full report here

The United States faces a critical human talent shortage in industries essential for maintaining technological leadership, including workforce sectors related to artificial intelligence, quantum computing, semiconductors, 5G/6G technologies, fintech, and biotechnology. Without a robust education system that prepares our youth for future careers in these sectors, our national security and competitiveness are at risk. Quoting the report, Dr. Katie Jenner, Secretary of Education for the State of Indiana, reiterated the idea that “we must start treating a strong educational system as a national security issue” during the panel discussion. Addressing these challenges requires a comprehensive approach that bridges the gaps between national security, industry, higher education, and K-12 education while leveraging local innovation. The paper outlines strategies for creating and promoting career pathways from K-12 into high-demand industries to maintain the U.S.’s competitive edge in an increasingly global landscape, including:

National security has historically driven educational investment (think Sputnik) and remains a bipartisan priority, providing a strong foundation for new legislation addressing emerging technologies like AI. For example, the CHIPS and Science Act, driven by competition with China, has spurred states to innovate, form public-private partnerships, and establish Tech Hubs. 

Mapping out workforce opportunities in other critical sectors such as aviation, AI, computer science, and biosecurity can ensure that the future workforce is gaining necessary skills to be successful in high-need careers in national security. For example, Ohio created a roadmap for advanced manufacturing with the Governor’s Office of Workforce Transformation and the Ohio Manufacturers’ Association outlining sector-specific competencies.

Innovative funding streams, employer incentives, and specialized intermediaries promoting career-connected learning can bridge gaps by encouraging stronger cross-sector ties in education and the workforce. For example, Texas allocated incentive funding to Pathways in Technology Early College High Schools (P-TECH) encouraging explicit career-connected learning opportunities that engage young people in relevant career paths. 

A Technical Assistance (TA) Center would offer tailored support based on each state’s emerging industries, guided by broader economic and national security needs. The center could bring together stakeholders such as community colleges, education leaders, and industry contacts to build partnerships and cross-sector opportunities. 

Virginia streamlined all workforce initiatives under a central state department, enhancing coordination and collaboration. The state also convenes representatives and cabinet members with backgrounds in workforce issues regularly to ensure alignment of education from K-12 through postsecondary.

Education R&D lacks sufficient investment and the infrastructure to support innovative solutions addressing defining challenges in education in the U.S. The New Essential Education Discoveries (NEED) Act would establish an agency called the National Center for Advanced Development in Education (NCADE) that would function as an ARPA-ED, developing and disseminating evidence-based practices supporting workforce pathways and skills acquisition for critical industries.

Giving young students opportunities to learn about different careers in these sectors will inspire interest and early experiences with diverse options in higher education, manufacturing, and jobs from critical industries ensuring American competitiveness.Implementing these recommendations will require action from a diverse group of stakeholders including the federal government and leadership at the state and local levels. Check out the report to see how these steps will empower our workforce and uphold the United States’ leadership in technology and national security.

United States Discloses Nuclear Warhead Numbers; Restores Nuclear Transparency

Note: The initial NNSA release showed an incorrect graph that did not accurately depict the size of the stockpile for the period 2012-2023. The corrected graph is shown above.

[UPDATED VERSION] The Federation of American Scientists applauds the United States for declassifying the number of nuclear warheads in its military stockpile and the number of retired and dismantled warheads. The decision is consistent with America’s stated commitment to nuclear transparency, and FAS calls on all other nuclear states to follow this important precedent.

The information published on the National Nuclear Security Administration (NNSA) web site today shows that the U.S. nuclear weapons stockpile as of September 2023 included 3,748 nuclear warheads, only 40 warheads off FAS’ estimate of 3,708 warheads.

The information also shows that the United States last year dismantled only 69 retired nuclear warheads, the lowest number since 1994.

FAS has previously requested that the United States release the size of the US nuclear arsenal for FY2021, FY2022, and FY2023, but those requests were denied. FAS believes the information was wrongly withheld and that today’s declassification decision vindicates our belief that stockpile disclosures do not negatively affect U.S. security but should be provided to the public.

With today’s announcement, the Biden Administration has restored the nuclear stockpile transparency that was created by the Obama administration, halted by the Trump administration, revived by Biden administration in its first year, but then halted again for the past three years.

While applauding the U.S. disclosure, FAS also urged other nuclear-armed states to disclose their stockpile numbers and warheads dismantled. Excessive nuclear secrecy creates mistrust, fuels worst-case planning, and enables hardliners and misinformers to exaggerate nuclear threats.

What The Nuclear Numbers Show

The declassified stockpile numbers show that the United States maintained a total of 3,748 warheads in its military stockpile as of September 2023. The military stockpile includes both active and inactive warheads in the custody of the Department of Defense. The information also discloses weapons numbers for the previous two years, numbers that the U.S. government had previously declined to release.

Although there have minor fluctuations, the numbers show that the U.S. nuclear weapons stockpile has remained relative stable for the past seven years. The fluctuations during that period do not reflect decision to increase or decrease the stockpile but are the result of warheads movements in and out of the stockpile as part of the warhead life-extension and maintenance work.

Although the warhead numbers today are much lower than during the Cold War and there have been reductions since 2000, the reduction since 2007 has been relatively modest. Although the New Start treaty has had some indirect effect on the stockpile size due to reduced requirement, the biggest reductions since 2007 have be caused by changes in presidential guidance, strategy, and modernization programs. The initial chart released by NNSA did not accurately show the 1,133-warhead drop during the period 2012-2023. NNSA later corrected the chart (see top of article). The chart below shows the number of warheads in the stockpile compared with the number of warheads deployed on strategic launchers over the years.

This graph shows the size of the U.S. nuclear stockpile over the years plus the portion of those warheads deployed on strategic launchers. The stockpile number for 2024 and the strategic launcher warheads are FAS estimates.

The information also shows that the United States last year dismantled only 69 retired nuclear warheads. That is the lowest number of warheads dismantled in a year since 1994. The total number of retired nuclear warheads dismantled 1994-2023 is 12,088. Retired warheads awaiting dismantlement are not in the DOD stockpile but in the DOE stockpile.

The information disclosed also reveals that there are currently another approximately 2,000 retired warheads in storage awaiting dismantlement. This number is higher than our most recent estimate (1,336) because of the surprisingly low number of warheads dismantled in recent years. Because dismantlement appears to be a lower priority, the number of retired weapons awaiting dismantlement today (~2,000) is only 500 weapons lower than the inventory was in 2015 (~2,500).

FAS’ Work For Nuclear Transparency

The Federation of American Scientists has worked since its creation to increase responsible transparency on nuclear weapons issues; in fact, the nuclear scientists that built the bomb created the “Federation of Atomic Scientists” to enable some degree of transparency to discuss the implications of nuclear weapons (see FAS newsletter from 1949). There are of course legitimate nuclear secrets that must remain so, but nuclear stockpile numbers are not among them.

This FAS newsletter from 1949 describes the debate and FAS effort in support of transparency of the US weapons stockpile.

One part of FAS’ efforts, spearheaded by Steve Aftergood who for many years directed the FAS Project on Government Secrecy, has been to report on the government’s discussions about what needs to be classified and repeatedly request declassification of unnecessary secrets such as the stockpile numbers. This work yielded stockpile declassifications in some years (2012-2018, 2021) while in other years (2019-2020 and 2022-2024) FAS’ declassified requests were initially denied. Most recently, in February 2024, an FAS declassification request for the stockpile numbers as of 2023 was denied, although the letter added: “If a different decision is made in the future, we will notify you of this decision” (see rejection letters). Given these denials, FAS in March 2024 sent a letter to President Biden outlining the important reasons for declassifying the numbers.

The new disclosure of the stockpile numbers suggests that denial of earlier FAS declassification requests in 2023 and 2024 may not have been justified and that future years’ numbers should not be classified.

The other part of FAS’ efforts has been the Nuclear Information Project, which works to analyze, estimate, and publish information about the U.S. nuclear arsenal. In 2011, when the Obama administration first declassified the history of the stockpile, the FAS estimate was only 13 warheads off from the official number of 5,112 warheads. The Project also works to increase transparency of the other nuclear-armed states by providing the public with estimates of their nuclear arsenals. The project described the structures that enabled Matt Korda on our team and others to discover the large missile silo fields China was building, and NATO officials say our data is “the best for open source information that doesn’t come from any one nation.”

Why Nuclear Transparency Is Important

FAS has since its founding years worked for maximum responsible disclosure of nuclear weapons information in the interest of international security and democratic values. In a letter to President Biden in March 2024 we outlined those interests.

After denials in 2023 and February 2024 of FAS declassification requests, FAS in March sent President Biden a letter outlining why the denials were wrong. Click here to download full version of letter.

First, responsible transparency of the nuclear arsenal serves U.S. security interests by supporting deterrence and reassurance missions with factual information about U.S. capabilities. Equally important, transparency of stockpile and dismantlement numbers demonstrate that the United States is not secretly building up its arsenal but is dismantling retired warheads instead of keeping them in reserve. This can help limit mistrust and worst-case scenario planning that can fuel arms races.

Second, the United States has for years advocated and promoted nuclear transparency internationally. Part of its criticism of Russia and China is their lack of disclosure of basic information about their nuclear arsenals, such as stockpile numbers. U.S. diplomats have correctly advocated for years about the importance of nuclear transparency, but their efforts are undermined if stockpile and dismantlement numbers are kept secret because it enables other nuclear-armed states to dismiss the United States as hypocritical.

Third, nuclear transparency is important for the debate in the United States (and other Allied democracies) about the status and future of the nuclear arsenal and strategy and how the government is performing. Opponents of declassifying nuclear stockpile numbers tend to misunderstand the issue by claiming that disclosure gives adversaries a military advantage or that the United States should not disclose stockpile numbers unless the adversaries do so as well. But nuclear transparency is not a zero-sum issue but central to the democratic process by enabling informed citizens to monitor, debate, and influence government policies. Although the U.S. disclosure is not dependent on other nuclear-armed states releasing their stockpile numbers, Allied countries such as France and the United Kingdom should follow the U.S. example, as should Russia and China and the other nuclear-armed states.

Acknowledgement: Mackenzie Knight, Jon Wolfsthal, and Matt Korda provided invaluable edits.

More information on the FAS Nuclear Information Project page.


This research was carried out with generous contributions from the Carnegie Corporation of New York, the New-Land Foundation, the Ploughshares Fund, the Prospect Hill Foundation, Longview Philanthropy, and individual donors.

The Blackouts During the Texas Heatwave Were Preventable. Here’s Why.

On Monday, July 9, nearly 3 million homes and businesses in Texas were suddenly without power in the aftermath of Hurricane Beryl. Today, four days later, over 1 million Texans are entering a fourth day powerless. The acting governor, Dan Patrick, said in a statement that restoring power will be a “multi-day restoration event.” As people wait for this catastrophic grid failure to be remedied, much of southeast Texas, which includes Houston, is enduring dangerous, extreme heat with no air conditioning amid an ongoing heatwave. 

Extreme Heat is the “Top Weather Killer”

As our team at FAS has explained, prolonged exposure to extreme heat increases the risk of developing potentially fatal heat-related illnesses, such as heat stroke, where the human body reaches dangerously high internal temperatures. If a person cannot cool down, especially when the nights bring no relief from the heat, this high core temperature can result in organ failure, cognitive damage, and death. Extreme heat is often termed the “top weather killer,” as it’s responsible for 2,300 official deaths a year and 10,000 attributed via excess deaths analysis.  With at least 10 lives already lost in Texas amidst this catastrophic tragedy, excess heat and power losses are further compounding vulnerabilities, making the situation more dire. 

Policy Changes Can Save Lives

These losses of life and power outages are preventable, and it is the job of the federal government to ensure this. Our team at FAS has previously called for attention to the soaring energy demands and unprecedented heat waves that have placed the U.S. on the brink of widespread grid failure across multiple states, potentially jeopardizing millions of lives. In the face of widespread blackouts, restoring power across America is a complex, intricate process requiring seamless collaboration among various agencies, levels of government, and power providers amid constraints extending beyond just the loss of electricity. There is also a need for transparent protocols for safeguarding critical medical services and frameworks to prioritize regions for power restoration, ensuring equitable treatment for low-income and socially vulnerable communities affected by grid failure events.

As a proactive federal measure, there needs to be a mandate for the implementation of an Executive Order or an interagency Memorandum of Understanding (MOU) mandating the expansion of public health and emergency response planning for widespread grid failure under extreme heat. This urgently needed action would help mitigate the worst impacts of future grid failures under extreme heat, safeguarding lives, the economy, and national security as the U.S. moves toward a more sustainable, stable, and reliable electric grid system.Therefore, given the gravity of these high-risk, increasingly probable scenarios facing the United States, it is imperative for the federal government to take a leadership role in assessing and directing planning and readiness capabilities to respond to this evolving disaster.

Image via NWS/Donald Sparks

Increasing the “Policy Readiness” of Ideas

NASA and the Defense Department have developed an analytical framework called the “technology readiness level” for assessing the maturity of a technology – from basic research to a technology that is ready to be deployed.  

A policy entrepreneur (anyone with an idea for a policy solution that will drive positive change) needs to realize that it is also possible to increase the “policy readiness” level of an idea by taking steps to increase the chances that a policy idea is successful, if adopted and implemented.  Given that policy-makers are often time constrained, they are more likely to consider ideas where more thought has been given to the core questions that they may need to answer as part of the policy process.

A good first step is to ask questions about the policy landscape surrounding a particular idea:

1. What is a clear description of the problem or opportunity?  What is the case for policymakers to devote time, energy, and political capital to the problem?

2. Is there a credible rationale for government involvement or policy change?  

Economists have developed frameworks for both market failure (such as public goods, positive and negative externalities, information asymmetries, and monopolies) and government failure (such as regulatory capture, the role of interest groups in supporting policies that have concentrated benefits and diffuse costs, limited state capacity, and the inherent difficulty of aggregating timely, relevant information to make and implement policy decisions.)

3. Is there a root cause analysis of the problem? 

One approach that Toyota has used to answer this question is the “five whys,” which can prevent an analyst from providing a superficial or incomplete explanation with respect to a given problem.

4. What can we learn from past efforts to address the problem?  If this is a problem U.S. policymakers  have been working on for decades without much success, is there a new idea worth trying, or an important change in circumstances?

5. What can we learn from a comparative perspective, such as the experiences of other countries or different states and regions within the United States?

6. What metrics should be used to evaluate progress? What strategy should policy-makers have for dealing with Goodhardt’s Law? 

Goodhardt’s Law states that when a measure becomes a target, it ceases to become a good measure.  A police chief under pressure to reduce the rate of violent crime might reclassify certain crimes to improve the statistics.

7. What are the potential policy options, and an assessment of those options?  Who would need to act to approve and implement these policies?

This question – as is often the case – leads to more questions:

8. What are the documents that are needed to both facilitate a decision on the idea, and implement the idea?  

In the U.S. context, examples of these documents or processes include:

9. Has the idea been reviewed and critiqued by experts, practitioners, and stakeholders?  Is there a coalition that is prepared to support the idea?  How can the coalition be expanded?

10. How might tools such as discovery sprints, human-centered design, agile governance, and pilots be used to get feedback from citizens and other key stakeholders, and generate early evidence of effectiveness?

11. What steps can be taken to increase the probability that the idea, if approved, will be successfully implemented? 

For example, this might involve analyzing the capacity of the relevant government agencies to implement the recommended policy.

12. How can the idea be communicated to the public?  

For example, if you were a speechwriter, what stories, examples, quotes, facts and endorsements would you use to describe the problem, the proposed solution, and the goal?  What are the questions that reporters are likely to ask, and how would you respond to them?

Perhaps you have some experience with policy entrepreneurship and have suggestions on the right questions to ask about a policy idea to increase its “readiness level”. Comment on Tom’s LinkedIn post, where you can add wisdom that could be helpful to others learning about how to make positive change through policy.

Improving Government Capacity: Unleashing the capacity, creativity, energy, and determination of the public sector workforce

Peter Bonner is a Senior Fellow at FAS.

Katie: Peter, first, can you explain what government capacity means to you?

Peter: What government capacity means to me is ensuring that the people in the public sector, federal government primarily, have the skills, the tools, the technologies, the relationships, and the talent that help them meet their agency missions they need to do their jobs.

Those agency missions are really quite profound. I think we lose sight of this: if you’re working at the EPA, your job is to protect human health in the environment. If you’re working at the Department of the Interior, it’s to conserve and protect our natural resources and cultural heritage for the benefit of the public. If you’re working for HHS, you’re enhancing the health and well-being of all Americans. You’re working for the Department of Transportation, you’re ensuring a safe and efficient transportation system. And you can get into the national security agencies about protecting us from our enemies, foreign and domestic. These missions are amazing. Building that capacity so that the people can do their jobs better and more effectively is a critical and noble undertaking. Government employees are stewards of what we hold in common as a people. To me, that’s what government capacity is about.

Mr. Bonner’s Experience and Ambitions at FAS

You’ve had a long career in government – but how is it that you’ve come to focus on this particular issue as something that could make a big difference?

I’ve spent a  couple of decades building government capacity in different organizations and roles, most recently as a government executive and political appointee as an associate director at the Office of Personnel Management. Years ago I worked as a contractor with a number of different companies, building human capital and strategic consulting practices. In all of those roles, in one way or another, it’s been about building government capacity.

One of my first assignments when I worked as a contractor was working on the  Energy Star program, and helping to bridge the gaps between the public sector interests – wanting to create greater energy efficiency and reduce energy usage to address climate change – to the private sector interests – making sure their products were competitive and using market forces to demonstrate the effectiveness of federal policy. This work promoted energy efficiency across energy production, computers, refrigerators, HVAC equipment, even commercial building and residential housing. Part of the capacity building piece of that was working with the federal staff and the federal clients who ran those programs, but also making sure they had the right sets of collaboration skills to work effectively with the private sector around these programs and work effectively with other federal agencies. Agencies not only needed to work collaboratively wih the private sector, but across agencies as well. Those collaboration skills–those skills to make sure they’re working jointly inter-agency – don’t always come naturally because people feel protective about their own agency, their own budgets, and their own missions. So that’s an example of building capacity. 

Another project early on I was involved in was helping to develop a training program for inspectors of underground storage tanks. That’s pretty obscure, but underground storage tanks have been a real challenge in the nation in creating groundwater pollution. We developed an online course using simulations on how to detect leaks and underground storage tanks. The capacity building piece was getting the agencies and  tank inspectors at the state and local level to use this new learning technology to make their jobs easier and more effective. 

Capacity building examples abound – helping OPM build human capital frameworks and improve operating processes, improving agency performance management systems, enhancing the skills of Air Force medical personnel to deal with battlefield injuries, and on. I’ve been doing capacity building through HR transformation,  learning, leadership development, strategy and facilitation, human centered design, and looking at how do you develop HR and human capital systems that support that capacity building in the agencies. So across my career, those are the kinds of things that I’ve been involved in around government capacity.

What brought you to FAS and what you’re doing now? 

I left my job as the associate director for HR Solutions at the Office of Personnel Management last May with the intent of finding ways to continue to contribute to the effective functioning of the federal government. This opportunity came about from a number of folks I’d worked with while at OPM and elsewhere.

FAS is in a unique position to change the game in federal capacity building through thought leadership, policy development, strategic placement of temporary talent, and initiatives to bring more science and technical professionals to lead federal programs. 

I’m really trying to help change the game in talent acquisition and talent management and how they contribute to government capacity. That ranges from upfront hiring in the HR arena through to onboarding and performance management and into program performance.

I think what I’m driven by at FAS is to really unleash the capacity, the creativity, the energy, the determination of the public sector workforce to be able to do their jobs as efficiently and effectively as they know how. The number of people I know in the federal government that have great ideas on how to improve their programs in the bottom left hand drawer of their desk or on their computer desktop, that they can never get around to because of everything else that gets in the way. 

There are ways to cut through the clutter to help make hiring and talent management effective. Just in hiring: creative recruiting and sourcing for science and technical talent, using hiring flexibilities and hiring authorities on hand, equipping HR staffing specialists and hiring managers with the tools they need, working across agencies on common positions, accelerating background checks are all ways to speed up the hiring process and improve hiring quality.

It’s the stuff that gets in the way that inhibits their ability to do these things. So that unleashing piece is the real reason I’m here. When it comes to the talent management piece changing, if you can move the needle a little bit on the perception of public sector work and federal government work, because the perception, the negative perception of what it’s like to work in the federal government or the distrust in the federal government is just enormous. The barriers there are profound. But if we can move the needle on that just a little bit, and if we can change the candidate experience of the person applying for a federal job so that they, while it may be arduous, results in a positive experience for them and for the hiring manager and HR staffing specialist, that then becomes the seed bed for a positive employee experience in the federal job. That then becomes the seed bed for an effective customer experience because the linkage between employee experience and customer experience is direct. So if we can shift the needle on those things just a little bit, we then start to change the perception of what public sector work is like, and tap into that energy of what brought them to the public sector job in the first place, which by and large is the mission of the agency.

Using Emerging Technologies to Improve Government Capacity

How do you see emerging technologies assisting or helping that mission?

The emerging technologies in talent management are things that other sectors of the economy are working with and that the federal government is quickly catching up on. Everybody thinks the private sector has this lock picked. Well, not necessarily. Private sector organizations also struggle with HR systems that effectively map to the employee journey and that provide analytics that can guide HR decision-making along the way.

A bright spot for progress in government capacity is in recruiting and sourcing talent. Army Corps of Engineers, Department of Energy are using front end recruiting software to attract people into their organizations. The  Climate Corps, for example, or the Clean Energy Corps at Department of Energy. So they’re using those front end recruiting systems to bring people in and attract people in to submit the resumes and their applications that can again, create that positive initial candidate experience, then take ’em through the rest of the process. There’s work being done in automating and developing more effective online assessments from USA Hire, for example, so that if you’re in a particular occupation, you can take an online test when you apply and that test is going to qualify you for the certification list on that job.

Those are not emerging technologies but they are being deployed effectively in government. The mobile platforms to quickly and easily communicate with the applicants and communicate with the candidates at different stages of the process. Those things are coming online and already online in many of the agencies. 

In addition to some experimentation with AI tools, I think one of the more profound pieces around technologies is what’s happening at the program level that is changing the nature of the jobs government workers do that then impacts what kind of person an HR manager is  looking for. 

For example, while there are specific occupations focused on machine learning, AI, and data analytics, data literacy and acumen and using these tools going to be part of everyone’s job in the future. So facility with those analytic tools and with the data visualization tools that are out there is going to have a profound impact on the jobs themselves. Then you back that up to, okay, what kind of person am I looking for here? I need somebody with that skill set coming in. Or who can be easily up-skilled into that. That’s true for data literacy, data analytics, some of the AI skill sets that are coming online. It’s not just the technologies within the talent management arena, but it’s the technologies that are happening in the front lines and the programs that then determine what kind of person I’m looking for and impact those jobs.

The Significance of Permitting Reform for Hiring

You recently put on a webinar for the Permitting Council. Do you mind explaining what that is and what the goal of the webinar was?

The Permitting Council was created under what’s called the Fast 41 legislation, which is legislation to improve the capacity and the speed at which environmental permits are approved so that we can continue with federal projects. Permitting has become a real hot button issue right now because the Inflation Reduction Act, the CHIPS and Science Act, the Bipartisan Infrastructure Law created all of these projects in the field, some on federal lands, some on state and local lands, and some on tribal or private sector lands, that then create the need to do an environmental permit of some kind in order to get approval to build. 

So under the Bipartisan Infrastructure Law, we’re trying to create internet for all, for example, and particularly provide internet access in rural communities where they haven’t had it before, and people who perhaps couldn’t afford it. That requires building cell towers and transmission lines on federal lands, and that then requires permits, require a permitting staff or a set of permitting contractors to actually go in and do that work.

Permitting has been, from a talent perspective, underresourced. They have not had the capacity, they have not had the staff even to keep up with the permits necessitated by these new pieces of legislation. So getting the right people hired, getting them in place, getting the productive environmental scientists, community planners, the scientists of different types, marine biologists, landscape folks, the fish and wildlife people who can advise on how best to do those environmental impact statements or categorical exclusions as a result of the National Environmental Protection Act – it has been a challenge. Building that capacity in the agencies that are responsible for permitting is really a high leverage point for these pieces of legislation because if I can’t build the cell tower, I then don’t realize the positive results from the Bipartisan Infrastructure Law. And you can think of the range of things that those pieces of legislation have fostered around the country from clean water systems in underserved communities, to highways, to bridges, to roads, to airports.

Another example is offshore wind. So you need marine biologists to be able to help do the environmental impact statements around building the wind turbines offshore and examine the effect on the marine habitats. It’s those people that the Department of Interior, the Department of Energy, and Department of Commerce need to hire to come in and run those programs and do those permits effectively. That’s what the Permitting Council does.

One of the things that we worked with with OPM and the Permitting Council together on is creating a webinar so that we got the hiring managers and the HR staffing specialists in the room at the same time to talk about the common bottlenecks that they face in the hiring process. After doing outreach and research, we created journey maps and a set of personas to identify a couple of the most salient and common challenges and high leverage challenges that they face.

Overcoming Hiring Bottlenecks for Permitting Talent, a webinar presented to government hiring managers, May 2024

Looking at the ecosystem within hiring, from what gets in the way in recruiting and sourcing, all the way through to onboarding, to focusing in on the position descriptions and what do you do if you don’t have an adequate position description upfront when you’re trying to hire that environmental scientist to the background check process and the suitability process. What do you do when things get caught in that suitability process? And if you can’t bring those folks on board in a timely fashion you risk losing them. 

We focused on a couple of the key challenges in that webinar, and we had, I don’t know, 60 or 70 people who were there, the hiring managers and HR staffing specialists who took away from that a set of tools that they can use to accelerate and improve that hiring process and get high quality hires on quickly to assist with the permitting.

The Permitting Council has representatives from each of the agencies that do permitting, and works with them on cross agency activities. The council also has funding from some of these pieces of legislation to foster the permitting process, either through new software or people process, the ability to get the permits done as quickly as possible. So that’s what the webinar was about. I We’re talking about doing a second one to look at the more systemic and policy related changes, challenges in permitting hiring.

The Day One Project 2025

FAS has launched its Day One Project 2025, a massive call for nonpartisan, science-based policy ideas that a next presidential administration can utilize on “day one” – whether the new administration is Democrat or Republican. One of the areas we’ve chosen to focus on is Government Capacity. Will you be helping evaluate the ideas that are submitted?

I’ve had input into the Day One Project, and particularly around the talent pieces in the government capacity initiative, and also procurement and innovation in that area. I think the potential of that to help set the stage for talent reform more broadly, be it legislative policy, regulatory or the risk averse culture we have in the federal government. I think the impact of that Day One Project could be pretty profound if we get the right energy behind it. So one of the things that I’ve known for a while, but has come clear to me over the past five months working with FAS, is that there are black boxes in the talent management environment in the federal government. What I mean by that is that it goes into this specialized area of expertise and nobody knows what happens in that specialized area until something pops out the other end.

How do you shed light on the inside of those black boxes so it’s more transparent what happens? For instance: position descriptions when agencies are trying to hire someone. Sometimes what happens with position descriptions is that the job needs to be reclassified because it’s changed dramatically from the previous position description. Well, I know a little about classification and what happens in the classification process, but to most people looking from the outside to hiring managers, that’s a black box. Nobody knows what goes on. I mean, they don’t know what goes on within that classification process to know that it’s going to be worthwhile for them once they have the position description at the other end and are able to do an effective job announcement. Shedding light on that, I think has the potential to increase transparency and trust between the hiring manager and the HR folks or the program people and the human people.

If we’re able to create that greater transparency. If we’re able to tell the candidates when they come in and apply for a job where they are in the hiring process and whether they made the cert list or didn’t make the cert list. And if they are in the CT list, what’s next in terms of their assessment and the process? If they’ve gone through the interview, where are we in the decision deliberations about offering me the job? Same thing in suitability. Those are many black boxes all the way, all the way across. And creating transparency and communication around it, I think will go a long way, again, to moving that needle on the perception of what federal work is and what it’s like to work in the system. So it’s a long answer to a question that I guess I can summarize by saying, I think we are in a target rich environment here. There’s lots of opportunity here to help change the game.

Photo Depicts Potential Nuclear Mission for Pakistan’s JF-17 Aircraft

Due to longstanding government secrecy, analyzing Pakistan’s nuclear weapons program is fraught with uncertainties. While it is known that Pakistan–along with many other nuclear-armed states–is modernizing its nuclear capabilities and fielding new weapons systems, little official information has been released regarding these plans or the status of its arsenal.

One of the many questions researchers have been asking concerns the modernization of Pakistan’s nuclear-capable aircraft and its associated air-launched cruise missiles (ALCM). It has been long assumed that the Mirage III and Mirage V fighter bombers are the two aircraft with a nuclear delivery role in the Pakistan Air Force (PAF). The Mirage V is thought to have a strike role with Pakistan’s limited supply of nuclear gravity bombs, while the Mirage III has been used to conduct test launches of Pakistan’s dual-capable Ra’ad-I (Hatf-8) ALCM, as well as the follow-on Ra’ad-II. The Ra’ad ALCM was first tested in 2007 and has since remained Pakistan’s only nuclear-capable air-launched cruise missile.

The U.S. Air Force National Air and Space Intelligence Center (NASIC) reported in 2017 that the Ra’ad cruise missile was “conventional or nuclear,” a term normally used to describe a dual-capable system.

In order to retire its aging Mirage III and V aircraft and bolster defense production, Pakistan has procured over 130 operational JF-17 aircraft–which are jointly produced with China–and plans to acquire more in the future. During the 2024 Pakistan Day Parade, the PAF also announced a JF-17 PFX (Pakistan Fighter Experimental) project to maximize the operational lifespan and modernize the capabilities of the JF-17 aircraft.

Over the past few years, several reports have suggested Pakistan may incorporate the dual-capable Ra’ad ALCM onto the JF-17 so that the newer aircraft could eventually take over the nuclear strike role from the Mirage III/Vs. However, little information has been revealed about the status of this procurement and whether the JF-17s will replace the Mirage III and Vs in the nuclear mission. That is, until March 2023, when an aviation photographer captured an image that could help answer some of these lingering questions.

A possible new nuclear mission

During rehearsals for the 2023 Pakistan Day Parade (which was subsequently canceled), an image surfaced of a JF-17 Thunder Block II carrying what was reported to be a Ra’ad ALCM. Notably, this was the first time such a configuration had been observed in public.

Photo credit: Rana Suhaib/Snappers Crew

FAS was able to purchase the original image. To try and ascertain which type of Ra’ad is in the JF-17 image–the original Ra’ad-I or the extended-range Ra’ad-II–we compared it to other Ra’ad-I and -II missiles displayed in the 2017, 2018, 2019, 2021, 2022, and 2024 Pakistan Day Parades (the parades in 2020 and 2023 were canceled) where the Ra’ad-I and Ra’ad-II were showcased alongside other nuclear-capable missiles such as the Nasr, Ghauri, Shaheen-IA and -II, as well as the Babur-1A.

Between 2017–when the Ra’ad-II was first publicly unveiled–and 2022, there were very few observable differences between the Ra’ad-I and -II. During this period, both missiles featured a new engine air intake, and although the Ra’ad-II was presented as having nearly double the range capability, this was not clearly observable through external features.

Comparison of Ra’ad missiles from Pakistan Day Parades

In 2017, Pakistan unveiled the Ra’ad Il air-launched cruise missile, an enhanced version of the nuclear-capable Ra’ad. Both the Ra’ad I and II have since been featured in the annual Pakistan Day Parades and seem to have a similar design. However, the latest version of the Ra’ad II, which was first displayed in 2022, has a noticeably new tail fin configuration and supposedly has a longer range of up to 600 km.

However, in 2022, a new version of the Ra’ad-II was displayed at the 2022 Pakistan Day Parade. Notably, this new version had an ‘x-shaped’ tail fin configuration as opposed to the other ‘twin-tail’ configurations seen in previous versions of the missile. The subsequent 2024 Pakistan Day parade also showcased the two distinct versions of the Ra’ad with their respective tail fin arrangements.  

The fin arrangements of the photographed missile on the JF-17 appear to more closely match the ‘twin-tail’ configuration of the Ra’ad-I, rather than the newer ‘x-shaped’ tail of the Ra’ad-II, especially since it is unlikely that an outdated version of the Ra’ad II would be utilized in a flight test intended to demonstrate state-of-the-art capabilities.  

Pakistan is also developing a conventional, anti-ship variant of the Ra’ad ALCM, known as Taimoor, that can be launched from the JF-17. Photos of the missile indicate that the two designs are highly similar, although the Taimoor missile also appears to include an ‘x-tail’ fin configuration, and its length is reported to be 4.38 meters. The ‘x-tail configuration’ would appear to indicate that the missile photographed on the JF-17 was not the Taimoor; however, for additional clarity, we measured multiple parade images of both versions of the Ra’ad and compared their lengths to that of the photographed missile.

We took an image of the Ra’ad-I from the 2019 Pakistan Day Parade and used the Vanishing Point feature in Photoshop to add gridded planes that simulate a 3D space in order to account for the angle at which the image was taken and the depth at which the missile sits compared to the side of the truck. After finding the make and model of the vehicle carrying the missile (which appears to be an early version of the Hino 500 Series FM 2630), we used the truck’s trailer axle spread of approximately 1.3 meters and wheelbase of 4.24 meters reference values and used Photoshop’s measuring tool to render an approximate length. We found it to be around 4.9 meters.

To estimate the dimensions of the Ra’ad-II, we started with a photo from the 2022 Pakistan Day Parade. Using the same methodology with Photoshop’s Vanishing Point tool to account for the angle of the photo and the distance from the missile’s position in the center of the vehicle to the foremost gridded plane that measures the length of the truck, we roughly estimated the size of the missile to be around 4.9 meters.

We also double-checked the dimensions of the cruise missile in the JF-17 image. Since we know that the JF-17 is roughly 14.3 meters long, we used that number as a reference value and employed Photoshop’s Vanishing Point and measuring tool again to render an approximate length of the missile, given its lowered position in relation to the edge of the fuselage. The result was 4.9 meters, which matches the reported dimensions of the Ra’ad-I and -II ALCM as well as our estimated measurements. This measurement is also longer than the Taimoor’s reported 4.38-meter length.

While it is possible that the missile could be an old Ra’ad-II, given that the 2017 version also had a ‘twin-tail’ configuration, that version of the Ra’ad-II appears to be outdated and is therefore unlikely to be utilized in a flight test. Still, it is possible that more information, images, or statements from the Pakistan government could surface that answer some of these questions. 

Observing the differences between the Ra’ad-I and Ra’ad-II missiles raises a few questions. How was Pakistan able to nearly double the range of the Ra’ad from an estimated 350 km to 550 km and then to 600 km for the newest version without noticeably changing the size of the missile to carry more fuel? The answer could possibly be that the Ra’ad-II engine design is more efficient, the construction components are made from lighter-weight materials, or the payload has been reduced. 

These measurements offer additional evidence to support our conclusion that the missile observed in the photographed image of the JF-17 is the Ra’ad-I ALCM.

Implications for Pakistan’s nuclear forces

Given the lack of publicly available information from the government of Pakistan about its nuclear forces, we must rely on these types of analyses to understand the status of Pakistan’s nuclear arsenal. From these observations, it is likely that Pakistan has made significant progress toward equipping its JF-17s with the capability to eventually supplement–and possibly replace–the nuclear strike role of the aging Mirage III/Vs. Additionally, it is evident that Pakistan has redesigned the Ra’ad-II ALCM, but little information has been confirmed about the purpose or capabilities associated with this new design. It is also unclear whether either of the Ra’ad systems has been deployed, but this may only be a question of when rather than if. Once deployed, it remains to be seen if Pakistan will also continue to retain a nuclear gravity bomb capability for its aircraft or transition to stand-off cruise missiles only. 

This all takes place in the larger backdrop of an ongoing and deepening nuclear arms competition in the region. Pakistan is reportedly pursuing the capability to deliver multiple independently targetable re-entry vehicles (MIRVs) with its Ababeel land-based missile, while India is also pursuing MIRV technology for its Agni-P and Agni-5 missiles, and China has deployed MIRVs on a number of its DF-5B ICBMs and DF-41. In addition to the Ra’ad ALCM, Pakistan has also been developing other short-range, lower-yield nuclear-capable systems, such as the NASR (Hatf-9) ballistic missile, that are designed to counter conventional military threats from India below the strategic nuclear level. 

These developments, along with heightened tensions in the region, have raised concerns about accelerated arms racing as well as new risks for escalation in a potential conflict between India and Pakistan, especially since India is also increasing the size and improving the capabilities of its nuclear arsenal. This context presents an even greater need for transparency and understanding about the quality and intentions behind states’ nuclear programs to prevent mischaracterization and misunderstanding, as well as to avoid worst-case force buildup reactions.

The author would like to thank David La Boon and Decker Eveleth for their invaluable guidance and feedback on using Photoshop’s Vanishing Point feature.

Dr. Pierre-Clément Simon and Dr. Casey Icenhour, Idaho National Laboratory, Developing the Future of Fusion Energy

The Office of Technology Transfers is holding a series of webinars on cutting-edge technologies being developed at the DOE National Labs – and the transformative applications they could have globally for energy access. We sat down with the people behind these technologies – the experts who make that progress possible. These interviews highlight why a strong energy workforce is so important, from the lab into commercial markets. These interviews have been edited for length and do not necessarily reflect the views of the DOE. Be sure to attend DOE’s next National Lab Discovery Series webinar on multiphysics modeling for fusion energy on Wednesday, June 26.

Dr. Pierre-Clément Simon and Dr. Casey Icenhour come from different backgrounds, but share similar passions: for driving forward progress in fusion energy and mentoring early career scientists. At Idaho National Lab, they do both. As computational scientists at INL, they contribute to the development of groundbreaking technologies in the world of fusion. Dr. Simon works on FENIX, a new multi-physics modeling program, and Dr. Icenhour works on MOOSE, a foundational modeling framework that underpins FENIX – a system that is used to simulate reactions within fusion energy. 

The Road to Idaho

Dr. Simon grew up in France, where he pursued engineering science for both his undergraduate and Master’s degrees. During his studies, he questioned where he would use the skills and knowledge he was gaining. “I wanted to make sure I applied them to something useful to society, and I asked myself what the big challenges were that we’re facing today. And climate change was one of the main ones for me – and from there working in energy became something that I was very passionate about.” (Simon)

He continued on to his PhD at Penn State, focusing his energy on nuclear fission research. After graduating, he faced a tough decision – stay in the US, or head back home to France. What helped him decide was the culture of American research that allows for ambitious ideas, exploration, and even failure early in one’s career. 

“You can’t do science if you’re being too cautious. The early stages of your career are really important to take those risks, grow, challenge the status quo, and have an impact. And for me, the labs – and especially INL – were a great place for that.” (Simon)

Dr. Icenhour started closer to home – growing up in western North Carolina, he got his start at Western Carolina University in the electrical engineering program. His studies there and later at North Carolina State University for his PhD led him to combine his electrical engineering background with a focus in nuclear engineering and plasma physics. 

As a first-generation college student, Dr. Icenhour didn’t immediately see the labs as a career option – he assumed he would head into industry after graduation, maybe returning to academia in the future to fuel his passion for teaching. It wasn’t until he discovered and participated in DOE’s Graduate Student Research Award program at Oak Ridge National Laboratory that that changed. 

“When I went to Oak Ridge, I realized that the labs served this vital need in between academia and industry – they’re doing the big science that [those two] might not be willing or able to pay for because of the size of the investment…I felt that if I wanted to make an impact in a multifaceted way – not only on research, but on collaborations with industry and working with students in service to energy and climate change – that this was the place I wanted to be.” (Icenhour)

Dr. Icenhour continued his work on plasma physics, electromagnetics, and the MOOSE framework at Idaho National Laboratory while continuing to work on his PhD – completing it in 2023 and converting to a full-time staff member at INL. 

Physics Modeling of the Future

Dr. Icenhour and Dr. Simon are both computational scientists at INL – but focused on different programs. Dr. Icenhour began working on the MOOSE (Multiphysics Object Oriented Simulation Environment) framework during his PhD, and has helped translate its capabilities to FENIX (Fusion ENergy Integrated multiphys-X). Dr. Simon leads the development of FENIX – a modeling system that is able to incorporate multiple frameworks like MOOSE and apply them to fusion simulations. 

In basic terms, MOOSE is a multiphysics modeling framework that allows a user to simulate how systems will interact with different, potentially highly-coupled areas of physics. It can allow scientists to test different interactions – how a material would experience heat transfer, electromagnetic forces, solid mechanics, and other materials in different environments.  

Scientists use programs like MOOSE to test out designs and functions of new technologies like fission and fusion that are expensive and time- and labor-intensive to test out in reality. Having modeling systems that can reliably simulate how certain designs will interact with different environments saves money and people power, and allows for more creative and ambitious experiments. These programs can rapidly increase the pace of research, development, and deployment of new technologies. 

FENIX takes that work a step further – using the lessons learned and capabilities of MOOSE and other frameworks, combining them, and applying it to fusion systems. For example: a researcher could use MOOSE to validate how a material tile responds to heat exposure, and then use FENIX to incorporate other programs like Cardinal (developed at Argonne) and TMAP8 (Tritium Migration Analysis Program, version 8 – developed at INL), to understand how neutron-generated heat and tritium implantation would affect the same material tile. 

It sounds complex, but Dr. Simon and Dr. Icenhour describe it as using building blocks – starting small and slowly increasing the complexity. 

“Under the MOOSE ecosystem we’re building whole reactor models – the entirety of an advanced reactor core. That’s what we want to do for fusion…Being able to do these fully integrated models can help us develop resources that industry can use for rapid design iteration and engineering. We’re starting small and then building big simulations that can be utilized in an intelligent way to get the answers we need to solve these challenges.” (Icenhour)

Importantly, these programs are or will be completely open source – anyone with an internet connection can download, use, and contribute to MOOSE, and will soon be able to with FENIX as well. This makes collaborations with a much wider network of scientists possible, and the team at Idaho has worked with labs and students across the US, in Italy, and the UK. Dr. Simon explained that his team has developed trainings for these programs and why it’s necessary to keep it open source: “The fusion industry will not exist if we don’t have the workforce that’s needed.” (Simon)

The Power of Mentorship

Dr. Icenhour and Dr. Simon work tirelessly to move these technologies forward, and they have already accomplished quite a bit. Dr. Icenhour actually developed the electromagnetics module of the MOOSE framework as part of his PhD – now it’s being used as part of the larger initiative. “My greatest accomplishment is being able to contribute something that I worked for years on that other folks see as important to their research. The sense of accomplishment I feel from that is incomparable to a lot of other aspects of the job.”

Dr. Simon has had his share of technical accomplishments as well, but shared the pride he felt when he recently received the INL Lab Director’s Award for Inclusive Diversity – given based on his efforts to support international and early career researchers at the lab.  “When you want to do great science, you need a fantastic community with a lot of diverse ideas. If you only have the same type of people doing the research, you’re always going to end up with the same outputs, with the same limitations.

But more than any awards or achievements, Dr. Simon and Dr. Icenhour both emphasized that they feel their most important work is mentoring other researchers. They are both still early career themselves, and feel a responsibility to support others in pursuing lab careers. Both are members of the Early Career Researchers Association at INL, with Dr. Simon acting as the current Chair and Dr. Icenhour as the Professional Development Officer. 

Dr. Simon spoke about the challenges of first coming to the US as an international student – “My first full discussion in English was at customs. I was blessed to have a lot of people that were willing and able to mentor and guide me – there’s a long list of people that really changed my career. I want to do my best to pay it forward.”

Dr. Icenhour’s experience during his internship at ORNL was similar: “[My experience] at Oak Ridge really introduced me to that way of working and the opportunities I might have, and that changed my career. The mentorships and experiences I received there and the opportunity to go made all the difference.” 

Combined, they oversee five interns, and spoke about one student in particular that they are mentoring currently – a graduate student intern who, with the support of Dr. Simon and Dr. Icenhour, has been accepted to multiple National Science Foundation and DOE Fellowships. “I have never been so proud of a student as when we were proud of [our intern] at the end of the summer…It’s [his] accomplishment, and he did the work – but that showed me that I was doing something right as far as being a mentor, and that made me feel really proud.” (Icenhour)

Ultimately, both scientists are contributing a great deal not only to fusion and fission science, but to the field of computational science as a whole. Their journeys haven’t been easy, but their perseverance and commitment to bringing others along with them makes it possible. “My ability to be resilient – even when things go wrong, I keep going. Solving these problems is very challenging, and my ability to keep going and stay motivated is something I’m very proud of.” (Simon)

We’re Entering a New Period, as Revealed by FAS Nuclear Arsenal Data Published in SIPRI Yearbook 2024

Goodbye, decades of nuclear weapon reduction?

Hans Kristensen and Matt Korda with the FAS Nuclear Information Project write in the new SIPRI Yearbook, released today, that the world’s nuclear arsenals are on the rise, massive modernization programs are underway, and nuclear weapons are becoming more prominent in military strategies and rhetoric.

It is clear that the gradual reductions in nuclear stockpiles that characterized the post-Cold War period is over, and that the world is sliding back into nuclear competition and––in some cases––an arms race.

The development is in stark contrast to the promises made by many nuclear-armed states to reduce nuclear risks and seek a world without nuclear weapons.

The SIPRI Yearbook is published by the Stockholm International Peace Research Institute (SIPRI) and is one of the most widely cited sources of information on nuclear weapons. The nuclear data is derived from the research and analysis the Nuclear Information Project uses to produce the Status of World Nuclear Forces on the FAS website.

The SIPRI chapter describes the nuclear weapon modernization programs underway in each nuclear-armed state and provides estimates for how many nuclear warheads each country possesses. Combined, the research team, which includes Kristensen, Korda, Eliana Johns, and Mackenzie Knight, estimates that the combined global inventory of nuclear warheads is approximately 12,120. Of these, around 3,900 are estimated to be deployed on missiles and aircraft (2,100 of which are on high operational alert on ballistic missiles). Thousands of warheads (some 5,680) are stored in special depots for deployment if necessary. The remaining 2,540 warheads or so are retired and awaiting dismantlement. 

Moreover, with the increased nuclear competition, the research team reports that government transparency of nuclear forces is decreasing.

Read it here: SIPRI Yearbook 2024 nuke chapter

This research was carried out with generous contributions from the Joseph Rowntree Charitable Trust, Longview Philanthropy, New-Land Foundation, Ploughshares Fund, the Prospect Hill Foundation, and individual donors.

Get Ready, Get Set, FESI!: Putting Pilot-Stage Clean Energy Technologies on a Commercialization Fast Track

It may sound dramatic, but “Valleys of Death” are delaying the United States’ technology development progress needed to achieve the energy security and innovation goals of this decade. As emerging clean energy technologies move along the innovation pipeline from first concept to commercialization, they encounter hurdles that can prove to be a death knell for young startups. These “Valleys of Death” are gaps in funding and support that the Department of Energy (DOE) hasn’t quite figured out how to fill – especially for projects that require less than $25 million.

The International Energy Agency (IEA) estimates that to reach net-zero emissions by 2050, almost 35% of CO2 emissions to avoid require technologies that are not yet past the demonstration stage. It’s important to note that this share is even higher in harder-to-decarbonize sectors like long-haul transportation and heavy industry. To reach this metric, a massive effort within the next ten years is needed for these technologies to reach readiness for deployment in a timely manner.

Although programs exist within DOE to address different barriers to innovation, they are largely constrained to specific types of technologies and limited in the type of support they can provide. This has led to a discontinuous support system with gaps that leave technologies stranded as they wait in the “valleys of death” limbo. A “Fast Track” program at DOE – supported by the CHIPS and Science-authorized Foundation for Energy Security and Innovation (FESI) – would remove obstacles for rapidly-growing startups that are hindered by traditional government processes. FESI is uniquely positioned to be a valuable tool for DOE and its allies as they seek to fill the gaps in the technology innovation pipeline.

Where does FAS come in?

The Department of Energy follows the lead of other agencies that have established agency-affiliated foundations to help achieve their missions, like the Foundation for the National Institutes of Health (FNIH) and the Foundation for Food & Agriculture Research (FFAR). These models have proven successful at facilitating easier collaboration between agencies and philanthropy, industry, and communities while guarding against conflicts of interest that might arise from such collaboration. Notably, in 2020, the FNIH coordinated a public-private working group, ACTIV, between eight U.S. government agencies and 20 companies and nonprofits to speed up the development of the most promising COVID-19 vaccines. 

As part of our efforts to support DOE in standing up its new foundation with the Friends of FESI Initiative, FAS is identifying potential use cases for FESI – structured projects that the foundation could take on as it begins work. The projects must forward DOE’s mission in some way, with a particular focus on accelerating clean energy technology commercialization.

In early April, we convened leaders from DOE, philanthropy, industry, finance, the startup community, and fellow NGOs to workshop a few of the existing ideas for how to implement a Fast Track program at DOE. We kicked things off with some remarks from DOE leaders and then split off into four breakout groups for three discussion sessions.

In these sessions, participants brainstormed potential challenges, refinements, likely supporters, and specific opportunities that each idea could support. Each discussion was focused around what FESI’s unique value-add was for each concept and how best FESI and DOE could complement each other’s work to operationalize the idea. The four main ideas are explored in more detail below.

Support Pilot-scale Technologies on the Path to Commercialization 

The technology readiness level (TRL) framework has been used to determine an emerging technology’s maturity since NASA first started using it in the 1970s. The TRL scale begins at “1” when a technology is in the basic research phase and ends at “9” when the technology has proven itself in an operating environment and is deemed ready for full commercial deployment. 

However, getting to TRL 9 alone is insufficient for a technology to actually get to demonstration and deployment. For an emerging clean technology to be successfully commercialized, it must be completely de-risked for adoption and have an established economic ecosystem that is prepped to welcome it. To better assess true readiness for commercial adoption, the Office of Technology Transitions (OTT) at the Department of Energy (DOE) uses a joint “TRL/Adoption Readiness Level (ARL)” framework. As depicted by the adoption readiness level scale below, a technology’s path to demonstration and deployment is less linear in reality than the TRL scale alone suggests.

Source: The Office of Technology Transitions at the Department of Energy

There remains a significant gap in federal support for technologies trying to progress through the mid-stages of the TRL/ARL scales. Projects that fall within this gap require additional testing and validation of their prototype, and private investment is often inaccessible until questions are answered about the market relevance and competitiveness of the technology.

FESI could contribute to a pilot-scale demonstration program to help small- and medium-scale technologies move from mid-TRLs to high-TRLs and low to medium ARLs by making flexible funding available to innovators that DOE cannot provide within its own authorities and programs. Because of its unique relationship as a public-private convener, FESI could reach the technologies that are not mature enough, or don’t qualify, for DOE support, and those that are not quite to the point where there is interest from private investors. It could use its convening ability to help identify and incubate these projects. As it becomes more capable over time, FESI might also play a role in project management, following the lead of the Foundation for the NIH.

Leverage the National Labs for Tech Maturation 

The National Laboratories have long worked to facilitate collaboration with private industry to apply Lab expertise and translate scientific developments to commercial application. However, there remains a need to improve the speed and effectiveness of collaboration with the private sector.

A Laboratory-directed Technology Maturation (LDTM) program, first ideated by the EFI Foundation, would enable the National Labs to allocate funding for technology maturation projects. This program would be modeled after the highly successful DOE Office of Science Laboratory-directed Research and Development (LDRD) program and it would focus on taking ideas at the earliest Technology Readiness Levels (TRLs) and translating them to proof of concept—from TRL 1 and 2 to TRL 3. This program would translate scientific discoveries coming out of the Labs into technology applications that have great potential for demonstration and deployment. FESI could assist in increasing the effectiveness of this effort by lowering the transaction costs of working with the private sector. It could also be a clearinghouse for LDTM-funded scientists who need partners for their projects to be successful, or could support an Entrepreneur-in-Residence or entrepreneurial postdoc program that could house such partners.

While FESI would be a practical convener of non-federal funding for this program, the magnitude of the funding needed to establish this program may not be well-suited for an initial project for the foundation to take on. It is estimated that each project would be in the ballpark of $5-20 million, and funding a full portfolio, which private sponsors are more likely to be interested in, is a nine-figure venture. Supporting a LDTM program is a promising idea for further down the line as FESI grows and matures. 

Align Later-stage R&D Market Needs with Corporate Interest via a Commercialization Consortium

Industry and investors often struggle to connect with government-sponsored technologies that fit their plans and priorities. At the same time, government-sponsored researchers often struggle to navigate the path to commercialization for new technologies.

Based on a model widely-used by the Department of Defense (DOD), an open consortium is a mechanism and means to convene industry and highlight relevant opportunities coming out of DOE-funded work. The model creates an accessible and flexible pathway to get U.S.-funded inventions to commercial outcomes.

FESI could function as the Consortium Management Organization (CMO), pictured below, to help structure interactions and facilitate communications between DOE sponsors and award recipients while freeing government staff from “picking winners.” As the CMO, FESI would issue task orders and handle contracting per the consortium agreement, which would be organized under DOE’s other transactions authority (OTA). In this model, FESI could work with DOE staff in applied R&D offices and OCED to identify opportunities and needs in the development pipeline, and in parallel work with consortium members (including holders of DOE subject inventions, industry partners, and investors) to build teams and scope projects to advance targeted technology development efforts. 

This consortium could help work out the kinks in the pipeline to ensure that successful technologies in the applied offices have sufficient “runway” to reach TRL 7, and that OCED has a healthy pipeline of candidate technologies for scaled demonstrations. FESI could mitigate the offtake risk that is known to stall first-of-a-kind projects, like financing a lithium extraction project, for example. Partners in industry and the investment community will be aligned, and potentially provide cost share, in order to gain access to technologies emerging from DOE subject inventions.

The Time is Right

This workshop comes at a prime time for FESI. The Secretary of Energy appointed the inaugural FESI board—composed of 13 leaders in innovation, national security, philanthropy, business, science, and other sectors—in mid-May. In the coming months, the board will formally set up the organization, hire staff, adopt an agenda, and begin to pursue projects that will make a real impact to advance DOE’s mission. As Friends of FESI, we want to see the foundation position itself for the grand impact it is designed to have

The above proposals are actionable and affordable projects that a young FESI is uniquely-positioned to achieve. That said, supporting pilot-stage demonstrations is only one area where FESI can make an impact. If you have additional ideas for how FESI could leverage its unique flexibility to accelerate the clean energy transition, please reach out to our team at fesifriends@fas.org. You can also keep up with the Friends of FESI Initiative by signing up for our email newsletter. Email us!

“I knew FAS is a group that really seeks to do good”: A Conversation with Dr. Rosina Bierbaum

Trying to sum up a varied and impressive career can be an impossible task – especially when that career is still going strong. But as Rosina Bierbaum steps down from her position as Vice Chair of FAS’s Board of Directors, Jonathan Wilson sat down to find out more about how her science career began, and to glean just a few pearls of wisdom that she’s picked up during her time at the forefront of science policy in this country.

Jonathan Wilson: I know that you started off early on with an interest in marine biology. Where did that come from? 

Rosina Bierbaum: Well, I think it was because my dad had a small boat store. And the family  went water-skiing, canoeing, and sail-boating on the rivers and small lakes in Pennsylvania. I grew up in the smoggy steel town of Bethlehem, Pennsylvania, so visits to these pristine lakes and waters were special and close to my heart. And then I read Rachel Carson’s book, The Sea Around Us. And that really made me want to preserve the waters of the planet and especially got me excited about the oceans. It exposed me to this amazing example of women and science – and even now, there are still some antiquated ideas about women [not belonging] in science. 

On that note, I’m curious about when you were coming up early on, whether you got any kind of discouragement or pushback on pursuing a career in science or even studying science? 

Well, not really. Both my parents had not gone to college and really wished that they could have. And so they encouraged all of us to do so. We would wake up for every NASA space launch, no matter what time of day or night it was, to watch ‘science in action’ on our little black and white TV. My parents were always very interested in science. They encouraged me to enter the science fairs. My older brother did. My older sister did. And I did. So, I felt exactly the opposite – science was cool. And then in high school, I was lucky enough to have freshmen and sophomore science teachers who encouraged me to do after-school work with them to help prepare labs. In fact, they also encouraged me to take summer courses in math at Lehigh University, which was only six blocks away from me, but at that time didn’t yet enroll women. I actually never felt the discouragement that I know a lot of women have. My older sister is an atmospheric chemist. And she definitely felt it was much harder for her than I think it is for ecologists like me, because there were already more women in biology. When I think back on it, though, the two high school teachers who encouraged me were women in my crucial teen years. But most of my mentors in college and graduate school who also believed in me and encouraged me to go further were men. 

It’s interesting because you have a sister who’s a chemist. You have this glittering science policy career. It strikes me that your parents must have had this kind of innate curiosity about the world. Do you ever think, Okay, if my dad or my mom had gone to college, this is what they would have done,? Do they have scientific minds? 

Yes, I think so. My mom actually did become a nurse before the five children showed up. And so she was fascinated in all things medical for the rest of her life, and other disciplines of science, too. And Dad followed in his father’s footsteps initially, which was as a grocer and a butcher, in small-town Bethlehem. You had populations from all over the world who would walk to the steel plant near us and buy things from the store on the way home. For example, he had ultraviolet lamps to keep down bacteria. And so he was always thinking about, ‘Why does this work? How does this work?’ And he was very intrigued with our science experiments. So yes, I think he had an “engineering” mind. He did say he wished that he had been able to go to college. In his 70s, he actually took chemistry courses at the local community college, intending, of course, to impress my older sister! And I remember being in graduate school myself and we would often talk about homework assignments and the design of my experiments together. 

Reading about your early career and your education, it’s clear that pretty early on you set yourself apart. Of course, being a woman in a field dominated by men at the time, that’s one element. But there’s also the element of the tension back then between scientists and government policy workers. You’ve said that some of your scientist colleagues were very negative about you going to do a Congressional Fellowship – they weren’t crazy about you working with politicians. I’m curious if these tensions ever grated on you – being one of the few women in some of these scientific environments, and then being one of the few scientists eager to go work on Capitol Hill. 

Well, first of all, I was very lucky that I went right from graduate school into the Congressional Office of Technology Assessment, the late great “OTA”, which is only defunded [meaning, Bierbaum says, Congress could vote to fund it again and resurrect it].  But was done away with in the [Former Speaker Newt] Gingrich Congress. There I was able to learn how to work in a policy domain in a less scary or startling fashion, how to take what had been sort of a narrow and deep science PhD and expand into learning about politics and economics, the social science aspects, and the engineering aspects with a team. 

But it was true that I was exceedingly shocked the very first day that I was a congressional fellow. I went to a House Science Committee hearing, and it was on ozone depletion in the stratosphere. And there were eight men who were wonderful academic leaders in this field trying to speak to one member of Congress who was, of course, a lawyer, as most of them are – and it was a terrible conversation. There was really no information shared between the two sides. And then that whole team of experts from a ‘great University in the Northeast’ got offstage. And one of the environmental groups’ lawyers got up and talked to a lone member of Congress who was there and they were able to exchange real information. 

It was one of those epiphanies. I realized that all the hard and good scientific research and accomplishments out in the ivory towers that aren’t translated into usable information simply won’t get used. That made me think for the first time that maybe this shouldn’t just be a one-year congressional fellowship to learn how policy works, but to actually work to bring science into the policy world, and – equally important – to bring the policy needs back out to the academic world. 

Did it ever become frustrating or old to you – the work of translating between these two communities of politicians and scientists?

It was actually very exciting. What was surprising in conducting the first congressional assessment on acid rain was how little the scientific uncertainties stopped the Congress from deciding what to do! There were huge questions in the 1980s of which pollutants to control, over how big of a region, how much to reduce, and what ecological endpoints even exist. And they answered those questions fairly quickly: let’s go for sulfur dioxide first, and let’s tackle a big region of the country. About a 50% decrease in the loading of hydrogen ions in the Eastern lakes could come from about 50% emissions reduction from the Midwest. After quickly deciding that, then Congress spent 10 years arguing over who pays and the political aspects. 

My first boss, Bob Friedman, asked me to draw a diagram of how we were going to do this assessment, how Congress should think about the impacts of climate change, and how they could build it into the Clean Air Act of 1990. So, I drew one a very linear diagram – start by thinking about the sources. You should think about reactions as they’re moving through the atmosphere. You should think about deposition products. What will the impacts be?  And out of that, will fall the solutions. And he burst into laughter. Somewhere I still have that diagram today. To me, science was driving everything, and the miracle happens, and [the answer] falls out the bottom. He redrew it so that science was in the bottom right of the box, surrounded by societal concerns and interests, which were surrounded by, of course, the political exigencies and possibilities.

I learned that science is never the loudest voice in the room, but it must be in the room. And what it says and how it can guide regulations or legislation is something that became a principle that I tried to abide by in the years in the Congress and then in the White House. And so, it never got old, because it was really interesting to figure out how to be scientifically accurate, but also politically expedient, and translate things into usable information. This is very obviously very important, and very key to what FAS is trying to do these days. 

I’m curious how over the course of your career working in science policy and watching how science interacts with government policy – how you’ve seen that change. Have you seen science on the Hill and in the White House more often just following the winds of political trends? Or do you see real progression with how the government interacts with scientists and hard science? 

Well, I certainly would say in the 1980s, during the era of the acid rain bill and the reauthorization of the Clean Air Act, it was an interesting time because the federal agencies were not particularly helping the Congress think very hard about this. It was the time of [former Environmental Protection Agency administrator] Anne Gorsuch. And so this little congressional agency [OTA] was very useful. We actually analyzed 19 different acid rain bills in the course of three or four years. I do think, though, also there were more statespeople in the Congress than I feel there are today, and there was definitely more collaborative work. And one of the things that OTA required was that both the chair and the ranking member of committees had to ask for assessments, so it belonged to both sides. Then there was also a Technology Assessment Council of Democrats, Republicans, House, and Senate people who reviewed the process of producing it. So reports were considered relatively apolitical when completed. But I do think that it was a different time. 

I mean, the main thing that Congress has done on climate change was pass the 1990 Global Change Act. And thank goodness they created that because it requires an annual research plan. It requires an assessment every four years or so of the impacts [of climate change] on the U.S. And the 5th National Climate Assessment that just came out has very strong indications of impacts already being felt: the issues of inequity, the issues of extreme events, costs to livelihood, regional impacts, etc.

So I think you’re right. There are political winds that blow. And timing is everything. Sometimes issues are more relevant, and sometimes they are not. But I feel that the steady collection of information that used to happen in the 1980s – and somewhat into the 1990s – from real debates, and committee hearings on topics, has changed. I would say back then in the Science Committee, the Democrats’ and Republicans’ staff would meet together to figure out who they were going to bring in as people to testify. And they would work on questions together. If the questions didn’t get asked by one side, they’d get asked by the other. I think partisanship has really diminished that, and I think the frequency of science-based committee hearings has decreased a lot too. You’ll often see, depending on whether it’s a Republican or Democrat committee chair – there might be just one person who defends a scientific point of view lined up against three or four people arguing against it, as opposed to a rigorous debate. 

So you spent two decades at the intersection of science and policy, serving in both the legislative and executive branches, and you even ran the first Environment Division of the White House’s Office of Science and Technology. Along the way, you were introduced to the Federation of American Scientists. So what made you want to serve on FAS’s board?

I knew about the Pugwash conferences – FAS came into being in response to nuclear weapons and seeking to prevent their use. So the same advisor – Bentley Glass – who urged me to do that Congressional fellowship, had been very active in Pugwash and speaking out against future arms’ races. And he got me involved in student Pugwash. I did that for many years, too, during my times at OTA and OSTP and even beyond, when I came to [the University of] Michigan. But over the years, John Holdren (former Chair of FAS and winner of 2 of its awards) had talked to me about FAS’s value. Henry C. Kelly was the President [of FAS], and he had worked with me at OSTP. He asked me to join the Board because FAS was thinking about energy and climate, and how to expand their mission into that area. I think I was added early on as a kind of “other”, for expertise in things slightly tangential, but within the orbit of future FAS work. 

I knew FAS is a group that really seeks to do good. And we were hoping we could engage more young scholars and stretch the confines of FAS into other security issues like climate change, energy, et cetera. 

It strikes me again – here you are at another point in your career where you’re unafraid to be a little bit of a pioneer, or different from everyone else at the table. You have this organization that is very historically nuclear-focused: FAS. And you’re not afraid to jump into that room with all these nuclear scientists and try something new. What was that like at first?

Well, one thing, Jonathan – I think you started by asking about being a woman in science. And I have to say for almost all of my career in the policy world, I hardly thought about that I was only the only woman in the room. But that was often true. It was in the policy world, where I was going to be the only scientist in the room. And I think again being undaunted by that it goes back to my parents, who believed in me, and said you could do anything you wanted to. But with FAS, I was in a room with scientists. They were different scientists than me. But it was fascinating. 

It was a world that was a bit alien. But again, it was trying to figure out what the role of FAS can be in these new and emerging issues and how to communicate it. So it actually didn’t feel as alien as it did being in the policy world [in government]. It was fun thinking about how FAS could move into these areas. And of course, I think the world of Gilman [Louie – current FAS board chair], who is just a fabulous chair and a joy to work with, he’ll be impossible to replace. 

I’ve been very happy to serve. I’m so happy about where it is now with the expansion into science policy, the issues of artificial intelligence, technology, and innovation, etc.. You’re in a great place to tackle emerging issues. I think of all of these as relevant to security issues, expanding the scope of FAS.  And, being a central place in D.C. with access to the Congress and the executive agencies and the NGO world is just fabulous. 

What are you going to be up to now? I mean – you’re not retiring. So you still have a lot of other stuff to do. So what interests you the most right now? 

I’m on many other boards. I’m on the Gordon and Betty Moore Foundation Board. And as you know, they do a huge amount of work on the environment and on basic science. I find that really interesting: to think about how you can effect change both in practice and advance science research. 

The most time-consuming duty is my work as chair of the Scientific and Technical Advisory Panel of the Global Environment Facility. The Global Environment Facility exists to implement the environmental treaties in the less developed countries. And so my little team of scientists screens every project of $2 million or greater, and tries to make sure that there’s a sound theory of change, that the outcome desired can be achieved, and that they’ve thought about climate risk screening, both the effect of the project on climate change, but also if the outcome will persist as the climate changes. 

I’m also on Al Gore’s Climate Reality Project, and we train thousands of young climate scholars all around the world. I serve on the Environmental and Energy Study Institute Board that briefs the Congress on key environmental issues. I’m on the Board of the Wildlife Conservation Society working to save wildlife and wild places around the world. I’m on the Global CO2 Initiative board at the University of Michigan and on an advisory board for Colorado State, developing an environmental program for undergraduate and graduate students. I teach both at the University of Michigan, mainly on Climate Adaptation, and at the University of Maryland on Science Policy with new FAS Board member and another member of the former Obama PCAST, Jim Gates, who’s a fabulous string theorist. And we’re able to pull in graduate students from the sciences, because he’s a physicist, and graduate students from public policy – because that’s the school I’m in at Maryland. And we do create a wonderful clash of cultures. We require that the students write policy memos. And each year, some of the students then decide, ‘Hey, maybe this is a noble profession – going into science policy!’. 

As you step down from your time with FAS, what excites you about what FAS can accomplish in coming years? What would you like to see FAS either expand into or do more of? 

Well, I think one of the things that they now have the capability to do is to work with the next generation of FAS scholars. I think FAS has an incredible potential to do convenings on a variety of topics, also potentially at a variety of universities. I think this generation hasn’t had to think about the core of FAS, nuclear security issues, as much as they should. Certainly with us celebrating Oppenheimer [at 2023’s FAS Public Service Awards], the time is ripe to do that. But I also think holding convenings on other particularly contentious issues makes sense.  I think FAS can be seen as a neutral facilitator to bring together both sides of an issue – whether it be on artificial intelligence or other science and technology topics – and bring together academics, the NGO community, and people from the Hill or the agencies to talk through some of these things. It certainly has proven that FAS, being where it is and being led as it is, has its ear to the rail, as it were, for upcoming topics. I think that being an enabler of wise discussion and communication on emerging topics is so much needed, especially in this time of both polarization and an increase in misinformation.  

I was both horrified and heartened that the World Economic Forum listed misinformation as its fifth most worrisome risk over the next decade. The first four were all environmental, but misinformation was the next one, and then misuse of AI was the sixth one. And so all the security issues – environmental security, et cetera – are, I think, squarely in FAS’s domain. I think it’s a time of incredible growth and potential for FAS. And I just can’t wait to see what it becomes in this next generation.

Thinking Big To Solve Chronic Absenteeism

Across the country in small towns and large cities, rural communities and the suburbs, millions of young people are missing school at astounding rates. They’re doing it with such frequency that educators are now tracking “chronic absenteeism.” 

It’s an important issue the White House is prioritizing. On May 15, the Biden-Harris Administration will host a summit on addressing chronic absenteeism. You can watch the livestream here, starting at 9:30 am ET.

This brand of truancy – where students are absent more than 10 percent of the time – is a problem in every state: Between 2018 and 2022, rates of chronic absenteeism nearly doubled, meaning an estimated 6.5 million more students are chronically absent today than six years ago. The New York Times recently reported that “something fundamental has shifted in American childhood and the culture of school, in ways that may be long lasting.”

But, like so many other issues in our country, chronic absenteeism hits some places harder than others. According to the non-profit organization Attendance Works, students from low-income and under-served communities are “much more likely to be enrolled in schools facing extreme levels of chronic absence.” When Attendance Works crunched the numbers, it found that in schools where at least 75 percent of students received a free or reduced-price lunch, the rates of chronic absenteeism nearly tripled, increasing from 25 percent to 69 percent between 2017 and 2022.

This alarming trend has educators and policymakers scrambling for solutions, from better bus routes to automated messaging systems for parents to “early warning” attendance tracking. These are important pursuits, but alone they won’t solve the problem.

Why? Because experts and research show that chronic absenteeism is only a symptom of a larger, more complex problem. For too many young people of color, school can be out of touch with the lives they live, so they’ve stopped going, to the point that experts predict that attendance rates won’t return to pre-COVID levels until 2030.

In these schools, the curriculum can lack rigor and their inflexible policies can harm students’ mental health and stifle the inquisitive optimism they might otherwise bring to school each day. Enrichment programs are few and far between, and students lack meaningful relationships with faculty and staff. For many kids, school is irrelevant and unwelcoming. 

If schools and policymakers want to solve the problem of chronic absenteeism – particularly in under-served communities – then they must invest in new ideas, research, and tools that will make school a place where kids feel welcomed and engaged, and where learning is relevant. In short, a school needs to be a place where kids want to be. Every. Single. Day.

Teachers, principals, and superintendents know this, and they work to make their schools and classrooms warm, fun, and challenging. But they are swimming against the tide, and they cannot be expected to do this alone. The U.S. must direct and support its brightest minds and boldest innovators to attack this problem. It can do so by making a national investment in research and development efforts to explore new approaches to learning.

The U.S. has already made a big bet on innovation for sectors like defense and health – and in space exploration in the 1960s when JFK challenged the nation to put men on the moon. This kind of “imagine if…” R&D has not yet been applied to education. 

Let’s create a National Center for Advanced Development in Education (NCADE), inspired by DARPA, the R&D engine behind the Internet and GPS. This new center would enable informed-risk, high-reward R&D to come up with new approaches and systems that would make learning relevant and fun. It could also produce innovations and creative new ways to increase family engagement – a big factor that contributes to absenteeism – improve access to technology, and even test and assess alternative discipline programs aimed at keeping kids in school rather than suspending them.

As one example, a study shows that texting parents with attendance tips and alerts effectively reduces absenteeism. Another study worked with a school district to send over 32,000 texts to families and saw attendance increase by 15 percent.

As the nation’s schools face the daunting task of post-COVID recovery, efforts to stem chronic absenteeism that tinker around the edges won’t solve the problem. NCADE could drive the transformative solutions that are needed with a nimble, multidisciplinary approach to advance bold, “what if…” R&D projects based on their potential to transform education. 

Consider the possibilities of virtual reality. In partnership with edtech startup Transfr, several Boys & Girls Clubs are leveraging virtual reality to help students plan for their future careers. With VR technology, students can peek into a cell or stand on a planet’s surface. Imagine if NCADE could further develop an early concept for an AI-assisted “make your own song” program for students with speech-language development challenges. Or, it could support the creation of customized, culturally relevant assessments, made possible through machine learning, that make test-taking less intimidating. 

Chronic absenteeism is a complex problem caused by a number of factors, but the theme running through all of them is that for too many students, schools don’t offer the types of learning opportunities or supports that make learning engaging, meaningful, and relevant to their lives. It doesn’t have to be this way. Let’s act boldly to harness innovation and make school inviting, accessible, and worthwhile for all students.

The Importance of Standards for the U.S. Bioeconomy & National Security: A Conversation with Congressman Jake Auchincloss

The U.S. bioeconomy, the sector of the economy that is touched by biology, is valued at ~$1 trillion and predicted to grow to over $30 trillion in the next two decades. With such enormous potential, ensuring the U.S. bioeconomy’s continued economic growth and global leadership has become a matter of importance for national security. Despite this massive potential, the U.S. bioeconomy, and specifically the biomanufacturing industry, is currently limited by the lack of standards in place for the sector. 

The need for standards within the biomanufacturing sector has been discussed at length by experts, and the U.S. government has acknowledged and prioritized the establishment of standards by creating a National Standards Strategy for Critical and Emerging Technologies. Both the CHIPS & Science Act of 2022, and the Visions, Needs, and Proposed Actions for the Data for the Bioeconomy Initiative (2023), highlight the need for standardization as critical to the advancement of our domestic biomanufacturing sector. Furthermore, the National Security Commission on Emerging Biotechnology (NSCEB), a commission tasked with reviewing advancements in biotechnology and its nexus with national security, stated in their interim report the importance of creating standards for this sector as a matter of national security.

Most recently, The Select Committee on Strategic Competition between the United States and the Chinese Community Party (Select Committee on the CCP), held a hearing, “Growing Stakes: The Bioeconomy and American National Security”, that focused on the threats posed by adversaries in the industry. During the hearing, Congressman Jake Auchincloss (D-MA, 4th District), entered into the record a bipartisan letter that contained recommendations around developing and implementing standards for the bioeconomy that urged action from the National Institute of Standards and Technology (NIST). 

To get a better understanding of how Congress and the Select Committee on the CCP view the need for standards for the bioeconomy, FAS interviewed Congressman Jake Auchincloss.

FAS: In your opinion, why does creating standards for biotechnology and biomanufacturing boost economic competitiveness and national security? What benefits does this pose for different regions across the nation?

Congressman Auchincloss: Standards are important in every industry to solve coordination and collective action problems, and the government plays a critical role in establishing them so that markets can work more effectively. Standards provide concrete benchmarks for individuals and companies within American industry, all of whom need a stable environment to make long-term investments. This sentiment is echoed in the National Security Commission on Emerging Biotechnology’s interim report which has noted that “biomanufacturing faces barriers to innovation because of…lack of standardization.” 

With standardization, more time can be spent innovating, researching, and building instead of compensating for uncertainty due to a lack of definitions that carry the weight of the federal government. Establishing standards will only increase productivity. As I stated in the letter I entered into the record during the Select Committee on the CCP’s hearing around the bioeconomy, “standardization will help advance industrial biomanufacturing, create a more resilient and dynamic supply chain, and establish a durable, competitive U.S. bioeconomy. In turn, strengthening the U.S. bioeconomy will improve Americans’ well-being, promote well-paying jobs, and create a competitive and advantageous U.S. science and technology enterprise to achieve our national and societal goals.”

What is the role of international collaboration in creating standards for biotechnology and biomanufacturing in light of increased tensions with China?

International collaboration is critical to the success of biotechnology, but we must ensure we are working with reliable and responsible partners. For the U.S. to remain a leader in the bioeconomy, the U.S. must ensure its domestic standards become the international benchmark. 

As we standardize the bioeconomy, we must also set standards for ethical intent and conduct. As such, there are some companies whose loyalty does not side with responsible science. We shouldn’t partner with companies like BGI, whose technologies have clear ties to the CCP’s repression and ongoing ethnic cleansing of its Uyghur communities. The increased tensions are a direct result of President Xi Jinping’s disregard for human rights, and excluding the CCP from projects that could be weaponized against their own people is the correct response.

Where does the U.S. bioeconomy stand in comparison with China or other countries?

China isn’t currently ahead but they are neck-and-neck with the U.S. because of the rate at which they are investing in their domestic bioeconomy. That’s already showing up in their patent and publication volume. The CCP’s R&D investment in biotechnology increased from $26 million USD in 1986 to $99 million USD in 2005. From 2008 to 2020, their investments increased to $3.8 billion USD. China’s spending on research and development overall climbed 10.3 percent to 2.44 trillion Chinese yuan ($378 billion USD) in 2020, according to the nation’s National Bureau of Statistics. Further, according to the health care information company IQVIA, China was the world’s second-largest national biopharmaceutical market in 2017, worth $122.6 billion USD. 

The 117th Congress and Biden Administration edged science and technology funding upwards, but Republicans have proposed slashing federal R&D funding. We are moving in the wrong direction with this self-defeating approach. Congress must prioritize basic science: the curiosity-driven, peer-reviewed research that the private sector won’t fund and the public sector under-funds. We can start by expanding NIH funding and fully appropriating the $170 billion Science portion of the Chips and Science law that was authorized throughout the next ten years.

NIST has been directed to create standards and metrology for the U.S. bioeconomy through the FY24 appropriation bills, the FY25 Presidential Budget, the Bioeconomy EO, and from the letter that you submitted into the record during the hearing “Growing Stakes: The Bioeconomy & American Security.” Given all these priorities, what needs to happen in order for NIST to fulfill their directives for the bioeconomy?

There needs to be continued pressure applied to ensure NIST is prioritizing this important work. Congress can further support NIST by appropriating more funding for them to achieve the work laid out in front of them, as I have advocated during the current appropriations process.

The recent AI EO also directs NIST with many different tasks. Is the AI EO overtasking NIST and making bioeconomy related efforts an afterthought for them?

The AI EO can be implemented simultaneously with standardization efforts, if NIST is appropriately resourced. But those who think that AI is more important than biotech are wrong, and should not point NIST in that direction. Care should be taken not to duplicate work unnecessarily, but all of these tasks assigned to NIST will take time and labor. Again, NIST needs to be adequately funded to do all the work it is being asked to do.

In the letter, you suggest NIST collaborate with Manufacturing USA institutes, NIIMBL & BioMADE. However, in the Department of Commerce’s FY25 budget request, they ask for $37M for the Manufacturing USA program, the same amount that they have received since FY23. Should Congress prioritize and raise funding levels for programs like Manufacturing USA and why would this be important for ensuring a competitive edge for the U.S. bioeconomy?

Absolutely. Many federal R&D programs are continuously underfunded, even as they are tasked with more responsibilities. The programs we fund reflect our priorities. We need to be prioritizing science, technology, and centers of excellence for manufacturing to gain that competitive edge. Increasing funding would give these agencies and programs the resources they need to set standards and increase R&D. That’s why I sent a letter to the House Committee on Appropriations asking for a 10 percent increase above FY24 enacted funding levels for NIST, Manufacturing USA, and the Manufacturing Extension Partnership.

Lastly, in your opinion, what potential does the U.S. bioeconomy have that we are not capitalizing on and what would you like to see occur for the U.S. bioeconomy in the next year.

I would like to see standardization, or at least the beginning of standardization, within the next year. With standardization for industrial biomanufacturing in place, the U.S. bioeconomy will be able to reach new heights and enable our talented citizens to delve deeper into their research without being hindered by the lack of baseline definitions. Fully appropriating the $170 billion that was authorized for the Science portion of CHIPS and Science would be the step in the right direction to maintain U.S. innovation and competitiveness in biotechnology and biomanufacturing. Furthermore, we need state capacity and funding for R&D; which includes the staffing and programming for regulation and standardization and also the funding for peer-reviewed basic research. Finally, we need to expand the productive capacity of the bioeconomy through workforce development compacts that bring together employers, educators, and trade associations together; through skilled immigration pathways; and through technology-agnostic tax credits that are transferable for R&D and biomanufacturing. In the next year, I am working towards legislation that advances all of these different components to strengthen and secure the U.S. bioeconomy.

The Federation of American Scientists values diversity of thought and believes that a range of perspectives — informed by evidence — is essential for discourse on scientific and societal issues. Contributors allow us to foster a broader and more inclusive conversation. We encourage constructive discussion around the topics we care about.