How Should FESI Work with DOE? Lessons Learned From Other Agency-Affiliated Foundations

In May, Secretary Granholm took the first official step towards standing up the Foundation for Energy Security and Innovation (FESI) by naming its inaugural board. FESI, authorized in the CHIPS and Science Act of 2022 and appropriated in the FY24 budget, holds a unique place in the clean energy ecosystem. It can convene public-private partnerships and accept non-governmental and philanthropic funding to spur important projects. FESI holds tremendous potential for empowering the DOE mission and accelerating the energy transition. 

Through the Friends of FESI Initiative at FAS, we’ve identified a few opportunities for FESI to have some big wins early on – including boosting next-generation geothermal development and supporting pilot stage demonstrations for nascent clean energy technologies. We’ve also written about how important it is for the FESI Board to be ambitious and to think big. It’s important that FESI be intentional and thoughtful about the way that it’s structured and connected to the Department of Energy (DOE). The advantage of an entity like FESI is that it’s independent, non-governmental, and flexible. Therefore, its relationship to DOE must be complementary to DOE’s mission, but not tethered too tightly. FESI should not be bound by the same rules as DOE. 

While the board has been organizing itself and selecting a leadership team, we’ve been gathering insights from leaders at other Congressionally-chartered foundations to provide best practices and lessons learned for a young FESI. Below, we make a case for the mutually-beneficial agreement that DOE and FESI should pursue, outline the arrangements that three of FESI’s fellow foundations have with their anchor agencies, and highlight which elements FESI would be wise to incorporate based on existing foundation models. Structuring an effective relationship between FESI and DOE from the start is crucial for ensuring that FESI delivers impact for years to come.  

Other Transactions Agreements (OTA)

If FESI is going to continue to receive Congressional appropriations through DOE, which we hope it will, it should be structured from the start in a way that allows it to be as effective as possible while it receives both taxpayer dollars and private support. The legal arrangement between FESI and DOE that most lends itself to supporting these conditions is an Other Transactions (OT) agreement. Congress has granted several agencies, including DOE, the authority to use OTs for research, prototype, and production purposes, and these agreements aren’t bound by the same regulations that government contracts or grants are. FESI and DOE wouldn’t have to reinvent the wheel to design a mutually beneficial OT agreement after looking at other shining examples from other agencies. 

Effective Use of an Other Transactions Agreement Between FNIH and NIH 

Many consider the gold standard of public-private accomplishment – made possible through an Other Transactions Agreement – to be a partnership first ideated in the early days of the COVID-19 pandemic. Leaders at the National Institute of Health (NIH) and the Foundation for National Institute of Health (FNIH) were faced with an unprecedented need for developing a vaccine on an accelerated timeline. In a matter of weeks, these leaders pulled together a government-industry-academia coalition to coordinate research and clinical testing efforts. The resulting partnership is called ACTIV (Accelerating COVID-19 Therapeutic Interventions and Vaccines) and includes eight U.S. government agencies, 20 biopharmaceutical companies and several nonprofit organizations.

Like the COVID-19 pandemic, climate change is a global crisis. Expedited energy research, commercialization, and deployment efforts require cohesive collaboration between government and the private sector. Other Transactions consortia like ACTIV pool together the funding to support some of the brightest minds in the field, in alignment with the national agenda, and return discoveries to the public domain. Pursuing an OT agreement allowed the FNIH and NIH to act swiftly and at the scale required to begin to tackle the task of developing a life-saving vaccine. 

What We Can Learn from Other Agency-Affiliated Foundations 

FESI Needs to Find its Specific Value-Add and then Execute

The allure of an independent, non-governmental foundation like FESI is pretty straightforward. Unencumbered by traditional government processes, agency-affiliated foundations are nimble, fast-moving, and don’t face the same operational barriers as government when working with the private sector. They can raise and pool funds from private and philanthropic donors. For that reason, it’s crucial that FESI differentiates itself from DOE and doesn’t become a shadow agency. Although FESI’s mission aligns with that of the DOE’s, and may focus on programs similar to those of ARPA-E, there is a drastic difference between being a federal agency and being a foundation affiliated with a federal agency.  

FESI’s potential relies on its ability to be independent enough to take risks while still maintaining a strong relationship with DOE and the agency’s mission. FESI’s goals should be aligned with DOE’s through frequent communication with the agency – to understand priorities, opportunities, and barriers it might face in achieving those goals. In reality, neither FESI nor DOE can directly instruct the other what to do, but the two entities should be aligned and aware of what the other is doing at all times. 

Additionally, a young FESI should figure out what it can do that DOE can’t and then capitalize on that. The Foundation for Food & Agriculture Research (FFAR), for example, was established with a specific purpose of convening public private partnerships. At the time, the USDA struggled to connect with industry. FFAR found its benefit by serving as a more flexible extension of the agency’s aims. FESI could play a similar role – acting in concert with DOE, but playing different instruments. 

More important than the agreement are the relationships between FESI and DOE leaders and staff

Pursue a flexible agreement that can be revisited and revised

Whatever relationship structure DOE and FESI decide on needs to be flexible enough so that they can both exercise the relationship required to tackle problems together. The agreement needs to be more than a list of what FESI can and can’t do. Based on other foundations’ experiences, it is best to revisit, revise, and refresh the document every so often. An ancient contract collects dust and doesn’t serve FESI or DOE. Luckily, Other Transactions agreements can be amended at any time. 

Select a strategic executive director with a vision

DOE is racing against time to commercialize the clean energy technology needed to solve difficult decarbonization challenges. With FESI’s strength being its agility and ability to act quickly, the foundation is poised to be an invaluable asset to DOE’s mission. Whomever the FESI Board designates to lead this fight must walk in on day one with a clear, focused vision ready to fund projects that earn wins and to work with the board to make good on their promises. One of the first challenges they will face will be educating the ecosystem about FESI’s role and purpose. A clearly articulated answer to, “What does FESI hope to accomplish?” is key for fundraising and program design and execution. 

Being FESI’s first ever executive director is no small feat and the board’s selection should be a quick study who has proven experience under their belt for fundraising and managing nine-figure budgets – the scale that we hope FESI is one day able to operate at. A successful leader will have high credibility throughout the energy system and with both political parties. They will bring with them networks that span sectors and add on to those of the board members. With these assets in tow, the Secretary of Energy should be excited about the FESI executive director and eager to work with them. 

The agreement is the backstop, but the game is played at the plate

An overarching theme across each agency-affiliated foundation is the importance of agency-foundation relationships that are based on deep trust. One foundation leader even said, “It’s really not about the paper – [the structuring agreement] – at all.” Instead, they said, the success of an agency and its foundation runs on “tacit knowledge and relationships” that will grow over time between foundation and agency. Clearly, an agreement needs to be in place between FESI and DOE, but if the organization “runs exclusively off those pieces of paper, it won’t be its best self.” 

As a young FESI grows over time, leaders of each organization – the FESI executive director and the Secretary of Energy – and the board and the executive director should all be in close contact with one another. Any of these folks should be able to pick up the phone, dial their counterpart, and give them good – or bad – news directly. These relationships should be prioritized and fostered, especially early on.

Create and raise the profile of FESI as early as possible

By far the greatest benefit of DOE having an agency-affiliated foundation is that FESI can raise and distribute funding more quickly and more efficiently than DOE will ever be able to. This can be a great driver for DOE success as FESI’s role is to support the agency. The FNIH, for example, can raise funding from biopharma, send it into projects, and then grant it out, all while avoiding cumbersome procedures since that money doesn’t belong to taxpayers.

To successfully fundraise, FESI will need the staff and the infrastructure needed to identify and execute on promising projects. Leaders at other foundations have found that their respective funder ecosystems are drawn to projects that fill a gap and that convene the public and private sectors. Whenever possible, and to the extent possible, FESI should aim to pool funding from different streams by convening consortia – in order to avoid the procedural strings attached to receiving federal dollars. One example, the Biomarkers Consortium, led by the FNIH, pools funding from government, for-profit, and non-profit partners. Members of this consortium pay annual dues to participate and contribute their scientific and technical expertise as advisors. 

How Do Other Congressionally-chartered Foundations Work with Their Agencies? 

The Foundation for Food & Agriculture Research and the U.S. Dept. of Agriculture 

In the first year of the Foundation for Food & Agriculture Research (FFAR), it had $200 million from Congress, one staff member, and no reputation to fundraise off of. By year three, FFAR had established its first public-private consortium – composed of companies and global research organizations working to develop crop varieties to meet global nutritional demands in a changing environment. FFAR provided the Crops of the Future Collaborative with an initial $10 million investment that was matched by participants for a total investment of $20 million. The law requires that the foundation matches every dollar of public funding with at least one dollar from a private source. This partnership marked FFAR’s first big, early win that set the young foundation on a road to success.    

FFAR is unique among its fellow foundations as it doesn’t receive any funding from USDA. Instead, FFAR receives appropriations from the Farm Bill about every five years as mandatory funding that doesn’t go through the regular appropriations process. Because of this, this funding is separate from that of the USDA’s so the funding streams remain separate and not in competition with one another. While FFAR doesn’t receive money from USDA, USDA can receive grants from FFAR and the two entities conduct business in close coordination with one another. Whenever FFAR identifies a program for launch, its staff run the possibility past the USDA to ensure that FFAR is filling a USDA gap and that there isn’t any programmatic overlap. 

A memorandum of understanding (MOU) is the legal agreement of choice that structures the relationship between USDA and FFAR. This document describes how the two exchange with each other and is updated every other year. In addition, FFAR has USDA representatives sit as ex-officio members of its board. While FFAR remains to this day quite independent of the USDA, according to staff, the agency is a “valued piece” of the work of the foundation.  

In addition to having an MOU with USDA, FFAR has MOUs and funding agreements with each of the corporations in their consortia. These funding agreements either give FFAR money or fund the project directly. The foundation’s public private partnerships are generally funded through a competitive grants process or through direct contract; however, the foundation also uses prize competitions to encourage the development of new technologies. 

When it comes to fundraising as a science-based organization, FFAR has encountered distinct challenges. Most of its fundraising is done by its Executive Director and scientists who solicit funding for each of its six main research focus areas. Initially, in 2016, these six “Challenge Areas” were selected by the board of directors using stakeholder input to address urgent food and agricultural needs. Recently, FFAR has pivoted to a framework that is based on four overarching priority areas – Agroecosystems, Production Systems, Healthy Food Systems and Scientific Workforce Development. Defining focus areas creates clarity and structure for a foundation working in an overwhelming abyss of opportunity. It would be wise for FESI leadership to define a handful of focus areas to hone in on in its early rounds of projects. 

Most of FFAR’s fundraising efforts are on a project and program basis, instead of finding high net-worth individuals that will donate large sums of untethered money. To be a successful fundraiser, FFAR leaders must be able to clearly articulate the vision of the foundation, locate projects that will appeal to donors, and also be able to articulate the benefits to donors (i.e. receiving early access to information or notice of publications). FFAR leaders have found that projects that promise to fill gaps between the public and private sector have proven highly enticing amongst the funder community. 

The Foundation for the National Institutes of Health and The National Institutes of Health 

The Foundation for the National Institutes of Health (FNIH) is going on its 35th year advancing the mission of the NIH and leading public-private partnerships that advance breakthrough biomedical discoveries. Its authorizing statute has been amended slightly since it was initially passed in 1990, but its language served as a model for FESI’s authorization legislation.

The FNIH statute does not lay down specific rules or regulations for projects or programs that the organization is confined to. Instead, it allows the foundation to do whatever its leaders decide, as long as it relates to NIH and there’s a partner from the NIH involved. Per law, the NIH Director is required to transfer “not less than $1.25 million and not more than $5 million” of the agency’s annual appropriations to FNIH. Between FY2015 and FY2022, NIH transferred between $1 million and $1.25 million annually to FNIH for administrative and operational expenses (less than 0.01% of NIH’s annual budget).The FNIH and the NIH also have a Memorandum of Understanding (MOU) signed to facilitate the legal relationship between each organization, though this agreement has aged since it was signed and the relationship in practice is more informal.

The National Fish and Wildlife Foundation and the Fish and Wildlife Service

The National Fish and Wildlife Foundation (NFWF), chartered by Congress to work with the Fish and Wildlife Service (FWS), is the nation’s largest non-governmental conservation grant-maker. In fiscal year 2023 alone, the NFWF awarded $1.3 billion to 797 projects that will generate a total conservation impact of $1.7 billion. 

NFWF doesn’t have a guiding agreement, like an MOU, with FWS. Instead, it uses the text language in the initial authorizing legislation. Since its inception, NFWF has built cooperative agreements with roughly 15 other agencies and 150 active federal funding sources. These agreements function as mechanisms through which agencies can transfer appropriated funds over to NFWF to administer and deploy to projects on the ground. These cooperative agreements are revisited on a program-specific basis; some are revised annually, while others last over a five-year period. 

Congress mandates that each federal dollar NFWF awards is matched with a non-federal dollar or “equivalent goods and services.” NFWF also has its own policy that it aims to achieve at least a 2:1 return on its project portfolio — $2 raised in matching contributions to every federal dollar awarded. This non-federal funding comes from conservation-focused philanthropic foundations, but also project developers needing to fulfill regulatory obligations, or even from legal settlements, such as in the case of NFWF receiving $2.544 billion from BP and Transocean to fund Gulf Coast projects impacted by the Deepwater Horizon oil spill. 

To distribute this money, NFWF solicits its own requests for proposals (RFP), separate from FWS, and awards roughly 98% of its grants to NGOs or state/local governments. If it wanted, FWS could apply to or be a joint applicant to receive a grant issued by NFWF. Earlier this year, NFWF announced an RFP – the “America the Beautiful Challenge” – that pooled funds $119 million from multiple federal agencies and the private sector to eventually award to project applicants working to address conservation and public access needs across public, Tribal, and private lands. NFWF has review committees composed of NFWF staff and third-party expert consultants or members of other involved agencies. These committees converge to discuss a proposed slate of projects to decide which move forward before the NFWF Board delivers its seal of approval.

While NFWF is regarded as a successful model of a foundation supporting several federal agencies, its accomplishments are slightly distinct from what FESI has been created to do. As a 501(c)3, NFWF is able to channel funds from various sources, both public and private, to support projects that comply with federal conservation and resilience requirements. NFWF works closely with the Department of Defense to fund resilience projects that protect military bases and nearby towns against natural disasters in coastal areas. With just under 200 employees, NFWF is also able to serve as a “release valve” for agencies that do not have the workforce capacity to handle the influx of work generated by the Bipartisan Infrastructure Law (BIL) or Inflation Reduction Act (IRA), for example. While FESI could take on projects that DOE doesn’t have the capacity or agility to handle, it should also operate independently and aim to act on ideas that originate from outside of DOE. 

Takeaways for FESI

The foundations that have preceded FESI, each chartered by Congress to support the mission of federal agencies, have proven that these models can be successful. They have supported public-private partnerships to produce life-saving vaccines, breakthrough discoveries in food and agriculture, and to more quickly distribute grants to conservation organizations on the ground. FESI was authorized and appropriated by Congress to accelerate innovation to support the global transition to affordable and reliable low-carbon energy. Its inaugural board is now tasked with choosing leadership and pursuing strategic projects that will put FESI on a path to accomplishing the goals set before it. 

In order to deliver on its potential, FESI should initially select focus areas that will guide the foundation’s projects intentionally and methodically, like FFAR has done. Foundation leaders should also pursue a flexible legal arrangement with DOE that allows leaders from both entities to work together freely and flexibly. An Other Transactions Agreement is an ideal choice to structure this agreement, as it can be revisited as often as desired and frees transactions between DOE and FESI from regulations that government contracts or grants are bound by. FESI’s potential contributions to the global energy transition and national security rely on its ability to be independent enough to take risks while simultaneously pursuing projects that complement DOE’s mission. An effective legal agreement that structures the foundation’s relationship with DOE will ensure that FESI delivers impact for years to come. 

Dr. Max Delferro, Argonne National Lab, Building a World With Sustainable Plastics

The U.S. Department of Energy’s (DOE) Office of Technology Transfers is holding a series of webinars on cutting-edge technologies being developed at the DOE National Labs – and the transformative applications they could have globally for clean energy. We sat down with the people behind these technologies – the experts who make that progress possible. These interviews highlight why a strong energy workforce is so important, from the lab into commercial markets. These interviews have been edited for length and do not necessarily reflect the views of the DOE. Be sure to attend DOE’s next National Lab Discovery Series webinar on catalysts for plastics upcycling on Thursday, August 22.

Dr. Max Delferro has spent his career bringing research on plastics recycling to the forefront of scientific discussions. Since 2008, he has focused on how to effectively and economically recycle plastics – but also how to turn them into more valuable materials like synthetic oils and waxes. As a Group Leader of the Catalysis Science Program Group at Argonne National Laboratory, Dr. Delferro’s work could help take plastic out of landfills and put them to good use elsewhere. 

A Post-doctorate in Plastics Manufacturing

After completing three degrees – including a PhD – at the University of Parma in Italy, Dr. Delferro moved to Chicago, where he took up a postdoc position at Northwestern University. With a background in organometallic chemistry, Dr. Delferro hadn’t considered working on polymer synthesis until he started his postdoc. 

“When I first came to Northwestern after my PhD, they were working on the development of  new catalysts to make polymers, such as polyethylene,” Dr. Delferro said. “I had zero knowledge of how to do this at first, but I initiated my first ethylene polymerization – it took a few seconds – and a white powder started to precipitate. It was amazing.”

That was 16 years ago – a time when plastics accumulation in the environment wasn’t yet a hot topic for public discussion like it is today. Now, as Dr. Delferro says, there is at least one article in every journal or newspaper at any given time discussing the role and impact of plastics in our society.

“It took circa 100 years of research on how to make efficient and functional polymers. We’ve created a polymer that can go into space, and we’ve produced a polymer that uses the same material but has completely different properties to make a plastic bag. Scientists have mastered the engineering and design of how to make these polymers, but we still don’t know how to efficiently deconstruct them.”

After he joined Argonne National Laboratory in 2016, Dr. Delferrro started to think on how to convert plastics into new products. However, the first feedback he received was why anyone should care about plastics deconstruction when there’s plenty of landfills to dispose of plastics.

“Maybe in  20, 30, 50 years there will be no more room in landfills,” Dr. Delferro responded. “What are we going to do with that material then? We will need to discover how to selectively deconstruct plastics, in particular polyolefins.”

Polyolefins are a class of polymers that make up about 50 to 64% of the world’s plastic waste. They are not biodegradable, causing them to accumulate in landfills or wherever they’re disposed of.

Over dinner, Dr. Delferro and his colleagues discussed this idea and how to apply for funding to research this exact question. A white paper was sent to DOE’s Office of Science, Basic Energy Science which initially funded the project, followed by funding through the Energy Frontier Research Center, Institute for Cooperative Upcycling of Plastics (ICOUP). Dr. Delferro’s research could begin.

Uncovering the Problem of Plastics Recycling

Everything started when Dr. Delferro and his colleagues paid a visit to a nearby recycling facility to take a peek “behind the scenes.” Unbeknownst to him, what he would see that day would alter his research trajectory.  

“It’s really not efficient how we do recycling in the United States. All of the plastics move along on a very, very fast conveyor belt. To separate a plastic bottle from the rest, there are detectors that recognize the bottles and then activate an air jet that pushes the bottle off of the belt. But if you have any light plastics, like a plastic bag, the bag also flies away with the bottle. Then, the recyclable bottles, made of polyethylene terephthalate (PET, #1), became contaminated with an unrecyclable plastic bag, which makes the PET items unrecyclable.”  

It was at this moment that Dr. Delferro first decided to focus his attention on the problem of plastics recycling. 

“It was because of this experience that I thought, ‘Okay, we need to do something.’ Which is when we started to focus on plastics. Seeing with my own eyes was a game changer. I could not believe how we are managing plastic waste right now.”

In Dr. Delferro’s native Italy, for example, the recycling system is more robust.

“When I visit my mom in the summer, she has seven different bins – one yellow, another blue, a green, a brown, etc, and the city gives her a bag with an RFID tag that they scan. And if they open it and they find, for example, unwashed glass or plastic in the metal bin, she gets a fine.”

Without a system like this in place, the U.S. relies on individuals who recycle to know how to do it properly. During Argonne Open Lab Day, Dr. Delferro informed the public about what all goes into recycling, but is sometimes met with confusion.  

“I think 95% of the people who come to our open labs don’t understand how to recycle correctly. On every piece of plastic, there is a symbol with a number from one to seven. To date, only numbers one and two should go into the blue bin. For three to seven, we need to develop new technologies and new processes to target [those plastics] – which is what we’re doing at the DOE National lab system.” 

Dr. Delferro’s lab especially focuses on plastics that fall in categories four and five. Number four plastics are known as LDPEs, like the kind used in grocery bags and shrink wrap. Number five – polypropylene – are used for items like yogurt cups, bottle caps, and ketchup bottles. Consider a plastic soda bottle, for instance, that uses both kinds of plastic – a sturdy twist-off cap made from polypropylene and a wraparound brand label that uses a stretchier LDPE.   

“In theory, the PET bottle is 100% recyclable and can be recycled an infinite number of times without losing its mechanical and thermal properties. For the polypropylene cap, a few new technologies are just now coming out on the market. Up until a few years ago, this cap would end up in a landfill.”

While the LDPE used to make the brand label plastic film now costs less than a penny to manufacture, the research and development required to design a material with its specific properties took decades. Now, Dr. Delferro and his team are faced with the challenge of discovering a technology to selectively deconstruct LDPE and other kinds of polyolefins like it.   

“One technology doesn’t solve all of the problems that we have. We probably need one hundred technologies to solve all of these problems, and we are only at the beginning phases of the inception of each of these.”

One Man’s Trash is another Man’s Synthetic Oil 

Working at Argonne, Dr. Delferro gets the opportunity to explore these solutions – and has developed a catalyst technology that could help not only recycle, but upcycle plastic into more valuable items.

“I pay taxes for the city to come pick up my trash, but what if, instead, we had an economy where trash became valuable? Then I could leave my plastic jug and someone would be incentivized to pick it up for money. If we were to value plastic bottles at, say, one cent per bottle, it would make a big difference. How many bottles are in one blue bin? How many blue bins per alley? How many alleys per street? My job is to develop the technologies to incentivize this process.”

Dr. Delferro is confident that developing these technologies is not a matter of if, but when

“From a technological and scientific point of view, we are going to solve the plastic deconstruction problem. I don’t know when, but we, as a science community, are going to develop the technologies to solve it. The most difficult part is really the economics.”

While the economics may be the “most difficult part” to understand in Dr. Delferro’s brain, the polymer science itself is no walk in the park. The most current type of selective deconstruction is called pyrolysis. This process begins with plastic waste that is burned, without oxygen, to produce a mixture of gas, liquids, and solid products. The liquid oils, after separation, can be blended with crude oil to be sent back to a refinery to remake new olefins (like propane, ethene, or butene). 

“Pyrolysis is great, but we think that there are more opportunities that exist outside of it. It’s not very selective, it creates a lot of byproducts that require separation, there are gasses involved, and so on.”

So, Dr. Delferro and his team began exploring hydrogenolysis – a very selective way of combining hydrogen in presence of a catalyst to selectively chop polymeric chains, through carbon-carbon cleavage. What they found was that they were able to fine tune the property of the newly-discovered catalysts to make a selective product – like waxes and lubricants – without the requirement of any separation. 

“Our dream is for you to be able to go into the car mechanic, or wherever you change the oil in your car, and they can change the oil in your car with synthetic oil that comes directly from plastic waste. That is our dream and our vision.”

The Argonne scientists aren’t the only believers in this vision of converting waste into a resource. The Bioenergy Technologies Office (BETO) and Advanced Materials & Manufacturing Technologies Office (AMMTO) at the DOE, in collaboration with Chevron Phillips – one of the biggest producers of lubricating based-oil – are supporting this project.

“Maybe one day, your plastic bag could become the oil for your car. And that is our goal.”

The Power of Lab Science

Max’s work is only possible because of the institution supporting him. His support comes from “all over the place” within DOE offices. His core program is primarily supported by the DOE Office of Basic Energy Sciences, in particular the Chemical Sciences, Geosciences, and Biosciences Division (CSGB).

Dr. Delferro is involved with two Energy Frontier Research Centers (EFRC) – basic research programs funded by DOE. He is the Deputy Director of the Institute for Cooperative Upcycling of Plastics, led by Ames Lab, and a principal investigator at the Catalyst Design for Decarbonization Center (CD4DC) at the University of Chicago that is working on hydrogen management.  

“At UChicago, we are using metal organic frameworks – porous materials that we designed to do hydrogen management. These materials add or remove hydrogen to molecules to transport the molecules from point A to point B, which will be very, very important in the future for the hydrogen economy.”

When it comes to the plastic conversion conversation, Argonne and Ames National Laboratories are leading the basic science research efforts, joined by the BOTTLE Consortium – led by the National Renewable Energy Laboratory (NREL) and composed of experts from other labs and universities – on the applied science side. 

“There are a lot of people that are working in this conversion area and the National Labs play a pivotal role. [Argonne and Ames Labs] just had our first joint meeting [with the BOTTLE Consortium] in May of this year to have an open conversation and see what we are each doing and how we can work together to push for new technology and knowledge.”

The National Labs are more about us than about me.

And it’s not just funding that makes a difference. Max’s position at Argonne means collaboration is often just a few doors down. 

“In my research group, I have a physicist, a computational engineer … Everyone can bring their expertise to the table to tackle really difficult projects that one person could not do alone. When I want to learn about quantum computing, I go down one set of stairs where there’s a quantum computer. The National Labs are more about us than about me.” 

It doesn’t stop there – Max’s research is able to be licensed by private companies. 

“We have a series of patents as they go from catalyst design process and applications. They are available for everyone to be licensed,” Dr. Delferro said. “Two of my postdocs that left when they finished their postdoc here got some of this IP and they started their own startup company called Aeternal Upcycling, with ideas to try to take this technology from the lab to the market.”

Putting the Impact in Plastic

The beauty of working at an institution like Argonne National Lab is that researchers like Dr. Delferro can join forces with scientists in other fields to tackle complex problems. Dr. Delferro brought his expertise in catalyst design to the Manufacturing Group and the Life Cycle Analysis Group at Argonne, whose researchers stepped in to fill gaps in his own knowledge. 

After conducting rounds of research, this team of researchers found that making oil from plastic waste produces 50% less carbon emissions than the oil from the refinery.

“Right then you start to think, ‘Oh maybe this could have an impact.’”

Dr. Delferro sees the importance of his work not only in the lab, but out in the world, too.

“When you go on a beautiful beach, when you see a straw, you say, ‘Dang it, why use a straw? Why do you need a straw? You don’t need a straw. Why is there a straw here? My beautiful beach, can I do something?’”

An example like this illustrates that while Dr. Delferro is working to deconstruct polyethylenes in his lab, his friends and neighbors are also interacting closely with these same materials in their own lives.  

“The beauty of this research compared with other research is if I talk with my neighbor about quantum computing, they don’t have a clue about the physics behind it. If I talk about plastic waste and how we should do recycling, everyone can understand.” 

Hot-Launch Yoga: Cobra Pose Reveals Nuke Repose

The Indian Navy has integrated yoga into its training practices for decades, and in recent years it has conducted yoga sessions onboard its warships during port visits as a form of cultural diplomacy. These events, and the social media posts documenting them, occasionally offer fascinating data points about the status of specific military capabilities.

In particular, yoga-related social media posts and satellite imagery now indicate that one of India’s oldest naval missiles capable of launching nuclear weapons has likely been retired as the country continues to develop its sea-based nuclear deterrent. 

For nearly 15 years, India’s naval nuclear forces solely consisted of two offshore patrol vessels that had been specially configured to launch nuclear-capable Dhanush missiles. 

The Dhanush––a variant of India’s Prithvi short-range ballistic missile––had always been somewhat of an odd capability for India’s navy. Given its relatively short range and liquid-fuel design––meaning that it would need to be fueled immediately prior to launch––the Dhanush’s utility as a strategic deterrence weapon was severely limited. The ships carrying these missiles would have to sail dangerously close to the Pakistani or Chinese coasts to target facilities in those countries, making them highly vulnerable to counterattack. 

For those reasons, we have continuously assessed that as India’s long-planned nuclear-powered ballistic missile submarines become operational, the Dhanush would eventually be phased out.

New data points from social media and satellite imagery indicate it is very likely that this has now happened. 

––
For years, India’s nuclear-capable Dhanush missiles were carried by two specially configured Sukanya-class offshore patrol vessels, known as INS Subhadra (hull number P51) and INS Suvarna (P52). These two vessels have been most clearly distinguishable from India’s four other Sukanya-class patrol vessels by the presence of missile stabilizer platforms on their aft decks that could be clearly seen through satellite imagery, including in this image from April 2018.

The last time that any official Indian source had indicated that the Dhanush capability was still operational was in 2019, when two Facebook posts by the Indian Navy’s official page specifically mentioned the capability and implied that it was still active on both INS Subhadra and INS Suvarna.

In December 2021, satellite imagery from Airbus showed two Sukanya-class vessels at Naval Base Karwar, one with missile stabilizers and the other without. The vessel without stabilizers also featured new aft deck markings in a cross pattern that had not been seen before on other vessels of that class. Without additional images, it was unclear whether the vessel featuring the new markings was one of the two nuclear-capable ships or another ship in the Sukanya-class that is also home-ported at Karwar––INS Sukanya (P50). If the vessel without stabilizers was the INS Sukanya, then the markings on the ship would not have necessarily indicated any change to the nuclear mission, since INS Sukanya had never been equipped with missile stabilizers. If the new pattern belonged to one of the two vessels that had previously been equipped with those stabilizers, however, then it would indicate that those stabilizers had been removed, thus likely eliminating that vessel’s nuclear strike role and removing the Dhanush missile from combat duty.

Clarity arrived through a strange medium: a series of yoga-related Instagram posts published by India’s public broadcaster during port visits to Seychelles in October 2022, indicating that the vessel with the new deck markings was indeed INS Suvarna. This meant that as of December 2021 at the latest, the missile stabilizers on INS Suvarna had been removed, meaning that the vessel has since been unable to launch nuclear-capable Dhanush ballistic missiles.

At the exact same time that the crew of INS Suvarna was practicing yoga in Seychelles, another satellite image captured by Maxar Technologies showed another Sukanya-class patrol vessel at Naval Base Karwar with its aft deck under construction. Similarly to the previous case, it remained unclear whether this vessel was the nuclear-capable INS Subhadra or the non-nuclear INS Sukanya. A subsequent satellite image in April 2023 indicated that the aft deck had been repainted with a new cross pattern with a circle––likely to be used as a helipad. 

This same unique deck pattern was then on full display at another yoga session in Seychelles, during a port visit by INS Subhadra in February 2024. This indicates that this vessel lost its ability to deliver nuclear-capable Dhanush missiles when its aft deck began construction around October 2022. 

Since then, neither vessel has been seen with its missile stabilizer platforms returned to the aft deck, suggesting that the nuclear-capable Dhanush has finally been removed from active service and that the nuclear strike mission for the Sukanya-class patrol vessels has likely been retired. Given that the Dhanush is a close variant of India’s land-based Prithvi SRBM, it is likely that the Dhanush’s associated warheads have not been dismantled, but instead have been returned to India’s stockpile for use by these short-range systems. 

— 

Although the yoga-related source of the news may have been surprising, the Dhanush missile’s retirement in itself was not. For years, we have assumed that the system would be eliminated once India’s sea-based deterrent reached a higher level of maturity. That time appears to be now: after years of delays, India’s second ballistic missile submarine––INS Arighat––is expected to be commissioned into the Navy before the end of 2024. Two more ballistic missile submarines are expected to follow over the course of this decade, and satellite imagery indicates that they will be able to carry double the number of missiles as India’s first two submarines. 

More details on these developments, as well as other elements of India’s evolving nuclear arsenal, will be available in our forthcoming September publication: Indian Nuclear Weapons, 2024, in the Bulletin of the Atomic Scientists

This research was carried out with generous contributions from the Carnegie Corporation of New York, the Jubitz Foundation, the New-Land Foundation, Ploughshares, the Prospect Hill Foundation, and individual donors.

Energy Justice for All: Keeping Disadvantaged Populations Cool in a Heating World

Extreme heat is the deadliest weather phenomenon in the United States — more lethal than hurricanes, floods, and tornadoes combined. And, as extreme temperatures rise, so do American household energy bills. An alarming 16% (20.9 million) of U.S. households are behind on their energy bills and at an increased risk of utility shut-offs. 

Many households rely on electrical power systems like air conditioning (AC) to combat immediate heat effects, increasing energy demand and straining power transmission capabilities, non-reliability, and energy insecurity. Even as increasingly warmer winter months due to climate change reduce the need for heating, this indicates a future with increased energy demand for cooling. In the U.S., projected changes in cooling degree days — the metric to estimate how much cooling is needed to maintain a comfortable indoor air temperature — are expected to drive a 71% increase in household cooling demand by 2050, according to the latest annual energy outlook from the U.S. Energy Information Administration (EIA).

Increasingly excessive heat is, therefore, a financial burden for many people, particularly low-income households. This is especially the case as low-income households tend to live in less energy-efficient homes that are more expensive to cool. The inability to afford household energy needs — that is, energy insecurity — makes it a challenge to stay cool, comfortable, and healthy during periods of extreme heat. Thus, as the impacts of extreme heat and energy insecurity are not distributed evenly, it is increasingly essential for the federal government to consider equity and prioritize disadvantaged populations in its efforts to tackle these intertwined crises.

Extreme Heat Drives Energy Burdens and Utility Insecurities

Energy insecurity refers to an individual or household’s inability to adequately meet basic household energy needs, like cooling and heating. Extreme heat compounds existing energy insecurities by surging the need for AC and other electrical sources of cooling technology. Thus, as the demand for cooling during summer increases energy consumption, many households cannot afford to run their AC, leading to life-threatening living conditions. According to the latest EIA Residential Energy Consumption Survey (RECS), a striking one in five households reported reducing or forgoing necessities like food and medicine to pay an energy bill. Over 10% of households reported keeping their home at an unhealthy or unsafe temperature due to costs.

Additionally, while household access to AC has increased over the years, a significant one in eight U.S. homes on average are still lacking AC, with renters going without at a higher rate than homeowners. The lack of access to cooling may be particularly hazardous for low-income, renter, rural, or elderly households, especially for those with underlying health conditions and those living in heat islands—urban areas where temperatures are higher than surrounding rural and suburban areas.

Another critical issue is that at least three million U.S. utility customers have their power disconnected every year due to payment challenges, yet 31 states have no policy preventing energy shut-offs during excessive heat events. The states that do have policies vary widely in their cut-off points and protection policies. Across the board, low-income people are disproportionately facing disconnections and are often charged a “reconnect fee” or a deposit after a cutoff

As of 2021, 29 states had seasonal protections and 23 had temperature-based disconnection protections that prohibit utility companies from disconnecting power. Still, research from the Indiana University Energy Justice Lab shows that these do not fully prohibit disconnections, often putting the onus on customers to demonstrate eligibility for an exemption, such as medical need. Further, while 46 states, along with Washington D.C., give customers the option to set up a payment plan as an alternative to disconnection, high interest rates may be costly, and income-based repayment is rarely an option.

These cut-off policies are all set at the state level, and there is still an ongoing need to identify best practices that save lives. Policymakers can use utility regulations to protect residents from the financial burden of extreme heat events. For example, Phoenix passed a policy that went into effect in 2022 and prevents both residential energy disconnections in summer months (through October 2024) and late fees incurred during this time for residents who owe less than $300. Also, some stakeholders in Massachusetts are considering “networked geothermal energy with microdistricts” — also known as geogrids — to build the physical capability to transfer cool air through a geothermal network. This could allow different energy users to trade hot and cool air,  in order to move cool air from members who can pay for it to those who cannot.

Grid Insecurity

Extreme heat also poses a significant threat to national energy grid infrastructure by increasing the risk of power outages due to increased energy consumption. Nationwide, major power outages have increased tenfold since 1980, largely because of damages from extreme weather and aging grid infrastructure. Regions not accustomed to experiencing extreme heat and lacking the infrastructure to deal with it will now become particularly vulnerable. Disadvantaged neighborhoods in urban heat islands also face heightened risks as they more frequently lack the essential infrastructure needed to adapt to the changing climate. For example, grid infrastructure in California’s urban disadvantaged communities has been shown to be weaker and less ready to support electric appliances. 

Similarly, rural communities face unique challenges in preparing for disasters that could lead to power loss. Geographic isolation, limited resources, and older infrastructure are all factors that make power outages more frequent and long-lasting in rural areas. These factors also affect the frequency of maintenance service and speed of repairs. In addition, rural areas often have less access to emergency services and cooling centers, making power outages during extreme heat events additionally hazardous. Power outages during extreme heat events increase the risks for heat-related illnesses such as fainting, heat exhaustion, and heatstroke. And, for rural homes that rely on well water, losing power can mean losing access to water, since well systems rely on electrical pumps to bring water into the home. 

Further heightening the need for urgency to consider these especially vulnerable populations and regions, research has shown that the time to restore power after an outage is significantly longer for rural communities and in low-income communities of color. Power restoration time reflects which communities are prioritized and, as a result, which communities are neglected. 

Equity Considerations For Different Housing Types

Extreme heat and energy security cannot be addressed without considering equity, as the impacts are not distributed evenly, especially by race, income, and housing type. One example of this intersection: Black renters have faced disproportionate burdens of extreme heat and energy security, as wealth is deeply correlated with race and homeownership in the U.S. In 2021, the EPA reported that Black people are 40% more likely than non-Black people to live in areas with the highest projected increase in mortality rates due to extreme temperatures. Simultaneously, a 2020 analysis by the Brookings Institute found that Black renters had greater energy insecurity than white renters, Black homeowners, and white homeowners. Beyond the cost burden of staying cool, this energy security puts lives at greater risk of heat-related illness and death and hinders economic mobility.

This paradigm reflects historic, discriminatory housing policies like redlining. Such policies segregated neighborhoods, induced lower homeownership rates, and ensured underinvestment in low-income communities of color—all of these factors play a significant part in making Black residents more vulnerable to the effects of extreme heat. Further compounding this, a 2020 study of more than 100 cities across the U.S. found that 94% of formerly redlined areas are hotter than non-redlined areas; in the summer, this difference can be as high as 12.6℉. 

Nationwide, households living in manufactured homes also face disproportionate risks and impacts, with about 25% having incomes at or below the federal poverty level. At the same time, manufactured homes consume 70% more per square foot on energy than site-built homes, while using 35% less energy due to their smaller size. Notably, about 70% of all manufactured homes are located in rural areas, which on average have higher median energy burdens than metropolitan areas

Further, ​​older manufactured housing units are often in inadequate condition and do not meet building codes established after 1976. However, research has shown that the vulnerability of households living in manufactured housing units to extreme temperatures is only partially related to whether they have AC systems installed. Other key factors that residents report to drive vulnerability include AC units that do not work efficiently, are located in less-used parts of the house, or are ineffective in maintaining comfortable temperatures. These factors hamper manufactured housing residents’ ability to control their home’s thermal environment — thereby driving thermal insecurity.

Manufactured housing households facing challenges with resource access (such as exclusions from assistance programs and lack of credit access), physical health and mental limitations, care burdens, and documentation status may be at disproportionate risk. These households deal with multiple, intersecting vulnerabilities and often must engage in trade-off behavior to meet their most immediate needs, sacrificing their ability to address unsafe temperatures at home. They often take adapting to the increasingly extreme climate into their own hands, using various ways to cope with thermal insecurity, such as installing dual-pane windows, adding insulation, planting shade trees, using supplemental mobile AC units, and even leaving home to visit local air-conditioned malls.

These overlapping paradigms showcase the intrinsic interconnectedness among climate justice, climate justice, energy justice, and housing justice. Essentially, housing equity cannot be pursued without energy justice and climate justice, as the conditions for realizing each of these concepts entail the conditions for the others.  Realizing these conditions will require substantial investment and funding for climate programs to include heat governance, housing resilience, and poverty alleviation policies. 

Policy Considerations

Energy Burdens and Utility Insecurity

Update the Low Income Home Energy Assistance Program (LIHEAP)

The Low Income Home Energy Assistance Program (LIHEAP) exists to relieve energy burdens, yet was designed primarily for heating assistance. Thus, the LIHEAP formulas advantage states with historically cooler climates. To put this in perspective, some states with the highest heat risk — such as Missouri, Nevada, North Carolina, and Utah —  offer no cooling assistance funds from LIHEAP.  Despite their warm climates, Arizona, Arkansas, Florida, and Hawai’i all limit LIHEAP cooling assistance per household to less than half the available heating assistance benefit. And, since most states use their LIHEAP budgets for heating first, very little remains for cooling assistance — in some cases, cooling assistance is not offered at all. As a result, from 2001 to 2019, only 5% of national energy assistance went to cooling

For vulnerable households, the lack of cooling assistance is compounded by a lack of disconnection protections from extreme heat. Thus, with advanced forecasts, LIHEAP should also be deployed both to restore disconnected electric service and to make payments on energy bills, which may surge even higher with the increase in demand response pricing due to more extreme temperatures. The distribution of LIHEAP funds to the most vulnerable households should also be maximized. As most states do not have firm guidelines on which households to distribute LIHEAP funds to and use a modified “first-come, first-served” approach, a small number of questions specific to heat risk could be added to LIHEAP applications and used to generate a household heat vulnerability score.

Further, the LIHEAP program is massively oversubscribed, and can only service a portion of in-need families. To adapt to a hotter nation and world at large, the annual budgets for LIHEAP must increase and the allocation formulas will need to be made more “cooling”-aware and equitable for hot-weather states. The FY25 presidential budget keeps LIHEAP’s funding levels at $4.1 billion, while also proposing expanding eligible activities that will draw on available resources. Analysis from the National Energy Assistance Directors Association found that this funding level could cut off up to 1.5 million families from the program and remove program benefits like cooling. 

Reform the Public Utility Regulatory Policies Act of 1978 (PURPA) 

While PURPA prohibits electric utilities from shutting off home electricity for overdue bills when doing so would be dangerous for someone’s health, it does not have explicit protections for extreme temperatures. The federal government could consider reforms to PURPA that require utilities to have moratoriums on energy shut-offs during extreme heat seasons.

Housing Improvements

Expand Weatherization Assistance Programs

Weatherization aims to make homes more energy efficient and comfortable in various climates through actions, such as attic and wall insulation, air sealing, or adding weather stripping to doors and windows. More than half of cities have a weatherization program. The Department of Energy (DOE) Weatherization Assistance Program (WAP) funding is available for states and other entities to retrofit older homes for improved energy efficiency to power cooling technologies like AC. However, similar to LIHEAP, WAP primarily focuses on support for heating-related repairs rather than cooling. For all residential property types, weatherization audits, through WAP and LIHEAP, can be expanded to consider heat resilience and cooling efficiency of the property and then identify upgrades such as more efficient AC, building envelope improvements, cool roofs, cool walls, shade, and other infrastructure. 

Further, weatherization can be complicated when trying to help the most vulnerable populations. As some of the houses are in such poor condition that they do not qualify for weatherization, there is a need for nationwide access to pre-weatherization assistance programs. These programs address severe conditions in a home that would cause a home to be deferred from the federal WAP because the conditions would make the weatherization measures unsafe or ineffective. Pre-weatherization assistance programs are typically run by the State WAP Office or administered in partnership with another state office.  

Additionally, as the Infrastructure Investment and Jobs Act (IJA) allocated roughly $3.5 billion to WAP, states should utilize this funding to target energy-insecure neighborhoods with high rates of rental properties. Doing so will help states prioritize decreasing energy insecurity and its associated safety risks for some of the most vulnerable households.

Increase Research on Federal Protections for Vulnerable Housing Types

There is a need for a nationwide policy for secure access to cooling. While the Department of Housing and Urban Development (HUD) does not regulate manufactured home parks, it does finance the parks through Section 207 mortgages. HUD  could stipulate park owners must guarantee resident safety. This agency could also update the Manufactured Home Construction and Safety Standards to allow for AC and other cooling regulations in local building codes to apply to manufactured homes, as they do for other forms of housing, as well as require homes perform to a certain level of cooling under high heat conditions. Additionally, to support lower-cost retrofit methods for manufactured homes and other vulnerable housing types, new approaches to financing, permitting, and incentivizing building retrofits should be developed, per the Biden-Harris Administration’s Climate Resilience Game Changers Assessment.  HUD’s Green and Resilient Retrofit Program, which provides climate resilience funding to affordable housing properties, can serve as a model.

Further, as home heat-risk remains under-studied and under-addressed by hazard mitigation planning, and policy processes, there needs to be better measures of home thermal security. Without better data, homes will continue to be overlooked in state and federal climate and adaptation efforts. 

Grid Resilience and Energy Access

Prioritize access to affordable, resilient energy alternatives for energy-insecure individuals

The most long-term investment in reducing energy insecurity and climate vulnerability is ensuring the most energy insecure populations have access to alternative, renewable energy sources, such as wind and solar.  This is a core focus of the DOE Energy Futures Grant (EFG) program, which provides $27 million in financial assistance and technical assistance to local- and state-led partnership efforts for increasing access to affordable clean energy. EFG is a Justice 40 program, and required to ensure 40% of the overall benefits of its federal investments flow to disadvantaged communities. Programs like EFG can serve as a model for federal efforts to reduce energy cost burdens, while simultaneously reducing dependence on nonrenewable energy sources like oil and natural gas. 

Accelerate Energy-Efficient Infrastructure

Efficient AC technologies, such as air source heat pumps, can help make cooling more affordable. Therefore, resilient cooling strategies, like high-energy efficiency cooling systems, demand/response systems, and passive cooling interventions, need federal policy actions to rapidly scale for a warming world. For example, cool roofs, walls, and surfaces can keep buildings cool and less reliant on mechanical cooling, but are often not considered a part of weatherization audits and upgrades. District cooling, such as through networked geothermal, can keep entire neighborhoods cool while relying on little electricity. However, this is still in the demonstration project phase in the U.S. Initiatives like the DOE Affordable Home Energy Shot can bring new resilient cooling technologies into reach for millions of Americans, but only if it is given sufficient financial resources. The Environmental Protection Agency’s Energy Star program can further incentivize low-power and resilient cooling technologies if rebates are designed that take advantage of these technologies.

Putting FESI on a Maximum Impact Path

The Foundation for Energy Security and Innovation is now a reality: an affiliated but autonomous non-profit organization authorized by Congress to support the mission of the U.S. Department of Energy and to accelerate the commercialization of energy technologies. FESI’s establishment was a vital first step, but its value depends on what happens next. In order to maximize FESI’s impact, the board and staff should think big from the start, identify unique high-leverage opportunities to complement DOE’s work, and systematically build the capacity to realize them. This memo suggests that:

  1. FESI should align with DOE’s energy mission,
  2. FESI should serve as catalyst and incubator of initiatives that advance this mission, especially initiatives that drive public-private technology partnerships, and
  3. FESI should develop lean and highly-networked operational capabilities that enable it to perform these functions well.

Three appendices to this memo provide background information on FESI’s genesis, excerpt its authorizing legislation, and link to other federal agency-affiliated foundations and resources about them.

Thinking Big: FESI’s Core Mission

DOE is responsible for managing the nation’s nuclear stockpile, cleaning up the legacy of past nuclear weapons development, and advancing basic scientific research as well as transforming the nation’s energy system. Although FESI’s authorizing legislation allows it to support DOE in carrying out the Department’s entire mission [Partnerships for Energy Security and Innovation Act Section b(3)(A)], the detailed description of FESI’s purposes [Sections b(1)(B)(ii), b(3)(B)] and the qualifications specified for its board members [Section b(2)(B)] signal that Congress viewed the energy mission as FESI’s primary focus. This conclusion is also supported by the hearing testimony gathered by the House Science Committee.

“Catalyz[ing] the timely, material, and efficient transformation of the nation’s energy system and secur[ing] U.S. leadership in energy technologies,” the two pillars of DOE’s energy mission, are extremely challenging responsibilities. The energy system makes up about 6% of the U.S. economy, or about $4000 per person per year, and its importance outweighs this financial value. This system keeps Americans warm in the winter and cool in the summer, gets us to our jobs and schools, and allows us to work, learn, and enjoy life. The system’s transformation to cleaner and more secure resources must not interrupt the affordable and reliable provision of these and many other vital services.

In addition to posing daunting system management challenges, the incipient energy transition is testing America’s global technological leadership. The United States now leads the world in oil and natural gas production, thanks in part to breakthroughs enabled by DOE. But the risks imposed by the use of conventional energy resources have risen. Other nations, notably China and Russia, have taken aggressive actions to establish leadership positions in new energy technologies, such as advanced nuclear power, solar panels, and lithium-ion batteries. DOE is tasked with reclaiming these fields.

DOE’s ambitious energy mission would benefit more from FESI’s support than would DOE’s other responsibilities. The energy system, unlike the nuclear stockpile or cleanup, and to a far greater extent than basic science, lies outside federal control. To transform it and secure global leadership in key technologies, DOE will have to collaborate closely with the private sector, philanthropy, and non-profits. Strengthening such collaboration, particularly to accelerate commercialization of energy technologies, is precisely the purpose specified for FESI by Congress. [Sections b(1)(B)(ii); b(3)(B)(i)]

FESI’s alignment with DOE’s energy mission should be resilient to changes in Congress and the administration. Its authorizing legislation was sponsored by members of both parties across three Congresses and won overwhelming majorities when voted on as a freestanding bill. By law, a majority of its board members must have experience in the energy sector, research, or  technology commercialization [Section b(2)(B)(iii)(III)] FESI will have difficulty building strong collaborations and thus achieving its congressional mandate unless it is seen as a long-term partner with a clear and stable mission.

Filter, Catalyst, and Incubator: FESI’s Core Functions

DOE brings many assets to its mission of energy system transformation and global technological leadership. It invests over $9 billion per year in energy research, development, and demonstration, far more than any other entity in the world. Its network of 17 national laboratories and thousands of academic collaborators converts those funds into a vast store of knowledge and opportunities for real-world impact. It possesses financial and regulatory tools that allow it to shape energy markets to varying degrees.

FESI’s responsibility – and opportunity – is to help DOE use these assets to more effectively advance its energy mission. More effective public-private partnerships to accelerate technology commercialization, including such dimensions as technology maturation, new product development, and regional economic development [Sections b(3)(B)(ii), (iii), and (v)] will be an enduring priority. But the specific use-cases and projects that FESI invests in will change as the global energy landscape does. Indeed, the dynamic nature of that landscape, along with structural constraints on DOE, is a key justification for FESI’s creation. FESI must develop processes that enable it to quickly identify and act on points of leverage that enhance the impact of DOE’s assets in a rapidly-changing system.

These processes should perform three vital functions, all of which will benefit from collaboration between FESI and the national laboratory-affiliated foundations [Section b(4)(G)].  The first is to serve as a filter that helps DOE gather and sift valuable insights about the global energy landscape that the department’s leadership might otherwise miss. Information flows in a large bureaucracy like DOE are inevitably shaped by its organizational structure. The structure of DOE’s energy-focused units and the national labs is in many respects a legacy of the times in which they were established and does not map well to today’s energy system. In addition, DOE’s immense scale means that the voices of newer and less powerful players in the system, such as start-up companies and community groups, may be drowned out. Some voices of the grassroots internal to DOE and the labs may also be hard to discern at the leadership level. The Secretary of Energy’s Advisory Board helps to fill these gaps, but it is constrained by the Federal Advisory Committee Act and other laws and regulations. FESI’s flexibility, bipartisan character, and non-governmental status, bolstered by a strong relationship with the lab foundations, will allow it to recognize DOE’s blind spots, whether internal or external. 

FESI should draw on this new or neglected information to perform the second function: catalyzing actionable opportunities that advance DOE’s energy mission. It can develop these opportunities (jointly, as appropriate, with one or more lab foundations) by convening a broad range of stakeholders in formats that DOE cannot effectively utilize and at a pace that DOE cannot match. For example, a group of firms in an emerging clean energy industry may identify a shared technological need that international competitors are pursuing aggressively. FESI could support these firms to articulate their need, identify DOE-affiliated assets that could address it, and rapidly assemble a public-private partnership that aligns the two. Such a partnership might have a regional focus and engage state and local governments and regionally-focused philanthropy as well. If FESI’s information filter were to pick up unrecognized obstacles to effective community engagement or lack of attention to end-user priorities, it could assemble cross-sectoral partnerships appropriate to those opportunities. The catalyst function could be particularly important for crisis response, when speed and agility are essential, and DOE’s formal processes are likely to slow the agency down. 

FESI’s third core function should be to incubate and ultimately spin out the initiatives that it has catalyzed. The process of assembling each initiative will require FESI to provide basic administrative support, like internal communication and coordination. FESI should frequently go several steps further by raising seed funding for each initiative, particularly from non-governmental sources, and serving as its external champion. FESI should not, however, become the permanent home of mature partnerships. The managerial demands imposed by carrying out this function risk undermining the filter and catalyst functions. Spinning out the successes will permit FESI’s leadership to hunt more effectively for new opportunities. The destination for the spinoffs might be new or expanded programs within DOE, an existing non-profit like an industry consortium or community foundation, or a new organization.

Lean and Intensely Networked: FESI’s Operational Capabilities

FESI’s high ambition, dynamic functions, and unique institutional position determine the capabilities it will need to operate effectively. Above all, it must be plugged intensively into a broad network that spans the energy industry; DOE and the national labs; states, communities, and Congress; and philanthropy. FESI will only be able to spot what DOE could do better by having a savvy understanding of what DOE is already doing and what its potential partners want to be doing. FESI must be able to gather and interpret this information continuously at a modest cost, which puts a premium on networking. 

FESI board members must be vital nodes of its network. FESI’s authorizing statute specifies that the board represent “a broad cross-section of stakeholders.” The members will hold positions that provide insights and contacts of value to FESI and should be selected to build and maintain the network’s breadth. The board’s ex officio representatives from DOE will provide complementary perspectives and connections inside the Department. FESI’s staff will only have the knowledge and resources required to do their jobs well if the entire board is active and engaged (but not micro-managing the staff).

FESI’s staff should be led by an executive director who is responsible for its day-to-day operations [Section b(5)(A)] and has high credibility throughout the energy system and with both political parties. Staff members should bring sector-spanning networks to the organization that leverage those of the board. Even more important, the staff must possess the entrepreneurial skills, and technological and market knowledge, to recognize and act on promising opportunities. Prior experience in business, social, or public entrepreneurship – building new companies, non-profit organizations, and government programs – is likely to be particularly valuable to FESI.

Running lean should be a value for a FESI and will likely be a necessity as well. The value lies in taking initiative and moving quickly. The necessity arises from the likely limits on federal appropriations for operations, which are  authorized at $3 million annually [Section b(11)] and may not rise to that level. To be sure, FESI must raise resources from non-federal sources – indeed, that will be one of its core challenges. But those resources are likely to be much easier to raise if they are devoted to projects rather than operations. 

Finally, FESI will need to mitigate risks to its reputation that might arise from real and perceived conflicts of interest of the board and staff as well as from the images and interests of its potential partners. A pristine reputation will be vital to maintaining the confidence of DOE, Congress, and external stakeholders. FESI should seek to reduce the cost in money and time of rigorous vetting and disclosure, but ultimately this investment is an essential one that must be borne.


Appendix 1. A Brief Prehistory of FESI


Appendix 2. Other Federal Agency-Affiliated and National Laboratory Foundations

Numerous federal agencies have Congressionally authorized non-governmental foundations that work with them to advance their missions. The National Park Foundation (NPF) is the oldest, dating back to 1935. Anyone who wants to support a particular national park, or the system as a whole, can do so through a contribution to NPF. Similarly, donors who care about public health can give to the CDC Foundation (CDC Foundation) or the Foundation for NIH (FNIH). A 2021 report by the National Academy of Public Administration (NAPA), which recommended establishing a foundation for DOE, reviews a wide range of agency-related foundations, as does the 2020 ITIF “Mind the Gap” report and a 2019 CRS report.

As the NAPA report describes, all of these foundations leverage federal investment with private contributions to complement and supplement their agency affiliate, while guarding against potential conflict of interest. Yet, more remarkable than this commonality among is the foundations’ diversity. Each seeks to complement and supplement its partner agency, but because each agency has a different mission, structure, and functions, each affiliated foundation is unique.

FESI will likely have much in common with the FNIH. Like NIH, DOE is a major research funder that advances a critical national mission. Like NIH, DOE must rely on the private sector to turn advances made possible by the R&D it funds into technologies that make a difference on the ground. FNIH’s contributions to fighting the pandemic illustrate how having a flexible non-profit partner for an agency can advance the agency’s mission in a moment of need. Its Pandemic Response Fund and Accelerating COVID-19 Therapeutic Interventions and Vaccines (ACTIV) partnership with NIH, private firms, other federal agencies, and allied governments, aids the search for treatments and vaccines and prepares the nation to defend against future pandemics.

The Foundation for Food and Agriculture Research, which is affiliated with the U.S. Department of Agriculture, is another potential source of inspiration and learning for FESI. One notable innovation made by FFAR is its use of prizes and challenges, along with more traditional competitive, cost-shared grants. To ensure technologies can scale, FFAR brings industry experts into its project design and administration. In a review of the FFAR’s progress, the Boston Consulting Group (BGC) found that FFAR’s “Congressional funding allows it to bring partners to the table and serve as an independent, neutral third party.” 

Links to agency-affiliated foundations not linked above:


Appendix 3. Selected Provisions of FESI’s Authorizing Statute1

Partnerships for Energy Security and Innovation (42 USC 19281)

CHIPS AND SCIENCE ACT SEC. 10691. FOUNDATION FOR ENERGY SECURITY AND INNOVATION

(b)(1)(B) MISSION.—The mission of the Foundation shall be—

(i) to support the mission of the Department; and

(ii) to advance collaboration with energy researchers, institutions of higher education, industry, and nonprofit and philanthropic organizations to accelerate the commercialization of energy technologies.

(b)(2)(B)(iii)(II) REPRESENTATION.—The appointed members of the Board shall reflect a broad cross-section of stakeholders from academia, National Laboratories, industry, nonprofit organizations, State or local governments, the investment community, and the philanthropic community.

(III) EXPERIENCE.—The Secretary shall ensure that a majority of the appointed members of the

Board— (aa)(AA) has experience in the energy sector; (BB) has research experience in the

energy field; or (CC) has experience in technology commercialization or foundation operations;

and (bb) to the extent practicable, represents diverse regions, sectors, and communities.

(b)(3) PURPOSES.—The purposes of the Foundation are—

(A) to support the Department in carrying out the mission of the Department to ensure the security and prosperity of the United States by addressing energy and environmental challenges through transformative science and technology solutions; and

(B) to increase private and philanthropic sector investments that support efforts to create, characterize, develop, test, validate, and deploy or commercialize innovative technologies that address crosscutting national energy challenges, including those affecting minority, rural, and other

underserved communities, by methods that include—

(i) fostering collaboration and partnerships with researchers from the Federal Government, State 

governments, institutions of higher education, including historically Black colleges or universities,

Tribal Colleges or Universities, and minority-serving institutions, federally funded research and development centers, industry, and nonprofit organizations for the research, development, or commercialization of transformative energy and associated technologies;

(ii) strengthening and sharing best practices relating to regional economic development through scientific and energy innovation, including in partnership with an Individual Laboratory-Associated Foundation;

(iii) promoting new product development that supports job creation;

(iv) administering prize competitions—

(I) to accelerate private sector competition and investment; and

(II) that complement the use of prize authority by the Department;

(v) supporting programs that advance technology maturation, especially where there may be gaps in Federal or private funding in advancing a technology to deployment or commercialization from the prototype stage to a commercial stage;

(vi) supporting efforts to broaden participation in energy technology development among individuals from historically underrepresented groups or regions; and

(vii) facilitating access to Department facilities, equipment, and expertise to assist in tackling national challenges.

(b)(4)(G) INDIVIDUAL AND FEDERAL LABORATORY-ASSOCIATED FOUNDATIONS.—

(ii) SUPPORT.—The Foundation shall provide support to and collaborate with covered foundations.

(iv) AFFILIATIONS.—Nothing in this subparagraph requires—

(I) an existing Individual Laboratory-Associated Foundation to modify current practices or

affiliate with the Foundation

(b)(5)(I) INTEGRITY.—

(i) IN GENERAL.—To ensure integrity in the operations of the Foundation, the Board shall develop and enforce procedures relating to standards of conduct, financial disclosure statements, conflicts of interest (including recusal and waiver rules), audits, and any other matters determined appropriate by the Board.

(b)(6) DEPARTMENT COLLABORATION.—

(A) NATIONAL LABORATORIES.—The Secretary shall collaborate with the Foundation to develop a process to ensure collaboration and coordination between the Department, the Foundation, and National Laboratories

Don’t Fight Paper With Paper: How To Build a Great Digital Product With the Change in the Couch Cushions

Barriers abound. If there were a tagline for most peoples’ experience building tech systems in government, that would be a contender. At FAS, we constantly hear about barriers agencies face in building systems that can help speed permitting review, a challenge that’s more critical than ever as the country builds new infrastructure to move away from a carbon economy. But breaking down barriers isn’t just a challenge in the permitting arena. So today we’re bringing you an instructive and hopefully inspiring story from Andrew Petrisin, Deputy Assistant Secretary for Multimodal Freight at the U.S. Department of Transportation. We hope his success in building a new system to help manage the supply chain crisis provides the insight – and motivation – you need to overcome the barriers you face. 

To understand Andrew’s journey, we need to go back to the start of the pandemic. Shelter-in-place orders around the world disrupted global supply chains. Increased demand for many goods could not be met, creating a negative feedback loop that drove up costs and propelled inflation. In June of 2021, the Biden Administration announced it would establish a Supply Chain Disruption Task Force to address near-term supply and demand misalignments. Andrew joined the team together with Port Envoy and previous Deputy Secretary of Transportation John Porcari.

Porcari pulled together all the supply chain stakeholders out of the Port of LA on a regular basis to build situational awareness. That included the ports, terminal operators, railroads, ocean carriers, trucking associations, and labor. These types of meetings, happening three times each week during the height of the crisis, allowed stakeholders to share data and talk through challenges from different perspectives. Before the supply-chain crisis, meetings with all of the key players – in what Petrisin calls a “wildly interdependent system” – were rare. Now, railroads and the truckers had better awareness of the dwell times at the port (i.e., how long a container is sitting on terminal). Ocean carriers and ports now had greater understanding of what might be causing delays inland.

Going with the FLOW

These meetings were helpful, but to better see around corners, it needed to evolve to something more sophisticated. “The irony is that the problem was staring us right in the face,” Andrew told us, “but at the time we really had limited options to proactively fix it.” The meetings were building new relationships and strengthening existing ones, but there was a clear need for more than what had, thus far, consisted mostly of exchanges of slide decks. This prompted Petrisin to start asking some new questions: “How could we provide more value? What would make the data more actionable for each of you?” And critically, “Who would each of you trust with the data needed to make something valuable for everyone?” This was the genesis of FLOW (Freight Logistics Optimization Works)

Looking back, it might be easy to see a path from static data in slide decks shared during big conference calls to a functional data system empowering all the actors to move more quickly. But that outcome was far from certain. To start, the DOT is rarely a direct service provider. There was little precedent for the agency taking on such a role. The stakeholders Andrew was dealing with saw the Department as either a regulator or a grantmaker, both roles with inherent power dynamics. Under normal circumstances, if the Department asked a company for data, the purpose was to evaluate them to inform either a regulatory or grantmaking decision. That makes handing over data to the Department something private companies do carefully, with great caution and often trepidation. In fact, one company told Andrew “we’ve never shared data with the federal government that didn’t come back to bite us.” Yet to provide the service Andrew was envisioning, the stakeholders would need to willingly share their data on an ongoing, near real-time basis. They would need to see DOT in a whole new light, and a whole new role. DOT would need to see itself in a new light as well. 

Oh, This is Different: Value to the Ecosystem

Companies had no obligation to give DOT this data, and until now, had no real reason to do so. In fact, other parts of government had asked for it before, and been turned down. But companies did share the data with Andrew’s team, at least enough of them to get started. Part of what Andrew thinks was different this time was that DOT wasn’t collecting this data primarily for its own use. “Oh, this is very different,” one of his colleagues said. “You are collecting data for other people to use.” The goal in this case was not a decision about funding or rules, but rather the creation of value back to the ecosystem of companies whose data Andrew’s system was ingesting. 

To create that value, Andrew could not rely on a process that presumed to know up front what would work. Instead, he, his team, and the companies would need to learn along the way. And they would need to learn and adjust together. Instead of passive customers, Andrew’s team needed active participants who would engage in tight ‘build-measure-learn’ cycles – partners who would immediately try out new functionality and not only provide candid, quick feedback, but also share how they were using the data provided to improve their operations. “I was very clear with the companies: I need you guys to be candid. I need to know if this is working for you or not; if it’s valuable for you or not. I don’t need advice, I need active participation,” Petrisin says.

This is an important point. Too often, leaders of tech projects misunderstand the principle of listening to users as soliciting advice and opinions from a range of stakeholders and trying to find an average or mid-point from among them. What product managers should be trying to surface are needs, not opinions. Opinions get you what people think they want.“If I’d asked them what they wanted, they would have said faster horses,” Henry Ford is wrongly credited with saying. It’s the job of the digital team to uncover and prioritize needs, find ways to meet those needs, serve them back to the stakeholders, learn what works, adjust as necessary, and continue that cycle. The FLOW team did this again and again.  

Building Trust through Partnership

That said, many of the features of FLOW exist because of ideas from users/companies that the team realized would create value for a larger set of stakeholders. “People sometimes ask How’d you get the shippers to give you purchase order data? The truth is, it was their idea,” Petrisin says. But this brings us back to the importance of an iterative process that doesn’t presume to know what will work up front. If the FLOW team had asked shippers to give them purchase order data in a planning stage, the answer most certainly would have been no, in part because the team hadn’t built the necessary trust yet, and in part because the shippers could not yet imagine how they would use this tool and how valuable it would be to them. 

Co-creation with users relies on a foundation of trust and demonstrable value, which is built over time. It’s very hard to build that foundation through a traditional requirements-heavy up front planning process, which is assumed to be the norm in government. An iterative – and more nimble – process matters. One industry partner told Petrisin, “‘usually the government comes in and tells us how to do our jobs, but that isn’t what you did. It was a partnership.’”

One way that iterative, collaborative process manifested for the FLOW team was regular document review with the participating companies. “Each week we’d send them a written proposal on something like how we’re defining demand side data elements, for example,” Petrisin told us. “It would essentially say, ‘This is what we heard, this is what we think makes sense based on what we’re hearing from you. Do you agree?’ And people would review it and comment every week. Over time, you build that culture, you show progress, and you build trust.” 

Petrisin’s team knew you can’t cultivate this kind of rapid, collaborative learning environment at scale from day one. So he started small, with a group of companies that were representative of the larger ecosystem. “So we got five shippers, two ocean carriers, three ports, two terminals, two  chassis companies, three third-party logistics firms, a trucking company, and a warehouse company,” he told us, trying to keep the total number under 20. “Because when you get above 20, it becomes hard to have a real conversation.” In these early stages, quality mattered more than quantity, and the quality of the learning was directly tied to the ability to be in constant, frank communication about what was working, what wasn’t, and where value was emerging. 

Byrne’s Law states that you can get 85% of the features of most software for 10% of the price. You just need to choose the right priorities. This is true not only for features, but for data, too. The FLOW team could have specified that a system like this could only succeed if it had access to all of the relevant data, and very often thorough requirements-gathering processes reinforce this thinking. But even with only 20 companies participating early on, FLOW ensured those companies were representative of the industry. This enabled real insights from very early in the process. Completeness and thoroughness, so often prized in government software practices, is often neither practical nor desired.

Small Wins Yield Large Returns

Starting small can be hard for government tech programs. It’s uncomfortable internally because it feels counter to the principle that government serves everyone equally; external stakeholders can complain if a competitor or partner is in the program and they’re not. (Such complaints can be a good problem to have; it can mean that the ecosystem sees value in what’s being built.) But too often, technology projects are built with the intent of serving everyone from day one, only to find that they meet a large, pre-defined set of requirements but don’t actually serve the needs of real users, and adoption is weak. Petrisin didn’t enjoy having to explain to companies why they couldn’t be in the initial cohort, but he stuck to his guns. The discipline paid off. “Some of my favorite calls were to go back to those companies a few months later and say, ‘We’re ready! We’re ready to expand, and we can onboard you.’” He knew he was onboarding them to a better project because his team had made the hard choices they needed to make earlier. 

Starting small can ironically position products to grow fast, and when they do, strategies must change. Petrisin says his team really felt that. “I’ve gone from zero to one on a bunch of different things before this, but never really past the teens [in terms of team size], so to speak,” he says. “And now we’re approaching something like 100. So a lot of the last fiscal year for me was learning how to scale.” Learning how to scale a model of collaborative and shared governance was challenging. 

Petrisin had to maintain FLOW’s commitment to serving the needs of the broader public, while also being pragmatic about who DOT could serve with given resources and continuing to build tight build-measure-learn cycles. Achieving consensus or even directional agreement during a live conversation with 20 stakeholders is one thing, but it’s much harder, and possibly counterproductive, with 60 or 100. So instead of changing the group of 20, which provided crucial feedback and served as a decision-making body, Andrew developed a second point of engagement: a bi-weekly meeting open to everyone for the FLOW team to share progress against the product roadmap weekly, which provided transparency and another opportunity to build trust through communicating feature delivery.

Fighting Trade-off Denial

One thing that didn’t change as the project scaled up was the team’s commitment to realistic and transparent prioritization. “We have to be very honest with ourselves that we can’t do everything,” Petrisin tells us. “But we can figure out what we want to do and transparently communicate that to the industry. They [the industry members] run teams. They manage P&Ls [profits and loss statements]. They understand what it is to make trade-offs with a given budget.” There was a lot of concern about not serving all potential supply chain partners, but Petrisin fought that “trade-off denial.” At that point, his team could either serve a smaller group well or serve everyone poorly. Establishing the need for prioritization early allowed for an incremental and iterative approach to development.

What drove that prioritization is not the number of features, lines of code, or fidelity to a predetermined plan, but demonstrable value to the ecosystem, and to the public. “Importers are working to better forecast congestion, which improves their ability to smooth out their kind of warehouse deliveries. Ocean carriers are working to forecast their bookings using the purchase order data. The chassis providers have correlated the flow demand data to their chassis utilization.” These are all qualitative outcomes, highly valuable but ones that could not necessarily have been predicted. There are quantitative measures too. FLOW aims to reduce operational variance, smoothing out the spikes in the supply chain because the actors can better manage changing demands. That means a healthier economy, and it means Americans are more likely to have access to the goods they need and want. 

Major Successes Don’t Have to Start With Major Funding

What did the FLOW team have that put them in a position to succeed that other government software products don’t have? Given that FLOW was born out of the pandemic crisis, you might guess that it had a big budget, a big team, and a brand-name vendor. It had none of those. The initial funding was “what we could find in the couch cushions, so to speak,” says Andrew. More funding came as FLOW grew, but at that point there was already a working product to point to, and its real-world value was already established, both inside DOT and with industry. What did the procurement look like and how did they choose a vendor? They didn’t. So far, FLOW has been built entirely by developers led by DOT’s Bureau of Transportation Statistics, and the FLOW team continues to hire in-house. Having the team in-house and not having to negotiate change orders has made those build-measure-learn cycles a lot tighter. 

FLOW did have great executive support, all the way up to the office of Transportation Secretary Pete Buttigieg, who understood the critical need of better digital infrastructure for the global supply chain. It’s unfortunately not as common as it should be for leadership to be involved and back development the way DOT’s top brass showed up for Petrisin and his team. But the big difference was just the team’s approach. “The problem already existed, the data already existed, the data privacy authorities already existed, the people already existed,” he told us. “What we did was put those pieces together in a different way. We changed processes and culture, but otherwise, the tools were already there.”

FLOW into the Future

FLOW is still an early stage product. There’s a lot ahead for it, including new companies to onboard that bring more and more kinds of data, more features, and more insights that will allow for a more resilient supply chain in the US. When Andrew thinks about where FLOW is going, he thinks about his role in its sustainability. “My job is to get the people, processes, purpose, and culture in place. So I’ve spent a lot of time on making sure we have a really great team who are ready to continue to move this forward, who have the relationships with industry. It’s not just my vision. It’s our vision.” He also thinks about inevitability, or at least the perception of it. “Five years from now we should look back and think, why did we not do this before? Or why would we ever have not done it? It’s digital public infrastructure we need. This is a role government should play, and I hope in the future people think it’s crazy that anyone would have thought government can’t or shouldn’t do things like this.” 

Just this spring, when the Baltimore bridge collapsed, FLOW allowed stakeholders to monitor volume changes and better understand the impact of cargo rerouting to other ports. Following the collapse of the Francis Scott Key Bridge, Ports America Chesapeake, the container terminal operator at the Port of Baltimore, committed to joining FLOW for the benefits of supply chain resiliency to the Baltimore region. But FLOW’s success was not inevitable. Anyone who’s worked on government tech projects can rattle off a dozen ways a project like this could have failed. And outside of government, there’s a lot of skepticism, too. Petrisin remembers talking to the staff of one of the companies recently, who admitted that when they’d heard about the project, they just assumed it wouldn’t go anywhere and ignored it. He admits that’s a fair response. Another one, though, told him that he’d realized early on that the FLOW team wasn’t going to try to “press release” their way to success, but rather prove their value. The first company has since come back and told him, “Okay, now that everyone’s in it, we’re at a disadvantage to not be in it. When can we onboard?“

“You can’t fight paper with paper.” Ultimately, this sentiment sums up the approach Petrisin and his team took, and why FLOW has been such a success with such limited resources. It reminds us of the giant poster Mike Bracken, who founded the Government Digital Service in the UK, and inspired the creation of such offices as the US Digital Service, used to have on the wall behind his desk. In huge letters, it said “SHOW THE THING.” There’ll never be a shortage of demands for paperwork, for justification, for risk mitigation, for compliance in the game of building technology in government. These “government needs” (as Mike would call them) can eat up all your team has to give, leaving little room for meeting user needs. But momentum comes from tangible progress — working software that provides your stakeholders immediate value and that they can imagine getting more and more from over time. That’s what the FLOW team delivered, and continues to. They fought paper with value.

If Andrew and the FLOW team at DOT can do it, you can too. Where do you see successes like this in your work? What’s holding you back and what help do you need to overcome these barriers? Share on LinkedIn.


Lessons

Use a precipitating event to change practices. The supply chain crisis gave the team both the excuse to build this infrastructure that now seems indispensable, and the room to operate a little differently in how they built it. Now the struggle will be to sustain these different practices when the crisis is perceived to be over. 

Make data function as a compass, not a grade. Government typically uses data for purposes of after-the fact evaluation, which can make internal and external actors wary of sharing it. When it can serve instead to inform one’s actions in closer to real-time, its value to all parties becomes evident. As Jennifer says in her book Recoding America, make data a compass, not a grade.

Build trust. Don’t try to “press release your way to success.” Actively listen to your users and be responsive to their concerns. Use that insight to show your users real value and progress frequently, and they’ll give you more of their time and attention. 

Trust allows you to co-create with your users. Many of FLOW’s features came from the companies who use it, who offered up the relevant data to enable valuable functionality. 

But understand your users’ needs, don’t solicit advice. The FLOW roadmap was shaped by understanding what was working for its stakeholders and what wasn’t, by what features were actually used and how. The behavior and actions of the user community are better signals than people’s opinions.

Start small. A traditional requirements-heavy up front planning process asks you to know everything the final product will do when you start. The FLOW team started with the most basic needs, built that, observed how the companies used it, and added from there. This enables them to learn along the way and build a product with greater value at lower cost.

Be prepared to change practices as you scale. How you handle stakeholders early on won’t work as you scale from a handful to hundreds. Adapt processes to the needs of the moment. 

Fight trade-off denial. There will always be stakeholders, internal and external, who want more than the team can provide. Failing to prioritize and make clear decisions about trade-offs benefits no one in the end.

Don’t just reduce burden – provide value. There’s been a huge focus on reduction of burden on outside actors (like the companies involved in FLOW) over the past years. While it’s important to respect their time, FLOW’s success shows that companies will willingly engage deeply, offering more and more of their time, when there’s a real benefit to them. Watch your ratio of burden to value. 

Measure success by use and value to users. Too many software projects define success as on-time and on-budget. Products like FLOW define success by the value they create for their users, as measured by quantitative measures of use and qualitative use cases. 
Fund products not projects. FLOW started with a small internal team trying to build an early prototype, not a lengthy requirements-building stage. It was funded with “what we could find in the couch cushions” followed by modest dedicated allocations, and continues to grow modestly. This matches Jennifer’s model of product funding, as distinct from project funding.

Overcoming Historical Barriers in Mid-Tier Agriculture

Introducing A New Podcast Series with an FAS Food Impact Fellow and the Racial Equity in the Food System Working Group

In October 2023, Federation of American Scientists launched a multi-year Food Supply Chain Impact Fellowship that placed 28 food systems professionals dedicated to strengthening mid-tier agriculture value chains across the U.S. Many of these Fellows are currently working to support the Regional Food Business Centers, the Resilient Food Systems Infrastructure Program and advance local and regional food systems research.

I serve as one of the FAS Fellows contributing to local and regional food systems efforts at the national level. During our initial fellowship onboarding, we developed research proposals that could tie directly to or prepare us for our future federal service. I am not new to understanding and addressing the issues that plague regional agricultural value chains and mid-tier agriculture. With much of my previous work in food systems focused on strengthening regional processing (think hops growing and processing for craft brewing operations, or capital access and sourcing for a startup food manufacturing business), I was up for the challenge of researching and discussing food systems transformation and the pathways we might all consider reaching for a more resilient regionally-focused food system.

One of the major questions around food systems transformation and successful models of local and regional food systems success has been right-fitting technical assistance and investment in technical assistance to meet the diverse needs of producers, particularly Black, Hispanic, and Tribal producers. Not all agricultural education is built the same and many of the long-standing methods of engaging with farmers, producing education, investing in technical assistance, and marketing the access to the public, do not meet the needs of farmers long un-prioritized by local, state, and federal resources. 

There is not only personal value to farmers, but also intrinsic value to mid-tier agriculture in ensuring resources are widely available and unbiasedly accessible. Diversifying and centering equity in agricultural technical assistance can improve the resilience and market growth of mid-tier agricultural value chains. The first value factor, according to the Local and Regional Food Systems Playbook (LRFSP), is that systemic injustice can impair people in responding to food system disruptions, and not having a disaster response that is fast, nimble and widespread is in itself disastrous.

A second value factor focuses on generational wealth for agricultural producers. Economists suggest increases in programmatic capital and long-term investment in resources that create wealth generation, like land and business ownership, will more effectively support historically underserved and low-resourced Americans in building agricultural, generational wealth long-term.

Finally, a third value focuses on the critical link between food insecurity and nutritional disease rates. The rates of food insecurity and detrimental health outcomes are all highest in BIPOC communities. Moving towards program funding that begins to reverse historical food insecurity and loss of food sovereignty, which will take years and generations to reverse, will more consistently contribute to advance human health across all people and across our economic and biological lives.

To contribute to forwarding research-based discussions, I took this topic to colleagues with the Racial Equity in the Food Systems Working Group (REFS). REFS is a Community of Practice of extension educators, rural sociologists, economists and other agricultural and food systems professionals, and community stakeholders who connect, learn, and collaborate to facilitate change within our institutions and society to build racial equity within the food system. The result is a 3-part podcast series hosted by Kolia Souza and me with REFS guest experts exploring the value of increasing long-term investment in mid-tier technical assistance for historically underserved producers. Michigan State University Center for Regional Food Systems produced three episodes as an extension of their Reaching for Equity in All Lives (REAL) Talks Series, with two episodes currently available on all podcast platforms. The first two episodes focus on Scaling Up with Trust and Relationships and Systems are People

The podcasts are real conversations with experts in the field of agriculture and equity and include Dr Marcus Coleman, Professor of Practice in Economics at Tulane University; Keesa V Johnson, MDes, Food Systems Strategy Design Specialist at Michigan State University; and, Rachel Lindvall, MLIS, Consultant on Sustainable and Indigenous Food Systems. The third episode of Real Talks will launch in August and all eight episodes of the REAL Talks podcast will be available at the link or across Apple and YouTube podcast platforms.

Equitable resources are ones that center historically underserved producers; they focus on longitudinal access, higher funding caps, lowers barriers in application processes, and provides the direct technical assistance to support diverse applications from historically underserved organizations. Equitable resources include debt-financing mechanisms specifically tailored to historically underserved producers and crop insurance that is accessible and affordable. I welcome you to check out all the podcast episodes and review the additional resources in this post along with the body of evidence and knowledge that examines historical inequities in funding and access within our food systems. 

To learn more about the work of the Center for Regional Food Systems at Michigan State University, you can check them out here. You can reach out about this blog post to Maria Graziani, Food Supply Chain Impact Fellow, mgraziani@fas.org

Cover image: Corona Farmers Market in Queens, New York is one of the most dynamic and diverse farmers markets in the city and is steps off the subway and mass transit system for the city | USDA Photo by Preston Keres

Fortifying America’s Future: Pathways for Competitiveness

The Federation of American Scientists (FAS) and Alliance for Learning Innovation (ALI) Coalition, in collaboration with the Aspen Strategy Group and Walton Family Foundation, released a new paper “Fortifying America’s Future: Pathways for Competitiveness,” co-authored and edited by Brienne Bellavita, Dan Correa, Emily Lawrence, Alix Liss, Anja Manuel, and Sara Schapiro. The report delves into the intersection of education, workforce, and national security preparedness in the United States, summarizing key findings from roundtable discussions in early 2024. These roundtable discussions gathered field experts from a variety of organizations to enrich the discourse and provide comprehensive recommendations for addressing this challenge. Additionally, a panel of topical experts discussed the subject matter of this report at the Aspen Security Forum on July 18th, 2024.

Read the full report here

The United States faces a critical human talent shortage in industries essential for maintaining technological leadership, including workforce sectors related to artificial intelligence, quantum computing, semiconductors, 5G/6G technologies, fintech, and biotechnology. Without a robust education system that prepares our youth for future careers in these sectors, our national security and competitiveness are at risk. Quoting the report, Dr. Katie Jenner, Secretary of Education for the State of Indiana, reiterated the idea that “we must start treating a strong educational system as a national security issue” during the panel discussion. Addressing these challenges requires a comprehensive approach that bridges the gaps between national security, industry, higher education, and K-12 education while leveraging local innovation. The paper outlines strategies for creating and promoting career pathways from K-12 into high-demand industries to maintain the U.S.’s competitive edge in an increasingly global landscape, including:

National security has historically driven educational investment (think Sputnik) and remains a bipartisan priority, providing a strong foundation for new legislation addressing emerging technologies like AI. For example, the CHIPS and Science Act, driven by competition with China, has spurred states to innovate, form public-private partnerships, and establish Tech Hubs. 

Mapping out workforce opportunities in other critical sectors such as aviation, AI, computer science, and biosecurity can ensure that the future workforce is gaining necessary skills to be successful in high-need careers in national security. For example, Ohio created a roadmap for advanced manufacturing with the Governor’s Office of Workforce Transformation and the Ohio Manufacturers’ Association outlining sector-specific competencies.

Innovative funding streams, employer incentives, and specialized intermediaries promoting career-connected learning can bridge gaps by encouraging stronger cross-sector ties in education and the workforce. For example, Texas allocated incentive funding to Pathways in Technology Early College High Schools (P-TECH) encouraging explicit career-connected learning opportunities that engage young people in relevant career paths. 

A Technical Assistance (TA) Center would offer tailored support based on each state’s emerging industries, guided by broader economic and national security needs. The center could bring together stakeholders such as community colleges, education leaders, and industry contacts to build partnerships and cross-sector opportunities. 

Virginia streamlined all workforce initiatives under a central state department, enhancing coordination and collaboration. The state also convenes representatives and cabinet members with backgrounds in workforce issues regularly to ensure alignment of education from K-12 through postsecondary.

Education R&D lacks sufficient investment and the infrastructure to support innovative solutions addressing defining challenges in education in the U.S. The New Essential Education Discoveries (NEED) Act would establish an agency called the National Center for Advanced Development in Education (NCADE) that would function as an ARPA-ED, developing and disseminating evidence-based practices supporting workforce pathways and skills acquisition for critical industries.

Giving young students opportunities to learn about different careers in these sectors will inspire interest and early experiences with diverse options in higher education, manufacturing, and jobs from critical industries ensuring American competitiveness.Implementing these recommendations will require action from a diverse group of stakeholders including the federal government and leadership at the state and local levels. Check out the report to see how these steps will empower our workforce and uphold the United States’ leadership in technology and national security.

United States Discloses Nuclear Warhead Numbers; Restores Nuclear Transparency

Note: The initial NNSA release showed an incorrect graph that did not accurately depict the size of the stockpile for the period 2012-2023. The corrected graph is shown above.

[UPDATED VERSION] The Federation of American Scientists applauds the United States for declassifying the number of nuclear warheads in its military stockpile and the number of retired and dismantled warheads. The decision is consistent with America’s stated commitment to nuclear transparency, and FAS calls on all other nuclear states to follow this important precedent.

The information published on the National Nuclear Security Administration (NNSA) web site today shows that the U.S. nuclear weapons stockpile as of September 2023 included 3,748 nuclear warheads, only 40 warheads off FAS’ estimate of 3,708 warheads.

The information also shows that the United States last year dismantled only 69 retired nuclear warheads, the lowest number since 1994.

FAS has previously requested that the United States release the size of the US nuclear arsenal for FY2021, FY2022, and FY2023, but those requests were denied. FAS believes the information was wrongly withheld and that today’s declassification decision vindicates our belief that stockpile disclosures do not negatively affect U.S. security but should be provided to the public.

With today’s announcement, the Biden Administration has restored the nuclear stockpile transparency that was created by the Obama administration, halted by the Trump administration, revived by Biden administration in its first year, but then halted again for the past three years.

While applauding the U.S. disclosure, FAS also urged other nuclear-armed states to disclose their stockpile numbers and warheads dismantled. Excessive nuclear secrecy creates mistrust, fuels worst-case planning, and enables hardliners and misinformers to exaggerate nuclear threats.

What The Nuclear Numbers Show

The declassified stockpile numbers show that the United States maintained a total of 3,748 warheads in its military stockpile as of September 2023. The military stockpile includes both active and inactive warheads in the custody of the Department of Defense. The information also discloses weapons numbers for the previous two years, numbers that the U.S. government had previously declined to release.

Although there have minor fluctuations, the numbers show that the U.S. nuclear weapons stockpile has remained relative stable for the past seven years. The fluctuations during that period do not reflect decision to increase or decrease the stockpile but are the result of warheads movements in and out of the stockpile as part of the warhead life-extension and maintenance work.

Although the warhead numbers today are much lower than during the Cold War and there have been reductions since 2000, the reduction since 2007 has been relatively modest. Although the New Start treaty has had some indirect effect on the stockpile size due to reduced requirement, the biggest reductions since 2007 have be caused by changes in presidential guidance, strategy, and modernization programs. The initial chart released by NNSA did not accurately show the 1,133-warhead drop during the period 2012-2023. NNSA later corrected the chart (see top of article). The chart below shows the number of warheads in the stockpile compared with the number of warheads deployed on strategic launchers over the years.

This graph shows the size of the U.S. nuclear stockpile over the years plus the portion of those warheads deployed on strategic launchers. The stockpile number for 2024 and the strategic launcher warheads are FAS estimates.

The information also shows that the United States last year dismantled only 69 retired nuclear warheads. That is the lowest number of warheads dismantled in a year since 1994. The total number of retired nuclear warheads dismantled 1994-2023 is 12,088. Retired warheads awaiting dismantlement are not in the DOD stockpile but in the DOE stockpile.

The information disclosed also reveals that there are currently another approximately 2,000 retired warheads in storage awaiting dismantlement. This number is higher than our most recent estimate (1,336) because of the surprisingly low number of warheads dismantled in recent years. Because dismantlement appears to be a lower priority, the number of retired weapons awaiting dismantlement today (~2,000) is only 500 weapons lower than the inventory was in 2015 (~2,500).

FAS’ Work For Nuclear Transparency

The Federation of American Scientists has worked since its creation to increase responsible transparency on nuclear weapons issues; in fact, the nuclear scientists that built the bomb created the “Federation of Atomic Scientists” to enable some degree of transparency to discuss the implications of nuclear weapons (see FAS newsletter from 1949). There are of course legitimate nuclear secrets that must remain so, but nuclear stockpile numbers are not among them.

This FAS newsletter from 1949 describes the debate and FAS effort in support of transparency of the US weapons stockpile.

One part of FAS’ efforts, spearheaded by Steve Aftergood who for many years directed the FAS Project on Government Secrecy, has been to report on the government’s discussions about what needs to be classified and repeatedly request declassification of unnecessary secrets such as the stockpile numbers. This work yielded stockpile declassifications in some years (2012-2018, 2021) while in other years (2019-2020 and 2022-2024) FAS’ declassified requests were initially denied. Most recently, in February 2024, an FAS declassification request for the stockpile numbers as of 2023 was denied, although the letter added: “If a different decision is made in the future, we will notify you of this decision” (see rejection letters). Given these denials, FAS in March 2024 sent a letter to President Biden outlining the important reasons for declassifying the numbers.

The new disclosure of the stockpile numbers suggests that denial of earlier FAS declassification requests in 2023 and 2024 may not have been justified and that future years’ numbers should not be classified.

The other part of FAS’ efforts has been the Nuclear Information Project, which works to analyze, estimate, and publish information about the U.S. nuclear arsenal. In 2011, when the Obama administration first declassified the history of the stockpile, the FAS estimate was only 13 warheads off from the official number of 5,112 warheads. The Project also works to increase transparency of the other nuclear-armed states by providing the public with estimates of their nuclear arsenals. The project described the structures that enabled Matt Korda on our team and others to discover the large missile silo fields China was building, and NATO officials say our data is “the best for open source information that doesn’t come from any one nation.”

Why Nuclear Transparency Is Important

FAS has since its founding years worked for maximum responsible disclosure of nuclear weapons information in the interest of international security and democratic values. In a letter to President Biden in March 2024 we outlined those interests.

After denials in 2023 and February 2024 of FAS declassification requests, FAS in March sent President Biden a letter outlining why the denials were wrong. Click here to download full version of letter.

First, responsible transparency of the nuclear arsenal serves U.S. security interests by supporting deterrence and reassurance missions with factual information about U.S. capabilities. Equally important, transparency of stockpile and dismantlement numbers demonstrate that the United States is not secretly building up its arsenal but is dismantling retired warheads instead of keeping them in reserve. This can help limit mistrust and worst-case scenario planning that can fuel arms races.

Second, the United States has for years advocated and promoted nuclear transparency internationally. Part of its criticism of Russia and China is their lack of disclosure of basic information about their nuclear arsenals, such as stockpile numbers. U.S. diplomats have correctly advocated for years about the importance of nuclear transparency, but their efforts are undermined if stockpile and dismantlement numbers are kept secret because it enables other nuclear-armed states to dismiss the United States as hypocritical.

Third, nuclear transparency is important for the debate in the United States (and other Allied democracies) about the status and future of the nuclear arsenal and strategy and how the government is performing. Opponents of declassifying nuclear stockpile numbers tend to misunderstand the issue by claiming that disclosure gives adversaries a military advantage or that the United States should not disclose stockpile numbers unless the adversaries do so as well. But nuclear transparency is not a zero-sum issue but central to the democratic process by enabling informed citizens to monitor, debate, and influence government policies. Although the U.S. disclosure is not dependent on other nuclear-armed states releasing their stockpile numbers, Allied countries such as France and the United Kingdom should follow the U.S. example, as should Russia and China and the other nuclear-armed states.

Acknowledgement: Mackenzie Knight, Jon Wolfsthal, and Matt Korda provided invaluable edits.

More information on the FAS Nuclear Information Project page.


This research was carried out with generous contributions from the Carnegie Corporation of New York, the New-Land Foundation, the Ploughshares Fund, the Prospect Hill Foundation, Longview Philanthropy, and individual donors.

The Blackouts During the Texas Heatwave Were Preventable. Here’s Why.

On Monday, July 9, nearly 3 million homes and businesses in Texas were suddenly without power in the aftermath of Hurricane Beryl. Today, four days later, over 1 million Texans are entering a fourth day powerless. The acting governor, Dan Patrick, said in a statement that restoring power will be a “multi-day restoration event.” As people wait for this catastrophic grid failure to be remedied, much of southeast Texas, which includes Houston, is enduring dangerous, extreme heat with no air conditioning amid an ongoing heatwave. 

Extreme Heat is the “Top Weather Killer”

As our team at FAS has explained, prolonged exposure to extreme heat increases the risk of developing potentially fatal heat-related illnesses, such as heat stroke, where the human body reaches dangerously high internal temperatures. If a person cannot cool down, especially when the nights bring no relief from the heat, this high core temperature can result in organ failure, cognitive damage, and death. Extreme heat is often termed the “top weather killer,” as it’s responsible for 2,300 official deaths a year and 10,000 attributed via excess deaths analysis.  With at least 10 lives already lost in Texas amidst this catastrophic tragedy, excess heat and power losses are further compounding vulnerabilities, making the situation more dire. 

Policy Changes Can Save Lives

These losses of life and power outages are preventable, and it is the job of the federal government to ensure this. Our team at FAS has previously called for attention to the soaring energy demands and unprecedented heat waves that have placed the U.S. on the brink of widespread grid failure across multiple states, potentially jeopardizing millions of lives. In the face of widespread blackouts, restoring power across America is a complex, intricate process requiring seamless collaboration among various agencies, levels of government, and power providers amid constraints extending beyond just the loss of electricity. There is also a need for transparent protocols for safeguarding critical medical services and frameworks to prioritize regions for power restoration, ensuring equitable treatment for low-income and socially vulnerable communities affected by grid failure events.

As a proactive federal measure, there needs to be a mandate for the implementation of an Executive Order or an interagency Memorandum of Understanding (MOU) mandating the expansion of public health and emergency response planning for widespread grid failure under extreme heat. This urgently needed action would help mitigate the worst impacts of future grid failures under extreme heat, safeguarding lives, the economy, and national security as the U.S. moves toward a more sustainable, stable, and reliable electric grid system.Therefore, given the gravity of these high-risk, increasingly probable scenarios facing the United States, it is imperative for the federal government to take a leadership role in assessing and directing planning and readiness capabilities to respond to this evolving disaster.

Image via NWS/Donald Sparks

Increasing the “Policy Readiness” of Ideas

NASA and the Defense Department have developed an analytical framework called the “technology readiness level” for assessing the maturity of a technology – from basic research to a technology that is ready to be deployed.  

A policy entrepreneur (anyone with an idea for a policy solution that will drive positive change) needs to realize that it is also possible to increase the “policy readiness” level of an idea by taking steps to increase the chances that a policy idea is successful, if adopted and implemented.  Given that policy-makers are often time constrained, they are more likely to consider ideas where more thought has been given to the core questions that they may need to answer as part of the policy process.

A good first step is to ask questions about the policy landscape surrounding a particular idea:

1. What is a clear description of the problem or opportunity?  What is the case for policymakers to devote time, energy, and political capital to the problem?

2. Is there a credible rationale for government involvement or policy change?  

Economists have developed frameworks for both market failure (such as public goods, positive and negative externalities, information asymmetries, and monopolies) and government failure (such as regulatory capture, the role of interest groups in supporting policies that have concentrated benefits and diffuse costs, limited state capacity, and the inherent difficulty of aggregating timely, relevant information to make and implement policy decisions.)

3. Is there a root cause analysis of the problem? 

One approach that Toyota has used to answer this question is the “five whys,” which can prevent an analyst from providing a superficial or incomplete explanation with respect to a given problem.

4. What can we learn from past efforts to address the problem?  If this is a problem U.S. policymakers  have been working on for decades without much success, is there a new idea worth trying, or an important change in circumstances?

5. What can we learn from a comparative perspective, such as the experiences of other countries or different states and regions within the United States?

6. What metrics should be used to evaluate progress? What strategy should policy-makers have for dealing with Goodhardt’s Law? 

Goodhardt’s Law states that when a measure becomes a target, it ceases to become a good measure.  A police chief under pressure to reduce the rate of violent crime might reclassify certain crimes to improve the statistics.

7. What are the potential policy options, and an assessment of those options?  Who would need to act to approve and implement these policies?

This question – as is often the case – leads to more questions:

8. What are the documents that are needed to both facilitate a decision on the idea, and implement the idea?  

In the U.S. context, examples of these documents or processes include:

9. Has the idea been reviewed and critiqued by experts, practitioners, and stakeholders?  Is there a coalition that is prepared to support the idea?  How can the coalition be expanded?

10. How might tools such as discovery sprints, human-centered design, agile governance, and pilots be used to get feedback from citizens and other key stakeholders, and generate early evidence of effectiveness?

11. What steps can be taken to increase the probability that the idea, if approved, will be successfully implemented? 

For example, this might involve analyzing the capacity of the relevant government agencies to implement the recommended policy.

12. How can the idea be communicated to the public?  

For example, if you were a speechwriter, what stories, examples, quotes, facts and endorsements would you use to describe the problem, the proposed solution, and the goal?  What are the questions that reporters are likely to ask, and how would you respond to them?

Perhaps you have some experience with policy entrepreneurship and have suggestions on the right questions to ask about a policy idea to increase its “readiness level”. Comment on Tom’s LinkedIn post, where you can add wisdom that could be helpful to others learning about how to make positive change through policy.

Improving Government Capacity: Unleashing the capacity, creativity, energy, and determination of the public sector workforce

Peter Bonner is a Senior Fellow at FAS.

Katie: Peter, first, can you explain what government capacity means to you?

Peter: What government capacity means to me is ensuring that the people in the public sector, federal government primarily, have the skills, the tools, the technologies, the relationships, and the talent that help them meet their agency missions they need to do their jobs.

Those agency missions are really quite profound. I think we lose sight of this: if you’re working at the EPA, your job is to protect human health in the environment. If you’re working at the Department of the Interior, it’s to conserve and protect our natural resources and cultural heritage for the benefit of the public. If you’re working for HHS, you’re enhancing the health and well-being of all Americans. You’re working for the Department of Transportation, you’re ensuring a safe and efficient transportation system. And you can get into the national security agencies about protecting us from our enemies, foreign and domestic. These missions are amazing. Building that capacity so that the people can do their jobs better and more effectively is a critical and noble undertaking. Government employees are stewards of what we hold in common as a people. To me, that’s what government capacity is about.

Mr. Bonner’s Experience and Ambitions at FAS

You’ve had a long career in government – but how is it that you’ve come to focus on this particular issue as something that could make a big difference?

I’ve spent a  couple of decades building government capacity in different organizations and roles, most recently as a government executive and political appointee as an associate director at the Office of Personnel Management. Years ago I worked as a contractor with a number of different companies, building human capital and strategic consulting practices. In all of those roles, in one way or another, it’s been about building government capacity.

One of my first assignments when I worked as a contractor was working on the  Energy Star program, and helping to bridge the gaps between the public sector interests – wanting to create greater energy efficiency and reduce energy usage to address climate change – to the private sector interests – making sure their products were competitive and using market forces to demonstrate the effectiveness of federal policy. This work promoted energy efficiency across energy production, computers, refrigerators, HVAC equipment, even commercial building and residential housing. Part of the capacity building piece of that was working with the federal staff and the federal clients who ran those programs, but also making sure they had the right sets of collaboration skills to work effectively with the private sector around these programs and work effectively with other federal agencies. Agencies not only needed to work collaboratively wih the private sector, but across agencies as well. Those collaboration skills–those skills to make sure they’re working jointly inter-agency – don’t always come naturally because people feel protective about their own agency, their own budgets, and their own missions. So that’s an example of building capacity. 

Another project early on I was involved in was helping to develop a training program for inspectors of underground storage tanks. That’s pretty obscure, but underground storage tanks have been a real challenge in the nation in creating groundwater pollution. We developed an online course using simulations on how to detect leaks and underground storage tanks. The capacity building piece was getting the agencies and  tank inspectors at the state and local level to use this new learning technology to make their jobs easier and more effective. 

Capacity building examples abound – helping OPM build human capital frameworks and improve operating processes, improving agency performance management systems, enhancing the skills of Air Force medical personnel to deal with battlefield injuries, and on. I’ve been doing capacity building through HR transformation,  learning, leadership development, strategy and facilitation, human centered design, and looking at how do you develop HR and human capital systems that support that capacity building in the agencies. So across my career, those are the kinds of things that I’ve been involved in around government capacity.

What brought you to FAS and what you’re doing now? 

I left my job as the associate director for HR Solutions at the Office of Personnel Management last May with the intent of finding ways to continue to contribute to the effective functioning of the federal government. This opportunity came about from a number of folks I’d worked with while at OPM and elsewhere.

FAS is in a unique position to change the game in federal capacity building through thought leadership, policy development, strategic placement of temporary talent, and initiatives to bring more science and technical professionals to lead federal programs. 

I’m really trying to help change the game in talent acquisition and talent management and how they contribute to government capacity. That ranges from upfront hiring in the HR arena through to onboarding and performance management and into program performance.

I think what I’m driven by at FAS is to really unleash the capacity, the creativity, the energy, the determination of the public sector workforce to be able to do their jobs as efficiently and effectively as they know how. The number of people I know in the federal government that have great ideas on how to improve their programs in the bottom left hand drawer of their desk or on their computer desktop, that they can never get around to because of everything else that gets in the way. 

There are ways to cut through the clutter to help make hiring and talent management effective. Just in hiring: creative recruiting and sourcing for science and technical talent, using hiring flexibilities and hiring authorities on hand, equipping HR staffing specialists and hiring managers with the tools they need, working across agencies on common positions, accelerating background checks are all ways to speed up the hiring process and improve hiring quality.

It’s the stuff that gets in the way that inhibits their ability to do these things. So that unleashing piece is the real reason I’m here. When it comes to the talent management piece changing, if you can move the needle a little bit on the perception of public sector work and federal government work, because the perception, the negative perception of what it’s like to work in the federal government or the distrust in the federal government is just enormous. The barriers there are profound. But if we can move the needle on that just a little bit, and if we can change the candidate experience of the person applying for a federal job so that they, while it may be arduous, results in a positive experience for them and for the hiring manager and HR staffing specialist, that then becomes the seed bed for a positive employee experience in the federal job. That then becomes the seed bed for an effective customer experience because the linkage between employee experience and customer experience is direct. So if we can shift the needle on those things just a little bit, we then start to change the perception of what public sector work is like, and tap into that energy of what brought them to the public sector job in the first place, which by and large is the mission of the agency.

Using Emerging Technologies to Improve Government Capacity

How do you see emerging technologies assisting or helping that mission?

The emerging technologies in talent management are things that other sectors of the economy are working with and that the federal government is quickly catching up on. Everybody thinks the private sector has this lock picked. Well, not necessarily. Private sector organizations also struggle with HR systems that effectively map to the employee journey and that provide analytics that can guide HR decision-making along the way.

A bright spot for progress in government capacity is in recruiting and sourcing talent. Army Corps of Engineers, Department of Energy are using front end recruiting software to attract people into their organizations. The  Climate Corps, for example, or the Clean Energy Corps at Department of Energy. So they’re using those front end recruiting systems to bring people in and attract people in to submit the resumes and their applications that can again, create that positive initial candidate experience, then take ’em through the rest of the process. There’s work being done in automating and developing more effective online assessments from USA Hire, for example, so that if you’re in a particular occupation, you can take an online test when you apply and that test is going to qualify you for the certification list on that job.

Those are not emerging technologies but they are being deployed effectively in government. The mobile platforms to quickly and easily communicate with the applicants and communicate with the candidates at different stages of the process. Those things are coming online and already online in many of the agencies. 

In addition to some experimentation with AI tools, I think one of the more profound pieces around technologies is what’s happening at the program level that is changing the nature of the jobs government workers do that then impacts what kind of person an HR manager is  looking for. 

For example, while there are specific occupations focused on machine learning, AI, and data analytics, data literacy and acumen and using these tools going to be part of everyone’s job in the future. So facility with those analytic tools and with the data visualization tools that are out there is going to have a profound impact on the jobs themselves. Then you back that up to, okay, what kind of person am I looking for here? I need somebody with that skill set coming in. Or who can be easily up-skilled into that. That’s true for data literacy, data analytics, some of the AI skill sets that are coming online. It’s not just the technologies within the talent management arena, but it’s the technologies that are happening in the front lines and the programs that then determine what kind of person I’m looking for and impact those jobs.

The Significance of Permitting Reform for Hiring

You recently put on a webinar for the Permitting Council. Do you mind explaining what that is and what the goal of the webinar was?

The Permitting Council was created under what’s called the Fast 41 legislation, which is legislation to improve the capacity and the speed at which environmental permits are approved so that we can continue with federal projects. Permitting has become a real hot button issue right now because the Inflation Reduction Act, the CHIPS and Science Act, the Bipartisan Infrastructure Law created all of these projects in the field, some on federal lands, some on state and local lands, and some on tribal or private sector lands, that then create the need to do an environmental permit of some kind in order to get approval to build. 

So under the Bipartisan Infrastructure Law, we’re trying to create internet for all, for example, and particularly provide internet access in rural communities where they haven’t had it before, and people who perhaps couldn’t afford it. That requires building cell towers and transmission lines on federal lands, and that then requires permits, require a permitting staff or a set of permitting contractors to actually go in and do that work.

Permitting has been, from a talent perspective, underresourced. They have not had the capacity, they have not had the staff even to keep up with the permits necessitated by these new pieces of legislation. So getting the right people hired, getting them in place, getting the productive environmental scientists, community planners, the scientists of different types, marine biologists, landscape folks, the fish and wildlife people who can advise on how best to do those environmental impact statements or categorical exclusions as a result of the National Environmental Protection Act – it has been a challenge. Building that capacity in the agencies that are responsible for permitting is really a high leverage point for these pieces of legislation because if I can’t build the cell tower, I then don’t realize the positive results from the Bipartisan Infrastructure Law. And you can think of the range of things that those pieces of legislation have fostered around the country from clean water systems in underserved communities, to highways, to bridges, to roads, to airports.

Another example is offshore wind. So you need marine biologists to be able to help do the environmental impact statements around building the wind turbines offshore and examine the effect on the marine habitats. It’s those people that the Department of Interior, the Department of Energy, and Department of Commerce need to hire to come in and run those programs and do those permits effectively. That’s what the Permitting Council does.

One of the things that we worked with with OPM and the Permitting Council together on is creating a webinar so that we got the hiring managers and the HR staffing specialists in the room at the same time to talk about the common bottlenecks that they face in the hiring process. After doing outreach and research, we created journey maps and a set of personas to identify a couple of the most salient and common challenges and high leverage challenges that they face.

Overcoming Hiring Bottlenecks for Permitting Talent, a webinar presented to government hiring managers, May 2024

Looking at the ecosystem within hiring, from what gets in the way in recruiting and sourcing, all the way through to onboarding, to focusing in on the position descriptions and what do you do if you don’t have an adequate position description upfront when you’re trying to hire that environmental scientist to the background check process and the suitability process. What do you do when things get caught in that suitability process? And if you can’t bring those folks on board in a timely fashion you risk losing them. 

We focused on a couple of the key challenges in that webinar, and we had, I don’t know, 60 or 70 people who were there, the hiring managers and HR staffing specialists who took away from that a set of tools that they can use to accelerate and improve that hiring process and get high quality hires on quickly to assist with the permitting.

The Permitting Council has representatives from each of the agencies that do permitting, and works with them on cross agency activities. The council also has funding from some of these pieces of legislation to foster the permitting process, either through new software or people process, the ability to get the permits done as quickly as possible. So that’s what the webinar was about. I We’re talking about doing a second one to look at the more systemic and policy related changes, challenges in permitting hiring.

The Day One Project 2025

FAS has launched its Day One Project 2025, a massive call for nonpartisan, science-based policy ideas that a next presidential administration can utilize on “day one” – whether the new administration is Democrat or Republican. One of the areas we’ve chosen to focus on is Government Capacity. Will you be helping evaluate the ideas that are submitted?

I’ve had input into the Day One Project, and particularly around the talent pieces in the government capacity initiative, and also procurement and innovation in that area. I think the potential of that to help set the stage for talent reform more broadly, be it legislative policy, regulatory or the risk averse culture we have in the federal government. I think the impact of that Day One Project could be pretty profound if we get the right energy behind it. So one of the things that I’ve known for a while, but has come clear to me over the past five months working with FAS, is that there are black boxes in the talent management environment in the federal government. What I mean by that is that it goes into this specialized area of expertise and nobody knows what happens in that specialized area until something pops out the other end.

How do you shed light on the inside of those black boxes so it’s more transparent what happens? For instance: position descriptions when agencies are trying to hire someone. Sometimes what happens with position descriptions is that the job needs to be reclassified because it’s changed dramatically from the previous position description. Well, I know a little about classification and what happens in the classification process, but to most people looking from the outside to hiring managers, that’s a black box. Nobody knows what goes on. I mean, they don’t know what goes on within that classification process to know that it’s going to be worthwhile for them once they have the position description at the other end and are able to do an effective job announcement. Shedding light on that, I think has the potential to increase transparency and trust between the hiring manager and the HR folks or the program people and the human people.

If we’re able to create that greater transparency. If we’re able to tell the candidates when they come in and apply for a job where they are in the hiring process and whether they made the cert list or didn’t make the cert list. And if they are in the CT list, what’s next in terms of their assessment and the process? If they’ve gone through the interview, where are we in the decision deliberations about offering me the job? Same thing in suitability. Those are many black boxes all the way, all the way across. And creating transparency and communication around it, I think will go a long way, again, to moving that needle on the perception of what federal work is and what it’s like to work in the system. So it’s a long answer to a question that I guess I can summarize by saying, I think we are in a target rich environment here. There’s lots of opportunity here to help change the game.