This series of interviews spotlights scientists working across the country to implement the U.S. Department of Energy’s massive efforts to transition the country to clean energy, and improve equity and address climate injustice along the way. The Federation’s clean energy workforce report discusses the challenges and opportunities associated with ramping up this dynamic, in-demand workforce. These interviews have been edited for length and do not necessarily reflect the views of the DOE.
Dr. Adria Brooks’ journey to the Department of Energy has been a winding road. From the forests of Western Massachusetts, to the desert mountains of Arizona, to the frosty fields of Wisconsin, she has made a career out of teaching others why they should care about clean energy.
From Felling Trees to Harnessing Sunshine
Dr. Brooks’ pathway to clean energy began as an undergraduate when she took time off from her Bachelors degree to work on a forest trail crew. She spent nine months on a trail crew in Western Massachusetts. “I was trained to be a lumberjack, basically, but for conservation purposes – so felling trees to build bridges or trails and things. I loved that job; it was really fun and helped me connect with the environment.” Interacting with the environment in such a physical, tangible way encouraged her to change her course of study from space sciences to climate change and energy issues.
Soon after switching her academic focus, she found work at a solar test facility managed by her alma mater, the University of Arizona. Very quickly she got hands-on experience in every facet of solar energy, from installation of modules and inverters to running experiments, collecting data, doing analysis, and writing reports. In addition, she honed her science communication skills by giving tours to visiting audiences – ranging from Girl Scouts to the late Senator John McCain.
Understanding how solar and its supporting power systems worked on the ground illuminated a new lesson for Dr. Brooks: “Solar [was] not the problem – the power grid is the reason we can’t get more clean energy.” With this new understanding, Dr. Brooks pursued both a Master’s and a PhD in electrical engineering, with a certificate in energy analysis and policy.
“I loved the policy piece of it, because it brought together economists and engineers and policy folks,” she says. This cohort of people came from different disciplines into the energy analysis and policy program at University of Wisconsin. “It was a really cool program; I loved it.”
State Government Service
While pursuing her dissertation Dr. Brooks started working at the Wisconsin Public Service Commission as a transmission engineer. This demanding state government role proved to be a valuable training ground, building on the communication skills she honed in Arizona. As an engineer she worked across two different administrations, explaining electrical transmission systems, their challenges, and how different policies might impact reliability and clean energy goals. The key to effectively engaging her audience? Understanding their specific goals and meeting them where they were.
“The information [on power systems] I was providing was essentially the same. The question became: What lens am I using? Am I focusing on reliability and consumer cost? Am I focusing on decarbonisation? From my view, it didn’t really matter. The solutions wind up being pretty similar, but it was eye-opening for me to learn how to communicate the science to folks that don’t have that background, but who have the ability to make big decisions affecting the power grid. I thoroughly loved that job. And that’s what set me off wanting to do more policy work at the federal level.”
Joining DOE and the GDO
Setting her sights on the federal government, Dr. Brooks joined the U.S. Department of Energy (DOE)in 2020 as a AAAS Science Technology Policy Fellow in the Department’s Energy Efficiency and Renewable Energy Office. The position was meant to be research heavy and focused on maximizing taxpayer investments in different investigative projects. But when a new administration came in with a long list of renewable energy goals and a serious focus on transmission, Dr. Brooks found herself reassigned to the Office of Electricity, and later hired into the newly created Grid Deployment Office (GDO).
GDO, which is tasked with investing in critical generation facilities, increasing grid resilience, and improving and expanding transmission and distribution systems to provide reliable, affordable electricity, needed internal folks who understood the science of renewable energy and grid deployment, and who could translate it to cross-cutting program teams and leadership who weren’t mired in the details day-to-day. Dr. Brooks found her groove by bringing in skills from her days at the solar test facility and the Wisconsin Public Service Commission. “My job became a lot more policy focused, trying to explain the science to stand up new programs related to the transmission and the power grid,” she said. Dr. Brooks’ communications skills combined with her technical background are hugely important because the science of electrical transmission – and how that impacts what clean energy development can occur and how quickly – is often an incomprehensible thing for people, including policymakers.
Communication remains a crucial part of Dr. Brooks’s role and contribution at DOE.
A big win for Dr. Brooks and GDO was the October 2023 release of the National Transmission Needs Study. This study is a useful planning reference to efficiently and effectively deploy resources to update and expand the nation’s transmission grid infrastructure. Conducted every three years, this most recent study is more expansive in scope than previous versions. “Future policy decisions that the Department makes are going to be based on the findings of this report. It also provides a lot of valuable insight for utilities, developers, and other decision makers across the country, so that’s very big,” Dr. Brooks said. Although modest, she played a major role in planning, analyzing the data for, and rolling out the report.
The journey that brought Dr. Brooks to DOE seems almost preordained, as she is bringing her specific knowledge to bear on the urgent problem of climate change.
“Now I feel more impactful being so close to the policymaking, getting to have one foot in the engineering analysis and one foot in policy development. That is really exciting. I do think a lot of that is a very specific opportunity that matched the specific skill set that I had. I have felt very lucky in that regard, to be seen as an expert around the Department. Lots of different offices will reach out, or policymakers will reach out to try to get clarity on the transmission system, and that is exciting. But I also know that it’s luck that I stepped in at the exact time to make that opportunity for myself.”
While the transmission issues she works on can often feel insurmountable, Dr. Brooks feels optimistic about the future.
“I am hopeful about how much transmission we’re going to be able to build over the next 10 to 15 years. The word ‘transmission’ is now a common term; people understand it. A couple of years ago, I would talk to folks about my job and they would say, ‘I don’t understand what the power grid is.’ Now, more people at least understand what the grid is, and that it is a bottleneck to getting clean energy online. That’s huge. I think we’re going to make a lot more progress than I had any hope of us making even a couple of years ago.”
More than just the policy implications of her work, Dr. Brooks is impressed by how many young professionals want to join government service to play an active role in fighting climate change. Starting in Tucson, continuing in Wisconsin, and now from her home in Boston, she’s volunteered in a variety of roles unrelated to energy systems and grid work that facilitate climate discussions. “I’ve always found kids to be super eager and curious to learn”, she said, providing even more hope that the work will continue with the support of future generations.
This series of interviews spotlights scientists working across the country to implement the U.S. Department of Energy’s massive efforts to transition the country to clean energy, and improve equity and address climate injustice along the way. The Federation’s clean energy workforce report discusses the challenges and opportunities associated with ramping up this dynamic, in-demand workforce. These interviews have been edited for length and do not necessarily reflect the views of the DOE.
From the rugged snowbanks of Alaska to the tropical seaside of Hawai’i, Dr. Olivia Lee Mei Ling has sought to improve the access to, and delivery of, energy. To understand her journey to the Department of Energy and her work today, our story begins in Alaska.
Women in Polar Science
After obtaining her PhD in Wildlife and Fisheries Science from Texas A&M University, Dr. Lee headed north to accept a teaching position at the University of Alaska, Fairbanks. She spent ten years there, first in the Geophysical Department and later in the International Arctic Research Center. While there she developed future energy scenarios for Alaskans, working with federal, state, tribal and local governments, expert stakeholders and non-governmental organizations. Those conversations were sometimes difficult – bringing together a wide range of perspectives and personalities and asking them to align on a plan – but were vital to the state’s future.
“Building those relationships [between energy stakeholders], and helping those conversations continue to happen was a fantastic opportunity to delve into how policy and science can co-occur.”
While at the university she did a short stint with the National Science Foundation as an IPA [Intergovernmental Personnel Act, a temporary position in the federal government] supporting researchers doing work in the Arctic. There she was involved in interagency programs, with a lot of emphasis on developing diversity, equity, and inclusion initiatives across agencies.
During this time Dr. Lee supported growing outreach for a group of scientists, Women in Polar Science. She identified a need for this group after submitting an article to a geophysical journal about the group’s work – which was rejected because it ‘wasn’t of interest to a wide enough audience.’
Dr. Lee said it was “appalling to think that the science community is not interested or doesn’t believe there is enough value in sharing what we’ve learned about the women who face adversity doing research in polar environments. And so I co-founded the Interagency Arctic Research Policy Committee’s (IARPC) community of interest on diversity, equity, and inclusion issues.” The group has grown and since taken off, bringing more scientists together to work on DEI within arctic research.
Dr. Lee’s work in Polar Science led to more social ties within Alaska’s Tribal communities, and a deeper understanding of their unique needs. These experiences showed the value of skills beyond traditional scientific training. Empathy quickly became her guiding principle; as an oil-rich state, it became clear that any energy plan in Alaska needed to address community needs first. “In some areas, like in Alaska, diesel will have to continue to be a part of the energy mix until they’re able to support something more reliable year-round than renewables can offer right now. We need to push clean energy, but not at the cost of livelihoods and safety of communities.” says Dr. Lee. To non-scientists, this statement might be surprising – isn’t the goal to eliminate all fossil fuels? No, the goal is to support a just transition in every region.
Moving States, Territories and Tribes to Clean Energy As Quickly As Possible at DOE
When the Department of Energy began ramping up hiring through the Clean Energy Corps, Dr. Lee was immediately interested. When she interviewed with the Grid Deployment Office, the office recognized her knowledge and skills were unique and vital to their work, and in particular, her combination of scientific expertise and knowledge of the needs associated with Tribal communities.
“We work with a lot of tribes, and it’s a skill set that not everyone has – to take the time to self-educate on the history of colonization, to respectfully interact with tribes, understand that they are self-governing entities and continue to face a lot of challenges in developing their economies.
At GDO, Dr. Lee supports grid resilience projects. Her team thinks critically about what specific infrastructure investments could help communities be more resilient to impacts from climate change – and what resources or guidance communities need to implement those ideas. “We’re not just reacting to disasters as they happen, but thinking about 10 years, 20 years down the road, where do we need to be? How do we sustain energy access and what partnerships we can help build now to make sure that this is an ongoing process?
She continues: “It’s really exciting to know that you are a part of modernizing the grid in a way that will have tangible benefits in the near term – and in the long term as well, if we’re able to help states and tribes plan how [IRA funds] can shape their sustainability moving forward.”
There are lots of unknowns: what kind of infrastructure exists today, and what kind of investment is required to hasten transition? What resources specific to that location are available now, and how can productive programs be amplified? The work involves measuring and modeling to ensure waste and harm are minimized, while maximizing positive environmental and economic opportunities across the lifecycle of any energy plan.
“In my particular program, we’re supporting projects that develop good resilience. And there’s a very strong emphasis on going beyond theoretical into implementation. Like: what specific infrastructure investments and projects are going to be done to make the electrical grid more resilient to impacts from climate change?”
Unsurprisingly, this work is more than spreadsheets of numbers. To deploy an energy upgrade so much more must be considered: a region’s history, its present day health, and how the region may evolve based on the impacts of climate change prediction models. How can the department meet communities where they are and at the same time prepare them for a changing environment?
Unexpected Opportunities to Build Grid Resilience
Dr. Lee shares one example of how her team did just that. One of the Alaskan tribes they work with requested funding for a project that seemed outside the bounds of grid resilience. They didn’t ask for wiring, poles or grounding, but for snow removal equipment for their wind facility.
During a recent snowstorm, the community couldn’t access their wind facilities because they lacked updated snow removal equipment. Without ready access to those facilities, if anything had gone wrong, the grid would have had problems as well. “It’s so important to have energy in Alaska winters – it’s life or death. You can’t just say ‘I’ll put on an extra blanket.’ Responding to a request for something as simple as snow removal equipment is an actual, valid, small step that we can take to support grid resilience.”
Dr. Lee’s ability to think creatively and understand the needs of remote communities are the skills that make her an exceptional team member. Without that level of understanding, that tribe may not have gotten support for equipment that at first glance, isn’t immediately related to grid resilience.
Advice for Those Seeking Roles in Government Clean Energy Work
Dr. Lee’s achievements underscore the importance of a strong federal workforce. She offered advice for those entering government for the first time: “Find mentors who can help you navigate how the government works, and be open to new opportunities and trying new things.” She adds that being open to learning and finding mentors in different offices and different career stages brings the most opportunities compared to a so-called “straight career path.”
She says another benefit is that the people working clean energy technology in the government are some of the most optimistic people. Her work at GDO – helping modernize and fortify the grid – is vital to the resilience and livelihood of communities across the country.
The Federation of American Scientists (FAS) seeks to advance progress on a broad suite of contemporary issues where science, technology, and innovation policy can deliver dramatic progress. In recognition of her work in public service, FAS will honor Dr. Alondra Nelson with the Public Service Award next month alongside other distinguished figures including Senators Chuck Schumer (D-NY) and Todd Young (R-IN) for their work in Congress making the CHIPS & Science Act a reality to ensure a better future for our nation.
In addition to my role as Senior Fellow in Science Policy for FAS, I have the pleasure of chairing Membership Engagement for Section X, a governance committee of the American Association for the Advancement of Science (AAAS) focused on Societal Impacts of Science and Engineering. I had the honor of co-moderating a session featuring Dr. Alondra Nelson last week, titled Big Issues for Science Policy in a Challenging World—A Conversation with Dr. Alondra Nelson at the American Educational Research Association (AERA) in Washington D.C.
The hybrid event was co-organized with Section K (Social, Economic, and Political Sciences) and co-moderated with Dr. Barbara Schneider, John A. Hannah University Distinguished Professor in the College of Education and the Department of Sociology at Michigan State University and Immediate Past Chair of AAAS Section K. We led a targeted Q&A discussion informed by audience questions in-person and online.
The conversation focused on how scientific and technical expertise can have a seat at the policymaking table, which aligns with the mission of FAS, and provided key insights from an established leader. Opening remarks featured reflections from Dr. Alondra Nelson on the current state of key issues in science policy that were priorities during her time in the Biden-Harris administration, and her views on the landscape of challenges that should occupy our attention in science policy today and in the future. Dr. Alondra Nelson is the Harold F. Linder Professor at the Institute for Advanced Study and a distinguished senior fellow at the Center for American Progress. A former deputy assistant to President Joe Biden, she served as acting director of the White House Office of Science and Technology Policy (OSTP) and the first ever Deputy Director for Science and Society.
FAS is highly invested in ensuring that federal government spending is directed towards enhancing our nation’s competitiveness in science and technology. Dr. Nelson emphasized the idea of innovation for a purpose, and how scientific research and technology development have the potential to improve society, including through STEM education and the infrastructure necessary for research investments to be successful. She also discussed how science and technology can advance democratic values, and highlighted three examples from her time at OSTP that provide promise for the future, including: the cancer moonshot; expanding access to federally funded research across the country; and the need for bringing new voices into science and technology.
Public trust in science and public engagement. The moderated discussion began with the idea of public trust in science in order to set the stage for the current policy landscape. We are operating in a low trust environment for science, and we should make scientific data more accessible to the public. She also highlighted that we need to engage the public in the design process of science and technology, which is why the OSTP Division of Science and Society was initially created. On this point, Dr. Nelson also said that “science policy is a space of possibility” and that we need to expand these opportunities more widely.
Scientific workforce, federal investments and international collaboration. Dr. Nelson described the need to make the implementation of CHIPS and Science a reality and to bring more young voices into science and technology. She remarked that the promise of the CHIPS and Science Act is the intention around investments, and that “we need the ‘and science’ part to be fully funded in order to support the future scientific workforce.” To the question of how we should target federal investments in science and technology, she emphasized the need for collaborative research, bipartisan opportunities, and continuing to study the ‘science of science’ in order to understand the best ways for improving the system, while recognizing that the ROI from the investments we make today may take a few generations to be evident. Relatedly, on the question of ensuring our nation’s competitiveness in science and technology while fostering international collaboration, Dr. Nelson reminded the audience that “national security is a concern around many STEM areas of research.”
Including marginalized voices and technological development. A significant part of the conversation focused on ensuring that marginalized voices have a seat at the table in science and technology. Dr. Nelson stated bluntly that “you can’t have good science without diversity” and that we need to support institutions across the country and engage with different types of educational institutions that may have been traditionally marginalized. To this end, as an example, she emphasized that OSTP previously engaged indigenous knowledge in its work around science and technology governance. The field of artificial intelligence (AI) was also discussed as an example of an area where we need to elevate the visibility of ethical issues that marginalized communities face. The CHIPS and Science Act focused on key technology areas that could create jobs in fields such as AI, leading to a discussion on the need for better policy around emerging technologies, creating high quality jobs, and a stronger focus on workers in the innovation economy.
The event concluded with a high level discussion on policy impact, to which Dr. Nelson remarked that “if you want your science to have an impact, you should find ways to elevate the visibility of your findings among policymakers.” She stated that this will necessitate expanding our current methods to include broader voices in science and technology in the future. We look forward to honoring Dr. Nelson’s impact in the field during next month’s FAS event.
With the news that SpaceX’s Starship is nearing readiness for another test launch, FAS CEO Dan Correa has been thinking more about what its technology could mean for national security, space science, and commercial space activities. Correa believes policymakers should be thinking and talking more about the implications of Starship and other competing space efforts as well. He recently sat down with Karan Kunjur and Neel Kunjur, founders of space technology startup K2 Space, to find out just how big of a leap the next generation of launch vehicles will represent.
Dan Correa, FAS CEO: Let’s start with reminding people exactly what SpaceX’s Starship is – and why it could be such a paradigm shifter.
Karan Kunjur, K2 Space Co-founder, CEO: Starship is a next generation launch vehicle and spacecraft being developed by SpaceX, and when operational will change the game in space exploration. It’s the largest and most powerful launch system to ever be developed (150+ tons of payload capacity to LEO) and is intended to be fully re-usable.
A single Starship launching at a cadence of three times per week will be capable of delivering more mass to orbit in a year than humanity has launched in all of history.
With Starship-class access to space, we’re about to move from an era of mass constraint, to an era of mass abundance. In this new era, what we put in space will look different. The historic trades that were made around mass vs. cost will be flipped on its head, and the optimal spacecraft required for science, commercial and national security missions will change.
DC: Can you be more specific about what types of economic sectors are likely to be affected by Starship and other similar next generation launch vehicles? In other words, is there a broader ecosystem of products and services that you think are likely to emerge to take advantage of Starship or similar capabilities from other companies?
Neel Kunjur, K2 Space Co-founder, CTO: Historically, almost every application in space has been constrained by something commonly known as ‘SWAP’ – Size, Weight and Power. Satellite bus manufacturers have been forced to use expensive, lightweight components that are specially designed for spacecraft that need to fit inside current rockets. Payload designers have been forced to pursue compact sensor designs and complicated, sometimes unreliable deployables. Brought together, these needs have resulted in lightweight, but necessarily expensive vehicles.
A perfect example of this is the James Webb Space Telescope (JWST). In order to fit required mission capabilities within SWAP constraints, the designers of JWST had to 1) Develop a highly complex deployable segmented mirror to fit within the volume budget, 2) Use expensive and novel Beryllium mirrors to fit within the mass budget, and 3) Design low power instruments and thermal conditioning hardware to fit within the power budget. This kind of complexity dramatically increases the cost of missions.
KK: Exactly. In a world with Starship, things will become significantly simpler. Instead of a complex, unfolding, segmented mirror, you could use a large monolithic mirror. Instead of expensive Beryllium mirrors, you could use simpler and cheaper materials with lower stiffness-to-mass ratios, similar to those used in ground-based telescopes. Instead of expensive, power-optimized instruments, additional power could be used to make simpler and cheaper instruments with more robust thermal conditioning capabilities.
The potential for change exists across every type of mission in space. It will become possible to have a satellite bus platform that has more power, more payload volume and more payload mass – but one that comes in at the cost of a small satellite. In a world with launch vehicles like Starship, satellite-based communications providers will be able to use the increased power to have greater throughput, remote-sensing players will be able to use more volume to have larger apertures, and national security missions will no longer need to make the trade-off between single exquisite satellites and constellations of low capability small satellites.
DC: Can we get more specific about what we think the new costs would be? If I’m a taxpayer thinking about how my government financially supports space exploration and activity, that’s important. Or even if I’m a philanthropic supporter of space science – it matters. So what are some “back of the envelope” estimates of cost, schedule, and performance of Starship-enabled missions, relative to status quo approaches?
KK: Here’s an example: the MOSAIC (Mars Orbiters for Surface-Atmosphere-Ionosphere Connections) concept, identified as a priority in the National Academies’ 2022 Planetary Decadal Survey, was a 10-satellite constellation to understand the integrated Mars climate system from its shallow ice, up through Mars’ atmospheric layers, and out to the exosphere and space weather environment. The study envisioned deploying one large “mothership” satellite and nine smaller satellites in orbit around Mars using SpaceX’s Falcon Heavy Rocket. Development of these spacecraft was expected to cost ~$1B (excluding recommended 50% reserves).
In a world with Starship, the same mission could cost $200M in spacecraft costs. With this next generation launch vehicle, you could launch 10 large satellites in a single Starship. Each satellite would be redesigned to optimize for Starship’s mass allowance (150 tons), allowing the use of cheaper, but heavier materials and components (e.g. aluminum instead of expensive isogrid & composite structure). Each satellite would have more capabilities from a power (20kW), payload mass and payload volume than the large “mothership” satellite envisioned in the original MOSAIC study.
DC: You’ve told me that standardization and modularization possibilities with Starship as it relates to satellites and scientific instruments is crucial. Can you elaborate on that idea?
NK: Longer term, having mass will allow us to do interesting things like over-spec the SWAP capabilities of the satellite bus to meet the requirements of various space science missions – thereby driving standardization. With sufficient SWAP, we could start to include a consistent bundle of instruments (rather than selecting a few to fit within limited SWAP budgets) – reducing the level of customization and non-recurring engineering (NRE) required for each mission.
Although there will always be some level of customization required for each individual scientific mission, the potential to standardize a large portion of the hardware will make it possible to mass produce probes, increasing the potential frequency of missions and reducing the potential cost per mission. Examples here include standardized build-to-print suites of spectrometers, cameras, and particle and field sensors.
DC: What are the implications for the Defense Department? What are some of the important opportunities to deliver capabilities to solve national security problems in less time, at a lower cost, and with greater resilience?
NK: In 2022, the Space Force made resilience its No. 1 priority. One of the ways it hoped to achieve resilience was through the use of cheaper, more quickly deployed satellites. Unfortunately, the only path historically to going cheaper and faster was by going smaller, thereby sacrificing capabilities (e.g. low cost satellites typically come in <2kW of array power).
With Starship, and companies like K2, agencies such as the Department of Defense will have access to larger, more capable satellites that are built cheaper, faster and with lower NRE. Instead of a single exquisite satellite with 20kW of power, the DoD will be able to deploy constellations of 40 satellites, each with 20kW of power, all within a single Starship. With the rise of refueling and next generation propulsion systems, these high power constellations will be deployable in higher orbits like Medium Earth Orbit (MEO) and Geostationary Orbit (GEO), providing a much needed alternative to a potentially crowded Low Earth Orbit (LEO).
DC: The NASA Commercial Orbital Transportation Services program (COTS) program used firm, fixed-price milestone payments to solve a problem (deliver and retrieve cargo and crew to the International Space Station) at a fraction of the cost of “business as usual” approaches. NASA also gave companies such as SpaceX greater autonomy with respect to how to solve this problem. What are some key lessons that policy-makers should learn from the NASA COTS program and similar efforts at the Space Development Agency?
KK: The NASA COTS program and SDA have demonstrated that policy can be as effective as technology in driving positive change in space. The move towards firm, fixed priced models incentivized reductions in cost/time, and pushed commercial entities to be thoughtful about what it would take to deliver against stated mission requirements. The autonomy that was given to the companies like SpaceX was critical to achieving the unprecedented results that were delivered.
Moving forward, other areas that could benefit from this approach include deep space communications infrastructure and space debris identification and remediation.
NK: Take the communications capabilities around Mars. The current infrastructure is aging and throughput limited – we just have a collection of Mars orbiters that are operating beyond their primary design lifetimes. With the ramp-up of ambitious scientific missions expected to be launched over the next decade (including eventual human exploration efforts), this aging infrastructure will be unable to keep up with a potentially exponential increase in data demands.
Rather than addressing this via conventional completed missions, where the end-to-end mission is prescribed, a new approach that uses mechanisms like data buys or Advance Market Commitments could fit well here. Assigning a price for throughput deployed on a $/Gbps basis – what the U.S. government would be willing to pay, but not actually prescribing how those capabilities are deployed – could result in a cheaper, faster and more effective solution. Companies could then raise capital against the potential market, build out the infrastructure and shoulder a majority of the risk, much like any other early stage venture.
DC: What new commercial capabilities might Starship unlock? Would any of these capabilities benefit from some government involvement or participation, in the same way that the NASA COTS program helped finance the development of the Falcon9?
KK: Almost every new commercial space customer has been forced to operate with sub-scale unit economics. Given capital constraints, their only option has been to buy a small satellite and compromise on the power, payload mass or payload volume they actually need. In a world with Starship, commercial players will be able to deploy capable constellations at a fraction of the cost. They’ll be able to multi-manifest in a single Starship, amortizing the cost of launch across their full constellation (instead of just 4-8 satellites). The mass allowance of Starship will make previously infeasible commercial businesses feasible, from large fuel depots, to orbital cargo stations, to massive power plants.
As we think about development across the solar system, as future deep space missions increase the demand for data, the lack of comms capabilities beyond the Deep Space Network (DSN) is going to play a limiting factor. A concerted effort to start building these capabilities to handle future data demand could be an interesting candidate for a COTS-like approach.
DC: For policy-makers and program managers who want to learn more about Starship and other similar capabilities, what should they read, and who should they be following?
KK: There are a number of great pieces on the potential of Starship, including:
- Starship will be the biggest rocket ever. Are space scientists ready to take advantage of it?
- Starship is still not understood
- Accelerating Astrophysics with SpaceX
DC: Great recommendations. Thanks to you both for chatting.
KK: Thank you.
The U.S. bioeconomy has been surging forward, charged by the Presidential Executive Order 14081 and the CHIPS and Science Act. However, there are many difficult challenges that lay ahead for the U.S. bioeconomy, including for U.S. biomanufacturing capabilities. U.S. biomanufacturing has been grappling with issues in fermentation capacity including challenges related to scale-up, inconsistent supply chains, and downstream processing. While the U.S. government works on shoring up these roadblocks, it will be important to bring industry perspectives into the conversation to craft solutions that not only addresses the current set of issues but looks to mitigate challenges that may arise in the future.
To get a better understanding of industry perspectives on the U.S. bioeconomy and the U.S. biomanufacturing sector, the Federation of American Scientists interviewed Dr. Sarah Richardson, the CEO of MicroByre. MicroByre is a climate-focused biotech startup that specializes in providing specialized bacteria based on the specific fermentation needs of its clients. Dr. Richardson received her B.S. in biology from the University of Maryland in 2004 and a Ph.D. in human genetics and molecular biology from Johns Hopkins University School of Medicine in 2011. Her extensive training in computational and molecular biology has given her a unique perspective regarding emerging technologies enabled by synthetic biology.
FAS: The U.S. Government is focused on increasing fermentation capacity, including scale-up, and creating a resilient supply chain. In your opinion, are there specific areas in the supply chain and in scale-up that need more attention?
Dr. Sarah Richardson: The pandemic had such an impact on supply chains that everyone is reevaluating the centralization of critical manufacturing. The United States got the CHIPS and Science Act to invest in domestic semiconductor manufacturing. The voting public realized that almost every need they had required circuits. Shortages in pharmaceuticals are slowly raising awareness of chemical and biomedical manufacturing vulnerabilities as well. The public has even less insight into vulnerabilities in industrial biomanufacturing, so it is important that our elected officials are proactive with things like Executive Order 14081.
When we talk about supply chains we usually mean the sourcing and transfer of raw, intermediate, and finished materials — the flow of goods. We achieve robustness by having alternative suppliers, stockpiles, and exacting resource management. For biomanufacturing, an oft raised supply chain concern is feedstock. I can and will expound on this, but securing a supply of corn sugar is not the right long-term play here. Shoring up corn sugar supplies will not have a meaningful impact on industrial biomanufacturing and should be prioritized in that light.
Biomanufacturing efforts are different from the long standing production of consumer goods in that they are heavily tied to a scientific vendor market. As we scale to production, part of our supply chain is a lot of sterile plastic disposable consumables. We compete with biomedical sectors for those, for personal protective equipment, and for other appliances. This supply chain issue squeezed not just biomanufacturing, but scientific research in general.
We need something that isn’t always thought of as part of the supply chain: specialized infrastructural hardware. This may not be manufactured domestically. Access to scale up fermentation vessels is already squeezed. The other problem is that no matter where you build them, these vessels are designed for the deployment of canonical feedstocks and yeasts. Addressing the manufacturing locale would offer us the chance to innovate in vessel and process design and support the kinds of novel fermentations on alternate feedstocks that are needed to advance industrial biomanufacturing. There are righteous calls for the construction of new pilot plants. We should make sure that we take the opportunity to build for the right future.
One of the indisputable strengths of biomanufacturing is the potential for decentralization! Look at microbrewing: fermentation can happen anywhere without country-spanning feedstock pipelines. As we onboard overlooked feedstocks, it may only be practical to leverage them if some fermentation happens locally. As we look at supply chains and scale up we should model what that might look like for manufacturing, feedstock supply chains, and downstream processing. Not just at a national level, but at regional and local scales as well.
There are a lot of immediate policy needs for the bioeconomy, many of which are outlined in Executive Order 14081. How should these immediate needs be balanced with long-term needs? Is there a trade-off?
Counterintuitively, the most immediate needs will have the most distant payoffs! The tradeoff is that we can’t have every single detail nailed down before work begins. We will have to build tactically for strategic flexibility. Climate change and manufacturing robustness are life or death problems. We need to be open to more creative solutions in funding methods, timeline expectations; in who comes to the table, in who around the table is given the power to affect change, and in messaging! The comfortable, familiar, traditional modes of action and funding have failed to accelerate our response to this crisis.
We have to get started on regulation yesterday, because the only thing that moves slower than technology is policy. We need to agree on meaningful, aggressive, and potentially unflattering metrics to measure progress and compliance. We need to define our terms clearly: what is “bio-based,” does it not have petroleum in it at all? What does “plant-based” mean? What percentage of a product has to be renewable to be labeled so? If it comes from renewable sources but its end-of-life is not circularizable, can we still call it “green”?
We need incentives for innovation and development that do not entrench a comfortable but unproductive status quo. We need to offer stability to innovators by looking ahead and proactively incubating the standards and regulations that will support safety, security, and intellectual property protection. We should evaluate existing standards and practices for inflexibility: if they only support the current technology and a tradition that has failed to deliver change, they will continue to deliver nothing new as a solution.
We need to get on good footing with workforce development, as well. A truly multidisciplinary effort is critical and will take a while to pull off; it takes at least a decade to turn a high school student into a scientist. I only know of one national graduate fellowship that actually requires awardees to train seriously in more than one discipline. Siloing is a major problem in higher education and therefore in biomanufacturing. What passes for “multidisciplinary” is frequently “I am a computer scientist who is not rude to biologists” or “our company has both a chemical division and an AI division.” A cross-discipline “bilingual” workforce is absolutely critical to reuniting the skill sets needed to advance the bioeconomy. Organizations like BioMADE with serious commitments to developing a biomanufacturing workforce cannot effectively address the educational pipeline without significantly more support.
When we emphasize the collection of data — which data are we talking about? Is the data we have collected already a useful jumping off point for what comes next? Are the models relevant for foreseeable changes in technology, regulation, and deployment? For some of it, absolutely not. As every responsible machine learning expert can tell you, data is not something you want to skimp or cheap out on collecting or curating. We have to be deliberate about what we collect, and why. Biases cannot all be avoided, but we have to take a beat to evaluate whether extant models, architecture, and sources are relevant, useful, or adaptable. A data model is as subject to a sunk cost fallacy as anything else. There will be pressure to leverage familiar models and excuses made about the need for speed and the utility of transfer learning. We cannot let volume or nostalgia keep us from taking a sober look at the data and models we currently have, and which ones we actually need to get.
What are the major pain points the biomanufacturing industry is currently facing?
Downstream processing is the work of separating target molecules from the background noise of production. In purely chemical and petrochemical fields, separation processes are well established, extensively characterized, and relatively standardized. This is not the case in industrial biomanufacturing, where upstream flows are arguably more variable and complex than in petrochemicals. Producers on the biomedical side of biomanufacturing who make antibiotics, biologics, and other pharmaceuticals have worked on this problem for a long time. Their products tend to be more expensive and worth specialized handling. The time the field has spent developing the techniques in the urgent pursuit of human health works in their favor for innovation. However, separating fermentation broth from arbitrary commodity molecules is still a major hurdle for a bioindustrial sector already facing so many other simultaneous challenges. Without a robust library of downstream processing methods and a workforce versant in their development and deployment, new industrial products are viewed as significant scaling risks and are funded accordingly.
There is fatigue as well. For the sake of argument, let us peg the onset of the modern era of industrial biomanufacturing to the turn of the latest century. There have been the requisite amount of promises any field must make to build itself into prominence, but there has not been the progress that engenders trust in those or future promises. We have touted synthetic biology as the answer for two and a half decades but our dependence on petroleum for chemicals is as intense as ever. The goodwill we need to shift an entire industry is not a renewable resource. It takes capital, it takes time, and it takes faith that those investments will pay off. But now the chemical companies we need to adopt new solutions have lost some confidence. The policy makers we need to lean into alternative paths and visionary funding are losing trust. If the public from whence government funding ultimately springs descends into skepticism, we may lose our chance to pivot and deliver.
The right investment right now will spell the difference between life and death on this planet for billions of people.
This dangerous dearth of confidence can be addressed by doing something difficult: owning up to it. No one has ever said “oh goody — a chance to do a postmortem!”. But such introspective exercises are critical to making effective changes. A lack of reflection is a tacit vote for the status quo, which is comfortable because we’re rarely punished for a lack of advocacy. We should commission an honest look at the last thirty years — without judgment, without anger, and without the need to reframe disappointing attempts as fractional successes for granting agencies, or position singular successes as broadly representative of progress for egos.
Biomanufacturing is so promising! With proper care and attention it will be incredibly transformative. The right investment right now will spell the difference between life and death on this planet for billions of people. We owe it to ourselves and to science to do it right — which we can only do by acknowledging what we need to change and then truly committing to those changes.
Corn sugar tends to be the most utilized biomass in the bioeconomy. What are the issues the U.S. faces if it continues to rely solely on corn sugar as biomass?
History shows that low-volume, high-margin fine chemicals can be made profitable on corn sugar, but high-volume, low-margin commodity chemicals cannot. Projects that produce fine chemicals and pharmaceuticals see commercial success but suffer from feedstock availability and scaling capacity. Success in high-margin markets encourages people to use the exact same technology to attempt low-margin markets, but then they struggle to reduce costs and improve titers. When a commodity chemical endeavor starts to flag, it can pivot to high-margin markets. This is a pattern we see again and again. As long as corn sugar is the default biomass, it will not change; the United States will not be able to replace petrochemicals with biomanufacturing because the price of corn sugar is too high and cannot be technologically reduced. This pattern is also perpetuated because the yeast we usually ask to do biomanufacturing cannot be made to consume anything but corn sugar. We also struggle to produce arbitrary chemicals in scalable amounts from corn sugar. We are stuck in an unproductive reinforcing spiral.
Even if commodity projects could profit using corn sugar, there is not enough to go around. How much corn sugar would we have to use to replace even a fifth of the volume of petroleum commodity chemicals we currently rely on? How much more land, nitrogen, water, and additional carbon emissions would be needed? Would chemical interests begin to overpower food, medical, and energy interests? What if a pathogen or natural disaster wiped out the corn crop for a year or two? Even if we could succeed at manufacturing commodities with corn sugar alone, locking out alternatives makes the United States supply chain brittle and vulnerable.
Continued reliance on corn sugar slows our technological development and stifles innovation. Specialists approaching manufacturing problems in their domain are necessarily forced to adopt the standards of neighboring domains. A chemical engineer is not going to work on separating a biomass into nutrition sources when no microbiologist is offering an organism to adopt it. A molecular biologist is not going to deploy a specialized metabolic pathway dependent on a nutrition source not found in corn sugar. Equipment vendors are not going to design tools at any scale that stray from a market demand overwhelmingly based on the use of corn sugar. Grantors direct funds with the guidance of universities and industry leaders, who are biased towards corn sugar because that’s what they use to generate quick prototypes and spin out new start up companies.
The result of relying on corn sugar is an entrenched field and consequently we might lose our chance to make a difference. Without introducing low-cost, abundant feedstocks like wastes, we run the risk of disqualifying an entire field of innovation.
What does the U.S. need to do in order for other biomass sources to be utilized beyond corn sugar? Are there ideas (or specific programs) that the U.S. government could supercharge?
Federal agencies must stop funding projects that propose to scale familiar yeasts on corn sugars to produce novel industrial chemicals. We must immediately stop funding biomass conversion projects meant to provide refined sugars to such endeavors. And we must stop any notion of dedicating arable land solely to corn sugar solely for the purposes of biomanufacturing new industrial products. The math does not and will not work out. The United States must stop throwing money and support at such things that seem like they ought to succeed any minute now, even though we have been waiting for that success for 50 years without any meaningful changes in the economic analysis or technology available.
Ironically, we need to take a page from the book that cemented petroleum and car supremacy in this country. We need to do the kind of inglorious, overlooked, and subsequently taken for granted survey of the kind that enabled the Eisenhower Interstate System to be built.
We need to characterize all of the non-corn feedstocks and their economic and microbial ecosystems. We need to know how much of each biomass exists, what it is composed of, and who is compiling where. We need to know what organisms rot it and what they produce from it. We need to make all of that data as freely available as possible to lower the barriers of entry for cross-disciplinary teams of researchers and innovators to design and build the logistical, microbiological, chemical, and mechanical infrastructure necessary. We need to prioritize and leverage the complex biomasses that cannot just be ground into yeast food.
We need to get the lay of the land so – to use the roadway analogy – we know where to pour the asphalt. An example of this sort of effort is the Materials Genome Initiative, which is a crosscutting multi-agency initiative for advancing materials and manufacturing technology. (And which has, to my chagrin, stolen the term “genome” for non-biological purposes.) An even more visible example to the public is a resource like the Plant Hardiness Zone Map that provides a basis for agricultural risk assessment to everyone in the country.
The United States needs to lean into an old strength and fund infrastructure that gives all the relevant specialties the ability to collaborate on truly divergent and innovative biomass efforts. The field of industrial biomanufacturing must make a concerted effort to critically examine a history of failed technical investments, shake off the chains of the status quo, and guide us into true innovation. Infrastructure is not the kind of project that yields an immediate return. If venture capital or philanthropy could do it, they would have already. The United States must flex its unique ability to work on a generational investment timeline; to spend money in the very short term on the right things so as to set everyone up for decades of wildly profitable success — and a safer and more livable planet.