Watch This Space: Looking at the Next Generation of Space Launch Technology
With the news that SpaceX’s Starship is nearing readiness for another test launch, FAS CEO Dan Correa has been thinking more about what its technology could mean for national security, space science, and commercial space activities. Correa believes policymakers should be thinking and talking more about the implications of Starship and other competing space efforts as well. He recently sat down with Karan Kunjur and Neel Kunjur, founders of space technology startup K2 Space, to find out just how big of a leap the next generation of launch vehicles will represent.
Dan Correa, FAS CEO: Let’s start with reminding people exactly what SpaceX’s Starship is – and why it could be such a paradigm shifter.
Karan Kunjur, K2 Space Co-founder, CEO: Starship is a next generation launch vehicle and spacecraft being developed by SpaceX, and when operational will change the game in space exploration. It’s the largest and most powerful launch system to ever be developed (150+ tons of payload capacity to LEO) and is intended to be fully re-usable.
A single Starship launching at a cadence of three times per week will be capable of delivering more mass to orbit in a year than humanity has launched in all of history.
With Starship-class access to space, we’re about to move from an era of mass constraint, to an era of mass abundance. In this new era, what we put in space will look different. The historic trades that were made around mass vs. cost will be flipped on its head, and the optimal spacecraft required for science, commercial and national security missions will change.
DC: Can you be more specific about what types of economic sectors are likely to be affected by Starship and other similar next generation launch vehicles? In other words, is there a broader ecosystem of products and services that you think are likely to emerge to take advantage of Starship or similar capabilities from other companies?
Neel Kunjur, K2 Space Co-founder, CTO: Historically, almost every application in space has been constrained by something commonly known as ‘SWAP’ – Size, Weight and Power. Satellite bus manufacturers have been forced to use expensive, lightweight components that are specially designed for spacecraft that need to fit inside current rockets. Payload designers have been forced to pursue compact sensor designs and complicated, sometimes unreliable deployables. Brought together, these needs have resulted in lightweight, but necessarily expensive vehicles.
A perfect example of this is the James Webb Space Telescope (JWST). In order to fit required mission capabilities within SWAP constraints, the designers of JWST had to 1) Develop a highly complex deployable segmented mirror to fit within the volume budget, 2) Use expensive and novel Beryllium mirrors to fit within the mass budget, and 3) Design low power instruments and thermal conditioning hardware to fit within the power budget. This kind of complexity dramatically increases the cost of missions.
KK: Exactly. In a world with Starship, things will become significantly simpler. Instead of a complex, unfolding, segmented mirror, you could use a large monolithic mirror. Instead of expensive Beryllium mirrors, you could use simpler and cheaper materials with lower stiffness-to-mass ratios, similar to those used in ground-based telescopes. Instead of expensive, power-optimized instruments, additional power could be used to make simpler and cheaper instruments with more robust thermal conditioning capabilities.
The potential for change exists across every type of mission in space. It will become possible to have a satellite bus platform that has more power, more payload volume and more payload mass – but one that comes in at the cost of a small satellite. In a world with launch vehicles like Starship, satellite-based communications providers will be able to use the increased power to have greater throughput, remote-sensing players will be able to use more volume to have larger apertures, and national security missions will no longer need to make the trade-off between single exquisite satellites and constellations of low capability small satellites.
DC: Can we get more specific about what we think the new costs would be? If I’m a taxpayer thinking about how my government financially supports space exploration and activity, that’s important. Or even if I’m a philanthropic supporter of space science – it matters. So what are some “back of the envelope” estimates of cost, schedule, and performance of Starship-enabled missions, relative to status quo approaches?
KK: Here’s an example: the MOSAIC (Mars Orbiters for Surface-Atmosphere-Ionosphere Connections) concept, identified as a priority in the National Academies’ 2022 Planetary Decadal Survey, was a 10-satellite constellation to understand the integrated Mars climate system from its shallow ice, up through Mars’ atmospheric layers, and out to the exosphere and space weather environment. The study envisioned deploying one large “mothership” satellite and nine smaller satellites in orbit around Mars using SpaceX’s Falcon Heavy Rocket. Development of these spacecraft was expected to cost ~$1B (excluding recommended 50% reserves).
In a world with Starship, the same mission could cost $200M in spacecraft costs. With this next generation launch vehicle, you could launch 10 large satellites in a single Starship. Each satellite would be redesigned to optimize for Starship’s mass allowance (150 tons), allowing the use of cheaper, but heavier materials and components (e.g. aluminum instead of expensive isogrid & composite structure). Each satellite would have more capabilities from a power (20kW), payload mass and payload volume than the large “mothership” satellite envisioned in the original MOSAIC study.
DC: You’ve told me that standardization and modularization possibilities with Starship as it relates to satellites and scientific instruments is crucial. Can you elaborate on that idea?
NK: Longer term, having mass will allow us to do interesting things like over-spec the SWAP capabilities of the satellite bus to meet the requirements of various space science missions – thereby driving standardization. With sufficient SWAP, we could start to include a consistent bundle of instruments (rather than selecting a few to fit within limited SWAP budgets) – reducing the level of customization and non-recurring engineering (NRE) required for each mission.
Although there will always be some level of customization required for each individual scientific mission, the potential to standardize a large portion of the hardware will make it possible to mass produce probes, increasing the potential frequency of missions and reducing the potential cost per mission. Examples here include standardized build-to-print suites of spectrometers, cameras, and particle and field sensors.
DC: What are the implications for the Defense Department? What are some of the important opportunities to deliver capabilities to solve national security problems in less time, at a lower cost, and with greater resilience?
NK: In 2022, the Space Force made resilience its No. 1 priority. One of the ways it hoped to achieve resilience was through the use of cheaper, more quickly deployed satellites. Unfortunately, the only path historically to going cheaper and faster was by going smaller, thereby sacrificing capabilities (e.g. low cost satellites typically come in <2kW of array power).
With Starship, and companies like K2, agencies such as the Department of Defense will have access to larger, more capable satellites that are built cheaper, faster and with lower NRE. Instead of a single exquisite satellite with 20kW of power, the DoD will be able to deploy constellations of 40 satellites, each with 20kW of power, all within a single Starship. With the rise of refueling and next generation propulsion systems, these high power constellations will be deployable in higher orbits like Medium Earth Orbit (MEO) and Geostationary Orbit (GEO), providing a much needed alternative to a potentially crowded Low Earth Orbit (LEO).
DC: The NASA Commercial Orbital Transportation Services program (COTS) program used firm, fixed-price milestone payments to solve a problem (deliver and retrieve cargo and crew to the International Space Station) at a fraction of the cost of “business as usual” approaches. NASA also gave companies such as SpaceX greater autonomy with respect to how to solve this problem. What are some key lessons that policy-makers should learn from the NASA COTS program and similar efforts at the Space Development Agency?
KK: The NASA COTS program and SDA have demonstrated that policy can be as effective as technology in driving positive change in space. The move towards firm, fixed priced models incentivized reductions in cost/time, and pushed commercial entities to be thoughtful about what it would take to deliver against stated mission requirements. The autonomy that was given to the companies like SpaceX was critical to achieving the unprecedented results that were delivered.
Moving forward, other areas that could benefit from this approach include deep space communications infrastructure and space debris identification and remediation.
NK: Take the communications capabilities around Mars. The current infrastructure is aging and throughput limited – we just have a collection of Mars orbiters that are operating beyond their primary design lifetimes. With the ramp-up of ambitious scientific missions expected to be launched over the next decade (including eventual human exploration efforts), this aging infrastructure will be unable to keep up with a potentially exponential increase in data demands.
Rather than addressing this via conventional completed missions, where the end-to-end mission is prescribed, a new approach that uses mechanisms like data buys or Advance Market Commitments could fit well here. Assigning a price for throughput deployed on a $/Gbps basis – what the U.S. government would be willing to pay, but not actually prescribing how those capabilities are deployed – could result in a cheaper, faster and more effective solution. Companies could then raise capital against the potential market, build out the infrastructure and shoulder a majority of the risk, much like any other early stage venture.
DC: What new commercial capabilities might Starship unlock? Would any of these capabilities benefit from some government involvement or participation, in the same way that the NASA COTS program helped finance the development of the Falcon9?
KK: Almost every new commercial space customer has been forced to operate with sub-scale unit economics. Given capital constraints, their only option has been to buy a small satellite and compromise on the power, payload mass or payload volume they actually need. In a world with Starship, commercial players will be able to deploy capable constellations at a fraction of the cost. They’ll be able to multi-manifest in a single Starship, amortizing the cost of launch across their full constellation (instead of just 4-8 satellites). The mass allowance of Starship will make previously infeasible commercial businesses feasible, from large fuel depots, to orbital cargo stations, to massive power plants.
As we think about development across the solar system, as future deep space missions increase the demand for data, the lack of comms capabilities beyond the Deep Space Network (DSN) is going to play a limiting factor. A concerted effort to start building these capabilities to handle future data demand could be an interesting candidate for a COTS-like approach.
DC: For policy-makers and program managers who want to learn more about Starship and other similar capabilities, what should they read, and who should they be following?
KK: There are a number of great pieces on the potential of Starship, including:
- Starship will be the biggest rocket ever. Are space scientists ready to take advantage of it?
- Starship is still not understood
- Accelerating Astrophysics with SpaceX
DC: Great recommendations. Thanks to you both for chatting.
KK: Thank you.
NK: Thanks.
“The US needs to lean into an old strength”: Maintaining Progress and Growing US Biomanufacturing
The U.S. bioeconomy has been surging forward, charged by the Presidential Executive Order 14081 and the CHIPS and Science Act. However, there are many difficult challenges that lay ahead for the U.S. bioeconomy, including for U.S. biomanufacturing capabilities. U.S. biomanufacturing has been grappling with issues in fermentation capacity including challenges related to scale-up, inconsistent supply chains, and downstream processing. While the U.S. government works on shoring up these roadblocks, it will be important to bring industry perspectives into the conversation to craft solutions that not only addresses the current set of issues but looks to mitigate challenges that may arise in the future.
To get a better understanding of industry perspectives on the U.S. bioeconomy and the U.S. biomanufacturing sector, the Federation of American Scientists interviewed Dr. Sarah Richardson, the CEO of MicroByre. MicroByre is a climate-focused biotech startup that specializes in providing specialized bacteria based on the specific fermentation needs of its clients. Dr. Richardson received her B.S. in biology from the University of Maryland in 2004 and a Ph.D. in human genetics and molecular biology from Johns Hopkins University School of Medicine in 2011. Her extensive training in computational and molecular biology has given her a unique perspective regarding emerging technologies enabled by synthetic biology.
FAS: The U.S. Government is focused on increasing fermentation capacity, including scale-up, and creating a resilient supply chain. In your opinion, are there specific areas in the supply chain and in scale-up that need more attention?
Dr. Sarah Richardson: The pandemic had such an impact on supply chains that everyone is reevaluating the centralization of critical manufacturing. The United States got the CHIPS and Science Act to invest in domestic semiconductor manufacturing. The voting public realized that almost every need they had required circuits. Shortages in pharmaceuticals are slowly raising awareness of chemical and biomedical manufacturing vulnerabilities as well. The public has even less insight into vulnerabilities in industrial biomanufacturing, so it is important that our elected officials are proactive with things like Executive Order 14081.
When we talk about supply chains we usually mean the sourcing and transfer of raw, intermediate, and finished materials — the flow of goods. We achieve robustness by having alternative suppliers, stockpiles, and exacting resource management. For biomanufacturing, an oft raised supply chain concern is feedstock. I can and will expound on this, but securing a supply of corn sugar is not the right long-term play here. Shoring up corn sugar supplies will not have a meaningful impact on industrial biomanufacturing and should be prioritized in that light.
Biomanufacturing efforts are different from the long standing production of consumer goods in that they are heavily tied to a scientific vendor market. As we scale to production, part of our supply chain is a lot of sterile plastic disposable consumables. We compete with biomedical sectors for those, for personal protective equipment, and for other appliances. This supply chain issue squeezed not just biomanufacturing, but scientific research in general.
We need something that isn’t always thought of as part of the supply chain: specialized infrastructural hardware. This may not be manufactured domestically. Access to scale up fermentation vessels is already squeezed. The other problem is that no matter where you build them, these vessels are designed for the deployment of canonical feedstocks and yeasts. Addressing the manufacturing locale would offer us the chance to innovate in vessel and process design and support the kinds of novel fermentations on alternate feedstocks that are needed to advance industrial biomanufacturing. There are righteous calls for the construction of new pilot plants. We should make sure that we take the opportunity to build for the right future.
One of the indisputable strengths of biomanufacturing is the potential for decentralization! Look at microbrewing: fermentation can happen anywhere without country-spanning feedstock pipelines. As we onboard overlooked feedstocks, it may only be practical to leverage them if some fermentation happens locally. As we look at supply chains and scale up we should model what that might look like for manufacturing, feedstock supply chains, and downstream processing. Not just at a national level, but at regional and local scales as well.
There are a lot of immediate policy needs for the bioeconomy, many of which are outlined in Executive Order 14081. How should these immediate needs be balanced with long-term needs? Is there a trade-off?
Counterintuitively, the most immediate needs will have the most distant payoffs! The tradeoff is that we can’t have every single detail nailed down before work begins. We will have to build tactically for strategic flexibility. Climate change and manufacturing robustness are life or death problems. We need to be open to more creative solutions in funding methods, timeline expectations; in who comes to the table, in who around the table is given the power to affect change, and in messaging! The comfortable, familiar, traditional modes of action and funding have failed to accelerate our response to this crisis.
We have to get started on regulation yesterday, because the only thing that moves slower than technology is policy. We need to agree on meaningful, aggressive, and potentially unflattering metrics to measure progress and compliance. We need to define our terms clearly: what is “bio-based,” does it not have petroleum in it at all? What does “plant-based” mean? What percentage of a product has to be renewable to be labeled so? If it comes from renewable sources but its end-of-life is not circularizable, can we still call it “green”?
We need incentives for innovation and development that do not entrench a comfortable but unproductive status quo. We need to offer stability to innovators by looking ahead and proactively incubating the standards and regulations that will support safety, security, and intellectual property protection. We should evaluate existing standards and practices for inflexibility: if they only support the current technology and a tradition that has failed to deliver change, they will continue to deliver nothing new as a solution.
We need to get on good footing with workforce development, as well. A truly multidisciplinary effort is critical and will take a while to pull off; it takes at least a decade to turn a high school student into a scientist. I only know of one national graduate fellowship that actually requires awardees to train seriously in more than one discipline. Siloing is a major problem in higher education and therefore in biomanufacturing. What passes for “multidisciplinary” is frequently “I am a computer scientist who is not rude to biologists” or “our company has both a chemical division and an AI division.” A cross-discipline “bilingual” workforce is absolutely critical to reuniting the skill sets needed to advance the bioeconomy. Organizations like BioMADE with serious commitments to developing a biomanufacturing workforce cannot effectively address the educational pipeline without significantly more support.

MicroByre is working to advance alternatives to substrates currently favored by the bioeconomy.
When we emphasize the collection of data — which data are we talking about? Is the data we have collected already a useful jumping off point for what comes next? Are the models relevant for foreseeable changes in technology, regulation, and deployment? For some of it, absolutely not. As every responsible machine learning expert can tell you, data is not something you want to skimp or cheap out on collecting or curating. We have to be deliberate about what we collect, and why. Biases cannot all be avoided, but we have to take a beat to evaluate whether extant models, architecture, and sources are relevant, useful, or adaptable. A data model is as subject to a sunk cost fallacy as anything else. There will be pressure to leverage familiar models and excuses made about the need for speed and the utility of transfer learning. We cannot let volume or nostalgia keep us from taking a sober look at the data and models we currently have, and which ones we actually need to get.
What are the major pain points the biomanufacturing industry is currently facing?
Downstream processing is the work of separating target molecules from the background noise of production. In purely chemical and petrochemical fields, separation processes are well established, extensively characterized, and relatively standardized. This is not the case in industrial biomanufacturing, where upstream flows are arguably more variable and complex than in petrochemicals. Producers on the biomedical side of biomanufacturing who make antibiotics, biologics, and other pharmaceuticals have worked on this problem for a long time. Their products tend to be more expensive and worth specialized handling. The time the field has spent developing the techniques in the urgent pursuit of human health works in their favor for innovation. However, separating fermentation broth from arbitrary commodity molecules is still a major hurdle for a bioindustrial sector already facing so many other simultaneous challenges. Without a robust library of downstream processing methods and a workforce versant in their development and deployment, new industrial products are viewed as significant scaling risks and are funded accordingly.
There is fatigue as well. For the sake of argument, let us peg the onset of the modern era of industrial biomanufacturing to the turn of the latest century. There have been the requisite amount of promises any field must make to build itself into prominence, but there has not been the progress that engenders trust in those or future promises. We have touted synthetic biology as the answer for two and a half decades but our dependence on petroleum for chemicals is as intense as ever. The goodwill we need to shift an entire industry is not a renewable resource. It takes capital, it takes time, and it takes faith that those investments will pay off. But now the chemical companies we need to adopt new solutions have lost some confidence. The policy makers we need to lean into alternative paths and visionary funding are losing trust. If the public from whence government funding ultimately springs descends into skepticism, we may lose our chance to pivot and deliver.
The right investment right now will spell the difference between life and death on this planet for billions of people.
This dangerous dearth of confidence can be addressed by doing something difficult: owning up to it. No one has ever said “oh goody — a chance to do a postmortem!”. But such introspective exercises are critical to making effective changes. A lack of reflection is a tacit vote for the status quo, which is comfortable because we’re rarely punished for a lack of advocacy. We should commission an honest look at the last thirty years — without judgment, without anger, and without the need to reframe disappointing attempts as fractional successes for granting agencies, or position singular successes as broadly representative of progress for egos.
Biomanufacturing is so promising! With proper care and attention it will be incredibly transformative. The right investment right now will spell the difference between life and death on this planet for billions of people. We owe it to ourselves and to science to do it right — which we can only do by acknowledging what we need to change and then truly committing to those changes.
Corn sugar tends to be the most utilized biomass in the bioeconomy. What are the issues the U.S. faces if it continues to rely solely on corn sugar as biomass?
History shows that low-volume, high-margin fine chemicals can be made profitable on corn sugar, but high-volume, low-margin commodity chemicals cannot. Projects that produce fine chemicals and pharmaceuticals see commercial success but suffer from feedstock availability and scaling capacity. Success in high-margin markets encourages people to use the exact same technology to attempt low-margin markets, but then they struggle to reduce costs and improve titers. When a commodity chemical endeavor starts to flag, it can pivot to high-margin markets. This is a pattern we see again and again. As long as corn sugar is the default biomass, it will not change; the United States will not be able to replace petrochemicals with biomanufacturing because the price of corn sugar is too high and cannot be technologically reduced. This pattern is also perpetuated because the yeast we usually ask to do biomanufacturing cannot be made to consume anything but corn sugar. We also struggle to produce arbitrary chemicals in scalable amounts from corn sugar. We are stuck in an unproductive reinforcing spiral.
Even if commodity projects could profit using corn sugar, there is not enough to go around. How much corn sugar would we have to use to replace even a fifth of the volume of petroleum commodity chemicals we currently rely on? How much more land, nitrogen, water, and additional carbon emissions would be needed? Would chemical interests begin to overpower food, medical, and energy interests? What if a pathogen or natural disaster wiped out the corn crop for a year or two? Even if we could succeed at manufacturing commodities with corn sugar alone, locking out alternatives makes the United States supply chain brittle and vulnerable.

MicroByre is working to advance alternatives to substrates currently favored by the bioeconomy.
Continued reliance on corn sugar slows our technological development and stifles innovation. Specialists approaching manufacturing problems in their domain are necessarily forced to adopt the standards of neighboring domains. A chemical engineer is not going to work on separating a biomass into nutrition sources when no microbiologist is offering an organism to adopt it. A molecular biologist is not going to deploy a specialized metabolic pathway dependent on a nutrition source not found in corn sugar. Equipment vendors are not going to design tools at any scale that stray from a market demand overwhelmingly based on the use of corn sugar. Grantors direct funds with the guidance of universities and industry leaders, who are biased towards corn sugar because that’s what they use to generate quick prototypes and spin out new start up companies.
The result of relying on corn sugar is an entrenched field and consequently we might lose our chance to make a difference. Without introducing low-cost, abundant feedstocks like wastes, we run the risk of disqualifying an entire field of innovation.
What does the U.S. need to do in order for other biomass sources to be utilized beyond corn sugar? Are there ideas (or specific programs) that the U.S. government could supercharge?
Federal agencies must stop funding projects that propose to scale familiar yeasts on corn sugars to produce novel industrial chemicals. We must immediately stop funding biomass conversion projects meant to provide refined sugars to such endeavors. And we must stop any notion of dedicating arable land solely to corn sugar solely for the purposes of biomanufacturing new industrial products. The math does not and will not work out. The United States must stop throwing money and support at such things that seem like they ought to succeed any minute now, even though we have been waiting for that success for 50 years without any meaningful changes in the economic analysis or technology available.
Ironically, we need to take a page from the book that cemented petroleum and car supremacy in this country. We need to do the kind of inglorious, overlooked, and subsequently taken for granted survey of the kind that enabled the Eisenhower Interstate System to be built.
We need to characterize all of the non-corn feedstocks and their economic and microbial ecosystems. We need to know how much of each biomass exists, what it is composed of, and who is compiling where. We need to know what organisms rot it and what they produce from it. We need to make all of that data as freely available as possible to lower the barriers of entry for cross-disciplinary teams of researchers and innovators to design and build the logistical, microbiological, chemical, and mechanical infrastructure necessary. We need to prioritize and leverage the complex biomasses that cannot just be ground into yeast food.
We need to get the lay of the land so – to use the roadway analogy – we know where to pour the asphalt. An example of this sort of effort is the Materials Genome Initiative, which is a crosscutting multi-agency initiative for advancing materials and manufacturing technology. (And which has, to my chagrin, stolen the term “genome” for non-biological purposes.) An even more visible example to the public is a resource like the Plant Hardiness Zone Map that provides a basis for agricultural risk assessment to everyone in the country.
The United States needs to lean into an old strength and fund infrastructure that gives all the relevant specialties the ability to collaborate on truly divergent and innovative biomass efforts. The field of industrial biomanufacturing must make a concerted effort to critically examine a history of failed technical investments, shake off the chains of the status quo, and guide us into true innovation. Infrastructure is not the kind of project that yields an immediate return. If venture capital or philanthropy could do it, they would have already. The United States must flex its unique ability to work on a generational investment timeline; to spend money in the very short term on the right things so as to set everyone up for decades of wildly profitable success — and a safer and more livable planet.