Ahead of the Wildland Fire Commission Report Release, a Roundup of FAS’s Efforts to Provide Input on Wildland Fire Policy

FAS is committed to producing science-based policy recommendations that improve people’s lives – and over the last year we’ve devoted considerable effort to understanding wildfire in the context of U.S. federal policy. We hope that some of our work will be reflected in the forthcoming Congressionally authorized Wildland Fire Mitigation and Management Commission Report, anticipated for release in the coming days.

Over the past year, together with partner organizations COMPASS, Conservation X Labs, and the California Council on Science and Technology, FAS recruited diverse experts to participate in our Wildland Fire Policy Accelerator to develop actionable ideas, presented in a concise format, to inform the work of the Commission.  

In early 2023, FAS hosted a convening that provided stakeholders (including Commission members and Accelerator participants) from the science, technology, and policy communities with an opportunity to exchange forward-looking ideas with the shared goal of improving the federal government’s approach to managing wildland fire. 

Wildland Fire Policy Accelerator: Developing Actionable Recommendations

Working together, FAS and accelerator participants produced policy recommendations that provide targeted suggestions for addressing wildfire challenges in several domains, including   landscapes and communities; public health and infrastructure; science, data, and technology; and the workforce. With climate change worsening wildfire impacts, a holistic overhaul of wildland fire policy is urgent, but also within reach, if policymakers work collaboratively to implement a broad suite of changes.

FAS Wildland Fire Policy Memos

Effective wildfire management will require a thoughtful, multi-pronged approach, as detailed in these memos by Accelerator participants.

Wildland Fire in Context: Ensuring Broad Perspectives Are Incorporated 

Funding and Implementation of Wildland Fire Programs: Mapping the Landscape 

Our wildland fire work is not finished. We look forward to reviewing the policy recommendations of the forthcoming Commission Report and helping to amplify its messages in the halls of Congress and in federal agencies. We applaud the efforts of the Commission to incorporate and synthesize diverse perspectives over an incredibly short time frame and we hope that the report will be as robust and comprehensive as is required to improve how we live with wildland fire. Our staff, partners, and engaged subject matter experts, and others are sure to have thoughts, which we look forward to sharing in the coming weeks. Stay tuned.

It’s Time to Move Towards Movement as Medicine

For over 10 years, physical inactivity has been recognized as a global pandemic with widespread health, economic, and social impacts. Despite the wealth of research support for movement as medicine, financial and environmental barriers limit the implementation of physical activity intervention and prevention efforts. The need to translate research findings into policies that promote physical activity has never been higher, as the aging population in the U.S. and worldwide is expected to increase the prevalence of chronic medical conditions, many of which can be prevented or treated with physical activity. Action at the federal and local level is needed to promote health across the lifespan through movement.

Research Clearly Shows the Benefits of Movement for Health

Movement is one of the most important keys to health. Exercise benefits heart health and physical functioning, such as muscle strength, flexibility, and balance. But many people are unaware that physical activity is closely tied to the health conditions they fear most. Of the top five health conditions that people reported being afraid of in a recent survey conducted by the Centers for Disease Control and Prevention (CDC), the risk for four—cancer, Alzheimer’s disease, heart disease, and stroke—is increased by physical inactivity. It’s not only physical health that is impacted by movement, but also mental health and other aspects of brain health. Research shows exercise is effective in treating and preventing mental health conditions such as depression and anxiety, rates of which have skyrocketed in recent years, now impacting nearly one-third of adults in the U.S. Physical fitness also directly impacts the brain itself, for example, by boosting its ability to regenerate after injury and improving memory and cognitive functioning. The scientific evidence is clear: Movement, whether through structured exercise or general physical activity in everyday life, has a major impact on the health of individuals and as a result, on the health of societies.

Movement Is Not Just about Weight, It’s about Overall Lifelong Health

There is increasing recognition that movement is important for more than weight loss, which was the primary focus in the past. Overall health and stress relief are often cited as motivations for exercise, in addition to weight loss and physical appearance. This shift in perspective reflects the growing scientific evidence that physical activity is essential for overall physical and mental health. Research also shows that physical activity is not only an important component of physical and mental health treatment, but it can also help prevent disease, injury, and disability and lower the risk for premature death. The focus on prevention is particularly important for conditions such as Alzheimer’s disease and other types of dementia that have no known cure. A prevention mindset requires a lifespan perspective, as physical activity and other healthy lifestyle behaviors such as good nutrition earlier in life impact health later in life.

Despite the Research, Americans Are Not Moving Enough

Even with so much data linking movement to better health outcomes, the U.S. is part of what has been described as a global pandemic of physical inactivity. Results of a national survey by the CDC published in 2022 found that 25.3% of Americans reported that outside of their regular job, they had not participated in any physical activity in the previous month, such as walking, golfing, or gardening. Rates of physical inactivity were even higher in Black and Hispanic adults, at 30% and 32%, respectively. Another survey highlighted rural-urban differences in the number of Americans who meet CDC physical activity guidelines that recommend ≥ 150 minutes per week of moderate-intensity aerobic exercise and ≥ 2 days per week of muscle-strengthening exercise. Respondents in large metropolitan areas were most active, yet only 27.8% met both aerobic and muscle strengthening guidelines. Even fewer people (16.1%) in non-metropolitan areas met the guidelines.

Why are so many Americans sedentary? The COVID-19 pandemic certainly exacerbated the problem; however, data from 2010 showed similar rates of physical inactivity, suggesting long-standing patterns of sedentary behavior in the country. Some of the barriers to physical activity are internal to the individual, such as lack of time, motivation, or energy. But other barriers are societal, at both the community and federal level. At the community level, barriers include transportation, affordability, lack of available programs, and limited access to high-quality facilities. Many of these barriers disproportionately impact communities of color and people with low income, who are more likely to live in environments that limit physical activity due to factors such as accessibility of parks, sidewalks, and recreation facilities; traffic; crime; and pollution. Action at the state and federal government level could address many of these environmental barriers, as well as financial barriers that limit access to exercise facilities and programs.

Physical Inactivity Takes a Toll on the Healthcare System and the Economy

Aside from a moral responsibility to promote the health of its citizens, the government has a financial stake in promoting movement in American society. According to recent analyses, inactive lifestyles cost the U.S. economy an estimated $28 billion each year due to medical expenses and lost productivity. Physical inactivity is directly related to the non-communicable diseases that place the highest burden on the economy, such as hypertension, heart disease, and obesity. In 2016, these types of modifiable risk factors comprised 27% of total healthcare spending. These costs are mostly driven by older adults, which highlights the increasing urgency to address physical inactivity as the population ages. Physical activity is also related to healthcare costs at an individual level, with savings ranging from 9-26.6% for physically active people, even after accounting for increased costs due to longevity and injuries related to physical activity. Analysis of 2012 data from the Agency for Healthcare Research and Quality’s Medical Expenditure Panel Survey (MEPS) found that each year, people who met World Health Organization aerobic exercise guidelines, which correspond with CDC guidelines, paid on average $2,500 less in healthcare expenses related to heart disease alone compared to people who did not meet the recommended activity levels. Changes are needed at the federal, state, and local level to promote movement as medicine. If changes are not made in physical activity patterns by 2030, it is estimated that an additional $301.8 billion of direct healthcare costs will be incurred.

Government Agencies Can Play a Role in Better Promoting Physical Activity Programs

Promoting physical activity in the community requires education, resources, and removal of barriers in order for programs to have a broad reach to all citizens, including communities that are disproportionately impacted by the pandemic of physical inactivity. Integrated efforts from multiple agencies within the federal government is essential. 

Past initiatives have met with varying levels of success. For example, Let’s Move!, a campaign initiated by First Lady Michelle Obama in 2010, sought to address the problem of childhood obesity by increasing physical activity and healthy eating, among other strategies. The Food and Drug Administration, Department of Agriculture, Department of Health and Human Services including the Centers for Disease Control and Prevention, and Department of Interior were among the federal agencies that collaborated with state and local government, schools, advocacy groups, community-based organizations, and private sector companies. The program helped improve the healthy food landscape, increased opportunities for children to be more physically active, and supported healthier lifestyles at the community level. However, overall rates of childhood obesity remained constant or even increased in some age brackets since the program started, and there is no evidence of an overall increase in physical activity level in children and adolescents since that time.

More recently, the U.S. Office of Disease Prevention and Health Promotion’s Healthy People 2030 campaign established data-driven national objectives to improve the health and well-being of Americans. The campaign was led by the Federal Interagency Workgroup, which includes representatives across several federal agencies including the U.S. Department of Health and Human Services, the U.S. Department of Agriculture, and the U.S. Department of Education. One of the campaign’s leading health indicators—a small subset of high-priority objectives—is increasing the number of adults who meet current minimum guidelines for aerobic physical activity and muscle-strengthening activity from 25.2% in 2020 to 29.7% by 2030. There are also movement-related objectives focused on children and adolescents as well as older adults, for example:

Unfortunately, there is currently no evidence of improvement in any of these objectives. All of the objectives related to physical activity with available follow-up data either show little or no detectable change, or they are getting worse.

To make progress towards the physical activity goals established by the Healthy People 2030 campaign, it will be important to identify where breakdowns in communication and implementation may have occurred, whether it be between federal agencies, between federal and local organizations, or between local organizations and citizens. Challenges brought on by the COVID-19 pandemic (e.g., less movement outside of the house for people who now work from home) will also need to be addressed, with the recognition that many of these challenges will likely persist for years to come. Critically, financial barriers should be reduced in a variety of ways, including more expansive coverage by the Centers for Medicare & Medicaid Services for exercise interventions as well as exercise for prevention. Policies that reflect a recognition of movement as medicine have the potential to improve the physical and mental health of Americans and address health inequities, all while boosting the health of the economy.

Watch This Space: Looking at the Next Generation of Space Launch Technology

With the news that SpaceX’s Starship is nearing readiness for another test launch, FAS CEO Dan Correa has been thinking more about what its technology could mean for national security, space science, and commercial space activities. Correa believes policymakers should be thinking and talking more about the implications of Starship and other competing space efforts as well. He recently sat down with Karan Kunjur and Neel Kunjur, founders of space technology startup K2 Space, to find out just how big of a leap the next generation of launch vehicles will represent.

Dan Correa, FAS CEO:  Let’s start with reminding people exactly what SpaceX’s Starship is – and why it could be such a paradigm shifter.

Karan Kunjur, K2 Space Co-founder, CEO: Starship is a next generation launch vehicle and spacecraft being developed by SpaceX, and when operational will change the game in space exploration. It’s the largest and most powerful launch system to ever be developed (150+ tons of payload capacity to LEO) and is intended to be fully re-usable. 

A single Starship launching at a cadence of three times per week will be capable of delivering more mass to orbit in a year than humanity has launched in all of history. 

With Starship-class access to space, we’re about to move from an era of mass constraint, to an era of mass abundance. In this new era, what we put in space will look different. The historic trades that were made around mass vs. cost will be flipped on its head, and the optimal spacecraft required for science, commercial and national security missions will change. 

DC:  Can you be more specific about what types of economic sectors are likely to be affected by Starship and other similar next generation launch vehicles? In other words, is there a broader ecosystem of products and services that you think are likely to emerge to take advantage of Starship or similar capabilities from other companies?

Neel Kunjur, K2 Space Co-founder, CTO: Historically, almost every application in space has been constrained by something commonly known as ‘SWAP’ – Size, Weight and Power. Satellite bus manufacturers have been forced to use expensive, lightweight components that are specially designed for spacecraft that need to fit inside current rockets. Payload designers have been forced to pursue compact sensor designs and complicated, sometimes unreliable deployables. Brought together, these needs have resulted in lightweight, but necessarily expensive vehicles. 

A perfect example of this is the James Webb Space Telescope (JWST). In order to fit required mission capabilities within SWAP constraints, the designers of JWST had to 1) Develop a highly complex deployable segmented mirror to fit within the volume budget, 2) Use expensive and novel Beryllium mirrors to fit within the mass budget, and 3) Design low power instruments and thermal conditioning hardware to fit within the power budget. This kind of complexity dramatically increases the cost of missions. 

KK: Exactly. In a world with Starship, things will become significantly simpler. Instead of a complex, unfolding, segmented mirror, you could use a large monolithic mirror. Instead of expensive Beryllium mirrors, you could use simpler and cheaper materials with lower stiffness-to-mass ratios, similar to those used in ground-based telescopes. Instead of expensive, power-optimized instruments, additional power could be used to make simpler and cheaper instruments with more robust thermal conditioning capabilities.    

The potential for change exists across every type of mission in space. It will become possible to have a satellite bus platform that has more power, more payload volume and more payload mass – but one that comes in at the cost of a small satellite. In a world with launch vehicles like Starship, satellite-based communications providers will be able to use the increased power to have greater throughput, remote-sensing players will be able to use more volume to have larger apertures, and national security missions will no longer need to make the trade-off between single exquisite satellites and constellations of low capability small satellites.  

DC: Can we get more specific about what we think the new costs would be? If I’m a taxpayer thinking about how my government financially supports space exploration and activity, that’s important. Or even if I’m a philanthropic supporter of space science – it matters. So what are some “back of the envelope” estimates of cost, schedule, and performance of Starship-enabled missions, relative to status quo approaches?

KK: Here’s an example: the MOSAIC (Mars Orbiters for Surface-Atmosphere-Ionosphere Connections) concept, identified as a priority in the National Academies’ 2022 Planetary Decadal Survey, was a 10-satellite constellation to understand the integrated Mars climate system from its shallow ice, up through Mars’ atmospheric layers, and out to the exosphere and space weather environment. The study envisioned deploying one large “mothership” satellite and nine smaller satellites in orbit around Mars using SpaceX’s Falcon Heavy Rocket. Development of these spacecraft was expected to cost ~$1B (excluding recommended 50% reserves). 

In a world with Starship, the same mission could cost $200M in spacecraft costs. With this next generation launch vehicle, you could launch 10 large satellites in a single Starship. Each satellite would be redesigned to optimize for Starship’s mass allowance (150 tons), allowing the use of cheaper, but heavier materials and components (e.g. aluminum instead of expensive isogrid & composite structure). Each satellite would have more capabilities from a power (20kW), payload mass and payload volume than the large “mothership” satellite envisioned in the original MOSAIC study. 

DC: You’ve told me that standardization and modularization possibilities with Starship as it relates to satellites and scientific instruments is crucial. Can you elaborate on that idea?

NK: Longer term, having mass will allow us to do interesting things like over-spec the SWAP capabilities of the satellite bus to meet the requirements of various space science missions – thereby driving standardization. With sufficient SWAP, we could start to include a consistent bundle of instruments (rather than selecting a few to fit within limited SWAP budgets) – reducing the level of customization and non-recurring engineering (NRE) required for each mission. 

Although there will always be some level of customization required for each individual scientific mission, the potential to standardize a large portion of the hardware will make it possible to mass produce probes, increasing the potential frequency of missions and reducing the potential cost per mission. Examples here include standardized build-to-print suites of spectrometers, cameras, and particle and field sensors.

DC:  What are the implications for the Defense Department?  What are some of the important opportunities to deliver capabilities to solve national security problems in less time, at a lower cost, and with greater resilience?

NK: In 2022, the Space Force made resilience its No. 1 priority. One of the ways it hoped to achieve resilience was through the use of cheaper, more quickly deployed satellites. Unfortunately, the only path historically to going cheaper and faster was by going smaller, thereby sacrificing capabilities (e.g. low cost satellites typically come in <2kW of array power). 

With Starship, and companies like K2, agencies such as the Department of Defense will have access to larger, more capable satellites that are built cheaper, faster and with lower NRE. Instead of a single exquisite satellite with 20kW of power, the DoD will be able to deploy constellations of 40 satellites, each with 20kW of power, all within a single Starship. With the rise of refueling and next generation propulsion systems, these high power constellations will be deployable in higher orbits like Medium Earth Orbit (MEO) and Geostationary Orbit (GEO), providing a much needed alternative to a potentially crowded Low Earth Orbit (LEO). 

DC:  The NASA Commercial Orbital Transportation Services program (COTS) program used firm, fixed-price milestone payments to solve a problem (deliver and retrieve cargo and crew to the International Space Station) at a fraction of the cost of “business as usual” approaches.  NASA also gave companies such as SpaceX greater autonomy with respect to how to solve this problem.  What are some key lessons that policy-makers should learn from the NASA COTS program and similar efforts at the Space Development Agency?  

KK: The NASA COTS program and SDA have demonstrated that policy can be as effective as technology in driving positive change in space. The move towards firm, fixed priced models incentivized reductions in cost/time, and pushed commercial entities to be thoughtful about what it would take to deliver against stated mission requirements. The autonomy that was given to the companies like SpaceX was critical to achieving the unprecedented results that were delivered. 

Moving forward, other areas that could benefit from this approach include deep space communications infrastructure and space debris identification and remediation. 

NK: Take the communications capabilities around Mars. The current infrastructure is aging and throughput limited – we just have a collection of Mars orbiters that are operating beyond their primary design lifetimes. With the ramp-up of ambitious scientific missions expected to be launched over the next decade (including eventual human exploration efforts), this aging infrastructure will be unable to keep up with a potentially exponential increase in data demands. 

Rather than addressing this via conventional completed missions, where the end-to-end mission is prescribed, a new approach that uses mechanisms like data buys or Advance Market Commitments could fit well here. Assigning a price for throughput deployed on a $/Gbps basis – what the U.S. government would be willing to pay, but not actually prescribing how those capabilities are deployed – could result in a cheaper, faster and more effective solution. Companies could then raise capital against the potential market, build out the infrastructure and shoulder a majority of the risk, much like any other early stage venture.

DC: What new commercial capabilities might Starship unlock?  Would any of these capabilities benefit from some government involvement or participation, in the same way that the NASA COTS program helped finance the development of the Falcon9?

KK: Almost every new commercial space customer has been forced to operate with sub-scale unit economics. Given capital constraints, their only option has been to buy a small satellite and compromise on the power, payload mass or payload volume they actually need.  In a world with Starship, commercial players will be able to deploy capable constellations at a fraction of the cost. They’ll be able to multi-manifest in a single Starship, amortizing the cost of launch across their full constellation (instead of just 4-8 satellites). The mass allowance of Starship will make previously infeasible commercial businesses feasible, from large fuel depots, to orbital cargo stations, to massive power plants. 

As we think about development across the solar system, as future deep space missions increase the demand for data, the lack of comms capabilities beyond the Deep Space Network (DSN) is going to play a limiting factor. A concerted effort to start building these capabilities to handle future data demand could be an interesting candidate for a COTS-like approach.

DC:  For policy-makers and program managers who want to learn more about Starship and other similar capabilities, what should they read, and who should they be following?

KK: There are a number of great pieces on the potential of Starship, including:

DC: Great recommendations. Thanks to you both for chatting.

KK: Thank you.

NK: Thanks.

Opening Up Scientific Enterprise to Public Participation

This article was written as part of the Future of Open Science Policy project, a partnership between the Federation of American Scientists, the Center for Open Science, and the Wilson Center. This project aims to crowdsource innovative policy proposals that chart a course for the next decade of federal open science. To read the other articles in the series, and to submit a policy idea of your own, please visit the project page.

For decades, communities have had little access to scientific information despite paying for it with their tax dollars. The August 2022 Office of Science and Technology Policy (OSTP) memorandum thus catalyzed transformative change by requiring all federally funded research to be made publicly available by the end of 2025. Implementation of the memo has been supported by OSTP’s “Year of Open Science”, which is coordinating actions across the federal government to advance open access research. Access, though, is the first step to building a more responsive, equitable research ecosystem. A more recent memorandum from the Office of Management and Budget (OMB) and OSTP outlining research and development (R&D) policy priorities for fiscal year (FY) 2025 called on federal agencies to address long-standing inequities by broadening public participation in R&D. This is a critical demand signal for solutions that ensure that federally funded research delivers for the American people.

Public engagement researchers have long been documenting the importance of partnerships with key local stakeholders — such as local government and community-based organizations — in realizing the full breadth of participation with a given community. The lived experience of community members can be an invaluable asset to the scientific process, informing and even shaping research questions, data collection, and interpretation of results. Public participation can also benefit the scientific enterprise by realizing active translation and implementation of research findings, helping to return essential public benefits from the $170 billion invested in R&D each year.

The current reality is that many local governments and community-based organizations do not have the opportunities, incentives, or capacity to engage effectively in federally-funded scientific research. For example, Headwaters Economics found that a significant proportion of communities in the United States do not have the staffing, resources, or expertise to apply to receive and manage federal funding. Additionally, community-based organizations (CBOs) — the groups that are most connected to people facing problems that science could be activated to solve, such as health inequities and environmental injustices — face similar capacity barriers, especially around compliance with federal grants regulations and reporting obligations. Few research funds exist to facilitate the building and maintenance of strong relationships with CBOs and communities, or to provide capacity-building financing to ensure their full participation. Thus, relationships between communities and academia, companies, and the federal government often consume those communities’ time and resources without much return on their investment.

Great participatory science exists, if we know where to look

Place-based investments in regional innovation and research and development (R&D) unlocked by the CHIPS and Science Act (i.e. Economic Development Administration’s (EDA) Tech Hubs and National Science Foundation’s (NSF) Regional Innovation Engines and Convergence Accelerator) are starting to provide transformative opportunities to build local research capacity in an equitable manner. What they’ll need are the incentives, standards, requirements, and programmatic ideas to institutionalize equitable research partnerships.

Models of partnership have been established between community organizations, academic institutions, and/or the federal government focused on equitable relationships to generate evidence and innovations that advance community needs. 

An example of an academic-community partnership is the Healthy Flint Research Coordinating Center (HFRCC). The HFRCC evaluates and must approve all research conducted in Flint, Michigan. HFRCC designs proposed studies that would align better with community concerns and con­text and ensures that benefits flow directly back to the community. Health equity is assessed holistically: considering the economic, environmental, behavioral, and physical health of residents. Finally, all work done in Flint is made open access through this organization. From these efforts we learn that communities can play a vital role in defining problems to solve and ensuring the research will be done with equity in mind.

An example of a federal agency-community partnership is the Environmental Protection Agency’s (EPA) Participatory Science Initiative. Through citizen science processes, the EPA has enabled data collection of under-monitored areas to identify climate-related and environmental issues that require both technical and policy solutions. The EPA helps to facilitate these citizen-science initiatives through providing resources on the best air monitoring equipment and how to then visualize field data. These initiatives specifically empower low-income and minority communities who face greater environmental hazards, but often lack power and agency to vocalize concerns. 

Finally, communities themselves can be the generators of research projects, initially without a partner organization. In response to the lack of innovation in diabetic care management, Type 1 diabetic patients founded openAPS. This open source effort spurred the creation of an overnight, closed loop artificial pancreas system to reduce disease burden and save lives. Through decentralized deployment to over 2700 individuals, there are 63 million hours of real-world “closed-loop” data, with the results of prospective trials and randomized control trials (RCTs) showing fewer highs and less severe lows, i.e., greater quality of life. Thus, this innovation is now ripe for federal investment and partnership for it to reach a further critical scale.

Scaling participatory science requires infrastructure

Participatory science and innovation is still an emerging field. Yet, effective models for infrastructuring participation within scientific research enterprises have emerged over the past 20 years to build community engagement capacity of research institutions. Participatory research infrastructure (PRI) could take the form of the following: 

  1. Offices that develop tools for interfacing with communities, like citizen’s juries, online platforms, deliberative forums, and future-thinking workshops.
  2. Ongoing technology assessment projects to holistically evaluate innovation and research along dimensions of equity, trust, access, etc.
  3. Infrastructure (physical and digital) for research, design experimentation, and open innovation led by community members.
  4. Organized stakeholder networks for co-creation and community-driven citizen science
  5. Funding resources to build CBO capacity to meaningfully engage (examples including the RADx-UP program from the NIH and Civic Innovation Challenge from NSF).
  6. Governance structures with community members in decision-making roles and requirements that CBOs help to shape the direction of the research proposals.
  7. Peer-review committees staffed by members of the public, demonstrated recently by NSF’s Regional Innovation Engines
  8. Coalitions that utilize research as an input for collective action and making policy and governance decisions to advance communities’ goals.

Call to action

The responsibility of federally-funded scientific research is to serve the public good. And yet, because there are so few interventions that have been scaled, participatory science will remain a “nice to have” versus an imperative for the scientific enterprise. To bring participatory science into the mainstream, there will need to be creative policy solutions that create incentive mechanisms, standards, funding streams, training ecosystems, assessment mechanisms, and organizational capacity for participatory science. To meet this moment, we need a broader set of voices contributing ideas on this aspect of open science and countless others. That is why we recently launched an Open Science Policy Sprint, in partnership with the Center for Open Science and the Wilson Center. If you have ideas for federal actions that can help the U.S. meet and exceed its open science goals, we encourage you to submit your proposals here.

Kindergarten, Once Radical, Needs a Revamp to Provide More Equitable Learning Outcomes

In its early days, kindergarten was considered a radical approach to education. The foundation of the kindergarten curriculum included developmentally appropriate practice through hands-on engaging activities designed for the developmental stages of young children. Hands-on activities, play and socialization, or the ways children learn best, were the key strategies utilized to support children’s learning. Today, kindergarten is more closely associated with academics, worksheets, and learning to read as the pressure to meet certain standards is pushed down on our young children, their families and teachers. This shift has resulted in the more engaging hands on activities falling to the wayside. 

One might assume that this more intensive introduction to public school would produce better long term results for our students. Why do it otherwise? However, most recent data from the Progress in International Reading Literacy Study (PIRLS) reports that the average reading scores for fourth graders in the U.S. was lower than the averages for 12 education systems across the world, many of whom wait until children are developmentally ready to read, closer to age 7, before beginning formal literacy instruction. If children are not faring as well as they progress through the grades as students in countries with less rigorous curricula in kindergarten, is our more intensive academic approach in the early years working? It is time to radicalize kindergarten again? 

Kindergarten Today is More Advanced Than You Remember

According to the Center on the Developing Child at Harvard, emotional well-being and social competence provide a strong foundation for emerging cognitive abilities, and together they are the bricks and mortar of brain architecture. The emotional and physical health, social skills, and cognitive-linguistic capacities that emerge in the early years are all important for success in school, the workplace, and in the larger community. Children develop these important skills through positive relationships with caring adults, play-based, engaging activities, and opportunities to explore. In recent times, kindergarten classroom curriculum has shifted away from meeting the developmental needs and abilities of children instead following a highly academic one-size-fits-all approach to learning. Coincidentally, or not, the majority of teacher preparation programs in the United States do not require (or in some cases even offer) a course on child development or the science of learning in young children. 

Today, kindergarten in the United States looks much more like first, second or even third grade yet, developmentally, our children remain the same.

According to a study conducted by the University of Virginia, between 1998 and 2006, kindergarteners were held to increasingly higher academic expectations both prior to and during kindergarten, including the expectation that parents would teach children all (presumably English) letters before entering school. Teachers reported dedicating more time to advanced literacy and math content, teacher-directed instruction, and assessment and substantially less time to art, music, science, and child-selected activities. This trend continues today.

While some states require 30 minutes of recess for kindergartners, other states do not, and have in some cases reduced outdoor play time to 15 minutes or less per day.  According to Eric Jensen’s book Teaching With the Brain in Mind, “A short recess arouses students and may leave them ‘hyper’ and less able to concentrate.” Children benefit from an extended recess session (approximately an hour in length), because it gives their bodies time to regulate the movement and bring their activity level back down again.”  Our kindergarteners are playing less and ‘studying’ more. 

The Inequities of Kindergarten Have Lasting Consequences

Just as a child’s experience prior to entering the public school system may be different than the next child, once they enter kindergarten, their experience can vary greatly based on the state, district and community in which they live. According to the most recent 50 State Comparison: K-3 Policies released by the Education Commision of the States, every district in the country offers at least a part day kindergarten with 16 states requiring full day kindergarten. In some states, districts are required to offer over 1,000 hours of kindergarten instruction per school year whereas others require as few as 50 hours. Some kindergarten teachers have as many as 33 children alone in a kindergarten classroom while children in other states may be in a class with half as many children present. Six states do not require kindergarten attendance.  

Since the pandemic, enrollment and attendance in kindergarten has declined across the country primarily in communities of color. Based on a report released by Attendance Works in 2011, we know that children with low or at-risk attendance in kindergarten and first grade were more likely to not reach grade level standards in third grade in English language arts and math. National estimates suggest that one in 10 kindergarten and first grade students misses 18 or more days of the school year, or nearly a month. More recent data suggests this rate is most likely higher.  These missed days in the early years can add up to weaker reading skills, higher rates of retention and lower attendance rates in later grades.This is especially true for children from low-income families, who depend on school for literacy development. Students from lower performing schools and/or low income families were more likely to have attendance issues in the early years compared to their peers from higher performing schools. 

Bridging the Gap

For many students, we know that kindergarten is their first experience in the public school system. While some may not start school until first grade, kindergarten is often the bridge from early experiences to the K-12 system. Children of color and/or those living in low income communities may face the perfect storm that challenges the integrity of the bridge that is kindergarten. For example, access to kindergarten may be limited, cultural and linguistic appropriateness may be absent, chronic health issues may impact attendance, and transportation may be challenging. For many working families, a half day kindergarten does not meet the family’s needs. Full day programs may be out of reach for families either because they are not offered or, they live within communities where the first half of kindergarten is free but the second half of the day is fee-based, excluding lower income families. 

Based on 2021 U.S. Census Data, 14% of 5 year old children in the United States are not enrolled in school. This means we have over 150,000 potentially eligible children not enrolled in kindergarten. How will every child reach the Common Core standards for kindergarten if they are not in kindergarten? And even if they are present, are the standards being implemented equitably across the country? Are our kindergarteners experiencing the most appropriate learning possible? 

In order to ensure all children are provided the same opportunities for growth and success, we must ensure that all schools are ready for all children. To do so, it is important we explore opportunities to: 

As kindergarten focuses on academic performance and excludes those without classroom or transportation access, we tip the scales further between the “haves” and “have nots” – at the risk to all students and American competitiveness. How a child is introduced to school and how a child is prepared for formal education has lasting effects. If the U.S. wants to develop a workforce ready to lead and compete globally we have to start at the very beginning of a student’s school experience. Kindergarten, once radical, today needs a radical reinvention that provides for today’s challenges and readies children for tomorrow.

Buying in Bulk: Electrifying Fleets Across the Country

The Electrification Coalition (EC) is a nonpartisan, nonprofit organization that develops and implements a broad set of strategies to facilitate the widespread adoption of electric vehicles (EVs) to overcome the economic, public health, and national security challenges that stem from America’s dependence on oil. They provided technical support to the Climate Mayors initiative to help cities, counties, school districts, and other public entities leverage their collective buying power and accelerate the conversion of public fleets to EVs. 

Sarah Reed is the EC’s Director of Programs, managing EV innovation projects including the Climate Mayors Electric Vehicle Purchasing Collaborative program. Matthew Stephens-Rich is the Director of Technical Services for the EC’s programs.

This interview is part of an FAS series on innovative procurement.

Ryan Buscaglia: Could you give a broad overview of what the Climate Mayor’s EV purchasing collaborative was and what the role of the electrification coalition was within that?

Sarah: This project, the climate mayor’s purchasing collaborative, was launched in 2018 by then Mayor Eric Garcetti of Los Angeles and a couple of other partners. It was an unprecedented effort to create a one stop shop for local governments and schools and universities to reduce some of those upfront challenges that exist with fleet electrification. This project is really focused on providing technical assistance to all of these types of nonprofit local government/public entities, as they look to purchase EVs. 

We ask, ‘how can we make sure that every vehicle that’s being bought within the next year is electric? How can we provide that support, knowledge, and information on the fleetside to make that shift?’ So we pair that information and guidance from the electrification coalition with an easy option to buy this equipment.

You can buy all types of vehicles including school buses, garbage trucks, street sweepers, EV charging stations, and the EVs that we all pretty much know at this point (Chevy Bolt, the Nissan LEAF, etc.) through Sourcewell, which is a purchasing platform that sells all kinds of things. 

Many local governments are familiar with Sourcewell from buying equipment for playgrounds, pens, chairs or anything that schools and local governments purchase. This initiative brought together us, that purchasing mechanism, and the climate mayors who were saying ‘we need to transition our fleet’. That’s how we looked at that approach, especially in 2018, when there weren’t quite as many EVs out there. Our role is the overall organization as well as that technical support.

Why are public fleets an important leverage point in advancing electrification?

Sarah: The EC’s mission is focused on improving economic and national security challenges and reducing dependence on oil. We have this history of working with communities and cities on EV adoption and knew that fleets were a sweet spot for several reasons. Fleets have predictable routes. So in general, most fleet vehicles drive within the city or they drive to and from different facilities. They have higher mileage than a vehicle you or I might drive, which increases the cost savings of an electric vehicle. And they also usually go home to one spot at night where they can charge. So they’re really good candidates to make electric. And we can help show some of the cost savings and the total cost of ownership. Matt can share more about that part.

Matt: 2018 doesn’t sound like that long ago, but where we were in the market at that point there were barriers that needed better solutions. First and foremost was market growth. Consumer adoption was going on in 2018, but definitely not at the clip it is right now.

We saw an immediate opportunity to bring public fleets as a leader in proving EVs and where they can be a best fit in fleet deployment. When you think about a fleet—average people don’t think about fleets often—you’re thinking about dump trucks, step vans, Amazon delivery, that type of thing. But it turns out public fleets have a significantly large light duty deployment: sedans, pickup trucks, etc. It’s everything from a parking enforcement vehicle, to courier vehicles, a whole slew of things that are running around town. 

This made them a really good fit for electrification. It’s also a great way for cities to communicate their sustainability priorities and demonstrate what EVs look like in the wild. To this day we have a number of partners that signed on to the initial commitments with climate mayors that still send us back pictures. It just gives that demonstration of what EVs do look like in the wild. So that it’s not just a hypothetical, but something that’s real right in front of you. 

While we had a lot of initial municipalities that were excited to go electric, they didn’t have reliable access. Typically you have to go through usual procurement—you have to go through a publicly bid RFP process, you have to get three vendors responding back, and go through a whole criteria set. It takes time, effort, and energy. The biggest critics of RFPs will be quick to point out you often end up with an inferior product to what you originally were hoping for.

Working with Sourcewell and our procurement partners we were able to use pre-bid contracts to eliminate the need for that and instantly go and purchase those vehicles that you wanted. We heard and still l hear stories of somebody saying “I’m looking for a Chevy Bolt or a Kia Niro or something like that and my local dealer didn’t respond to my RFP, so I can’t buy the vehicle I’m looking for. What do I do?” So that was a specific opportunity.

But as fleet options have grown between electric school buses, electric street sweepers, charging infrastructure, transit buses—we’ve continued to grow the offering in the EV purchasing collaborative. For instance, with the F150 Lightning and the E-Transit, the Ford options, those are amazing utility applications, big top requests right now for a lot of folks, which has been a really helpful key asset to that. 

As Sarah also mentioned, in terms of the technical support we’re here to help usher folks through that procurement process. Pre IRA [Inflation Reduction Act] and Bipartisan Infrastructure Law [BIL] there was an opportunity to use the earlier EV tax credits through leasing. And then by leasing the vehicle from a leasing company or maybe a dealer or your procurement partner, you’re able to claim a pass through portion of the tax credit. A lot of municipalities never lease so this was the first time they were ever leasing a product. Often you just buy it outright using bonded money or something to that effect. So there was a lot of technical support in terms of procurement. 

We also provide fleet analysis support through our free to use Drive tool—Dashboard for Rapid Vehicle Electrification (DRVE)—which you can download from our website. Anyone can use it. It’s designed to provide a quick analysis. A lot of fleet managers are bought into electrification, they don’t need the whole proofing of “what is an EV? where do I charge it?” They have a lot of that figured out. It’s really just the question of “okay, what is the exact model that I should be thinking about? Or what would a Total Cost of Ownership (TCO) look like?” They’re pretty bought in but just need some of those final touches to gut check it.

That brings up so many follow up questions from me, the first is why do electric street sweepers have such high demand?

Matt: Street sweepers were really popular in part because what’s one thing that every city has? It is a street sweeper or a refuse truck. At that moment, it was kind of a cool thing we had an electric street sweeper and said let’s do an informational webinar and get the word out on it. That was the most attended webinar we had hosted to date. And it was because there were a lot of good use applications and a lot of city fleet vendors were really excited to hear about that. Definitely a great example of finding vehicle options ripe for electrification.

We’ve grown like Sarah mentioned with nonprofits and we have done a lot of work with universities. State fleets as well. While city fleets are a great place to start, we work with all public entities big and small. We’ve had the chance to work with and help them through the process, especially again to the market growth. One thing we really emphasize is you are not expected to use our procurement partner, it’s just one of many options. So above all else, we’re here to help you find the most cost effective, most time efficient way to get your vehicles deployed. In those moments, we’re really just really on hand to help out folks through any specific part of the process. 

How did the Climate Mayors’ collaborative that you all are a part of come together? And what were some of the challenges of putting together such an aggregated purchasing vehicle rather than working with people in a one-off fashion?

Sarah: Mayor Garcetti had a really strong sustainability platform and he did a ton of really exciting things in the city of Los Angeles. One of those things was around electric vehicles at the community / fleet level across the city and he was a cofounder of the climate mayor’s organization. 

There are a group of cities that put together a request for information out to automakers to say, “We’re a couple of really large cities (Seattle, LA, etc.) and we want to buy this many fleet vehicles, we want to electrify them, and you need to make vehicles for us to buy.” This was a market indicator from all these large cities that the demand is there. The EC had a history with Sourcewell and built up some of those other relationships to turn that initial action into a project that could provide the support [for cities].

To make sure I’m understanding it right—there were three main entities involved here. You have the Climate Mayors that came together issuing a call and committing to purchasing a certain number of electric vehicles that they thought would meet their goals as cities and locales. You have the Electrification Coalition functioning as an overarching organization and technical support provider, working through details and analysis. And then you have Sourcewell, who’s the actual vehicle point provider who they can go to as an “easy button”. Once cities determine something meets their needs they can procure it quickly from them. Is that right?

Sarah: Yes, perfect! And you did ask about initial commitments. So there was that RFI, and then when we launched this program, there were about 20 cities and counties committing to a couple hundred EVs. This project has been focused on immediate action. So we didn’t care about saying, “Oh, by 2030 we’re gonna electrify our whole fleet.” That wasn’t what this was about. This was about next year—what vehicles are already turning over and how we can make them electric?

The peak of commitments was over 6,000 vehicles that fleets were looking at purchasing. Several thousand were purchased. And so that initial commitment grew from 20 or so folks, when we launched this program in 2018, to several hundred local governments that are a part of this effort.

Besides the thousands of vehicles, are there any other impact or success measures that you have from this specific program that you’d like to talk about?

Sarah: So it’s about 450 cities and counties and the actual commitments to purchasing vehicles has the potential to reduce over 2 million annual gasoline gallons, as well as 46 million annual tons of CO2 emissions. We’ve also written some case studies and provided other reports and resources.

Matt: A big key to success too, especially in those early days, we focused on investing into the relationships we were building with cities. The biggest thing about procurement is it’s always happening. So you’re always planning for that next one, two, or three rounds, forecasting procurement planning across a number of years. So we focused on those initial quick hits, and then focused on how we can grow those purchase orders over time with partners. Having all the analysis set really helped. We also did a lot of work with cities on setting up what we call “EV first” procurement policies. So essentially taking the city’s own internal procurement policy and kind of flipping it on its head. 

Traditionally, the assumption is you’re going to buy a gasoline or diesel vehicle, not anything that you have to defend. We flip it and say EV is actually the assumed norm and if it’s not going to be that they have to work down to the decision tree to get to buying a regular gasoline or diesel vehicle. Now, we are still working with a lot of those original cities. Many of them are staying on target for broader 2030 goals, like Sarah mentioned, but even then, there’s all types of barriers that can crop up along the way. 

Do you do any work around planning for the end of life asset problem with localities so those vehicles are gonna break down, they’re gonna need to replace those on a timeline eventually. How do you manage that transition and handle scrapping or selling to a secondary market?

Matt: To be honest, it doesn’t just start with EVs. Very few fleets can just outright scrap and replace vehicles. Often you have something in a primary use and then you’re putting it into a secondary use. It’s quite literally in the back, we’re going to use it on an as needed basis. Indeed, should a vehicle have to go in the shop, we have the backup that can come into deployment. So we actually started into that with those first EV purchases. Often those gasoline vehicles were not being immediately retired or sold off or sent to a scrapyard, but put in a secondary use on hand as needed. Very rarely driven, to be honest, but just still there. 

That asset management is a critical piece to it. Going back to the leasing structure—that creates a whole new world for secondary life and addressing how do you deal with the end of life for that vehicle? Those public fleets that were going down the leasing route, often that was in a closed ended lease or with an option to buy it out at a later date. 

So that was a way to hedge a bet because electric vehicles in general really are taking a shape and arc of evolution more similar to your smartphone than a traditional vehicle. The range only gets better, charging speeds get better. So that was one way folks were a little more comfortable with committing to procuring. With a closed ended lease, they knew they were giving it back to the dealer and not just going to have the vehicle on their hands and working out what do we do with it?

In sending to scrapyards, it’s interesting because there is actually a lot of asset value. There are so many rare and critical minerals in the vehicles and battery packs. It’s a growing industry—Redwood Materials is a growing recycler who I heard speak once. They put it saliently describing how all the material of a battery is there from day one to the last day. There are a lot of amazing efforts on materials reclamation. We actually have a Critical Minerals Center that is a sister program and effort that goes upstream and thinks about how to bring mineral security onshore. That’s all to say that public fleets will be a part of that flow. Fleets are a really good test case because fleet managers take their jobs very seriously when retiring an asset, whether that’s selling, recycling, or taking it to auction.

Are there any other ways to de-risk the end of life problems for public procurement?

Matt: Another example is EV transit buses. There are a number of transit bus companies with great EV options, and clever leasing options to lease the bus and the battery as separate assets. It addresses the questions about how battery life will fare. That’s been an option too, decoupling the battery from the vehicle and thinking about them as two separate assets that work together. It creates fascinating procurement options. There were terms, for instance, where you lease a bus for a ten year period but you have a five year term on the battery and you get a fresh battery at the five year mark. Assuming you’re keeping up with other things—suspensions, tires, etc.—that just need inspection and upkeep you can keep it on the road for a while.

Sarah: I’ll clarify that while we do work on transit buses and have helped many cities with them, they are not a part of this program because there are very strict FTA rules about how you can sell them.

Could you talk more about the DRVE tool and how important that information is when it comes to helping cities with their planning?

Matt: Call it right place right time. As we were launching the EV purchasing collaborative we were talking with friends at Atlas Public Policy who do a lot of market research and tool creation. We were grabbing a drink at a conference talking about how all of the total cost of ownership calculators out there are clunky and hard to use. From that, the DRVE tool was born.

We wanted to focus on what’s the tool for the masses, not the folks that can dedicate a lot of resources and time. We want people to be in and out in an hour and have a good assessment of where EVs could be a good fit. We wanted it to be open-source and free to use, something that anybody could download and run with—it’s Excel based. We designed it so that it talks to a variety of federal source databases like fueleconomy.gov, the Alternative Fuel Data Center, etc. The vehicles in it update automatically. If you run it in a couple months you’ll start to see 2024 model year vehicles start to populate it.

It was something we needed for our own practice, but realized it was effective and that we could release it publicly. You can upload any fleet data tracking that you use— we’ve worked with folks who have had to fax us their data sets. We saw a need to work with a wide variety of file formats.

How has the IRA affected what you’re trying to do with the electrification coalition and what is your vision for the next couple years that you’re excited to work on?

Matt: Our vision is we’re not going to sleep!

We have continually added to the DRIVE tool adding new features. We added more forward, navigating EV incentives and charging procurement incentives too. Between the BIL and IRA there are a lot of new provisions. Specific to the IRA, the commercial clean vehicle tax credit is going to be most relevant for public fleets. We’re reloading the site daily now on what was formerly called “direct pay” now called “elective pay”. Everything we described to you about pass-through working with dealers, leasing has changed now because public fleets can file for these tax credits directly and claim the entire benefit. Now the challenge is on us to make sure we’re digesting that information and getting out to the masses and being sure folks are understanding of the steps it will take to get this set up. Another implication of this new funding is the focus on charging infrastructure. The charging and fueling infrastructure round that just became available as part of the $7.5 billion provided by BIL—we supported 50 applications to that, did countless webinars and phone calls.

Any last thoughts from you, Sarah?

Sarah: This program has helped a lot of folks get ready to take advantage of these incentives and become more familiar with electric vehicles. We’re focused on helping cities of all sizes (not just the usual suspects). We helped 50 cities and many of those were very small, or rural, or didn’t have a fleet sustainability person. We’re trying to expand who has access to these vehicles on the city side of things. Also creating wholesale tools—while our individual support is great there’s only a few of us. 

This is the way the market is going. I don’t know if you could say that five years ago you could have had everybody saying that, but I doubt there are that many folks out there in cities unsure that this is what the future is going to look like. Taking advantage now while there are incentives, tax credits, and state programs that provide incentives for this transformation is critical. If you don’t act now you’ll be behind, there will be less resources out there. Now is the time to act and have more buy in. 

Revolutionary Advances in AI Won’t Wait

The Pentagon has turned innovation into a buzzword, and everyone can agree on the need for faster innovation. It seems a new innovation office is created every week. Yet when it comes to AI, the DoD is still moving too slowly and hampered by a slow procurement process. How can it make innovation core to the organization and leverage the latest technological developments?

We have to first understand what type of innovation is needed. As Harvard Business School professor Clayton Christensen wrote, there are two types of innovation: sustaining and disruptive. Sustaining innovation makes existing products and services better. It’s associated with incremental improvements, like adding new features to a smartphone or boosting the performance of the engine on a car, in pursuit of better performance and higher profits.

Disruptive innovation occurs when a firm with fewer resources challenges one of the bigger incumbents, typically either with a lower-cost business model or by targeting a new customer segment. Disruptive firms can start with fewer resources because they have less overhead and fewer fixed costs, and they often leverage new technologies.

Initially, a disruptor goes unnoticed by an incumbent, who is focused on capturing more profitable customers through incremental improvements. Over time, though, the disruptor grows enough to capture large market share, threatening to replace the incumbent altogether.

Intel Illustrates Both Types of Innovation

Intel serves as an illustrative example of both types of innovation. It was the first company to manufacture DRAM memory chips, creating a whole new market. However, as it focused on sustaining innovation, it was disrupted by low-cost Japanese firms that were able to offer the same DRAM memory chips at a lower cost. Intel then pivoted to focus on microprocessors, disrupting the personal computer industry. However, more recently, Intel is at risk of being disrupted again, this time by lower-power microprocessors, like ARM, and application-specific processors, like Nvidia GPUs.

The DoD, like the large incumbent it is, has become good at sustaining innovation. Its acquisitions process first outlines the capabilities it needs, then sets budgets, and finally purchases what external partners provide. Each part of this – the culture, the procedures, the roles, the rules – have been optimized over time for sustaining innovation. This lengthy, three-part process has allowed the Pentagon to invest in steadily improving hardware, like submarines and airplanes, and the defense industrial base has followed suit, consolidating to just five major defense contractors that can provide the desired sustaining innovation.

The problem is that we are now in an era of disruptive innovation, and a focus on sustaining innovation doesn’t work for disruptive innovation. As a result of decreasing defense budgets in the 1990s and a parallel increase in funding in the private sector, companies now lead the way on innovation. With emerging technologies like drones, artificial intelligence, and quantum computing advancing every month by the private sector, a years-long process to outline capabilities and define budgets won’t work: by the time the requirements are defined and shared, the technology will have moved on, rendering the old requirements obsolete. To illustrate the speed of change, consider that the National Security Commission on Artificial Intelligence’s lengthy 2021 report on how the U.S. can win in the AI era failed to include any mention of generative AI or Large-Language Models, which have seen revolutionary advances in just the past few years. Innovation is happening faster than our ability to write reports or define capabilities.

The Small, Daring, and Nimble Prevail

So how does an organization respond to the threat of disruptive innovation? It must create an entirely new business unit to respond, with new people, processes, and culture. The existing organization has been optimized to the current threat in every way, so in many ways it has to start over while still leveraging the resources and knowledge it has accumulated.

Ford learned this lesson the hard way. After trying to intermix production of internal combustion cars and electric vehicles for years, Ford recently carved out the EV group into a separate business unit. The justification? The “two businesses required different skills and mind-sets that would clash and hinder each area if they remained parts of one organization”, reported the New York Times after speaking with Jim Farley, the CEO of Ford.

When the personal computer was first introduced by Apple, IBM took it seriously and recognized the threat to its mainframe business. Due to bureaucratic and internal controls, however, its product development process took four or five years. The industry was moving too quickly for that. To respond, the CEO created a secretive, independent team of just 40 people. The result? The IBM personal computer was ready to ship just one year later.

One of the most famous examples of creating a new business unit comes from the defense space: Skunkworks. Facing the threat of German aircraft in World War II, the Air Force asked Lockheed Martin to design them a plane that could fly at 600-mph, which was 200 mph faster than Lockheed’s current planes. And they wanted a working prototype in just 180 days. With the company already at capacity, a small group of engineers, calling themselves Skunkworks, set up shop in a different building with limited resources – and miraculously hit the goal ahead of schedule. Their speed was attributed to their ability to avoid Lockheed’s bureaucratic processes. Skunkworks would expand over the years and go on to build some of the most famous Air Force planes, including the U-2 and SR-71.

DoD’s Innovation Approach to Date

The DoD appears to be re-learning these lessons today. Its own innovation pipeline is clogged down by bureaucracy and internal controls. Faced with the threat of a Chinese military that is investing heavily into AI and moving towards AI-enabled warfare, the DoD has finally realized that it cannot rely on its sustaining innovation to win. It must reorganize itself to respond to the disruptive threat.

It has created a wave of new pathways to accelerate the adoption of emerging technologies. SBIR open topics, the Defense Innovation Unit, SOFWERX, the Office of Strategic Capital, and the National Security Innovation Capital program are all initiatives created in the spirit of Skunkworks or the “new business unit”. Major commands are doing it too, with the emergence of innovation units like Navy Task Force 59 in CENTCOM.

These initiatives are all attempts to respond to the disruption by opening up alternative pathways to fund and acquire technology. SBIR open topics, for example, have been found to be more effective than traditional approaches because they don’t require the DoD to list requirements up front, instead allowing it to quickly follow along with commercial and academic innovation.

Making the DoD More Agile 

Some of these initiatives will work, others won’t. The advantage of DoD is that it has the resources and institutional heft to create multiple such “new business units” that try a variety of approaches, provided Congress continues to fund them.

From there, it must learn which approaches work best for accelerating the adoption of emerging technologies and pick a winner, scaling that approach to replace its core acquisitions process. These new pathways must be integrated into the main organization, otherwise they risk remaining fringe programs with scoped impact. The best contractors from these new pathways will also have to scale up, disrupting the defense industrial base. It is only with these new operating and business models – along with new funding policies and culture – can the DoD become proficient at acquiring the latest technologies. Scaling up the new business units is the only way to do so.

The path forward is clear. The hard work to reform the acquisitions process must begin by co-opting the strengths of these new innovation pathways. The good news is that the DoD, through its large and varied research programs, partnerships, and funding, has clear visibility into emerging and future technologies. Now it must figure out how to scale the new innovation programs or risk getting disrupted.

Systems Thinking in Climate: Positive Tipping Points Jumpstart Transformational Change

This blog post is the second piece in a periodic series by FAS on systems thinking. The first is on systems thinking in entrepreneurial ecosystems.

News was abuzz two weeks ago with a flurry of celebratory articles showcasing the first-year accomplishments of the Administration’s signature clean energy law, the Inflation Reduction Act (IRA), on its August 16-anniversary. The stats are impressive. Since the bill’s passage, some 270 new clean energy projects have been announced, with investments totaling some $132 billion, according to a Bank of America analyst report. President Biden, speaking at a White House anniversary event, reported that the legislation has already created 170,000 clean energy jobs and will create some 1.5 million jobs over the next decade, while significantly cutting the nation’s carbon emissions. 

The New York Times also headlined an article last week: “The Clean Energy Future Is Arriving Faster Than You Think,” citing that “globally, change is happening at a pace that is surprising even the experts who track it closely.” In addition, the International Energy Agency, which provides analysis to support energy security and the clean energy transition, made its largest ever upward revision  to its forecast on renewable capacity expansion. But should this accelerated pace of change that we are seeing really be such a surprise? Or, can rapid acceleration of transformation be predicted, sought after, and planned for?

FAS Senior Associate Alice Wu published a provocative policy memo last week entitled, “Leveraging Positive Tipping Points To Accelerate Decarbonization.” Wu asserts that we can anticipate and drive toward thresholds in decarbonization transitions. A new generation of economic models can enable the analysis of these tipping points and the evaluation of effective policy interventions. But to put this approach front and center will require an active research agenda and a commitment to use this framework to inform policy decisions. If done successfully, a tipping points framework can help forecast multiple different aspects of the decarbonization transition, such as food systems transformation and for ensuring that accelerated transitions happen in a just and equitable manner. 

Over the past year, FAS has centralized the concept of positive tipping points as an organizing principle in how we think about systems change in climate and beyond. We are part of a global community of scholars, policymakers, and nonprofit organizations that recognize the potential power in harnessing a positive tipping points framework for policy change. The Global Systems Institute at the University of Exeter, Systemiq, and the Food and Land Use Coalition are a few of the leading organizations working to apply this framework in a global context. FAS is diving deep into the U.S. policy landscape, unpacking opportunities with current policy levers (like the IRA) to identify positive tipping points in progress and, hopefully, to build capacity to anticipate and drive toward positive tipping points in the future.

Through a partnership between FAS and Metaculus, a crowd-forecasting platform, a Climate Tipping Points Tournament has provided an opportunity for experienced and novice forecasters alike to dive deep into climate policy questions related to Zero Emissions Vehicles (ZEVs). The goal is to anticipate some of these nonlinear transformation thresholds before they occur and explore the potential impacts of current and future policy levers.

While the tournament is still ongoing, it is already yielding keen insights on when accelerations in systems behavior is likely to occur, on topics that range from the growth of ZEV workforce to the supply chain dynamics for critical minerals needed for ZEV batteries. FAS is planning to publish a series of memos that will seek to turn insights from the tournament into actionable policy recommendations. Future topics planned include: 1) ZEV subsidies; 2) public vs. private charging stations; sodium ion battery research and development; and 4) ZEV battery recycling and the circular economy.

Going forward, FAS will continue to elevate the concept of positive tipping points in the climate space and beyond. We believe that if scientists and policymakers work together toward operationalizing this framework, positive tipping points can move quickly from the realm of the theoretical to become an instrument of policy design that enables decision makers to craft laws and executive action that promotes systems change toward the beneficial transformations we are seeking.

Increasing Evidence that the US Air Force’s Nuclear Mission May Be Returning to UK Soil

Significant modernization is underway at RAF Lakenheath for F-35A aircraft, a planned “surety dormitory,” and other infrastructure indicating that the nuclear weapons mission may be returning after a hiatus of 15 years.

New U.S. Air Force budgetary documents strongly imply that the United States Air Force is in the process of re-establishing its nuclear weapons mission on UK soil.

The Air Force’s FY 2024 budgetary justification package, dated March 2023, notes the planned construction of a “surety dormitory” at RAF Lakenheath, approximately 100 kilometers northeast of London. The “surety dormitory” was also briefly mentioned in the Department of Defense’s testimony to Congress in March 2023, but with no accompanying explanation. “Surety” is a term commonly used within the Department of Defense and Department of Energy to refer to the capability to keep nuclear weapons safe, secure, and under positive control.

The justification documents note the new requirement to “Construct a 144-bed dormitory to house the increase in enlisted personnel as the result of the potential Surety Mission” [emphasis added]. To justify the new construction, the documents note, “With the influx of airmen due to the arrival of the potential Surety mission and the bed down of the two F-35 squadrons there is a significant deficiency in the amount of unaccompanied housing available for E4s and below at Royal Air Force Lakenheath” [emphasis added].

A screenshot of an FY2024 budgetary document describing the proposed construction of a

The U.S. Air Force’s FY 2024 budgetary justification package describes the proposed construction of a “surety dormitory” at RAF Lakenheath. 

Construction of the facility is scheduled to begin in June 2024 and end in February 2026.

We previously documented the UK’s addition to the Department of Defense’s FY2023 budgetary documentary for the NATO Security Investment Program, in which it was written that “NATO is wrapping up a thirteen-year, $384 million infrastructure investment program at storage sites in Belgium, Germany, the Netherlands, the UK, and Turkey to upgrade security measures, communication systems, and facilities” [emphasis added]. An explicit mention of the UK had not been included in the previous year’s budgetary documents, and it was removed in this year’s documents after we reported on its inclusion the previous year.

The removal of country names from the Pentagon’s latest list of nuclear base upgrades is yet another example of the United States reducing the nuclear transparency of its own nuclear posture while criticizing nuclear secrecy in other nuclear-armed countries.

The removal of country names from the Pentagon’s Military Construction Program budget request follows the denial of a recent FAS declassification request of previously available nuclear warhead numbers. These decisions contradict and undermine the Biden administration’s appeal for nuclear transparency in other nuclear-armed states.

The past two years of budgetary evidence strongly suggests that the United States is taking steps to re-establish its nuclear mission on UK soil. The United States has not stored nuclear weapons in the United Kingdom for the past 15 years, since we reported in 2008 that nuclear weapons had been withdrawn from RAF Lakenheath.

The Weapons Storage and Security Systems (known as WS3) at RAF Lakenheath are contained within Protective Aircraft Shelters; the WS3s include an elevator-drive vault that can be lowered into the concrete floor, as well as the associated command, control, and communications software needed to unlock the weapons. A total of 33 WS3 vaults were installed at RAF Lakenheath in the 1990s, each of which can hold up to four B61 bombs, for a maximum capacity of 132 warheads. Whenever nuclear weapons have been withdrawn from European air bases in the past, their vaults have been put into “caretaker” status, but as Harold Smith, the former U.S. Assistant Secretary of Defense for Nuclear, Chemical, and Biological Defense Programs stated at the time, these vaults were “mothballed in such a way that if we chose to go back into those bases we can do it.”

A total of 33 underground nuclear weapons storage vaults in aircraft shelters at RAF Lakenheath were mothballed, but documents indicate that they may be reactivated soon.

The nuclear-related upgrades to RAF Lakenheath are taking place as the new 495th Fighter Squadron (hosted at RAF Lakenheath) prepares to become the first U.S. Air Force squadron in Europe to be equipped with the nuclear-capable F-35A Lightning II. The upgrades coincide with the long-planned delivery of the new B61-12 gravity bombs to Europe, which will replace the approximately 100 legacy B61-3s and -4s currently estimated to be deployed in Europe.

There are an estimated 100 B61 nuclear bombs deployed at six bases in five European countries, with preparations underway at RAF Lakenheath to potentially receive nuclear weapons as well.

In December 2021, in response to a media question about potentially stationing nuclear weapons in Poland, NATO Secretary General Jens Stoltenberg announced that “we have no plans of stationing any nuclear weapons in any other countries than we already have…” However, it is difficult to square his statement with the planned “arrival of the potential Surety mission” at RAF Lakenheath, as well as the addition of the base to the list of sites receiving nuclear upgrades.

One possible explanation is that the United States is currently preparing the infrastructure at RAF Lakenheath to allow the base to potentially receive nuclear weapons in the future or in the midst of a crisis, without necessarily having already decided to permanently station them there or increase the number of weapons currently stored in Europe. The budget language of a “potential Surety mission” indicates that a formal deployment decision has not yet been made.

This would be consistent with construction at other known nuclear storage bases across Europe, where new upgrades are taking place that are designed to facilitate the rapid movement of weapons on- and off-base to increase operational flexibility. In the midst of a genuine nuclear crisis with Russia, for example, a portion of U.S. nuclear weapons could be redistributed from more vulnerable eastern bases to RAF Lakenheath.

Background information:

Lakenheath Air Base Added To Nuclear Weapons Storage Site Upgrades
NATO Steadfast Noon Exercise and Nuclear Modernization in Europe
The C-17A Has Been Cleared To Transport B61-12 Nuclear Bomb To Europe
FAS Nuclear Notebook: US nuclear weapons, 2023


This research was carried out with generous contributions from the John D. and Catherine T. MacArthur Foundation, the New-Land Foundation, the Ploughshares Fund, the Prospect Hill Foundation, Longview Philanthropy, the Stewart R. Mott Foundation, the Future of Life Institute, Open Philanthropy, and individual donors.

“The US needs to lean into an old strength”: Maintaining Progress and Growing US Biomanufacturing

The U.S. bioeconomy has been surging forward, charged by the Presidential Executive Order 14081 and the CHIPS and Science Act. However, there are many difficult challenges that lay ahead for the U.S. bioeconomy, including for U.S. biomanufacturing capabilities. U.S. biomanufacturing has been grappling with issues in fermentation capacity including challenges related to scale-up, inconsistent supply chains, and downstream processing. While the U.S. government works on shoring up these roadblocks, it will be important to bring industry perspectives into the conversation to craft solutions that not only addresses the current set of issues but looks to mitigate challenges that may arise in the future.

To get a better understanding of industry perspectives on the U.S. bioeconomy and the U.S. biomanufacturing sector, the Federation of American Scientists interviewed Dr. Sarah Richardson, the CEO of MicroByre. MicroByre is a climate-focused biotech startup that specializes in providing specialized bacteria based on the specific fermentation needs of its clients. Dr. Richardson received her B.S. in biology from the University of Maryland in 2004 and a Ph.D. in human genetics and molecular biology from Johns Hopkins University School of Medicine in 2011. Her extensive training in computational and molecular biology has given her a unique perspective regarding emerging technologies enabled by synthetic biology.

FAS: The U.S. Government is focused on increasing fermentation capacity, including scale-up, and creating a resilient supply chain. In your opinion, are there specific areas in the supply chain and in scale-up that need more attention?

Dr. Sarah Richardson: The pandemic had such an impact on supply chains that everyone is reevaluating the centralization of critical manufacturing. The United States got the CHIPS and Science Act to invest in domestic semiconductor manufacturing. The voting public realized that almost every need they had required circuits. Shortages in pharmaceuticals are slowly raising awareness of chemical and biomedical manufacturing vulnerabilities as well. The public has even less insight into vulnerabilities in industrial biomanufacturing, so it is important that our elected officials are proactive with things like Executive Order 14081.

When we talk about supply chains we usually mean the sourcing and transfer of raw, intermediate, and finished materials — the flow of goods. We achieve robustness by having alternative suppliers, stockpiles, and exacting resource management. For biomanufacturing, an oft raised supply chain concern is feedstock. I can and will expound on this, but securing a supply of corn sugar is not the right long-term play here. Shoring up corn sugar supplies will not have a meaningful impact on industrial biomanufacturing and should be prioritized in that light.

Biomanufacturing efforts are different from the long standing production of consumer goods in that they are heavily tied to a scientific vendor market. As we scale to production, part of our supply chain is a lot of sterile plastic disposable consumables. We compete with biomedical sectors for those, for personal protective equipment, and for other appliances. This supply chain issue squeezed not just biomanufacturing, but scientific research in general.

We need something that isn’t always thought of as part of the supply chain: specialized infrastructural hardware. This  may not be manufactured domestically. Access to scale up fermentation vessels is already squeezed. The other problem is that no matter where you build them, these vessels are designed for the deployment of canonical feedstocks and yeasts. Addressing the manufacturing locale would offer us the chance to innovate in vessel and process design and support the kinds of novel fermentations on alternate feedstocks that are needed to advance industrial biomanufacturing. There are righteous calls for the construction of new pilot plants. We should make sure that we take the opportunity to build for the right future.

One of the indisputable strengths of biomanufacturing is the potential for decentralization! Look at microbrewing: fermentation can happen anywhere without country-spanning feedstock pipelines. As we onboard overlooked feedstocks, it may only be practical to leverage them if some fermentation happens locally. As we look at supply chains and scale up we should model what that might look like for manufacturing, feedstock supply chains, and downstream processing. Not just at a national level, but at regional and local scales as well.

There are a lot of immediate policy needs for the bioeconomy, many of which are outlined in Executive Order 14081. How should these immediate needs be balanced with long-term needs? Is there a trade-off?

Counterintuitively, the most immediate needs will have the most distant payoffs! The tradeoff is that we can’t have every single detail nailed down before work begins. We will have to build tactically for strategic flexibility. Climate change and manufacturing robustness are life or death problems. We need to be open to more creative solutions in funding methods, timeline expectations; in who comes to the table, in who around the table is given the power to affect change, and in messaging! The comfortable, familiar, traditional modes of action and funding have failed to accelerate our response to this crisis.

We have to get started on regulation yesterday, because the only thing that moves slower than technology is policy. We need to agree on meaningful, aggressive, and potentially unflattering metrics to measure progress and compliance. We need to define our terms clearly: what is “bio-based,” does it not have petroleum in it at all? What does “plant-based” mean? What percentage of a product has to be renewable to be labeled so? If it comes from renewable sources but its end-of-life is not circularizable, can we still call it “green”?

We need incentives for innovation and development that do not entrench a comfortable but unproductive status quo. We need to offer stability to innovators by looking ahead and proactively incubating the standards and regulations that will support safety, security, and intellectual property protection. We should evaluate existing standards and practices for inflexibility: if they only support the current technology and a tradition that has failed to deliver change, they will continue to deliver nothing new as a solution. 

We need to get on good footing with workforce development, as well. A truly multidisciplinary effort is critical and will take a while to pull off; it takes at least a decade to turn a high school student into a scientist. I only know of one national graduate fellowship that actually requires awardees to train seriously in more than one discipline. Siloing is a major problem in higher education and therefore in biomanufacturing. What passes for “multidisciplinary” is frequently “I am a computer scientist who is not rude to biologists” or “our company has both a chemical division and an AI division.” A cross-discipline “bilingual” workforce is absolutely critical to reuniting the skill sets needed to advance the bioeconomy. Organizations like BioMADE with serious commitments to developing a biomanufacturing workforce cannot effectively address the educational pipeline without significantly more support.

Hand holding petri dish with bacterial striations.

MicroByre is working to advance alternatives to substrates currently favored by the bioeconomy.

When we emphasize the collection of data — which data are we talking about? Is the data we have collected already a useful jumping off point for what comes next? Are the models relevant for foreseeable changes in technology, regulation, and deployment? For some of it, absolutely not. As every responsible machine learning expert can tell you, data is not something you want to skimp or cheap out on collecting or curating. We have to be deliberate about what we collect, and why. Biases cannot all be avoided, but we have to take a beat to evaluate whether extant models, architecture, and sources are relevant, useful, or adaptable. A data model is as subject to a sunk cost fallacy as anything else. There will be pressure to leverage familiar models and excuses made about the need for speed and the utility of transfer learning. We cannot let volume or nostalgia keep us from taking a sober look at the data and models we currently have, and which ones we actually need to get.

What are the major pain points the biomanufacturing industry is currently facing?

Downstream processing is the work of separating target molecules from the background noise of production. In purely chemical and petrochemical fields, separation processes are well established, extensively characterized, and relatively standardized. This is not the case in industrial biomanufacturing, where upstream flows are arguably more variable and complex than in petrochemicals. Producers on the biomedical side of biomanufacturing who make antibiotics, biologics, and other pharmaceuticals have worked on this problem for a long time. Their products tend to be more expensive and worth specialized handling. The time the field has spent developing the techniques in the urgent pursuit of human health works in their favor for innovation. However, separating fermentation broth from arbitrary commodity molecules is still a major hurdle for a bioindustrial sector already facing so many other simultaneous challenges. Without a robust library of downstream processing methods and a workforce versant in their development and deployment, new industrial products are viewed as significant scaling risks and are funded accordingly.

There is fatigue as well. For the sake of argument, let us peg the onset of the modern era of industrial biomanufacturing to the turn of the latest century. There have been the requisite amount of promises any field must make to build itself into prominence, but there has not been the progress that engenders trust in those or future promises. We have touted synthetic biology as the answer for two and a half decades but our dependence on petroleum for chemicals is as intense as ever. The goodwill we need to shift an entire industry is not a renewable resource. It takes capital, it takes time, and it takes faith that those investments will pay off. But now the chemical companies we need to adopt new solutions have lost some confidence. The policy makers we need to lean into alternative paths and visionary funding are losing trust. If the public from whence government funding ultimately springs descends into skepticism, we may lose our chance to pivot and deliver.

The right investment right now will spell the difference between life and death on this planet for billions of people.

This dangerous dearth of confidence can be addressed by doing something difficult: owning up to it. No one has ever said “oh goody — a chance to do a postmortem!”. But such introspective exercises are critical to making effective changes. A lack of reflection is a tacit vote for the status quo, which is comfortable because we’re rarely punished for a lack of advocacy. We should commission an honest look at the last thirty years — without judgment, without anger, and without the need to reframe disappointing attempts as fractional successes for granting agencies, or position singular successes as broadly representative of progress for egos. 

Biomanufacturing is so promising! With proper care and attention it will be incredibly transformative. The right investment right now will spell the difference between life and death on this planet for billions of people. We owe it to ourselves and to science to do it right — which we can only do by acknowledging what we need to change and then truly committing to those changes.

Corn sugar tends to be the most utilized biomass in the bioeconomy. What are the issues the U.S. faces if it continues to rely solely on corn sugar as biomass?

History shows that low-volume, high-margin fine chemicals can be made profitable on corn sugar, but high-volume, low-margin commodity chemicals cannot. Projects that produce fine chemicals and pharmaceuticals see commercial success but suffer from feedstock availability and scaling capacity. Success in high-margin markets encourages people to use the exact same technology to attempt low-margin markets, but then they struggle to reduce costs and improve titers. When a commodity chemical endeavor starts to flag, it can pivot to high-margin markets. This is a pattern we see again and again. As long as corn sugar is the default biomass, it will not change; the United States will not be able to replace petrochemicals with biomanufacturing because the price of corn sugar is too high and cannot be technologically reduced. This pattern is also perpetuated because the yeast we usually ask to do biomanufacturing cannot be made to consume anything but corn sugar. We also struggle to produce arbitrary chemicals in scalable amounts from corn sugar. We are stuck in an unproductive reinforcing spiral. 

Even if commodity projects could profit using corn sugar, there is not enough to go around. How much corn sugar would we have to use to replace even a fifth of the volume of petroleum commodity chemicals we currently rely on? How much more land, nitrogen, water, and additional carbon emissions would be needed? Would chemical interests begin to overpower food, medical, and energy interests? What if a pathogen or natural disaster wiped out the corn crop for a year or two? Even if we could succeed at manufacturing commodities with corn sugar alone, locking out alternatives makes the United States supply chain brittle and vulnerable.

Gloved hands holding petri dish showing light green bacterial striations.

MicroByre is working to advance alternatives to substrates currently favored by the bioeconomy.

Continued reliance on corn sugar slows our technological development and stifles innovation. Specialists approaching manufacturing problems in their domain are necessarily forced to adopt the standards of neighboring domains. A chemical engineer is not going to work on separating a biomass into nutrition sources when no microbiologist is offering an organism to adopt it. A molecular biologist is not going to deploy a specialized metabolic pathway dependent on a nutrition source not found in corn sugar. Equipment vendors are not going to design tools at any scale that stray from a market demand overwhelmingly based on the use of corn sugar. Grantors direct funds with the guidance of universities and industry leaders, who are biased towards corn sugar because that’s what they use to generate quick prototypes and spin out new start up companies. 

The result of relying on corn sugar is an entrenched field and consequently we might lose our chance to make a difference. Without introducing low-cost, abundant feedstocks like wastes, we run the risk of disqualifying an entire field of innovation. 

What does the U.S. need to do in order for other biomass sources to be utilized beyond corn sugar? Are there ideas (or specific programs) that the U.S. government could supercharge?

Federal agencies must stop funding projects that propose to scale familiar yeasts on corn sugars to produce novel industrial chemicals. We must immediately stop funding biomass conversion projects meant to provide refined sugars to such endeavors. And we must stop any notion of dedicating arable land solely to corn sugar solely for the purposes of biomanufacturing new industrial products. The math does not and will not work out. The United States must stop throwing money and support at such things that seem like they ought to succeed any minute now, even though we have been waiting for that success for 50 years without any meaningful changes in the economic analysis or technology available.

Ironically, we need to take a page from the book that cemented petroleum and car supremacy in this country. We need to do the kind of inglorious, overlooked, and subsequently taken for granted survey of the kind that enabled the Eisenhower Interstate System to be built. 

We need to characterize all of the non-corn feedstocks and their economic and microbial ecosystems. We need to know how much of each biomass exists, what it is composed of, and who is compiling where. We need to know what organisms rot it and what they produce from it. We need to make all of that data as freely available as possible to lower the barriers of entry for cross-disciplinary teams of researchers and innovators to design and build the logistical, microbiological, chemical, and mechanical infrastructure necessary. We need to prioritize and leverage the complex biomasses that cannot just be ground into yeast food. 

We need to get the lay of the land so – to use the roadway analogy – we know where to pour the asphalt. An example of this sort of effort is the Materials Genome Initiative, which is a crosscutting multi-agency initiative for advancing materials and manufacturing technology. (And which has, to my chagrin, stolen the term “genome” for non-biological purposes.) An even more visible example to the public is a resource like the Plant Hardiness Zone Map that provides a basis for agricultural risk assessment to everyone in the country.

The United States needs to lean into an old strength and fund infrastructure that gives all the relevant specialties the ability to collaborate on truly divergent and innovative biomass efforts. The field of industrial biomanufacturing must make a concerted effort to critically examine a history of failed technical investments, shake off the chains of the status quo, and guide us into true innovation. Infrastructure is not the kind of project that yields an immediate return. If venture capital or philanthropy could do it, they would have already. The United States must flex its unique ability to work on a generational investment timeline; to spend money in the very short term on the right things so as to set everyone up for decades of wildly profitable success — and a safer and more livable planet.

Governance of AI in Bio: Harnessing the Benefits While Reducing the Risks

Artificial Intelligence (AI) has gained momentum in the last 6 months and has become impossible to ignore. The ease of use of these new tools, such as AI-driven text and image generators, have driven significant discussion on the appropriate use of AI. Congress has also started digging into AI governance. Discussion has focused on a wide range of social consequences of AI, including biosecurity risks that could arise. To develop an overarching framework that includes addressing bio-related risks, it will be crucial for Congress, different federal agencies, and various non-governmental AI stakeholders to work together.

Bio Has Already Been Utilizing AI For Decades

Artificial intelligence has a long history in the life sciences. The principles are not new. Turing developed the idea in the 50’s and, by the turn of the century, bioinformaticians (data scientists for biological data) were already using AI in genome analysis. One focus of AI tools for biology has been on proteins. Nearly every known function in your body relies on proteins, and their 3-dimensional shapes determine their functions. Predicting the shape of a protein has long been a critical challenge. In 2020, Alphabet’s DeepMind published AlphaFold 2 as an AI-enabled software tool capable of doing just that. While not perfect, scientists have been able to use it and related tools to predict the shape of proteins faster and even to create new proteins optimized for specific applications. Of course, the applications of AI in biotechnology extends beyond proteins. Medical researchers have taken advantage of AI to identify new biomarkers and leverage AI to improve diagnostic tests. Industrial biotechnology researchers are exploring the use of AI to optimize biomanufacturing processes to improve yield. In other natural sciences, AI can even drive entire courses of experiments with minimal human input, with biological labs in development. Unfortunately, these same tools and capabilities could also be misused to cause harm by actors trying to develop toxins, pathogens, and other potential bio risks.

Proposed Bio x AI Solutions Are Incomplete 

Congress is looking for ways to reduce AI risks, beginning with social implications such as disinformation, employment decision making, and other areas encountered by the general public. These are excellent starting points and echo some concerns abroad. Some Congressional action has also called for sweeping studies, new regulatory commissions, or broadly scoped risk management frameworks (see the AI Risk Management Framework developed by NIST). While some recently proposed bills address AI concerns in healthcare, there have been few solutions for reducing risks specifically related to intersections of AI with biosciences and biotechnology. 

The Biden Administration recently reached agreements with leaders in the development of AI models to implement risk mitigation measures, including ones related to biosecurity. However, all of the current oversight mechanisms for AI models are voluntary, which has generated discussion on how to provide incentives and whether a stronger approach is needed. As the availability of AI models increases and models specific to biosciences and biotechnology become more sophisticated, this question about how to establish enforceable rules and appropriate degrees of accountability while minimizing collateral impact on innovation will become more urgent.

Approaches to governance for AI’s intersections with biology must also be tailored to the needs of the scientific community. As AI-enabled biodesign tools drive understanding and innovation, they will also decrease hurdles for malicious actors seeking to do harm. At the same time, data sharing, collaboration, and transparency have long been critical to advances in biosciences. Restricting AI model development or access to data, models, or model outputs without hampering legitimate research and development will be challenging. Implementing guardrails for these tools should be done carefully and with a solid understanding of how they are used and how they might be misused. Key questions for oversight of AI in bio include:

Now, While the Policy Window is Open

Recently, the National Defense Authorization Act for Fiscal Year 2022 created the bipartisan, intergovernmental National Security Commission on Emerging Biotechnology. The NSCEB has been tasked with creating an interim report by the end of 2023 and a final policy recommendation report by the end of 2024 with recommendations for Congressional action. One of the areas they are looking into is the intersection of AI and biosciences, specifically how AI technology can enable innovation in the biosciences and biotechnologies while mitigating risks. 

The current attention on AI and the upcoming interim report to Congress by the NSCEB provide  an important policy window and acts as a call to action that requires stakeholder input in order to create governance and policy recommendations that enable innovation while mitigating risks. If you are an AI or bio expert within academia, the biotech industry, an AI lab, or other non-governmental organization and are interested in contributing policy ideas, we have just launched our Bio x AI Policy Development Sprint here. Timely, well considered policy recommendations that address the key questions listed above will lead to the best possible implementation of AI in biosciences and biotechnology.

The Future of Mobility in Michigan

The Detroit Regional Partnership (DRP) will use $52 million from the EDA to transition legacy automotive industry into a globally competitive advanced mobility cluster. The Global Epicenter of Mobility (GEM) coalition will do this through a new Supply Chain Transformation Center and Mobility Accelerator Innovation Network that will bolster existing pillars of support in their ecosystem.

Maureen Krauss is the President and Chief Executive Officer of the Detroit Regional Partnership. In December, 2022 Maureen testified at a House Science, Space, and Technology subcommittee hearing on Building Regional Innovation Economies. You can find her testimony here. Christine Roeder is the Executive Vice President of the GEM coalition and has over 20 years of experience with economic development and the automotive industry across Michigan. She previously held various senior leadership roles at the Michigan Economic Development Corporation (MEDC).

This interview is part of an FAS series on Unleashing Regional Innovation where we talk to leaders building the next wave of innovative clusters and ecosystems in communities across the United States. We aim to spotlight their work and help other communities learn by example. Our first round of interviews are with finalists of the Build Back Better Regional Challenge run by the Economic Development Administration as part of the American Rescue Plan.

Ryan Buscaglia: Could you tell me a little bit about the history of your coalition and how it came together in Detroit?

Maureen Krauss:  We represent a region of five and a half million people. That is not a federal formula region or state formula region. It is a self chosen region where people/communities have decided they want to work together. So that always makes things a little easier. We’ve been doing this for a long time in economic development. And when you look at our seven clusters that we focus on, the mobility cluster—advanced mobility—right now is responsible for about 70% of our workload. So it’s one that we’re immersed in. Everyone knows Detroit as the auto town, the auto region. Michigan—the auto state. We have been seeing this transformation everyday from ICE to EV (internal combustion engine to electric vehicle) but it’s really more than auto right? It has to do with aerospace and defense and a lot of other industries that are here on how the mobility industry is changing.

So it was interesting when we first convened regional partners who work in this space. We actually didn’t pick mobility. We said: ‘What should it be? What topics should it be?’ And we were really pleased when people got back to us on that: 19 out of 20 said mobility. So that was a clear sign that this was a space we needed to really focus on. 

The one other thing I will say, Christine and I worked together back in 2009 and 2010 when our auto economy imploded, and we learned a lot from that. We did not want that to happen again. Our global epicenter of mobility approach is to proactively embrace this change and ensure that our talent and our small and medium sized companies and our entrepreneurs can keep pace with the global shift, with the big original equipment manufacturers (OEMs), with the huge tier-one suppliers, and make sure they have access to the resources and the research that they need to make the transition. At one point I said to the EDA: we don’t want to ever come back to the federal government for a bailout. We want to show that we’re proactively recognizing this change and doing something about it. Christine, I don’t know if you want to share any other insight around that on how people got involved in it [the coalition].

Christine Roeder: My prior work for 20 plus years was with the state level economic development group doing business development. And so a lot of my time—if not 90 plus percent of my time over that couple of decades—was spent on automotive projects in Southeast Michigan.

Knowing this ecosystem of players of workforce development and entrepreneurs and the different incubators we have, the work we have with these research institutions like University of Michigan and Wayne State— we’re really fortunate in this region. To the DRPS credit, when they were pulling this application together. It was not a situation of “well this is the group we have and this is how we’re moving forward.” It was “how can we make the table bigger?” I’ve heard Maureen say that a number of times, how can we make the table bigger? And who else is missing and how can we bring them to the table? 

Old image of Dodge automobile factory, circa 1916. Workers are finishing chassis.
Dodge Factory, 1916

Everyone knows Detroit as the auto town, the auto region – but it’s really much more than auto.

So I’m really thrilled about the work we’re doing within these different pillars of projects for the EDA, because they have brought together groups that never have even worked together their whole time doing workforce development or economic development in the Detroit metro region. We’re building trust across different organizations and educating these organizations about what the capabilities they each have.

You mentioned the historic roots of this coalition coming out of the incredibly tough period of 2009 and 2010. Could you talk about the lessons from that period and how that informed the projects and coalition that you’re working on today?

Maureen: There were some really significant programs that came out of that period of time. Some did not work out. They didn’t pan out, they weren’t needed. But we had to really take a look and say— okay, we’re very grateful for this auto industry here, what would we be without it? It provides great jobs, great quality of life, and jobs across the spectrum from manufacturing to technical research and development. We have these assets, what else can we do with them? 

So what happened as this coalition was building: we saw the different components be created to address very specific needs, and our approach on this EDA grant was we don’t want to invent something new. It was funny how the EDA asked us in all the meetings, well, is there a construction project? Or are we building a new thing? No, we have some existing pillars that are quite strong! We want the funds to accelerate their work, and make sure that historically excluded communities (HECs) will be able to have access. 

We looked at a very broad definition of HECs not just from a racial lens, but for instance, our region is very urban and suburban and rural. So how do we ensure that a successful program in Ann Arbor or Detroit can reach companies and people in the rural areas of our region too? So it was really, as Christine said, expanding that table and not creating yet another program.

And then we looked at what are shared components that we can all benefit from. That’s the strength of one of our components called GEM-Central and the whole research piece. Entrepreneurs, small-medium sized companies—they can’t hire high end McKinsey or Boston Consulting Group firms to do their research, but there’s a lot of shared information there. So we wanted to strengthen that and have that available to all. 

And then very importantly in the DEI space, we’re a very diverse region, and it’s very authentic here. But we want to ensure that our small and medium sized companies also understand how to be more diverse, how to really embrace all of the cultures that are here in the Detroit area. You know, a lot of these smaller firms barely have an HR person, let alone a DEI officer. So that’s one of our activities. We want to make sure that small and medium sized companies understand how to incorporate DEI into their companies to provide greater opportunities for all because we do believe that we have a great talent pool here. Sometimes you just have to look in different places than the traditional sources. And that really encompasses a lot of our DEI work and allowing access to those findings and those paths for companies that might just have a part time HR person doing payroll.

How are you bringing people to the table trying to be inclusive in the process of developing this future oriented cluster? Did it look like weekly meetings or did it look like town halls? Did you go to people in the community?

Maureen: Our diversity here is very authentic. So it’s not like we had to one day say, who can we call that represents black entrepreneurs or whatever. We work with these people every day. It was really as Christine said. We met every week for I don’t know how many weeks we typically had about 80 people on each call. We never met in person, either. Remember, this all started during COVID. We never met in person. But we had these meetings and we invited a big group of everyone we knew. Who was who was the black business Chamber of Commerce here? Get them in. Detroit Future City? Get them in. Our rural areas’ economic development partners and others? Get them in. It never seemed that it was forced, but it was just making sure they were in the conversations and there was that representation. It was the weekly mantra—who’s not here? Tell your friend, tell us who we can include on an invite. 

We have the second largest Arab population outside the Middle East in the Detroit area. There’s a very strong Hispanic Chamber of Commerce here. So Ernst and Young (EY) helped us bring all those people together and have those conversations. We did two sessions with them to really listen to what their thoughts and ideas and approaches would be to make sure that we were inclusive. I don’t think any of us feel comfortable that we’ve solved this issue. Right? But we’re gonna work super hard to be better and more inclusive in everything we do. And then Christine, if you want to talk about the global initiative as well and how they were engaged?

Christine: As this project came together we split it up into different pillars of activities. The talent pillar includes Global Detroit, which is an organization that works day in and day out bringing immigrants and the world community to Michigan, and specifically Southeast Michigan for job opportunities, entrepreneurship opportunities, and to become part of our ecosystem. As well as standing up for those groups that are already here in Detroit. As Maureen mentioned, we have a very strong Arab American presence here. We have a growing Bangladeshi population in Hamtramck. It’s wonderful to see and the integration of that into our community is really important, especially with the need to help them to build businesses and hire people.

The entrepreneurship program at Global Detroit is one of the funded partners of the EDA under our umbrella of GEM. Whether it’s with that group or, for example, yesterday and the day before here in downtown Detroit at our large convention center the Michigan Minority Business Development Council had their annual minority procurement conference. So they had hundreds of companies and hundreds of exhibits on the floor and we had two of our partners that had exhibited there. One of them being the group that’s reaching out to the legacy companies, the ICE companies that Maureen mentioned, as well as our talent and workforce pillar. So we had two of our pillars represented there, reaching out into the minority business community and talking about GEM. We’re looking to do things differently and dig way deeper into these historically excluded communities to make sure that they’re part of the solution.

I know that a conversation people are having is around the struggle of trying to link up a series of small and mid sized manufacturers who may or may not be able to plug into different OEM supply chains at different places. So hearing that you’ve connected that with the talent piece is wonderful, hopefully creating a value chain that is inclusive, and meets all the needs of a globally competitive market at the end of the day. Is that initiative connected with the new supply chain transformation center that is a critical part of your project?

Christine: The Supply Chain Transformation Center is the legacy company component, and they were there yesterday. So yes, it’s tied into that and working with those companies, particularly new companies that they’ve never worked with before in more rural areas and/or minority owned. All of those companies have components and products that are at risk of being extinct in the next 20 years. One of the lessons that Maureen and I learned back in 2008 and 2009 was that when companies that only made one part for one or two customers, when those customers filed bankruptcy, they suddenly had no idea what they were going to do.

Diversifying those companies into other products is what the supply chain transformation pillar is really doing. It is identifying those companies working with them on what is the product, where else could it be applied? What’s your machinery like? What else could it make? What’s your talent like? What can we upskill them into doing? And then how do we make sure that company continues to be a company as the product line that they’re currently making is shrunk into a handful of suppliers that will continue to build those or produce those parts. ICE is not going away, it’s going to take decades for those all to come off the roads, right? But we’re not going to have as many of them, there’s not going to be as many produced at the large scale. So how do we help those companies to find other customers to build their parts?

Maureen: We’re trying to create a path for the customer journey, and connect all the different components of GEM to ensure that all of these really strong pillars that we have—we do hate to call them pillars, but ‘co-recipient’ sounds very government-like, so we haven’t found the perfect phrase—but you know, we have six of them. How do we make sure that they are aligned? So it’s almost like a handoff. So your company’s going to make a new part at your legacy company and you’re going to transition but then we have to talk about the talent and what do we do with your existing talent so they don’t just get left behind, but that they have the skills to make that. We really are trying to be thoughtful about making sure these programs are also connected, talking to one another and sharing this customer journey. And in the end our customer is our people, right? Whether they represent individual talent or small or medium sized companies. We just need to ensure that all the pieces flow together nicely.

Mural reading
“Nothing stops Detroit”

How have you been navigating this transition from people thinking of Detroit as the auto town, and Michigan the auto state, and moving from that vision to the next 10-20 years as people make this transition from ICE to EV? I’m wondering how you tell the story of that transition to people and if you ever get pushback from people who might want to double down on the historic focus on ICE automobiles. 

Maureen: You’re always going to have naysayers. I never got my late parents to use a cellphone properly, and that was frustrating. Bless them, but it just didn’t work. I do think our people are very resilient. We’ve learned about change before and we learned what happens when you don’t embrace that change in the right way. There’s still going to be the naysayers “oh we will never have enough charging stations, oh who is going to supply all the hydrogen?” Those things are going to be a part of the conversation.

That’s why our research component is so important so we have the right data to show it’s not either or. There’s not going to be one date in the future where we switch from ICE to EV. We have to transition. It’s happening very quickly. We just had a big announcement in DC this week with Secretary Raimondo about a project that’s coming to Michigan with 600 jobs in the hydrogen electrolyzer space, they’re called NEL. So this is happening in a big way, but we want to be sensitive to the fact that change is hard.

Christine: I would just add that what’s going on with the government, federal regulations coming down that the companies need to meet in a short amount of time doesn’t leave any chance that this transition is happening. We are going to do what we can to save as many jobs and save as many companies as we can. And continue to attract companies in this new supply chain as maureen just mentioned this week’s announcement in the hydrogen space. It’s not just electric vehicles that run on batteries, it is hydrogen too. There are many other ways that our mobility companies are going to wean us from fossil fuels and move us towards more renewable choices. There’s going to be naysayers but we need to follow where the puck is going, not just follow the puck. 

Maureen, you testified to the House Science, Space, and Technology Committee Research and Technology Subcommittee last December. How did it feel to stand up on such a stage and speak up for your region like that?

Maureen: I’m extraordinarily proud. I grew up here, it’s my home. Five years ago they wouldn’t have asked Detroit to testify for this. It meant so much that they were asking us. I had Congress people from around the country asking me “how did you do this?  How did you work together and avoid somebody going rogue?” I’m proud our region realized the strength of working together. Five years ago we would have had five different applications from this region and none of them would have been strong enough to be successful. It was interesting when you saw who called us initially to tell us DRP needed to lead this. It was our council of governments, SEMCOG, somebody from Ann Arbor, from Automation Alley, these were all mature organizations who recognized: “why would we apply individually? Let’s do this together.” 

Those of us who are here know the great innovation that happens here. We tour the facilities and it’s a marvel. For a long time our story got overshadowed by other issues: bankruptcy or crime or vacant buildings. Now to be recognized as an example of innovation is, not to be sappy, but it made me very proud to tell that story on a national scale.

If you’re successful in doing what you proposed, what’s your hopeful vision for what Detroit will look like in 10-15 years?

Maureen: You’ve probably seen a shirt somewhere that says ‘Detroit vs Everybody’. It was done a few years ago when the city got beat up quite a bit. Really where we’re at today is so different, it’s such a different place. But it takes a while for a community to go down. Takes a while for it to get back. When I tell my future grandchildren what grandma did I want to show them the amazing people and companies that chose to come here because they saw it as an opportunity to be innovative and to be their best selves and be successful. People from all over the country and the world.

Christine: Ten years from now the seeds we are planting will grow into a recognized entrepreneurial ecosystem. That’s an area where Southeast Michigan—because of the jobs available in the auto industry—has not been an area nurturing entrepreneurship and bringing new ideas to market. Ten years from now I hope we’re seeing headlines of incubators bustling and companies that we put into the funnel as an idea raising their series A in venture capital to grow jobs and keep them in Michigan. Use our talent that will be training through GEM. Working with universities to make sure commercialization is happening here and we are able to grow. Again these new business owners. We adore and support our automakers and all the supply chains through it. For my daughters who are teenagers right now I would love for my children to be able to have that choice of coming out of school, and if they have an idea for something they want to create they can do it here in Detroit, they don’t have to go to a coast to do it. The GEM coalition, the entrepreneurship startup piece especially, I would like to see the work we’re doing in that area have a real impact.