A Focused Research Organization to Build the Foodome Project for the Future of Nutrition
Our current knowledge of the biochemical compounds in food is incredibly limited, but existing databases of MassSpec scans contain massive amounts of untapped, unannotated information about food ingredients. A project to leverage these databases with the tools of data mining, AI, and high-throughput measurement will systematically unveil the chemical composition of all food ingredients and revolutionize our understanding of food and health.
Diet is the single biggest determinant of health over which we have direct control. An unhealthy diet poses more risk to morbidity than alcohol, tobacco, drug use, and unsafe sex combined. Indeed, our diet exposes us to thousands of food molecules, many of which are known to play an important role in multiple diseases including coronary heart disease, cancer, stroke, and diabetes. Despite the demonstrated and complex role of diet on health, nutrition science remains focused on molecules that serve as energy sources such as sugars, fats, and vitamins, leaving most disease-causing compounds uncatalogued and invisible to researchers and health care professionals. Further, our current understanding of the way food affects health is limited to nutritional guidelines that rely on a panel of 150 essential micro- and macro-nutrients in our diet. This is a tiny fraction of the more than 130,000 compounds known to be present in food, hence limiting our ability to unveil the health implications of our diet.
Project Concept
The Foodome project aims to unveil this “dark matter of nutrition” by creating an open-access high-resolution compendium of food compounds through a strategy that combines Big Data, ML/AI, and experimental techniques, implemented by a focused cross-disciplinary team, motivated to bring transformative change and maximize public benefits.
In the past five years, BarabásiLab has curated the largest library of compounds in food, consisting of more than 135,000 biochemicals linked to 3,500 foods. While the number of biochemicals is exceptional, the coverage is highly uneven, sparse, and largely unquantified. Yet, information about the missing biochemicals is carried by the unannotated MassSpec peaks available for each MassSpec scan of food ingredients. Because chemicals are invisible to the one-chemical-one-peak tools employed today, we have designed a strategy that relies on data mining, AI, and high-throughput measurements to resolve them: we plan to collect the more than 3,000,000 MassSpec scans already available in databases, and mine the full scientific literature to collect knowledge on food composition. We also plan to take advantage of the increasing number of annotated genomes to infer their chemical makeup. These data will serve as input for a ML/AI platform designed to learn associations between biochemical structures and the ingredients’ phylogenetic position, helping us systematically unveil the chemical composition of all food ingredients.
What is a Focused Research Organization?
Focused Research Organizations (FROs) are time-limited mission-focused research teams organized like a startup to tackle a specific mid-scale science or technology challenge. FRO projects seek to produce transformative new tools, technologies, processes, or datasets that serve as public goods, creating new capabilities for the research community with the goal of accelerating scientific and technological progress more broadly. Crucially, FRO projects are those that often fall between the cracks left by existing research funding sources due to conflicting incentives, processes, mission, or culture. There are likely a large range of project concepts for which agencies could leverage FRO-style entities to achieve their mission and advance scientific progress.
This project is suited for a FRO-style approach because the Foodome platform and knowledge base will address problems in health science beyond the competence of any single academic group or start-up. The project started in the academic environment involving groups at Northeastern University, Harvard Medical School, and Tufts Medical School, but typical academic researchers and institutions are motivated by short term publication strategies and unable to devote the years needed to develop a public resource. Federal nutrition research funding exists, but is fragmented, and normal funding channels are generally unable to offer sustained support for a project of this size. With VC funding, we were able to move the project to a startup environment to standardize the toolset and develop key technologies, but company management decided that the Foodome platform’s timeline is too far from the market. Based on these experiences, an FRO appears to be the best framework to accomplish the vision of Foodome. The project enters a field limited by technological stagnation, and will fundamentally change our understanding of health and disease, impacting multiple fields and industries.
How This Project Will Benefit Scientific Progress
A high-resolution knowledgebase on the composition of food will revolutionize our ability to explore the role of each food-borne molecule in human health, impacting multiple fields: 1) It will be transformative for health care, changing our ability to prevent and control disease. 2) It will aid the development of healthier, more nutritious, and biochemically balanced foods. 3) It will facilitate the development of novel pharmaceuticals. 4) By improving MassSpec annotations, it will provide a more accurate biochemical descriptions of any sample, empowering diagnosis, and detection. 5) It will unlock innovations in personalized nutrition and precision medicine, allowing clinicians to offer precision advice to a patient on how to use diet to prevent and manage disease.
Key Contacts
Author
- Albert-László Barabási, Northeastern and Harvard Medical School, alb@neu.edu
Referrers
- Alice Wu, Federation of American Scientists, awu@fas.org
Learn more about FROs, and see our full library of FRO project proposals here.
Flexible Hiring Resources For Federal Managers
From education to clean energy, immigration to wildfire resilience, and national security to fair housing, the American public relies on the federal government to deliver on critical policy priorities.
Federal agencies need to recruit top talent to tackle these challenges quickly and effectively, yet often are limited in their ability to reach a diverse pipeline of talent, especially among expert communities best positioned to accelerate key priorities.
FAS is dedicated to bridging this gap by providing a pathway for diverse scientific and technological experts to participate in an impactful, short-term “tour of service” in federal government. The Talent Hub leverages existing federal hiring mechanisms and authorities to place scientific and technical talent into places of critical need across government.
The federal government has various flexible hiring mechanisms at its disposal that can help federal teams address the complex and dynamic needs they have while tackling ambitious policy agendas and programs. Yet information about how to best utilize these mechanisms can often feel elusive, leading to a lack of uptake.
This resource guide provides an overview of how federal managers can leverage their available hiring mechanisms and the Talent Hub as a strategic asset to onboard the scientific and technical talent they recruit. The accompanying toolkit includes information for federal agencies interested in better understanding the hiring authorities at their disposal to enhance their existing scientific and technical capacities, including how to leverage Intergovernmental Personnel Act Mobility Program and Schedule A(r) fellowship hiring.
FAS Forum: Envisioning the Future of Wildland Fire Policy
In this critical year for reimagining wildland fire policy, the Federation of American Scientists (FAS) hosted a convening that provided stakeholders from the science, technology, and policy communities with an opportunity to exchange forward-looking ideas with the shared goal of improving the federal government’s approach to managing wildland fire.
A total of 43 participants attended the event. Attendee affiliations included universities, federal agencies, state and local agencies, nonprofit organizations, and philanthropies.
This event was designed as an additive opportunity for co-learning and deep dives on topics relevant to the Wildland Fire Mitigation and Management Commission (the Commission) with leading experts in relevant fields (the convening was independent from any formal Commission activities).
In particular, the Forum highlighted, and encouraged iteration on, ideas emerging from leading experts who participated in the Wildland Fire Policy Accelerator. Coordinated by FAS in partnership with COMPASS, the California Council on Science and Technology (CCST), and Conservation X Labs, this accelerator has served as a pathway to source and develop actionable policy recommendations to inform the work of the Commission.
A full list of recommendations from the Accelerator is available on the FAS website.
The above PDF summarizes discussions and key takeaways from the event for participant reference. We look forward to building on the connections made during this event.
AI for science: creating a virtuous circle of discovery and innovation
In this interview, Tom Kalil discusses the opportunities for science agencies and the research community to use AI/ML to accelerate the pace of scientific discovery and technological advancement.
Q. Why do you think that science agencies and the research community should be paying more attention to the intersection between AI/ML and science?
Recently, researchers have used DeepMind’s AlphaFold to predict the structures of more than 200 million proteins from roughly 1 million species, covering almost every known protein on the planet! Although not all of these predictions will be accurate, this is a massive step forward for the field of protein structure prediction.
The question that science agencies and different research communities should be actively exploring is – what were the pre-conditions for this result, and are there steps we can take to create those circumstances in other fields?
One partial answer to that question is that the protein structure community benefited from a large open database (the Protein Data Bank) and what linguist Mark Liberman calls the “Common Task Method.”
Q. What is the Common Task Method (CTM), and why is it so important for AI/ML?
In a CTM, competitors share the common task of training a model on a challenging, standardized dataset with the goal of receiving a better score. One paper noted that common tasks typically have four elements:
- Tasks are formally defined with a clear mathematical interpretation
- Easily accessible gold-standard datasets are publicly available in a ready-to-go standardized format
- One or more quantitative metrics are defined for each task to judge success
- State-of-the-art methods are ranked in a continuously updated leaderboard
Computational physicist and synthetic biologist Erika DeBenedictis has proposed adding a fifth component, which is that “new data can be generated on demand.” Erika, who runs Schmidt Futures-supported competitions such as the 2022 BioAutomation Challenge, argues that creating extensible living datasets has a few advantages. This approach can detect and help prevent overfitting; active learning can be used to improve performance per new datapoint; and datasets can grow organically to a useful size.
Common Task Methods have been critical to progress in AI/ML. As David Donoho noted in 50 Years of Data Science,
The ultimate success of many automatic processes that we now take for granted—Google translate, smartphone touch ID, smartphone voice recognition—derives from the CTF (Common Task Framework) research paradigm, or more specifically its cumulative effect after operating for decades in specific fields. Most importantly for our story: those fields where machine learning has scored successes are essentially those fields where CTF has been applied systematically.
Q. Why do you think that we may be under-investing in the CTM approach?
U.S. agencies have already started to invest in AI for Science. Examples include NSF’s AI Institutes, DARPA’s Accelerated Molecular Discovery, NIH’s Bridge2AI, and DOE’s investments in scientific machine learning. The NeurIPS conference (one of the largest scientific conferences on machine learning and computational neuroscience) now has an entire track devoted to datasets and benchmarks.
However, there are a number of reasons why we are likely to be under-investing in this approach.
- These open datasets, benchmarks and competitions are what economists call “public goods.” They benefit the field as a whole, and often do not disproportionately benefit the team that created the dataset. Also, the CTM requires some level of community buy-in. No one researcher can unilaterally define the metrics that a community will use to measure progress.
- Researchers don’t spend a lot of time coming up with ideas if they don’t see a clear and reliable path to getting them funded. Researchers ask themselves, “what datasets already exist, or what dataset could I create with a $500,000 – $1 million grant?” They don’t ask the question, “what dataset + CTM would have a transformational impact on a given scientific or technological challenge, regardless of the resources that would be required to create it?” If we want more researchers to generate concrete, high-impact ideas, we have to make it worth the time and effort to do so.
- Many key datasets (e.g., in fields such as chemistry) are proprietary, and were designed prior to the era of modern machine learning. Although researchers are supposed to include Data Management Plans in their grant applications, these requirements are not enforced, data is often not shared in a way that is useful, and data can be of variable quality and reliability. In addition, large dataset creation may sometimes not be considered academically novel enough to garner high impact publications for researchers.
- Creation of sufficiently large datasets may be prohibitively expensive. For example, experts estimate that the cost of recreating the Protein Data Bank would be $15 billion! Science agencies may need to also explore the role that innovation in hardware or new techniques can play in reducing the cost and increasing the uniformity of the data, using, for example, automation, massive parallelism, miniaturization, and multiplexing. A good example of this was NIH’s $1,000 Genome project, led by Jeffrey Schloss.
Q. Why is close collaboration between experimental and computational teams necessary to take advantage of the role that AI can play in accelerating science?
According to Michael Frumkin with Google Accelerated Science, what is even more valuable than a static dataset is a data generation capability, with a good balance of latency, throughput, and flexibility. That’s because researchers may not immediately identify the right “objective function” that will result in a useful model with real-world applications, or the most important problem to solve. This requires iteration between experimental and computational teams.
Q. What do you think is the broader opportunity to enable the digital transformation of science?
I think there are different tools and techniques that can be mixed and matched in a variety of ways that will collectively enable the digital transformation of science and engineering. Some examples include:
- Self-driving labs (and eventually, fleets of networked, self-driving labs), where machine learning is not only analyzing the data but informing which experiment to do next.
- Scientific equipment that is high-throughput, low-latency, automated, programmable, and potentially remote (e.g. “cloud labs”).
- Novel assays and sensors.
- The use of “science discovery games” that allow volunteers and citizen scientists to more accurately label training data. For example, the game Mozak trains volunteers to collaboratively reconstruct complex 3D representations of neurons.
- Advances in algorithms (e.g. progress in areas such as causal inference, interpreting high-dimensional data, inverse design, uncertainty quantification, and multi-objective optimization).
- Software for orchestration of experiments, and open hardware and software interfaces to allow more complex scientific workflows.
- Integration of machine learning, prior knowledge, modeling and simulation, and advanced computing.
- New approaches to informatics and knowledge representation – e.g. a machine-readable scientific literature, increasing number of experiments that can be expressed as code and are therefore more replicable.
- Approaches to human-machine teaming that allow the best division of labor between human scientists and autonomous experimentation.
- Funding mechanisms, organizational structures and incentives that enable the team science and community-wide collaboration needed to realize the potential of this approach.
There are many opportunities at the intersection of these different scientific and technical building blocks. For example, use of prior knowledge can sometimes reduce the amount of data that is needed to train a ML model. Innovation in hardware could lower the time and cost of generating training data. ML can predict the answer that a more computationally-intensive simulation might generate. So there are undoubtedly opportunities to create a virtuous circle of innovation.
Q. Are there any risks of the common task method?
Some researchers are pointing to negative sociological impacts associated with “SOTA-chasing” – e.g. a single-minded focus on generating a state-of-the-art result. These include reducing the breadth of the type of research that is regarded as legitimate, too much competition and not enough cooperation, and overhyping AI/ML results with claims of “super-human” levels of performance. Also, a researcher who makes a contribution to increasing the size and usefulness of the dataset may not get the same recognition as the researcher who gets a state-of-the-art result.
Some fields that have become overly dominated by incremental improvements in a metric have had to introduce Wild and Crazy Ideas as a separate track in their conferences to create a space for more speculative research directions.
Q. Which types of science and engineering problems should be prioritized?
One benefit to the digital transformation of science and engineering is that it will accelerate the pace of discovery and technological advances. This argues for picking problems where time is of the essence, including:
- Developing and manufacturing carbon-neutral and carbon-negative technologies we need for power, transportation, buildings, industry, and food and agriculture. Currently, it can take 17-20 years to discover and manufacture a new material. This is too long if we want to meet ambitious 2050 climate goals.
- Improving our response to future pandemics by being able to more rapidly design, develop and evaluate new vaccines, therapies, and diagnostics.
- Addressing new threats to our national security, such as engineered pathogens and the technological dimension of our economic and military competition with peer adversaries.
Obviously, it also has to be a problem where AI and ML can make a difference, e.g. ML’s ability to approximate a function that maps between an input and an output, or to lower the cost of making a prediction.
Q. Why should economic policy-makers care about this as well?
One of the key drivers of the long-run increases in our standard of living is productivity (output per worker), and one source of productivity is what economists call general purpose technologies (GPTs). These are technologies that have a pervasive impact on our economy and our society, such as interchangeable parts, the electric grid, the transistor, and the Internet.
Historically – GPTs have required other complementary changes (e.g. organizational changes, changes in production processes and the nature of work) before their economic and societal benefits can be realized. The introduction of electricity eventually led to massive increases in manufacturing productivity, but not until factories and production lines were reorganized to take advantage of small electric motors. There are similar challenges for fostering the role that AI/ML and complementary technologies will play in accelerating the pace of scientific and technological advances:
- Researchers and science funders need to identify and support the technical infrastructure (e.g. datasets + CTMs, self-driving labs) that will move an entire field forward, or solve a particularly important problem.
- A leading academic researcher involved in protein structure prediction noted that one of the things that allowed DeepMind to make so much progress on the protein folding problem was that “everyone was rowing in the same direction,” “18 co-first authors .. an incentive structure wholly foreign to academia,” and “a fast and focused research paradigm … [which] raises the question of what other problems exist that are ripe for a fast and focused attack.” So capitalizing on the opportunity is likely to require greater experimentation in mechanisms for funding, organizing and incentivizing research, such as Focused Research Organizations.
Q. Why is this an area where it might make sense to “unbundle” idea generation from execution?
Traditional funding mechanisms assume that the same individual or team who has an idea should always be the person who implements the idea. I don’t think this is necessarily the case for datasets and CTMs. A researcher may have a brilliant idea for a dataset, but may not be in a position to liberate the data (if it already exists), rally the community, and raise the funds needed to create the dataset. There is still a value in getting researchers to submit and publish their ideas, because their proposal could be catalytic of a larger-scale effort.
Agencies could sponsor white paper competitions with a cash prize for the best ideas. [A good example of a white paper competition is MIT’s Climate Grand Challenge, which had a number of features which made it catalytic.] Competitions could motivate researchers to answer questions such as:
- What dataset and Common Task would have a significant impact on our ability to answer a key scientific question or make progress on an important use-inspired or technological problem? What preliminary work has been done or should be done prior to making a larger-scale investment in data collection?
- To the extent that industry would also find the data useful, would they be willing to share the cost of collecting it? They could also share existing data, including the results from failed experiments.
- What advance in hardware or experimental techniques would lower the time and cost of generating high-value datasets by one or more orders of magnitude?
- What self-driving lab would significantly accelerate progress in a given field or problem, and why?
The views and opinions expressed in this blog are the author’s own and do not necessarily reflect the view of Schmidt Futures.
The Magic Laptop Thought Experiment
One of the main goals of Kalil’s Corner is to share some of the things I’ve learned over the course of my career about policy entrepreneurship. Below is an FAQ on a thought experiment that I think is useful for policy entrepreneurs, and how the thought experiment is related to a concept I call “shared agency.”
Q. What is your favorite thought experiment?
Imagine that you have a magic laptop. The power of the laptop is that any press release that you write will come true.
You have to write a headline (goal statement), several paragraphs to provide context, and 1-2 paragraph descriptions of who is agreeing to do what (in the form organization A takes action B to achieve goal C). The individuals or organizations could be federal agencies, the Congress, companies, philanthropists, investors, research universities, non-profits, skilled volunteers, etc. The constraint, however, is that it has to be plausible that the organizations would be both willing and able to take the action. For example, a for-profit company is not going to take actions that are directly contrary to the interests of their shareholders.
What press release would you write, and why? What makes this a compelling idea?
Q. What was the variant of this that you used to ask people when you worked in the White House for President Obama?
You have a 15-minute meeting in the Oval Office with the President, and he asks:
“If you give me a good idea, I will call anyone on the planet. It can be a conference call, so there can be more than one person on the line. What’s your idea, and why are you excited about it? In order to make your idea happen, who would I need to call and what would I need to ask them to do in order to make it happen?”
Q. What was your motivation for posing this thought experiment to people?
I’ve been in roles where I can occasionally serve as a “force multiplier” for other people’s ideas. The best way to have a good idea is to be exposed to many ideas.
When I was in the White House, I would meet with a lot of people who would tell me that what they worked on was very important, and deserved greater attention from policy-makers.
But when I asked them what they wanted the Administration to consider doing, they didn’t always have a specific response. Sometimes people would have the kernel of a good idea, but I would need to play “20 questions” with them to refine it. This thought experiment would occasionally help me elicit answers to basic questions like who, what, how and why.
Q. Why does this thought experiment relate to the Hamming question?
Richard Hamming was a researcher at Bell Labs who used to ask his colleagues, “What are the most important problems in your field? And what are you working on?” This would annoy some of his colleagues, because it forced them to confront the fact that they were working on something that they didn’t think was that important.
If you really did have a magic laptop or a meeting with the President, you would presumably use it to help solve a problem that you thought was important!
Q. How does this thought experiment highlight the importance of coalition-building?
There are many instances where we have a goal that requires building a coalition of individuals and organizations.
It’s hard to do that if you can’t identify (1) the potential members of the coalition; and (2) the mutually reinforcing actions you would like them to consider taking.
Once you have a hypothesis about the members of your coalition of the willing and able, you can begin to ask and answer other key questions as well, such as:
- Why is it in the enlightened self-interest of the members of the coalition to participate?
- Who is the most credible messenger for your idea? Who could help convene the coalition?
- Is there something that you or someone else can do to make it easier for them to get involved? For example, policy-makers do things with words, in the same way that a priest changes the state of affairs in the world by stating “I now pronounce you husband and wife.” Presidents sign Executive Orders, Congress passes legislation, funding agencies issue RFPs, regulatory agencies issue draft rules for public comment, and so on. You can make it easier for a policy-maker to consider an idea by drafting the documents that are needed to frame, make, and implement a decision.
- If a member of the coalition is willing but not able, can someone else take some action that relaxes the constraint that is preventing them from participating?
- What evidence do you have that if individual or organization A took action B, that C is likely to occur?
- What are the risks associated with taking the course of action that you recommend, and how could they be managed or mitigated?
Q. Is this thought experiment only relevant to policy-makers?
Not at all. I think it is relevant for any goal that you are pursuing — especially ones that require concerted action by multiple individuals and organizations to accomplish.
Q. What’s the relationship between this thought experiment and Bucky Fuller’s concept of a “trim tab?”
Fuller observed that a tiny device called a trim tab is designed to move a rudder, which in turn can move a giant ship like the Queen Elizabeth.
So, it’s incredibly useful to identify these leverage points that can help solve important problems.
For example, some environmental advocates have focused on the supply chains of large multinationals. If these companies source products that are more sustainable (e.g. cooking oils that are produced without requiring deforestation) – that can have a big impact on the environment.
Q. What steps can people take to generate better answers to this thought experiment?
There are many things – like having a deep understanding of a particular problem, being exposed to both successful and unsuccessful efforts to solve important problems in many different domains, or understanding how particular organizations that you are trying to influence make decisions.
One that I’ve been interested in is the creation of a “toolkit” for solving problems. If, as opposed to having a hammer and looking for nails to hit, you also have a saw, a screwdriver, and a tape measure, you are more likely to have the right tool or combination of tools for the right job.
For example, during my tenure in the Obama Administration, my team and other people in the White House encouraged awareness and adoption of dozens of approaches to solving problems, such as:
- Sponsoring incentive prizes, which allow agencies to set a goal without having to choose the team or approach that is most likely to be successful;
- Making open data available in machine-readable formats, and encouraging teams to develop new applications that use the data to solve a real-world problem;
- Changing federal hiring practices and recruiting top technical talent;
- Embracing modern software methodologies such as agile and human-centered design for citizen-facing digital services;
- Identifying and pursuing 21st century moonshots;
- Using insights from behavioral science to improve policies and programs;
- Using and building evidence to increase the share of federal resources going to more effective interventions;
- Changing procurement policies so that the government can purchase products and services from startups and commercial firms, not just traditional contractors.
Of course, ideally one would be familiar with the problem-solving tactics of different types of actors (companies, research universities, foundations, investors, civil society organization) and individuals with different functional or disciplinary expertise. No one is going to master all of these tools, but you might aspire to (1) know that they exist; (2) have some heuristics about when and under what circumstances you might use them; and (3) know how to learn more about a particular approach to solving problems that might be relevant. For example, I’ve identified a number of tactics that I’ve seen foundations and nonprofits use.
Q. How does this thought experiment relate to the concept that psychologists call “agency?”
Agency is defined by psychologists like Albert Bandura as “the human capability to influence …the course of events by one’s actions.”
The particular dimension of agency that I have experienced is a sense that there are more aspects of the status quo that are potentially changeable as opposed to being fixed. These are the elements of the status quo that are attributable to human action or inaction, as opposed to the laws of physics.
Obviously, this sense of agency didn’t extend to every problem under the sun. It was limited to those areas where progress could be made by getting identifiable individuals and organizations to take some action – like the President signing an Executive Order or proposing a new budget initiative, the G20 agreeing to increase investment in a global public good, Congress passing a law, or a coalition of organizations like companies, foundations, nonprofits and universities working together to achieve a shared goal.
Q. How did you develop a strong sense of agency over the course of your career?
I had the privilege of working at the White House for both Presidents Clinton and Obama.
As a White House staffer, I had the ability to send the President a decision memo. If he checked the box that said “yes” – and the idea actually happened and was well-implemented, this reinforced my sense of agency.
But it wasn’t just the experience of being successful. It was also the knowledge that one acquires by repeatedly trying to move from an idea to something happening in the world, such as:
- Working with the Congress to pass legislation that gave every agency the authority to sponsor incentive prizes for up to $50 million;
- Including funding for dozens of national science and technology initiatives in the President’s budget, such as the National Nanotechnology Initiative or the BRAIN Initiative;
- Recruiting people to the White House to help solve hard and important problems, like reducing the waiting list for an organ transplant, allowing more foreign entrepreneurs to come to the United States, or launching a behavioral sciences unit within the federal government; and,
- Using the President’s “bully pulpit” to build coalitions of companies, non-profits, philanthropists, universities, etc. to achieve a particular goal, like expanding opportunities for more students to excel in STEM.
Q. What does it mean for you to have a shared sense of agency with another individual, a team, or a community?
Obviously, most people have not had 16 years of their professional life in which they could send a decision memo to the President, get a line in the President’s State of the Union address, work with Congress to pass legislation, create a new institution, shape the federal budget, and build large coalitions with hundreds of organizations that are taking mutually reinforcing actions in the pursuit of a shared goal.
So sometimes when I am talking to an individual, a team or a community, it will become clear to me that there is some aspect of the status quo that they view as fixed, and I view as potentially changeable. It might make sense for me to explain why I believe the status quo is changeable, and what are the steps we could take together in the service of achieving a shared goal.
Q. Why is shared agency important?
Changing the status quo is hard. If I don’t know how to do it, or believe that I would be tilting at windmills – it’s unlikely that I would devote a lot of time and energy to trying to do so.
It may be the case that pushing for change will require a fair amount of work, such as:
- Expressing my idea clearly, and communicating it effectively to multiple audiences;
- Marshaling the evidence to support it;
- Determining who the relevant “deciders” are for a given idea;
- Addressing objections or misconceptions, or responding to legitimate criticism; and
- Building the coalition of people and institutions who support the idea, and may be prepared to take some action to advance it.
So if I want people to devote time and energy to fleshing out an idea or doing some of the work needed to make it happen, I need to convince them that something constructive could plausibly happen. And one way to do that is to describe what success might look like, and discuss the actions that we would take in order to achieve our shared goal. As an economist might put it, I am trying to increase their “expected return” of pursuing a shared goal by increasing the likelihood that my collaborators attach to our success.
Q. Are there risks associated with having this strong sense of agency, and how might one mitigate against those risks?
Yes, absolutely. One is a lack of appropriate epistemic humility, by pushing a proposed solution in the absence of reasonable evidence that it will work, or failing to identify unintended consequences. It’s useful to read books like James Scott’s Seeing Like a State.
I also like the idea of evidence-based policy. For example, governments should provide modest amounts of funding for new ideas, medium-sized grants to evaluate promising approaches, and large grants to scale interventions that have been rigorously evaluated and have a high benefit to cost ratio.
The views and opinions expressed in this blog are the author’s own and do not necessarily reflect the view of Schmidt Futures.
2022 Bioautomation Challenge: Investing in Automating Protein Engineering
2022 Bioautomation Challenge: Investing in Automating Protein Engineering
Thomas Kalil, Chief Innovation Officer of Schmidt Futures, interviews biomedical engineer Erika DeBenedictis
Schmidt Futures is supporting an initiative – the 2022 Bioautomation Challenge – to accelerate the adoption of automation by leading researchers in protein engineering. The Federation of American Scientists will act as the fiscal sponsor for this challenge.
This initiative was designed by Erika DeBenedictis, who will also serve as the program director. Erika holds a PhD in biological engineering from MIT, and has also worked in biochemist David Baker’s lab on machine learning for protein design at the University of Washington in Seattle.
Recently, I caught up with Erika to understand why she’s excited about the opportunity to automate protein engineering.
Why is it important to encourage widespread use of automation in life science research?
Automation improves reproducibility and scalability of life science. Today, it is difficult to transfer experiments between labs. This slows progress in the entire field, both amongst academics and also from academia to industry. Automation allows new techniques to be shared frictionlessly, accelerating broader availability of new techniques. It also allows us to make better use of our scientific workforce. Widespread automation in life science would shift the time spent away from repetitive experiments and toward more creative, conceptual work, including designing experiments and carefully selecting the most important problems.
How did you get interested in the role that automation can play in the life sciences?
I started graduate school in biological engineering directly after working as a software engineer at Dropbox. I was shocked to learn that people use a drag-and-drop GUI to control laboratory automation rather than an actual programming language. It was clear to me that automation has the potential to massively accelerate life science research, and there’s a lot of low-hanging fruit.
Why is this the right time to encourage the adoption of automation?
The industrial revolution was 200 years ago, and yet people are still using hand pipettes. It’s insane! The hardware for doing life science robotically is quite mature at this point, and there are quite a few groups (Ginkgo, Strateos, Emerald Cloud Lab, Arctoris) that have automated robotic setups. Two barriers to widespread automation remain: the development of robust protocols that are well adapted to robotic execution and overcoming cultural and institutional inertia.
What role could automation play in generating the data we need for machine learning? What are the limitations of today’s publicly available data sets?
There’s plenty of life science datasets available online, but unfortunately most of it is unusable for machine learning purposes. Datasets collected by individual labs are usually too small, and combining datasets between labs, or even amongst different experimentalists, is often a nightmare. Today, when two different people run the ‘same’ experiment they will often get subtly different results. That’s a problem we need to systematically fix before we can collect big datasets. Automating and standardizing measurements is one promising strategy to address this challenge.
Why protein engineering?
The success of AlphaFold has highlighted to everyone the value of using machine learning to understand molecular biology. Methods for machine-learning guided closed-loop protein engineering are increasingly well developed, and automation makes it that much easier for scientists to benefit from these techniques. Protein engineering also benefits from “robotic brute force.” When you engineer any protein, it is always valuable to test more variants, making this discipline uniquely benefit from automation.
If it’s such a good idea, why haven’t academics done it in the past?
Cost and risk are the main barriers. What sort of methods are valuable to automate and run remotely? Will automation be as valuable as expected? It’s a totally different research paradigm; what will it be like? Even assuming that an academic wants to go ahead and spend $300k for a year of access to a cloud laboratory, it is difficult to find a funding source. Very few labs have enough discretionary funds to cover this cost, equipment grants are unlikely to pay for cloud lab access, and it is not obvious whether or not the NIH or other traditional funders would look favorably on this sort of expense in the budget for an R01 or equivalent. Additionally, it is difficult to seek out funding without already having data demonstrating the utility of automation for a particular application. All together, there are just a lot of barriers to entry.
You’re starting this new program called the 2022 Bioautomation Challenge. How does the program eliminate those barriers?
This program is designed to allow academic labs to test out automation with little risk and at no cost. Groups are invited to submit proposals for methods they would like to automate. Selected proposals will be granted three months of cloud lab development time, plus a generous reagent budget. Groups that successfully automate their method will also be given transition funding so that they can continue to use their cloud lab method while applying for grants with their brand-new preliminary data. This way, labs don’t need to put in any money up-front, and are able to decide whether they like the workflow and results of automation before finding long-term funding.
Historically, some investments that have been made in automation have been disappointing, like GM in the 1980s, or Tesla in the 2010s. What can we learn from the experiences of other industries? Are there any risks?
For sure. I would say even “life science in the 2010s” is an example of disappointing automation: academic labs started buying automation robots, but it didn’t end up being the right paradigm to see the benefits. I see the 2022 Bioautomation Challenge as an experiment itself: we’re going to empower labs across the country to test out many different use cases for cloud labs to see what works and what doesn’t.
Where will funding for cloud lab access come from in the future?
Currently there’s a question as to whether traditional funding sources like the NIH would look favorably on cloud lab access in a budget. One of the goals of this program is to demonstrate the benefits of cloud science, which I hope will encourage traditional funders to support this research paradigm. In addition, the natural place to house cloud lab access in the academic ecosystem is at the university level. I expect that many universities may create cloud lab access programs, or upgrade their existing core facilities into cloud labs. In fact, it’s already happening: Carnegie Mellon recently announced they’re opening a local robotic facility that runs Emerald Cloud Lab’s software.
What role will biofabs and core facilities play?
In 10 years, I think the terms “biofab,” “core facility,” and “cloud lab” will all be synonymous. Today the only important difference is how experiments are specified: many core facilities still take orders through bespoke Google forms, whereas Emerald Cloud Lab has figured out how to expose a single programming interface for all their instruments. We’re implementing this program at Emerald because it’s important that all the labs that participate can talk to one another and share protocols, rather than each developing methods that can only run in their local biofab. Eventually, I think we’ll see standardization, and all the facilities will be capable of running any protocol for which they have the necessary instruments.
In addition to protein engineering, are there other areas in the life sciences that would benefit from cloud labs and large-scale, reliable data collection for machine learning?
I think there are many areas that would benefit. Areas that struggle with reproducibility, are manually repetitive and time intensive, or that benefit from closely integrating computational analysis with data are both good targets for automation. Microscopy and mammalian tissue culture might be another two candidates. But there’s a lot of intellectual work for the community to do in order to articulate problems that can be solved with machine learning approaches, if given the opportunity to collect the data.
Growing Innovative Companies to Scale: A Listening Session with Startups in Critical Industries
On September 16th, 2021, the Day One Project convened a closed-door listening session for interagency government leaders to hear from co-founders and supply-chain leaders of 10 startups in critical industries — bioeconomy, cleantech, semiconductor — about challenges and opportunities to scale their operations and improve resilience in the United States. The panel was moderated by Elisabeth Reynolds, Special Assistant to the President for Manufacturing and Economic Development. The overarching theme is that for innovative companies in critical industries, the path of least resistance for scaling production is not in the United States — but it could be.
Unlike many startups that are purely software based and can scale quickly with little capital expenditure, these companies produce a product that requires manufacturing expertise and can take longer and more capital to grow to scale. Capital markets and government programs are often not well aligned with the needs of these companies, leaving the country at risk that many of the most cutting-edge technologies are invented here, but made elsewhere. As there is a tight relationship between the learning-by-building phase of scale up and innovation capacity, outsourcing production poses a threat to U.S. competitiveness. The country also risks losing the downstream quality manufacturing jobs that could stimulate economic growth in regions across the country.
Key Takeaways:
Challenges
- Overseas government incentives and manufacturing ecosystems, like intellectual property support, subsidies, and more available advanced manufacturing technology options, are more attractive than U.S. offerings.
- Shortcomings with existing federal programs and funding include a lack of government outreach to navigate the complexity of opportunities, regulations that delay access to funding on appropriate timelines, and misplaced emphasis away from commercialization.
- Supply chain gaps and opportunities for sustainable manufacturing in the United States were identified in quantum and bioindustry sectors.
Solutions
- Additional government financing mechanisms, like tax-credits for R&D and renewable tech and government co-investment opportunities through an expanded EXIM Bank, In-Q-Tel, J2 Ventures, and programs like the Development Finance Corporation (DFC) were highly encouraged.
- Improving government processes and regulations through reducing funding application timelines in the Department of Energy’s Loan Program Office or providing better guidance over The Committee on Foreign Investment in the United States (CFIUS) restrictions on quantum companies’ foreign acquisitions.
- Government demand-pull incentives were the most important solution recommended by startups in order to guide the development of technology from basic science to commercialization.
Challenges
There are significant challenges to taking advanced technology from earlier R&D phases to manufacturing products that demonstrate viability at scale. Available financing opportunities do not adequately support longer time horizons or larger capital requirements. A lack of manufacturing and engineering skills pose another barrier to scaling a product from prototype to pilot to commercial production. After many decades of disinvestment in the country’s manufacturing base, overcoming these challenges will be difficult but essential if we are to grow and benefit from our most innovative, emerging companies. As two of the bioeconomy startups stated:
“The USG knows how to fund research and purchase finished products. There is not enough money, and far more problematically, not nearly enough skilled Sherpas to fill the gap in between.”
“Manufacturing … has been considered as a “cost center,” … reducing cost of manufacturing (e.g., moving manufacturing sites offshore) is one of the major themes … Rarely there are investments or financing opportunities coming to the sector to develop new technologies that can drive innovation…the types of investment are usually very large (e.g., capex for building a manufacturing plant). As a result, it has been very hard for startups which dedicate themselves to novel, next generation manufacturing technologies to raise or secure sufficient funding.”
During the conversation, three specific challenges were identified that speak to key factors that contribute to this manufacturing gap in the United States:
1) Overseas Government Incentives and Manufacturing Ecosystems
The startups largely agreed that overseas governments provide more incentives to manufacture than the United States. Often, these countries have developed “manufacturing-led” ecosystems of private companies and other institutions that can reliably deliver critical inputs, whether as part of their value chain, or in terms of their broader development needs. Some examples from the companies include:
- “A Dutch-owned manufacturing plant in Brazil was designed to produce 30 million gallons of oil from algae; the favorable rates for such an endeavor were only made possible by the financing provided by the Brazilian Development Bank to bridge the gap of scale.” —Bioeconomy startup.
- “Currently, with a lack of biomanufacturing capacity in the US, there is a scramble for companies trying to secure capacity at contract manufacturing facilities, which are usually filled. Many are looking to other countries for scale up.” — Bioeconomy startup.
- “There are far more off-shore companies that support semiconductor packaging, manufacturing, and testing … given the volumes of these companies, it is much cheaper to do this off-shore…as foreign fabrication facilities have more technically advanced semiconductor manufacturing processes.” —Semiconductor startup.
- “The Taiwan Semiconductor Manufacturing Corporation (TSMC) also offers a much wider variety of IP support in their process libraries for semiconductors—much of this IP is production proven with a roadmap for support on future technology nodes.” —Semiconductor startup.
2) Shortcomings with Existing Federal Programs and Funding
The U.S. government has a wide range of programs that focus on supporting innovation and manufacturing. However, these programs are either targeted at the earlier stages of R&D and less on manufacturing scale up, are relatively small in scope, or involve time consuming and complicated processes to access them.
- “National Quantum Initiative allocates $1.4B towards the growth of the quantum industry and workforce, but most of this funding is going to National Labs. Larger contracts and grants to small businesses are needed.” —Semiconductor startup.
- “[Obtaining federal funding] is more complicated than going for venture backing or going to other countries.” —Semiconductor startup.
- “Regarding our knowledge of financing or other government support options currently available, I would say that any additional awareness, education or direction you can provide us with would be welcomed … even just making a single connection could prove beneficial and lower the burden [of figuring out how to access federal programs].” —Bioeconomy startup.
- “As we grew, we ran into early challenges convincing capital providers about the market drivers and perceptions about the waste industry and infrastructure …The time/timing of Department of Energy Loan Program Office funding has been a challenge taking over a year from review to award to project funding. Process doesn’t always match up with the speed of technology development and market need for commercialization.” —Cleantech startup.
- “Key U.S. federal agencies, most notably the National Security Agency (NSA), have publicly indicated a non-embrace of emerging quantum secure communication technology … The NSA position (which prefers new algorithm development) is greatly hampering VC investment in quantum component suppliers.” — Semiconductor startup.
3) Supply Chain Gaps and Opportunities for Sustainable Manufacturing in the U.S.
A few specific instances were described where the United States lacks access to critical inputs for bioeconomy and quantum development, as key suppliers are located abroad. However, as these emerging fields develop, critical inputs will change and present an opportunity to course correct. Therefore, improving our domestic manufacturing base now is vital for driving demand and establishing innovation ecosystems for industries of the future.
- “Quantum tech is going to be critical to the next century of product and tech development…The quantum supply chain currently has significant choke points for US suppliers. Many critical components (such as single frequency lasers, non-linear crystals, and InGaAs single photon detectors) are not available through US vendors.” —Semiconductor startup.
- “A lack of current capacity in the U.S. is also the opportunity for the industry as well to develop more sustainable processes, as new material inputs are developed and created, there will need to be new processes and new ways of manufacturing … With new bio inputs there is an opportunity to build up our biomanufacturing ability, produce on demand and rebuild that converter level, train a new workforce and create the building blocks for new materials that we need here to make products.“ —Bioeconomy startup.
Solutions
Startups commented on the importance of expanding funding opportunities, such as co- investment and tax credit solutions, as well as key process and regulatory changes. Most importantly, startups highlighted the importance of demand-pull mechanisms to help commercialize new technologies and create new markets.
1) Additional Government Financing Mechanisms
Several companies commented on the need to provide additional financing to support manufacturers, as equipment is often too expensive for venture avenues and other forms of capital are not readily available. These solutions include expanding government co- investment and leveraging tax credits.
- “Expanding the scope of the EXIM to support manufacturers looking to export materials and products abroad would be of interest. In the case of new, advanced biomaterials we have found that the EU is driving a great deal of demand interest as well as early adoption. Helping smaller companies with new materials developed in the US navigate this might benefit the speed of commercialization.” — Bioeconomy startup.
- “Two specific financing mechanisms to be expanded are 1) Increased funding for accelerator programs that support emerging US-based manufacturing companies and 2) government funding of venture capital companies like In-Q-Tel and J2 Ventures that can help support early-stage technology companies.” — Semiconductor startup.
- “Policymakers should support, maintain, and expand programs like the recent Development Finance Corporation’s investments [under Defense Production Act authority] to accelerate the domestic production of pharmaceuticals and active ingredients, particularly for essential medicines that are in short supply and often needed in response to public health emergencies.” —Bioeconomy startup.
- “The R&D Tax Credit is one of the most effective and efficient federal tax incentives for driving innovation across a variety of industries…an enhanced R&D tax credit for companies investing in research and development of technology to enable the domestic production of active pharmaceutical ingredients … would fully unlock the potential of synthetic biology and result in meaningful and immediate changes to the U.S. pharmaceutical supply chain.” —Bioeconomy startup.
- “Provided the incentives are set at adequate levels, incentives such as renewable energy and storage tax credits could be structured with bonuses to support domestic manufacturing in our selection of suppliers.” —Cleantech startup.
2) Improving Government Processes and Regulations
A few of the startups identified specific government processes or regulations that could be improved upon, such as application times for funding in energy sectors or restrictions in procurement or foreign acquisitions.
- “Department of Energy loan mechanisms appear to be developing positively, but for them to be effectively utilized, there needs to be a simple and swift process for application and decision making. We hear horror stories of 9 months to 3 years for this process currently.” —Bioeconomy startup.
- “In-Q-Tel is an excellent funding vehicle … However, they require a Department of Defense (DoD) sponsor. Given the stance of NSA, DoD sponsors are unwilling to support an investment in a non-NSA desired technology. This eliminates a critical government financing mechanism.” —Semiconductor startup.
- “Securing the US supply chain in quantum is going to require the acquisition of key foreign owned companies and processes. For small companies, these acquisitions are only possible when equity sharing is included in the negotiations. The Committee on Foreign Investment in the United States (CFIUS) restrictions greatly complicate these negotiations since foreign ownership in a US company that is producing products with national security implications is restricted. Although these restrictions are still needed, it would be helpful if CFIUS could provide more direct guidance to US companies negotiating the acquisition and transfer of foreign manufacturing capabilities.” —Semiconductor startup.
3) Government Demand-pull Incentives:
Most, if not all, startups felt that the best role for the government is in creating demand- pull incentives to support the development of technology from basic science to commercialization and help create new markets for leading-edge products. This can range from procurement contracts to new regulatory standards and requirements that can incent higher quality, domestic production.
- “The best role that the Department of Defense or other agencies can play is that they can help drive the demand for these technologies.” —Cleantech startup.
- “The most important thing that the Government can do to help speed the financing and adoption of new advanced materials is to help to build the pull through from research and new materials development to the final product … We believe that we have a once in a generation opportunity to re-imagine how we manufacture goods and products here in the US. We need to look at the whole of the supply chain from molecule and material inputs to final products. There has been a hollowing out of the middle of the supply chain for manufacturing here in the United States.” — Bioeconomy startup.
- “Policymakers should continue to invest in public-private partnerships like the Manufacturing USA Institutes, which brings together experts from government, academia, and industry to support projects from the research phase through the commercialization of innovative technologies.” —Bioeconomy startup.
- Examples:
- “If there were a mechanism to incent companies and early adopters to reducetheir petroleum inputs and bridge the gap all the way to the domestically sourced chemical inputs to their domestic manufacturing it could help to create more secure supply chains.” —Bioeconomy startup.
- “Domestic procurement requirements … could stimulate investment in domestic pharmaceutical supply chain[s] … Policymakers should consider maintaining a list of essential medicines and give preference to domestically sourced pharmaceuticals (and active ingredients) in any federal procurement programs” —Bioeconomy startup.
- “Policymakers should consider directing the Food and Drug Administration to create a priority review designation for domestically sourced pharmaceutical products and active ingredients. By moving these products to the top of the priority list, manufacturers would have another incentive to change supply chain practices and shift production back to the United States.” —Bioeconomy startup.
Conclusion
These anecdotes provide a small window into some of the challenges startups face scaling their innovative technologies in the United States. Fixing our scale up ecosystem to support more investment in the later-stage manufacturing and growth of these companies is essential for U.S. leadership in emerging technologies and industries. The fixes are many — large and small, financial and regulatory, product and process-oriented — but now is a moment of opportunity to change pace from the past several decades. By addressing these challenges, the United States can build the next generation of U.S.-based advanced manufacturing companies that create good quality, middle-skill jobs in regions across the country. The Biden-Harris Administration has outlined a new industrial strategy that seeks to realize this vision and ensure U.S. global technological and economic leadership, but it’s success will require informing policy efforts with on-the-ground perspectives from small- and medium-sized private enterprises.
Session Readout: Rebuilding American Manufacturing
Our roundtable of senior leadership at the White House National Economic Council and U.S. Dept. of Health and Human Services as well as a diversity of viewpoints across political ideologies from Breakthrough Energy, American Compass, MIT’s The Engine, and Employ America discussed competing with China on advanced manufacturing sectors (bioeconomic, semiconductor, quantum, etc.), supply chain resilience, and new visions for industrial policy that can stimulate regional development. This document contains a summary of the event.
Topic Introduction: Advanced Manufacturing & U.S. Competitiveness
The session began with an introduction by Bill Bonvillian (MIT), who shared a series of reflections, challenges, and solutions to rebuilding American manufacturing:
Advanced manufacturing and supply chain resilience are two sides of the same coin. The pandemic awoke us to our over dependence on foreign supply chains. Unless we build a robust domestic manufacturing system, our supply chains will crumble. American competitiveness therefore depends on how well we can apply our innovation capabilities to the historically underfunded advanced manufacturing ecosystem. Other nations are pouring tremendous amounts of resources because they recognize the importance of manufacturing for the overall innovation cycle. To rebuild American manufacturing, an ecosystem is needed—private sector, educational institutions, and government—to create an effective regional workforce and advanced manufacturing technology pipeline.
Panel 1: Framing the Challenge and Identifying Key Misconceptions
Our first panel hosted Arnab Datta (Employ America), Chris Griswold (American Compass), and Abigail Regitsky (Breakthrough Energy). The questions and responses are summarized below:
What would you say are some misconceptions that have posed obstacles to finding consensus on an industrial policy for advanced manufacturing?
Chris Griswold: The largest misconception is the ideological view that industrial policy is central planning, with the government picking winners and losers—that it is un- American. That’s simply not true. From Alexander Hamilton, to Henry Clay and Abraham Lincoln, and through the post-war period and America’s technological rise, the American way has involved a rich public-private sector innovation ecosystem. Recent decades of libertarian economics have weakened supply chains and permitted the flight of industry from American communities.
Arnab Datta: People like to say that market choices have forced manufacturing overseas, but really it’s been about policy. Germany has maintained a world-class manufacturing base with high wages and regulations. We have underrated two important factors in sustaining a high-quality American manufacturing ecosystem: financing and aggregate demand. Manufacturing financing is cash-flow intensive, making asset-light strategies difficult. And when you see scarce aggregate demand, you see a cost-cutting mentality that leads to things like consolidation and offshoring. We only need to look back to the once-booming semiconductor industry that lost its edge. Our competitors are making the policy choices necessary to grow and develop strategically; we should do the same.
Abigail Regitsky: For climate and clean energy, startups see the benefit of developing and manufacturing in the United States—that’s a large misconception, that startups do not want to produce domestically. The large issue is that they do not have financing support to develop domestic supply chains. We need to ensure there is a market for these technologies and there is financing available to access them.
With the recently introduced bill for an Industrial Finance Corporation from Senator Coons’ Office, what would you say are the unique benefits of using government corporations and why should the general public care? And how might something like this stimulate job and economic growth regionally?
Arnab Datta: The unique benefits of a government corporation are two-fold: flexibility in affordability and in financing. In some of our most difficult times, government entities were empowered with a range of abilities to accomplish important goals. During the Great Depression and World War III, the Reconstruction Financing Corporation was necessary to ramp up wartime investment through loans, purchase guarantees, and other methods. America has faced difficult challenges, but government corporations have been a bright spot in overcoming these challenges. We face big challenges now. The Industrial Finance Corporation (IFC) bill arrives at a similar moment, granting the government the authority to tackle big problems related to industrial competition—national security, climate change, etc. We need a flexible entity, and the public should care because they are taking risks in this competition with their tax dollars. They should be able to have a stake in the product, and the IFC’s equity investments and other methods provide that. It will also help with job growth across regions.Currently, we are witnessing rising capital expenditures to a degree not seen for a very long time. We were told manufacturing jobs would never come back, but the demand is there. Creating an institution that establishes permanence for job growth in manufacturing should not be an exception but a norm.
Abigail Regitsky: We need a political coalition to get the policies in supply to support the clean energy agenda. An IFC could support a factory that leverages hydrogen in a green way, or something even more nascent. These moves require a lot of capital, but we can create a lot of economic returns and jobs if we see the long-term linkage and support it.
What would you say might be the conservative case for industrial policy for advanced manufacturing? And what specific aspects of the advanced manufacturing ecosystem specifically do you see opportunities and needs?
Chris Griswold: It’s the same as the general case—it’s a common sense, good idea. Fortunately, there is much more consensus on this now than there was just a few years ago. Some specific arguments that should appeal to both sides include:
- The national security imperative to bolster our currently vulnerable supply chain and industrial base.
- Having national economic resiliency to keep up with competitors. It’s almost unanimous at this point that it will be difficult to compete without an effective advanced manufacturing sector and resilient supply chain. Offshoring all of our capacity has diminished our know-how and degraded our ability to innovate ourselves back out of this situation. We can’t just flip the innovation switch back on—it takes time to get our manufacturing ecosystem up to speed with the pace of technological progress.
- Deindustrialization has hurt working communities and created regional inequality. It has made not just our country weaker in general, but it has harmed many specific working-class local communities. Working class people without a college degree have been hit the hardest. Working class communities of color have been harmed in unique ways. At the heart of these large discussions is a moral imperative about workers and their families. They matter. We must do more to support local economies, which means caring about the composition of those economies.
Abigail Regitsky: It’s the idea of losing the “know-how” or “learning-by-building” phase of innovation. This is crucial for developing solutions to solve climate change. With climate, time is of the essence; when you are able to tie manufacturing to the innovation process, it fosters a faster scale up of new technology. We need the manufacturing know-how to scale up emerging technologies and reduce emissions to zero by mid-century.
Panel 2: Ideas for Action
Our first panel hosted Dr. Elisabeth Reynolds (WHNEC), Joseph Hamel (ASPR), and Katie Rae (MIT’s The Engine). The questions and responses are summarized below:
In the last panel, we heard from a variety of perspectives on this deep and comprehensive issue, what are a few priorities you have for improving the manufacturing base?
Elisabeth Reynolds: The last panel presented the imperative and opportunity of today’s moment perfectly. The administration is working to reframe the nation’s thoughts on industrial policy. All of those problems outlined existed before the pandemic. What we’re addressing now is a new commitment and understanding that this is not just about national security—it’s about economic security. We don’t need to build and make everything here, but we need to build and make a lot here, from commodities to next-gen technology. We have to support small and medium-sized businesses. The administration’s plans compliment the Industrial Finance Corporation bill and the initiatives included in it. There is a real effort to include and support communities, schools, and people who have not been included. We’refocusing on the regional level—we are aiming to have workforce training at the regional level to build a pipeline for the next generation of workers in manufacturing. Another critical component is the climate agenda, which manufacturing facilities should leverage demonstration funding, tax credits, and procurement to facilitate, especially on the latter, with the role of government as a buyer. Finally, each of these issues, must be approached through an equity lens, in terms of geographic, racial, small vs. big business, and more. We need to create a level playing field, that is where America will thrive.
“President Biden recently issued an Executive Order 14017 directing the US government to undertake a comprehensive review of six critical US industrial base sectors. ASPR is the lead for the public health and biological preparedness industry base review. What can you tell us about these efforts to date?”
Joseph Hamel: These efforts are focused on furthering the relationships and leveraging partnerships that were discovered during pandemic response, from the Food and Drug Administration to the Defense Advanced Research Projects Agency and National Institute of Standards and Technology, it is important to explore the right level of coordination. We are conducting a review of essential medicines to identify the most critical and relevant, then exploring potential threats and ways to invest andimprove the supply chain for these drugs. We’re bringing in clinicians, manufacturers and distributor partners to ask questions like “what is the most vulnerable item in our global supply chain and how can we act on it? We’re also establishing an innovation laboratory with FDA to evaluate a wide array of products that are subject to shortage and geographic production dependencies. We are also investigating overlooked capacities for the assembly of these products and leveraging opportunities inside and outside of government so manufacturers can realize new capabilities at scale. We need a more resilient global supply chain, as was demonstrated in the pandemic. And we have to think about doing this with lower-cost, lower-footprint environmental impact so that we can become competitive inside a larger ecosystem.
A few weeks ago the Day One Project held a listening session with several startups in cleantech, semiconductor, and bioeconomy industries that governments overseas provide more incentives, from subsidies to more available tools, to manufacture there than in the United States. What is the most important way to make it easier for small and medium sized companies to manufacture in the United States?
Katie Rae: The Engine was founded to go after the world’s biggest problems. Advanced manufacturing is one of them—ensuring foundational industries are built here. This collides with everything, including our supply chains. The impact is not theoretical—how do we get vaccines to everyone? There’s been a lot of innovation, but our current system didn’t want to support the ideas because they were out of favor for investments. We had the ideas, but we didn’t have the financing, this was a market failure. We need funding to bring these ideas to life. When startups begin scaling, they need capital to do so. It is not inherently provided by the private market, so governments are not picking winners and losers but rather ensuring that money goes to a list of potential winners.
Elisabeth Reynolds: The comments about the financing gap are exactly right. We have less support for the scale up of cutting-edge technologies at their later stage of development. We need more time and capital to get these ideas there. Katie’s team is focused on finding this capital and supporting the commercialization into government. We also have a growing shift in the mindset of the country—first thought has been to take manufacturing offshore, but the equalization of the costs is bringing some of this production back to our shores.
If you were to ask the audience to work on a specific domain, what would you challenge them to do?
Elisabeth Reynolds: We should build in on the positive programs we have; Joe’s is a great example. We also can’t forget about the examples of work outside of government. We innovate well across a wide range of places and the government needs to be a partner in supporting this.
Katie Rae: Loan guarantee programs in procurement is a must-have. Other governments will do it and our companies will relocate their headquarters there.
Joseph Hamel: Furthering investments in platform technology development. We need to leverage what is growing as a bioeconomy initiative and use these applications to create end products that we never thought were achievable. We should explore material science applications and innovation in quality by design, up front.
Interview with Erika DeBenedictis
2022 Bioautomation Challenge: Investing in Automating Protein Engineering
Thomas Kalil, Chief Innovation Officer of Schmidt Futures, interviews biomedical engineer Erika DeBenedictis
Schmidt Futures is supporting an initiative – the 2022 Bioautomation Challenge – to accelerate the adoption of automation by leading researchers in protein engineering. The Federation of American Scientists will act as the fiscal sponsor for this challenge.
This initiative was designed by Erika DeBenedictis, who will also serve as the program director. Erika holds a PhD in biological engineering from MIT, and has also worked in biochemist David Baker’s lab on machine learning for protein design at the University of Washington in Seattle.
Recently, I caught up with Erika to understand why she’s excited about the opportunity to automate protein engineering.
Why is it important to encourage widespread use of automation in life science research?
Automation improves reproducibility and scalability of life science. Today, it is difficult to transfer experiments between labs. This slows progress in the entire field, both amongst academics and also from academia to industry. Automation allows new techniques to be shared frictionlessly, accelerating broader availability of new techniques. It also allows us to make better use of our scientific workforce. Widespread automation in life science would shift the time spent away from repetitive experiments and toward more creative, conceptual work, including designing experiments and carefully selecting the most important problems.
How did you get interested in the role that automation can play in the life sciences?
I started graduate school in biological engineering directly after working as a software engineer at Dropbox. I was shocked to learn that people use a drag-and-drop GUI to control laboratory automation rather than an actual programming language. It was clear to me that automation has the potential to massively accelerate life science research, and there’s a lot of low-hanging fruit.
Why is this the right time to encourage the adoption of automation?
The industrial revolution was 200 years ago, and yet people are still using hand pipettes. It’s insane! The hardware for doing life science robotically is quite mature at this point, and there are quite a few groups (Ginkgo, Strateos, Emerald Cloud Lab, Arctoris) that have automated robotic setups. Two barriers to widespread automation remain: the development of robust protocols that are well adapted to robotic execution and overcoming cultural and institutional inertia.
What role could automation play in generating the data we need for machine learning? What are the limitations of today’s publicly available data sets?
There’s plenty of life science datasets available online, but unfortunately most of it is unusable for machine learning purposes. Datasets collected by individual labs are usually too small, and combining datasets between labs, or even amongst different experimentalists, is often a nightmare. Today, when two different people run the ‘same’ experiment they will often get subtly different results. That’s a problem we need to systematically fix before we can collect big datasets. Automating and standardizing measurements is one promising strategy to address this challenge.
Why protein engineering?
The success of AlphaFold has highlighted to everyone the value of using machine learning to understand molecular biology. Methods for machine-learning guided closed-loop protein engineering are increasingly well developed, and automation makes it that much easier for scientists to benefit from these techniques. Protein engineering also benefits from “robotic brute force.” When you engineer any protein, it is always valuable to test more variants, making this discipline uniquely benefit from automation.
If it’s such a good idea, why haven’t academics done it in the past?
Cost and risk are the main barriers. What sort of methods are valuable to automate and run remotely? Will automation be as valuable as expected? It’s a totally different research paradigm; what will it be like? Even assuming that an academic wants to go ahead and spend $300k for a year of access to a cloud laboratory, it is difficult to find a funding source. Very few labs have enough discretionary funds to cover this cost, equipment grants are unlikely to pay for cloud lab access, and it is not obvious whether or not the NIH or other traditional funders would look favorably on this sort of expense in the budget for an R01 or equivalent. Additionally, it is difficult to seek out funding without already having data demonstrating the utility of automation for a particular application. All together, there are just a lot of barriers to entry.
You’re starting this new program called the 2022 Bioautomation Challenge. How does the program eliminate those barriers?
This program is designed to allow academic labs to test out automation with little risk and at no cost. Groups are invited to submit proposals for methods they would like to automate. Selected proposals will be granted three months of cloud lab development time, plus a generous reagent budget. Groups that successfully automate their method will also be given transition funding so that they can continue to use their cloud lab method while applying for grants with their brand-new preliminary data. This way, labs don’t need to put in any money up-front, and are able to decide whether they like the workflow and results of automation before finding long-term funding.
Historically, some investments that have been made in automation have been disappointing, like GM in the 1980s, or Tesla in the 2010s. What can we learn from the experiences of other industries? Are there any risks?
For sure. I would say even “life science in the 2010s” is an example of disappointing automation: academic labs started buying automation robots, but it didn’t end up being the right paradigm to see the benefits. I see the 2022 Bioautomation Challenge as an experiment itself: we’re going to empower labs across the country to test out many different use cases for cloud labs to see what works and what doesn’t.
Where will funding for cloud lab access come from in the future?
Currently there’s a question as to whether traditional funding sources like the NIH would look favorably on cloud lab access in a budget. One of the goals of this program is to demonstrate the benefits of cloud science, which I hope will encourage traditional funders to support this research paradigm. In addition, the natural place to house cloud lab access in the academic ecosystem is at the university level. I expect that many universities may create cloud lab access programs, or upgrade their existing core facilities into cloud labs. In fact, it’s already happening: Carnegie Mellon recently announced they’re opening a local robotic facility that runs Emerald Cloud Lab’s software.
What role will biofabs and core facilities play?
In 10 years, I think the terms “biofab,” “core facility,” and “cloud lab” will all be synonymous. Today the only important difference is how experiments are specified: many core facilities still take orders through bespoke Google forms, whereas Emerald Cloud Lab has figured out how to expose a single programming interface for all their instruments. We’re implementing this program at Emerald because it’s important that all the labs that participate can talk to one another and share protocols, rather than each developing methods that can only run in their local biofab. Eventually, I think we’ll see standardization, and all the facilities will be capable of running any protocol for which they have the necessary instruments.
In addition to protein engineering, are there other areas in the life sciences that would benefit from cloud labs and large-scale, reliable data collection for machine learning?
I think there are many areas that would benefit. Areas that struggle with reproducibility, are manually repetitive and time intensive, or that benefit from closely integrating computational analysis with data are both good targets for automation. Microscopy and mammalian tissue culture might be another two candidates. But there’s a lot of intellectual work for the community to do in order to articulate problems that can be solved with machine learning approaches, if given the opportunity to collect the data.
An interview with Martin Borch Jensen, Co-founder of Gordian Biotechnology
Recently, I caught up with Martin Borch Jensen, the Chief Science Officer of the biotech company Gordian Biotechnology. Gordian is a therapeutics company focused on the diseases of aging.
Martin did his Ph.D. in the biology of aging, received a prestigious NIH award to jumpstart an academic career, but decided to return the grant to launch Gordian. Recently, he designed and launched a $26 million competition called Longevity Impetus Grants. This program has already funded 98 grants to help scientists address what they consider to be the most important problems in aging biology (also known as geroscience). There is a growing body of research which suggests that there are underlying biological mechanisms of aging, and that it may be possible to delay the onset of multiple chronic diseases of aging, allowing people to live longer, healthier lives.
I interviewed Martin not only because I think that the field of geroscience is important, but also because I think the role that Martin is playing has significant benefits for science and society, and should be replicated in other fields. With this work, essentially, you could say that Martin is serving as a strategist for the field of geroscience as a whole, and designing a process for the competitive, merit-based allocation of funding that philanthropists such as Juan Benet, James Fickel, Jed McCaleb, Karl Pfleger, Fred Ehrsam, and Vitalik Buterin have confidence in, and have been willing to support. Martin’s role has a number of potential benefits:
- Many philanthropists are interested in supporting scientific research, but don’t have the professional staff capable of identifying areas of research that we are under-investing in. If more leading researchers were willing to identify areas where there is a strong case for additional philanthropic support, and design a process for the allocation of funding that inspires confidence, philanthropists would find it easier to support scientific research. Currently, there are almost 2,000 families in the U.S. alone that have $500 million in assets, and their current level of philanthropy is only 1.2 percent of their assets.
- Researchers could propose funding mechanisms that are designed to address shortcomings associated with the status quo. For example, at the beginning of the pandemic, Tyler Cowen and Patrick Collison launched Fast Grants, which provided grants for COVID-19 related projects in under 14 days. Other philanthropists have designed funding mechanisms that are specifically designed to support high-risk, high-return ideas by empowering reviewers to back non-consensus ideas. Schmidt Futures and the Astera Institute are supporting Focused Research Organizations, projects that address key bottlenecks in a given field, and are difficult to address using traditional funding mechanisms.
- Early philanthropic support can catalyze additional support from federal science agencies such as NIH. For example, peer reviewers in NIH study sections often want to see “preliminary data” before they recommend funding for a particular project. Philanthropic support could generate evidence not only for specific scientific projections, but for novel approaches to funding and organizing scientific research.
- Physicist Max Planck observed that science progresses one funeral at a time. Early career scientists are likely to have new ideas, and philanthropic support for these ideas could accelerate scientific progress.
Below is a copy of the Q&A conducted over email between me and Martin Borch Jensen.
Tom Kalil: What motivated you to launch Impetus grants?
Martin Borch Jensen: Hearing Patrick Collison describe the outcomes of the COVID-19 Fast Grants. Coming from the world of NIH funding, it seemed to me that the results of this super-fast program were very similar to the year-ish cycle of applying for and receiving a grant from the NIH. If the paperwork and delays could be greatly reduced, while supporting an underfunded field, that seemed unambiguously good.
My time in academia had also taught me that a number of ideas exist, with great potential impact but that fall outside of the most common topics or viewpoints and thus have trouble getting funding. And within aging biology, several ‘unfundable’ ideas turned out to shape the field (for example, DNA methylation ‘clocks’, rejuvenating factors in young blood, and the recent focus on partial epigenetic reprogramming). So what if we focused funding on ideas with the potential to shape thinking in the field, even if there’s a big risk that the idea is wrong? Averaged across a lot of projects, it seemed like that could result in more progress overall.
TK: What enabled you to do this, given that you also have a full-time job as CSO of Gordian?
MBJ: I was lucky (or prescient?) in having recently started a mentoring program for talented individuals who want to enter the field of aging biology. This Longevity Apprenticeship program is centered on contributing to real-life projects, so Impetus was a perfect fit. The first apprentices, mainly Lada Nuzhna and Kush Sharma, with some help from Edmar Ferreira and Tara Mei, helped set up a non-profit to host the program, designed the website and user interface for reviewers, communicated with universities, and did a ton of operational work.
TK: What are some of the most important design decisions you made with respect to the competition, and how did it shape the outcome of the competition?
MBJ: A big one was to remain blind to the applicant while evaluating the impact of the idea. The reviewer discussion was very much focused on ‘will this change things, if true’. We don’t have a counterfactual, but based on the number of awards that went to grad students and postdocs (almost a quarter) I think we made decisions differently than most funders.
Another innovation was to team up with one of the top geroscience journals to organize a special issue where Impetus awardees would be able to publish negative results – the experiments showing that their hypothesis is incorrect. In doing so, we both wanted to empower researchers to take risks and go for their boldest ideas (since you’re expected to publish steadily, risky projects are disincentivized for career reasons), and at the same time take a step towards more sharing of negative results so that the whole field can learn from every project.
TK: What are some possible future directions for Impetus? What advice do you have for philanthropists that are interested in supporting geroscience?
MBJ: I’m excited that Lada (one of the Apprentices) will be taking over to run the Impetus Grants as a recurring funding source. She’s already started fundraising, and we have a lot of ideas for focused topics to support (for example, biomarkers of aging that could be used in clinical trials). We’re also planning a symposium where the awardees can meet, to foster a community of people with bold ideas and different areas of expertise.
One thing that I think could greatly benefit the geroscience field, is to fund more tools and methods development, including and especially by people who aren’t pureblooded ‘aging biologists’. Our field is very limited in what we’re able to measure within aging organisms, as well as measuring the relationships between different areas of aging biology. Determining causal relationships between two mechanisms, e.g. DNA damage and senescence, requires an extensive study when we can’t simultaneously measure both with high time resolution. And tool-building is not a common focus within geroscience. So I think there’d be great benefit to steering talented researchers who are focused on that towards applications in geroscience. If done early in their careers, this could also serve to pull people into a long-term focus on geroscience, which would be a terrific return on investment. The main challenges to this approach are to make sure the people are sincerely interested in aging biology (or at least properly incentivized to solve important problems there), and that they’re solving real problems for the field. The latter might be accomplished by pairing them up with geroscience labs.
TK: If you were going to try to find other people who could play a similar role for another scientific field, what would you look for?
MBJ: I think the hardest part of making Impetus go well was finding the right reviewers. You want people who are knowledgeable, but open to embracing new ideas. Optimistic, but also critical. And not biased towards their own, or their friends’, research topics. So first, look for a ringleader who possesses these traits, and who has spent a long time in the field so that they know the tendencies and reputations of other researchers. In my case, I spent a long time in academia but have now jumped to startups, so I no longer have a dog in the fight. I think this might well be a benefit for avoiding bias.
TK: What have you learned from the process that you think is important for both philanthropists considering this model and scientists that might want to lead an initiative in their field?
MBJ: One thing is that there’s room to improve the basic user interface of how reviews are done. We designed a UI based on what I would have wanted while reviewing papers and grants. Multiple reviewers wrote to us unprompted that this was the smoothest experience they’d had. And we only spent a few weeks building this. So I’d say, it’s working putting a bit of effort into making things work smoothly at each step.
As noted above, getting the right reviewers is key. Our process ran smoothly in large part because the reviewers were all aligned on wanting projects that move the needle, and not biased towards specific topics.
But the most important thing we learned, or validated, is that this rapid model works just fine. We’ll see how things work out, but I think that it is highly likely that Impetus will support more breakthroughs than the same amount of money distributed through a traditional mechanism, although there may be more failures. I think that’s a tradeoff that philanthropists should be willing to embrace.
TK: What other novel approaches to funding and organizing research should we be considering?
MBJ: Hmmm, that’s a tough one. So many interesting experiments are happening already.
One idea we’ve been throwing around in the Longevity Apprenticeship is ‘Impetus for clinical trials’. Fast Grants funded several trials of off-patent drugs, and at least one (fluvoxamine) now looks very promising. Impetus funded some trials as well, but within geroscience in particular, there are several compounds with enough evidence that human trials are warranted, but which are off-patent and thus unlikely to be pursued by biopharma.
One challenge for ‘alternative funding sources’ is that most work is still funded by the NIH. So there has to be a possibility of continuity of research funded by the two mechanisms. Given the amount of funding we had for Impetus (4-7% of the NIA’s budget for basic aging biology), what we had in mind was funding bold ideas to the point where sufficient proof of concept data could be collected so that the NIH would be willing to provide additional funding. Whatever you do, keeping in mind how the projects will garner continued support is important.
Biden, You Should Be Aware That Your Submarine Deal Has Costs
For more than a decade, Washington has struggled to prioritize what it calls great power competition with China — a contest for military and political dominance. President Biden has been working hard to make the pivot to Asia that his two predecessors never quite managed.
The landmark defense pact with Australia and Britain, AUKUS, that Mr. Biden announced this month is a major step to making that pivot a reality. Under the agreement, Australia will explore hosting U.S. bombers on its territory, gain access to advanced missiles and receive nuclear propulsion technology to power a new fleet of submarines.
Why and How Faculty Should Participate in U.S. Policy Making
If the U.S. Congress is to produce sound policies that benefit the public good, science and technology faculty members must become active participants in the American policy-making process. One key element of that process is congressional hearings: public forums where members of Congress question witnesses, learn about pressing issues, develop policy initiatives and conduct oversight of both the executive branch and corporate practices.
Faculty in science and technology should contribute to congressional hearings because: 1) legislators should use data and scientifically derived knowledge to guide policy development, 2) deep expertise is needed to support effective oversight of complex issues like the spread of misinformation on internet platforms or pandemic response, and 3) members of Congress are decision makers on major issues that impact the science and technology community, such as research funding priorities or the role of foreign nationals in the research enterprise. A compelling moment during a hearing can have a profound impact on public policy, and faculty members can help make those moments happen.