The Department of Energy says COVID-19 was caused by a lab leak. But that doesn’t mean COVID-19 was definitely caused by a lab leak. In fact, the agency’s report, which made headlines last week, states it has “low confidence” in its own conclusion. Scientific evidence, on the other hand, has overwhelmingly pointed toward a natural spillover from animals to humans — the same origin of nearly every other outbreak in history and a growing threat.
What this DOE report does prove is that COVID-19’s origins have become extremely politicized. With congressional hearings, a federal advisory board and government reports pitting the conclusions of the U.S. intelligence community against the conclusions of scientists, it raises the question – if a lab leak were the true origin of COVID-19, would it change how we prepare for pandemics? Clearly, a sizable portion of the American political system is engaged with the idea that proving COVID-19 began with a lab leak matters. But why does it matter? Is it motivated by the political goal of assigning blame or the public health goal of planning for the future?
This is not, after all, the first time scientists have thought about lab safety. Or the first time anyone has talked about, or implemented, biosecurity regulations. We spoke to Angela Rasmussen, a virologist and one of the scientists investigating COVID-19’s origins, to understand how scientists were thinking about lab safety before the pandemic politicized it — and what needs to happen to protect the world from pandemics, wherever they might come from.
Maggie Koerth: Whether COVID-19 came from a lab leak or if it was a zoonotic spillover, does that change what we have to do to prepare for or mitigate the risk of future pandemics?
Angela Rasmussen: Yes, it matters. But also, no, it doesn’t. So we already know that lab accidents are potentially dangerous. We also already know that they’ve happened before. But as far as I know, there’s been one lab-associated pandemic, and that was a flu pandemic in 1977, that started … likely as a result of a vaccine trial. Lab-acquired infections are a known risk. We also know how to mitigate that risk pretty well. That doesn’t mean we eliminate the risk. But it does mean we can mitigate it significantly, and of course we should. No matter what, we’re not going to be like, “Oh, it was zoonotic. So let’s not worry about biosafety biosecurity anymore.” Nobody at all is ever going to say that because we were already worried about biosafety and biosecurity.
But I do think that the reason why it matters where the pandemic came from is that if people are saying that it’s a lab leak, one of the consequences of this is that people are going to say, “It was that person’s fault, get them out of here, hold them accountable, whatever. And maybe shut down a lab. But don’t worry about zoonosis.” When in reality, we should not neglect lab safety. And we should certainly hold people accountable. But we also shouldn’t be taking resources away from the threat of zoonosis because it’s a much, much bigger threat.
MK: Biosafety and biosecurity, and regulation around those things, aren’t new ideas. What was the discussion around lab safety like in the years leading up to COVID, before it was politicized.
AR: If you really want to go all the way back to the beginning, that was in response partly to the anthrax attacks after 9/11. I think that was a wake-up call to the government that people could conduct bioterrorism with few resources. What came out of that was the Select Agents and Toxins Act, which basically made it a federal crime to mishandle a list of pathogens that are considered to be exceptionally dangerous or candidates for development into bio weapons. This differs from country to country.
Then in 2010, two experiments were done; one by Yoshihiro Kawaoka and one by Ron Fouchier. They basically selected variants of H5N1 [influenza virus] that were transmissible amongst ferrets by the airborne route. And that got a lot of people’s attention because H5N1 doesn’t transmit very efficiently between people by the airborne route. It really doesn’t transmit efficiently between people at all. The fear of so-called gain of function experiments began with those data, and there was a whole big discussion about, you know, should they publish their findings? Should they publish their methods? They did end up publishing both. But then there was a moratorium that was imposed on all research on influenza and coronaviruses that would potentially be considered gain of function. [That led to] the Pandemic Potential Pathogen Care and Oversight Framework, and that is how, currently, the [National Institutes of Health] or the Department of Health and Human Services regulates [biosafety].
MK: You’re talking here about two different things that both sort of get lumped together under the word “biosafety”: Regulation of lab security and practice and regulation of types of experiments that could be dangerous. Should we be talking about those separately?
AR: Boy, on this topic I’m seeing everything getting mixed up. I think some of it reflects the fact that some people, because of the lab leak theory, which can be a very compelling story, have just decided, “This seems like a pretty dangerous thing. I guess I better jump into this field now.” And I think there’s a lot of people who are essentially newcomers to it, who don’t necessarily have a lot of technical knowledge. So I do think that these efforts are being sort of wrongly conflated.
MK: What are your thoughts on the recommendations made by the National Science Advisory Board for Biosecurity, which earlier this year issued a report that was particularly centered on new proposed regulations for gain of function research?
AR: They basically said that now we need to start regulating everything that deals with a pathogen. Their new recommendations were maddening because they were both so vague, but also so potentially broad in scope, that it could mean essentially a stoppage of any research that involves a virus, even if it’s a virus that doesn’t make anything or anybody sick.
It has kind of turned into this creep, where a lot of people who don’t actually do any of this work and don’t have a very good technical understanding of it can come in and say, “Oh, look at this, this experiment could cause a zombie apocalypse, or this could cause the extinction of the human race with transmissible bird flu.” And none of these things are really very accurate or likely. But a lot of that sort of fear-based messaging is, I think, really motivating. It really is a big problem. And many of my colleagues and I have been talking about this. If we can’t do a lot of the work that we do, [the result is] no more vaccines, no more antiviral therapies, no more understanding zoonotic pathogens that are going to emerge.
MK: I think it’s easy to see this, though, as an industry — in this case, virology — pushing back against regulation, just because that’s what industries do. Are there regulations you would support? Things you’d like to see changed?
AR: I do think that making the [Pandemic Potential Pathogen Care and Oversight Framework] more broadly applicable to multiple agencies across the U.S. government, and making it smarter [would be good].
We already have a lot of regulation. And we’re happy to comply with more, if that makes sense. But right now, I haven’t seen any proposals that actually make sense or that would meaningfully address actual threats that we do work with.
MK: If there were a lab leak situation that led to a pandemic, what kinds of changes would you expect to see? What would you like to see people who believe COVID-19 was a lab leak pushing for?
AR: The one thing that we should all be pushing for, and one thing that has kind of gotten lost in this conversation, is that every country regulates biosafety differently and biosecurity differently. Even between Canada and the U.S., for example, things are regulated differently. And this is kind of a big problem when it comes to responding to “lab leaks” or trying to think about this as an international effort. There is no sort of treaty mechanism that would allow an organization like [the World Health Organization] to put together a multinational working group that would have any enforcement power. There is discussion about a pandemic treaty at the World Health Assembly. So I think that really a lot of the focus should go toward that, because that’s going to address international collaborative efforts to get to the bottom of any sort of new emerging pathogen, whether it’s from a lab or whether it’s from nature.
MK: I’m curious if there’s anything valuable that COVID-19 and the arguments over where it came from have taught you about those cultural differences within science, and how science is done in different parts of the world.
AR: Yeah, it has been really instructive. I know people who are scientists, either from China or in China. What I’ve learned about the situation in China is that there are very, very different political pressures that scientists are working under. I’m familiar with the U.S. and the Canadian system. There’s always some politics involved. [But] there’s a whole different set of considerations that I think scientists in China have to deal with. There’s also a whole different reward system.
One of the arguments that, to me, is crazy is [that] the Wuhan Institute of Virology developed the secret reverse genetics clone of the SARS2 precursor and didn’t publish it, because [virologist] Shi Zhengli was afraid that she’d get blamed for starting the pandemic that hadn’t started yet. That makes zero sense to me because I found that in many Chinese institutions, people get significant cash bonuses when they write a Nature, Science or Cell paper. If you’re the senior author of these high profile papers, you get compensated for it quite a lot, actually thousands of dollars.
I’ve also noticed that a lot of times when people are talking about especially safety practices, there’s an assumption that things are inferior in China. That’s not the case at all. These so-called safety problems that were occurring at the Wuhan Institute of Virology, BSL-4 lab, were actually not safety problems. They were, to me, indicative of a biosafety maximum containment lab, where they were doing what they normally do for containment lab, which is constantly reevaluating how you might be able to do things better and installing multiple redundant systems so that you don’t have the containment breach.
MK: So you’re actually coming away from this a little bit more confident in international standards?
AR: I wouldn’t say I’m more confident in international standards because there aren’t any. I’m more confident, though, that in some ways, while China is different, and there’s clearly different pressures that scientists in China have to work under, it’s kind of like the institutions in any other country I’ve worked in. I’ve never seen anything that indicates to me that there’s a culture of lax regulation of high containment labs.
Add to that, individual labs will have individual challenges. The U.S. Army Medical Research Institute of Infectious Disease has had these terrible plumbing problems where their containment labs are getting flooded. There’s all these problems. So it’s not like just China that would have potential safety issues, or literally physical problems, engineering failures with their containment labs. That can happen anywhere. I think the difference is how proactive your institution is about trying to prevent those from happening in the first place. But it looks to me like WIV is constantly working to improve its biosafety and biocontainment practices.