Adam and Eve. Pandora. Frankenstein. Pick your dangerous knowledge myth, and you’ll pick up on humanity’s deep ambivalence about scientists experimenting on the covid virus almost three years into the pandemic.
The latest reminder came last week, when a Boston University virology team released a preliminary report of an experiment that stitched part of the omicron variant of covid — the more-infectious kind today most prevalent worldwide — onto the original 2020 version of the virus, creating a chimera, or combination of past SARS-CoV-2 types. It’s a classic example of the “dual-use” conundrum — investigating how viruses might go bad tells you how to make viruses that might go bad, knowledge that can be misused to cause harm.
The United Kingdom’s Daily Mail, which broke the news, quoted scientists calling the chimera experiment “playing with fire” and “profoundly unwise.” A federal health official said she wished the experimenters had informed her institute, which partly funds the lab of the experimenters. Boston University called the news reports “false and inaccurate,” noting that the study had gone through a biosecurity review at the school.
Scientists and governments have wrestled with how to balance potential harms from dual-use research for decades — in debates about pandemic prevention, seen in the anthrax investigation of the 2000s, transmissible flu experiments in ferrets a decade ago, the “lab leak” pandemic fight over the origins of covid and now the chimera experiment questions. Only last month, the National Institutes of Health’s National Science Advisory Board for Biosecurity (NSABB) called for stronger experiment oversight, saying current rules miss pathogens that “pose a severe threat to public health or national security.” The pandemic and the Boston incident show that decades of stumbling in the dark toward an effective pandemic bug oversight system hasn’t led to a solution for dual-use virus research — and that finding one is more urgent than ever, with covid still ripping around the globe and throwing off new variants.
Pandemic prevention and lab safety rules “only move in fits and starts,” said biodefense professor Gregory Koblentz of George Mason University, pointing to the long list of past controversies. “And we only make progress where there is some crisis, or perceived crisis, that grabs people’s attention.”
“Unfortunately, if these things get politicized, that makes it harder to find a solution,” he added.
That may have already happened. Even as the NIH’s biosecurity board is moving forward releasing new biosecurity guidelines for both dual use and “potential pandemic pathogen” research, the lab leak debate has devolved into Sen. Rand Paul (R-Ky.) sparring with — and fundraising off — National Institutes of Allergy and Infectious Disease chief Anthony Fauci in hearings. Republican members of Congress, who have promised lab leak investigations after the November elections, on Wednesday sent a demand for documents on the variant mash-up experiment to Boston University.
“We’ve got to show that we’re being responsible, because there’s all these people out there who are trying to characterize virologists as a bunch of cowboys in the Wild West, just doing whatever the hell they want with no attention to safety or security,” said molecular biologist and virologist Michael Imperiale of the University of Michigan.
“That’s not the case,” he added. “I don’t know a single scientist who’s not concerned. If for no other reason than that if someone’s doing an experiment in their lab, and they end up with something that’s pathogenic, who’s going to be the first people exposed? People in that lab, right? No one wants to do that.”
In the Boston University experiment, the researchers investigated why the omicron variant of the covid virus seems less deadly than the original version from 2020. Was it because of mutations to the spike protein the newer variant uses to infect people, or due to mutations in the body of the virus, which contains the viral machinery that replicates inside infected cells?
To find out, they attached the omicron variant’s spike — which first latches onto cells to start an infection — to the older, original body of the virus.
The results — which are still preliminary and await peer review — suggest that the spike plays only a small role in the deadliness of the virus: The chimera creation of an original virus body equipped with omicron spikes proved 80 percent fatal to lab mice. That falls between the unaltered original virus, which kills 100 percent of mice, and unaltered omicron, which by itself didn’t kill any mice. The chimera also produced neurologic symptoms, hunched postures and unresponsiveness in the animals that omicron by itself didn’t cause. That suggests the body of the omicron virus particle doesn’t attack the nervous system like the original virus does.
In the days after the Daily Mail story, which highlighted the 80 percent fatality rate, public debate centered on whether the virus displayed a dangerous new gain of function in the experiment. Scientific observers disagreed on this point, with Harvard University epidemiologist Marc Lipsitch saying that it was “unquestionably” a gain of function, because adding the more infectious spike to the original kind of covid made it more infectious. Others such as Science Magazine columnist Derek Lowe, a medicinal chemist, disagreed, noting the chimera was less fatal than the original covid virus. “So this was not a gain-of-function experiment,” he wrote.
But debate over whether research is gain of function (or not) isn’t the point, said security scholar Klaus-Peter Saalbach of Germany’s University of Osnabrueck, and is a distraction. “Any genetic modification which results in new functions is gain of function research and obviously a chimeric virus with new properties is gain of function research,” he said by email.
What scientists are really arguing about — and what matters — is really “gain of function research of concern” (perhaps inevitably acronym-ed as GOFROC), which creates potentially dangerous viruses, or “potential pandemic pathogens,” said Saalbach — the latter of which is the genuine term used in scientific policies worldwide.
“‘Gain of function’ — we should retire that term, it really doesn’t help us in that debate,” said Koblentz. “It has become shorthand for a class of research that people are worried about because of the risks it poses, but it is a term that really has outlived its usefulness.”
Confusion over what “gain of function” means contributed to the repeated circular, tail-chasing exchanges between Fauci and Paul over chimeric bat coronavirus experiments conducted before the pandemic. (“Senator Paul, you do not know what you are talking about,” said Fauci, at one point.) It also made the term an unlikely political football. The National Institutes of Health noted in a 2021 letter to Sen. Charles Grassley (R-Iowa) that potential pandemic pathogen standards, not gain of function ones, were the determining factors in approving these now-controversial experiments conducted by U.S. and Chinese researchers since “they were not reasonably expected to increase transmissibility or virulence of these viruses in humans.”
“I don’t think ‘gain of function’ is a red herring, but I think we have to make clear what it means and what it doesn’t mean,” said Imperiale. Many labs regularly do experiments that cause some incremental gain of function to viruses, bacteria and other organisms, after all. More significantly, Science Magazine has reported that chimera experiments similar to the Boston one have already been conducted at the Food and Drug Administration and by a privately funded lab in Texas, without any uproar.
In the final analysis, said Imperiale, the Boston University chimeras were not potential pandemic pathogens, because the omicron spike in the experiment has already been outcompeted in the real world by even more transmissible versions. That means the accidental release of the lab-made Boston virus wouldn’t have led to a pandemic, because the latest variants would have kept it from transmitting. “We’re really concerned about gain of function as it relates to pathogens that have pandemic potential, right? That’s the concern,” he said.
In 2004, spurred by the death of five people in anthrax mailings traced to an Army biodefense strain of the pathogen, the National Research Council’s “Biotechnology Research in an Age of Terrorism” report first warned of the “dual use dilemma” and led a year later to the creation of the NIH’s NSABB, meant to review potentially dangerous virus experiments.
The board was not in the news again until 2011, when two labs announced they had made a forms of avian influenza transmissible between mammals — ferrets in this case — triggering NSABB votes on their publications, followed by new U.S. regulations on 15 kinds of experimental organisms and toxins. Three years later, debate arose after Centers for Disease Control and Prevention high-containment labs mishandled both anthrax and avian flu samples. The ensuing furor divided the scientists between camps arguing for (Scientists for Science) and against (Cambridge Working Group) dual-use virus research in these more secure labs. Many of these voices, and their unchanged positions, figure prominently today in both the lab leak and chimera debate.
Last month, the NSABB released a draft report calling for expanding oversight of lab pathogens beyond those with deadly pandemic risks to include ones that are merely highly transmissible (even if less fatal). As well, the kinds of biosafety reviews that Boston University did on the chimera research on its own should be incorporated into the federal oversight system, it suggested.
“It’s good to see that we’ve now got thoughtful people considering this issue seriously again,” said Imperiale. Too often in the past, he suggested the NIH board has gone dormant between crises, snapping awake only to play catch-up in a calamity.
Nature of the beast
There’s one point all of the experts who spoke to Grid agreed on — the Boston University chimera experiments do point to a need for stronger federal government oversight of potentially dangerous bugs. The fact that we are still debating whether to review genetically altering known pandemic pathogens, not even potential ones, said Koblentz, “is an indictment of both the self-governance model that the virology community largely supports and the current policy.”
For starters, the chimera experiment shows NIH should better define what research is viewed as risky — particularly on the question of transmissibility as well as lethality — and better explain responsibilities so scientists don’t learn only later, from news reports, that funding agency officials wish they had been consulted beforehand. The chimera case also shows that, surprise, scientists might conduct pandemic bug research apart from their federal grants, another loophole in current rules that needs fixed.
Without new laws, the NIH cannot compel private foundations or companies to follow these tighter rules. But Imperiale said it likely would anyway, both because the biomedical community largely take its lead from the $45 billion research agency, and because NIH funding penetrates almost every lab in some fashion, from training to equipment to collaborators. Just following federal rules means fewer headaches.
The rub is that those new rules must draw the line at truly dangerous experiments, Imperiale added: “If you start reviewing too many things, you do potentially throw sand in the gears of scientific progress.”
There is an inevitable tension built into biomedical research between people who want to experiment as fast as possible and those who want to slow down, he added.
“We’re never going to have a perfect system,” he said. “It’s just the nature of the beast. But it would be great if we could come to some agreement on what experiments cross the line.”
Thanks to Lillian Barkley for copy editing this article.