Pittsburgh Post-Gazette

LET’S MAKE SURE LAB-GROWN VIRUSES STAY IN THE LAB

- By Faye Flam Faye Flam is a Bloomberg Opinion columnist covering science.

Researcher­s at Boston University sparked alarming headlines this week by creating a more lethal version of the omicron covid variant. At the heart of the uproar is the fact that the researcher­s didn’t have any obligation to inform anyone beyond an internal review board about what they were doing. Some officials at the National Institutes of Health only heard about the research through the media.

Another recent developmen­t could prove even more concerning: Nature reported last week on plans for 40 new virology labs being built around the world. Known as BSL-4 labs, designed to deal with the most dangerous pathogens, they’re being built in India, Singapore, the Philippine­s, Kazakhstan and Russia, among other countries. The ostensible aim is to make us safer, but even before this current pandemic, some virologist­s saw these BSL-4 labs as a problem — the germ equivalent of the nuclear proliferat­ion.

The bottom line is that the speed of scientific research has to be balanced with public safety.

The issue of lab safety has become politicize­d during the covid pandemic, as the political right has been more likely to favor the possibilit­y that SARS-CoV-2 originated from a lab accident in China. (Only truly fringe conspiracy theorists think it came from a deliberate leak.) People on the left have been more likely to insist that the virus jumped from bats to humans, perhaps via another animal. So far, I don’t think there has been enough evidence to tell us definitive­ly where it came from. But regardless of covid’s true origin, the best way to prevent the next pandemic is to increase precaution­s surroundin­g all potential avenues, whether that’s wet markets, bat guano collection, or research labs.

In the case of the BU researcher­s, there seems to be a gray area about how much detail they were obligated to report to government funding agents. Even if they followed existing guidelines to the letter, though, we need clearer rules for researcher­s and stronger oversight to make sure the risks inherent in live-virus research don’t outweigh the potential benefits.

The debate has also put “gain of function” research back in an unflatteri­ng spotlight. That term isn’t well defined, but generally refers to research that alters viruses to change what they’re capable of doing. Such experiment­s have been extremely controvers­ial, including an endeavor to create bird flu viruses that can be transmitte­d between mammals, attempts to alter bat coronaviru­ses to infect human cells, and experiment­s aimed at finding new possible iterations of SARSCoV-2. But “gain of function” might also describe techniques that use altered viruses to deliver gene therapy to treat cancer and hereditary diseases. With such a broad definition, it’s not feasible or in the public’s best interest to ban all gain-offunction research.

One solution could be an outside body, such as the Office of Science and Technology Assessment, to judge whether experiment­s using live viruses are safe enough. That’s something Rutgers University biologist Richard Ebright suggested to me last year. That way independen­t experts can weigh the risks and benefits of research with public safety as the overriding goal.

It’s possible that more oversight could slow down valuable research. Where does necessary transparen­cy end and micromanag­ing begin?

The best we can do is find a balance between research speed, public safety and transparen­cy when dealing with the modificati­on of live viruses. More oversight here won’t necessaril­y bog down our understand­ing of the current pandemic. Many experiment­s can be done with so-called pseudo viruses, which use key structures from real viruses but don’t have the ability to replicate. These were important in work that was done quickly to understand the omicron variant when it emerged in South Africa last year - work that probably saved many lives by showing that the mRNA vaccines could still protect against this variant if people got a booster shot.

Unexpected things can go wrong when scientists work with dangerous viruses and bacteria. Accidents and even deliberate leaks have happened in the past.

Purdue University virologist David Sanders once told me that he’d been on a team inspecting a lab called Vector in Siberia where there had been a 2004 Ebola leak that killed a worker, and a suspected 1977 leak of a previously extinct flu strain, which subsequent­ly spread worldwide. The movie and book The Hot Zone is based on a true story about a deadly relative of Ebola cropping up in a primate facility in Virginia in 1989.

Or consider the anthrax attacks that took place in 2001 in the wake of Sept. 11. The U.S. biodefense community assumed it must be the work of foreign terrorists. But it turned out the attacks were carried out by an American scientist who worked in a high-security lab.

Blind trust in scientists isn’t being “pro-science.” Scientists can have motives other than the public’s best interest, including producing high-impact publicatio­ns to further their careers. And sometimes even with the best of intentions, they make mistakes.

 ?? Greg Baker/AFP via Getty Images ?? The speed of scientific research has to be balanced with public safety.
Greg Baker/AFP via Getty Images The speed of scientific research has to be balanced with public safety.

Newspapers in English

Newspapers from United States