Controversy erupted in 2011 when two researchers performed experiments on a highly transmissible form of bird flu virus. Now, the U.S. government has announced new policies for reviewing such potentially risky research before it gets funded.
The U.S. government released a framework yesterday (Feb. 21), detailed in a forum in the journal Science, for determining whether to fund research that could create a version of the H5N1 bird flu virus that could infect mammals by airborne droplets. The recent bird flu experiments sought to do exactly that, in the hope of understanding how such a virus might evolve in the wild. The White House also put out a draft policy yesterday to help research institutions assess so-called "dual use" research that could do both good and harm. Some people fear the mutant virus and other pathogens could escape the lab or be used as a bioweapon by terrorists.
About 600 confirmed human infections of the H5N1 virus have occurred since 2003, roughly 60 percent of which were fatal (though this number might be inflated). The bug does not pass easily between humans, but if it were to acquire that ability, it could potentially cause a pandemic.
The first of the two policies, a framework for dealing with research on highly infectious strains of H5N1 virus, requires that funding agencies and the Department of Health and Human Services both review the research. The document lays out seven criteria that must be met in order to grant funding. For instance, the research must be done only on viruses that could evolve naturally, and the risks to lab workers and the public must be manageable. [Mutant Bird Flu Quiz: Test Your Viral Smarts]
Some scientists feel the framework is a step in the right direction. "I think the government has done a good job here in terms of framing the discussion," virologist Michael Imperiale of the University of Michigan told LiveScience. The framework document provides "a mechanism for reviewing this type of research before it gets funded that I think is fair and comprehensive," Imperiale said.
Others are heavily critical. "What was initially a weak policy has been transformed into an empty policy," molecular biologist Richard Ebright of Rutgers University, in New Jersey, told LiveScience. The framework applies to a very narrow set of risky experiments, and does not provide a true risk-benefit assessment, Ebright said.
The second policy released yesterday is a drafted set of guidelines for how research institutions should handle controversial research more generally. It applies to research on 15 deadly pathogens that include highly infectious H5N1, Ebola virus and others, as well as seven categories of experiments that make a germ or toxin more lethal.
If the second policy is implemented, Imperiale thinks it will help institutions manage the risks and benefits of research with these pathogens. One concern has been that the extra level of scrutiny applied to this research could hinder or prevent work that is vital to public health. "I'm going to be interested to see what institutions notice. Are they going to see things in [the policy] that are burdensome?" Imperiale asked.
On the flip side, the policy might not be restrictive enough. The responsibility of evaluating risk falls to research institutions, and not all of them are equipped to make those assessments, Ebright said. What's more, he said, having universities evaluate work by their own scientists is a conflict of interest.
What's at stake
One of the greatest concerns over research with deadly pathogens is the possibility of accidental release by a researcher who becomes infected. Another risk is that individuals or rogue governments could get hold of the pathogens and use them for terrorism, as in the 2001 anthrax attacks. [10 Deadly Diseases That Hopped Across Species]
Researchers who work with pathogens like H5N1 argue that studying them is critical for understanding how dangerous pathogens might evolve naturally and cause an outbreak. The result would be better preparedness in case of such a scenario.
Ultimately, any policies that attempt to manage risky research will need to address issues at a global level. "There's nothing said [in the policies] about any international agreement," biosecurity expert Harvey Rubin of the University of Pennsylvania told LiveScience. "The level of involvement by every country is so critical, whether they're doing the research, or they're the recipient of the value of the research to public health. Everybody has a stake in this."
The disputed experiments were conducted on the H5N1 virus in ferrets in 2011, by teams at the University of Wisconsin and Erasmus University in the Netherlands. The controversy centered on whether the studies ought to be published or should have been conducted in the first place. The concerns prompted a voluntary moratorium on the research, which has recently been lifted.
Related on LiveScience and MNN: