BALTIMORE — When University of Maryland School of Medicine researchers were looking for volunteers to test a vaccine for Ebola, which was killing thousands of people in West Africa, Andrea Buchwald raised her hand in Baltimore.
“Scientific curiosity,” the 29-year-old graduate research assistant in Maryland’s department of epidemiology said, explaining why — along with her trust in the system governing treatment of human subjects — she was willing to be experimented on.
“Consent for clinical trials is a very stringent process,” Buchwald added. “You’re expected to do your best to ensure your participants are fully informed and doing this of their own volition. Things have changed a lot since the 1940s.”
It was during that decade that hundreds of Guatemalans were infected with sexually transmitted diseases in the name of research — a horrific reminder of practices brought back to light last month when the Johns Hopkins University was sued for $1 billion by research subjects and their families for its role in approving federal funds for the study. Hopkins officials said the university didn’t develop or oversee the study and was not responsible.
The days when researchers used impoverished populations, prisoners, prostitutes, orphans and others as human guinea pigs are largely in the past, most would agree. In the most infamous case — the Tuskegee study that ran from the 1930s to the 1970s — black Alabama men with syphilis were left untreated so researchers could trace the terrible progression of the disease.
But even today, concerns arise periodically about the use of human subjects in clinical trials, especially as research institutions and pharmaceutical companies increasingly go abroad to test new drugs and vaccines in countries where oversight can be more lax.
In India, for example, a rash of reports several years ago of people dying during clinical trials, or being enrolled without proper consent, led to an uproar and a government crackdown on what had become a booming industry for the fast-developing country.
“I think there is lingering fear and suspicion of research in many quarters,” said Dr. Daniel Kuritzkes, a Harvard virologist who had three research studies in India interrupted by the government’s scramble to enact new regulations.
“That’s unfortunate because for the most part, there has been worldwide adaptation of laws governing how human subjects are protected in research, said Kuritzkes, who chairs the AIDS Clinical Trials Group, a National Institutes of Health program that conducts research in conjunction with institutions around the world. “Things are done very differently than in the past.”
Case by case
Researchers who receive NIH funding, for example, must get the approval of their organization’s Institutional Review Board, which determines whether a proposed study protects human subjects, properly weighs the risks and benefits to them and can document that they provided informed consent.
At Hopkins Medicine alone, there are six such boards that meet on its Baltimore campuses weekly for three hours apiece to handle the volume of research. The boards approve about 1,800 new protocols a year and oversee about 6,100 trials, according to Hopkins.
But the boards operate in private — Hopkins officials would not allow reporters to observe a meeting for this article — so it can be difficult to assess their work.
“There are tremendous amounts of variability,” said Laura Stark, a professor at Vanderbilt University who wrote the 2011 book “Behind Closed Doors: IRBs and the Making of Ethical Research.”
“The ethical thing to do in one system may not be considered ethical in another system.”
The Institutional Review Board system has its origins in a medical past when doctors had much freer rein. For example, she writes, in the 1940s and 1950s, Mennonites, Quakers and other religious objectors to war were put in service to their country as research subjects for the NIH.
Some of them were marooned on what is now Roosevelt Island in New York so scientists could study the minimum amount of food and water shipwreck victims might need, Stark writes. Additionally, NIH researchers would drive from their campus to nearby Jessup or to Lorton, Virginia, to avail themselves of prisoners for studies, she writes.
Once the NIH started funding more research off its own campus, officials realized they would need a way to make sure those institutions followed certain standards — and to limit their own liability should something go wrong.
“Basically, the review system got tied to money: If you wanted the money, you had to have a review process,” Stark said.
The research community has long acknowledged the need to protect human subjects. Officials began enacting laws and regulations for researchers receiving federal dollars in 1948 with the Nuremberg Code, establishing the idea of consent. It was a response to German physicians who experimented on prisoners during World War II.
Then, in 1974, the U.S. passed the National Research Act to codify protections for research subjects. That led to the landmark Belmont Report, which spelled out principles of ethical treatment.
Dr. Christopher Plowe, a malaria researcher, said the rules were a “strong and appropriate reaction to Tuskegee, among others.”
A lot of current researchers began their careers after the rules were put in place and know no other way, said Plowe, the new director of the University of Maryland School of Medicine’s Institute for Global Health. Many researchers, like him, have even voluntarily strengthened the consent process overseas to address lingering distrust and ensure a study’s integrity.
Keeping up with the times
For all the improvement in protections for human subjects, there are those who say that laws and regulations have failed to keep up with changes in medicine and research.
Seema Shah, head of the NIH unit on international research ethics, said the last revision of regulations stemming from the research law came in 1991, before researchers used social media or could map the human genome, both of which raise privacy questions that remain unaddressed.
“The goal in creating laws and ethical norms is to prevent the scandals of the past from happening in the future,” Shah said. “A lot of studies in the past prompted concern from the public and led to changes. But since there is no big crisis now, maybe some things aren’t being put into law.”
When tragedy strikes in the midst of a trial, the repercussions can extend beyond the issue of victim compensation and lead to institutionwide changes.
Hopkins was shaken when Ellen Roche, 24, died in 2001 after inhaling an experimental chemical during a study of how healthy people’s lungs defend against asthma attacks.
The federal government briefly suspended virtually all of the medical institution’s experiments involving human subjects, an astonishing blow to Hopkins, which perennially garners the most research dollars in the nation. While the trials eventually resumed, federal officials faulted the researcher for failing to get proper approval of the experimental drug or to disclose its risks to volunteers.
Liz Martinez, a Hopkins research participant advocate who has worked at the institution for about 30 years, remembers that “painful” time.
“She was one of us,” Martinez said of Roche, who had worked as a lab tech at Hopkins’ asthma center.
The crisis led to changes that make Martinez feel research subjects are much better protected. Hopkins increased the number of review boards and created the research participant advocate position that Martinez has held for eight years.
“Participating in research is never perfectly safe,” she said. “It wouldn’t be research if there wasn’t risk. You can’t take it away. But you can improve it.”