A HISTORICAL PERSPECTIVE ON EPIDEMIOLOGY RESEARCH ETHICS

Harrison Helms

As the globe stands engulfed in the coronavirus pandemic, Emory University stands out as a city upon a hill for epidemiology research. “Tucked away in Druid Hills,” proclaims Malcolm Gladwell, “is a really remarkable place that has played a key role in our understanding of infectious diseases for 75 years.” U.S. News ranked Emory’s infectious diseases program fifth in the nation and ninth internationally. The prestige of Emory’s epidemiology research, augmented by the current pandemic, makes these ethical questions regarding the nature of the research even more important and relevant. The use of both animal and human subjects requires ethical consideration when conducting research, yet examples of abominable test subject treatment are rampant in history. The implementation of research ethics in the past has been primarily responsive, with institutions reacting to disasters in human research. In general, however, research has become increasingly ethical with stricter standards applied, and viewing research at Emory through the backdrop of the history of research provides a new perspective on questions of ethics. 


Vaccinations have become the keystone of public health, illustrated by the race for a coronavirus vaccine, with Emory participating. English physician Edward Jenner is credited with laying the foundation for vaccine research. In 1796, he injected the fluid of a cowpox pustule into an eight-year-old boy, providing him immunity to smallpox. This is not the sole example of vaccination research on young children. French scientist Louis Pasteur continued Jenner’s work on vaccine development, inoculating a nine-year-old boy with an experimental rabies vaccine. Early research laid the foundation for the ethical questions that would later be raised. 


Abhorrent research practices became rampant during World War II. Imperial Japan used Chinese prisoners of war as test subjects in a myriad of atrocious experiments, including for vaccine research, between 1932 and 1945. Nazi Germany also utilized concentration camp prisoners for despicable medical experiments. Following the end of the war, these morally reprehensible and starkly unethical experiments were discussed at the Nuremberg trials, where the victorious Allied Powers prosecuted war criminals. A product of the trials was the Nuremberg Code, setting up a code of ethical standards for research through human experimentation. The Code emphasized the necessity of consent regarding participation and ensuring the welfare of test subjects by decreasing risk to the greatest extent possible. This was the first international ethics code for human research. Many would follow in the second half of the twentieth-century. 


The Declaration of Helsinki was published in 1964 by the World Medical Association. As with the Nuremberg Code, consent and the welfare of the subjects was of paramount importance. Unethical research practices were becoming increasingly recognized and exposed in the scientific community. In 1966 Henry Beecher published “Ethics and Clinical Research” in the New England Journal of Medicine, discussing twenty-two unethical studies including the Tuskegee syphilis study, in which Black-American men were intentionally infected with syphilis without being informed. Beecher’s article, along with the increasing awareness of the unethical Tuskegee study, led to a chain reaction of the creation of ethical rules regarding human research.


Along with the exposé of unethical studies, the animal rights movement was gaining traction, partially led by Australian philosopher Peter Singer. In his book Animal Liberation, Singer maintains that the vast majority of animal research is unethical. Questions of ethics regarding animal research certainly pertain to Emory, the operator of Yerkes National Primate Research Center. The Center has consistently been embroiled in controversy regarding animal rights, with protesters assembling to demand its closure as recently as August 2020. 


Congress eventually took action on research ethics in 1973, in part due to recent revelations on the Tuskegee study. The National Research Act was passed and signed into law by President Richard Nixon and authorized bureaucratic agencies to regulate human research. This was to be executed through Institutional Review Boards, which would monitor human research. Major revisions were later made to the federal code of research ethics by the National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research. The commission published The Belmont Report in 1979, which, like previous ethics codes, emphasized consent and also emphasized the need for a risk-benefit analysis when using human test subjects. The code was revised again in 1981. 


With academia being the breeding ground for young researchers, graduate students at the end of the twentieth-century began being inoculated with an ethical research education. The National Institute of Health required all graduate students funded by grants to receive training in responsible research conduct in 1989. The same year, the National Academy of Sciences published On Being a Scientist, an accessible resource on ethical research practices. The Laney Graduate School at Emory enrolls approximately 1,800 graduate students each year, and it is apparent that these future researchers are exposed to the ethical questions and standards surrounding research. 


The Tuskegee syphilis study was the first catalyst of the twentieth-century, but as time distanced the present from the horrors of the event, the spotlight on research ethics began to fade. The spark was reignited, however, in 1999, when a test subject at the University of Pennsylvania died in a gene therapy experiment. The federal government responded by requiring all researchers conducting human experiments to be trained in research ethics. The National Academy of Sciences also published Integrity in Scientific Research in 2002 which set guidelines for educating researchers in ethical research conduct. 


Scientific research is not a Machiavellian game. While great strides have been made which have greatly improved the lives of humanity, such as the invention of vaccines, the common consensus is that they do not justify the inhumane and unethical treatment of human test subjects. Research at Emory, such as studies regarding the association of blood markers with outcomes of COVID-19 patients, demands ethical considerations. While ethical research standards have typically been the product of research disasters, they have greatly improved, particularly in the twentieth-century. With the abundance of epidemiological research occurring at Emory University, often requiring animal and human test subjects, the paramount importance of ethical research practices must always be remembered.