Do Scientists Need an Ethics Code?

The turn of the year always brings along juicy lists to browse. As a list-obsessed person, I have recently read: Barnes & Noble’s “The 50 Best Works of Historical Fiction” (the Odyssey-inspired “Cold Mountain” by Charles Frazier is in there, along with books by Alice Hoffman and Philip K. Dick); Paul Rabin’s Top 25 Christmas Song countdown (including Lou Monte’s infamous “Dominick the Donkey”); “Critically Hated Movies that are Actually Awesome” (agree about “Constantine,” disagree about “The Cell”) on looper.com; the “Top 10 Modern Love columns of 2017” outlining some of the best entries appearing on the New York Times’ wildly popular page; and Business Insider’s “The most famous books that take place in every state” (New Jersey’s book is Junot Diaz’s “Drown.”)

But here’s one list that scared me, mainly because the larger roll it draws from keeps growing. The “Top 10 Retractions of 2017” was published earlier this month in The Scientist, a magazine for researchers in the life sciences. Among the transgressions reported are an effort by editors at the Journal of Translational Science in May 2017 to republish an already retracted study linking vaccines to autism. Thankfully, the article was retracted within days. Also painful is the instance of the Cornell University food scientist inventoried whose work this year generated 5 retractions and 13 corrections.

Perhaps most spectacularly unnerving is the action by the publisher Springer to retract 107 (yes, 1071) papers from the journal Tumor Biology. This occurred when editors learned that the necessary “peer review,” a central practice of science in which expert colleagues review research papers for key characteristics such as rigor, originality, and reproducibility prior to publication, didn’t occur. As a result, the Chinese government concluded that about 500 researchers it supported who were tied to these papers were guilty of misconduct. They have been temporarily banned from working at their universities and institutes while the government continues its investigation.

Why would the number of scientists who are making the kind of ethical boo-boos leading to retractions be on the rise? Isn’t it absolutely clear what their behaviors should be? Sadly, the answer is no. Unlike physicians, who swear fealty to the ancient Hippocratic Oath with its famous provision to “first, do no harm,” scientists do not follow any clearly defined universal code. Certainly, various associations and institutions boast ethical guidelines that members and employees are expected to follow. But a more generalizable one for the folks whose activities run the gamut from inquiries into dark matter to investigations into gene patterns? No.

In an article in Science Careers, the writer Beryl Lieff Benderly described the twin public benefits the Hippocratic Oath bestows on the medical community and the larger public. “The powerful, ancient ritual of publicly promising to observe a select community’s standards has long symbolized the intense professionalism that binds physicians. It also places the obligation to ethical behavior at the center of professional identity,” Benderly wrote.

Scientists for decades have been working to establish a codified set of principles for their community. Their ideas go well beyond setting an ideal for honesty in research and attribution, the behaviors where the retractees enumerated in The Scientist fell short. In journalism, we follow a code outlined by the Society for Professional Journalists. Imperfect as the public may think we are, we journalists know our rules and our values. The case may be similar for a scientific code of conduct — to be used successfully, it should be explicit in its values and be widely known and accepted. Some advocates also believe the existence of such a statement would re-establish bonds between scientists, connections that have withered over time under the forces of a commercialism that drives labs to be larger and subjected to intense pressures to produce results.

The idea of an oath or code of ethics has been debated in the scientific community for more than a century. Philosophers such as Karl Popper wanted to find a mechanism by which scientists would consider the ethical implications of their advances. Not all have agreed in the past or now that such an oath is needed. The scientific community, opponents contend, could never agree on the wording for an oath. Others even argue that each field in science is so specialized that a one-size-fits-all code could never be constructed.

The central issue for researchers seems to be deciding for themselves the answer to this question: Is the creation of knowledge separate from how it is used? The query can be applied, of course, to the discovery of the atomic bomb and the issue of whether the Manhattan Project researchers bear responsibility for the destruction that ensued. But the question is also deeply applicable today in an era when biologists are employing CRISPR technology, a simple and powerful tool for genome-editing, and artificial intelligence researchers are developing sentient systems. I would like to believe that such scientists are aware of the implications of their actions. But wouldn’t it be better to be sure?

Should scientists follow a code of ethics? I vote “yes.” Looking to the future, wouldn’t it make us all feel better?

Kitta MacPherson