Thursday, August 17, 2017

The Nuremberg Code 70 Years Later

JAMA Viewpoint
The Nuremberg Code 70 Years Later. By Jonathan D. Moreno, PhD; Ulf Schmidt, PhD; Steve Joffe, MD, MPH

JAMA. Published online August 17, 2017. doi:10.1001/jama.2017.10265

Seventy years ago, on August 20, 1947, the International Medical Tribunal in Nuremberg, Germany, delivered its verdict in the trial of 23 doctors and bureaucrats accused of war crimes and crimes against humanity for their roles in cruel and often lethal concentration camp medical experiments. As part of its judgment, the court articulated a 10-point set of rules for the conduct of human experiments that has come to be known as the Nuremberg Code. Among other requirements, the code called for the “voluntary consent” of the human research subject, an assessment of risks and benefits, and assurances of competent investigators. These concepts have become an important reference point for the ethical conduct of medical research. Yet, there has in the past been considerable debate among scholars about the code’s authorship, scope, and legal standing in both civilian and military science. Nonetheless, the Nuremberg Code has undoubtedly been a milestone in the history of biomedical research ethics.1- 3

Writings on medical ethics, laws, and regulations in a number of jurisdictions and countries, including a detailed and sophisticated set of guidelines from the Reich Ministry of the Interior in 1931, set the stage for the code. The same focus on voluntariness and risk that characterizes the code also suffuses these guidelines. What distinguishes the code is its context. As lead prosecutor Telford Taylor emphasized, although the Doctors’ Trial was at its heart a murder trial, it clearly implicated the ethical practices of medical experimenters and, by extension, the medical profession’s relationship to the state understood as an organized community living under a particular political structure. The embrace of Nazi ideology by German physicians, and the subsequent participation of some of their most distinguished leaders in the camp experiments, demonstrates the importance of professional independence from and resistance to the ideological and geopolitical ambitions of the authoritarian state.

The circumstances in which the code was promulgated thus signified a tension between professional standards and duties to the state. There had long been an intense debate within the medical profession about its ethical obligations in the course of human experiments, dating back to 18th- and 19th-century objections against objectifying human beings for scientific purposes. The increased demands placed on modern states to promote the health and welfare of citizens in the 20th century required state agencies to respond to public pressure to protect participants in clinical trials. These debates were often stimulated by medical ethics transgressions or medical errors that attracted the wider attention of state agencies and the public at large, or by concerned physicians who regarded themselves as reformers and wished to improve their colleagues’ practices. At the center of that debate is the question of how to balance participants’ rights and welfare with the progress of medical science, for example, through professional guidelines and ethics codes or through greater state intervention, laws, and regulations. That the Nazi doctors’ crimes occurred despite the vigorous and sophisticated ethical debates of the time and place should serve as a cautionary tale for physicians today.

Exactly a century ago, for example, Heiman unleashed substantial criticism against what he regarded as his colleagues’ irresponsible experimental practices in his book Etyka Lekarska (“Medical Ethics”).4 He vigorously criticized the reduction of human beings to experimental material, insisted on consent, and warned against taking advantage of desperate patients who may fear their physician will abandon them if they do not cooperate. This tradition in Eastern European medicine, which has persisted over the generations even amid catastrophic wars and massive political turmoil, demonstrates the importance of professional self-governance and advocacy in the face of an undemocratic and in many ways authoritarian state with little respect for individual rights. In 1967, 50 years after the publication of Heiman’s book and 20 years after publication of the code, the Polish Medical Association published strict “Deontological Ethical Rules” (Deontologiczno-Etyczne Zasady) about consent, voluntariness, risks and benefits, publication, and qualified investigators.5 Those rules were expanded 10 years later. Although rules do not necessarily translate into practices, it is notable that even at the height of the Cold War in a socialist government, the association sent a copy of these rules to every physician in Poland. Whether this was effective in protecting patients who participated in research is unknown.

More recent scholarship has found a similar preoccupation with “medical deontology” (medical ethics) in other countries in that region, perhaps partly owing to their proximity to the concentration camps, the collective experiences of the Second World War, and the need to assert professional ethics against a background of authoritarian rule. For example, in East Germany, medical ethicists collaborated with state commissions, such as the Central Expert Committee for Drug Commerce (Zentraler Gutachterausschuss für Arzneimittelverkehr) to monitor the safety and ethical conduct of pharmacological experiments on thousands of patients. Rather than working in an entirely different ethical universe behind the so-called Iron Curtain, these experts, though seemingly removed from the fundamental ethical principles of the code, applied their detailed knowledge of modern research ethics to the practical challenges of experimental medical science in undemocratic societies. The infamous state-sponsored “doping” of world class athletes, in which physicians were involved, was not necessarily typical of medical and ethical practices.

It is essential, however, to distinguish the inherent structural problems of nondemocratic and authoritarian government systems from legitimate government regulation for the protection of patient-participants. The two are not the same. For example, in the late 1940s and early 1950s, critics of the newly founded postwar British National Health Service (NHS), which does support clinical research, alleged that greater government direction of medicine and medical science may inevitably lead to a Nazi or Soviet style of government. Organizations such as the Society for Freedom in Science likewise denounced as totalitarian almost any government program that promoted a greater degree of central or state planning of clinical care and medical research.2 These tensions regarding professional vs governmental control over medical science are pervasive and longstanding.

In a symbolic sense, the Nuremberg Code is part of the infrastructure of the democratic international system that emerged after World War II, with its focus on respect for human rights, individual autonomy, and informed consent. But even that symbolic role is intangible at best. In the field of human research ethics the code was eclipsed by the World Medical Association Declaration of Helsinki in 1964. While the code may have created greater awareness of the importance of human rights in medical science among wide sections of the medical profession, its specific role in international human rights law is modest compared with the 1948 Universal Declaration of Human Rights, created by the United Nations General Assembly in light of 2 world wars. In 1987, 40 years after the code was written, the US Supreme Court ruled that it did not apply in the case of a retired Army sergeant who claimed to have been injured in an LSD experiment. Likewise, in 2004, an inquest into the death of a British serviceman from a Cold War sarin nerve agent experiment ruled that the code—despite its importance as a reference point in medical ethics—did not apply to UK domestic law. Indeed, the code has not been adopted by any government, with the partial exception of the US Department of Defense in 1953 regarding defensive experiments concerning atomic, biological, and chemical agents.1

The story of the Nuremberg Code is not one of ethical norms taking on the force of law. Rather, its legacy shows the fundamental importance of a robust organized medical profession that protects its independence from political interests and its ability to chart its own moral course, yet is at the same time open to the essential role of nations and government agencies that respect broadly defined and agreed-upon rules to protect the rights and well-being of human research participants. Renewed allegations in 2017 that psychologists working with the Central Intelligence Agency on detainee interrogations engaged in unethical human experimentation demonstrate that these matters are not merely of historical interest.6 It is therefore worth asking, 70 years after the judgment at Nuremberg and in the face of increasing authoritarianism worldwide, whether contemporary physicians can claim membership in a profession that reflects the spirit of the code.

Corresponding Author: Jonathan D. Moreno, PhD, Department of Medical Ethics and Health Policy, University of Pennsylvania Perelman School of Medicine, 423 Guardian Dr, BH1415, Philadelphia, PA 19104.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Drs Schmidt and Moreno report grants from Wellcome Trust during the conduct of the study. No other disclosures were reported.

Funding/Support: Drs Schmidt and Moreno acknowledge the support of a Wellcome Trust Seed Award 2016, “Cold War Bioethics: Human Research Ethics in Central Eastern Europe, 1945-present” (WT: 201441/Z/16/Z).

Role of the Funder/Sponsor: Wellcome Trust had no role in the preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.

References
1.  Moreno  JD.  Undue Risk: Secret State Experiments on Humans. New York, NY: Routledge; 2001.
2.  Schmidt  U.  Justice at Nuremberg: Leo Alexander and the Nazi Doctors’ Trial. New York, NY: Palgrave Macmillan; 2004.
3.  Schmidt  U.  Secret Science: a Century of Poison Warfare and Human Experiments: New Product Edition. New York, NY: Oxford University Press; 2015.
4.  Heiman  T. Etyka lekarska i obowiązki lekarza: (deontologia). Warszawa, Poland: skł. gł. Gebethner i Wolff; 1917.
5.  Polish Medical Association.  Deontologiczno-etyczne zasady.  Służba Zdrowia. 1967;26:3.
6.  Dougherty  S, Allen  SA. Nuremberg Betrayed: Human Experimentation and the CIA Torture Program. June 2017. http://physiciansforhumanrights.org/assets/multimedia/phr_humanexperimentation_report.pdf. Accessed June 22, 2017.

The Behavior of Ethicists (ch 15 of A Companion to Experimental Philosophy)

Chapter 15. The Behavior of Ethicists. Eric Schwitzgebel and Joshua Rust. In A Companion to Experimental Philosophy, edited by Justin Sytsma and Wesley Buckwalter. DOI: 10.1002/9781118661666.ch15

Summary: We review and present a new meta-analysis of research suggesting that ethicists in the United States appear to behave no morally better overall than do non-ethicist professors. Measures include: returning library books, peer evaluation of overall moral behavior, voting participation, courteous and discourteous behavior at conferences, replying to student emails, paying conference registration fees and disciplinary society dues, staying in touch with one's mother, charitable giving, organ and blood donation, vegetarianism, and honesty in responding to survey questions. One multi-measure study found ethicists tending to embrace more stringent moral views, especially about meat eating and charitable donation. The same multi-measure study found ethicists and other professors to show similarly small-to-medium correlations between their expressed normative attitudes and their self-reported or directly measured behavior.

Keywords: ethics; attitude-behavior correlation; metaphilosophy; experimental philosophy; applied ethics