
This Article From Issue
July-August 2021
Volume 109, Number 4
Page 194
You have likely heard of the Stanford Prison Experiment, a famous 1971 study by Stanford University psychology professor Philip Zimbardo in which students were randomly assigned to play “prisoners” and “guards.” The “guards” became unnecessarily cruel, the “prisoners” broke down, and the study was widely cited as evidence of how readily people conform to their social roles. However, the study has long been panned as being both unethical and poorly designed: The participants were self-selected, the investigator took part in his own study, the guard behavior was coached, and the consent forms did not make it clear that participants had a right to leave at any time, among other issues.
Many researchers in psychological science call the experiment an embarrassment, yet it has appeared unchallenged in introductory psychology textbooks up to at least 2015. Popular internet memes criticize the generalization of the results to anyone not a college-aged white male, but paradoxically ignore that the study has already fallen from favor. All this goes to show is that once an idea takes hold, getting any corrections to sink in at a similar level can be extremely difficult. Scientists face this type of challenge all the time. The research enterprise is designed to be subject to revision as new data become available, but old paradigms can be hard to dislodge. That is all the more reason to remain vigilant about scientific ethics so as to reduce errors from the start.
Recently, the idea of an erosion of public trust in science has become its own soundbite, regularly repeated without qualifications or empirical support. In this special issue on “Trustworthy Science,” we delve deeper into a wide range of topics related to trust and the ethical underpinnings of scientific research, exploring why we should consider science as worthy of trust.
We start out by establishing a baseline with data about public trust. Cary Funk of the Pew Research Center provides an overview of public attitudes toward science and scientists ("What the Public Really Thinks About Scientists"). This baseline has a counterpoint later in the issue, in “The Trust Fallacy” by Nicole Krause and her colleagues, in which they discuss how nuances of trust in science are their own ethical issue, and how a lack of information is not the only factor in gaining public trust. The issue explores a wide range of topics across fields of science, including oversights in biomedical research, digital regulation, the role of humans in the natural world, and what our efforts to contact intelligent life elsewhere say about our own culture. In addition, our authors explore the ethics of the scientific enterprise more broadly, exploring how missing genome research hampers science, arguing that scientific objectivity does not prevent researchers from acting on consequences of their work, engaging underserved communities in public health, putting people first in human-subject research, looking at the toll of calling out unethical work, and reconsidering what inclusive academic environments should look like.
One overarching theme that emerges from this issue is that the research community would benefit as a whole from taking more time to step back and consider the implications and limitations of the data they collect. Another central message is that if we want to improve public trust in science, we need to acknowledge the history of how the scientific enterprise has interacted with different communities, including some that have been excluded or mistreated in the past. Trust can be achieved only when researchers listen to people’s concerns, find common ground, and demonstrate that they are willing to speak up for what’s right.
Scientists are rightly proud of their remarkable accomplishments, but they should be vigilant about their biases as well, and should see ethics as a partner in progress. As bioethicist Insoo Hyun states in this issue, “Good guidelines and ethical standards do not get in the way of science. They help pave the way.” —Fenella Saunders (@Fenella Saunders)
American Scientist Comments and Discussion
To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.
If we re-share your post, we will moderate comments/discussion following our comments policy.