Current Issue

This Article From Issue

January-February 2020

Volume 108, Number 1
Page 55

DOI: 10.1511/2020.108.1.55

THE MISINFORMATION AGE: How False Beliefs Spread. Cailin O’Connor and James Owen Weatherall. 266 pp. Yale University Press, 2019. $26.

Scientists face a dilemma. Our jobs involve discovery, explaining what we discover, and informing others about what we learn. But discovery and explanation require funding, and informing others requires communication. Both funding and communication can have unintended consequences that don’t always serve the pursuit of good science. This problem looms large when science makes contact with powerful or monied interests, but vigilance is always required if the spread of misinformation is to be avoided.

Ad Right

In The Misinformation Age: How False Beliefs Spread, philosophers of science Cailin O’Connor and James Owen Weatherall discuss how the dissemination of facts and theories can get hijacked in science and other areas. Historically, misinformation has distorted discourse about foreign lands, war, and social policy, and it continues to do so today. Irrational beliefs can arise not only because people’s ability to reason is limited but also because of how information flows among perfectly rational agents. People form communities, and those groupings can act as echo chambers, amplifying bad ideas and untrustworthy data. The problem is made worse by the fact that we often care more about conforming to our group’s norms and narratives than about discovering truth. What the book does best is to reveal yet another contributor to misinformation: interference on the part of interest groups, with the intention of shaping understanding in pernicious ways.

Some of those ways are obvious. Interest groups can fund a biased research agenda. For instance, tobacco companies, with the aim of diverting attention from smoking, have funded research that emphasizes causes of lung cancer other than smoking, such as asbestos. The book explains how industry funding of research can corrupt the scientific process, how industry-funded propaganda can selectively report self- serving findings, and how political interference can slow down action addressing problems, such as acid rain.

The authors also discuss less obvious techniques for manipulating the flow of information. One approach is to merely raise the specter of uncertainty. The tobacco industry for years broadcast ads emphasizing that the evidence that smoking causes cancer was not 100 percent conclusive, just as climate change deniers today are proclaiming that human-caused climate change is only a theory. These “merchants of doubt,” as Naomi Oreskes and Erik Conway have dubbed them, are right in that all data include some margin of error and theories are, well, theories—not indisputable pronouncements. But if we acted only when we were 100 percent certain, we would never drink a glass of water or cross the street. When it comes to the use of evidence to inform difficult decisions, certainty is a pipe dream.

People frequently care more about how their beliefs fit in with those of their tribe than about whether those beliefs are true or false.

The authors of The Misinformation Age bypass work on the fallibility of human reasoning; their strategy for revealing how misinformation spreads is to focus instead on social networks— on how evidence gets transmitted among participants who communicate with one another. They rely on a model of information transmission developed by economists Venkatesh Bala and Sanjeev Goyal. The model assumes that individuals make choices not only by seeing the outcomes of their own past actions, but also by hearing about the outcomes of actions taken by others who communicate with them. Thus, decision-making depends largely on which people one is connected to via a communication channel, for that determines much of the information to which any individual has access. Individuals are also assumed to be “rational” in the sense that they use the information in reasonable ways. If enough people connected to them report that burning fossil fuels warms the Earth, and few people connected to them report otherwise, they will conclude that climate change is anthropogenic. The Bala-Goyal model is a reasonable first approximation to social decision-making processes. However, the model fails to take into account the fact that some information gets processed by groups even in the absence of information transfer. For instance, the model doesn’t capture people’s bias toward conformity to the beliefs of their community. Everyone has knowledge that they take on faith (for instance, the teachings of their religious leader or teacher) even without knowing many of the details, but they may deem other people (such as a foreign religious leader) untrustworthy.

One intriguing problem to which the authors devote most of a chapter is a difficulty that plagues political life today: polarization, or the fact that different communities have diametrically opposed, deeply entrenched beliefs. It turns out that polarization can arise within a social network even when all agents are perfectly rational and update their beliefs in optimal ways. This outcome can occur via information bubbles: Each cluster transmits information within itself, with only weak channels to other clusters, with the result that clusters resolve to different conclusions. In other words, polarization does not require bias in the assimilation of evidence; it is sufficient for different findings to be available to different subnetworks for elements of those subnetworks to arrive at different beliefs. The Misinformation Age shows that these ideas can explain why it took so long for medical science to accept that stomach ulcers are caused by bacteria: There was too much competition from the idea that stomach acid was the culprit. They also explain why a debate still rages about how to treat Lyme disease: Competing views about the value of antibiotics have led to polarized camps.

Social network analyses support O’Connor and Weatherall’s suggestions for social interventions to limit the spread of misinformation. One proposal they offer would limit the development of misinformed clusters. Many news organizations today operate on a principle of fairness: On controversial issues, both sides deserve representation. But this idea can be taken too far. When the best evidence strongly supports one side, then guaranteeing the other side a hearing gives the wrong impression that the evidence is balanced. Fairness does not necessitate that we hear from a climate-change denier every time we hear new evidence that climate change is anthropogenic, because the preponderance of the evidence does not demand it. Another suggestion the authors make is to puncture information bubbles by encouraging trusted politicians to say surprising things to their own constituencies. It was likely the very hawkishness of U.S. President Richard Nixon that allowed him to reinstitute diplomatic relations with China. His conservative credentials gave extra credence to his support of cooperation with a heretofore sworn enemy. A word from a trusted source is better than a thousand words from someone not trusted. This conclusion seems right but perhaps naïve, given the reluctance of so many politicians today to express support for an idea that is anathema to their donors or other critical supporters.

This insight raises a bigger issue: How much does evidence even matter to people? No amount of analysis of social networks will get this one right, because it’s a question about how humans think. The data from psychology and other fields show that people often care more about narrative than about truth. We all have stories that we use to organize our knowledge. Our understanding of gravity comes with a tall tale about Galileo dropping spheres from the Leaning Tower of Pisa, and our understanding of political events tends to be instantiated as a story about some heroic act by white hats faced with maltreatment by selfish black hats. Until the truth forces itself upon people—until they or someone close to them gets cancer from smoking, experiences flooding caused by climate change, gets mercury poisoning from eating fish, or the like—there’s a strong tendency for people to do whatever is necessary to maintain their narrative. O’Connor and Weatherall admit as much when they discuss how major cable and online news outlets use novelty to select the news that’s worth reporting. Whether an event is perceived as novel depends on the story that one is telling. A story about a bomb going off in Baghdad is novel only if one is living with the narrative that violence in Iraq is receding.

The book addresses the value of evidence and other topics related to fake news and its history in an engaging and sophisticated way. But scientists need to keep in mind when interacting with the public (and with other scientists) that communication does more than share information and misinformation. It relates a narrative that affects not only people’s understanding, but also their well-being, and it can affect which community they affiliate with. People frequently care more about how their beliefs fit in with those of their tribe than about whether those beliefs are true or false. Try telling a fan that his or her team has no chance. The reaction will be much like that of a scientist told that his or her theory is bunk.

Philosophers of science, including the authors of this book, are coming to appreciate the importance of social groupings in forming beliefs and attitudes. A field called social epistemology is on the rise; it attempts to understand how knowledge can be based almost entirely on the testimony of others. What is clear—psychologically—is that beliefs can be based on what others believe, whether or not their beliefs have merit.

The Misinformation Age captures the threat posed by misinformation in the current political moment. It ends by describing some critical implications of the ease of spreading misinformation, made even easier by modern technology, and offers some ideas about how to deal with the ensuing problems. First, we have to give up on the idea that we live in a free “marketplace of ideas” that allows only the best to prevail. Information has to be regulated to make sure that it conforms to the facts. Second, human beings are too vulnerable to manipulation by misinformation to be able to sustain a democracy. Democracy may be a moral imperative, but we need institutions that allow us to make decisions based on evidence rather than ignorance. These implications may be unappetizing, but they are insights that must be digested if we want a more informed society that makes wiser decisions.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.