Blogs

From The Staff

Having Faith in Science, Equivocally

Unequivocally deny scientific results? No, that’s saying there's certainty when there must be doubt.

November 7, 2016

From The Staff Communications Religion

Lake Harriett, Minneapolis, MN. Photo by the author.
Ad Right

“Politics aside, you can't unequivocally deny scientific results,” I argued. We were discussing anthropogenic climate change.

“Go take a long walk down a short pier,” came the reply.

When taking someone’s suggestion to go for a long walk down a short pier, it’s worth considering the location and time of year.

I chose Minnesota in wintertime. Turns out, a person can walk on water, and like I’d been taught in elementary school, I tested my hypothesis three times before making that conclusion. There are caveats.

These caveats of scientific knowledge, however, are often minimized or even eliminated in an attempt to explain scientific results quickly and simply, and simultaneously, to create the conflict that allows many in our profession to sell their news: “Science is New: People Can Walk On Water” and “Science is Wrong: Summer Experiments Show People Attempting to Walk On Water All Wet”. Despite the hype, rarely is a scientific discovery so exciting nowadays to prompt the discoverer to leap out of the bathtub and run naked through the streets exclaiming “Eureka,” which is what Archimedes is purported to have done more than two millennia ago.

In part, that’s because scientific discoveries today often involve long, arduous processes, expensive equipment, or huge teams of people working together. While the results can be exciting, stimulating, or even alarming—recent examples include evidence of the Higgs boson, the first quantum machine, gene editing via CRISPR, climate-change predictions—they’re far from “revolutionary” (at least according to my dictionary) because they don’t turn anyone’s thinking on its head. Still, understanding those discoveries—the processes that led to them and their caveats—requires work. To simplify that work and for the sake of expediency, people in my profession often ask scientists to try to explain their discoveries as if they were talking about it to a child (as in the quote often attributed to Einstein, that “if you cannot explain it simply, then you don't understand it well enough”). But explaining science simply and explaining science completely are different. Someone who requires a simple explanation of a scientific concept rarely has the patience to hear a complete explanation. It can take many years for both speaker and listener to explain a scientific discovery completely, which is perhaps why the process of a complete explanation is so often undertaken in primary, secondary, undergraduate, and graduate school, and is perhaps completed along the way, but very often requires even more time than that.

How being moved by science also moved me to become a person of faith.

But I still have faith in science even though the resulting headlines and their repetition—“Science is New!” and “Science is Wrong!”—fuel doubt about the knowledge scientists generate. Indeed, scientists themselves may show what is new today is wrong tomorrow.

Proponents of science have framed this characteristic as a positive, describing science as a “self-correcting process”—an old trope if ever there was one. But the reason I have faith in science, the reason I’ve chosen not to be a die-hard skeptic, has less to do with science’s self-correction than it does with what I understand to be human nature: I trust that we are social creatures who share our knowledge; I trust that we like to look good before our peers; I trust that sharing knowledge that shows others’ knowledge to be wrong—especially showing incorrect knowledge of those we laud, such as Darwin or Einstein—can make us look good before our peers.

These characteristics of human nature are not particularly flattering, but they are very well documented. At least one religion labels a (potential) motivation for demonstrating someone else’s knowledge to be wrong—envy—and a (potential) resulting feeling after demonstrating someone’s knowledge to be wrong—pride—each as a “deadly sin.” And so I trust that our human nature will include such characteristics for a very long time to come, continuing to motivate scientific achievement (or “self-correction,” if you wish). Those aren't the only motivations for doing science, of course. But after all, the more reputable the idea or scientist, the greater the envy. The better the takedown, the greater the resulting pride.

It seems to me that almost all of the time, though, science’s self-correction nowadays comes about by adding new caveats rather than overturning well-established theories. In part, that is because of science's most powerful form of reasoning, formal induction, which is a relatively new form of reasoning for our species. In induction, the only evidence that matters is evidence that was—and can be—repeatedly demonstrated. So new results must make sense of why past results could be repeatedly demonstrated. We don't live our lives this way.

If I lived my life reasoning exclusively by formal induction, I should be quite willing to repeat my walking-on-water experiment in summertime. Having equipped myself with a bathing suit, I can report that one caveat to my previous scientific conclusion—that we humans can walk on water—is that it must be wintertime to do so. A better caveat to my walking-on-water conclusion is that it must be wintertime in Minnesota. A still better caveat is that the water must be frozen. An even better caveat is that the water must be frozen and be a few decimeters thick. These better caveats I reached through gedankenexperiment—thought experiment—as living one's life reasoning entirely by induction would mean an early death. Well, at least, that's what I think because I'm unwilling to live my life by induction in order to demonstrate that an early death is the result. My death would be only one data point, anyway, and I would then be entirely unable to repeat the experiment.

Even with results from thought experiments, there are almost always caveats to our knowledge, at least, from a philosophical standpoint. In the Western tradition, Socrates in the 5th century BCE was reported to believe that the only knowledge he actually had was that he knew nothing. René Descartes's even more famous conclusion, "I think therefore I am," in the 17th century CE posed the idea that the only knowledge anyone could be absolutely certain about is that there is doubt and, therefore, there is a thing that doubts. (Descartes then went too far because there is no reason to conclude “I think” from having reasoned that there is a thing that doubts.) Outside the Western tradition, there are similar philosophical doubts, including in Taoism, Jainism, Islamic thought, et cetera. And I believe these philosophical treatises underpin today’s thinking that being skeptical—having doubts—is a “G”ood.

But I still have faith in science, even though I believe being skeptical of scientific results is also a Good. “Real Patriots Ask Questions” is how Carl Sagan and Ann Druyan put it as the title of the final chapter of their 1995 book (Sagan's penultimate book, and the last he saw published) The Demon-Haunted World: Science as a Candle in the Dark, reminding us both of science’s many imperfections and highlighting U.S. Supreme Court Justice Robert Jackson’s statement in 1950 that “it is not the function of our government to keep the citizen from falling into error; it is the function of the citizen to keep the government from falling into error.” To keep science from falling into error, scientific experiments are (or should be) repeated or results reinterpreted, and new experiments are built (or should be built) upon old ones. As a result, sometimes new scientific results challenge old scientific results. Sometimes new scientific knowledge replaces old scientific knowledge. The word scientific, after all, etymologically means “producing knowledge.” There's no guarantee in science that knowledge newly produced will not be in conflict with knowledge previously produced. To borrow a phrase from U.S. Supreme Court Justice Oliver Wendell Holmes, Jr., “a page of [scientific] history is worth a volume of logic.”

Cartoon by the author.

So I have faith in science because of that history and the inescapable humanity of those who have undertaken what is often a life-long process to try to completely understand what is known about a scientific field or even just a scientific theory, and so are in the best position to generate new scientific knowledge. They will share their knowledge with others. They will want to look good in front of their peers. And so they too will try to overturn the results of others or add caveats to the knowledge generated by others—whether out of envy, curiosity, friendship, et cetera—or even overturn or add caveats to the knowledge they themselves have generated. They too will sometimes feel pride in their achievements. When they do, but even when they don’t, people like me will ask them to tell us stories about their research, ask them for explanations of what they did, whether it worked, or how it failed. The stories people like me then tell—even if they are long—are comparatively simple and far from complete. In other words, there are caveats to the stories we tell and so there are caveats to the stories we read, hear, and view. There is no way around it because there are caveats to the scientific knowledge itself, too.

For the remainder of my lifetime, then, I believe there will continue to be caveats to almost all of our knowledge and there will continue to be these characteristics of human nature. So I have faith in science even though its most powerful reasoning process, by induction, is a relatively new way of thinking for our species, even though we don't live our lives by thinking this way, even though thinking this way can and does lead to error, including what could be a cold, icy death out in the middle of a Minnesota lake in wintertime. But science is not the only way to think about the world, not the only way to reason, not the only way to come to a conclusion.

For the realms for which science reliably produces and reproduces knowledge, though, even if those realms are politically charged—such as climate change—choosing to ignore scientific knowledge is choosing to ignore the nature of people who are striving to check and possibly overturn that knowledge. Choosing to ignore that scientific knowledge is choosing to ignore the nature of knowledge itself as conjectural. So to unequivocally say climate change is “bullshit” or worse, a “hoax,” is to reject not only robust and worthy science done and checked and constantly challenged by people who have dedicated many years and their intellectual lives to understanding our planet’s climate, but to reject the one thing that Socrates, Descartes, and philosophical thinkers the world over for millennia have reasoned: The only thing we can be sure exists is that there is doubt.

I doubt anyone’s faith is perfect; doubt creeps in. Decisions must be made, though, and so although unequivocal statements may play well in politics, they are worthy of reproach when made about science. Doubt scientific results? Sure, even if you demonstrate you know almost nothing about the subject, say, by confusing “climate” with “weather.” But unequivocally deny scientific results? No, that’s saying there's certainty when there must be doubt.


The views and opinions expressed in this post are the author’s own and do not necessarily represent the views of American Scientist or its publisher, Sigma Xi.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.

×

AMSCI ICON NAVIGATION:

  • Navigation Menu
  • Help
  • My AmSci
  • Select Options (not present on all pages)

Click "American Scientist" to access home page