Two articles in the March-April issue of SKEPTICAL INQUIRER deal with critical, scientific-oriented mental habits, which are usefully relevant to thoughtful world-building. I subscribe to this magazine, which tackles pseudoscientific beliefs and theories of all types, mainly because exploration of topics such as UFOs, Bigfoot, poltergeists, and many other subjects in the fields of the paranormal and cryptozoology can yield story ideas (and also keep fictional characters who encounter such phenomena from seeming too gullible, if they're aware of the major arguments against, say, telepathy or channeling spirits). Some articles do take a blatantly anti-religious stance, but not enough to put me off the magazine as a whole. "Skepticism" doesn't mean "cynicism" or stubbornly doubting everything. As used in this publication, it means keeping an open mind, asking questions, and being ready to change one's beliefs as evidence demands.
The parent organization that publishes SKEPTICAL INQUIRER is here:Center for Inquiry
"Why We Believe—Long After We Shouldn't," by Carol Tarvis and Elliot Aronson, analyzes the well-known phenomenon of cognitive dissonance. Once we've made up our minds on a topic, further information that contradicts or invalidates our belief or position makes us uncomfortable. The idea that we've made a mistake in holding a certain belief threatens to undermine our self-concept as intelligent, informed, morally upright people. We tend to pay more attention to and give more credence to data that support our position (confirmation bias). Social media exacerbate this problem. As everyone knows, Facebook (for instance) makes it easy to control our feed so that we end up in a bubble where we encounter only information that agrees with the beliefs we already embrace. Confronting evidence that we made a mistake in choosing the last car we bought (one of the authors' examples) and consoling ourselves by seeking out facts that reinforce our original high opinion of the vehicle is one thing. Letting confirmation bias rule us in matters such as politics or religion is more serious. This article uses the metaphor of a pyramid to illustrate how confirmation bias can drive people on opposite sides of an issue further apart. Imagine two people starting near the top of the pyramid, pretty close together. Often, at this point, "we are faced not with a clear go-or-no-go decision, but instead with ambiguous choices whose consequences are unknown or unknowable." Forced to make a decision, often an "impulsive" one, "we justify it to reduce the ambiguity of that choice." The more actions we take to justify our commitment to that initial choice, the nearer to the bottom of the pyramid we move, so that the two people who started close together at the top end up getting further and further apart. The authors acknowledge that "it's good to hold an informed opinion and not change it" every time a possible objection comes along. At the same time, though, it's "essential to be able to let go of that opinion when the weight of the evidence dictates." I'm reminded of C. S. Lewis's discussion of faith, which, he explains, doesn't mean blindly believing apparently impossible things. It means that once we've reached a certain belief (in his example, in God) for what we consider good reasons, we should stick to that belief unless we encounter solid evidence to disprove it, not let every adverse life event or shift in our emotions override our rational commitment.
"The Virtuous Skeptic," by Massimo Pigliucci, outlines the ethical principles a person intelligently seeking truth should embrace. Humility—knowing one's limitations and recognizing what kinds of expertise are needed to produce an informed opinion on any particular question—heads the list. The author lays out a table of "epistemic virtues"—curiosity, honesty, objectivity, parsimony (Occam's Razor), etc.—and the opposite "epistemic vices"—closed-mindedness, dogmatism, gullibility, self-deception, etc. The article ends with a list of questions we should ask ourselves, which apply well to any argument, scientific or not (slightly paraphrased and shortened): Did I carefully consider my opponent's arguments instead of dismissing them? Did I interpret my opponent's statements in the most charitable way possible (very important in politics!)? Did I entertain the possibility that I could be wrong? Am I an expert in this area, and, if not, have I consulted experts? Did I check the reliability of sources? Finally, "do I actually know what I'm talking about, or am I simply repeating somebody else's opinion?"
Critical thinking is hard work!
Margaret L. CarterCarter's Crypt