I’m frequently asked “are you and atheist or not?”
I rarely answer.
Why? Because most people have a very simplistic and inaccurate system of labels, often used merely to stereotype people into buckets. Most of us are binary thinkers, we think the choices are “dark red or dark blue” and if you’re not dark red, then you’re dark blue.
Hopefully this chart can help us appreciate the complexity of the situation a little bit more. These positions aren’t necessarily on a bidirectional scale, some positions overlap, some people hold more than one, other people tend to hop around all the time (I’ve been known to do that),there are still significant things missing from this chart, and etc, but at least its a start.
The sad reality is, most of us would side with a bumper sticker over a philosophy book, just because one is easier to understand.
“The more difficult it becomes to process a series of statements, the less credit you give them overall. During metacognition, the process of thinking about your own thinking, if you take a step back and notice that one way of looking at an argument is much easier than another, you will tend to prefer the easier way to process information and then leap to the conclusion that it is also more likely to be correct.
In experiments where two facts were placed side by side, subjects tended to rate statements as more likely to be true when those statements were presented in simple, legible type than when printed in a weird font with a difficult-to-read color pattern. Similarly, a barrage of counterarguments taking up a full page seems to be less persuasive to a naysayer than a single, simple, powerful statement.”
Facts never win arguments because we are more likely to dig in our heels when we hear facts that disagree with our beliefs. This has been observed by psychologists in numerous studies and has been called the “The Backfire Effect”
David McRaney says “Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead.”
While scholars discovered that “under a lot of conditions, the mere existence of contradictory facts [to their beliefs] made people more sure of themselves — or made them claim to be more sure.”