Confirmation Bias - Island of Sanity

Island of Sanity

Public Debate

Confirmation Bias

Lists of logical fallacies or muddled thinking often include "confirmation bias". Confirmation bias is when you accept claims uncritically that agree with what they already believe, but question or dismiss claims that contradict what you already believe.

For example, I recently saw stories in the news claiming that several politicians were engaged in questionable business dealings with foreign governments. And I noticed a fascinating thing about how reporters presented these claims. If the reporter opposed the politician in question, he would instantly accept the charges as true and the actions as inappropriate, possibly criminal and/or treasonous. But if the reporter supported the politician, they would question if the charges were true or if this was a smear campaign, or they would say that the business dealings were completely legitimate and political opponents were trying to create a scandal out of totally innocent behavior.

People often engage in this kind of thinking whether we're talking about politics, religion, science, diet plans, or whatever.

But that said ... confirmation bias isn't entirely irrational. People do it because it is often the sensible approach to contradictory information. If some fact or principle is well-established, if you have seen evidence that it is true over and over again over a long period of time, the fact that someone comes along and claims he has "proof" that it is false is not likely to make you abandon it. And it shouldn't.

To make a deliberately silly example: Suppose you saw a news headline that said, "Scientist proves that theory of gravity is false". Well, I suppose my first thought would be that it is a teaser, that the story isn't claiming that rocks float in mid-air when you drop them but rather that scientists have found some special case that happens only in distant galaxies where gravity does not work as present theories describe, or that the formulas scientists use to calculate the force of gravity are wrong by one one-millionthn of 1%, or some such technicality. But suppose the body of the article really did say that a scientist is claiming that rocks don't fall when you drop them, but in fact hover in mid-air. Would you say, "Zounds! Science has proven that? I guess I was wrong to believe in gravity all these years!" Or would you say, "That's nonsense. I've seen rocks fall when you drop them a million times. I don't know who this 'scientist' is or what experiment he performed, but either the scientist or the reporter is nuts".

Okay, the gravity example is extreme. But my point is, you have probbaly come to the conclusion that gravity is real over the course of your lifetime. You have seen it act many times. The evidence for it is overwhelming. It would take a very strong argument to convince you that a lifetime worth of experiences with gravity were all an illusion or a lie. You're not going to abandon belief in gravity because one person claims to have proven that it isn't real.

This could be called "confirmation bias", but it's rational thinking. If you changed your mind every time anyone said they had "proof" that something you thought was wrong, your mind would be a muddled mess. You'd believe the world is round one day and flat the next and then the next day back to round. It makes good sense to come to conclusions based on the weight of the evidence, and then insist on a comparable weight of evidence to change your mind. That's not being obstinate; that's being reasonable.

When someone presents new evidence, especially on a controversial subject, that evidence is often debatable. Just because I read in the news that "a new study proves X" doesn't mean that a new study really proved X. Maybe the study is biased. Maybe the people who did the study or the reporter who wrote the story are exagerrating or misrepresenting the results. Or even if the study is persuasive, if there are ten other studies that conclude the opposite, why should I take this study over the other ten? I've been in many convesations where someone expected everyone who disagreed with him to concede the argument because he could quote some famous person or respected organization who said that he was right. Often this is laughable because the authority he is citing has an obvious bias. Like, "You say that Ruritania is a dangerous place to visit? No, you're wrong. Look, here's a statement from the Ruritanian Tourist Board clearly stating that Ruritania is safe for tourists." (BTW, "Ruritania" is a fictional country -- I'm not trying to stir up any real arguments.) Even if the authority has no obviouis bias, the fact that on a controversial question you can find someone who agrees with you is not impressive. I can name many who disagree with you. That's why we call a question "controversial", because there are people on both sides.

So okay, I don't know anyone claiming that gravity isn't real. But there are plenty of controversial issues in the world. There are plenty of questions where people disagree and there are intelligent and rational people on both sides. I have strong opinions on many controversial issues. In most cases, I've arrived at those opinions based on hundreds of facts that I've seen over many years and based on serious thought and logic. Often my opinions are based on the weight of evidence. I can think of 10 good reasons to believe X and only 2 good reasons to believe Y, so I conclude that the weight of evidence is on the side of X. If you bring up some argument why I should believe Y, okay. Even if I don't see any holes in it, if it's a good solid argument, so now the score is 10 to 3 instead of 10 to 2. I'm still going to believe X.

Sure, sometimes a new fact or argument will change my mind. Maybe before I thought the score was 6 to 5 and you bring up two arguments for your side that I find persuasive so now it's 6 to 7, and so I shift from "it's a tough one but I tend to think X" to "it's a tough one but I tend to think Y". But that just means that there are cases where confirmation bias does not apply. That doesn't prove that confirmation bias isn't real or isn't sometimes rational.

Let me give a concrete example. Let me make clear that my point here is not to argue this particular issue, but rather to give an example of how thoughtful people approach difficult questions. And let me say up front that I'll present the opposite side in a moment so if you disagree with me, please bear with me and I'll get to your side in a moment!

So here's my example: I'm a Christian and I believe the Bible. Over the years I've heard many claims of errors and contradictions in the Bible. I've looked into them and I've found that almost all of them evaporate when studied. You can add "in my opionion" to that if you like. They turn out to be taking things out of context, circular reasoning, etc. I'm not going to go into detail because, as I said, my point here is not to prove that I am right about this particular issue but to discuss how reasonable people approach difficult questions.

So every now and then an atheist will tell me about some supposed error or contradiction in the Bible. 95+% of the time I've either heard it before and already know a rebuttal, or I see a flaw in the argument. But okay, what about the remaining handful? You might say, If there's just one error in the Bible, that proves that it is not divinely inspired and infallible. True. But "I don't see a flaw in this argument" is not at all the same as "There is no flaw in this argument." In my experience, time after time when atheists claim to have found an error in the Bible, further investigation proves that the Bible was right and the atheist was wrong. So if you have some claim that I can't refute, does that prove that the Bible is wrong? Or just that I don't know enough history or science or whatever to see the flaw? If there are one or two cases that I cannot refute, I think it's more likely that further study or new discoveries will prove that the Bible is right, than that the hundreds of cases where the Bible did prove to be right will all turn out to be mistakes or dumb luck.

I said I would prevent the flip side. Suppose you are an atheist. Over the course of your life you have examined hundreds of arguments for atheism and found them very convincing. Then a Christian comes along and says, "How can you not believe in God? I prayed for my brother to be healed and he was! Prayer works! The existence of God is proven!" Would you instantly abandon your atheism? Probably not. More likely you would say, "It wasn't your prayers that healed your brother but the skill of the doctors and the effects of the medicine. Your brother would have recovered whether you had prayed or not." You could point to times when a Christian prayed for something and what he asked for didn't happen. You would be unlikely to abandon your atheism because of one piece of evidence against it.

In either case, is the person who refuses to change his beliefs being obstinate and refusing to look at the evidence? No. He's saying, Okay, you bring up an interesting piece of evidence. But even if I can't refute it, it's one fact against the hundreds that convince me of the other side. Interesting. Maybe it causes me to re-examine my beliefs or have a twinge of doubt. Or maybe not. Maybe it's so minor compared to all the evidence for the other side that it doesn't make me pause for amoment.

So all that said, what's the difference between "demanding serious evidence before I change my mind" and "confirmation bias"? What's the difference between rejecting claims because they contradict the weight of established evidence, and rejecting claims because you just don't want to believe them? And that's a problem, because there's no easy answer. If there was a simple, objective way to determine what evidence is valid and what is not, maybe we wouldn't have controversial questions.

© 2022 by Jay Johansen


No comments yet.

Add Comment