*

*
Yosemite morning

Monday, October 4, 2021

Confirmation bias

People only listen to information that confirms their beliefs, even if it spreads misinformation online.

...the team paired participants with below-average scores to other below-average performers, while those with above-average scores were paired with other above-average performers. Each participant believed that they were paired with someone with above-average scores and so freely exchanged beliefs of what they wish to be true. Some have an optimistic view of their pairings, while others are pessimistic.

The team found that pessimistic participants who were paired with optimistic participants tend to become more optimistic. In contrast, optimistic participants are unlikely to adjust their outlook even when paired with pessimistic participants. More so, optimistic participants in below-average IQ pairs are unlikely to waiver from their motivated beliefs.

That means those in the below-average group only accept information that confirms their beliefs or reinforces their biases. The findings suggest that motivated beliefs drive confirmation bias, wherein subjects selectively give higher informational value to those who reinforce their beliefs.

Motivated reasoning - the story of Semmelweis. 

Motivated reasoning is, quite simply, when a person's emotions incentivize them to produce justifications for a position that they desire to hold rather than the one which is actually supported by evidence. It is closely related to confirmation bias, which is when people embrace ideas and "evidence" that confirms what they want to believe and give less weight to facts which do not. While a person practicing motivated reasoning could also be correct about a given issue, that will usually be a coincidence. When a person holds a view due to motivated reasoning, they are driven by their emotions, not by their reason — no matter how vehemently they insist otherwise.

Truthers and truth. 

*

I was reading a great article Saturday that I unfortunately failed to bookmark and can no longer find. But it discussed trutherism and cults. And the tendency for adherents and true believers to double down on misinformation when they are debunked or exposed (e.g. Lin Wood calling the Twin Towers bombing and collapse a hoax the other day.)

But the thing that I found most interesting in the article was when the author mentioned this scenario, and let me use a hypothetical, a doomsday cult calls for the end of the world, say next Tuesday. And if the world miraculously does not end, they will take credit for forestalling its collapse, being true believers. So the mega maniacal wackos put themselves on a solipsistic belief platform that can never be challenged, only reinforced. A closed loop.

I found it.  From the MIT Press Reader - How cognitive bias can explain post truth.

What if we have taken a public stand on something, or even devoted our life to it, only to find out later that we’ve been duped? Festinger analyzed just this phenomenon in a book called “The Doomsday Cult,” in which he reported on the activities of a group called The Seekers, who believed that their leader, Dorothy Martin, could transcribe messages from space aliens who were coming to rescue them before the world ended on December 21, 1954. After selling all of their possessions, they waited on top of a mountain, only to find that the aliens never showed up (and of course the world never ended). The cognitive dissonance must have been tremendous. How did they resolve it? Dorothy Martin soon greeted them with a new message: Their faith and prayers had been so powerful that the aliens had decided to call off their plans. The Seekers had saved the world!From the outside, it is easy to dismiss these as the beliefs of gullible fools, yet in further experimental work by Festinger and others it was shown that — to one degree or another — all of us suffer from cognitive dissonance.there is another aspect of cognitive dissonance that should not be underestimated, which is that such “irrational” tendencies tend to be reinforced when we are surrounded by others who believe the same thing we do. If just one person had believed in the “doomsday cult” perhaps he or she would have committed suicide or gone into hiding. But when a mistaken belief is shared by others, sometimes even the most incredible errors can be rationalized. In his path-breaking 1955 paper “Opinions and Social Pressure,” Solomon Asch demonstrated that there is a social aspect to belief, such that we may discount even the evidence of our own senses if we think that our beliefs are not in harmony with those around us. In short, peer pressure works. Just as we seek to have harmony within our own beliefs, we also seek harmony with the beliefs of those around us.

The author, research fellow and author Lee McIntyre, ends his essay with a powerful message:

If we are already motivated to want to believe certain things, it doesn’t take much to tip us over to believing them, especially if others we care about already do so. Our inherent cognitive biases make us ripe for manipulation and exploitation by those who have an agenda to push, especially if they can discredit all other sources of information. Just as there is no escape from cognitive bias, a news silo is no defense against post-truth. For the danger is that at some level they are connected. We are all beholden to our sources of information. But we are especially vulnerable when they tell us exactly what we want to hear.

From the introduction of When Prophecy Fails

A man with a conviction is a hard man to change.” And when that conviction is as important as the promise of salvation coming from the sky, “it may even be less painful to tolerate the dissonance than to discard the belief and admit one had been wrong.” 

*

How to Dispute Irrational Beliefs (Without Arguing) 

No comments: