Confirmation bias
From Wikipedia, the free encyclopedia
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior
beliefs or values.
[1] It is an important type of
cognitive bias that has a significant effect on the proper functioning of society by distorting evidence-based decision-making. People display this bias when they gather or remember information selectively, or when they interpret it in a
biased way. For example, a person may cherry-pick information that supports their belief, ignoring what is not supportive. People also tend to interpret ambiguous evidence as supporting their existing position. The effect is strongest for desired outcomes, for
emotionally charged issues, and for deeply entrenched
beliefs.
Confirmation bias is a broad construct covering a number of explanations. Biased search, interpretation and memory have been invoked to explain
attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence),
belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and
illusory correlation (when people falsely perceive an association between two events or situations).
A series of
psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives ("myside bias", an alternative name for confirmation bias). In certain situations, this tendency can bias people's conclusions. Explanations for the observed biases include
wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way. However, even scientists and intelligent people can be prone to confirmation bias.
Confirmation biases contribute to
overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor
decisions due to these biases have been found in political, organizational, financial and scientific contexts. For example, confirmation bias produces systematic errors in scientific research based on
inductive reasoning (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence.