Belief perseverance is the tendency to hold to our beliefs all the more in the face of evidence to the contrary. This tendency is at the heart of many social conflicts as well as debates about controversial subjects. Charles Lord and his colleagues studied this phenomenon in 1979 in the case of two groups with opposing views of capital punishment. Each side studied two purportedly new research findings on the deterence of capital punishment. It was actually one study but the results were given so that they were biased one way or another. Each side was found far more favor with the study that sided with their beliefs debated rigorously its opposite. Thus Lord had shown that, given the same evidence, their disagreement was increased.
The remedy for this is "Consider the opposite". When they gave the studies to a similar group, they asked one sub-group to be "as unbiased as possible" and asked another group to consider if they would have made they same evaluations if the same findings showed something in the opposite groups favor. This produced a far less biased group of individuals who would assess the information for what it is.
Even when presented with something completely discrediting to their beliefs, people will often hold fast to their presuppositions. A study by Mark Lepper, Craig Anderson and Lee Ross in 1980 took two groups of people and asked them to consider whether risk takers or cautious people make better firefighters. Then one group was told accounts of both a risk taker who was a good firefighter and of a cautious person who was a poor firemen. They came to the conclusion that risk takers are braver and therefore better firemen. When they later told the people that not only do statistics show that cautious people make better firefighters, but that the stories were fabricated for the purposes of the study, they still kept their beliefs and would continue to explain why they thought that risk takers would make better firefighters. Thus the evidence was removed from under them but they still believed their own incorrect theories.
See also Confirmation Bias