Viewed: 19 - Published at: 5 years ago

In 2006, researchers Brendan Nyhan and Jason Reifler created fake newspaper articles about polarizing political issues. The articles were written in a way that would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, experimenters then handed over a true article that corrected the first. For instance, one article suggested that the United States had found weapons of mass destruction in Iraq. The next article corrected the first and said that the United States had never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn't surprise you. What should give you pause, though, is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before that there actually were WMDs and that their original beliefs were correct.
The researchers repeated the experiment with other wedge issues, such as stem cell research and tax reform, and once again they found that corrections tended to increase the strength of the participants' misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.
Researchers Kelly Garrett and Brian Weeks expanded on this work in 2013. In their study, people already suspicious of electronic health records read factually incorrect articles about such technologies that supported those subjects' beliefs. In those articles, the scientists had already identified any misinformation and placed it within brackets, highlighted it in red, and italicized the text. After they finished reading the articles, people who said beforehand that they opposed electronic health records reported no change in their opinions and felt even more strongly about the issue than before. The corrections had strengthened their biases instead of weakening them.
Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.

( David McRaney )
[ You Are Now Less Dumb: How to ]
www.QuoteSweet.com

TAGS :