A body of studies promises to explain a lot. In politics, people are willing to believe misinformation that reinforces their beliefs. And correcting misinformation sometimes seems to reinforce it. Examples exist from both major U.S. parties, so let’s pick on the Democrats for a change:
[Yale poli-scientist John] Bullock found a similar effect when it came to misinformation about abuses at the U.S. detention facility at Guantanamo Bay, Cuba. Volunteers were shown a Newsweek report that suggested a Koran had been flushed down a toilet, followed by a retraction by the magazine. Where 56 percent of Democrats had disapproved of detainee treatment before they were misinformed about the Koran incident, 78 percent disapproved afterward. Upon hearing the refutation, Democratic disapproval dropped back only to 68 percent — showing that misinformation continued to affect the attitudes of Democrats even after they knew the information was false.
I’m not sure that logically follows, unless the Newsweek article only reported on the false Koran/flushing incident and no other detainee mistreatment reports (seems rather unlikely), or the volunteers were only shown one paragraph or so in the article (and in this latter case they’d be able to infer there were more allegations, true or false, anyway).
Still, without seeing the original study etc. it’s hard to say. And I don’t find the conclusion hard to agree with – I’m just not convinced by the methodology as described here. 🙂
I haven’t seen the original studies (there was more than one, by a different research team); I’ve only seen the news story. Healthy skepticism is certainly justified (as usual).
I can imagine it playing out in somebody’s mind like this: “ZOMG, so the Koran incident didn’t happen… but it could happen. What’s to stop it? Maybe the critics of Gitmo are right after all.”
(Yeah, “ZOMG” plays out in my mind. I need a life.)