Sunday, April 14, 2019

Corrected misinformation was presented alongside equal presentations of affirmed factual statements; participants reduced their belief in the misinformation but did not reduce their feelings towards the politician

They Might Be a Liar But They’re My Liar: Source Evaluation and the Prevalence of Misinformation. Briony Swire‐Thompson et al. Political Psychology, April 13 2019. https://doi.org/10.1111/pops.12586

Abstract: Even if people acknowledge that misinformation is incorrect after a correction has been presented, their feelings towards the source of the misinformation can remain unchanged. The current study investigated whether participants reduce their support of Republican and Democratic politicians when the prevalence of misinformation disseminated by the politicians appears to be high in comparison to the prevalence of their factual statements. We presented U.S. participants either with (1) equal numbers of false and factual statements from political candidates or (2) disproportionately more false than factual statements. Participants received fact‐checks as to whether items were true or false, then rerated both their belief in the statements as well as their feelings towards the candidate. Results indicated that when corrected misinformation was presented alongside equal presentations of affirmed factual statements, participants reduced their belief in the misinformation but did not reduce their feelings towards the politician. However, if there was considerably more misinformation retracted than factual statements affirmed, feelings towards both Republican and Democratic figures were reduced—although the observed effect size was extremely small.

---
In the wake of the 2016 presidential election, 88% of Americans reported that fabricated news had caused confusion about basic facts regarding current events (Barthel, Mitchell, & Holcomb, 2016). From the Cambridge Analytica scandal to reports of Russian troll factories (Chappell, 2018; Steward, Arif, & Starbird, 2018), it has been difficult to escape debate about how information can affect political discourse and the problematic nature of a media environment where veracity can-not be guaranteed. Misinformation in the public sphere can cause long-term damage to democratic discourse, not least because reasoning is often influenced by misinformation even after people have been presented with a valid correction (Johnson & Seifert, 1994; Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012; Thorson, 2016). Even if people do update their belief after a correction, they might not similarly update their attitudes about the issue or their opinion of the person who is spreading the misinformation. The present study assesses whether people’s feelings towards political figures are affected when a large amount of invalid information is disseminated in comparison to the amount of valid information. In other words, if a politician tends to tell mostly lies, to what extent do their supporters lose faith in them?
Events that influence trust, such as political scandals, often affect people’s voting preferences (Funk, 1996). However, political reputation is also surprisingly resilient; more than half of U.S. incumbents who are implicated in scandals are subsequently reelected (Basinger, 2013). Moreover, not all scandals are created equal. For example, a financial scandal such as tax evasion is more likely to permanently diminish feelings towards a political candidate as compared to a moral scandal (such as an extramarital affair; Doherty, Dowling, & Miller, 2014). The continued spreading of inaccurate information, however, differs from a one-time scandal as it is an ongoing violation of the pervasive but tacit assumption that people are generally truth tellers (Grice, 1975). Although people assume that speakers by and large are truthful, they are sensitive to violations of that maxim (Okanda, Asada, Moriguchi, & Itakura, 2015). Regardless of whether a politician is actually lying with intent to deceive or simply making repetitive unintentional errors, it is unclear how forgivable continued falsehoods are in the eyes of voters.

In previous research, Swire, Berinsky, Lewandowsky, and Ecker (2017) found that feelings to-wards a politician who disseminated misinformation remained unchanged even when participants acknowledged that their favored politician’s statements were incorrect. Specifically, Swire et al. asked participants to rate their belief in eight statements that Donald Trump made on the campaign trail, four of which were accurate and four inaccurate. The statements were either attributed to Trump (e.g., “Donald Trump said that vaccines cause autism”) or presented without attribution (“Vaccines cause autism”). After inaccurate items were corrected and true items affirmed, participants rerated their belief in those items either immediately or after a week-long delay. Results indicated that even if Trump supporters reduced their belief in misinformation attributed to Trump, they did not change their voting preferences nor feelings towards him.

This null effect, however, could potentially have resulted from participants being presented with true and false statements in equal quantities. While it is unclear a priori why factual statements should “balance out” misinformation, it is possible that a 50/50 split of true and false statements is insufficient to sway supporters’ feelings, as it may not sufficiently violate people’s expectations of truthfulness. This also could be in accordance with the tallying heuristic where people count the number of arguments (for example, pros and cons) and disregard the relative importance of each argument (Bonnefon, Dubois, Fargier, & Leblois, 2008; Gigerenzer, 2004). Perhaps the prevalence of misinformation must be more extreme—that is, the perceived amount or ratio of misinformation in comparison to factual information might have to be greater for falsehoods to influence opinion of the source. Additionally, it may be that participants update their beliefs about the specific items that are corrected, while opinion regarding the general amount of misinformation spread by a politician may be more stable. In other words, participants may accept that particular claims are false, but they maintain the perception that the candidate is accurate day to day, enabling them to have stable feel-ings towards the candidate.

Nyhan, Porter, Reifler, and Wood (2019) conducted a similar study to Swire et al. (2017) and also found that participants who reduced their belief in corrected misinformation did not change their feel-ings towards Donald Trump. However, unlike Swire et al., Nyhan and colleagues only presented one false (and no true) statement, suggesting that the preservation of support was not due to an equal number of factually accurate statements alongside it. There is additionally evidence to suggest that this phenom-enon could extend beyond political figures. With Israeli participants, Nyhan and Zeitzoff (2017) found that corrections successfully reduced individual misperceptions regarding the Israeli-Palestine conflict, but this did not extend to participants’ feelings towards the outgroup nor support for the peace process.

A separate question is whether (null) effects of misinformation on feelings and voting pref-erences are similar on both sides of the political spectrum, as both Swire et al. (2017) and Nyhan et al. (2019) exclusively used claims made by Donald Trump. Political symmetry is important to consider because there is still debate as to whether there are notable cognitive differences between people on opposing sides of the political spectrum. Some argue that these psychological differ-ences are the reason that the rejection of well-established scientific propositions are mainly found on the political right (Jost, 2017; Lewandowsky & Oberauer, 2016). Experimental support for political asymmetry was provided by Ecker and Ang (2018), who investigated whether partisan po-litical attitudes affected how people updated their beliefs after corrections were presented. Ecker and Ang found that retractions of attitude-dissonant misinformation were effective in participants on both sides of the political spectrum, whereas retractions of attitude-congruent misinformation were only effective in left-wing (but not right-wing) participants. In other words, left-wing partic-ipants were more willing to reject erroneous information that had supported their worldview than right-wing participants.

However, other researchers argue that identity-protective cognition occurs on both sides of the political spectrum (Kahan, 2013; Kahan, Peters, Dawson, & Slovic, 2017). For example, Washburn and Skitka (2017) tested the propensity of conservatives and liberals to misinterpret scientific claims that conflicted with preexisting beliefs. The authors found that both groups had motivated interpre-tations of scientific studies and were less likely to discover correct interpretations of the results when they conflicted with participants’ attitudes. Additionally, Claassen and Ensley (2016) found no dif-ference between Republicans and Democrats when it came to their concern about politicians using dirty-campaign tricks. There was little change in attitude towards a politician if the respondents were politically aligned with the politician, but participants were highly concerned if the politician came from the opposite side. It is therefore possible that cognitive processes guiding preferences towards politicians are similar regardless of partisanship. Whether correcting misinformation is more likely to reduce feelings towards Republican or Democratic political figures (or whether symmetry will be observed) is yet to be studied empirically.

No comments:

Post a Comment