Tuesday, December 4, 2018

Why Smart People Are Vulnerable to Putting Tribe Before Truth

Why Smart People Are Vulnerable to Putting Tribe Before Truth. Dan M Kahan. Scientific American, Dec 03 2018. https://blogs.scientificamerican.com/observations/why-smart-people-are-vulnerable-to-putting-tribe-before-truth/

Excerpts (full text and links in the link above):

What intellectual capacities—or if one prefers, cognitive virtues—should the citizens of a modern democratic society possess? For decades, one dominant answer has been the knowledge and reasoning abilities associated with science literacy. Scientific evidence is indispensable for effective policymaking. And for a self-governing society to reap the benefits of policy-relevant science, its citizens must be able to recognize the best available evidence and its implications for collective action.

This account definitely isn’t wrong. But the emerging science of science communication, which uses scientific methods to understand how people come to know what’s known by science, suggests that it is incomplete.

Indeed, it’s dangerously incomplete. Unless accompanied by another science-reasoning trait, the capacities associated with science literacy can actually impede public recognition of the best available evidence and deepen pernicious forms of cultural polarization.

The supplemental trait needed to make science literacy supportive rather than corrosive of enlightened self-government is science curiosity.

Simply put, as ordinary members of the public acquire more scientific knowledge and become more adept at scientific reasoning, they don’t converge on the best evidence relating to controversial policy-relevant facts. Instead they become even more culturally polarized.

This is one of the most robust findings associated with the science of science communication. It is a relationship observed, for example, in public perceptions of myriad societal risk sources—not just climate change but also nuclear power, gun control and fracking, among others.

In addition, this same pattern—the greater the proficiency, the more acute the polarization—characterizes multiple forms of reasoning essential to science comprehension: polarization increases in tandem not only with science literacy but also with numeracy (an ability to reason well with quantitative information) and with actively open-minded thinking—a tendency to revise one’s beliefs in light of new evidence.

The same goes for cognitive reflection. The Cognitive Reflection Test (CRT) measures how much people rely on two forms of information processing: “fast,” preconscious, emotion-driven forms of reasoning, often called “System 1”; or a conscious, deliberate, analytical, “slow” form, designated “System 2.”

There’s no doubt that scientific reasoning demands a high degree of proficiency in System 2 information processing. But as ordinary members of the public become more adept at this style of reasoning, they don’t think more like scientists. Instead, they become more reliable indicators of what people who share their group commitments think about culturally contested risks and related facts.

This relationship is readily apparent in public opinion survey studies (Figure 1). It has also been documented experimentally. Experiments catch these thinking capacities “in the act”: proficient reasoners are revealed to be using their analytical skills to ferret out evidence that supports their group’s position, while rationalizing dismissal of such evidence when it undermines their side’s beliefs.

Figure 1. Increasing polarization associated with various reasoning capacities and issues. Credit: Dan M. Kahan

What explains this effect? As counterintuitive as it sounds, it is perfectly rational to use one’s reason this way in a science communication environment polluted by tribalism.

What an ordinary member of the public thinks about climate change, for example, has no impact on the climate. Nor does anything that she does as a consumer or a voter; her individual impact is too small to make a difference. Accordingly, when she is acting in one of these capacities, any mistake she makes about the best available scientific evidence will have zero impact on her or anyone she cares about.

But given what positions on climate change have now come to signify about one’s group allegiances, adopting the “wrong” position in interactions with her peers could rupture bonds on which she depends heavily for emotional and material well-being. Under these pathological conditions, she will predictably use her reasoning not to discern the truth but to form and persist in beliefs characteristic of her group, a tendency known as “identity-protective cognition.”

One doesn’t have to be a Nobel prizewinner to figure out which position one’s tribe espouses. But if someone does enjoy special proficiency in comprehending and interpreting empirical evidence, it is perfectly predictable that she’ll use that skill to forge even stronger links between what she believes and who she is, culturally speaking.

Now consider curiosity.

Conceptually, curiosity has properties directly opposed to those of identity-protective cognition. Whereas the latter evinces a hardened resistance to exploring evidence that could challenge one’s existing views, the former consists of a hunger for the unexpected, driven by the anticipated pleasure of surprise. In that state, the defensive sentries of existing opinion have necessarily been made to stand down. One could reasonably expect, then, that those disposed toward science curiosity would be more open-minded and as a result less polarized along cultural lines.

This is exactly what we see when we test this conjecture empirically. In general population surveys, diverse citizens who score high on the Science Curiosity Scale (SCS) are less divided than are their low-scoring peers.

Indeed, rather than becoming more polarized as their science literacy increases, those who score highest on SCS tend to converge on what the evidence signifies about climate change, private gun ownership, nuclear power and the other risk sources.

Experimental data suggest why. Afforded a choice, low-curiosity individuals opt for familiar evidence consistent with what they already believe; high-curiosity citizens, in contrast, prefer to explore novel findings, even if that information implies that their group’s position is wrong (Figure 2). Consuming a richer diet of information, high-curiosity citizens predictably form less one-sided and hence less polarized views.

Figure 2. Selection of position-threatening news story. N= 750, nationally representative sample. Dotted lines denote 0.95 confidence intervals. Credit: Dan M. Kahan

This empirical research paints a more complex picture of the cognitively virtuous democratic citizen. To be sure, she knows a good deal about scientific discoveries and methods. But of equal importance, she experiences wonder and awe—the emotional signatures of curiosity—at the insights that science affords into the hidden processes of nature.

The findings on science curiosity also have implications for the practice of science communication. Merely imparting information is unlikely to be effective—and could even backfire—in a society that has failed to inculcate curiosity in its citizens and that doesn’t engage curiosity when communicating policy-relevant science.

What, then, should educators, science journalists, and other science communication professionals do to enlist the benefits of science curiosity?

The near-term answer to this question is straightforward: join forces with empirical researchers to study science curiosity and the advancement of their craft.

The value of such collaborations was a major theme of the National Academy of Sciences’ recent expert-consensus report Communicating Science Effectively. Indeed, connected lab-field initiatives of the kind envisioned by the NAS Report are already in place. The Science Curiosity Scale is itself the product of a collaborative project between social science researchers affiliated with the Cultural Cognition Project at Yale Law School (CCP) and the Annenberg Public Policy Center at the University of Pennsylvania (APPC), on the one hand, and science film producers at Tangled Bank Studios, on the other.

Results from that initiative, in turn, inform a collaboration between APPC social scientists and science communicators at the public television station KQED. Funded by the National Science Foundation and the Templeton Foundation, that partnership is performing field studies aimed at making science films and related forms of communication engaging to science-curious members of culturally diverse groups—including the groups that are bitterly divided on climate change and other issues.

For now, there are no proven protocols for using science curiosity to help extinguish the group rivalries that generate public disagreement over policy-relevant science, particularly among the most science literate members of such groups.

But if the science of science communication is not yet in a position to tell science communicators exactly what to do to harness the unifying effects of curiosity, it unmistakably does tell them how to figure that out: by use of the empirical methods of science itself.
Check also:

The key mechanism that generates scientific polarization involves treating evidence generated by other agents as uncertain when their beliefs are relatively different from one’s own:
Scientific polarization. Cailin O’Connor, James Owen Weatherall. European Journal for Philosophy of Science. October 2018, Volume 8, Issue 3, pp 855–875. https://www.bipartisanalliance.com/2018/12/the-key-mechanism-that-generates.html
Polarized Mass or Polarized Few? Assessing the Parallel Rise of Survey Nonresponse and Measures of Polarization. Amnon Cavari and Guy Freedman. The Journal of Politics, https://www.bipartisanalliance.com/2018/03/polarized-mass-or-polarized-few.html

Tappin, Ben M., and Ryan McKay. 2018. “Moral Polarization and Out-party Hate in the US Political Context.” PsyArXiv. November 2. https://www.bipartisanalliance.com/2018/11/moral-polarization-and-out-party-hate.html

Forecasting tournaments, epistemic humility and attitude depolarization. Barbara Mellers, PhilipTetlock, Hal R. Arkes. Cognition, https://www.bipartisanalliance.com/2018/10/forecasting-tournaments-epistemic.html

Does residential sorting explain geographic polarization? Gregory J. Martin & Steven W. Webster. Political Science Research and Methods, https://www.bipartisanalliance.com/2018/10/voters-appear-to-be-sorting-on-non.html

Liberals and conservatives have mainly moved further apart on a wide variety of policy issues; the divergence is substantial quantitatively and in its plausible political impact: intra party moderation has become increasingly unlikely:

Peltzman, Sam, Polarizing Currents within Purple America (August 20, 2018). SSRN: https://www.bipartisanalliance.com/2018/09/liberals-and-conservatives-have-mainly.html

Does Having a Political Discussion Help or Hurt Intergroup Perceptions? Drawing Guidance From Social Identity Theory and the Contact Hypothesis. Robert M. Bond, Hillary C. Shulman, Michael Gilbert. Bond Vol 12 (2018), https://www.bipartisanalliance.com/2018/10/having-political-discussion-with-out.html

All the interactions took the form of subjects rating stories offering ‘ammunition’ for their own side of the controversial issue as possessing greater intrinsic news importance:
Perceptions of newsworthiness are contaminated by a political usefulness bias. Harold Pashler, Gail Heriot. Royal Society Open Science, https://www.bipartisanalliance.com/2018/08/all-interactions-took-form-of-subjects.html
When do we care about political neutrality? The hypocritical nature of reaction to political bias. Omer Yair, Raanan Sulitzeanu-Kenan. PLOS, https://www.bipartisanalliance.com/2018/05/when-do-we-care-about-political.html

Democrats & Republicans were both more likely to believe news about the value-upholding behavior of their in-group or the value-undermining behavior of their out-group; Republicans were more likely to believe & want to share apolitical fake news:
Pereira, Andrea, and Jay Van Bavel. 2018. “Identity Concerns Drive Belief in Fake News.” PsyArXiv. September 11. https://www.bipartisanalliance.com/2018/09/democrats-republicans-were-both-more.html
In self-judgment, the "best option illusion" leads to Dunning-Kruger (failure to recognize our own incompetence). In social judgment, it leads to the Cassandra quandary (failure to identify when another person’s competence exceeds our own): The best option illusion in self and social assessment. David Dunning. Self and Identity, https://www.bipartisanalliance.com/2018/04/in-self-judgment-best-option-illusion.html

People are more inaccurate when forecasting their own future prospects than when forecasting others, in part the result of biased visual experience. People orient visual attention and resolve visual ambiguity in ways that support self-interests: "Visual experience in self and social judgment: How a biased majority claim a superior minority." Emily Balcetis & Stephanie A. Cardenas. Self and Identity, https://www.bipartisanalliance.com/2018/04/people-are-more-inaccurate-when.html

Can we change our biased minds? Michael Gross. Current Biology, Volume 27, Issue 20, 23 October 2017, Pages R1089–R1091. https://www.bipartisanalliance.com/2017/10/can-we-change-our-biased-minds.html
Summary: A simple test taken by millions of people reveals that virtually everybody has implicit biases that they are unaware of and that may clash with their explicit beliefs. From policing to scientific publishing, all activities that deal with people are at risk of making wrong decisions due to bias. Raising awareness is the first step towards improving the outcomes.
People believe that future others' preferences and beliefs will change to align with their own:
The Belief in a Favorable Future. Todd Rogers, Don Moore and Michael Norton. Psychological Science, Volume 28, issue 9, page(s): 1290-1301, https://www.bipartisanalliance.com/2017/09/people-believe-that-future-others.html
Kahan, Dan M. and Landrum, Asheley and Carpenter, Katie and Helft, Laura and Jamieson, Kathleen Hall, Science Curiosity and Political Information Processing (August 1, 2016). Advances in Political Psychology, Forthcoming; Yale Law & Economics Research Paper No. 561. Available at SSRN: https://ssrn.com/abstract=2816803
Abstract: This paper describes evidence suggesting that science curiosity counteracts politically biased information processing. This finding is in tension with two bodies of research. The first casts doubt on the existence of “curiosity” as a measurable disposition. The other suggests that individual differences in cognition related to science comprehension - of which science curiosity, if it exists, would presumably be one - do not mitigate politically biased information processing but instead aggravate it. The paper describes the scale-development strategy employed to overcome the problems associated with measuring science curiosity. It also reports data, observational and experimental, showing that science curiosity promotes open-minded engagement with information that is contrary to individuals’ political predispositions. We conclude by identifying a series of concrete research questions posed by these results.
Keywords: politically motivated reasoning, curiosity, science communication, risk perception

Facebook news and (de)polarization: reinforcing spirals in the 2016 US election. Michael A. Beam, Myiah J. Hutchens & Jay D. Hmielowski. Information, Communication & Society, http://www.bipartisanalliance.com/2018/03/our-results-also-showed-that-facebook.html

The Partisan Brain: An Identity-Based Model of Political Belief. Jay J. Van Bavel, Andrea Pereira. Trends in Cognitive Sciences, http://www.bipartisanalliance.com/2018/02/the-tribal-nature-of-human-mind-leads.html

The Parties in our Heads: Misperceptions About Party Composition and Their Consequences. Douglas J. Ahler, Gaurav Sood. Aug 2017, http://www.bipartisanalliance.com/2018/01/we-tend-to-considerably-overestimate.html

The echo chamber is overstated: the moderating effect of political interest and diverse media. Elizabeth Dubois & Grant Blank. Information, Communication & Society, http://www.bipartisanalliance.com/2018/01/the-echo-chamber-is-overstated.html

Processing political misinformation: comprehending the Trump phenomenon. Briony Swire, Adam J. Berinsky, Stephan Lewandowsky, Ullrich K. H. Ecker. Royal Society Open Science, published on-line March 01 2017. DOI: 10.1098/rsos.160802, http://rsos.royalsocietypublishing.org/content/4/3/160802

Competing cues: Older adults rely on knowledge in the face of fluency. By Brashier, Nadia M.; Umanath, Sharda; Cabeza, Roberto; Marsh, Elizabeth J. Psychology and Aging, Vol 32(4), Jun 2017, 331-337. http://www.bipartisanalliance.com/2017/07/competing-cues-older-adults-rely-on.html

Stanley, M. L., Dougherty, A. M., Yang, B. W., Henne, P., & De Brigard, F. (2017). Reasons Probably Won’t Change Your Mind: The Role of Reasons in Revising Moral Decisions. Journal of Experimental Psychology: General. http://www.bipartisanalliance.com/2017/09/reasons-probably-wont-change-your-mind.html

Science Denial Across the Political Divide — Liberals and Conservatives Are Similarly Motivated to Deny Attitude-Inconsistent Science. Anthony N. Washburn, Linda J. Skitka. Social Psychological and Personality Science, 10.1177/1948550617731500. http://www.bipartisanalliance.com/2017/09/liberals-and-conservatives-are.html

Biased Policy Professionals. Sheheryar Banuri, Stefan Dercon, and Varun Gauri. World Bank Policy Research Working Paper 8113. http://www.bipartisanalliance.com/2017/08/biased-policy-professionals-world-bank.html

Dispelling the Myth: Training in Education or Neuroscience Decreases but Does Not Eliminate Beliefs in Neuromyths. Kelly Macdonald et al. Frontiers in Psychology, Aug 10 2017. http://www.bipartisanalliance.com/2017/08/training-in-education-or-neuroscience.html

Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Caitlin Drummond and Baruch Fischhoff. Proceedings of the National Academy of Sciences, vol. 114 no. 36, pp 9587–9592, doi: 10.1073/pnas.1704882114, http://www.bipartisanalliance.com/2017/09/individuals-with-greater-science.html

Expert ability can actually impair the accuracy of expert perception when judging others' performance: Adaptation and fallibility in experts' judgments of novice performers. By Larson, J. S., & Billeter, D. M. (2017). Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(2), 271–288. http://www.bipartisanalliance.com/2017/06/expert-ability-can-actually-impair.html

Public Perceptions of Partisan Selective Exposure. Perryman, Mallory R. The University of Wisconsin - Madison, ProQuest Dissertations Publishing, 2017. 10607943. http://www.bipartisanalliance.com/2017/10/citizens-believe-others-especially.html

The Myth of Partisan Selective Exposure: A Portrait of the Online Political News Audience. Jacob L. Nelson, and James G. Webster. Social Media + Society, http://www.bipartisanalliance.com/2017/09/the-myth-of-partisan-selective-exposure.html

Echo Chamber? What Echo Chamber? Reviewing the Evidence. Axel Bruns. Future of Journalism 2017 Conference. http://www.bipartisanalliance.com/2017/09/echo-chamber-what-echo-chamber.html

Fake news and post-truth pronouncements in general and in early human development. Victor Grech. Early Human Development, http://www.bipartisanalliance.com/2017/09/fake-news-and-post-truth-pronouncements.html

Consumption of fake news is a consequence, not a cause of their readers’ voting preferences. Kahan, Dan M., Misinformation and Identity-Protective Cognition (October 2, 2017). Social Science Research Network, http://www.bipartisanalliance.com/2017/10/consumption-of-fake-news-is-consequence.html

No comments:

Post a Comment