Belief perseverance

From Wikipedia, the free encyclopedia

Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2]

Since rationality involves conceptual flexibility,[3][4] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".[5]

If beliefs are strengthened after others attempt to present evidence debunking them, this is known as a backfire effect.[6] There are psychological mechanisms by which backfire effects could potentially occur, but the evidence on this topic is mixed, and backfire effects are very rare in practice.[7][8][9] A 2020 review of the scientific literature on backfire effects found that there have been widespread failures to replicate their existence, even under conditions that would be theoretically favorable to observing them.[8] Due to the lack of reproducibility, as of 2020 most researchers believe that backfire effects are either unlikely to occur on the broader population level, or they only occur in very specific circumstances, or they do not exist.[8] For most people, corrections and fact-checking are very unlikely to have a negative impact, and there is no specific group of people in which backfire effects have been consistently observed.[8]

Evidence from experimental psychology[edit]

According to Lee Ross and Craig A. Anderson, "beliefs are remarkably resilient in the face of empirical challenges that seem logically devastating".[10]

The first study of belief perseverance was carried out by Festinger, Riecken, and Schachter.[11] These psychiatrists spent time with members of a doomsday cult who believed the world would end on December 21, 1954.[11] Despite the failure of the forecast, most believers continued to adhere to their faith.[11][12][13] In When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (1956) and A Theory of Cognitive Dissonance (1957), Festinger proposed that human beings strive for internal psychological consistency to function mentally in the real world.[11] A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance.[11][12][14] They tend to make changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance (rationalization) or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias).[11][12][14]

When asked to reappraise probability estimates in light of new information, subjects displayed a marked tendency to give insufficient weight to the new evidence. They refused to acknowledge the inaccurate prediction as a reflection of the overall validity of their faith. In some cases, subjects reported having a stronger faith in their religion than before.[15]

In a separate study, mathematically capable teenagers and adults were given seven arithmetical problems and asked to estimate approximate solutions using manual estimating. Then, using a calculator rigged to provide increasingly erroneous figures, they were asked for accurate answers (e.g., yielding 252 × 1.2 = 452.4, when it is actually 302.4). About half of the participants went through all seven tasks while commenting on their estimating abilities or tactics, never letting go of the belief that calculators are infallible. They simply refused to admit that their previous assumptions about calculators could have been incorrect.[16]

Lee Ross and Craig A. Anderson led some subjects to the false belief that there existed a positive correlation between a firefighter's stated preference for taking risks and their occupational performance. Other subjects were told that the correlation was negative. The participants were then thoroughly debriefed and informed that there was no link between risk taking and performance. These authors found that post-debriefing interviews pointed to significant levels of belief perseverance.[17]

In another study, subjects spent about four hours following instructions of a hands-on instructional manual.  At a certain point, the manual introduced a formula which led them to believe that spheres were 50 percent larger than they are. Subjects were then given an actual sphere and asked to determine its volume; first by using the formula, and then by filling the sphere with water, transferring the water to a box, and directly measuring the volume of the water in the box. In the last experiment in this series, all 19 subjects held a Ph.D. degree in a natural science, were employed as researchers or professors at two major universities, and carried out the comparison between the two volume measurements a second time with a larger sphere. All but one of these scientists clung to the spurious formula despite their empirical observations.[18]

Even when we deal with ideologically neutral conceptions of reality, when these conceptions have been recently acquired, when they came to us from unfamiliar sources, when they were assimilated for spurious reasons, when their abandonment entails little tangible risks or costs, and when they are sharply contradicted by subsequent events, we are, at least for a time, disinclined to doubt such conceptions on the verbal level and unlikely to let go of them in practice.

–Moti Nissani[1]

Backfire effects[edit]

If beliefs are strengthened after others attempt to present evidence debunking them, this is known as a backfire effect (compare boomerang effect).[6] For example, this would apply if providing information on the safety of vaccinations resulted in increased vaccination hesitancy.[19][20] Types of backfire effects include: Familiarity Backfire Effect (from making myths more familiar), Overkill Backfire Effect (from providing too many arguments), and Worldview Backfire Effect (from providing evidence that threatens someone's worldview).[8] There are a number of techniques to debunk misinformation, such as emphasizing the core facts and not the myth, or providing explicit warnings that the upcoming information is false, and providing alternative explanations to fill the gaps left by debunking the misinformation.[21] However, more recent studies provided evidence that the backfire effects are not as likely as once thought.[22]

There are psychological mechanisms by which backfire effects could potentially occur, but the evidence on this topic is mixed, and backfire effects are very rare in practice.[7][8][9] A 2020 review of the scientific literature on backfire effects found that there have been widespread failures to replicate their existence, even under conditions that would be theoretically favorable to observing them.[8] Due to the lack of reproducibility, as of 2020 most researchers believe that backfire effects are either unlikely to occur on the broader population level, or they only occur in very specific circumstances, or they do not exist.[8] Brendan Nyhan, one of the researchers who initially proposed the occurrence of backfire effects, wrote in 2021 that the persistence of misinformation is most likely due to other factors.[9]

For most people, corrections and fact-checking are very unlikely to have a negative impact, and there is no specific group of people in which backfire effects have been consistently observed.[8] Presenting people with factual corrections has been demonstrated to have a positive effect in many circumstances.[8][23][24] For example, this has been studied in the case of informing believers in 9/11 conspiracy theories about statements by actual experts and witnesses.[23] One possibility is that criticism is most likely to backfire if it challenges someone's worldview or identity. This suggests that an effective approach may be to provide criticism while avoiding such challenges.[24]

In many cases, when backfire effects have been discussed by the media or by bloggers, they have been over-generalized from studies on specific subgroups to incorrectly conclude that backfire effects apply to the entire population and to all attempts at correction.[8][9]

In cultural innovations[edit]

Physicist Max Planck wrote that "the new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it".[25] For example, the heliocentric theory of the great Greek astronomer, Aristarchus of Samos, had to be rediscovered about 1,800 years later, and even then undergo a major struggle before astronomers took its veracity for granted.[26]

Belief persistence is frequently accompanied by intrapersonal cognitive processes. "When the decisive facts did at length obtrude themselves upon my notice," wrote the chemist Joseph Priestley, "it was very slowly, and with great hesitation, that I yielded to the evidence of my senses."[27]

In education[edit]

Students often "cling to ideas that form part of their world view even when confronted by information that does not coincide with this view."[28] For example, students may spend months studying the solar system and do well on related tests, but still believe that moon phases are produced by Earth's shadow. What they learned was not able to intrude on the beliefs they held prior to that knowledge.[29]

Causes[edit]

The causes of belief perseverance remain unclear. Experiments in the 2010s suggest that neurochemical processes in the brain underlie the strong attentional bias of reward learning. Similar processes could underlie belief perseverance.[30]

Peter Marris suggests that the process of abandoning a conviction is similar to the working out of grief. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity".[31]

Philosopher of science Thomas Kuhn points to the resemblance between conceptual change and Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady). Hence, the difficulty of switching from one conviction to another could be traced to the difficulty of rearranging one's perceptual or cognitive field.[32]

See also[edit]

  • Anussava - Do not go upon what has been acquired by repeated hearing.

References[edit]

  1. ^ a b Nissani, Moti (December 1990). "A Cognitive Reinterpretation of Stanley Milgram's Observations on Obedience to Authority". American Psychologist. 45 (12): 1384–1385. doi:10.1037/0003-066X.45.12.1384. Retrieved November 21, 2021.
  2. ^ Baumeister, R. F.; et al., eds. (2007). Encyclopedia of Social Psychology. Thousand Oaks, CA: Sage. pp. 109–110. ISBN 9781412916707.
  3. ^ Voss, J. F.; et al., eds. (1991). Informal Reasoning and Education. Hillsdale: Erlbaum. p. 172.
  4. ^ West, L.H.T.; et al., eds. (1985). Cognitive Structure and Conceptual Change. Orlando, FL: Academic Press. p. 211.
  5. ^ Beveridge, W. I. B. (1950). The Art of Scientific Investigation. New York: Norton. p. 106.
  6. ^ a b Silverman, Craig (June 17, 2011). "The Backfire Effect: More on the press’s inability to debunk bad information". Columbia Journalism Review, Columbia University (New York City).
  7. ^ a b Lazić, Aleksandra; Žeželj, Iris (18 May 2021). "A systematic review of narrative interventions: Lessons for countering anti-vaccination conspiracy theories and misinformation". Public Understanding of Science. 30 (6). SAGE Publications: 644–670. doi:10.1177/09636625211011881. ISSN 0963-6625.
  8. ^ a b c d e f g h i j k Swire-Thompson B, DeGutis J, Lazer D (2020). "Searching for the Backfire Effect: Measurement and Design Considerations". J Appl Res Mem Cogn. 9 (3): 286–299. doi:10.1016/j.jarmac.2020.06.006. PMC 7462781. PMID 32905023.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  9. ^ a b c d Nyhan B (2021). "Why the backfire effect does not explain the durability of political misperceptions". Proc Natl Acad Sci U S A. 118 (15). doi:10.1073/pnas.1912440117. PMC 8053951. PMID 33837144.
  10. ^ Kahneman, Daniel, ed. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press. p. 144.
  11. ^ a b c d e f Dawson, Lorne L. (October 1999). "When Prophecy Fails and Faith Persists: A Theoretical Overview" (PDF). Nova Religio: The Journal of Alternative and Emergent Religions. 3 (1). Berkeley: University of California Press: 60–82. doi:10.1525/nr.1999.3.1.60. ISSN 1092-6690. LCCN 98656716. Retrieved 20 September 2021.
  12. ^ a b c Festinger, L. (1962). "Cognitive dissonance". Scientific American. 207 (4): 93–107. Bibcode:1962SciAm.207d..93F. doi:10.1038/scientificamerican1062-93. PMID 13892642. S2CID 56193073.
  13. ^ Festinger, Leon; et al. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press.
  14. ^ a b Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press.
  15. ^ Kleinmuntz, B., ed. (1968). Formal Representation of Human Judgment. New York: Wiley. pp. 17–52.
  16. ^ Timnick, Lois (1982). "Electronic Bullies". Psychology Today. 16: 10–15.
  17. ^ Anderson, C. A. (1983). "Abstract and Concrete Data in the Conservatism of Social Theories: When Weak Data Lead to Unshakeable Beliefs" (PDF). Journal of Experimental Social Psychology. 19 (2): 93–108. doi:10.1016/0022-1031(83)90031-8. Archived from the original (PDF) on 2016-10-05. Retrieved 2016-07-18.
  18. ^ Nissani, M. and Hoefler-Nissani, D. M. (1992). "Experimental Studies of Belief-Dependence of Observations and of Resistance to Conceptual Change". Cognition and Instruction. 9 (2): 97–111. doi:10.1207/s1532690xci0902_1.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  19. ^ Romm, Cari (December 12, 2014). "Vaccine Myth-Busting Can Backfire". The Atlantic.
  20. ^ Nyhan, Brendan and Reifler, Jason (January 9, 2015) "Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information" Vaccine (journal)
  21. ^ Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6. [1]
  22. ^ Lewandowsky, Stephan; Cook, John; Lombardi, Doug (2020), Debunking Handbook 2020, Databrary, pp. 9–11, doi:10.17910/b7.1182, retrieved 2021-01-20
  23. ^ a b van Prooijen, Jan-Willem; Douglas, Karen M. (2018). "Belief in conspiracy theories: Basic principles of an emerging research domain". European Journal of Social Psychology. 48 (7): 897–908. doi:10.1002/ejsp.2530. ISSN 0046-2772. PMC 6282974. PMID 30555188.
  24. ^ a b Moyer, Melinda Wenner (1 March 2019). "People Drawn to Conspiracy Theories Share a Cluster of Psychological Features". Scientific American. Retrieved 16 October 2020.
  25. ^ Eisenck, Hans J. (1990). Rebel with a Cause. London: W. H. Allen. p. 67.
  26. ^ Koestler, Arthur (1990). The Sleepwalkers: A History of Man's Changing Vision of the Universe. Penguin Books. ISBN 978-0140192469.
  27. ^ Roberts, Royston M. (1989). Serendipity. New York: Wiley. p. 28.
  28. ^ Burbules, N.C.; et al. (1992). "Response to contradiction: scientific reasoning during adolescence". Journal of Educational Psychology. 80: 67–75. doi:10.1037/0022-0663.80.1.67.
  29. ^ Lightman, A.; et al. (1993). "Teacher predictions versus actual student gains". The Physics Teacher. 31 (3): 162–167. Bibcode:1993PhTea..31..162L. doi:10.1119/1.2343698.
  30. ^ Anderson, Brian A.; et al. (2016). "The Role of Dopamine in Value-Based Attentional Orienting". Current Biology. 26 (4): 550–555. doi:10.1016/j.cub.2015.12.062. PMC 4767677. PMID 26877079.
  31. ^ Marris, Peter (1986). Loss and Change. London: Routledge. p. 2.
  32. ^ Kuhn, Thomas (1962). The Structure of Scientific Revolutions. Chicago: University of Chicago Press.

Further reading[edit]