I just finished The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life (Basic Books, 2011) by Robert Trivers, a Professor of Anthropology and Biology at Rutgers. The book covers a lot of ground, moving from deception as a strategy for plants and animals to human deception and self-deception to the role of deception and self-deception in human organizations, including government and religion. As humans, we are pretty good at sniffing out deception (aka lying) in others; self-deception is a human mental adaptation that makes the detection of deception more difficult and boosts self-confidence. I found the discussion of self-deception by organizations in the latter third of the book to be the most interesting material.
Chapter 7 looks at the psychology of self-deception. How do we pull it off? "Misrepresentation of self to others is believed to be the primary force behind misrepresentation of self to self" (p. 139). In other words, self-deception makes us better liars. But we are not necessarily conscious our own self-deception, this self-serving manipulation of information inside our heads. The ways that our human information processing apparatus leads to self-deception include:
- Avoiding Conflicting Information - We censor our own input: "We seek out publications that mirror or support our prior views and largely avoid those that don't" (p. 140).
- Biased Encoding and Interpretation of Information - "Our perceptual systems are set up to orient very quickly toward preferred information" and "our initial biases may have surprisingly strong effects" (p. 142).
- Biased Memory - "Memories are constantly distorted in self-serving ways." Memory is "both reconstructed and easily manipulated" and "we can invent completely fictitious memories" (p. 143-44).
- Rationalization and Biased Reporting - "We reconstruct internal motives and narratives to rationalize otherwise bad or questionable behavior" (p. 145).
In Chapter 9, Trivers looks at the role of self-deception within organizations, looking specifically at aviation and space disasters. These are good case studies because of the detailed investigation and subsequent reporting that follows such disasters. The discussion shows how self-deception works at the level of the organization. "When an organization practices deception toward the larger society, this may induce self-deception within the organization ..." (p. 201). One feature of organizational self-deception is the downgrading and marginalizing of departments responsible for internal reviews or safety procedures. Regarding NASA's failure to properly assess the safety risks that led to the Challenger disaster due to failed O-ring seals, Trivers comments that "in service of the larger institutional deceit and self-deception, the safety unit was thoroughly corrupted to served propaganda ends, that is, to create the appearance of safety where none existed" (p. 203).
Chapter 10, "False Historical Narratives," considers as examples the US, Japan, Turkey, Israel, and Arab states. Chapter 12 finally gets us to "Religion and Self-Deception." Cynics view all religions as thoroughly deceptive and self-deceptive, whereas sectarian believers view their own sect or religion as true and sincere with only other sects or religions as deceptive. Both critics and believers employ these simplistic accounts, and Trivers is quick to point out that supposedly sophisticated critics of religion are just as simplistic in their critique as religious defenders are in their belief. He tries to identify how self-deception works and why it persists in religious organizations. He notes the in-group and out-group effect of religious affiliation, which increases cooperation within the group at the cost of lowering cooperation outside the group. But it is the informational aspect that really merits discussion, as "certain features of religion provide a recipe for self-deception, removing nearly all restraints from rational thought" (p. 179). Such features include:
- A Unified, Privileged View of the Universe for Your Own Group - You are the chosen people.
- There May Be a Series of Interconnected Phantasmagorical Things - He rather cynically notes, "Once you have signed on to a few of these notions, there are hardly any boundaries left, and very small details can turn out to be critical features of dogma" (p. 283).
- The Deification of a Prophet - "The deification of Jesus is unlike the treatment of prophets in either Islam or Judaism" (p. 283).
- Sometimes a Book Is Treated as Received Wisdom Direct from God - "The key is that you — or your group — control the document and its interpretation" (p. 284).
- Faith Supercedes Reason - "The degree to which we believe something now becomes a determinant of its truth value" (p. 284).
- We Are Right - "We have the true religion, and as believers we are superior to those around us. (We have been "saved"; they have not.) ... [O]ur God is a just God, so our actions can't be evil when they re done in God's name" (p. 285).
There is no discussion of Mormonism anywhere in the book. I'll leave it as an exercise for the reader to judge the extent to which these features of self-deception in religion are part of the Mormon way of doing religion or of the LDS Church as an institution. It would probably be misleading to undertake such a discussion, suggesting as it might that Mormonism is uniquely subject to self-deception, whereas Trivers thinks all organizations and religions employ a healthy dose of self-deceptive practices. It's likely self-deception is evident in Mormonism to more or less the same degree as in other denominations. So why even talk about it, you might say? Because in the long run self-deception is harmful. Being aware of self-deception in ourselves and in organizations we are part of is a form of self-defense, protecting us from ourselves (we can be our own worst enemy) and from being used for ends that are not our own.
There is plenty more in the book, of course. If you want to dig deeper, read the academic paper by Trivers "The evolution and psychology of self-deception."