11 Helpful Heuristics
Happily, there are a number of heuristics that people have suggested, aimed at resisting the worst tendencies of self-interested biases. For instance, US political philosopher, John Rawls, proposed an interesting way of using rational self-interest to subvert exceptionalist thinking in political theory.[1] He suggested that when we are trying to determine a just solution to a social problem or even what a just society might be, we should imagine ourselves behind a veil of ignorance. By this, he meant that when we are trying to decide what social arrangement would be the most just, we should imagine that we don’t know what our position or role in society might be. He thought that if we make decisions about social arrangements imagining that we might be in the worst possible position in society, then we will favor fairer decisions.[2] The veil of ignorance heuristic can be extended to thinking about how our own actions might affect or be judged by others but is particularly useful in thinking about policy decisions in various applied ethics contexts.
Many cultures share another kind of tool to guard against self-regarding cognitive biases. In the Christian tradition, it is called the Golden Rule: “do to others what you want them to do to you” (Matthew 7:12). Unsurprisingly, the two other major religions which share historical roots with Christianity also have this maxim as central. Jewish law teaches, “Love your neighbour as yourself” (Leviticus 19:18) and we can find much the same idea in Islamic thought.[3] It is perhaps more striking that we can find the same basic idea in India, China, and Africa. In the Mahabharata , one passage counsels, “One should not behave towards others in a way which is disagreeable to oneself. This is the essence of morality. All other activities are due to selfish desire.”[4] Similarly, Kongfuzi taught “What you do not want done to yourself, do not do to others.”[5] These are just a few examples of the traditions where we can find this important reminder to avoid exceptionalist and selfish thinking.
Of course, the veil of ignorance and the Golden Rule are just heuristics and cannot offer perfect guidance. Some philosophers have complained that our imaginations are limited, and that it is much better to attend to what the least fortunate in our society say they want, rather than imagining ourselves in their shoes. Similarly, you may have some pretty peculiar tastes and preferences, so others may not want to be treated the same way as you would like to be. But arguably, such responses risk missing the point. These heuristics are helpful because they remind us that other individuals have lives as rich, complicated, and worthy of respect as our own.
Helpful heuristics aside, some of the best tools for challenging our own cognitive biases and self-interested attitudes have already been described in this primer. It’s not just the substance of these ideas and methods that makes them so effective, but that employing them slows down our deliberative and decision-making processes. Daniel Kahneman, who won the Nobel Prize for his research on human decision-making, describes two ways that neurotypical human brains process information and make decisions: fast thinking and slow thinking.[6] The fast thinking system takes past experiences, judgements, and decisions and unreflectively applies these previous patterns to current situations, thus saving mental effort. Fast thinking is common and effective in daily tasks and decision-making. However, it also makes us susceptible to cognitive biases and errors in thinking, like the self-interested biases discussed above. Slow thinking describes the rational and deliberative thought processes that allow humans to think critically about their beliefs, values, and experiences when making choices.[7] The slow thinking system, however, requires significant mental effort. Kahneman has found that—although, as rational agents, we identify with the slow thinking process—we depend more on fast thinking. Kahneman’s research shows that when we allow our brains to run on autopilot, are cognitively busy or exhausted, or are avoiding tasks that engage our slow thinking processes, then we are more susceptible to selfish, unethical, and even discriminatory behaviour.[8]
Stop and Think
Can you identify a situation in which, on reflection, you behaved badly because you were relying on fast thinking?
How might things have gone differently if you had made time for careful, critical reflection?
Using the tools that we have described in this primer will help you engage your slow thinking processes and avoid the cognitive biases that might lead to unethical attitudes or behaviour. Productive debate and philosophical argumentation can help one understand alternative positions and evaluate the strength of one’s own reasons for accepting a given conclusion. The ethical lenses can help one think through decisions about how to act from various perspectives that have different starting points or central commitments. And carefully employing the helpful heuristics, described above, can disrupt the fast thinking that we subconsciously use to make most of our small, everyday decisions when our choices matter morally. Ethical reflection and discussion helps one think critically about the attitudes, beliefs, and practices we might take for granted—attitudes, beliefs, and practices that may be engrained in our automatic, fast thinking habits.
11.1 Conclusion
These final thoughts ultimately bring us back to the point that we made right at the beginning of this primer in the Introduction. You cannot avoid ethics. Whatever path one takes in life, ethical problems will arise, and we will be challenged to determine for ourselves what it means to live well and do the right thing in the circumstances in which we find ourselves. Each of us plays a part in creating more ethical practices in our society and creating a more just community. Different professions and applied ethics contexts come with their own particular challenges, but each can be assessed by employing much the same tools—including the ideas offered in this primer.
- John Rawls, Justice as Fairness: A Restatement, ed. Erin Kelly (Cambridge: Harvard University Press, 20010. ↵
- The BBC has a nice brief introduction to this thought experiment. See, Nigel Warburton, "The Veil of Ignorance," BBC, April 2015, video, https://www.bbc.co.uk/programmes/p02n3sgv. ↵
- For example, see Justin Parrot, “Al-Ghazali and the Golden Rule: Ethics of Reciprocity in the Works of a Muslim Sage,” Journal of Religious and Theological Information 16, no. 2 (2017): 68-72, http://dx.doi.org/10.1080/10477845.2017.1281067. ↵
- Richard H. Davis, “A Hindu Golden Rule, in Context,” in The Golden Rule: The Ethics of Reciprocity in World Religions, ed. Jacob and Bruce Chilton (London: Continuum, 2008), 146-56. ↵
- Confucius, The Analects (New York: Open Road Integrated Media, 2016), xv, 23, https://ebookcentral.proquest.com/lib/dal/detail.action?docID=4697581. ↵
- Daniel Kahneman, Thinking Fast and Slow (Toronto: Anchor Canada, 2013), 10-12. ↵
- Kahneman, Thinking, 21. ↵
- Kahneman, Thinking, 37. ↵
Mental shortcuts that tend toward error when used uncritically
A thought experiment proposed by John Rawls that asks us to imagine that we don't know what our social position would be in society, when we are evaluating the justice of social arrangements within that society
The principle of treating others as you want to be treated
The cognitive system that we use for most of our daily decision-making that uses mental shortcuts to make quick decisions (but is more prone to errors because of this)
The cognitive system, requiring considerable mental effort, that allows humans to think critically and rationally about their beliefs, values, experiences, and choices when making a decision