The psychology of actual versus perceived risks.
Risk is a very important basis for our decisions as individuals and as communities. For example, an individual assesses the risks of switching a job, and the medical industry assesses the risks of allowing the use of a drug with side effects; recently many contemplate the risks of a state staying in or leaving the European Union. And then there is enterprise (or cost-related) risk, which is quantifiable on the basis of financial gains and losses.
But how are we dealing with risk from a psychological perspective when it comes to cost-related risk and decision making? Technically, risk is defined as the product of an incident’s likelihood and of the incident’s impact. However, we – as humans – seem to sometimes struggle to assess either.
A main complexity on our understanding of risk arises from the fact that we carry a so-called neglect of probability. The study of Monat et al. clearly shows this, based on the response of students on different likelihoods (from 0% to 100%) of being electroshocked in a respective study: Students knowing that there is no likelihood that they ‘ll be given an electroshock showed obviously no stress reaction, but interestingly, students being informed that there is an X likelihood that they will be given an electroshock during the study, showed equal levels of stress reaction regardless if this X likelihood was 5%, 50% or 95%. The main conclusion is that we tend to not respond to the likelihood of an event, but only to the occurrence.
Furthermore, our brains also get confused when we come to assess the magnitude of the impact of an occurrence. Representation and affect heuristics are common biases that lead us to characterise an event subjectively and emotionally and thus often out of its actual proportions. Our natural or even evolutionary defensiveness makes us fear loss more than we value gain (see also the milestone paper by Kahneman and Tversky). Think about it: losing £100 will cost you a greater amount of happiness, than the delight you‘d feel if someone gave you this amount of money.
Besides loss aversion, our decisions are also governed by risk aversion, which is a strong preference of certainty over gambles, even when the gamble has a higher or equal expected value. This is often referred to as the Ellsberg paradox: Ellsberg presented a test population with two urns. One urn contained 50 white marbles and 50 black marbles, while the other also contained 100 marbles, but the ratio of white to black marbles was unknown, with every ratio as likely as any other. Then, “the game” was to draw a black marble in one pick, without looking, in order to win a prize. Although the probability to draw a black marble was identical for either urn, people overwhelmingly chose to draw a marble from the urn with a known set of probabilities, rather than take a chance on the urn with an unknown ratio. This is one of the proofs that we exhibit strong aversion to ambiguity and uncertainty, meaning we have an inherent preference for the known over the unknown.
All the above cognitive biases lead us to falsely assess both the likelihood of an event and the magnitude of its impact, and consequently to a very unrealistic perception of risk. But thanks to a plethora of socio-psychological studies, we become conscious of our misleading biases and we possess the knowledge needed to distinguish between actual and perceived risks. A scientific approach helps in assessing the risks more realistically. The methodologies include suitable mathematical concepts (e.g. probabilistics, data mining, expert’s judgement quantification, objective records of various occurrences), and predefined cost-based thresholds of acceptable risks.
Some further reading:
Proske (2008): ‘Catalogue of risks: natural, technical, social and health risks’
Rolf Dobelli (2013): The Art of Thinking Clearly
References:
Glossary: Representativeness heuristics
Kahneman and Tversky (1979): ‘Prospect Theory: An Analysis of Decision under Risk’. In: Econometrica