Behavioural Economics: When irrationality is the remarkably logical decision

The link between Behavioural Economics and the design of effective Security Awareness programmes may, at first, not be apparent. In fact, its theories and lessons have greatly impacted our work here at Marmalade Box. Let me explain.

For two centuries, the idea that anyone, given the opportunity, would always try to maximise benefit for themselves, was a central tenet of economics. “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest,” wrote the father of economics, Adam Smith, in The Wealth of Nations, in 1776. Then in the 1970s, psychologists Daniel Kahneman and Amos Tversky introduced the world to cognitive bias — the often illogical thought systems we develop and apply to different situations.

The cat was out of the bag. Humans, it appeared, were not rational creatures after all in the way that contradicted the fundamental assumptions of utility theory. In fact, it transpired that rather than carefully weighing up facts to make sensible choices, we often made rash decisions based on scant information, misguided thinking and ill-informed “knowledge”.  And yet, rather than this being to our disadvantage, in a world of increasing complexity, where we lack time or capacity to accurately calculate risk and probability, acting irrationally but doing so in a predictable way, actually works in our favour.

This is the realm of behavioural economics, and the insights it offers have widespread applicability, not least to the world of information security awareness and security behaviour. We know, because we (and, in turn, our clients) have derived enormous benefit in using its principles to design our SABCTM programmes over the last few years

Behavioural economics rules

This October, behavioural scientist Richard Thaler won the Nobel Prize for economics, the third to do so (Kahneman won it in 2002 and Robert Shiller in 2013). Thaler’s particular contribution was to show how, by using the principles of behavioural research to better understand the mental shortcuts people use to make decisions, you could ’nudge’ them into making choices that are more ‘desirable’.

It is the antithesis of traditional economic rationality, but may actually be the rational solution in increasingly complex environments. Employing mental shortcut to focus on a single key factor enables us to tune out the ‘white noise’ of competing information. Not only does this save time, but also achieves surprisingly optimal outcomes, including at work, where we are exposed to a myriad of often contrary opinions, ideas and misperceptions that colour how we think and behave.

Behavioural economics sheds light on this ‘gut instinct reasoning’ and helps explain why communication campaigns and transformation programmes that appeal solely to logic often don’t just fail, but can actually reinforce already entrenched attitudes. Anyone who has challenged someone’s worldview will have experienced the ‘backfire effect’ as they become ever more stubborn. Yet most information security awareness strategies, aimed at influencing security behaviour, are based on disseminating logic and data, trying to change behaviour through rational argument, when in fact doing nothing would give a similar outcome.

Of course, this is not the proactive ‘can do’ approach that any self-respecting CISO would expect or want to hear, which leads us to how behavioural economics can guide us in developing more effective information security awareness campaigns.

Using cognitive bias to our advantage in designing Security Awareness programmes

By applying a knowledge of heuristics – the rules of thumb that are hard-wired into us – and cognitive bias, we can create highly effective mechanisms that can help us design information security awareness programmes that are more effective at changing behaviour.

While there’s no agreement about exactly how many cognitive biases there are, research suggests there are possibly 166, each one of them colouring the way we look at the world and providing a potential opportunity for creating behavioural change. These are just a few of the more common ones you could look at employing:

  • Loss Aversion. We hate losing and do all we can to avoid it, even if it’s something we never wanted or had an interest in previously.
  • Status Quo Bias. Think of this as a fancy name for inertia. It means that we change behaviour only when the incentive to do so is sufficiently strong, otherwise we stick with what we know even when we are presented with a better alternative.
  • Anchoring. What people see first creates a powerful ‘anchor’ that affects their subsequent thinking. That’s why retailers present the most expensive item first, so that others seem cheap in comparison.
  • Framing. How we think about something changes with the context. In other words, we will think differently about something depending on where we are, who we are with and what is around us.
  • Bandwagon effect. We may believe we’re independent by nature, but we follow others more often than we think. The probability one person will change their behaviour increases based on the number of people they see with that behaviour.

Improving choice architecture

If there is one thing that the whole area of behavioural research reveals, it is that facts on their own are effectively meaningless, or at least so open to misinterpretation to effectively make them so. That’s because how we respond to them is intimately connected to our own individual preferences and inclinations, and what gives us pleasure or pain. Given this background, we are not so much using reason to arrive at the best decision, but rather we are looking to construct a rationale for making the choices that we do.

So behavioural economics, and the research that surrounds it, provides us with a means to direct the choices of others in specific directions and to favourably alter security behaviour. Having a greater understanding of these underlying mechanisms, such as cognitive bias, enables us to be better choice architects when it comes to creating the context in which people make decisions about information security awareness.

To find out more about how we apply the lessons of Behavioural Economics within our SABCTM methodology why not get in touch? Find out how you can learn how to apply these lessons and SABCTM in your own organisation here.  Of course, if you have questions or comments on this post then just add them here and I’ll be happy to answer them.