Accounting ethics has long demanded that accountants not be influenced by bias. But bias isn’t just about racism, sexism, and all those other -isms. Recent revisions to section 120 of the SAICA Code of Professional Conduct make it clear that we need to avoid cognitive biases too – a much bigger challenge, because we are all guilty.
In his brilliant book Thinking, Fast and Slow, the Nobel-prizewinning psychologist Daniel Kahneman reveals that we have two systems of thinking: one fast (used to make quick judgements and decisions) and one slow (for when we require more mental processing).
The first of these systems is often called intuition. Intuition is of course extremely helpful − many of our decisions need to be made quickly and without a great deal of cognitive effort: if we had to think seriously about how to get out of bed, the best route to take to the bathroom, the correct order of our morning ablutions, and so on, we would never get anything done! And in some cases intuition is not just more efficient than the slower thinking system: it is absolutely vital, like when you spontaneously swerve your car in order to avoid a pedestrian.
Nonetheless, intuition makes mistakes, such as initially misjudging a person based on their appearance. Consider the following exercise, the original version of which was developed by the psychologist Shane Frederick:
Determine, as quickly as you can, the answer to the following question: A child’s cricket bat and ball together cost R1 100. The bat costs R1 000 more than the ball. How much does the ball cost?
Many people, when they come across a question like this in an article like this, anticipate that it will be tricky, and they therefore slow down and force themselves to ignore their intuition. If you did this, perhaps you came up with the correct answer, which is R50. If instead you just went with the response provided by your intuition, you almost certainly would have incorrectly answered ‘R100’. (If you don’t see why that is the wrong answer, just add R100 for the ball to R1 100 for the bat.) Even those who successfully resisted intuition would likely admit that had they not been actively looking for a catch, they would have been fooled.
Why is this? Well, it seems that our slower, logical thinking system kicks into action only when the intuitive thinking system tells it to − when the quicker system realises that it cannot handle the matter on its own − like if you were asked to multiply 27 by 33. If the intuitive system does not identify a need to call in backup, we will just go with our snap judgement. This is what happens when the answer ‘R100’ pops into our heads. Our intuition thinks ‘this looks like a problem I can handle – leave it to me’, yet it has mistaken a challenging question for a simple one. Because it has missed the complexity, it blurts out the wrong answer.
Cognitive biases describe scenarios where the intuitive system routinely tends to make this sort of error. Psychologists have discovered and named hundreds of them – just take a look at Wikipedia’s entry for ‘List of cognitive biases’. (It might be quite fun to read through them and for each one ask yourself, ‘do I know anyone who does that?’. And then, after you’ve come across bias blind spot, perhaps you’ll be better at asking the more revealing question ‘Do I do that?’.)
How is this relevant to ethics? Well, being ethical is not just about doing the right thing. First and foremost it’s about figuring out what the right thing is. Only once we’ve applied our minds can we actually act on our judgement. But this is often far from easy. Accountants and other businesspeople regularly confront difficult ethical dilemmas. For example, how to weigh the opposing interests of different stakeholders; how to balance competing ethical values; how to determine a moral course of action which is also aligned with our self-interest. Yet there is a real danger that our overactive intuitive thinking system will fall prey to the biases that bypass the deep reflection that is needed, and instead serve up a simplistic and ethically unsound result.
To know that cognitive biases exist, to recognise their enormous influence, to be able to identify at least some of them, and to know the basic mechanism by which they arise, are key weapons in combating them. Such knowledge will help us develop specific actions in particular contexts to avoid certain cognitive biases: for example, Kahneman suggests that before discussing an important decision, business executives should each write down their initial views and start the meeting by reading what they have written to reduce the tendency for anchoring bias – the first cognitive bias listed in the revised SAICA Code – to take effect, granting disproportionate influence to those who speak first.
There is also a general approach which will strengthen our defence in the battle against cognitive biases and their onslaught on our reasoning processes. When a decision is important − as all ethical decisions are − we must remember that forming a conclusion is not the responsibility of our intuition. Sure, we should not entirely cut it off, because its contributions may have value, but they must ultimately be rigorously tested. Perhaps a certain idea has intuitive appeal, and others are counterintuitive, but this is just a starting point, and no proposed solution to an ethical challenge should be accepted or rejected until it has been subjected to a thorough evaluation using the methods of our slower, deliberate, logical thinking system.
In other words, when a situation contains an ethical component, we ought to respond as suspicious readers responded to the question about the cricket bat and ball: to be on our guard, to anticipate tricks, and to retain a healthy scepticism about our ability to instantly know the ethical course of action.
Author
Jimmy Winfield, Associate professor in the College of Accounting at the University of Cape Town (UCT).