The BBC’s Horizon series had a cracking episode last week on the subject of How we make decisions. At the programme’s core was an exploration of the work of Daniel Kahneman whose book Thinking, Fast and Slow makes a great deal more sense to me know.
Kahneman’s work identifies two sorts of thinking patterns that we use in navigating our day-to-day lives: System 1 is the instinctive, sub-conscious and fast; System 2 is the slow, logical, considered and slow. Most of our lives rely on System 1, and whilst that works for us much of the time, it has a number of evolutionary traps into which we all regularly fall.
One of these is confirmation bias. We make decisions that will fit within our world view, despite the logic. In recent weeks I’ve seen this manifest itself in an information system that a client has started to use to rank influencers in their field of expertise. Whilst this system is “objective” (or at least as objective as any algorithm can be) when confronted with a ranked list of influencers many folks response has been that something is wrong because the ranking doesn’t reflect their own view. There’s been little consistency across individuals as to where the “wrong” is – which leads me to think that the ranking is probably fairly objective.
If we didn’t suffer confirmation bias, the response to this system would uniformly be “Oh, that’s not what I expected. How interesting!” not “Oh, that’s not what I expected. It’s wrong.” Confirmation bias is one of the reasons why I think Big Data will be harder to make happen than many of the tech’s proposers may think. That though, of course, is in its own right an opinion borne of these biases, and I naturally filter out evidence to the contrary of my own opinions.
Another topic that got covered in the Horizon show which got me thinking was around Loss Aversion. In particular, when we are on a losing streak, we tend towards making increasingly risky decisions (I guess that this is also reflected somewhat in the Sunk Cost Fallacy).
At a time when organisations are increasingly attempting to make “failure” a positive learning experience, it strikes me that these kind of psychological factors (alongside other sociological and cultural ones) make this really, really hard.
The reason? Well, because when we start to see loss we don’t act rationally – we start to do more and more to try to stem those losses. If an innovation initiative isn’t delivering what was expected, it’s really very hard indeed for those involved to stop what they are doing. And “logical” risk management processes placed around them are going to be hard to make happen.