Another day, another airline algorithm atrocity – this time not a doctor being hauled down the aisle by United, but a ten year old child being barred from travelling with his family by Air Canada.
For all of the talk about PR disasters and customer service woes, in our automated algorithm obsessed world one dialogue I’m not hearing clearly is:
“YOU REALISE THAT THIS IS WHAT WILL HAPPEN WHEN WE ALLOW ALGORITHMS TO RULE THE WORLD, DON’T YOU?!?!?!?!?”
Sorry, I’ll calm down.
The thing is that these kind of events are the inevitable consequence of a world run by algorithms. A challenge of chickens ruling over pigs.
If you’re not already familiar with the chicken/pig analogy, very popular in certain agile circles, it goes a little something like this…
Think about a cooked breakfast (if you’re vegetarian or vegan you’re going to need to run with this one for a while). Eggs and bacon. In a cooked breakfast a chicken has involvement. A pig is truly committed.
And so it is with (ironically given the way the analogy is used in software development) the difference between developers and someone on the front line of customer service. A developer creating algorithms can never be committed to the outcomes in the way that someone on the front line could be. It would be an act of idiocy to refuse a ten-year-old child passage with their family if you or I were confronted with the decision of who to eject from the plane. But the algorithms couldn’t cope because the human sensing required isn’t there.
A few weeks ago I provoked a bit of a debate with an article about the conundrum of an absence of self-driving trains. I asked a straight question on Twitter – what’s holding back the ubiquity of self-driving trains?, and the overwhelming answer from the masses was “The Unions”. How well we are trained to knee-jerk responses by the mass media.
The actual answers are many and varied, but summed up by my great friend (and proper expert in the field of public transportation) Dan in one word: “Money”. Put simply it’s cheaper for the time being to stick a person in the front of a train, self-driving or not, and make them responsible for the ultimate success of the vehicle rather than developing complex technological systems. Again a pig versus a chicken acting as an integration layer in a complex web of safety-critical systems.
“But airlines are old industry!” I hear the Valley Acolytes shout. “They can’t possibly have algorithms as good as those that, say Elon Musk has got!”. But Yield Management technologies have been around for over thirty years and if they are still throwing up garbage like we’ve seen in the past few weeks than there is something important to be learned.
Maybe we should be thinking harder about how we maintain human supervision over such technology? Algorithms are often seen as these pure, objective devices that produce perfect answers unsullied by the subjectivity of human interaction. But as Cathy O’Neil argues in her book Weapons of Math Destruction many are just as prone to subjective distortion, just many steps removed. And much more importantly, the heuristic way in which humans make sense of the world through cognitive biases should be seen for what they are – the reason why we have been so successful as a species, not a series of bugs to be ironed out.
Simple intuition would be enough to tell most people that the Air Canada incident was the wrong thing to do. Put algorithms into the equation, though, and all of a sudden sane, intelligent people become subservient to dumb machines.