Long-time readers of our newsletter will perhaps remember that in the far distant past we once wrote about Littlewood’s law, postulating that “miracles” do happen, and they happen more regularly than one would expect. Of course, when we talk about miracles, the natural inclination is to think about miracles being good events. Things like miracle cures of illnesses. Or miraculous escapes from danger. Or your mother-in-law’s car breaking down on the way to your house. You understand.
However, what we described as “miracles” should really have just been “unlikely events”, both good and bad, which tell us that often, when the bad events occur, they too are often the result of unlikely coincidences – or even pure randomness. It is, for example well known that one such bad event – an airplane crash, is usually not the result of just one thing that goes wrong but is generally the end result of a sequence of errors. The recent Boeing crashes will almost certainly demonstrate this again.
Malcolm Gladwell, in his book Outliers, sums this up best: “The typical [airplane] accident involves seven consecutive human errors. One of the pilots does something wrong that by itself is not a problem. Then one of them makes another error on top of that, which combined with the first error still does not amount to catastrophe. But then they make a third error on top of that, and then another and another and another and another, and it is the combination of all those errors that leads to disaster. These seven errors, furthermore, are rarely problems of knowledge or flying skill. It’s not that the pilot has to negotiate some critical technical maneuver and fails. The kinds of errors that cause plane crashes are invariably errors of teamwork and communication. One pilot knows something important and somehow doesn’t tell the other pilot. One pilot does something wrong, and the other pilot doesn’t catch the error. A tricky situation needs to be resolved through a complex series of steps — and somehow the pilots fail to coordinate and miss one of them.”
In investment markets, the bad events, like a debt default, generally happen not just out of pure randomness, but likely because a sequence of errors occurred. In the South African context, we can look at the Steinhoff failure and examine what happened. Here we know that while fraudulent behaviour was also at play, it should have been picked up earlier, but a sequence of human errors allowed it to continue for far longer than it should have.
But before we digress too far, let us return to investments, and investment processes in particular. What we are dealing with here is a sequence of human actions and interactions, so the inferences from Gladwell’s writings are also clear – it is possible for investment processes to also, despite the best efforts, “crash” and one should have, as in a debt default, a recovery process. A reboot button perhaps? The team dynamics become so important at this point. In the same way that we believe it is good to have “whistleblowers” in broader society, we also want to have “whistleblowers” in the investment process. Someone who is prepared to spot the mistake and call it out - the devil’s advocate, so to speak. So, a great investment process is not just about delineating a sequence of steps, but also about team dynamics, communication and roles. At Ashburton Investments, we do not believe that simply delineating a series or sequence (a process) itself is the guarantee of investment success. It is the way that we check each other in the process that goes a long way to achieving that. Holding each other accountable in a mature and intellectually rigorous manner is key to elevating ourselves to a higher plane of investment stewardship.