CHAPTER 22
RULE NUMBER FIVE

Ever Stuck

In March 2021, a 200,000‐ton container ship called the Ever Given made news headlines around the world when it got stuck in the Suez Canal. A journey that should have taken around 13 hours, ended up taking over six days, during which time the Ever Given blocked the main shipping route from the Atlantic to the Indian Ocean. In doing so, it caused a tailback of over 200 vessels and disrupted global supply chains. You might not remember the name of the ship, but you'll probably remember seeing pictures of an enormous cargo boat run aground on a sand bank being dug out by what looked like a toy digger.

Like me, your initial reaction upon hearing the story or seeing the image of it stuck, was probably to blame the captain for what seems like a basic navigation error. To misquote Brian Cullinan's famous words, “It doesn't sound very complicated, but you have to make sure you're steering the ship straight down the canal” (see the Introduction). But then I discovered that ships travelling through the Suez Canal are required by law to have a local pilot on board to help guide them through it. All of a sudden, I drew an entirely different conclusion. Surely, I now reasoned, it must be the pilot's fault. After all, preventing that kind of incident is the sole reason they're there.

Until that is, I learned that this wasn't the first time the Ever Given had been involved in an accident. Just over a year before the Ever Given got stuck in the Suez Canal, it was leaving the German port of Hamburg, when it moved too close to a pontoon and hit a local passenger ferry. Fortunately no one was killed or injured. Reading about this and looking at the pictures of it, I began to see the Suez incident as part of a pattern, rather than a one‐off freak accident.

Then, as I was writing this book in early 2022, news broke that the Ever Forward – like the Ever Given, a ship owned by the Evergreen Marine Corporation – had strayed outside normal shipping lanes and run aground in shallow water off Chesapeake Bay in Maryland. It took an entire month before the Ever Forward could once again live up to its name.

Why, I began to wonder, was a company operating ships that seemed to be so difficult to steer? Then it occurred to me that someone had to design and build the ships in the first place; what on earth were they thinking? Perhaps, I reasoned, it wasn't the fault of the pilots or the captains after all, but rather with the design of the ships. So I did some more investigating. It turns out that these aren't the only recent accidents involving cargo ships owned by that company. In 2017, for example, the Ever Smart lost a large number of containers mid‐ocean.

But then the Evergreen Marine Corporation is by no means the only shipping company to suffer incidents of this kind. It's far more common than we might think. It's just that the Ever Given provided a “made for news” visual spectacle, that means any incident involving an Evergreen Marine ship has now automatically become newsworthy.

The Bigger Picture

One the one hand, my (pun intended) voyage of discovery about the Ever Given is a perfect illustration of WYSIATI in action (see Chapter 8). But it also illustrates the power of looking at a series of data points, rather than individual ones.

Every time I discovered new information about the story, my perspective on who was to blame for the incident changed. Of course, I'm no maritime expert, so it is perfectly possible, I'm joining dots that I shouldn't. But even if the insights I discovered, aren't as relevant as I've suggested, knowing that the incidents in Chesapeake Bay and Hamburg happened must, on some level, also be useful when we're analysing the Suez incident.

Just as I had initially held the captain responsible for the Suez incident, our inclination, when we hear that someone has broken a rule, is often to blame them. This makes perfect sense. After all, it is their actions, or inactions, that have caused the rule to be broken. And it is important to hold people to account for their decisions.

Obviously, we need to investigate individual incidents as and when they occur, particularly when they're on the scale of the Suez one. But if we want to build a more robust compliance framework, it's important to also look at where similar events are occurring to see what lessons we can learn.

The Power of the Collective

In that spirit, Rule Number Five encourages us to look at collective, rather than individual, behaviours. In other words, to look at situations where lots of people aren't doing what we want them to. It reads as follows:

If one person breaks a rule you've got a people problem. If lots of people break a rule, you've got a rule problem.

The basis for the rule is that if the majority of people are able to comply with a rule and one person isn't, then that suggests there's an issue with the individual. But if lots of people aren't complying with a rule, then it's unlikely that they've all set out to deliberately break it.

After all, you're unlikely to have lots of people deliberately setting out to break a rule. There must be some explanation as to why it's happening. That needn't mean the rule itself is a problem; though it might be. Perhaps the reason for the noncompliance is that the rule looks good in theory, but is incredibly hard to comply with in practice. It might be that the training on the rule is inadequate, or the systems people need to use to comply with it are confusing. Whatever the reason, the fact that lots of people aren't complying is a clue that something needs to be looked at.

The other reason for that is that if the “rule” – by which I mean the rule itself, or the training, or communications about the rule, etc. – has a problem that isn't fixed, then you're just asking for the same problem to reappear. As Alexander den Heijer, a Dutch psychologist puts it: “When a flower doesn't bloom, you fix the environment in which it grows, not the flower.” The risk in not fixing the problem, and concluding it must all be the fault of the individuals concerned, is that you could get rid of all the people, hire new ones, and the same problem could occur again.

The Wisdom of the Crowd

Another benefit in focussing on collective behaviours is that we can deploy resources more appropriately. If one person is breaking a rule, and everyone else isn't, then we can respond accordingly. But if the problem is more widespread, it is worth deploying resources to do a detailed investigation, because we can see there is something in or around the rule that must, on some level, be contributing to the compliance problem. In other words – whisper it! – it's not entirely the fault of those who aren't complying.

To look at this another way, we're using the same logic we use when we consult Tripadvisor, and channelling the “wisdom” of the crowd. Of course, when lots of people break a rule, it doesn't feel as if they are displaying wisdom, but on some level there will be a reason for what they are doing. If we can work that out, we can stop or deter others from doing the same thing. Because we're using collective breaches to inform us, we've got more data and we're more likely to be able to identify the reasons why they are occurring. In simple terms, that also means we've got more people we can talk to about what's happening.

Ergodicity

On of the reasons we want to look at collective behaviour is that averages can be misleading. To help illustrate this, let me introduce a concept called ergodicity.

Imagine we ask 100 people to each roll a dice once and record their result. Then, we ask one person to roll a dice 100 times and record the results. In both cases, we'd expect to get the same outcome. There's no reason why 100 people rolling one dice each should produce a different result to one person rolling a dice 100 times.

Now imagine, we play a game of Russian roulette. For those unfamiliar with it, it's a “game” that involves loading a single round in a revolver. The person playing then spins the cylinder and points the revolver at their head and pulls the trigger. There's a one in six chance of being killed or seriously injured. While we might be able to persuade someone to play the game once, we're unlikely to be able to persuade them to play it lots of times. Unlike the dice game, there's a huge difference in outcomes between one person playing Russian roulette six times, and six people playing Russian roulette once.

In an ergodic world, the expected value of an activity performed by a group is the same as that for an individual carrying out that activity on their own. The dice roll game is ergodic; Russian roulette is non‐ergodic.

Removing “Air Cover”

Eliminating the impediments to compliance that we learn from looking at collective behaviour has one further benefit: it removes what I call “air cover” from people who are wilfully or negligently noncompliant. If there are lots of people who are not complying with a particular rule, for reasons we could prevent, then they are acting as air cover for those who are doing it for different reasons.

Imagine lots of people aren't complying with a rule and we discover that it is likely to be down to a system that is hard to use. If we don't fix that system, then we provide air cover to those who are failing to comply because they can't be bothered. Not just in terms of giving them an excuse, but also when we review compliance risks, they will be “lumped in” with the others. Solving the problem will ensure we isolate their behaviour. To put it another way, if we make it easier for people to be compliant, then we'll know that those who aren't are more likely to be doing it wilfully, and we can react accordingly.

It's the Stupid, System

Now I'm going to share a case study that illustrates how this Rule works in practice. A Financial Services client of mine had a compliance problem with a rule which required employees who wanted to trade listed shares to obtain permission before doing so. These are standard within the industry and the concept behind them is widely understood.

Since people in the industry can, in the course of their work, be in possession of price‐sensitive non‐public information, the firm can prevent staff from trading if there's a risk of an actual or perceived conflict of interest. It's not a rule that people would be unfamiliar with, or that is in any way complex. Yet large numbers of their employees were simply failing to comply with it.

The solution the client had proposed was entirely logical. Any employees who had breached the policy would be disciplined, and required to do further training on the relevant rules. A firm‐wide email would also be sent, reminding everyone of the policy.

In the spirit of this rule, I suggested that we look in a little more detail to see whether there were any obvious patterns to the breaches. It turned out that in the majority of cases, it was the identical story. The shares those employees had been trading were the company's own. For entirely legitimate reasons, staff were selling shares awarded to them as deferred bonuses, on or just after on the day the shares vested; in other words, at the earliest possible moment.

Under the firm's rules, this required pre‐approval. Yet the system they needed to use to sell the shares – which was built internally – didn't warn anyone trying to sell shares that they would need permission. Nor was there any control to ensure they actually had permission. Suddenly, it was possible to see why some people might reasonably have concluded that because the system allowed it, they didn't need separate pre‐approval.

In theory, the staff should have known they needed pre‐approval to trade. But, in practice, it is easy to see how they might not have thought about it. Not least, because the timing of the share sale wasn't their choice; they didn't want shares they'd been forced to hold, they wanted cash, so were keen to sell as soon as possible. The fact that lots of people broke the rule suggested it was more confusion than malicious intent driving the action. It's also worth remembering that if someone were going to be engaging in insider dealing, they'd be unlikely to do it on a system owned by their employer; they'd go “off the grid”.

All of which led me to suggest that they focus on changing the system. By all means, discipline people who should have known the rules, but why not prevent others making the same mistake? So, that's what they did. A warning was added to the system and, as a result, numbers of breaches of that rule declined substantially. There was a further benefit: management time was no longer wasted by looking at alarming statistics; and if breaches of that policy did occur, they knew they'd removed one obvious excuse.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.233.44