I recently read an article on why we fail to prepare for disasters.
Obviously this related to the lack of preparation for the viral pandemic that we are currently experiencing. But it’s familiar because this is something that all health and safety advisors face when trying to explain theories of risk assessment.
In the article psychologists described the inaction in the face of danger as a ‘normalcy bias’. This is where there is a lack of appropriate action, and the tendency to follow normal patterns of behaviour which aren’t suited to the current emergency.
People can have a problem of both recognising what is happening, plus so-called ‘herd-thinking’, but also understanding how to be of practical use in cases where they are uncertain of the risks. Apart from COVID-19 there have been numerous examples of how this failure has occurred. For example, the floods in New Orleans, but also during the other epidemics such as SARS and MERS, and health and safety disasters such as the Piper Alpha oil explosion, 3-Mile Island nuclear accident etc.
Very often in the H&S field one sees the Health and Safety Executive and other regulators or commentators being wise after the event, saying it was easy to see the danger.
It’s certainly easy to go in after a serious incident and be smart after the event – so easy, that there’s a name for it (‘Hindsight bias’), but it’s not always easy when running a business to know which ‘low likelihood/high hazard’ events are going to actually come about in any given year.
Costs and Efficiency
High consequence, low likelihood events can be easily and airily dismissed – the costs of solving the ‘problem’, compared to other investments that bring a predictable return will often seem far too high to even contemplate.
And alongside the question of money is a natural bias for people to discount future events and be overly optimistic about the likelihood of adverse events happening.
For example, in the case of Hurricane Sandy in America, the coastal residents there were well aware of the risks of the storm. They expected even more damage than the weather forecasters did, but they were relaxed and confident that it would be other people who suffered.
It’s also well known that younger people will discount future effects, and so this is why you see them refusing to wear personal protective equipment for the type of hazard that would affect them, say, after a decade or two of exposure.
One of the drivers of our capitalistic system is the continual drive towards efficiency and wringing out every last advantage from resources by keeping a close match between demand and supply. Dedicating money to an event that would be seen as unlikely or extremely rare is therefore seen as an inefficient use of resources.
For example, in the NHS having intensive care beds 80% full, compared with a 100% full, is seen in normal times as inefficient. This leads to long bureaucratic negotiations for the use of such beds for ‘elective surgery’ patients compared with the need to keep space for emergencies that might arise. But obviously when the pandemic strikes that extra capability is seen as extremely valuable and will act as a buffer against immediate cancellation of other patients needs.
Is Risk Assessment the answer?
It’s important from the outset to recognise that risk assessment for general business purposes is not a scientific technique.
There are so many competing items and values that it can only at best be the honest judgement of people trying to rank one type of risk against another to aid in sensible decision-making.
One of the potential disadvantages of the standard grid style ‘Risk Assessment Matrix’ which has hazard scores on one side and likelihood on the other is that, if it’s evenly balanced then, high consequence, events – which have extremely low likelihood – tend to be underscored in relation to the general sense of ‘dread’ and the public perception of guarding against these events.
And so you see sometimes these are deliberately weighted in order to try and steer action, where it doesn’t look like there’s high likelihood of this serious event happening.
However, it’s still too easy to ‘game’ scoring systems and those risk assessing will naturally shy away from selecting score combinations which would lead to large outlays of resources for a risk that can easily be dismissed by those who won’t be so involved if the dreaded incident comes about
So in my opinion fixing the score isn’t enough – after all, managers also have to be aware of what they’re choosing – and so more subtle methods should be used.
Role of H&S Advisers
For emergency situations in business health and safety situations it can be a big challenge for managers to get the balance right.
When lack of preparation combines with the immediacy of having to make the right judgement calls during an emergency situation then it might be easy to make the wrong decision, about, people are often slow to recognise the danger and then confused about how to respond.
Health and Safety Advisers need to offer opinions and make any recommendations as clear as possible, but also to steer in a way that is not overly disproportionate or confusing and not ‘overstep’ into areas with which they are unfamiliar – COVID-19 advice might for some be one such area.
And the other aspect is to provide clear advice and concrete plans and steps towards contributing to a solution to this kind of where event.
One of the reasons that I’ve called this blog ‘Look ahead before it’s too late’ is because it might be that the only way to ‘get your head around this’ is by projecting one’s actions into the future and asking: ‘How stupid am I going to look if this happens and I did nothing about it?’ That step alone can provide the driving motivation to take action.
Here are some conclusions that come to mind – no doubt for your organisation there are others that you can think of:
- We need leaders to imagine themselves being in charge when the rare event hits. They need to put themselves, ahead of time and ask how they would look if the rare event came about. If they exhibited inaction during the good times, when warnings were being issed of the need to prepare for foreseeable (albeit unlikely) disasters then it’s likely they will then be judged badly.
- When helping to estimate risk levels for high-consequence/low likelihood events, a consultant should counter over optimism and discounting of future risks and look at, and look out for people trying to game risk assessment and risk control efforts (e.g. items mysteriously dropping off H&S Committee Action Plans or Agenda’s).
- By including the workforce in consultation about risks, the opportunity is there to collect ideas on mitigating the effects of these rare events whilst shoring up more concrete plans in the background.
- Managers and Directors need to make overt, explicit, sensible decisions and avoid resorting to the common political trick of ‘kicking things into the long grass’, e.g. by deliberately under-resourcing the efforts that are needed.
- Organisations need to invest in their own future security and resilience. Instead of ignoring ‘Low likelihood/High Hazard’ events altogether and hoping they won’t happen, carry out regular (recorded) reviews to see whether the likelihood has gone above the original assessment, and put some of the annual budget to one side to make an investment to solve this in the future.
- And a final aspect is to be wary of fatigue building up. There’s a natural tendency to get bored with subjects oft-repeated that never seem relevant in the ‘here and now’, so a firm plan is needed and a constant eye on building future resilience.