By Andy Cuerel
“Stock prices have reached what looks like a permanently high plateau”
Economist Irving Fisher in October 1929, three days before the stock market crash that triggered the Great Depression
“Y2K is a crisis without precedent in human history”
Byte magazine editor Edmund DeJesus -1998
“We can close the book on infectious diseases”
Surgeon General of the United States William H. Stewart, speaking to the U.S. Congress in 1969
With the recent publication of the 2013 National Risk Register stating that Pandemic Influenza is still the most significant civil emergency risk in current times, I thought I would look at some of the issues associated with predicting the next big thing, and the balancing act between ensuring people are prepared but do not become desensitised to important warning signs.
Much to the chagrin of their providers, incorrect predictions can linger long after their impact has ceased to be relevant, occasionally becoming a person’s defining moment (“Read my lips…no new taxes“ – George Bush Senior, when accepting the Republican presidential nomination in 1988 – only to raise taxes two years later).
Putting aside the potential for personal humiliation, there are often common themes in these utterances – more specifically a well-intentioned denial of risk or, in the case of the Y2K prediction, an overstated one.
In the pursuit of my own legacy, I will add the following:
“The likelihood of an avian flu virus mutating sufficiently to cause a global human pandemic in the foreseeable future is negligible”
This statement may be sound, in much the same way as the statistical likelihood of the Titanic sinking on its maiden voyage wasn’t a great cause for concern. Unfortunately, it also smacks the curse of many low-frequency, high impact events in that it breeds complacency – “It’ll never happen to us….or will it?!”
Unfortunately, avian-borne influenza has some other facets to add to its unpredictability, at least from the risk and continuity manager’s point of view.
New sub-types of Influenza Type A are periodically identified, the most recent of which have been H7N9 and, before that, H5N1 in 2007-9. The newsworthy factor in these developments is always, sadly, reports of fatality and serious illness – often amongst Asian poultry farmers and others with direct contact to sick/dead birds. Confirmation of human-to-human transmission is often unclear and prone to media hysteria – “it’s coming for YOU!!”
This is against a backdrop of many experts who assert that global influenza epidemics are recurrent over decades and centuries, and that we are well overdue a ‘biggie’. Oh, and an extra little twist in the form of evolving sub-type strains, which renders vaccines quickly out of date, hence making control measure effectiveness and the impacts arising very difficult to predict.
The combination of these factors makes the timing and details of planning very difficult to gauge, particularly when there are significant human factors which you wouldn’t have, for example, in IT disaster recovery. The need for organisations to be seen to be doing something for their employees is a powerful driver, but how much to invest against a risk, which seems to rear up and then subside time and again with limited global impact, is a challenge.
One solution to being prepared for all risks – even those that may turn out to not be so risky – is to adopt a best practice principle of continuity planning, and to not make it scenario-specific. There are a number of related risks, which can fall into the same category as illness epidemics, at least in terms of business impact. Some examples are:
- Loss of a key goods/service supplier
- Medium term denial of access to premises e.g. resulting from a neighbour’s misfortune – fire, explosion = cordon zone etc.
- Industrial action
No one is suggesting that preparatory actions for an illness epidemic mirror those in the list above exactly – an organisation whose employees travel regularly to rural Vietnam for instance, will clearly need bespoke risk controls as an ongoing activity. Nonetheless, there is considerable cost-benefit justification in adopting common mitigations for themed risks, particularly for uber-rare events with unpredictable consequences. In the words of Admiral William D. Leahy, airing his views on the US Atomic Bomb Project to President Harry Truman in 1945, “The bomb will never go off. I speak as an expert in explosives”.