Caltrop Consulting
Independent Security Counsel

Ambiguous Threats

Ambiguous Threats and Plan Continuation Bias.
​
It’s a clue, but what does it mean?
​
The term ‘ambiguous threats’ was coined in a report in Harvard Business Review in 2006. In it, the authors discuss the Columbia Space Shuttle disaster that had occurred 3 years previously and in particular the signals seen well before the disaster, which were not obvious direct threats, but rather more ambiguous or ‘fuzzy’ ones. A direct threat, and its response, may be clear - something like living in a flood plain or earthquake zone or operating in a highly active conflict zone. A lot of time, effort and money is usually spent on mitigating or planning a response to these risks.
​
An ambiguous threat is a slight signal which may or may not be influential or important at all in the context of a wider risk, but which may also be seen, with hindsight, as an obvious pre-cursor of disaster. You may see chunks of foam strike your spacecraft on launch on multiple occasions with only minor damage only to be faced with a catastrophe that cost lives and millions of dollars, or you may see a disease affect a country on the other side of the world for the fourth time in two decades and watch almost transfixed as suddenly and unprecedentedly it is among the population, causing large scale loss of life and subsequent financial instability.
​
The response to these signals may be less clear than the general background risk, committing time effort and money may seem inappropriate, especially if it is a signal that occurred previously and had seemingly no impact. These sorts of ambiguous threats are usually downplayed or employ inappropriate strategies such as ‘wait and see’, which can, as we see, end up in disaster. This catastrophe is then further exacerbated, as - with 20:20 hindsight vision - it is very easy to appear negligent for not having taken the threat seriously.
​
A recent example was the killing of the leader of the Algerian Islamic militant group Al-Qaeda in the Islamic Maghreb [AQIM]. In isolation it could be seen as an event that would appear to actually diminish a threat to Western companies operating in Algeria which are expressly targeted by this group. However, shortly afterwards, rumours appeared in some press articles that Algeria had indirectly assisted the French in their operation; these were bolstered by other small signals that AQIM was planning a revenge attack on Western interests anyway, and that the revenge motive only served to increase the probability. What then to do with these ambiguous signals – signals that would be seen as a clear indicator if something did happen, but was mere background noise if one didn’t. This is the crux of the ambiguous threat problem.
​
It may then be better to have a strategy or framework to work with rather than leave it to the immediate ‘on the spot’ judgement of these signals. These immediate or embedded judgements are usually poor as they are commonly binary [do nothing / panic], necessarily reactionary as they are unplanned for, but more importantly involve a cognitive bias in the perception of risk - a reticence to budge a solidified notion of the future - the Plan Continuation Bias.
​
Plan continuation bias is well noted in aviation safety, sometimes roughly called ‘get-there-itis’; it is the bias of continuing with a plan despite subtle signals indicating changes in the situation, even potentially dangerous ones. It appears stronger the closer you are to ‘arrival’, and will generally involve not ‘stepping back’ to re-assess a situation properly from a clean sheet and revising plans and actions accordingly. The oil tanker SS Torrey Canyon in 1967 is a good example of small systemic errors providing signals along with an obvious increasing of dangers by the minute that were ignored as Captain Pastrengo Rugiati failed to accept or properly interpret realities and ploughed on into them under pressures to deliver.
​
What then to do with these ambiguous threats and signals? Asking the question is at least a step in the right direction – it will lead to a reaction that is going to be better than ignoring or downplaying by instinct at the very least. If we consider it a window of opportunity between a sign and a catastrophe rather than a nuisance signal to an already well known threat, we may be able to start to think about it in a better way. A strategy to ‘clean sheet’ and re-assess situations as new information comes in and to amplify those signals may spur us on to better assessment. Having someone ‘red team’ the situation – to come in with fresh eyes to see a less biased and more complete picture outside the conventional process’ may be of benefit. To also then look at measures that may not be complex or onerous nor costly and cumbersome but may mitigate to a great degree either the disaster itself or the repercussions of negligence claims further on from it. Workshopping the teams involved in a more stressed and realistic way is a further tactic, one that benefits any risk management mitigation but moreso in these circumstances. Any signal that widens the funnel of the expected future timelines should prompt a re-assessment of stance.
​
In the case of the Algerian terrorist leader, we first looked at the signals in great detail; following the track of information and people involved in them to spot bias and ambiguity of meaning. We did a grounding of information by a re-assessment of the stance of the security forces across multiple projects. We contextualised the death in terms of operative capabilities and expectations for complex attacks but did see that although that threat was diminished, it could be increased for a smaller scale, less sophisticated, ‘lone wolf’ type incident. We then advised slightly shifting the client’s stance – their physical security aspects were already set up to face the general threat, so this shift involved several small ‘nudges’; a refresh of the airports section of our Hostile Environment Coaching package; communicating the situation in a balanced way to people so they treated the situation seriously but without panic. Small changes in guard procedures and small inputs into their training to culturally raise their awareness for the short term. Revision of plans for relocation or draw down [we were in the midst of Coronavirus which had its own impacts on that…] so they were current and practised. In general a ‘pre-mortem’ approach was taken, rather than a more usual ‘it’ll be fine’ one.
​
-
​
During this process we noted that other operators were quite complacent about the information and did nothing with it except to note its existence. Our client spent no additional cash, but was better prepared both for the reality of an attack and also from claims they were negligent with the information they had. We cannot, in all honesty, say the same for others.
-
​
Facing Ambiguous Threats by Michael Roberto , Richard M.J. Bohmer and Amy C. Edmondson: https://hbr.org/2006/11/facing-ambiguous-threats
​
Plan Continuation Bias in Aviation by Tyler Briton: http://aviationsafetyblog.asms-pro.com/blog/the-internal-aviation-sms-threat-plan-continuation-bias
​
Danger, rocks ahead! - An excellent podcast by Tim Harford on the underlying human errors of SS Torrey Canyon: https://timharford.com/2019/11/cautionary-tales-ep-1-danger-rocks-ahead/