Picture credit: Fotolia
The WanaCrypt0r ransomware worm weaponises the US National Security Agency's 'Eternal Blue' tool, part of a cache of NSA digital weapons leaked last year. It exploits a weakness in Microsoft operating systems, which was patched by Microsoft for supported systems in March 2017.
Many NHS trusts still use Windows XP, for which Microsoft no longer provides free support. Reports suggest that despite warnings, the government cut a £5.5m Microsoft support programme for XP, leaving NHS XP systems vulnerable. According to expert Keren Elazari, unsupported software systems are most prevalent in "healthcare, energy and transport; as well as finance”.
The NSA has long been criticised for keeping system vulnerabilities from the likes of Microsoft and developing them into cyber weapons. No doubt then UK’s Government Communications Headquarters participates in this, given its vast powers of hacking, cyber espionage and cyber warfare.
All organisations leak. Cyber espionage and warfare tools can be found by anyone sufficiently interested, from calculating criminal collectives to bored bedsit boys, including enemies and the disaffected. There can be no doubt that a tiny but skilled group could use such tools to design and execute a dramatic, catastrophic event – these are perfect weapons for the weak.
Brad Smith, the president and chief legal officer at Microsoft warns: “Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen. And this most recent attack represents a completely unintended but disconcerting link between the two most serious forms of cybersecurity threats in the world today – nation-state action and organized criminal action.”
RELATED CONTENT
No organisation can guarantee the security of its know-how against a determined external attacker or internal leaker. The question is whether intelligence agencies and those who control them understand and realistically evaluate the risks of their tools being turned against themselves and allies.
Leaders are notoriously bad at evaluating small risks of very serious harm. Complacency and groupthink leads insiders to believe “it” will “never happen to us”; they discount the scale of harm until it has materialised. Then they over-react, desperately blaming someone else.
A big cyber attack built on systemic vulnerabilities hoarded by agencies such as NASA and GCHQ could leave us facing the failure of systems that deliver food, water, power, money, communications, transport or ‘the cloud’.
So the twin challenges for the civil service are to educate itself sufficiently; and to force these unfixed vulnerabilities onto ministerial agendas before something far nastier happens.
But there is a broader family of lessons from this crisis that comes from beyond the well understood fields of crisis management and business continuity or cyber policy planning.
It is easy to blame this crisis on some hapless leader who saved money by ending support for XP. But that is like attributing an air crash to ‘pilot error’, as was normal 30 years ago. Stanley Roscoe, an aviation psychologist of the time, described such conclusions as “the substitution of one mystery for another”. He thought aviation investigators could do much better. They did and so should we.
Persistently digging to root causes typically reveals an unseen web of human weaknesses that can lie latent, incubating for years – until luck runs out, when they cause a crisis. When the Public Accounts Committee has reported, we expect to find a web of behavioural and organisational risks at its root, such as internal silos; professionals who saw this coming but were not be heard; gaps in leadership skill and experience; leaders resistant to unwelcome news; decision-makers who neither understood IT nor sought explanations; inability to learn from history and minor failures; incentives that undermine the system’s integrity; communication failures; cultural weaknesses, complacency and complexity. These all drive reputational risk through perceptions of organisational competence. Derek Atkins and I explain behavioural and organisational risks and their reputational consequences, in our book ‘Rethinking Reputational Risk: How to Manage the Risks that can Ruin Your Business, Your Reputation and You’. These risk areas are not dealt with by the Orange Book.
Here lies the challenge for leaders. The civil service is full of intelligent, well-intentioned people. But we humans are, as Dan Ariely famously put it, ‘predictably irrational’. This is both strength and weakness. People are every organisation’s greatest assets and its greatest risks. For sustainable success, leaders must understand behavioural and organisational risks and how to manage them effectively.