The final keynote at RSA Conference Abu Dhabi 2015 was given by Richard Clarke, always an interesting and challenging speaker. As I listened to his discussion of responding to cyber threats, however, I was struck by his strong emphasis on preventative measures and the relatively little discussion of the essential role of ongoing visibility and analysis in responding to the cyber threats we face. His emphasis seemed to me to reflect the Innovator’s Dilemma that Clayton Christensen described in his 1997 book.
The Innovator’s Dilemma means that the very innovations that establish an individual’s or organization’s success become impediments to future innovation, making it difficult for the individual or organization to respond to new opportunities and challenges. It takes conscious and courageous action to set aside past successes and take a new path, even when one knows that this is necessary. Responding to opportunities can require “disruptive innovation” that turns away from past successes, re-shaping not just products or technologies but the organization itself. Focusing on past successes, when the world around you has changed, can be disastrous.
Unfortunately much of the message in Mr. Clarke’s talk seemed to endorse the old mind-set of “stop the attackers before they get into your organization”, ignoring the necessity for dealing with attackers who get past preventative measures like firewalls and intrusion detection. For example, during the question-and-answer session at the end of the keynote, he seemed to agree with a member of the audience who asserted that Trusted Platform Modules (TPMs) are the essential and woefully ignored under-pinning of security, the key to a stronger defense against cyber attackers. But in the world in which more than half of targeted attacks are launched by attackers using the stolen credentials of privileged users (Verizon Breach Investigations Report 2015), focusing on hardware roots of trust can leave you more vulnerable, not less, by putting your attention and investment on technologies, processes and organizational structures that don’t address all the risks that the organization faces.
Inherent in many of the technologies that are focused on preventing attackers from getting inside an organization – particularly those technologies focused on building defensive walls, but also technologies focused on known threats – is an assumption that past success in detecting and stopping attackers defines effective strategies and technologies for the future. Like businesses focused on past successes in products, enterprise and vendor focus on past successes in cybersecurity certainly have led to improvements in firewalls and anti-virus technologies, in intrusion detection and blacklisting. But that focus on past success has left many organizations – including those responsible for Smart Grid security — unprepared for the innovations in attacker strategies, for a threat landscape not just of polymorphic technologies but, even more importantly, of exploitation of our human predisposition to trust others.
Is there a way out of the cybersecurity Innovator’s Dilemma? The key is to be willing to set aside the very things you counted on in the past. Desperately holding onto what worked in the past when confronted by a landscape of new risks and challenges is a sure recipe for failure. In cybersecurity, including for Smart Grid, failure to learn from the Innovator’s Dilemma puts all of us at risk.