Disclaimer: The opinions of the columnists are their own and not necessarily those of their employer.
C. Warren Axelrod

Security Metrics, Recency Bias and Availability Heuristics

I “recently” came across an article by Tom Chatfield with the title “The Trouble with Big Data? It’s Called The ‘Recency Bias,’” which is available at http://www.bbc.com/future/story/20160605-the-trouble-with-big-data-its-called-the-recency-bias The article was published on June 5, 2016, which is not so recent in comparison to many other articles on the fast-moving subject of big data, so that, as asserted by the author, it may not carry the weight that up-to-the-minute articles would likely exhibit.

In any event, the article got me thinking about whether cybersecurity metrics are suffering the same fate as big data in general. After all, the emphasis has typically been on the latest data breach or piece of malware and the earlier stuff is soon forgotten, much to the chagrin of the authors of the annual VerizonBusiness 2016 DBIR (Data Breach Investigations Report), downloadable from http://www.verizonenterprise.com/verizon-insights-lab/dbir/2016/ , which year after year attributes the cause of many data breaches to organizations not having updated their software even though the patches may have been available for months, if not for as much as a year. As the report says: “Half of all exploitations happen between 10 and 100 days after the vulnerability is published …” So clearly there are opportunities for organizations to improve their vulnerability mitigation (a.k.a. patching) programs.

Perhaps even more interesting is Chatfield’s use of the term “availability heuristic,” the definition of which is as follows at https://www.verywell.com/availability-heuristic-2794824 :

“An availability heuristic is a mental shortcut that relies on immediate examples that come to mind. When you are trying to make a decision, a number of related events or situations might immediately spring to the forefront of your thoughts. As a result, you might judge that those events are more frequent and possible than others. You give greater credence to this information and tend to overestimate the probability and likelihood of similar things happening in the future.”

How true is this of security programs? Are we so engrossed in responding to ransomware, DDoS attacks, and nation-state hacks that we neglect patching software vulnerabilities and fine-tuning our ability to recognize insider threats? Do the metrics that currently take center stage block out the more insidious threats that can have far greater impact?

Yes, we have to worry about denial-of-service attacks that use the Internet of Things to flood large segments of the Internet, but not so much so that a major security agency allows a contractor to exfiltrate highly-classified information for almost two decades. Let’s put things in perspective and keep significant earlier threats, which have not yet been fully resolved, top-of-mind. Let’s avoid the availability-heuristic syndrome, especially as many of the greatest cybersecurity risks are “oldies but goodies.”

 

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*