Eleven Shades of Security Intelligence Knowability

I thought that the title of this piece would at least be au courant, but it seems that others got there before me. In a December 3, 2015 presentation on “Common Quality Enumeration (CQE) See Your Total Quality Picture,” by John Marien and Robert Martin of MITRE, the presenters state that “Within the realm of quality issues there is pure black (security focused) on one end and pure white (other quality issues) on the other with varying shades of gray in  between.”  The presentation is available at https://interact.gsa.gov/sites/default/files/ThuAM1-CQE%20SSCA%20WG%202015.pdf  and describes MITRE’s newest work on categorizing software security characteristics, following on their outstanding work on enumerating common weaknesses (CWE), vulnerabilities (CVE) and malware (MAEC).

I’m not sure about the analogy here of security focus as black and other quality aspects as white … but I digress. I hope to get back to CQE at another time, but for now let’s discuss security intelligence knowability.

I was first introduced to the concept of “unk-unks” in an early presentation on Y2K back in 1994. No, Donald Rumsfeld didn’t invent the term as claimed. It apparently (per Wikipedia) harkens way back to 1979.

Since then, I have been fascinated by the range of knowledge along what I call “the knowability spectrum,” as it applies to cybersecurity. I have written on the topic, mostly recently in the September/October 2015 CrossTalk Journal. The article has the title “Software Security Assurance: SOUP to NUTS”, and is available at http://static1.1.sqspcdn.com/static/f/702523/26502340/1441086786230/201509-Axelrod.pdf?token=apA1bQ9mWD5RbxK%2Banmj2wjB5qg%3D

What got me thinking about this subject once again, was something I read in a review of a biography of spy novelist John Le Carre in The Atlantic magazine. It went as follows:

“… negativeland, the silvery counterworld of the thing that you know but don’t want to know that you know …”

In the Crosstalk article, I compare a Known/Unknown model to David Snowden’s Cynefin complexity framework and relate these concepts to software supply-chain risks.

More recently I put together a “knowability spectrum,” where knowledge ranges from fully knowable to totally unknowable. Intermediate knowability categories are as follows:

  • Know you know (certain)
  • May know (not certain)
  • Can know (requires some effort)
  • Should know (if you were on the ball)
  • Could know (if you were interested in putting in the effort)
  • Don’t know you know (because you just don’t happen to know)
  • Don’t want to know you know (“negativeland” – see above)
  • Know you don’t know (and may not need to know)
  • Don’t want to know you don’t know (don’t tell me because then I’ll have to do something about it)
  • Don’t know you don’t know (the world of unk-unks)
  • Cannot know (absolutely unknowable)

As the list expands, which it has, it becomes more confusing, but also more interesting. Apart from “cannot know,” which is really the definition of unknowable, everything else is knowable to some degree or other if you know how and are willing to expend the resources. Even unknowable is not a complete blank since often inferences can be made, which is the essence of Douglas L. Hubbard’s definitive book “How to Measure Anything.” However, it is important to recognize that most categories are objective in that the item is either known or unknown or partially known. But there are a couple of subjective categories where the person doesn’t want to know regardless of whether something is known or not known.

I have experienced a couple of situations in which not wanting to know that something is known came into play. One in particular had to do with the implementation of an email scanning product. The argument was that if informed about a particular infringement, then they would have to act upon it. It was better not to know, or so they thought, because then they could not be held culpable if something was missed or not followed up.

This gets us into a very interesting space … the difference between the ability to know and the desire to know. There as old saying “What you don’t know won’t hurt you.” Nobody really believes that today (if it were ever held to be true), especially in today’s dangerous cyber and physical worlds, so it’s about time that we took on the responsibility and costs of getting to know about more threats before they result in catastrophes.

Post a Comment

Your email is never published nor shared. Required fields are marked *