The Misleading Nature of Schneier’s Security Mindset

Recently Bruce Schneier wrote an essay on the Security Mindset. In it he wrote:

Security requires a particular mindset. Security professionals — at least the good ones — see the world differently. They can’t walk into a store without noticing how they might shoplift. They can’t use a computer without wondering about the security vulnerabilities. They can’t vote without trying to figure out how to vote twice. They just can’t help it.

He further wrote:

This kind of thinking is not natural for most people. It’s not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal. You don’t have to exploit the vulnerabilities you find, but if you don’t see the world that way, you’ll never notice most security problems.

While I agree that certain security roles involve thinking about how things can fail (and be made to fail), that’s not the full picture. In particular I disagree that thinking about how things fail is not natural for engineers and, conversely, that how to build things is not natural for security professionals.

For the first case, imagine an engineer that is building a bridge and does not consider whether or not the suspension cables will sustain the weight of the roadway beneath. Clearly in building a bridge there is also the consideration of the consequences of not building it correctly. Engineers may not maximize the way they develop failure paradigms — trying to figure out every way one’s design could be subverted — but that does not mean they do not think about failure.

For the second case, it’s best to turn to corporate and government security needs. Security people want to figure out how to best architect solutions so that employees can securely access their desktop remotely, monitor log events throughout the enterprise, reduce costs through using VPN solutions, etc. Corporate security people in particular can no longer be the “No” person, but the “Here’s how to do it securely” person. And with this latter approach, architecture and engineering is part and parcel of the security professional’s toolkit.

Reducing the security mindset to “an attacker, an adversary or a criminal” is to limit the paradigm of security to one general class of security roles: namely, the auditor. To phrase this more as a whitehat, to think like an attacker is to constantly conduct a vulnerability assessment, which (again) is an auditing function (despite which corporate function conducts it).

The strict model of “the security mindset as only an attacker” may have been appropriate pre-2001. Since 9/11 and Sarbanes-Oxley, engineers are increasing expanding their understanding of security requirements and information security professionals are increasingly focusing on how to enable businesses securely. Granted that the two circles of the Venn Diagram will never full overlap, but they are increasing doing so and are overlapping more now than ever.


  1. shrdlu Apr 10, 2008 at 8:36 am | Permalink

    Well said. Maybe Schneier should have specified that engineers tend to think about *functional* failures, but not those initiated by an intelligent adversary.

    People do think about their own areas of risk. Builders of military systems *do* think about threats from an intelligent adversary (or a few million of them). Project managers only ask themselves, “What is the risk of this project not getting done?” CFOs think about the risk of losing money.

    So it only makes sense that within IT security, we think about both accidental and deliberate misuse of the systems. (And for the record, I *never* think about shoplifting when I go into a store. I don’t know why that should be a given for security professionals.)

  2. Osama Salah Apr 12, 2008 at 11:35 pm | Permalink

    “For the first case, imagine an engineer that is building a bridge and does not consider whether or not the suspension cables will sustain the weight of the roadway beneath.”

    Looks to me like an example of how to make it work.

    Thinking how to make it fail, would look like this:

    “Hm, if a car crashes on this pillar with a speed of X, would it bring the bridge down?”
    “Hm, if a boat under the bridge, rams into this column, would it collapse the bridge?
    “If a car blows up on the bridge, where would the best location be to make the bridge collapse, and how strong would that explosion have to be? Would a car be enough? Would it take a petrol truck?”

  3. Kenneth F. Belva Apr 13, 2008 at 7:27 am | Permalink

    Engineers do consider failure paradigms. Consider this interview from one of the architects of the World Trade Center as posted on CNN:

    “HARRIS: As a member of the team, and having such insight to how this building was constructed, could you believe that a plane could bring these buildings down?

    SWIRSKY: The criterion was that if a plane hits, it would go right through it. And nobody could foresee something like that. The tower was protected in such a way that the damage would be limited to one story, but it wouldn’t travel to the other stories.”

    I suspect that this is a case of engineers thinking about how things can be made to fail…

  4. Osama Salah Apr 13, 2008 at 10:50 am | Permalink

    “(And for the record, I *never* think about shoplifting when I go into a store. I don’t know why that should be a given for security professionals.)”

    I guess its a process of always challenging yourself, to see if you could have improved on the existing security design and made it better.


    The WTC example presents your point clearer. Something still went wrong. They forgot that the WTC is going to exist for quite some time and during that period planes got bigger. They were imagining a 100 passenger plane but not a 757. This is actually a perfect example of something else Bruce repeatedly mentions namely that some control today is probably going to be less of a challenge in the future.

    Engineers do think to a degree about how thinks can fail but they are generally not persistent in doing so. In case of construction engineers they do have their standards to comply with and that is pretty much their main target.

    I suppose engineers at NASA think much more about how things could fail or be made to fail, or maybe in the air plane industry.

  5. barbara Jan 27, 2010 at 12:22 pm | Permalink

    “The security mindset as only an attacker” has never been a reality. Prior to 9/11/2001, I asked the IT Auditor who worked for me about the possibility of a fire at WTC towers 1 and 2, he replied, “but there are sprinklers throughout the buildings, how would a fire occur? Without missing a beat I said, “An airplane could hit one of the towers.” There was no reply. The manager rolled his eyes (as if to say “you’ve really gone over the deep end this time”) and walked away from me. I knew in this instant there would be no followup on my query as far as he was concerned…

    I believe that Schneier’s mindset is especially appropriate for the DSO, CIO, COO and/or the IT Auditor who cannot imagine the possibilities of a system failure. I would go farther than Schneier and would say that you almost have to put yourself in the role of a ‘deviant’ in order to imagine the possibilities. Even so, there are events which may be impossible to imagine prior to their actual occurrence. So, in fact, you’re both right!!! You must be able to understand what ought to occur (in other words, what the system was designed to do) as well as to open yourself up to the possibility that there exist gaps/holes within the system that could eventually undermine operations.

Post a Comment

Your email is never published nor shared. Required fields are marked *