C. Warren Axelrod

A “Fluid and Pragmatic” Approach to Security

It really is disheartening to read time after time about the inadequacies of the information security approaches upon which we so heavily depend. A brief interview of Moti Yung by Laura DiDio in the July 2014 issue of the Communications of the ACM is one such article. Dr. Yung is known for his work on cryptography. He is a Google research scientist, a 2013 ACM Fellow and formerly a cryptographer and scientist at the IBM Thomas J. Watson Research Center, as well as having done pioneering work in other research environments.

Dr. Yung starts out with a disturbing statement that “There are terrific security tools and they work well, assuming no one has penetrated your computing systems.” [emphasis added] Since it is commonly held that all computing systems and networks of any significance have already been penetrated, whether or not you know it, then one can draw the conclusion from Yung’s statement that today’s security tools don’t “work well,” however “terrific” they might be.

So where does that leave us? The article goes on to say “In an era of global and mobile systems, however, Yung believes businesses need more pragmatic and fresher approaches to secure systems [that have been] already ‘partially penetrated.’” Whatever that means! However, I have also been guilty of making similar normative statements. The book “Enterprise Information Security and Privacy,” (Artech House, 2009), which I co-edited, is an attempt to take a fresh look at information security. Some of that goal was achieved, but the exercise really revealed that there is so much more still to be done … we don’t even have a good handle on very basic risk assessment!

Later in DiDio’s column, Yung says “It makes no sense to build a Maginot line that people can circumvent; security systems need to be fluid … Security professionals must thrive on challenges, accept failures as learning experiences, and move on.” That’s all very well for a researcher to say, but it really doesn’t work that way in the business world. For security professionals “in the real world” there is little tolerance of failure. Fail and you’re out (unless you can blame someone else).

Unfortunately, the attitude that things aren’t working but we’re smart enough to make them work is prevalent in academia, research institutions, and a number of government agencies. There is a view that, if only we can come up with the right “game-changing” technology, all will be well in the cybersecurity world. Well, it ain’t gonna happen! (I really do hope that I will live to regret those words.)

As long as we are too timid, or too greedy, to restrict the use of computer systems that are not secure (which is practically all of them), we will see accelerating abuse of these systems. There is no appetite at present to take the Draconian measures that might have a chance of working … nobody wants them, and most folks seem willing to pay the price of not doing so. Until that situation changes, security will not protect against incessant attacks. And it doesn’t appear that it will change any time soon. Prevention is not working, it’s time to give avoidance and deterrence a chance.

One Comment

  1. Jessica Dodson Sep 16, 2014 at 3:59 pm | Permalink

    “For security professionals “in the real world” there is little tolerance of failure. Fail and you’re out (unless you can blame someone else).”

    Excellent point. Failure means the loss of your job, and maybe dozens of others if your company is crippled enough. There has to be pragmatism because the other options leave you up the proverbial creek.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*