Disclaimer: The opinions of the columnists are their own and not necessarily those of their employer.
C. Warren Axelrod

Simplicity or Complexity – Which is More Secure?

On May 19, 2010, Dr. Patricia Muoio of the ODNI (Office of the Director of National Intelligence) gave a thought-provoking presentation at a symposium hosted by NITRD (Networking and Information Research and Development), which is the name of a program that “… provides a framework in which many Federal agencies come together to coordinate their networking and information technology (IT) research and development (R&D) efforts.” … see www.nitrd.gov

The name of the symposium was “Toward a Federal Cybersecurity Research Agenda: The Game-Changing Themes,” and the particular topic covered by Dr. Muoio in the “Government Overview” session had to do with “moving target” approaches. The underlying proposition is that, in order to outwit the bad guys, one should be agile and stay one step ahead of the attackers by making security systems so complex that those with evil intentions will not be able to keep up. Currently, it would appear that the shoe is on the other foot, with the attackers keeping victims on the defensive by being state-of-the-art and staying ahead of the owners’ efforts to protect information assets.

More than a decade ago, Bruce Schneier wrote a piece in his Crypto-Gram Newsletter of March 15, 2000, with the title “Software Complexity and Security,” available at http://www.schneier.com/crypto-gram-0003.html. The opening sentence of the article is: “The future of digital systems is complexity, and complexity is the worst enemy of security.” He ends the article with the words: “Secure systems should be cut to the bone and made as simple as possible. There is no substitute for simplicity. Unfortunately, simplicity goes against everything our digital future stands for.”

While Bruce was referring to the underlying software that infosec professionals are trying to protect, I believe that the same goes for security systems. If we embark on a complexity race with the bad guys, we may end up the losers. The real, though likely impossible-to-do answer, is to simplify underlying software, platforms and infrastructure and have correspondingly simple security systems. However, even if we are not able to roll back complexity, we need to do something in terms of the designs and structures of systems. The subprime mortgage fiasco, the “flash crash” of May 6, 2010, and the Gulf oil gush catastrophe are all recent examples of complexity resulting in disaster and making for much more complicated recoveries.

I bemoaned the increasing complexity in a column “The Death of K.I.S.S.” in Securities Information Magazine, in 1994. I pointed out that systems had become so complex that it was no longer possible for an individual or a group of individuals to know and understand every aspect of the systems that they are responsible for supporting.

A real-life example of the detrimental impact of complexity is the April 27, 2010 New York Times front-page article by Elisabeth Bumilller with the title “We Have Met the Enemy and He Is PowerPoint,” with due deference to Pogo. The article is accompanied by a very complicated “… PowerPoint diagram meant to portray the complexity of American strategy in Afghanistan…” According to Gen. Stanley A. McChrystal, then leader of the American and NATO forces in Afghanistan, “When we understand that slide, we’ll have won the war.” As for the war in Afghanistan, so it is for computer systems and their protection. We are continuing to build ever more complex systems, so how can we hope to protect them?

I believe that hope lies in decoupling system components … that is, introducing the computer system equivalent of circuit breakers. If a module is overheating, there should be some way of disconnecting it from the rest of the system in order to avoid a ripple effect. This is similar to isolating sections of the electricity grid so that a failure of one segment does not bring down others.

In my opinion, the answer lies in simplifying systems and minimizing their attack surfaces. Perhaps we’ll have to give up some functionality, perhaps not. Perhaps it will cost more, perhaps not, especially when you include the cost of system compromises and failures due to attacks. In any event, it is worth a try since the greater-complexity approach does not appear to be working.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*