- BlogInfoSec.com - https://www.bloginfosec.com -

Learned Lessons Are Not the Whole Picture

I am certainly a strong proponent of learning from disasters, as asserted in my June 14, 2010 column “Cyber Lessons Learned from the Gulf Oil Catastrophe,” for example.   Consequently I felt somewhat vindicated in that view by an article by William J, Broad on the front page of the Science Times section The New York Times of July 20, 2010. The article, with the title “Taking Lessons From What Went Wrong,” describes a series of catastrophes and details their impact on subsequent endeavors. One is the Tacoma Narrows Bridge failure that occurred as a result of a fairly modest 40 mph wind, which happened to cause the bridge to oscillate wildly and then break apart. I first mentioned the bridge in my June 24, 2008 column “Security Mindset: Nature or Nurture?” The writer describes the impact of the Tacoma event in delaying the opening the of Verrazano Bridge in New York so that the engineers could make sure that its design could withstand high winds.

It is, of course, very necessary to fix deficiencies that caused or contributed to previous catastrophes. However, there is, in my opinion, an even more important, though frequently missed, lesson which has to do with preventing the next disaster from happening. One of the most serious problems with the practice of information security is that we always seem to be trying to ensure that past successful attacks will not succeed in the future. While there is clearly value in patching known vulnerabilities, this approach only encourages the bad guys to come up with new and more effective ways to attack. It’s a big mistake to be constantly looking through the rear-view mirror for direction going forward. What is needed is a forward-looking approach. Let’s try to anticipate what will be attacked and work to protect against future threats. It really shouldn’t be that hard to do. There are some basic rules to follow, as I will describe.

The coolest new features are the most vulnerable. This paraphrases the old joke about the earliest slaves getting the hungriest lions in the Roman arenas. It is when some new capability or technology is introduced, often without adequate testing and without much thought given to how it might be misused, that the bad guys launch their successful attacks. Perhaps there needs to be a “cooling off period” before a major new device or piece of software goes mainstream.

The fraudsters go to where the money is. Again a paraphrase, this time of Willie Sutton’s response to being asked why he robbed banks. It doesn’t take much thought to realize that the bad guys will seek out the weak spots in financial and commercial systems. And often the weakest links are the small to midsize entities. Maybe we should be investing more time, effort and money in coming up with ways to protect the smallest entities and individuals, rather than the large institutions, which generally are already investing in security, albeit solving yesterday’s problems.

Here’s a coincidence … while I was writing this column, The Wall Street Journal of July 22, 2010 published an article by Sarah E. Needleman in the Small Business section with the title: “Lights Out Means Lost Sales: Business Owners Take Precautions Against Power Outages; ‘It’s Insurance.” The article describes situations in which significant business was lost for want of a backup power supply or generator. Small businesses often have not implemented even the most rudimentary security precautions.

Take heed of warnings. Most catastrophic events, from financial meltdowns to major oil spills, have already been anticipated.

Another coincidence … Quite by chance, as I was writing this, Ian Urbina wrote an article for The New York Times, dated July 22, 2010, about the Gulf oil spill with the title “Workers on doomed rig voiced safety concerns: ‘Run it, break it, fix it. That’s how they work,’ said one.” It tells that “… many of [the workers who had been surveyed, prior to the fire on the oil rig] were concerned about safety practices and feared reprisals if they reported mistakes or other problems.”

While it is financially and practically impossible to address every concern, management and policy makers should at least be more realistic about the likelihood of something bad happening. There are clearly segments of the critical infrastructure that will eventually fail or be brought down and perhaps we should be designing electrical power grids, highways, railroad tracks and the like to have greater redundancy and resiliency through multiple diverse paths, and alternatives that do not depend on the various grids. The collapse of a single bridge can have major economic impact on a region. Why not have a fleet of ferries that can be deployed to many areas? Should we encourage more local electricity generators for communities and individual buildings? Should there be several ways of achieving the same goal?

We in security talk about defense in depth, and that makes a lot of sense. However, one must be aware of common points of failure and recognize that practically everything fails eventually, whether communications, power, transportation, etc. The main difference is that information security is more subject to active attempts to break-ins or compromises. As circumstances change, other threats emerge and the less valuable become the lessons learned from previous disasters.