C. Warren Axelrod

Security, Safety and the “Wall of Constricted Thinking”

There is an interesting article by Jack Hitt in “The Idea” column in the SundayBusiness section of The New York Times of August 18, 2013. It is about how a newly-minted astrobiologist, Meredith Perry, came up with an idea for charging devices wirelessly by combing through concepts from a number of disciplines including sound, electricity and battery technology. The reporter described how specialists in each silo would generally nix the concepts associated with other silos. She would then do some of her own research and was able to circumvent the objections of the specialists who, when confronted with the proposed solution, grudgingly agreed that the approach “should work.” Hitt wrote the following … “Each expert seemed to dwell in his own private silo, so whenever she [Perry] crossed from one discipline to another, she would run into the same wall of constricted thinking.” [emphasis added]

I thought that this phrase, the “wall of constricting thinking,” is also applicable in the case of security and safety as they relate to software systems. My recent book, Engineering Safe and Secure Software Systems (Artech House, 2012), attempts to cut across two major disciplines populated respectively by information security professionals and software safety engineers in order to achieve software systems that are both safe and secure. I must admit that I experienced a wall of constricted thinking when presenting to a group of CISOs at a recent conference. Delegates did not show much interest in general, and some of those who did attend my presentation did not want to worry about system safety as they had neither the time nor inclination to take on another responsibility. There were a couple of attendees who did show interest and recognized that they had this problem in their own companies. But they were in the small minority.

It is appearing much more difficult than anticipated to raise the commitment and involvement of both groups to a level at which they will actually do something to resolve the increasingly urgent problem. That is not to say that some industries and sectors are not trying to insert limited security measures into their safety-critical systems. They are. However, one gets the impression that such initiatives are merely stop-gap measures and do not go far enough in addressing the real issues that arise because information security professionals do not appear to have any interest in learning about safety-critical systems, and software safety engineers seem to be intimidated by cybersecurity. Consequently, efforts to date have been somewhat superficial and do not address the real issues. There will have to be a major cultural shift in order for those focusing on security or safety in their software systems to work together, share information, knowledge and skills, and really get to understand and mitigate the other group’s issues.

By the way, some information security folks appear to be guilty of constrictive thinking within their own field. You may recall that I claimed that availability and integrity might actually trump confidentiality See my BlogInfoSec columns … “It’s About Availability and Integrity (not so much Confidentiality),” posted July 23, 2012, and “Integrity First … Then Availability … Then Confidentiality,” posted October 29, 2012. Some responded to the effect that matters such as availability and integrity are under the purview of DevOps and not the responsibility of infosec.

A number of senior-level colleagues are moving from IT-oriented infosec departments to operational-risk functions within the risk management area. There is an increasing recognition that infosec professionals must develop broader holistic roles, encompassing business functions and risk management in order to advance, while the day-to-day technical functions, such as managing firewalls and intrusion detection and protection systems, are moving increasingly into IT operations as they mature and can be incorporated into standard processes and procedures. In some situations, this is represented by different titles, such as BISO (Business Information Security Officer) and TISO (Technology Information Security Officer) reporting into an operational risk area. In other cases, entirely new titles are being introduced, such as SIRO (Senior Information Risk Officer/Owner), to express the evolving roles of those concerned with information risk. It seems that traditional information security roles may themselves be at risk.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*