There’s an interesting article in the September 2016 issue of the Communications of the ACM by Kate Matsudaira with the title “Bad Software Architecture Is a People Problem: When people don’t work well together they make bad decisions.” The article essentially describes many issues with the design, development, testing and bug fixing that arise because “… you don’t have a team that works well together …” which “… can hurt your software design, along with its maintainability, scalability, and performance.” If you add “security” and “safety” to Matsudaira’s list, you have the underlying theme of my book “Engineering Safe and Secure Software Systems” (Artech House, 2012), in which I attribute many software safety and security issues to lack of common knowledge and inadequate communication between cybersecurity professionals and software safety engineers.
Matsudaira provides six strategies for helping teams work better together and produce improved software:
- Define how you will work together
- Decide how you will test the whole system
- When bugs happen, work together to solve them
- Use versioning
- Create coding standards
- Do code reviews
Again all of these strategies can and should be applied to security and safety. Of course, “the devil is in the details.” But through my books, articles, presentations, and BlogInfoSec columns, I have worked hard to fill in many of the details that you need to accomplish, not only in terms of knowledge and experience required, but also with regard to the processes that need to be integrated and followed.
Matsudaira sums it all up in a rousing final paragraph that reads:
“… the real key to great software architecture for a system developed by lots of different people is to have great communication. You want everyone to talk openly to everyone else, ask questions, and share ideas. This means creating a culture where people are open and have a sense of ownership—even for parts of the system they didn’t write.”
I agree wholeheartedly with this statement—it has to be applied to developing and implementing security-critical and safety-critical systems. As we increasingly combine information processing systems with control systems, the knowledge gap becomes wider and cultures more varied. The task at hand is to bring it all together—not only within teams building various components but also across the big divide between security and safety cultures.