C. Warren Axelrod

Aircraft Safety and Security … It’s the Training!

As we dig deeper into the software jungle, we become increasingly prone to the vagaries of software and the difficulty of ensuring that it is both safe and secure. I tackled many of these issues in my book “Engineering Safe and Secure Software Systems” (Artech House). There I covered the SDLC (system development life cycle) in detail for both security-critical and safety-critical systems, pointing out the cultural and technical differences between cybersecurity professionals and safety engineers that need to be resolved in order to improve the safety-security needs of today’s cyber-physical systems.

However, one area which I didn’t emphasize—but will in the future—is the training of operators, be they aircraft pilots, train engineers, ships’ captains or road-vehicle drivers like you and me. This was driven home by a front-page article in The New York Times of February 3, 2019 by James Glanz et al with the title “Jet’s Software Was Updated, Pilots Weren’t: Boeing and F.A.A. Face Scrutiny After Crash.” The article describes how Boeing updated the software of its 737 Max to accommodate changes in the engines between that model and the prior model to improve efficiency and range. However, they apparently decided that pilots did not require special training on the new version of the software. It is alleged that this lack of training was a contributory factor in the crash of Lion Air’s Boeing 737 Max aircraft in October 2019.

Too often, we assume that those operating vehicles will have received the necessary training or have carefully read the user manuals—and will follow the rules! That may generally be the case for commercial craft, such as “trains and boats and planes,” as in Burt Bacharach’s lyrics sung by Dionne Warwick at https://www.youtube.com/watch?v=TwOngoEuVGw However, it is definitely not the case for automobiles which are becoming ever more complex yet depend on drivers to train themselves on how to operate such vehicles and where there are no standards as to placement and operation of controls. This becomes even more serious when you have auto manufacturers downloading software upgrades “on the fly,” especially when it turns out that not fully understanding the upgrades can lead to safety issues.

Perhaps the most disturbing aspect disclosed by the NYT article is that engineers decided unilaterally that additional training for the new version of the software was not needed and that pilots familiar with the prior system would know how to deal with the physical and logical changes that had been introduced. It is one thing to have properly-trained operators make errors or blatantly disregard their training, but it is quite another not to have trained them adequately in the first place.

As software increasingly takes over the control of physical objects—vehicles, sensors, diagnostic devices, and “things” generally—there is an urgent need to make sure that software-controlled devices come with complete and understandable instructions as to their use and what to do when there is a problem. Too often, we are left to rely on intuition and trial-and-error to master these physical objects, and it is too easy to get it wrong—sometimes with catastrophic consequences.

While some politicians, celebrities and colleagues claim to be all-knowing, they aren’t, and they set a bad example for the masses. It is okay to admit that you don’t know how this button works … after all, some you just push and others you must hold down for three or more seconds. How are you supposed to know that if there are no instructions or when those that exist are incomplete or difficult to understand? The consequences are not terrible if the device is a clock radio (unless you miss an important appointment as a result of not setting the alarm properly), but they can be horrific if you are dealing with vehicles or medical devices, for example.

It is interesting to speculate how the above situation arises. I think that many engineers, who develop software and cyber-physical systems, are so immersed in the design, development and testing that they become intimately familiar with every nuance and characteristic of the systems that they are working on. Somehow, they presume that everyone else will know what they know and that it is not necessary to educate them on new features, for example. Well, they’re wrong, terribly wrong.

Furthermore, many of these same engineers have little concept of the social impact of what they are designing and developing. Just look at Facebook, Twitter …

I just happened across a blog post that was reprinted on pages 8-9 of the February 2019 issue of Communications of the ACM. It is by Robin K. Hill and is about “Tech User Responsibility,” and is available at https://cacm.acm.org/blogs/blog-cacm/231489-tech-user-responsibility/fulltext

Hill explains many of the reasons—such as: this is a nuisance, this is clerical, this is supposed to be easy—why users don’t spend the time and effort to learn how to use applications. One section worth noting is as follows:

“We impose a minimal amount of responsibility on someone checking a book out of a library … We impose a high degree of responsibility for driving a car, because it can kill people.”

So it is for aircraft. Here, the pilot’s job is to fly the plane, whereas, while a car driver receives lessons and must obtain a license, car-driving is usually not his or her primary employment. Suffice to say, there are many reasons users (tech or otherwise) pay scant attention to training and user manuals, developers short-change training materials, and manufacturers try to minimize training costs. This can be okay for non-critical software, but can be very dangerous for the critical cyber-physical systems that are increasingly running the world.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*