Aircraft Safety … And Security

There was once a time when aircraft crashes appeared to be due mostly to mechanical failures or malfunctioning aircraft control systems. Yes, some incidents were caused by terrorists, and there were some accidents due to “human error,” but a goodly number of those also had system and/or physical components.

In the past couple of years, however, aircraft have proved to be extremely reliable and resilient, with few actual mechanical failures, and many of the crashes were reported to have resulted from some form of human interaction with the avionics and other onboard systems. In a recent catastrophe, the co-pilot of a Germanwings A320 Airbus flight from Barcelona to Dusseldorf apparently locked the pilot out of the cockpit and proceeded to reprogram the autopilot so that the plane would crash into a mountain.

In the cases of well-known Air France and Air Asia plane crashes, the pilots may have been misled into thinking that they were going at a speed as displayed on their instruments different from their actual speed and stalled their planes.

Malaysia Airlines flight MH370 is still lost and what happened to it remains a mystery more than a year later. There are a number of conflicting theories about what may have happened in this case, ranging from pilot suicide to takeover by terrorists. Either way it involves compromise of the aircraft control systems.

Each time there is a major incident, a litany of “experts” is paraded on television with their readings of what might have happened and what needs to be done to prevent the same thing from happening again. In his interesting and well-written book, “Safety-I and Safety-II: The Past and Present of Safety Management,” Erik Hollnagel describes the current way of responding to safety-related incidents as “trying to make sure that things do not go wrong, either by eliminating the causes of malfunctions and hazards, or by containing their effects,” The goal of Safety-I is to reduce the number of adverse outcomes to as few as possible.

Hollnagel proposes the Safety-II approach, the goal of which is to maximize the number of successful outcomes, which “is achieved by trying to make sure that things go right, rather than preventing them from going wrong.”

In some respects, Hollnagel’s approach is quite similar to my suggestions regarding Functional Security Testing (FST), which I have propounded a number of times in this column as well as in articles and presentations. I maintain that it is way too insufficient just to test the functionality of software (i.e., it does what it is supposed to do) but we need to make sure that the system doesn’t do what it’s not supposed to from a security point of view. I recognize that such “negative testing” is extremely costly and a drain on resources, but that it pays off handsomely if major catastrophes can be avoided.

This brings us back to the aircraft incidents. It is clear that, in designing various avionics and safety systems found on modern aircraft, software engineers did not think much about how to protect their systems from bad actors, whether external evildoers or internal misguided souls. And it is clear from the pundits’ comments on TV that they don’t know much about security either. There have to be highly-effective identity and access management (IAM) systems in place so that only authorized individuals can make changes to the flight control systems and then only if at least two of the flight crew (and perhaps ground control also) approve the changes. The engineers need to think through every possible scenario … suicidal pilot or co-pilot, terrorists taking over, etc. The crew needs to have at least two identifiers so that a crew member can enter one code to indicate that there is no coercion and a different code if a terrorist is peering over the crew member’s shoulder. Both should be accepted by the system, but for one situation the ground staff needs to work with the pilot and co-pilot, and in the other they must contact law enforcement and possibly the military.

Post a Comment

Your email is never published nor shared. Required fields are marked *