Outsourcing, Cost Cutting and the Boeing 737 Max Debacle

When we thought that Boeing had come up with ways to mitigate the risks that resulted in two major air crashes, we learn that Boeing has been outsourcing their software development to Indian companies that hired newbie temporary programmers for as little as $9 per hour, as described in a June 28, 2019 article by Peter Robison with the title “Boeing 737 Max software outsourced to $9-an-hour engineers” which is available at https://www.bloomberg.com/news/articles/2019-06-28/boeing-s-737-max-software-outsourced-to-9-an-hour-engineers

Boeing denies that their software compromised safety in any way. Be that as it may, there are risk factors that emanate from IT outsourcing relating to security and safety, as described in my book “Outsourcing Information Security” (Artech House, 2004), that need to be considered. Apparently, at the same time that it has been outsourcing, Boeing has been laying off experienced engineers. In my book, I point out the importance of retaining sufficient inhouse expertise to manage the outsourcing relationships and the quality of the products produced by third parties. This aspect of vendor management is critical.

In a later book, “Engineering Safe and Secure Software Systems” (Artech House, 2012), I discuss the importance of ensuring the quality of software as it passes through the various phases of the System Development Lifecycle (SDLC). I particularly praised the proactive assurance of safety-critical systems produced by and for the avionics industry. It is extremely disappointing and disquieting to learn of the possibility that the rigorous standards of the industry have been compromised.

We also learn from news reports such as Alexandra Ma’s July 29, 2019 article with the title “A former Boeing 737 Max engineer said he was ‘incredibly pressurized’ [sic] to keep costs down and downplay new features to avoid FAA scrutiny,” which is available at https://markets.businessinsider.com/news/stocks/boeing-737-max-former-engineer-pressure-costs-avoid-faa-scrutiny-2019-7-1028393024 , that there were alleged efforts “to characterize major changes in flight software as minor changes to avoid Federal Aviation Administration scrutiny.” Boeing has denied these claims.

There is the additional concern that such practices may exist among developers of the software programs that control autonomous vehicles now and going forward, especially as automakers have not been held to those same rigorous standards for the security-critical and safety-critical software systems that are increasingly being integrated into road vehicles. Yes, the NHTSA crash tests cars extensively and rates them according. However, I would question whether the auto testers have the required software system testing capabilities, as described in my 2012 book. The road vehicle situation is exacerbated by the practice by some manufacturers (such as Tesla) of downloading software revisions in real time. For the record, I greatly admire the technology prowess of Tesla, but question their practice of having car owners beta test new software, no matter how good it is. And, by the way, are automakers outsourcing their software development? I don’t know, but I would guess that in some cases they are.

Aside from the fundamental issues of testing software systems and training users, we are now being confronted with questions about whether seasoned engineers are retained to oversee the SDLC and signing off on the quality of the software being produced.

I am increasingly concerned whether those who are designing and developing complex algorithms and incorporating them into cyber-physical systems of systems have the critical contextual knowledge that is required to ensure overall security and safety. If we do not take immediate and strong government action to ensure that appropriate software assurance, including verification and validation, is being enforced for security-critical and safety-critical systems, then we will more than likely observe a rapid decline in the trustworthiness of the cyber-physical systems that are increasingly governing our lives. Such risk taking on our behalf, and without our knowledge and approval, is unconscionable.

One Comment

  1. Richard Schmitt Feb 10, 2021 at 7:18 pm | Permalink

    I know many engineers from India and some of them are top notch. But I’ve often been surprised how some real-time concepts are not intuitive to the remote teams. One example was about load hysteresis. A system may, on average, have a 70% utilization but that does not mean that the CPU is never saturated. Although the average utilization may be 70%, there may be significant intervals where the load is 1.2 to 1.5 or more (meaning 20 to 50% oversubscribed). During those periods of oversubscription, the system will fail. The teams could not understand how that could happen. It was not that they lacked the education or intelligence. They did lack the intuitive understanding. I do not know why, but when I had read about the 737 Max problems, I thought of this situation. Real time systems requires different design decisions that I feel at least the engineers I’ve worked with from India may not be as prepared for compared to American engineers.

Post a Comment

Your email is never published nor shared. Required fields are marked *