Juiced Beetle Not a Bug – Intentional Malware

An editorial column in the September 24, 2015 Wall Street Journal had the title “The VW Emission Bug.” The “defeat-device” software, which Volkswagen confirmed that they had installed in some of their diesel cars, certainly wasn’t a bug, as my October 5, 2015 BlogInfoSec column “When is a Glitch Not a Glitch” points out for the general case … it was a deliberate action by VW with the intent of misleading vehicle-emissions testers and customers.

Volkswagen was told by the U.S. EPA to recall about a half-million VW and Audi diesel-powered vehicles, including the VW Beetle, because the defeat-device software enables vehicles to pass annual emission tests but then reverts engines to a condition where as much as 40 times the permitted level of pollutants, including nitrogen oxide, is emitted. According to a front-page article in The New York Times of September 19, 2015, by Coral Davenport and Jack Ewing with the title “U.S. Orders Major VW Recall Over Emissions Test Trickery,” nitrogen oxide and other emissions contribute “to the creation of ozone and smog” and “are linked to a range of health problems, including asthma attacks, other respiratory diseases and premature death.”

The defeat-device software could be considered malware (malicious software) except that it was purposely installed by the manufacturer rather than inserted by some hacker, and its purpose was to enhance sales by overstating the engines’ “cleanliness.” Perhaps it should be called “deceitware.” In many ways, intentionally fooling customers is much worse than being subjected to an external attack. The fact is that, in this example and for other vehicular control systems in aircraft, trains, trucks, and the like, few beyond the manufacturers know what functions reside in such systems. It has already been demonstrated that researchers can take over vehicle control systems remotely and perform such tasks as cutting the engine, braking the vehicle, and controlling steering. But now we have a new model whereby vehicle manufacturers decide what software should be in a particular vehicle without the purchasers’ knowledge or consent, even though the software may be hazardous to human health and the environment (the basic definition of safety). We can expect subsequent cases like this… perhaps this is a new example of “unsafe at any speed,” the Ralph Nader moniker that changed safety standards for the auto industry.

We need to see new forms of disclosure much as those appearing on packs of cigarettes. How about … “This vehicle will pass annual inspections but the rest of the time may emit up to 40 times the amount of noxious and dangerous gases deemed acceptable by the EPA?”

A reporter for The New York Times, Farhad Manjoo, has mentioned in a couple of his articles on the VW case, such as in “Our Cars Need More Technology” in the October 1, 2015 issue, that some experts are recommending that vehicle control software be open source to allow many more eyes to examine the source code and hence be on the lookout for unintended security vulnerabilities and intentional malware. But that is no guarantee of anything. There are valid security arguments both for and against open-source software. In some cases open-source software is well-supported and reliable, but in others it is not given the support needed, as in the cases of the Heartbleed and Shellshock bugs in two open-source software programs used extensively in commercial websites … see my November 3, 2014 BlogInfoSec column “Heartbled and Shellshocked … What Can We Do?” The other concern is that automobile companies will use the open-source model to palm off responsibility and liability for vulnerabilities and intentionally-malicious software to open-source communities.

However the real issue here is that, like many incidents in the cybersecurity space, the defeat-device malware, which VW had been running for some seven years, was discovered by accident. This brings us to the general question of the quality and completeness of testing custom-built and third-party software. Modern vehicles purportedly host ten million or more lines of code (supposedly more than in aircraft!) and this number will increase significantly as automobiles and trucks become semi-autonomous and then fully-autonomous. Testing programs of such size for security and safety is a Herculean task, especially as each manufacturer crams its vehicles with its own proprietary software. And, as the VW case highlights, you have to test not only that the software performs its intended and allowable functions, but that it doesn’t do what it’s not intended to do or what it’s not supposed to do. I have written many times on functional security testing, to ensure that software doesn’t have vulnerabilities that will allow it to do what it’s not supposed to, but I have also referred to functional safety testing of control systems, which is used to make sure that software doesn’t result in harmful actions. Functional security and safety testing is orders of magnitude greater in effort than is the regular testing of functionality, which suggests that automated testing programs are essential for the testing to get done quickly.

Post a Comment

Your email is never published nor shared. Required fields are marked *