C. Warren Axelrod

Heartbleed Lessons – FST and Lab Certification

There has been much written following the “discovery” of the Heartbleed bug that plagues OpenSSL … some informative, some constructive, some neither. Perhaps the most useful article to date is one published on April 18, 2014, which was written by Nicole Perlroth of The New York Times with the title “Heartbleed Highlights a Contradiction in the Web.” It’s available at http://nyti.ms/1iyqgeA

In the article, Perlroth describes the lack of incentives for underfunded open-source community volunteers and the lack of interest by those who benefit from the use of such software in funding the support effort. Perlroth quotes security experts as saying that “companies and governments should pay for regular code audits, particularly when the security of their own products depends on the trustworthiness of the code.”

What it basically comes down to are two requirements that I have been touting for some time. One is functional security testing, and the other is the mandatory use of certification laboratories to determine that critical software in particular, whether open-source or off-the-shelf, meets pre-specified security standards.

Functional security testing, or FST, consists of the extensive testing of software-intensive systems in order to assure providers and users that the software systems do not do anything that they are not supposed to do. I describe this in greater detail in an article “The Need for Functional Security Testing,” in CrossTalk (March/April 2011), which is available online at http://www.crosstalkonline.org/storage/issue-archives/2011/201103/201103-axelrod.pdf . While expensive and time-consuming, such rigorous testing can help to avoid the kinds of software deficiencies that we encounter every day.

While FST is a process that can be relatively easily introduced into the system development lifecycle, certification is much more difficult to achieve. Certification in a qualified laboratory setting requires establishing economic and regulatory models that encourage or force software writers, manufacturers and users to put critical software through demanding security assurance processes. Such testing is only a partial solution since many software implementations incorporate customized components and often operate in differing contexts. Nonetheless certification of standard versions of software operating in their most common environments can be extremely valuable in steering customers towards more secure products, whether open-source, off-the-shelf, custom-built or combinations of these types of software.

Perhaps the Heartbleed flaw in OpenSSL will provide the necessary impetus to drive these initiatives … or perhaps not. Many vendor and customer organizations are more willing to endure the costs and inconvenience of flaws showing up from time to time and fairly frequent successful hacker attacks rather than invest the large sums needed for reviewing and certifying software. That trade-off generally can be justified until the magnitude of the deployment of the software is so large that the compromise of a single product can go on a rampage throughout critical networks and systems, as in the case of OpenSSL which supposedly is installed in some two-thirds of all websites.

Again we see ourselves as victims of the tragedy-of-the-commons syndrome where no specific person or company is responsible for fixing bugs that might affect the security of open-source software. Or if support individuals and organizations are indeed identified, it remains difficult or next to impossible to force them to respond on a timely basis. As Professor Steven Bellovin of Columbia University is quoted by Perlroth as saying, “Everybody’s job is not anybody’s job.” With no one taking on the responsibility and liability of ensuring that commonly-used software has been properly tested and appropriately certified, the security of software falls through a gaping hole.

Despite exhortations to make the necessary changes, it is not likely that much will be done except perhaps superficially. The costs can be very high and the returns are often seen to be so low that no one organization is going to volunteer to ensure the confidentiality, integrity and availability of such software. Unfortunately the only way to get something of this nature done is for government to intervene and mandate the implementation of such demanding processes as FST and certification, and it is unclear whether government is willing to take on this responsibility. At some point, if it becomes really really bad, we might see some response. But until then, don’t count on it.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*