C. Warren Axelrod

Was Citi Sleeping? Could Functional Security Testing Have Saved the Day?

Do you remember reading over the summer about Citigroup having a security hole in an iPhone app, which stored all manner of nonpublic personal information in a file? … and that the data could then be transferred to a PC? In the Technology section of the July 27, 2010 The Wall Street Journal, Spenser E. Ante, assisted by Ben Worthen, wrote an article, “Citi Offers Fix for Security Flaw: Free iPhone Banking App Accidentally Saved Personal Data in a Hidden File.”

It is interesting to note Citigroup’s claim that “… it performed security tests before and after releasing the application, but failed to detect the problem.” According to John Hering, CEO of Lookout, a provider of mobile security, “… his company is discovering more apps that could inadvertently expose or leak personal data …”

While Mr. Hering believes that such security flaws will only increase due to the quick pace at which these apps are being introduced, such a situation suggests that what is needed is much better testing … and not just functional testing and nonfunctional security testing. These latter two types of testing were apparently done for the Citi app.

What is not clear is whether any functional security testing was done, and the results imply that it was not. I have again and again touted the benefits of testing functionality of applications from the perspective of their not doing what they shouldn’t – see, for example, my August 30, 2010 column “Eureka! Professor Does FST (Functional Security Testing)” my article “The Application Security Testing Gap” in the Perspective column of the November 2010 issue of Information Security magazine, and my December 20, 2010 column “Reinventing the Functional Security Testing Wheel.” Clearly copying personal data to a hidden file on an iPhone is an unwanted function. Just as clearly, according to Mr. Hering, there are many such apps, growing in number, for which testing that the apps don’t do functionally what they are not supposed to be doing is not done.

One Comment

  1. Greg Barnes Jan 16, 2011 at 2:31 pm | Permalink

    Excellent article! For what it’s worth – I think you ‘got the idea across’ quite well.

    I like the point you made when you said “Security methods are used to implement privacy requirements, but ensuring that privacy requirements are met requires a separate privacy audit”

    …Of course none of us can remember everything all the time- but this concept is something that would do well to be added to a number of different testing frameworks already in play. In fact – I find it difficult to accept that something along these lines hasn’t been added as a function/element of the PCI/DSS standard.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*