Sam Dekay

Does Security Awareness Work (pt. 2)? It all Depends on What You Mean by “Work”

Several weeks ago this column printed an article entitled, “Does Security Awareness Work? Some Answers from Experimental Research.” The article presented results from three published experiments concerning the effectiveness of awareness programs. In the final paragraph of that piece, readers were presented with a “teaser”: A second article, discussing the results of the experiments and attempting to extract lessons from their results, would soon be published. So here we are.

Are the Experiments Valid?

It is entirely fair to indicate that these experiments—like controlled experiments in all fields—raise questions of validity. In other words, the design of the experiment may raise questions concerning the results and whether they accurately measure intended outcomes. In the case of these experiments, the “outcomes” were the extent to which security awareness programs actually resulted in changing users’ behaviors to reflect best security practices.

The validity of all three experiments may be questioned. For example, published results of the Carnegie-Mellon study do not describe the educational materials presented to Groups 1 and 2. Therefore, we don’t know if the content of the materials may have inadvertently affected experimental results. The same criticism may be raised with the German study. Interestingly, for instance, the consultants were able to gain vital information and penetrate security controls by means of social engineering; however, the brief published account of the study does not mention if social engineering was even discussed in the corporate security awareness program. The Military Academy study may be tainted because the cadets were sent bogus email messages concerning possible problems with grades. The messages were sent at a point in the semester when grades were especially important to the students. Had the messages been sent at a different time, the experimental results may have been entirely different.

However, if we are willing to dismiss these possible issues of validity, the three experiments still yield remarkably consistent results that may assist our understanding of how security awareness programs may actually help our users to adopt behaviors that reflect information security best practices.

Information about “Best Practices” is Not Enough

All three experiments demonstrate that the simple presentation of security-related information does not, by itself, result in altering behavior. Thus, posters, security awareness messages, or webinars are not by themselves sufficient to establish a culture of security-conscious users. This result should not be surprising, because behavioral psychologists have, for several decades, emphasized that desirable behaviors must be reinforced with positive or negative feedback. (The so-called school of “neo-behaviorists” maintains that negative feedback is actually more effective.) The mere presentation of information offers no feedback, either positive or negative. To inform users, for example, that “Security is Everyone’s Business” is unlikely to generate excitement—or a genuinely aware workforce.

The only experiment that incorporated use of feedback was the Carnegie-Mellon study. Here, when the duped members of Group 1 clicked on an embedded link in their email message, they were informed of their careless behavior and routed to security awareness information. One week later, these Group members had remembered their experience and refused to respond to a second bogus email. However, members of the other two Groups—who had not experienced the negative reinforcement of responding to a phisher—demonstrated no increased security awareness from one week to the second. The results of this experiment—together with the findings of behavioral psychology—tend to emphasize that awareness instruction must also be accompanied by positive or negative feedback in order to influence users’ security practices.

Unfortunately, however, security awareness is an elusive goal. Some components of awareness (such as informing employees that they must not open suspicious email messages) may be designed to include some form of reinforcement. Other components, however, are not equally as amenable to creating a user “experience.” For instance, what positive (or negative) reinforcement can accompany the instruction not to share passwords with another employee? If employees adhered to a strict honor code, and dutifully reported all instances of password sharing, then reinforcement is definitely available. Similarly, if supervisors regularly admonish employees for sharing, the practice will probably be curtailed. In absence of these cultural and environmental factors, however, it will be difficult to reinforce behaviors related to passwords.

Certain aspects of security awareness, such as informing users which behaviors must be avoided to prevent an infestation of malware, may be reinforced by technical means. If, for example, forensic investigation identifies that a viral infestation originated at a specific desktop, it is possible to speak with the relevant user and deliver an ad hoc lecture concerning the dangers of downloading unknown content from the Web. However, it may be difficult to locate the “relevant” user.

The Metrics of Security Awareness

Of the three experiments cited, probably the German Study is most successful from the perspective of metrics commonly used to gauge security awareness programs. Often, auditors and regulators raise the following questions regarding awareness:

  • Do you have a security awareness program?
  • What budget have you allocated for security awareness?
  • What kinds of awareness materials do you distribute?
  • How frequently do you distribute the materials?
  • What are the topics discussed in your awareness messages and posters?

By all accounts, the managers responsible for the German study could have responded positively, and in considerable detail, to all these questions. However, the actual effectiveness of the program is ignored entirely.

So, does security awareness work? Well, it depends on what you mean by “work.” Most information security professionals are not really required to demonstrate that their awareness programs are influencing the behaviors of users. The most easily acquired metrics, those concerning the structure of the program itself, usually satisfy auditors and other investigators. And metrics that might indicate true program effectiveness, such as indicators that users are sharing fewer passwords than in the past, may be difficult or impossible to acquire.

If you are truly interested in designing an awareness program that can demonstrate modification of users’ behaviors, it seems you must be quite realistic. That is, the content of awareness messages must be planned in such a manner that user behavior can actually be monitored and progress (or lack thereof) measured. One example of this sort of content is our continual battle against malware. Because the presence of viruses, worms, and other assorted nastiness can be detected, it is possible to determine if an awareness campaign to eradicate infestation has been effective. Also, please remember that your focus is upon influencing users’ behaviors; this will require some form—positive or negative—of reinforcement. Posters, no matter how eye-catching the graphics, will not a community of security-conscious users make.

3 Comments

  1. Scott Wright Apr 22, 2008 at 8:16 am | Permalink

    As with any awareness campaign, the act of communicating content itself is only a small part of a successful campaign or program. As Sam points out above, things like budget, frequency and topics are considerations that ultimately contribute to changes in an organization’s outcomes and business performance.

    One of the things I believe is most important is creating “sticky messages”. A great book that helped me understand the essentials of sticky message content is “Made to Stick” by Chip Heath and Dan Heath. The subtitle is, “Why some ideas survive and others die.”

    Without a memorable message that people can act on, success is unlikely, no matter how you measure it. With a good message, the effectiveness of penetration can be pretty obvious.

  2. Gary Hinson May 13, 2008 at 11:40 pm | Permalink

    Personally, I’m interested in (a) promoting information security awareness, making people more aware of the security issues they face at work and at home, (b) modifying individual behaviours, for example taking security into account when doing things and hopefully avoiding overtly risky activities, and (c) making this a widespread change, in other words a cultural shift embedded in the organization. Measuring and demonstrating the effectiveness of the awareness program is a separate issue. Tests, surveys and trials (such as the self-phishing or pen test ideas you mention) certainly generate information and, if properly designed, can generate valid statistics, but there are many other ways of measuring and reporting, for example counting the increased number of calls to the IT help desk or information security team, or page views on information security’s intranet website, directly relating to a recent security awareness initiative. If you don’t mind spending some $$ to get the stats, you could even conduct behavioral assessments using observation and recording of employees in real-life or experimental conditions – there’s the whole field of behavioral and sociological sciences at your disposal. The bigger question, though, is why do you need stats? A truly effective security awareness program should be self evidently effective. There should be a buzz around the place when new awareness topics are covered. Managers, staff and specialists should be talking about the program, doing the things it is suggesting, and ideally coming up with their own ideas (good communications are not broadcasts but two way – like comments on blogs!). Security policies should be up to date and referenced. People should be using and updating security-related standards, guidelines and procedures. Employees should be getting feedback when they ‘do the right thing’ as often as they are chastised for doing something insecure. People should be looking forward to the next awareness topic, and keen to get involved in the seminars, competitions and other learning opportunities. Information security should be nearer everyone’s ‘front of mind’ than if no awareness program was in place. Perhaps MRI scans would give you the stats you seek!!

    Kind regards,
    Gary Hinson

  3. Randolph Smith May 14, 2008 at 8:07 am | Permalink

    Consider that if a security awareness program is delivering useful results, then the target audience should be able to:
    1. Know what is expected of them
    2. Distinguish between acceptable and unacceptable behavior
    3. Take appropriate actions

    It is good that you point out that constructing valid experiments is not as easy as we would hope. Yet, good experiments are necessary to answer the most important question about an awareness program – Was it worth the effort?

    Sometimes just asking the right questions gets useful answers to the first two elements. Whether or not people will do the right thing when confronted with a security challenge is the heart of the matter. Regardless of what they know, what people actually do in a given situation is the true measure of an awareness program.

    If we are not willing to observe and measure real behavior, we will never know if the messages ever had the desired effects.

One Trackback

  1. […] (click HERE) This is the second post by Sam in recent weeks on examples of real security awareness campaigns, […]

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*