The Pitfalls of Awareness

Written by

We’ve just reached the end of Cyber Security Awareness Month and, as well as the various initiatives aimed at driving awareness, we saw some huge breaches and cyber-attacks hitting the headlines, from Yahoo’s historic breach to the DDoS on Dyn. If the aim of Cyber Security Awareness Month is to talk about cybersecurity and generate discussion of the threat and likely impacts, it can probably be counted as a success.

But, what we really need to consider is, what is the aim of awareness?

Awareness-raising is often perceived to be a good thing in and of itself. Many organizations, unfortunately, still put awareness-raising training in place because they think they should, or do it to tick the compliance box. It can’t do any harm, right? There may be hope that when people undertake the training some messages will stick, but probably not much expectation that there will be a big impact. When passwords don’t improve and people keep clicking on phishing links, eyes will inevitably roll at the ‘stupid users’ who ‘just don’t get it’.

The problem with this approach is that awareness-raising done simply for the sake of it usually has a negative impact, leading to worse behaviors. Awareness-raising training and communications should be designed and delivered carefully, with a focus on the intended outcome: changing behaviors for the better. Awareness-raising also needs to be carried out in consideration of the pitfalls of awareness and how to avoid them.

I consider the three pitfalls of awareness to be:


NIST recently published research which found that people are experiencing security fatigue, in which one participant commented: “I think I am desensitized to it… people get weary of being bombarded by ‘watch out for this, watch out for that’”

Awareness-raising done simply for the sake of it usually has a negative impact, leading to worse behaviors

This sums up ones of the problems with awareness-raising. When it is done the wrong way, awareness-raising can actually encourage people to switch off from security. Rather than becoming more engaged, people can feel overwhelmed and exhausted. To overcome this, NIST recommend limiting the number of security decisions users need to make, making it simple for users to choose the right security action and designing for consistent decision-making whenever possible. In terms of awareness-raising, I would add to this list the need for training and communications to be engaging, innovative and impactful. Training should not be about giving people a list of don’ts. When you’re asking people to change their behaviors, the priority should be on telling them why they should do so. Helping people understand how the threat actually becomes reality is key to constructive awareness-raising.


In this industry, we are often accused of pedaling fear, uncertainty and doubt. I do a lot of awareness-raising work in organizations and talking about scary stuff is intrinsic to talking about cybersecurity. We have to discuss the threats people face and, when we do so, people will feel scared. It became apparent to me early on that when we do this, too often we do so in a way that pushes them into denial (“I won’t get hacked, no-one would be interested in my data”) or avoidance (“the internet is full of criminals, I’m not going to use it”). For this reason, I’ve explored the psychology and sociology of fear to see what lessons we can learn in terms of communicating cybersecurity messages. When people are simply scared, they respond by engaging with the emotion of fear. They process that emotion by retreating into denial or avoidance. However, research shows (and my experience confirms) that when people are exposed to scary messages in the right way, they engage with the danger, the actual threat, and you are far more likely to see a positive change in behaviors. Discussing scary things with people in the right way means explaining why the threat is relevant to them, why the behavioral changes you are recommending will be effective and, crucially, supporting them in putting those responses in place. If you are asking people to have more complicated and unique passwords, how are you going to recommend they manage those passwords? If you want people to use two-factor authentication, will you take them through the set-up process step-by-step? Good awareness-raising training doesn’t simply scare people and push demands onto them, it supports and empowers them.

If you are asking people to have more complicated and unique passwords, how are you going to recommend they manage those passwords?

False Flags

Raising awareness of the malicious insider threat, and discussing the profile of the ‘typical’ malicious insider, can encourage people to falsely identify co-workers who seem to fit the profile as engaging in malicious activity. This has parallels with the Baader-Meinhof phenomenon, in which we learn something new and then repeatedly encounter the same knowledge again soon after. Human brains love patterns, and when we start to identify a pattern, we have a tendency to ignore the information which doesn’t fit into it. When we raise awareness of the malicious insider threat, people can go away thinking that they see patterns of malicious insider activity where there are none. This can lead to a couple of problems. Firstly, people who are not malicious can be falsely accused, potentially causing a HR issue and a whole host of unwanted consequences.  Secondly, too many false reports can lead to a dismissive attitude both from those reporting and from those receiving the reports. As the boy who cried wolf taught us, this can lead to truly malicious activity going unidentified or unreported when it does then occur. It is so important, therefore, to raise awareness of the profile of malicious insider activity within the context of an understanding that ‘one swallow does not a summer make’. Just because someone fits the profile of a malicious insider, does not mean they are one. Awareness-raising training should help contextualize the malicious insider threat and not over-emphasize the profiling of malicious insiders.

When taking forward the messages of Cyber Security Awareness Month, I hope you’ll remember the three f’s of awareness pitfalls and take steps to avoid fatigue, fear and false flags.


This blog was written by Dr Jessica Barker. With a background in sociology and civic design, Dr Jessica Barker specializes in the human side of cyber security. As an independent consultant, Jessica is engaged by FTSE100 companies, central government and SMEs across the defense, health, financial and retail sectors to advise organisations how they can keep their information safe while getting the most out of it. Jessica’s consultancy work involves designing and delivering research projects as well as leading information security audits. Jessica also specializes in learning and development packages which raise cyber security awareness and improve behaviors.

About The Risk Avengers:

The Risk Avengers is a collaboration of three well respected and experienced industry experts. Dr Jessica Barker, Sharon Conheady and Toni Sless pool their extensive knowledge and experience in the fraud prevention, physical security, cyber security, social engineering and penetration testing arenas. The Risk Avengers provide consultancy and training to businesses on the minefield that is information security, fraud awareness and prevention.

What’s hot on Infosecurity Magazine?