A Culture of Security, Not of Blame

Written by

"Blaming people for not handling poor technology correctly is - in my opinion - simply wrong."

For some time now, our industry has been seen blaming employees for security breaches. Terms like the ‘insider threat’ and the ‘weakest link’ are flourishing, making for easy sales for phishing assessment providers and security awareness content suppliers. There are, as some people are picking up, a number of problems with this approach to improving security. 

Bruce Schneier is quoted to have said “if you think you can solve the security problems with technology, you don’t know technology”. Likewise, I say: ‘if you think you can solve security problems with awareness, you don’t know people’.

Technology continuously introduces huge amounts of security challenges and risk factors, which we keep blaming employees for not handling correctly. Blaming people for not handling poor technology correctly is - in my opinion - simply wrong. In fact, by blaming the employee for clicking on a phishing link, or opening an attachment, is similar to building a car with poor brakes, and then blaming the driver when the car crashes. Guess what, with cars the manufacturer does not get away with blaming the weakest link or the stupid driver. Why do we expect IT security to be any different? Perhaps we are just too lazy to properly secure our technology?

It did take quite some time, and a large number of car accidents, before Volvo was one of the first car manufacturers to make seat belts a standard item in the 1960s. Did accidents go down? Did people stop dying in accidents? No, not to a large extent. People had the technology, and it was quite easy to use in that using a seat belt did not require formal training.

People were indeed aware, as newspapers did cover car accidents (like they write about security breaches today) and governments (at least in developed countries) did invest in driver safety campaigns just like you invest in security awareness programs.

Based on how we insist on working with security awareness in our industry, surely we must believe the efforts of car manufacturers, governments and news outlets must have reduced the number of accidents, mutilations and deaths? I mean, why else do you still insist on doing those security awareness training programs? Eventually they will work, I hear you say. We both know you don’t really believe that. 

The fact is that they don’t. Just like traffic safety awareness programs did not work back in the day. A large, and growing number of research into human behaviors, clearly demonstrate that even if we know what is the right thing to do, we consistently do something else. We even know why - we don’t care about doing the right thing, we care about feeling right! 

Making people feel right when they do what they should be doing, should be our aim then. Which leads to the question ‘how do we make people feel right?’ Again, science has a bundle of clues.

Humans are social beings, even us individualistic westerners. Social beings rely heavily on rules of engagement in their social interactions. What makes humans different from other social beings is our ability to generate meaning outside of our physical world. Using stories, we create huge social constructs with intricate regulations controlling our ideas, customs and social behaviors. These stories, and the rules that regulate them, seem to be a very strong controlling mechanism, one that makes people fall in line, change behaviors and dedicate themselves to purposes larger than themselves. 

As security experts, we should tap into these social constructs to change the behaviors of our colleagues. We need to understand how people interact and how we are being influenced by each other. Here, social engineering is key when applied correctly: social engineering provides a window into how people in general are being influenced by others through clues, messages and (not so) subtle signals.

One reason social engineering competitions are so successful at events like Def Con/Black Hat is that most of the time, you can just apply and follow a script, and eventually you will get the results you want. The interesting part here is not the fact that someone gets a prize for winning the ‘competition’, but instead the fact that it is just a simple formula, that works over and over again, is lost on most people, even the competitors themselves.

People are so bound to our patterns and rules that science call us predictably irrational - our irrational nature can be controlled predictably.

We can apply similar understanding as the social engineers and the social scientists demonstrate in our security culture programs. Jenny Radcliffe is doing some great work in this sector, with her Human Firewall project.

A growing number of people are starting to realize that security awareness is simple one, and a small one too, part of security culture. As our own research demonstrates, if you want to change behaviors you must work with a complete and holistic program, one that incorporates technology, people and policies. Not one by one, but together.

We can learn from the mistakes of others too, for example the aforementioned car industry. Technology and safety awareness campaigns were not enough to make people start wearing seatbelts. It took regulatory change, where governments imposed fines for not wearing them, and then had the police enforce the policy change. Technology, people and policies. Working together for a better future.

What’s hot on Infosecurity Magazine?