The user is not the enemy: How to increase information security usability

Peter Bassill's organisation relies on small workshop sessions anda requirement for users to navigate a web-based security portal twice a year.
Peter Bassill's organisation relies on small workshop sessions anda requirement for users to navigate a web-based security portal twice a year.
Lorrie Cranor, Carniege-Mellon University
Lorrie Cranor, Carniege-Mellon University
Donald Norman
Donald Norman

“I know to err is human”, Agatha Christie’s alter ego, Ariadne Oliver, remarked in her 1969 book, Halloween Party, “but a human error is nothing to what a computer can do if it tries.” It was a great line, except for one thing: behind every computer error, no matter how massive, is one or more humans.

“From a security perspective”, says Peter Wood, a member of the ISACA Conference Committee and founder of First Base Technologies, “classic human error remains the biggest vulnerability in most organisations I visit.”

Wood specialises in social engineering and staff training – “opposite sides of the same coin.” While many of the organisations he talks to blame user stupidity, he disagrees. “They don’t understand how to do the job because they’re not trained very well, or they’re doing wrong by doing the right thing.”

For example, from an information security perspective it’s a bad thing when users forward internal confidential email to their personal web email accounts, or take internal data home unencrypted on a flash drive or laptop – but those users are trying to be good employees by finishing their work. Meanwhile, users often fail to comply correctly with information security policies because they don’t understand them, think they don’t apply, or find them too difficult.

Take, for example, Wood says, the oft-implemented standard policy for passwords: the 30-day change requirement, based on an outdated model from the mainframes, when 30 days was the estimated length of time it took to crack a password (now, on a Windows machine it’s under two minutes).

“Typically, the guys who set security policy will go by the book – eight characters, mix of upper and lower case, numbers, symbols, and Microsoft provides enforcement in Windows – but they don’t think about how people actually might have to write it down to remember it, and that sort of mistake.”

"The moment where they fall for an attack, or think they have, seems to be a great teachable moment. People don't think it applies to them."
Lorrie Cranor, Carniege-Mellon University

Too difficult

This is not a new issue. Angela Sasse, head of information security research for the Human Centred Systems Group at University College London, began her career in human factors in 1990, when BT − trying to stem the rapidly escalating costs of its internal help desks − asked her to look into why the company’s users had so much trouble with passwords. The result, which Sasse wrote up in her paper Users Are Not the Enemy concluded: the password regime was too difficult for users to cope with.

More recently, Sasse was part of a project that interviewed 17 employees from two major commercial organisations to understand their compliance, or lack thereof, with information security policies. This research, published in 2008, developed the concept of the ‘compliance budget’. Most information security failures are due to human error. Punishing employees however, is ineffective at changing behaviour. The approach the resulting paper proposes is to think of employee compliance as “a finite resource that needs to be carefully managed”.

The easier it is to comply with information security policies, the more likely that users will do it, and the point where users will comply (the ‘compliance threshold’) varies according to organisational culture, the visibility of monitoring, the consistency of sanctions, and how much is asked of the user and when. Airport security, for example, is a much greater burden at the end of a long queue after a long night flight with small children than it is unaccompanied in an empty airport after a good night’s sleep.

An obvious approach is to force employees to comply by technology. The security company Overtis, for example, sells software intended to help organisations manage insider information security threats. The software, says Richard Walters, director of product management, runs on endpoints (computers, some PDAs) and provides onscreen prompts and dialog boxes reminding users what’s appropriate and what’s not. It can ensure that data put on removable media is encrypted, or ensure that confidential spreadsheets are not emailed outside the company.

"[The solution] is 80% training and awareness, 20% about putting in technology that will detect and prevent a user breaking something accidentally or mailciously."
Peter Bassill

“We’re really about taking somebody’s infosecurity policy and embedding it into this framework that we have, so we can block and prevent certain activities. But we provide a dialog box to the user explaining why this particular activity has been blocked. Security is a process – it’s all about people and processes, and less about technology”, he says.

An inherent conflict

There are limits to this approach. Donald Norman, author of many influential books on usability, including the 1988 classic The Design of Everyday Things, inspired a generation of human-computer interaction researchers and the creation of usability departments in every software company of any note. Yet his principles – that user error is usually the fault of poor system design – have yet to make headway in the information security world.

Norman expresses the conundrum of information security this way: “The more secure you make a system, the less secure it becomes.” In other words, “When you make it too secure, people do workarounds.” From a human factor point of view, “There is an inherent conflict between security, which is trying to make it hard for inappropriate people to have access, and usability, which is trying to make it easy for people to do their work.”

Yet security and usability have something significant in common: both tend to be tacked on at the end, after systems have already been designed. For that reason, both tend to be patch jobs that don’t work very well. “It means that both are inappropriate and you have neither good security nor is it easy to use.”

For now, the most common answer to the problem of human error in information security, is training and raising awareness of the consequences of mistakes.

Testing, testing

Peter Bassill, a member of the London chapter of ISACA and the CISO of a large company, says, “A lot comes down to users not being aware of what they’re doing.” The solution, he says, is “80% training and awareness, 20% about putting in technology that will detect and prevent a user breaking something accidentally or maliciously.” The information security awareness programme his company recently rolled out relies on small workshop sessions and a requirement for users to navigate a web-based security portal twice a year and achieve a pass mark on the questions at the end.

There is an inherent conflict between security, which is trying to make it hard for inappropriate people to have access, and usability, which is trying to make it easy for people to do their work."
Donald Norman

“Some users hate it and resent having to go through it”, he admits, “but the majority go through and have nothing but good things to say. We tend to try to impart not just corporate infosecurity, but small messages that they can use at home as well – making sure their home machine is patched, things they don’t really think of too often.”

It’s vital, he says, to “make it personal to the user”. Unfortunately, he says, too many CISOs “design a training programme that’s very technical and all about security, which tends to make users turn off very quickly”.

Lorrie Cranor, assistant professor in computer science, engineering, and policy at Carnegie-Mellon University, and convenor of the Symposium on Usable Privacy and Security, has been trying to find a way around user lack of interest. in information security

Recently, Cranor led a research project to study effective information security training which she is turning into a start-up company, Wombat Security.

National Academies of Science Usability, Security, and Privacy Committee

“I’m an end user”, says Norman, “but I’m non-typical”. He means that he is an organiser of the National Academies of Science (NAS) Usability, Security, and Privacy Committee, which has a series of meetings scheduled throughout 2009.

The goal: to find ways to advance the usability, security, and privacy of computer systems and investigate how to replace today’s uncomfortable and uneven trade-offs with information security solutions that satisfy all three characteristics. Cranor is also on the committee along with leading experts from IBM, Sun, and others; Angela Sasse is among the ‘provocateurs’.

“Usability, security, and privacy people all have to be part of the design team”, says Norman. “It’s the only way we’re going to reach a solution, by working together so each understands the concerns of the other.”

The initial chat among the committee revealed the first problem. “There are surprisingly few experts on security, and most of the stuff that’s designed is designed by programmers and people who don’t have any real understanding of security.

"Plus, there are surprisingly few experts on usability, and most stuff is designed by people who don’t know much about making things usable. So there are maybe one or two people in the world.”

“Basically, we discovered that you really need a way to hook people and get them interested”, she says, “and the moment where they fall for an attack, or think they have, seems to be a great teachable moment. People don’t think it applies to them.” Cranor’s particular project focused on phishing attacks.

“Being fooled is very powerful, and gets them paying attention.”

Many hats

Phishing is only one risk, but users’ problems with securing their home computers may provide an important opportunity for increasing information security awareness within companies.

Even users who do not believe their mistakes can put an entire company at risk will still take a personal interest in protecting their children online, staying free of viruses, and avoiding falling for phishing attacks. The information security principles they learn for home use, argues David King, chair of the Information Security Awareness Forum, can be carried into their company.

“One of the things that’s important in changing the culture, which is what we’re talking about here”, he says, “is recognising that people have many hats.” The corporate employee is also a private individual, and perhaps a charity volunteer, or a club member, for example.

“For a few years now”, he says, “anti-virus vendors have been allowing companies to provide their software to users for home use, and it helps create a protected domain around the organisation. You can apply the same model to awareness.” Users, he concludes, like Norman 20 years ago, are not stupid. “They might just not be making the appropriate decisions around security because they don’t necessarily see the world from a security point of view”.

What’s hot on Infosecurity Magazine?