The User is Not the Enemy: How to Increase Information Security Usability

Peter Bassill's organization relies on small workshop sessions and a requirement for users to navigate a web-based security portal twice a year
Peter Bassill's organization relies on small workshop sessions and a requirement for users to navigate a web-based security portal twice a year
Don Norman
Don Norman
Lorrie Cranor, Carniege-Mellon University
Lorrie Cranor, Carniege-Mellon University
Peter Wood, FirstBase Technologies
Peter Wood, FirstBase Technologies
Richard Walters, Overtis
Richard Walters, Overtis

“I know to err is human”, Agatha Christie’s alter ego, Ariadne Oliver, remarked in her 1969 book, Halloween Party, “but a human error is nothing to what a computer can do if it tries.” It was a great line, except for one thing: behind every computer error, no matter how massive, is one or more humans.

“From a security perspective”, says Peter Wood, a member of the ISACA Conference Committee and founder of First Base Technologies, the penetration-testing specialist, who adds that “classic human error remains the biggest vulnerability in most organizations I visit”.

Wood specializes in social engineering and staff training – “opposite sides of the same coin.” While many of the organizations he talks to blame user stupidity, he disagrees. “They don’t understand how to do the job because they’re not trained very well, or they’re doing wrong by doing the right thing.”

For example, from an information security perspective it’s a bad thing when users forward internal confidential email to their personal web email accounts, or take internal data home unencrypted on a flash drive or laptop – but those users are trying to be good employees by finishing their work. Meanwhile, users often fail to comply correctly with information security policies because they don’t understand them, think they don’t apply, or find them too difficult.

Take, for example, Wood says, the frequently-implemented standard policy for passwords: the 30-day change requirement, based on an outdated model from the days of mainframes, when 30 days was the estimated length of time it took to crack a password (now, on a Windows machine it’s under two minutes).

“Typically, the guys who set security policy will go by the book – eight characters, mix of upper and lower case, numbers, symbols, and Microsoft provides enforcement in Windows – but they don’t think about how people actually might have to write it down to remember it, and that sort of mistake.”

Too Difficult

This is not a new issue. A 2007 study by Oxford Executive Research found that companies that recovered quickly from major operational disasters increased their share price by 5% on average versus the market. Companies that struggled to regain their operations took a 20% drop in relative value. From this research, it appears that investors factor a company's resilience to adversity into its stock price.

It is clear to see why corporate executives in boardrooms around the world want answers to the IT risk question: How do we dramatically mitigate the risk and improve the return on investments in information systems?

The answer to these questions lies in treating information technology risk within the integrated framework of business risk management.

Information security risks need to be identified, measured and managed as part of a single view of all risks in the corporation, with oversight by senior management to understand and guide the appropriate risk / reward tradeoffs to achieve the goal of increasing return on IT investments.

The name for this approach to managing and balancing information risk and reward is IT risk management.

The easier it is to comply with information security policies, the more likely that users will do it, and the point where users will comply (the ‘compliance threshold’) varies according to organisational culture, the visibility of monitoring, the consistency of sanctions, and how much is asked of the user and when. Airport security, for example, is a much greater burden at the end of a long queue after a long night flight with small children than it is unaccompanied in an empty airport after a good night’s sleep.

An obvious approach is to force employees to comply by technology. The international integrated security company Overtis, for example, sells software intended to help organizations manage insider information security threats. The software, says Richard Walters, the firm's director of product management, runs on endpoints and provides onscreen prompts and dialog boxes reminding users of what is appropriate and what is not. It can ensure that data put on removable media is encrypted, or ensure that confidential spreadsheets are not emailed outside the company.

“We’re really about taking somebody’s infosecurity policy and embedding it into this framework that we have, so we can block and prevent certain activities. But we provide a dialog box to the user explaining why this particular activity has been blocked. Security is a process – it’s all about people and processes, and less about technology”, he says.

An Inherent Conflict

There are limits to this approach. Donald Norman, author of many influential books on usability, including the 1988 classic The Design of Everyday Things, inspired a generation of human-computer interaction researchers and the creation of usability departments in every software company of any note. Yet his principles – that user error is usually the fault of poor system design – have yet to make headway in the information security world.

Norman expresses the conundrum of information security this way: “The more secure you make a system, the less secure it becomes.” In other words, “When you make it too secure, people do workarounds.” From a human factor point of view, “There is an inherent conflict between security, which is trying to make it hard for inappropriate people to have access, and usability, which is trying to make it easy for people to do their work.”

Yet information security and usability have something significant in common: both tend to be tacked on at the end, after systems have already been designed. For that reason, both tend to be patch jobs that don’t work very well. “It means that both are inappropriate and you have neither good security nor is it easy to use.”

For now, the most common answer to the problem of human error is training and raising awareness of the consequences of mistakes.

Testing, Testing

Peter Bassill, a senior member of ISACA and the CISO of a large e-commerce firm, says, “A lot comes down to users not being aware of what they’re doing.” The solution, he says, is “80% training and awareness, 20% about putting in technology that will detect and prevent a user breaking something accidentally or maliciously.” The security awareness program his company recently rolled out relies on small workshop sessions and a requirement for users to navigate a web-based security portal twice a year and achieve a pass mark on the questions at the end.

National Academies of Science Usability, Security, and Privacy Committee

“I’m an end user”, says Norman, “but I’m non-typical”. He means that he is an organiser of the National Academies of Science (NAS) Usability, Security, and Privacy Committee, which has a series of meetings scheduled throughout 2009.

The goal: to find ways to advance the usability, security, and privacy of computer systems and investigate how to replace today’s uncomfortable and uneven trade-offs with solutions that satisfy all three characteristics. Cranor is also on the committee along with leading experts from IBM, Sun, and others.

“Usability, security, and privacy people all have to be part of the design team”, says Norman. “It’s the only way we’re going to reach a solution, by working together so each understands the concerns of the other.”

The initial chat among the committee revealed the first problem. “There are surprisingly few experts on security, and most of the stuff that’s designed is designed by programmers and people who don’t have any real understanding of security.

"Plus, there are surprisingly few experts on usability, and most stuff is designed by people who don’t know much about making things usable. So there are maybe one or two people in the world.”

“Some users hate it and resent having to go through it”, he admits, “but the majority go through and have nothing but good things to say. We tend to try to impart not just corporate infosecurity, but small messages that they can use at home as well – making sure their home machine is patched, things they don’t really think of too often.”

It’s vital, he says, to “make it personal to the user”. Unfortunately, he says, too many CISOs “design a training program that’s very technical and all about security, which tends to make users turn off very quickly”.

Lorrie Cranor, assistant professor in computer science, engineering, and policy at Carnegie-Mellon University, and convenor of the Symposium on Usable Privacy and Security, has been trying to find a way around user lack of interest.

Recently, Cranor led a research project to study effective information security training, which she is turning into a start-up company, Wombat Security.

“Basically, we discovered that you really need a way to hook people and get them interested”, she says, “and the moment where they fall for an attack, or think they have, seems to be a great teachable moment. People don’t think it applies to them.” Cranor’s particular project focused on phishing attacks. “Being fooled is very powerful, and gets them paying attention.”

Many Hats

Phishing is only one risk, but users’ problems with securing their home computers may provide an important opportunity for increasing information security awareness within companies.

Even users who do not believe their mistakes can put an entire company at risk will still take a personal interest in protecting their children online, staying free of viruses, and avoiding falling for phishing attacks. The principles they learn for home use, argues David King, chair of the Information Security Awareness Forum, can be carried into their company.

“One of the things that’s important in changing the culture, which is what we’re talking about here”, he says, “is recognizing that people have many hats.” The corporate employee is also a private individual, and perhaps a charity volunteer, or a club member, for example.

“For a few years now”, he says, “anti-virus vendors have been allowing companies to provide their software to users for home use, and it helps create a protected domain around the organisation. You can apply the same model to awareness.” Users, he concludes, like Norman 20 years ago, are not stupid. “They might just not be making the appropriate decisions around security because they don’t necessarily see the world from a security point of view”.

What’s hot on Infosecurity Magazine?