Digging Into the Science of Behaviour to Tackle Cyber Extortion

Cyber extortion takes place in many forms, affecting both individuals and organizations. In all cases, the victim is coerced into giving up something of value – typically money. Failing to do so, the affected party or parties risk having sensitive information exposed to the wider public or experiencing further disruption to business operations. The most prominent example is ransomware, having been brought to the fore by recent, high-profile events such as the Colonial Pipeline attack. 

Law enforcement authorities seeking to end such crimes are at a disadvantage, and criminals face next to zero risk when taking advantage of ‘desperate buyers’ while driving the extortion prices up in return for considerable profits. 

Employees need to know what they are up against and how to detect threats and prevent these initial areas of compromise. Yet, how should organizations go about executing this for optimal results?

Security Awareness Is Not One-Dimensional

Traditionally, awareness efforts relied solely on distributing information and expecting this would eventually trickle down into people’s actions and behaviors. However, awareness should not be seen as one-dimensional. Instead, behavioral interventions must be introduced to bridge the gap between merely knowing what threats lie out there and facilitating intentional, lasting change in habits to protect against said threats. As the social-scientist BJ Fogg demonstrated, behavior results from three elements converging: motivation, ability and prompt.

Lessons From Fogg’s Behaviour Model

Looking first at motivation, Fogg’s behavior model highlights three core motivators – sensation, anticipation and belonging – each of which has two sides: pleasure vs. pain; hope vs. fear; and acceptance vs. rejection. All of these play into the human experience and can be tapped into to shape an individual’s attitude and behavior in a certain trajectory.

On the one hand, pleasure or positive associations can be animated by visually appealing content, story-based techniques and engaging with humor. Nevertheless, while humor is highly effective in grabbing people’s attention and encouraging memory retention, it should be practiced with caution. Indeed, going too far with humorous approaches could serve to undermine your efforts, with audiences choosing to disregard your core messaging. Moreover, if humor is employed at the expense of others or without sensitivity to the audience’s culture, the outcome could backfire.

Conversely, organizations could capitalize on fear. Although this might work for some people, doing so is short-lived and often leads employees to harbor resentment, distrust or apathy towards security teams and their initiatives. Instead, organizations are better off providing their employees with the knowledge and tools to feel empowered to defend themselves.

This might include leveraging the power of leadership or celebrity. For example, appointing a well-respected member of the company as a security advocate who can invoke a sense of belonging and responsibility among employees towards the company’s security. Training should also be personally relevant by including lessons on protecting family members. Furthermore, offer public recognition and positive feedback when a phishing attempt is successfully thwarted.

Overall, the key here lies in building a strong relationship between security teams and the rest of the community grounded in trust. When there is trust, employees will be willing to do the work and adopt new, secure behaviors. They will also be more open with the security team about any mistakes they might have made so that issues are rectified sooner rather than later. To see that training should not be about tricking people, one does not have to look further than the West Midlands Trains phishing simulation test conducted in 2021. This incident led to unhappy employees, bad press and the rail union calling out the test as a “cynical and shocking stunt.”

Next, we consider ‘ability.’ Organizations should expect to be met with resistance when reshaping behaviors, as humans are creatures of habit. Therefore, we need to make the transition as straightforward and uncomplicated as possible to break down this barrier. In other words, organizations should introduce tools such as a password manager or phish-alert buttons, which make it easier to flag suspicious emails to the company’s security team. Games that ‘train’ individuals to spot phishing attacks in repetitive ways can also be useful in converting knowledge into intuitive situational awareness.

Finally, do not forget to use prompts, otherwise referred to as cues, triggers, calls to action, etc. These are little reminders or notices to get an individual to implement a security measure like enabling multi-factor authentication or thinking twice about opening an attachment. When building your ransomware awareness campaign, such prompts should be placed with intention, whether when a user first joins a company, within the user’s email clients or alongside notes about the latest circulating scams.

Closing Thoughts

When building a strategy to tackle cyber extortion, it is crucial to remember the complexity of human behavior. As demonstrated, it requires a careful balance of using humor, gamification and appropriate tooling, among other considerations. Remember that it is not solely down to the IT team to plan and execute the cybersecurity strategy. Executives must recognize their role in securing budget approval for the campaign and as the face of it. All departments should also step in to offer their insights and approval for the campaign. Cybersecurity awareness is a team effort.

What’s Hot on Infosecurity Magazine?