In this world nothing can be said to be certain, except death and taxes”, Benjamin Franklin wrote in 1789. If he were alive now, he might have added auditors to the list.
Today's bounty of security regulations and standards – Sarbanes-Oxley, the UK’s Data Protection Act, ISO 27001, the PCI standard for credit cards – all carry with them detailed compliance requirements. The result, said Akamai director of security intelligence, Josh Corman, at the October 2012 RSA Conference in London, is that many businesses actually fear auditors more than they fear hackers. You can see his point: a hacker may come, but the auditor will come.
“Auditors are feared”, agrees Allan Boardman, international vice president for ISACA; his career includes extensive experience in auditing, security, and risk management. Boardman finds this attitude frustrating: “Auditors have fairly wide and general knowledge, and they're an asset to the organization if they're used properly, but people just want to get them out of the way and not tell them anything they're not supposed to tell them out of fear of revealing inner secrets. Auditors are frustrated, because they feel they don't get full cooperation.”
This attitude may even be completely logical given the company's circumstances. Sue Milton, the president of the London ISACA chapter and a freelance consultant on corporate good governance, points out that the stick regulators use – that a company, especially in the financial area, may lose its license if it does not demonstrate compliance – serves to create a bigger and more immediate threat to the business than anything else.
“A lot of the smaller banks would say keeping their license is everything”, she remarks. “It's meant to be a virtuous circle, but quite often a player will say the biggest risk to my business is not the hackers but the fact that I can lose my license – so at all costs I must show my compliance and that becomes my number one risk to mitigate.”
Worse, Milton notes, are cases she’s seen where because the regulators can ask for just about any document they want from within the organization, the internal rule becomes: never write down anything that might be really bad news. It means the regulator never gets the documentation necessary to do the job properly – and that red flags within the organization never get seen by the right people.
“The incentives are wrong", she says. And so is the attitude: “Business units should be demonstrating themselves that they [are] complian[t]. The auditor should be saying that the methods they're using to demonstrate compliance are appropriate and you're using them in the right way.”
The Problem with Auditors Is…
Peter Wood, CEO of First Base Technologies, a specialist in penetration testing and security audit services, blames – at least in part – the way auditors work and report their findings.
“Some auditors can get obsessed with checklists", he says. “The problem with that is that it isn't always in the business' context. That's not a damning statement against all auditors, but when an auditor, external or internal, does that, typically the key actions are pulled out of the report and pushed to the people required to fix these issues without the reasoning behind it.” The upshot is that the controls applied may not actually address the specific threats the company faces.
“From my perspective", Wood continues, “if instead of working from a checklist the auditor were to work through threats and risk analysis, then they would end up with a set of recommendations that weren't just tailored to that system but also that described the whole issue from threat actor to impact. Actor – capability – motivation – value.” Working this way (he recommends conducting it as a round-table exercise) forces those involved to verbally model an attack and the relevant steps, rather than just dealing in theoretical concepts.
|"A hacker may come, but the auditor will come"|
It sounds expensive at first, he admits, but the results can be worth it. In one case, the staff of a museum wanted access to their HR records at home; management blocked it based on security concerns. After discussion, the group settled on 20 risks they thought were realistic, none of which were more than medium-level, and all of which could be countered with four or five controls that were realistic to implement. “They went away happy – and it only cost two hours of their time”, Wood recalls.
In an ideal world, he says, things like PCI and DSS requirements – which can't be disregarded by many sectors – should be a subset of a larger picture.
“The annoying thing is that once you've done the process, everyone who was part of it is really in favor and believes that it's added significant value”, Wood observes. “But until you've done it once, everyone is resistant because they just want to deal with the checklist.”
The View from the Other Side
Life can be just as frustrating for auditors. Richard Hollis, whose consultancy, The Risk Factory, enables customers to buy security services online, agrees: “They're getting dressed up for us, and we're not the bad guys.”
Hollis makes his living from auditing, and yet his 25 years of experience in risk management mean he couldn't sound less happy about the way the focus on compliance diverts attention from the risks companies really face. Yes, compliance, like hacktivism, raises awareness, but companies put money where they think it will improve the bottom line.
“I go into a company and they only have money for compliance”, he laments. The PCI framework only protects credit card data; what about the company's human resources database? But when Hollis tries to ask what the company really needs to protect, “I then become a vendor who's just trying to upsell”.
The consequence, he says, is that, “I do not know of anybody who is correctly spending an IT security or risk management budget”. Money is allocated for specific things: data protection or PCI. “There is no general risk management budget in companies. It's all compartmentalized and line-itemed.”
And yet, Hollis admits, “If I'm Joe Sixpack, I am grateful for these standards, because companies weren't doing anything, and won't unless they have to. As a consumer, that's better than nothing.”
Michael Hamelin, chief security architect at the security management solutions company Tufin, thinks the problem is more deeply rooted.
|"Quite often a player will say the biggest risk to my business is not the hackers but the fact that I can lose my license"|
|Sue Milton, ISACA|
When you give someone a very technical standard, if they're not a mature information security group their default is the standard, believing everything they need to do is contained within it, thus making them secure. Things like writing compliance policies and defining security zones – well known for years – have been lost in the scramble to adapt to new technologies.
“We throw things in the data center with no thought as to how we separate zones”, Hamelin says. “And then auditors show up with a checklist and they hate it when we get off the checklist. They think it's scope creep, but the reality is they haven't defined a good information security policy to start with.” Meanwhile, he sees companies that don't know what the rules in their firewalls are intended for, and often don't really know the value of the information they have on a particular server.
For Hollis, the risks of this approach are overwhelming because of the psychic distance many people are able to put between themselves and the data in their care. “Risk managers, CIOs, etc., could care more", he says. “They don't understand that people's lives are in this data.”
As an example, he cites a facility where he assisted with forensics: a database containing more than 20,000 names of HIV-positive patients was stolen off a server three times in 90 days. “The CIO was unfazed by this”, Hollis recalls. Then he slid across to her from the same server an Excel spreadsheet that had been taken more than 20 times in the same period – containing the names, addresses, Facebook pages, mobile numbers, and Twitter IDs of her daughter's football team.
Then, “She went white,” Hollis recounts. “It illustrates the problem. People don't understand what the data means. The people in these critical positions do not get it. They see IT as IT, and can't make that connection – nor that if we lose the data it causes the business harm and we can lose our competitive edge.”
Some of the examples of what happens when companies get focused on compliance are both funny and absurd. David Gibson, vice president of strategy for the data governance company Varonis, cites the company whose administrator wanted a report of every login attempt because the auditors said she had to check unauthorized attempts and review them every night. But there was no clear process for reporting or acting upon anything she spotted.
In another example, a large bank was required by its auditors to fix access controls that had folders containing sensitive data open to a number of user groups. Because the auditor only cared about the “everyone” group, that was the only one the bank was going to fix, even though the others posed as great a risk.
Sometimes, notes Gibson, problems arise because companies consolidate older systems without understanding the consequences. Say, for example, 100 small fileservers are replaced by two or three network-attached storage devices. “The 100 distinct servers were used by 50 people each, but the three NAS are used by 5,000 people each – and the problem is that those small fileservers may have had permissions that gave global access.”
Overall, says Gibson, “There's no substitute for thinking. If you're blindly following a prescriptive guide, you really have to be pretty lucky for it to do a lot of good.”