Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Learning from Operation PRISM

We’re obviously not even close to the end of the revelations around both Operation PRISM and Edward Snowden, but (avoiding political or ethical commentary on what happened) a few things are becoming clear around the challenges and best practices of dealing with the malicious privileged insider, and the potential impact of this whole mess on public cloud use.

Let’s address the last one first. Here’s an interesting piece from Dr. Dobbs on the subject:

“The revelation of secret government eavesdropping is likely to substantially reshape companies' understanding of their data's safety.”

I can see their point. If it’s clear that the government can simply harvest vast amounts of data from an organization you do business with, then the risks of moving information to public clouds could slow down adoption. However, this really shouldn’t be news to anyone. The government (and the service providers) has not been terribly shy about admitting that it can happen, or the fact that you won’t be notified if it does.

In truth, the way to deal with this is to be more proactive in protecting your own data. Simply relying on a service provider’s assurance that data is encrypted is not enough. Encrypted data is only secure if the key is safe, and if you don’t control the key, you don’t control the data. It’s that simple.

The important aspect of taking ownership to protect your own data is that it addresses concerns about the US government taking a look (if they think there’s a good enough reason to do so) and more importantly it stops other attackers, including the agents of other, less friendly regimes.

The other element of this story, and one that has also started to become a little clearer, is exactly how did a junior contractor working at the world’s most secret spy agency manage to get access to so much data?

This article in the LA Times suggests that, once again, the failure of access controls (and monitoring of activity) associated with a privileged systems administrator were at fault, as they are in so many breaches.

So no great surprise there then.

In some ways it’s comforting to know that even the NSA has problems dealing with privileged users – in much the same way as almost every other organization on the planet.  The challenge is always the same. The people who have the keys to the kingdom are the ones who can really mess you up if they want to, and stopping them can be tricky.

The general approach that many organizations have successfully implemented is based on a three-step process:

  1. Reduce the number of privileged users. Most organizations have too many “systems administrators” (and the NSA had at least ONE too many).
  2. Reduce the privileges that those administrators have. It limits the damage they can do should anyone go rogue.
  3. MONITOR the living heck out of them. If an admin makes a change to a critical server, accesses sensitive data, or does anything unusual or anomalous, then capture that event and maybe even look into it.

It’s not easy to do, but it is done well in many organizations today, and there are great tools available to make it easier to manage.

As the employee population becomes more mobile, data moves into the public cloud, and the workforce of contractors grows, businesses and governments will need to get smarter about how they monitor and manage access to their data.

And while Mr. Snowden might disagree, I think all our data will be safer when they do.

What’s Hot on Infosecurity Magazine?