Keeping Software Defined Data Centers Secure

Written by

Max Cooter looks at the concept of a software defined data center, and if this will put those “how secure is the cloud” debates to bed once and for all.

Anyone who has been paying attention to current thought around data center design will be aware of the increasing trend towards a software defined data center (SDDC). It mirrors the growing implementation of software-defined networks (SDN) and follows on from the widespread acceptance of cloud computing and virtualized servers.

What are the risks of deploying such a data center?

In the same way that cloud was, initially at least, seen as insecure, does running an SDDC mean that an organization is more at risk? It is important to state that the point of deploying SDDC is to increase security. Yet according to a recent Infosecurity magazine webinar, 55% of respondents had no plans to implement a software-defined data center at any point in the future.

VMware’s security and compliance specialist, Peter Bury, says he can understand the reasoning behind this. “There’s a tendency to look at SDDC as a zero sum game,” he argues. In other words, companies take all the aspects of an SDDC and think that everything has to be implemented at once.

However, Bury explains that the growth of the software-defined data center (and a quarter of the aforementioned webinar viewers had already implemented one) arises from well-established business reasons. He says it’s clear why companies are looking to overhaul their data centers. “I’m looking at a world with highly competitive, agile new companies who are able to bring out new products very quickly at a rate that big organizations can’t do.”

So many companies are looking at how to broaden the portfolio and react quicker and the answer that many come up with SSDC but that causes concerns of its own, says Bury. He adds that companies have moved away from a traditional infrastructure to a cloud-based one with little understanding. “A good example is where IT can’t move fast enough. In that case, you will find people going to third party providers: just a few seconds with a credit card and a browser, infrastructure and compute can be made available.”

Bury proposes that this could cause concerns for CIO who have no control over that environment, but the people in that organization will point out they need the infrastructure quickly as IT hasn’t been able to do that for them.

The reason why they can’t bring us to the heart of the problem. “You can try to make that available but you’ll have lot of legacy and processes around security and you won’t be sure that you can implement and verify them at the same speed that you can do everything else,” he cautions.

“Organizations willingly admit that they have taken security shortcuts just to get a service out to their customers. But can you maintain all the checks, verifications, user interfaces, crawling, patch management that you used to do when you continually stream into a live environment?”

There’s a balancing act between speed versus complexity versus security. There are small agile companies who have worked out how to deliver this, but traditionally the view has been doing things faster and faster is going to lead to operational chaos. This is the underlying fear in the construction of data centers. There is a great deal of complexity, and a fear of that complexity growing out of control.

“Complexity is the enemy of security,” Bury remarks. “The old model was CIA – confidentiality, integrity, availability. It’s very easy to get one of those right but hard to get all three correct at the same time.”

This brings us to the crux of the matter. The key to designing a software-defined data center therefore is to ensure that security doesn’t exist as a separate entity from the compute and storage, but is something built in; part of the same domain. In other words, there’s no gap between how the compute is configured and the network and security is configured.

What does this really mean?

A lot of the problems become operational – it’s one thing to build a data center and make it secure, it’s another thing to continually operate it. If you look at some recent security breaches, Target for example had passed its audit – it was PCI compliant – the problem was the ongoing problem between those audits. This opened up a gap that could be breached.

That brings us on to the second part of the process: automation. “Whether you’re taking an open source approach; whether you’re taking a vendor-specific or multi-vendor approach: you have to blend together compute, storage, network and security,” Bury said.

After that, he says you can start to predefine it. “You can build templates that predefine the compute so why not have object in that template which predefine the networking and which define the security controls you’re going to have?”

In other words, you’re not just pre-defining compute but an entire application. Not the compute but the network and security around it, and that can give you confidence because you know that anything you create from that when you scale up or down, or create capacity is going to come from that particular blueprint.

Rather than order, build, verify, tune, verify and sign-off each time something needs to be signed off: that process is done beforehand. This can mean that the networking and security is all done beforehand so you’re consuming it with the same operational model as a virtual machine: pre-defined and on-demand.

Bury says that in the world that his VMware customers inhabit, the principle is that as the infrastructure is designed so should networking elements such as switching, routing and load balancing be built in, so companies have consistent, repeatable processes all built around automation. He admits that automation is not an easy concept for his customers to get to grips with, but when you predefine things, you build in security from the outset.

Graham Brown, managing director at Gyrocom, believes that SSDC is a concept that the customers are really getting to grips with, thanks to the flexibility it gives them. “We see [SDDC] going to mission-critical environments. Financial institutions are really at the heart, dealing with the complexity of that environment by using automation and templates as the traditional approach to security doesn’t scale into the software defined data center.”

He adds that the growing increase on workload volumes was showing the limitations of the traditional tiered approach. “The firewall boundaries between a web and a database tier, for example, are increasingly difficult to manage and maintain, to troubleshoot in case of difficulties and to audit. And, that method is becoming less and less viable.”

In such circumstances, automating compute and storage, without automating network and security, defeats the object. “It’s only when you put all of these together that the true value of automating a data center becomes a reality,” says Brown.

It’s not a decision companies should shrink from as workloads increase. “We’re already entering a world where data center complexity is already being far beyond human scale” says Brown. “Without throwing huge amounts of resources, then the ability to be able to segment that security problem is absolutely critical for us. The only way to achieve this at scale is through automation.”

The rolling out of software-defined data centers will ultimately mean far more security. “We will have the ability to deploy a firewall to secure every conversation, something that hasn’t been possible up till now.

It’s like the old example of castle versus a hotel. A castle has a big lock but once you break it, you’re in. While the hotel has a security lock on every door,” says Brown. “And what’s more, every hotel room can be different.”

What’s hot on Infosecurity Magazine?