Keeping Software Defined Datacenters Secure

Written by

Anyone who has been paying attention to current thought around datacenter design will be aware of the increasing trend towards a software defined datacenter (SDDC). It mirrors the growing implementation of software-defined networks (SDN) and follows on from the widespread acceptance of cloud computing and virtualized servers.

But what are the risks of deploying such a datacenter? In the same way that cloud was, initially at least, seen as insecure, does running an SDDC mean that an organization is more at risk?

Zero Sum Game

It is important to state that the point of deploying SDDC is to increase security.  Yet according to a recent Infosecurity Magazine webinar, 55% of respondents had no plans to implement a software-defined datacenter at any point in the future.

VMware’s security and compliance specialist, Peter Bury says he can understand the reasoning behind this. “There’s a tendency to look at SDDC as a zero sum game,” he argues.  In other words, companies take all the aspects of an SDDC and think that everything has to be implemented at once.

However, Bury explains that the growth of the software-defined datacenter (and a quarter of the aforementioned webinar viewers had already implemented one) arises from well-established business reasons.

He says it’s clear why companies are looking to overhaul their datacenters.  “I’m looking at a world with highly competitive, agile new companies who are able to bring out new products very quickly at a rate that big organizations can’t do.”

Legacy Issues

Many companies are looking into how to broaden their portfolios and react more quickly, and the answer that many come up with is SSDC, but that causes concerns of its own, says Bury.

He adds that companies have moved away from a traditional infrastructure to a cloud-based one with little understanding. “A good example is where IT can’t move fast enough. In that case, you will find people going to third party providers:  just a few seconds with a credit card and a browser, [and] infrastructure and compute can be made available.”  Bury proposes that this could cause concerns for CIOs who have no control over that environment; but, the people in that organization will point out that they need the infrastructure quickly, as IT hasn’t been able to do that for them.

And the reason why they can’t brings us to the heart of the problem.  “You can try to make that available, but you’ll have lot of legacy and processes around security, and you won’t be sure that you can implement and verify them at the same speed that you can do everything else,” he cautions. “Organizations willingly admit that they have taken security shortcuts just to get a service out to their customers.  But can you maintain all the checks, verifications, user interfaces, crawling, [and] patch management that you used to do when you continually stream into a live environment?”

Building In Security to the Datacentre Configuration

There’s a balancing act between speed versus complexity versus security. There are small agile companies who have worked out how to deliver this, but traditionally, the view has been that doing things faster and faster is going to lead to operational chaos. This is the underlying fear in the construction of datacenters. There is a great deal of complexity and a fear that complexity growing out of control.  

“Complexity is the enemy of security,” Bury remarks. “The old model was CIA—confidentiality, integrity, availability. It’s very easy to get one of those right but hard to get all three correct at the same time.”

This brings us to another key element. The key to designing a software-defined datacenter, therefore, is to ensure that security doesn’t exist as a separate entity from the computer, and storage but is something built in: that is, part of the same domain.  In other words, there’s no gap between how the computer is configured and how the network and security are configured.

What this really means? A lot of the problems become operational—it’s one thing to build a datacenter and make it secure; it’s another thing to continually operate it. “If you look at some recent security breaches, Target for example had passed its audit—it was PCI compliant—the problem was the ongoing problem between those audits. This opened up a gap that could be breached.

And that brings us to the second part of the process: automation. “Whether you’re taking an open source approach; whether you’re taking a vendor-specific or multi-vendor approach: you have to blend together compute, storage, network and security.”

After that, says Bury, you can start to predefine it. “You can build templates that predefine the compute so why not have objects in that template which predefine the networking and which define the security controls you’re going to have?”

Pre-Defined and On-Demand Security

In other words, you’re not just pre-defining a compute but an entire application, not the compute, but the network and security around it. And that can give you confidence because you know that anything you create from that combination when you scale up or down, or create capacity is going to come from that particular blueprint.

Rather than order, build, verify, tune, verify, and sign-off each time something needs to be signed off, the process is done beforehand. This can mean that the networking and security is all done beforehand, so you’re consuming it with the same operational model as a virtual machine: pre-defined and on-demand.

Bury says that in the world that his VMware customers inhabit, the principle is that as the infrastructure is designed so should networking elements such as switching, routing and load balancing be built in, so companies have consistent, repeatable processes all built around automation.  He admits that automation is not an easy concept for his customers to come to grips with, but when things are predefined, security is built in from the outset.

Limitations of Traditional Approaches

Graham Brown, managing director at Gyrocom, believes that SSDC is a concept that the customers are really getting to grips with, thanks to the flexibility it gives them.  “We see [SDDC] going to mission-critical environments. Financial institutions are really at the heart, dealing with the complexity of that environment by using automation and templates as the traditional approach to security, [which] doesn’t scale into the software defined datacenter.”

He adds that the growing increase on workload volumes was showing the limitations of the traditional tiered approach. “The firewall boundaries between a web and a database tier, for example, are increasingly difficult to manage and maintain, to troubleshoot in case of difficulties and to audit. And, that method is becoming less and less viable.”

 In such circumstances, automating compute and storage, without automating network and security defeats the object. “It’s only when you put all of these together that the true value of automating a datacenter becomes a reality,” says Brown.   

Security as a Castle, Not a Hotel

And it’s not a decision companies should shrink from as workloads increase.  “We’re already entering a world where datacenter complexity is already being far beyond human scale. Without throwing huge amounts of resources, then the ability to be able to segment that security problem is absolutely critical for us,” says Brown. “And the only way to achieve this at scale is through automation.”

The rolling out of software-defined datacenters will ultimately mean far more security. “We will have the ability to deploy a firewall to secure every conversation, something that hasn’t been possible up till now. It’s like the old example of castle versus a hotel. A castle has a big lock but once you break it, you’re in. While the hotel has a security lock on every door,” says Brown. “And what’s more, every hotel room can be different.”

CASE STUDY – FINANCIAL INSTITUTION

Graham Brown describes how one of Gyrocom’s customers moved away from the traditional datacenter set-up.

“As a financial institution, security was paramount. When we first talked to them they were about to upgrade to a traditional architecture.  The company wanted a complete infrastructure refresh for the delivery of financial products through bespoke web-based applications.

“We were able to show how a software-defined approach offered advantages over the traditional way of doing things.  We implemented VMware’s NSX within four weeks. We introduced consolidation, showed there was no need to take on an east/west firewall and ensured every workload was segmented with the correct security policy.

“There were large operational savings but performance also increased as we were able to segment and overlay policies on templates. For example, we could create templates based on Windows machine, templates based on Linux machines etc. And through the micro-segmentation we could introduce a flatter infrastructure to offer more granular security.”

Brown claims that that the company now has a number of key benefits from a software-defined approach: improved security, infrastructure flexibility, simplified processes and reduced cost.

What’s hot on Infosecurity Magazine?