Virtualization: virtually a commodity

Is virtualization safe?
Is virtualization safe?
Chris Ingle, IDC
Chris Ingle, IDC
Paul Shaul Swartz, Netkeepers
Paul Shaul Swartz, Netkeepers
Ernesto Lobo, Kroll Ontrack
Ernesto Lobo, Kroll Ontrack
Weighing up the benefits with the risks of virtualization
Weighing up the benefits with the risks of virtualization

You cannot doubt the rapid success of virtualization, and once introduced, enterprises just can't get enough. Gartner recently predicted that revenue from this category of software will total $2.7 billion this year. That represents a 43% increase from $1.9 billion in 2008. Its research suggests that global virtualization penetration will reach 20% in 2009.

“Organizations are seeking ways to reduce ongoing support and server administration costs while increasing the availability of IT services,” says Mike Shirer, corporate communications director at IDC. “Early data suggests that via a virtualization project, an organization can fundamentally change its server to admin ratios — from 20–30 to 1 in the physical world to 60–100 to 1 in the virtual world”.

Leading vendor VMware – with 120 000 customers worldwide, an annual 40% growth and $1.3 billion revenue, has developed on users' desires to utilize spare capacity, reduce server costs and even close redundant data centers.

The virtual machines themselves can be switched from one physical machine to another, sometimes without the user being aware of the change.


Yet, arguably, if operating systems could be trusted to run multiple applications without conflict, there would be less of an imperative for virtualization.

"People have got systems with a huge capacity that is just unused," says Chris Ingle, an associate vice president in consulting at IDC. As it is, loading a server to capacity with dozens of applications is unwise. Running one application on a server, but doing it reliably, is a reasonable goal.

Server virtualization will allow just such a strategy. A small application is loaded onto the machine before the operating system (OS) and pretends it is an entire machine, talking to the OS and managing access to the hardware. It acts as a barrier and access control to the hardware so that the OS can function without battling with other software on the same machine.

This enables the confident deployment of development environments, reliable and rapid software demonstrations and a secure runtime base for risk-sensitive applications without conflicts between various applications competing for the same resources. Each OS behaves as if it has the hardware to itself, whereas it might be sharing it with others. It is possible to load a single, physical machine with many virtual servers and thereby usefully employ any spare capacity. Vendors report utilization rising from 15 to 85% and the figures are largely supported by analysts.

But that is not the end of the story. While physical servers can be safely filled with virtual machines, the virtual machines themselves can be switched from one physical machine to another, sometimes without the user being aware of the change. If a physical server is running too hot, or is close to 100% utilized, virtual machines can be moved on the fly. This has obvious implications for reliability, demand loading, and business continuity.

These abilities are the drivers behind virtualization. Despite the technology being some 40 years old, having first appeared on mainframe computers in the 1960s, now is high noon for virtualization.

Enterprises aren’t the only companies to benefit from the virtualization concept. Hosting firms such as Netkeepers, a Canadian cloud computing provider for SMBs, are also finding virtualization useful as a means to drive efficiencies into their operations. The company used VMware to increase its profitability by 30%.

NetKeepers supports Microsoft Windows Server 2000, 2003 and 2008 as well as various versions of RedHat Linux. “We have had significant success virtualizing servers on all of these platforms. NetKeepers has consolidated approximately 75 servers to date into our cloud infrastructure and has slated an additional 40 servers to be virtualized this year. We are hoping to be 80% virtualized by the end of 2010,” said Paul Shaul Swartz, CEO of Netkeepers.

“We have had significant saving on datacentre costs ranging from power consumption reductions to avoiding the need to increase our footprint at our data centers,” he added. “Additionally we have not had to grow our switching infrastructure to accommodate additional customer networks, and out of band network management equipment has become redundant as a result of our VMware cloud deployment.”

Heavyweight interest

Virtualization is expected to become a commodity technology, particularly with the recent entrance of Microsoft to the market. In June 2008, Microsoft's virtualization offering, Hyper-V, was made fully available and the product was bundled for free with the Windows Server 2008 operating system.

Virtualization can assist with some management problems, particularly in the area of disaster recovery.

Microsoft is attacking the enormous potential of the 90-95% of servers yet to be virtualized, and a head-to-head in large early-adopting enterprises is also certain as Windows Server product manager, Gareth Hall, explains: "In the enterprise space where people have gone for other higher-cost options, we are finding we are being treated seriously. We know there are feature differences between Hyper-V and the other products out there but we are finding that it's very rare those features are actually used."

Explosive growth in the population of virtual machines is critically dangerous for enterprises employing virtualization. The link between the server software and hardware is broken, allowing the commissioning of new servers instantly, even - dare it be said - by users themselves, without going through the trials of requisition via the IT department.

Doubtless this can be an enormous efficiency gain, and increases an organization’s flexibility, but it can also be a recipe for disaster.

Ernesto Lobo, data recovery engineer for Kroll Ontrack, explains that the number of calls over lost data that the firm has received has increased ten-fold in the last year due to the formidable growth in virtualization.

"It adds a lot of complexity. It's a whole new way of taking care of your servers and for a lot of people it may catch them off guard," he says.

Weighing it up

Perhaps unbelievably, even basic data security lessons, hard won over the decades, need to be re-taught. "When companies move to virtualized systems they often leave behind all the back up routines," Lobo reports.

He suggests that virtualization may inadvertently have caused such a cavalier attitude to return to companies that were once careful. With a virtualized server, it is easy to backup entire systems OS and all, "They will probably have some redundancy system," says Lobo, "but they forget about file level backups."

Virtualization does not prevent applications from going wrong, corrupting databases or scribbling over their own data. Neither does it prevent a user deleting files or overwriting data partitions. Human error is a particular risk when there may be many virtual machines present, and Lobo visits many companies that have allowed rampant the rampant of virtual machines.

Common causes of VMware data loss
  • VMware vStorage VMFS volume corruption.
  • Deleted VMware vStorage VMFS volumes.
  • Virtual Machine Disk Format (VMDK) corruption.
  • Traditional RAID and hardware failures.
  • Deleted or corrupt files contained within virtualized storage systems.

"We have to find out which machines are important. Too often the customer doesn't know, there are so many people creating new machines and they don't keep a tight control," he says.

This may be easier said than done. "Companies without good management are going to have to start getting it pretty quickly, because it's one thing sticking post-it notes on servers when you have a piece of physical hardware, but that approach doesn't work for virtualization," says IDC's Ingle.

"If you go to any virtualization shows you will see hundreds and hundreds of companies offering management solutions," Ingle adds. "All of this adds complexity. You still need an OS and an application, then you add the virtualization layer on top and now you're talking about all the management software that goes with that."

Market consolidation

Ingle expects further mergers and acquisitions, creating a simpler marketplace for virtualization tools. "When you get out of the hosting environment of large organizations and go mainstream, it's just far too complex and expensive. The hosting companies can afford it because they are charging back to their customers, but for your mainstream IT director, it's adding another layer of complexity and cost."

With Microsoft offering virtualization to all as part of server 2008, Ingle says the competition will shift to management tools. "I can stick my applications on virtual machines, I can say to my users 'go and deploy a virtual machine rather than buy a new server,' but a lot of organizations have found their users hear this message and deploy lots of virtual machines, and it becomes a mess. There is a need to integrate this mess into your management systems."

Yet, as Chris Wolf, senior analyst at Burton Group explains, virtualization can assist with some management problems, particularly in the area of disaster recovery. “That’s what got me into virtualization in the first place,” he says. “The mobility aspect of virtualization has allowed a lot of organizations to test their disaster recovery, to ensure that it works.” As with any enabling technology, it brings together opportunities and challenges.

The bottom line is that virtualization does not mean you can forget good practice. Indeed, there is greater need for control when chaos is so easily unleashed. "The most important thing is backups. If you have good backups you won't need to call us," concludes Kroll Ontrack's Lobo.

What’s hot on Infosecurity Magazine?