Security is the main cause of complexity in the data center

The primary area of data center complexity, cited by two-thirds of the respondents, is the increasing number of business-critical applications
The primary area of data center complexity, cited by two-thirds of the respondents, is the increasing number of business-critical applications

This is the subject, and the fifth year, of Symantec’s annual ‘State of the Data Center’ survey. During that period, Apple has revolutionized corporate computing by first launching the iPhone and later the iPad; and virtualization has progressed from a nice idea to a de rigueur part of the data center. Coupled with explosive growth in the generation of, and need to retain and manage huge volumes of data, the data center has become increasingly complex.

This year’s Symantec survey examines just that: causes and solutions in data center complexity. Symantec commissioned ReRez Research to gather responses from 2,453 IT professionals at organizations in 32 countries – including 500 in North America and 700 in Europe. “We asked,” writes Danny Milrad in the Symantec company blog, “about the complexity in different areas of IT, and respondents rated every area 6.7 or higher out of 10 in complexity. The highest rating was for the area of security, at 7.1, followed by infrastructure, disaster recovery, storage and compliance.”

The drivers behind the complexity are not necessarily intuitive. The primary area of complexity, cited by two-thirds of the respondents, is the increasing number of business-critical applications. The more obvious complexities of the modern data center come lower: a mobile workforce (44%), virtualization (43%) and the public cloud (41%). The main cause of this complexity, across all areas, is security.

IT departments are responding to the complexity through training, standardization, centralization and virtualization – and increasing budgets. But 9 out of 10 are also implementing an information governance program. The hope is that improved governance will improve security, make it easier to locate information, reduce management and storage costs, decrease legal and compliance risks, and enable a move to the cloud.

Symantec offers six of its own recommendations on how to mitigate data center complexity. Most of these are standard or can be implied by the body of the report: get C-level ownership of the governance project; understand the business need behind the data requirement; understand your own assets and reduce the number of backup applications. One, however, stands out as not mentioned elsewhere in the report: “deploy deduplication everywhere.”

Data deduplication, points out Steve Whitfield at Orange Information Systems Group, “can vastly reduce the amount of data we have to store, and therefore the amount of storage we need to buy and administer.” In a single stroke it can reduce storage requirements, improve efficiency and potentially reduce costs. It can, says Whitfield, “reduce physical storage requirements by up to 95%.”

What’s hot on Infosecurity Magazine?