Comment: The SSL Offload Dilemma

It is predicted that by 2031, 2048-bit certificates will no longer be sufficient, prompting a to switch to 3072-bit security or higher
It is predicted that by 2031, 2048-bit certificates will no longer be sufficient, prompting a to switch to 3072-bit security or higher
Nathan Pearce, F5 Networks
Nathan Pearce, F5 Networks

Put the term ‘data breach’ into a news search, and you’ll receive stories on the latest organizations to lose data. A new company seems to lose data every single day, with customer records, healthcare information, financial/card payment details and other vital statistics being waylaid by hackers, disgruntled insiders or simply negligent employees. No organization wants to be the next under the spotlight, but there are so many potential breach points that it is often hard for managers to know where to begin.

Worse yet, many consumers just don’t know whether their account information has been compromised. Should they change their banking details in the wake of a data breach? A hefty fine from the Information Commissioners Office, which is able to fine organizations up to £500,000 for data breaches, won’t secure customer data or compensate those affected.

The State of The Art

Following these issues, many companies have called for security consultants to become more application aware as companies try to secure their systems from application rather than network-level attacks.

Indeed, regulations and processes around payment card data are certainly being tightened. We have seen PCI DSS regulations becoming more firmly enforced over the last few years, and a key part of this is SSL, the encryption technology that allows a secure connection between a customer/prospect’s browser and a company server. The US National Institute of Standards and Technology, NIST, has recently recommended that all SSL certificates switch from a minimum of 1024-bit security to 2048-bit security.

Raising The Game

As most of us know, SSL transforms your details at the payment checkout, for example, to an unintelligible cipher or string of characters, using a ‘key’ or a set of instructions about how to transform this data so that it is intelligible to the right people. At a basic level, the length of the key, in number of bits, specifies the difficulty of cracking this transformation, which would allow a hacker to read all of the encrypted data.

In 2007, researchers cracked 700-bit RSA security using 300–400 consumer laptops in 11 months, and in 2010, researchers at the University of Michigan cracked 1024-bit security in just 100 hours. This latter case was achieved using some very sophisticated equipment, but it is a sign that we need to change our encryption.

It may seem obvious, but why do we need to change? Essentially, as the processing power of our PCs increases – and it increases very quickly – computers can run decryption algorithms faster. It is predicted that by 2031, 2048-bit certificates will no longer be sufficient, prompting us to switch to 3072-bit security or higher.

However, there is another side to increasing the SSL certificate complexity – companies must then use these certificates to decrypt the data itself. Although a 2048-bit security decryption is approximately 4.3 billion times more secure than a 1024-bit key, it is four to eight times more complex to process than a 1024-bit security certificate, significantly increasing the server load.

Raised server loads mean that consumers at the coal face have to wait longer for their transactions to go through. SSL encryption usually happens at the checkout, and if a checkout takes too long, consumers can go elsewhere.

How to Cope with the Flood

So far, I’ve highlighted that the switch to 2048-bit security is a necessary evil to keep consumer data safer as processing power rises, making commercial hacking more viable. However, to avoid processing slowdown, companies may need to increase server power. This can be an expensive option, and fortunately there is an alternative.

For many years, companies have been able to split loads between servers – at a basic level, for example, between two web servers so that one does not get overloaded and cause the website experience to slow for some viewers.

Similarly, companies can redirect their SSL processing to a bespoke system optimized to handle it. This is called SSL offloading, and is no different than any other kind of load balancing. It’s simply a specialist system that can deal with the SSL decryption traffic better than other systems, avoiding a jam.

Customizations include specialized SSL acceleration hardware that improves the performance of encryption/decryption during SSL communication. This enables significantly higher SSL transaction capabilities than performed on a server.

Large enterprises have significant IT infrastructure today and are frequently unwilling to invest in new, costly equipment to solve this problem. Nevertheless, this issue can be solved by simply making more of what enterprises already have.

Back to Business as Usual?

Putting SSL decryption onto a separate server will put companies ahead in the security arms race, at least where encryption of personal and payment information is concerned. It will give assurance to consumers that their data is safe, and won’t hold up sales. In fact, this kind of upgrading can future proof the business, with many systems being ready for 3072- and 4096-bit SSL encryption.

Although this may be seen as a technology that simply allows a company to function with ‘business as usual’, the impact of enhancing security arrangements cannot be underestimated. Even in the event of a data breach, having the assurance of encryption standards in place that would take months, if not years to crack, will go a long way to ensure consumer confidence. Indeed, if widely publicized, it may be enough to discourage hackers.

The end result is that consumers can still keep buying from a site, and remain secure without checkout processes slowing down. And maintaining sales without compromising security in a time when many companies are still in a very fragile state is a worthy goal to strive for.


Nathan Pearce has worked at F5 Networks for four years, and is a subject matter expert in next-generation datacenter agility and virtualization technologies. Pearce holds VCP4 certification from VMWare and is an active blogger on dynamic infrastructure, application delivery and virtualization.

What’s hot on Infosecurity Magazine?