Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Tackling the Authentication Problem

Photo credit: Richard Paul Kane/Shutterstock.com
Photo credit: Richard Paul Kane/Shutterstock.com

As well as being a partner at Accourt, Neira Jones is chair of the Cybercrime Advisory Board at the Centre for Strategic Cyberspace + Security Science (CSCSS), and the problems of identity and authentication have been high on her agenda for the last 18 months. If anyone is in a position to give us a handle on just how serious a threat to data the authentication issue is out there in the real world, then she is. “The analysts continue to tell us that lax password management and policies continue to put individuals and organizations at risk”, Jones tells us.

Add to this the number of high-profile credentials breaches in recent months, which in most cases led to the organizations involved deploying some kind of two-factor authentication in a classic example of waiting until the security horse had bolted before shutting the network stable door.

Additionally, there’s the fact that the identity protection market is starting to consolidate and coalesce. This, explains Jones, is evidenced by Equifax buying ID protection start-up TrustedID for about $30 million, EMC buying identity management provider Aveksa for $225 million, Symantec buying authentication company PasswordBank for $25 million, and many analysts predicting the exponential growth of the identity and authentication market, to name but a few. So, what can we conclude from the aforementioned? “Very simply”, Jones admits, “that we haven’t cracked it yet”.

The Password Problem

Cracking is, of course, at the heart of the authentication problem. Enterprises have typically focused on securing the network perimeter, and relied on static passwords to authenticate users inside the firewall, but this is insufficient given the nature of today’s advanced persistent threats (APTs), and internal risks associated with bring your own device (BYOD) adoption.

With passwords almost universally accepted as the weakest link, but also the easiest authentication method to deploy, is there any real chance of getting rid of them any time soon? Tony Ball, senior vice president for identity and access management at HID Global, understands the problems in doing so. “Strong authentication provides significantly higher security by combining a password (something the user knows) with something the user has (such as mobile and web tokens)”, he says.

“Unfortunately, users have found hardware one-time passwords (OTPs), display cards and other physical devices for two-factor authentication to be inconvenient”, Ball observes. And there lies the rub, the double-edged sword run through the heart of the authentication debate: balancing convenience with security. “The alternative of software tokens on mobile phones may be more convenient”, he explains, “but they have raised numerous security concerns. Using public key infrastructure (PKI) smart card technology with embedded OTP delivers the necessary security, but can be complex and costly to implement”.

Effective Alternatives

From the organizational perspective, are there really any novel and effective solutions to handling authentication issues out there being used? There’s certainly a general trend toward linking identity of a user to the identity of a local hardware device, such as a mobile phone or other token. There is also a new trend of using PKI as a more secure encryption method, minimizing the risk of a single point of attack.

“Twitter’s new authentication method follows these trends; PKI authentication is integrated into the Twitter client and only a ‘blessed’ smartphone can be used for securing access to the service”, explains Stina Ehrensvärd, CEO and founder of Yubico, a firm that specializes in authentication solutions. “It is easy to deploy for services that already have a user client. However, though it offers better security than just a static password, an application on a device connected to the internet is exposed to malicious malware”.

Neira Jones, meanwhile, is more excited about biometrics. “As far as I can tell, apart from sci-fi buffs and Big Bang Theory fans, biometrics started to enter public consciousness in 2009–2010 and since then, we have experienced increased user acceptance”. This started with biometrics usage for border security as the most significant development, due to technology advances and large-scale national ID deployments.

“In my opinion”, Jones continues, “the biometrics debate is an integral part of the identity and authentication space”. And she’s right, with the main barriers for adoption being a lack of knowledge and cost. “Over time, I believe these can be addressed”, Jones insists, “with the ever-present debate on what may or may not replace user names and passwords, potential emerging approaches aren’t mutually exclusive.

There remains the concept that if core identity information was available and secure (say, encrypted on a user device), technologies such as biometrics could be used to unlock the appropriate encrypted credential information”. Where a person would perform a transaction, the relevant identities/credentials for that transaction could be unlocked via biometrics, and an online transaction could access the citizen and retail identities for that individual.

Trusting Big Brother

Do governmental regulatory efforts have a part to play, with the likes of the Online Assurance program in the UK and the National Strategy for Trusted Identities in Cyberspace (NSTIC) in the US? Andy Kemshall, co-founder of SecurEnvoy, also happens to have worked alongside GCHQ to create the UK Government Online Assurance program. “We helped develop the document that provides good practice guidance to HMG public service providers to support user authentication to HMG Online Services”, Kemshall told Infosecurity. “The government has recognized the need for two-factor authentication, which they call multi-factor authentication”.

But Jamie Cowper, senior director of worldwide marketing and business development at Nok Nok Labs, isn’t sure that government is addressing the right problem. “Both the US and UK efforts are more focused on solving an ‘identity’ problem as opposed to an ‘authentication’ one”, he says, insisting the distinction is important. “It makes sense to use the appropriate commercial identity provider to access government resources”, Cowper explains, “but many of these programs are still defaulting to username and passwords as the standard way to access these resources”. In other words, the broader need to upscale user authentication to something that’s both stronger and easier for users is part of these programs, but it’s not built into their DNA. Still, recognizing the problem is a step in the right direction, and the fact that governments on both sides of the Atlantic are establishing nationally accepted and well-informed criteria by which public services can operate has to be applauded.

The problem, according to Andrew Hindle, global technical director at Ping Identity, is how well the government-driven standards and processes will be adopted by the private sector. Not least, Hindle points out, as “not all of these government efforts are designed to be used outside the public sector”.

There’s no doubting that the convergence of physical and logical access control so that they can be managed concurrently is particularly important in the US federal space. “In the US, the 2005 Federal Information Processing Standards Publication 201 (FIPS 201) defined requirements for standardized personal identity verification (PIV) smart credentials that leverage smart card and biometric technology and support strong authentication methods on the desktop and at the door”, Philip Hoyer, director strategic solutions at HID Global, told us. “Until now, FIPS 201 multifactor authentication has primarily been used for logical access and digital document signing using PKI-based validation”, he continued, “but these capabilities will also be highly effective for PKI at the door, which is expected to become more widely adopted as a federal identity best practice in the future.”

Working in High Resolution

Authentication should always be viewed as a process rather than as a bewildering set of technologies according to VP of strategy with Thales e-Security, Richard Moulds. “Authentication in the physical world is a multi-layered, risk-based activity – we naturally get more skeptical as the stakes get higher, and this should be reflected online”, he says. Of course, the more sensitive the information to which one tries to gain access, the more stringent the access controls will be. “Working in high resolution comes at a cost, and this is where advances in Big Data can play a significant role”. Moulds believes that “static clues like IP addresses, machine IDs and even stored credentials help build the picture, but dynamic, behavioral analysis really fills in the details.”

But with so much data in view, authentication systems don’t just control access to target systems any more, they contain so much personal data they become a target themselves. “Protecting all this stored personal data goes well beyond the usual password database problem in terms of both volume and sensitivity”, Moulds concludes. Soon authentication will move from being a password problem to a Big Data problem.

The FIDO Alliance

Infosecurity spoke to Jeff Hodges, FIDO Alliance Technical Working Group co-chair and senior member technical staff, Ecosystem Security Team with PayPal Information Risk Management about the future of authentication. "In terms of replacing the use of user-wielded, reusable, sharable passwords, there are various efforts underway, key among them being the FIDO Alliance", Hodges told us.

"PayPal co-founded the FIDO Alliance in an effort to design and establish a standardized, multi-vendor, multi-platform, cryptographically based, biometric-friendly authentication protocol. We believe that if we can devise a client-side API to leverage emerging cryptographic and biometric components (authenticators), couple it with an authentication protocol that can accommodate the differences in various components, and make it transparently deployable on the client side, that it stands a chance of attaining traction”, he explains. Such a system should therefore yield both a cryptographically strong second factor (where the user would continue to provide some form of password), or in the case of a FIDO-compliant biometric, a multi-factor authentication event with a cryptographic proof that the connection is from the user’s device (something one has) that is generated upon the biometric authenticator identifying the user (something one is). 

 

What’s Hot on Infosecurity Magazine?