OMG! TLS! You BEAST!

Written by

 

A couple of security researchers are due to present a way to compromise TLS 1.0 at a conference in Argentina  next week (scroll to end of page). Thai Duong and Juliano Rizzo have found a way - codenamed "BEAST" - to use malicious JavaScript in tandem with a network sniffer to perform a plaintext recovery attack on the traffic. It's currently terribly slow - apparently around 30 bytes per minute, so it's not a real and present threat right now. But it is a major concern in principle nevertheless. The really sad point is that only TLS 1.0 is susceptible, but practically no current browser supports either of its two successors.
 
This is not good news, but is it really an encryption problem? I say not at its root - it's a malicious script problem yet again - and again, and again and again...
 
Absent the malicious JavaScript the attack could not take place. So the real hazard to which ordinary web users are exposed is the uncontrolled download and execution of scripts from who knows where. Several social networking site pages I've examined draw scripts from half a dozen different domains entirely unbeknown to the user, and government and commercial sites are often just as bad. The UK "Parliament TV" site requires scripts from two domains - one of them a trendy and unexpected ".tv" - just to view a video. This widespread (indeed almost universal) practice forces the user to browse with no restrictions on the sources of executable content, which is a well-recognised security hazard. However cunning the actual exploit, almost all automatically triggered attacks are in the first instance injected by malicious scripts.
 
NoScript for Firefox was created to control this problem and it's a pretty good tool, but it's becoming increasingly redundant, as in order to make use of most web pages now you have to accept the scripts regardless. And scripting is being used in the most unnecessary and idiotic ways. I think the most illustrative example is currently the UK National Computing Centre web site. Try accessing the home page with JavaScript disabled, and you're faced with a massive mush of jumbled scraps of text - a completely unusable page. Turn on scripts and you get a small neat page with drop-down menus. In all fairness, the NCC site only draws scripts from one domain - but they're utterly unnecessary. Drop-down menus can be created perfectly well using just HTML and CSS. So why use CSS to format the fragments and then use scripts to position and animate them? Is it thoughtlessness or incompetence?
 
Until our web developers accept that scripts are a potential security hazard and so should be a carefully administered last resort drawn from sources the user can predict and trust, we're not going to escape the huge problem of the zombie pool - millions of domestic computers infected via malicious scripts in drive-by attacks - that is used to attack businesses and governments world-wide and to send out all that spam. It's not good enough to say "I know my  scripts are clean" either. Let's suppose a JavaScript repository you're calling code from remotely has been tampered with - a not unrealistic scenario. The Mozilla repository and DigiNotar were both recently breached, and I bet they were secured (even DigiNotar) better than many free script repositories, so how can you guarantee the code? These are the attacks of the future, as they're likely to be so easy to execute and have such far-reaching results.
 
But maybe I'm missing something here - maybe it's not the need to accept that scripts are a potential security problem. Maybe it's the need to think beyond how slick the page looks - to be interested enough to look beneath the hood and ensure that security is accounted for by doing everything in the simplest most transparent manner - what used to be called "good engineering practice". But maybe in the end it's the need to actually care.

 

What’s hot on Infosecurity Magazine?