Cybersecurity from Capitol Hill to Whitehall

Written by

Proclamations on cybersecurity and government surveillance have ignited political discourse in early 2015. Wendy M. Grossman cuts through the spin to find out what this means for technologists and citizens

Early 2015 saw multiple announcements on cybersecurity from US president Barack Obama and British prime minister David Cameron. Both were responding to recent events, primarily the Sony hack (which is estimated to have cost the company $15m) and the shooting in France of 11 staff at the satirical magazine Charlie Hebdo. The two countries also announced joint ‘cyber wargames’, whereby teams from each country will attack the other to test critical infrastructure.

Obama proposed improving cybersecurity information-sharing between government and the private sector; criminalizing the overseas sale of stolen US financial information; extending the RICO laws to include cybercrime; and requiring national data breach reporting.

The Electronic Frontier Foundation has described the resulting Cybersecurity Information Sharing Act (CISA) introduced in March as a “terrible surveillance bill” because it would allow companies to launch countermeasures against attackers. EFF and the Center for Democracy and Technology also complain that the bill bypasses current privacy protections for private-sector information.

In the run up to the UK’s May general election, Cameron and the home secretary, Theresa May, proposed reviving long-contentious policies: the principle that government must be able to read all communications, and the Communications Data Bill, which opponents have dubbed the ‘Snooper’s Charter’.

These policies would add to an already substantial framework for communications surveillance established in multiple pieces of legislation stretching back to 2001. In March, in the first of a series of planned reviews, the Intelligence and Security Committee (ISC) declared GCHQ’s activities as leaked by Edward Snowden to be legal, but said the law lacks transparency and accountability and could be interpreted as a ‘blank cheque’ for the security services.

Britain’s data protection regulator, information commissioner Christopher Graham, criticized the report for a basic misunderstanding: “At one point in the report they say specifically that if citizens are relatively OK about the security services reading letters and tapping phones with appropriate authorization, then why is the internet any different?

“I thought that represented a very naïve view of what the internet actually is, because it isn’t just another communications channel, it’s the universe through which we are transacting, doing business, [running] our companies, our work, our personal life, and so on. And the idea that that has got to be left open to be inspected by the authorities, whether good or bad, just seems to me to be ludicrous.”

Meanwhile, he adds, the same politicians speak regularly about cybersecurity, but there is an incompatibility in advocating securing communications and infrastructure against myriad threats while ensuring the authorities have access. “I thought it was naïve of the committee to assume that the bad actors wouldn’t take advantage of the vulnerabilities that might be left,” Graham said.

Content: Return of the Crypto Wars

Cameron is not alone in wanting access to encrypted communications. In March 2015, FBI director James Comey asked Congress to enact legislation requiring technology companies such as Apple and Google to include back doors in any encryption built into their products. Around the same time, the FBI removed from its website advice that consumers should protect their data by using encryption.

There are two kinds of objections to key escrow: ideological and technical. Susan Landau, professor of cybersecurity policy at Worcester Polytechnic Institute Department of Social Science and Policy Studies, describes the technical objection.

Both Obama and the newly re-elected Cameron have pushed cybersecurity up their governments' agenda
Both Obama and the newly re-elected Cameron have pushed cybersecurity up their governments' agenda

“Communications tools built with law-enforcement access to the keys will not be secure against skilled opponents. But the use of encryption where the end-users – and not Apple or Google, for example – hold the keys, means, as the president observed, ‘Even though the government has a legitimate request [to wiretap], technologically we cannot do it.’”

Herb Lin, a senior research scholar for cyber-policy at Stanford University, says the ideological objection is simpler: individuals should have full control over access to their own communications.

However, Lin says, it’s impossible to make a mechanism that will stay locked down forever, because computing continues to advance. But 1000 (or 100) years of security is long enough. Meanwhile, 10 seconds is clearly inadequate. “Somewhere between 10 seconds and 100 years there’s a crossover point,” he says.

Performing a risk analysis based on specific proposals and an estimate of how long the cryptography is likely to be secure in that application “would at least get the debate off the theological argument and on to the technical argument.”

Lin also raises a practical issue: company helpdesks are overwhelmed with retrieving and resetting user passwords. “I will bet anything that two to three years after all this unimpeachable encryption gets deployed, they will start offering recovery features,” he says. “People will not want to lose access to their data.”

Likely true, though privacy advocates will argue that choosing a (possibly third-party) key recovery scheme isn’t the same as having one forced upon you.

With six years of communications intelligence in his background, John Walker, visiting professor at the School of Science and Technology at Nottingham Trent University, takes a view more in line with law enforcement concerns about ‘going dark’.

“I respect privacy and I would like to have privacy,” he says, “but what we have to look at with a liberal attitude is whether we can allow insurgents – we’re talking about a global insider threat of which we have to be aware. If the price I have to pay to keep my legs attached to my torso is privacy, then so be it.” The key, he says, is ensuring that the use and exercise of such powers is proportionate and appropriately limited.

Metadata: Bulk Collection

The requirement for ISPs to retain communications traffic data for up to two years was implemented in the EU Data Retention Directive in 2006, a response to the July 7 2005 London bombing attacks. The UK had long favored data retention; a giant centralized database to store the flow was mooted as early as 2000. The 2012 version of this, the Communications Data Bill, would have required communications service providers to collect many forms of data that they currently do not, and disclose it to a substantial range of actors with oversight that opponents such as the Open Rights Group argued was insufficient. The bill failed politically.

In April 2014, the European Court of Justice ruled that the Data Retention Directive conflicted with the European Charter of Human Rights, thereby invalidating the supporting national legislation. In July, Parliament hastily enacted the Data Retention and Investigatory Powers Act (DRIPA) to ensure that ISPs did not begin deleting the stored data during the summer recess. 

Security is irrevocably weakened when keys are handed over to a third party
Security is irrevocably weakened when keys are handed over to a third party

A key element of the Communications Data Bill as proposed in 2012 was ‘black boxes’ to be installed on ISPs’ networks and through which traffic would pass; these would extract the metadata for retention. The Internet Service Providers Association complained about the likely loss of speed; advocacy organizations such as the Open Rights Group compared the idea to a man-in-the-middle attack.

Retention practices such as this raise further questions as to whether the principles of necessity and proportionality are being used in the filtering of data – ‘filtering’ being a term used in early versions of the CDB, though never clearly explained in satisfactory detail. There is a grey area here around intelligence demands for data that isn’t necessarily used in legal proceedings. This is problematic, as is the general opacity of the law.

That opacity is one piece that everyone can agree on. “They already had Tempora,” says Privacy International researcher Richard Tynan. “The police and security agencies said ‘we want this, so make it lawful for us to do what we’re already doing’. To have that as the mindset is the opposite to me of any legal course I’ve done on the rule of law. They will say they can’t do it without authorization, but we don’t know what cannot be authorized by Theresa May. To me, that is an unconstrained system.”

Will Semple, vice-president of security operations for Houston-based Alert Logic and a veteran of both intelligence and financial services, has seen both sides, yet does not think that Cameron’s proposals are “a balanced approach, especially from a military intelligence background and understanding the risks I experienced day in and day out.”

Simon Crosby, co-founder and CTO of the security company Bromium, also calls the government’s policies poorly conceived: “Once [technology companies] start to engineer for security, the ability to provide arbitrary back doors to arbitrary interested parties is just not going to happen – or at the very least Theresa May will have to answer the question of, ‘should Yahoo! provide a back door to China?’”

More bluntly, he says, “The ‘Snooper’s Charter’ is techno-babble. It’s nonsense.”

Crosby, too, agrees that today’s genuine threats require access to data in some circumstances, but he’s scathing about the methods proposed. “They’ve only come out with two so far. One: break everything and be a bad guy, really terrible. Two: they’re going to pass stupid laws for technologies that are literally impossible to develop.”

What’s needed instead, he says, “is a rational debate about how one could legitimately achieve and deliver data in the national interest – and not just the UK and US. The internet is a big place; it’s an international problem.”


Resources

"'Emergency' Ushers in a New Era in British Communications Surveillance", by Wendy M. Grossman, IEEE Security & Privacy, Issue 06, November-December 2014: http://www.computer.org/csdl/mags/sp/2014/06/msp2014060084-abs.html

EFF analysis of the Senate Intelligence Committee's Cybersecurity Information Sharing Act 2014: https://www.eff.org/deeplinks/2015/03/senate-intelligence-committee-advances-terrible-cybersecurity-bill-surveillance

Susan Landau on Obama's encryption policies: http://www.lawfareblog.com/2015/02/finally-some-clear-talk-on-the-encryption-issue/

Obama's announcements: https://www.whitehouse.gov/blog/2015/01/14/what-you-need-know-about-president-obama-s-new-steps-cybersecurity

"It Came from Planet Clipper: The Battle Over "Key Escrow", by Michael Froomkin, 1996: http://osaka.law.miami.edu/~froomkin/articles/planet_clipper.htm


This feature was originally published in the Q2 2015 issue of Infosecurity – available free in print and digital formats to registered users

What’s hot on Infosecurity Magazine?