Problems with the EU’s proposed ‘right to be forgotten’

One problem at the heart of the EU’s proposal is the very scope of personal data. The existing definition broadly defines personal data as data that can be used to identify a natural person. But, says the European Network and Information Security Agency (ENISA), it doesn’t specify whether that identification can be with a high degree of probability but not absolute certainty (such as “a picture of a person or an account of a person’s history”) or even whether it includes the identification of a person not uniquely, “but as a member of a more or less small set of individuals, such as a family.” Incidentally, this issue will only get more complex with the growth of big data analytics, where an individual may never be overtly specified, but may still become recognizable through the accumulation and association of different items.

Another difficulty concerns who can request the deletion of data. One example given is a photograph of two people, where one party would like it removed and the other party would like it retained. “Who gets to decide if and when the photo should be forgotten?” asks ENISA. And what about embarrassing news reports? “Should a politician or government be able to request removal of some embarrassing reports?”

ENISA then moves to the technical difficulties involved in the right to be forgotten. It notes that in an open global system such as the web, anybody can copy any data and store it anywhere. In short, “enforcing the right to be forgotten is impossible in an open, global system, in general... [since] unauthorized copying of information by human observers is ultimately impossible to prevent by technical means.” It can only be achieved within ‘closed systems’ such as access-controlled public networks entirely within the jurisdiction of the EU.

One approach to data expiration currently being explored is encryption. The basic principle is that the personal data is stored in an encrypted manner, viewable via a public/private key pair. On demand, or at a pre-specified date, the relevant public key is deleted, making the data concerned either unviewable or unintelligible. There are technical issues, such as the scalability of the public key management system and indeed the security of the public keys – if they can be compromised, then the content is accessible even after it has been ‘forgotten’. But even without these issues, the data or photos can be copied or captured while in plaintext and stored outside of the closed system either privately or publicly on the internet.

A final suggestion offered by ENISA is hiding rather than removing the data. Since most data is found via the major search engines, removing the data from the search engines – such as Google, Yahoo and Bing – would make it difficult to find. “A possible partial solution,” suggests ENISA, “may be a legal mandate aimed at making it difficult to find expired personal data, for instance, by requiring search engines to exclude expired personal data from their search results.”

ENISA’s report effectively says that the right to be forgotten cannot be guaranteed by technical means. It suggests that the EU should tighten its definitions of what is meant by personal data, and what is meant by ‘forgotten’ (is it, for example, merely making that data inaccessible to the public, or actually deleting it). It also points to its earlier report on privacy and behavioral tracking, noting that a tighter control on the personal data that is collected will lead to better control over their personal data by the data subjects – which is, after all, the purpose underlying the right to be forgotten.

What’s hot on Infosecurity Magazine?