The danger in service operators’ censorship filters

This report follows the publication of an Independent Parliamentary Inquiry into Online Child Protection, led by Conservative MP Claire Perry and published 17 April. The BBC subsequently reported that prime minister David Cameron would ‘consult’ on the issue: "I want to fully explore every option that might help make children safer - including whether internet filters should be switched on as the default, so that adult content is blocked unless you decide otherwise."

The Claire Perry report makes eight specific recommendations, including tasking the ISPs to roll out “single account network filters for domestic broadband customers”, and supporting this with “backstop legal powers to intervene should the ISPs fail to implement an appropriate solution.”

The new report from the Open Rights Group and the LSE points out that something similar is already in place with mobile providers. While it stresses that the mobile operators haven’t got it all wrong, it points to and discusses four specific problems: some sites are incorrectly classified; the operators are not transparent in how their systems work; it is often unclear on how to report mistakes; and it can be difficult to turn the filtering off. (Your author has personal experience of T-Mobile blocking the blog of James Firth, CEO of the Open Digital Policy Organization, and the website of an international packaging company – possibly because the filter didn’t like talk of its large package.)

The Open Rights Group and the LSE investigated 60 incidents of what it terms “incorrectly blocked sites between 1st January and 31st March 2012.” It gives a sample list of ten, in which the political danger in allowing censorship is immediately clear: four of these ten sites are Tor (the project designed to provide anonymity on the web), La Quadrature du Net (a French civil liberties group currently taking a lead against ACTA), Septicisle.info (a blog of political opinions), and Biased-BBC (a site that challenges the BBC’s impartiality). None of these sites contain ‘adult’ content.

The absurdity that can arise is further detailed in a case study involving the website of a Sheffield church. This was blocked by O2 “throughout the second half of 2011, claiming it features adult content.” When a church member noticed the block and reported it, all he could achieve was the removal of the block on his own phone. He received “a text informing him he could ‘now access 18-rated content’. He was told that the church website itself could not be removed from the filter.”

The Open Rights Group is not against filtering per se, but makes three ‘asks’ of the operators. Firstly, the choice should be simple and straightforward. It advocates ‘active choice’ rather than either opting in or opting out, and quotes last year’s Bailey Review: “...when a new device or service is purchased or contract entered into, customers would be asked to make an active choice about whether filters should be switched off or on...”.

Secondly, it asks for greater transparency from the operators on what kind of content might be blocked, who provides any filtering that might be used, in the information available to website operators, and for greater ease in methods to complain about errors.

Thirdly, there should be greater effort put into ‘redress and review’: for website operators to be able to challenge a refusal to remove a block, and for the operators to review the performance of the filter.

“What mobile filtering already helps to demonstrate,” concludes the report, “is that seemingly simple, laudable goals such as protecting children through technical intervention may have significant harmful and unintended consequences for everybody’s access to information.”

What’s hot on Infosecurity Magazine?