Child Porn Under the Carpet

Written by

UK culture secretary Maria Miller has held a summit with the world’s biggest ISPs to discuss the proliferation of child pornography and graphic images of child abuse online. "Child abuse images are horrific and widespread public concern has made it clear that the industry must take action", she said, adding "enough is enough". In response, Google have pledged to provide £3m in grants to protective schemes for children, including an allocation for the Internet Watch Foundation, the body which trawls the internet in search of illegal and abusive images.

Of course, as more and more people are quickly learning these days, there is another side to the internet in the form of the ‘dark web,’ where illegal activities can go on basically unchecked. Using the Tor browser, users can buy drugs and hire assassins, as well as access child pornography. Bearing this in mind, the idea that a few high-profile search engines and ISPs could stop the proliferation or downloading of these images seems rather naïve. At best, Miller can only hope to make it slightly more difficult for members of the public to find these images, and to send a clear message that these behaviours are both illegal and socially unacceptable.

More than anything, it seems that the summit is a gesture, without real hope of achieving anything but the closer regulation of internet content for average users of the non-dark web, and of sweeping illegal behaviours back where they belong, under the carpet and out of sight in the dark web. As the old adage goes, out of sight, out of mind. The non-visibility of these images would surely be a comfort to worried members of the public, despite their proliferation on the other side of the web. However, this slight deterrent may also be enough for many people whose slight curiosity would never be sparked by a stumbled-upon image or idle search term. In this sense the old adage rings doubly true: for many people, these images being out of sight may well keep them out of mind.

One of the major problems with online pornography seems to be its ability to neutralise taboo and to normalise the previously unacceptable. Many researchers, including Gail Dines and William Stuthers have written extensively about the negative effects of watching pornography, which include the tendency to "fall deeper into the mental habit of fixating on [pornographic images]," creating automatic neural pathways that every real-life sexual experience is then routed through. It is easy to see how violent and abusive pornographic images could create an addiction to violent and abusive real-life sexual encounters. Commentators argue that the cases of killers such as Mark Bridger, who was found to have watched violent pornography before murdering April Jones, highlight the very real problem presented by pornography.

It is true, however, that not everyone who enjoys violent pornography goes on to commit violent crimes. Many people would argue that they never forget the very important moral difference between fantasy and reality. Recently, a new academic journal has been created to study pornography. The peer-reviewed journal, set up by academics Feona Attwood and Clarissa Smith, has been accused of being pro-pornography, with Gail Dines having accused its editors of being "akin to climate-change deniers", and "taking a bit of junk science and leaping to all sorts of unfounded conclusions". However, as Diane Abbott, Labour MP and UK shadow minister for public health, said in response to news of Maria Miller’s summit, we are facing the "sexualisation and pornification of our society" in response to the proliferation of online pornography. It seems legitimate to want to study the effects of this ‘pornification’.

The argument about what we should and should not be allowed to see online rages on. In the meantime, Britain’s main ISPs, Virgin Media, BSkyB, BT and TalkTalk, have all agreed to work with the IWF in conjunction with the Child Exploitation and Online Protection Centre to crack down on illegal images of child abuse. Presumably, this means that every image on their servers will have to be manually checked for illegal content. Although this does represent a further step towards the regulation of the internet, hopefully the move will in some small way protect more children against the abuse perpetrated by pornographers.

 

What’s hot on Infosecurity Magazine?