Blog - Commentary

Law against revenge porn should target social media platforms

We must focus on the social media sites that publish or host 'revenge porn' images rather than on criminalising those who post them to effectively tackle the problem, writes Dr Samantha Pegg of Nottingham Law School.

Twin Design

The growing problem of revenge porn, the dissemination of sexual images either online or in hard copy without the consent of the person featured, continues to attract significant media attention. The offence, correctly entitled “disclosing private sexual photographs and films with intent to cause distress”, was introduced in England and Wales in April 2015 and has since resulted in more than two hundred prosecutions.

This figure needs to be read in light of the number of reported incidences, which stands at more than a thousand. Although many cases fail to proceed due to a lack of evidence, a withdrawal of support by the victim has also been a contributory factor in driving down the number of prosecutions. As this is not a sexual offence, where identification of the complainant is prohibited, many victims have been unwilling to draw further attention to disclosures by pursuing cases. Consequently, there have been calls for a reclassification of the offence as sexual to introduce anonymity and to recognise that these disclosures are often part of a campaign of sexual harassment or abuse.

The focus of the revenge porn offence is on the dissemination of the image, on or offline, with the intention of causing distress to the individual featured. Private sexual images are defined as images that show an individual's exposed genitals or pubic area or are otherwise something a reasonable person would consider to be sexual by virtue of their nature or context. Disclosure must be made to a third party and, unsurprisingly, most cases have concerned images posted online.

Facebook has been a popular method for the non-consensual disclosure of these sexual images, with offenders uploading pictures to their own pages and to pages set up for the purpose of humiliation. While Facebook has been reasonably responsive in removing sexual photographs and videos – as have other social media platforms – there is no requirement for them to proactively weed out these images. Currently their duty extends only to removing images within a reasonable period of time after they have received notification. Facebook, Twitter and Reddit also have policies that ban the posting of revenge porn on their platforms with accounts restricted, removed or locked if the policy is breached. Yet this, like the revenge porn offence, is a response that can only be deployed after the horse has bolted.

This leads us to a claim that has been bought in Belfast’s High Court. A naked image of a 14-year-old-girl was posted on Facebook as an act of revenge. Facebook removed the image when notified, but did not prevent it from being republished on the site multiple times. Although the Northern Ireland Assembly enacted a revenge porn offence in 2016, which mirrored that introduced in England and Wales and a criminal action can be taken against the person who posted the image, the girl has also brought a civil claim against the social media platform.

Interestingly, this case takes us back to those debates that preceded the introduction of the revenge porn offences. When it was announced revenge porn was to become a specific offence in England and Wales I was dismissive of the move – pointing to the myriad civil actions that could be taken to supress such content without the need for singling out their sexual nature or criminalising the person who has posted the images. True, these actions are slow, expensive and offer no guarantee that the image will be permanently removed, but much of this is also true of the criminal response.

I remain unconvinced the correct approach has been taken in tackling revenge porn. Although the conviction of offenders has been welcomed – albeit less so in cases where the sentence has been nominal – the focus remains too firmly placed on criminalising those individuals who have disseminated the image.  

The criminalisation of revenge porn has not succeeded in deterring those who are driven by spite, stupidity or a lack of awareness from posting sexual photographs and videos. It is this that poses the real problem, as once images are published it is almost impossible to supress them. Instead we must focus on the sites that publish or host such images. Sites that raise revenue from the traffic, the advertisers and the subscribers they attract must take some responsibility. Facebook already uses software to identify and block child abuse images but this is not, as the Northern Ireland case demonstrates, infallible.

While the threat of substantial damages may compel social media sites to police their domains more proactively, we should also reappraise the legal approach that is taken. It is not beyond the reach of the law to enact more creative civil or criminal responses to deter social platforms from hosting such images at all.

Dr Samantha Pegg is a senior lecturer at Nottingham Law School, part of Nottingham Trent University.

Posted by:

Dr Samantha
Pegg

18 October 2016

Editor's picks

 
   
 
 
 

Also read...

Lawyer activists call out UK companies on climate risk reporting

UK companies and auditors told relegating climate considerations to CSR in annual reports is no longer good enough.