Online Harms & Image Based Sexual Abuse

These are forms of abuse which are becoming more recognised and understood - we are campaigning for law reforms and robust regulation

Image-based sexual abuse is a pervasive and pernicious form of sexual abuse. We use the term ‘image-based sexual abuse’ to refer to a broad range of abusive behaviours including the taking and/or distribution of nude or sexual images without consent, including threats to do so, which includes so-called ‘revenge porn’, ‘upskirting’, fakeporn, sexual extortion and videos of sexual assaults and rapes.

The EVAW Coalition believes that new law and policy being considered by the Government in relation to Online Harms, in close ‘negotiation’ with the big tech giants, is a critical area where abuse of women and girls is very real, is increasing and needs specific naming and commitments. Any proposals in this area need to focus on changing the behaviour and attitudes of those who perpetrate online harms and the systems that enable them, rather than telling individuals how to ‘stay safe’ online.

UPDATE September 2020: New survey and report by Glitch and EVAW reveals increased abuse during Covid and calls on tech companies and Government to act. Read more here.

In the Spring/ Summer last year (2019) the Government consulted on its Online Harms White Paper. This paper was deigned to ‘make the UK the safest place in the world to be online’. Two of the main harms the paper sought to address were child sexual exploitation and terrorism.

It proposed public education and awareness raising and a quite technical and complex regulatory framework for digital companies. It didn’t really include any gendered details or proposals.

In our response, we sought to bring a gendered analysis to Government, outlined below some of our main points:

  • The Government’s framework for regulating and attempting to prevent online harms should include detailed recognition of VAWG & its online forms.
  • Online VAWG should be recognised as a wide and growing set of harms including (not limited to) image-based abuse, online harassment, sending unsolicited images, creation and sharing of ‘deepfake’ pornography etc. These harms should be recognised as related to one another because they have common drivers and intersect with other inequalities.
  • The Government should make financial provision for support for victims of online harms, independent and trauma-informed.
  • A regulator with teeth and independence.
  • Legal reform that properly addresses different forms of online VAWG.
  • Mandatory transparency reports for tech companies which are accessible, easy to find, with high levels of disaggregated data on the type and instances of harms reported & demographics of victims and perps.
  • The liability for online harms to be recognised as resting with the tech companies and their designers as well as individual perpetrators; responsibility/liability for harm never imputed to a victim who should have “kept themselves safe”.
  • Relating especially to ‘private communications’ enabled by online platforms, there should be a new high level tech commissioning and design stage related principle which is: a company should be required to take into account, and to address and reasonably mitigate against potential harms, if it seeks to build in private chat for users whose identity is not verified. This should apply for applications aimed and marketed at adults or children, and would include for example gaming platforms aimed at children and young people; dating sites; social media aimed at children and young adults.
  • A commitment to ‘future-proofing’ in the area of online harms including online VAWG regulation, to ensure that in particular the ever growing use of AI (see deepfakes) and other ways in which online harms will be perpetrated in future are within scope of the emerging policy and the regulator’s powers.
  • Tech companies should create safety by design.

In February 2020 the Government published an initial response to the consultation, which looks predominately at the quantitative and not qualitative responses.

The main points are as follows:

General

1. Freedom of expression – some organisations were concerned about impact on freedom of expression online and the Government have indicated as a fundamental right it will be protected. They have also clarified that the legislation will only apply to orgs that provide services or use functionality on their websites which facilitate sharing of user generated content or user interactions, for example through comments, forums or video sharing.

3. Identity of regulator – the Government is minded to go with OFCOM

4. Transparency reporting – the Govt has formed a multi-stakeholder Transparency Working Group

5. Full response to follow in the Spring

6. The Government will issue interim codes of practice on how to tackle online terrorist and child sexual exploitation and abuse (CSEA) content and activity, the codes will be voluntary and bridge the gap until the regulator is up and running.

Findings from consultation

1. Profile of respondents – 1531 respondents vis portal and 908 via email. 84% from individuals and just 16% organisations.

2. Of the 90% of individual respondents where demographics were collected:

  • 72% respondents were men
  • The largest group were ages 45-54 at 21%
  • 60% were white (British, NI, Welsh, English, Scottish, followed by 11% white other.
  • 11% considered themselves disabled.

The Government accepts the set may not be representative of the population as a whole.

3. Some organisations were co-ordinated by Samaritans, Hacked Off and Open Rights Group and these responses were either identical or very similar and submitted through central co-ordination.

4. Key themes: respondents welcomed the targeted and proportionate and risk based approach that the regulator is expected to take, responses highlighted that flexibility is vital, many asked for more detail on breadth of services and harms that fall within regulation.

5. Key themes: respondents highlighted the importance that companies have appropriate reporting mechanisms for harmful content and that these were accessible. Orgs showed more support for super-complaints than individuals.

6. Key themes: Most respondents were keen on transparency reporting although some companies expressed reservations stating these should be proportionate.

7. Key themes: Most respondents favoured a tiered enforcement approach which is proportional. With civil society groups in favour of firm enforcement. Some industry and rights groups expressed concern about the impact of measures on the UK’s attractiveness to the tech sector and on freedom of expression.

8. Key themes: Most respondents did not think private communications services should come within scope however some responses felt they should be included and that platforms should be responsible for users safety.

9. Key themes: Most agree that there should be an appeal mechanism from the regulator.

10.Key themes: Most respondents agreed in the ideas of safety by design.

The majority of the response then looks at the answers provided through the online portal and specifically % of orgs and individuals who agreed or disagreed with the government’s proposals. There will be a second more detailed response in due course which we anticipate may look at the qualitative data a bit more and may include further analysis of responses that didn’t go through the portal.

On the 16th October 2019 the government announced that they would be dropping age verification for porn sites, and that it would be dealt with by the Online Harms regulation.

The government ran a consultation on their Online Harms White Paper from April – July 2019. We have attached our final response to the Online Harms Consultation here.

We also spoke at the launch of “Shattering Lives and Myths”  in July 2019. A report on Image Based Sexual Abuse by Professor Clare McGlynn, Professor Erika Rackley and Assistant Professor Kelly Johnson which draws from interviews with 25 victim-survivors of image based sexual abuse and over 25 stakeholders and sets out a series of recommendations.

On the 25th June 2018 the Ministry of Justice announced that the Law Commission will be reviewing the laws around non-consensual taking, making and sharing of sexual images to be reviewed.

 

Take action

Image Based Sexual Abuse and Online Harms

Support us

Donate

With your support, we lobby and campaign for strategic approaches to ending violence against women and challenge the wider public attitudes that tolerate and condone violence against women.

Follow us

Follow us on Twitter for more information and updates

Subscribe

Join our newsletter mailing list to receive updates about the work of our coalition.