A Safer Life Online for Women and Girls: our response to Ofcom's guidance

Tags:

Introduction

Today we welcome the publication of guidance by Ofcom for tech platforms on how to keep women and girls safer online, a milestone in our collaborative efforts to ensure that the disproportionate abuse experienced by women and girls online is addressed.

During the passage of the Act in 2022, the Online Safety Act Network’s Prof Lorna Woods and Maeve Walsh (then at Carnegie UK) worked alongside leading VAWG organisations and experts, including the NSPCC, 5 Rights, EVAW, Refuge, Glitch and Professor Clare McGlynn on the development of a code of practice on online VAWG. This piece of work was a key component in the successful campaign to amend the Online Safety Bill to ensure there were greater protections for women and girls to address the disproportionate levels of harm and abuse they experience online.

Whilst the journey to this day has not always been smooth, we are delighted to have played a role in the publication of Ofcom’s guidance, which would not be in existence were it not for the tireless campaigning of civil society organisations in the VAWG sector and Parliamentarians from all parties, including Baroness Nicky Morgan and Alex Davies-Jones MP.

The new guidance sets out a raft of ‘foundational steps’ and ‘good practice’ that companies should take to ensure that their platforms do not enable the harassment and abuse of women, and that they act swiftly when it does occur. We provided detailed feedback on the previous iteration of the guidance, which Ofcom consulted on earlier in the year.

Whilst we note several positive additions to the guidance below, we are disappointed that many of the weaknesses which we flagged to Ofcom during the consultation still remain, many of which replicate structural weaknesses that prevail in the Illegal Harms and Children’s Codes. We respond specifically to some of the main changes below and will be providing further analysis once we have absorbed the details.

Since Ofcom carried out its consultation process, it has:

  • Included new measures on prompts and nudges, timeouts promoting diverse views and de-monetising accounts that are promoting misogynistic abuse.
  • Replaced “domestic abuse” with “stalking and coercive control” to align more clearly with the illegal harms duties.
  • Expanded thinking on sexual violence, particularly where linked to pornographic content, even when the pornography is not criminal but depicts sexual violence.
  • Tightened the definition of “online misogyny”, now framed as “online misogynistic abuse and sexual violence” as its own harm category. Content expressing traditional gender roles will not be included in the definition of misogyny unless it promotes abuse.
  • Broadened the scope of who the Guidance supports by clarifying that actions should be afforded to everyone and added case studies on intersectional risks to recognise that risk is not uniform, which will specifically include boys.

Our response:

We are pleased to see new measures on prompts and nudges, friction, rate-limiting, mass blocking and nudges, which signal a greater focus on up-stream measures and safety by design, as well as the intersectional understanding of misogyny as overlapping with other forms of discrimination such as racism and transphobia.

However we are disappointed that, despite express reference to safety by design it is not well defined within the guidance, as in the Act itself, and examples of ‘good practice’ tend to focus on those interventions that are made ex-post, such as content moderation and user empowerment. Whilst it is positive to see an acknowledgement that safety by design must be a continuous process, and that Ofcom has acknowledged that “a priority of safety-by-design should be designing out risks”, further clarity on how platforms interpret safety by design would ensure greater consistency and better understanding of how the ecosystem of a platform can contribute to users’ experiences of abuse.

We particularly welcome the recognition that certain types of ‘legal’ pornography can still be harmful when depicting sexual violence. For example, pornography depicting incest, rape, or adult actors playing the role of a child, is not criminalised in the UK and easily accessible by adults, often inadvertently found on social media sites such as X without actively seeking it out. Such content can still be abusive in nature and extremely harmful to women and girls.

Responding to feedback from respondents that stalking was not well covered in the guidance, Ofcom has changed its overarching harm types to specifically reference stalking, and have included new measures for geolocation removal to ensure that perpetrators can’t use location settings on platforms to stalk an individual. This is an important recognition of the fact that stalking is a distinct form of harm from domestic abuse, as the victim is not always known to the perpetrator.

Whilst changing domestic abuse to coercive control more clearly aligns with the Act, the term ‘coercive control’ is less well understood by the public and may be misinterpreted. Although Ofcom’s Illegal Content Judgments Guidance should help provide clarification to the regulated services, it is unlikely to be read more broadly. We encourage platforms to be clear and ambitious when it comes to tackling and responding to domestic abuse on their platforms to ensure survivors are adequately protected.

Notably, this iteration of the guidance focuses more on the behaviour of men and boys online, both as victims of gender-based harms as well as perpetrators of online abuse. We hope this focus on men and boys behaviour will lead to a more preventative, upstream approach from platforms, which the VAWG sector has been calling for, however we will be examining these measures in further detail to ensure they do not weaken the ambition of the guidance, which was always to recognise the disproportionate harassment and abuse of women and girls online.

Despite feedback from the online VAWG network that ‘foundational steps’ should be changed to ‘minimum steps’ in the guidance, Ofcom has decided not to change its framing, with reference to the “safe harbour” provision in the Act (para 3.70). We have already set out our concerns about safe harbour provisions and had hoped that the flexibility afforded by the non-enforceable guidance in relation to measures might have encouraged Ofcom to be more ambitious than in the codes.

Ofcom’s commitment to publish a follow-up report in 18 months time is very much welcome, and will be an important measure of how tech platforms are tackling harm against women and girls on their platforms. Whilst we still await clear direction on how they will measure success, we urge Ofcom to be robust in their ability to hold platforms to account by calling out bad practice when they see it and taking enforcement action where appropriate.