Statement on Ofcom's protection of children codes

Tags:

This statement from the OSA Network on Ofcom's protection of children online proposals and the related codes of practice is also attached as a PDF at the bottom of this page, with a supporting annex.

This statement reflects the views of the Online Safety Act Network and is supported by those who have signed on below. It focuses on the proposals set out in Ofcom’s recent Statement on Protecting Children from Harms Online, which the regulator published on 24 April, and the final versions of the codes of practice for user-to-user services and search, which have been laid by the Secretary of State in Parliament.

Summary

This statement builds on our summary and full response to Ofcom’s consultation on the protection of children proposals, which they published in March last year, as well as the collective response we coordinated on behalf of the Violence Against Women and Girls (VAWG) sector to that consultation.

It also - regrettably - revisits many of the issues we raised in our Network statement after Ofcom had published its illegal harms proposals last December. While the publication of the children’s codes is undoubtedly an important step forward in the protection of children online and a significant milestone in the implementation of the Online Safety Act, it does not go as far as we would like nor as far as the Act, as passed by Parliament, allows Ofcom to go. Despite the consistent feedback the regulator has received from civil society organisations for over 18 months, our concerns about Ofcom’s cautious interpretation of the legislation remain. In sum, while there have been changes to individual measures, the fundamental approach does not locate responsibility for safety with the services themselves.

We have updated the table we submitted to Ofcom as part of their consultation which compares the risks the regulator identifies and the lack of related measures to address them in the Codes; there was - and remains - a clear gap between the scale of potential harm and the measures that providers are expected to take to address them, with the safe harbour provision in the Act (which we discuss below) exacerbating this.

Ofcom’s responses to our consultation recommendations

As the supporting material sets out, there was plenty of evidence provided to Ofcom on the concerns of civil society about their fundamental approach to the implementation of the Act. While there are some small changes in response to civil society representations, which we set out in the next paragraph, the fundamental issues we flagged with their approach remain. The substance of these concerns - raised across multiple consultations - are informed by the lived and often traumatic experience of victims and survivors of harms, and their advocates, which the many Parliamentary supporters on all sides of both Houses intended to address through passing the legislation in the first place.

We appreciate that there are some small changes in the final codes in the following areas:

  • the expectation that services should consider providing age-appropriate experiences particularly as it relates to content
  • an acknowledgement that services can use age assurance systems to establish age below 18 years
  • expectations relating to age assurance in end-to-end encrypted spaces
  • expectations around the introduction of additional barriers to children accessing group chats where reports of harmful content have been made
  • small changes in relation to material that has a bearing on misogynistic content insofar as it affects children.

Businesses, conversely, have convinced the regulator to remove a number of measures from small, low-risk platforms, thus weakening protections still further. Their representations have also led to the final text of the measure on content moderation to be weakened by the inclusion of the caveat that swift takedown is only required if “technically feasible”; a similar change to a measure in the illegal content codes caused lots of anger earlier this year (see our response here), although there has been some welcome movement in this set of codes around the regulator’s expectations on what may or may not be “technically feasible”.

While there has been some more flexibility shown by Ofcom in this consultation in comparison to the illegal harms one that preceded it, in general, the regulator has still responded more positively to industry representations in the consultation process than to civil society representations, which - despite the level of engagement - rarely result in more than some tinkering around the edges.

We set out in the annex to this statement (attached to the PDF available below) a comparison of the points we and other Network members made in response to the consultation on the draft children’s codes and the changes Ofcom has made in the final versions (if any) to illustrate this point. As with our responses on the illegal harms codes, our recommendations to Ofcom fell into two broad categories:

  • Those that stem from our assessment that Ofcom could have interpreted the Act in a less cautious way in order to ensure that the obligations placed on regulated services - and, consequently the protections afforded to users - were as stretching and effective as possible.
  • Those that highlighted where Ofcom’s choices about what regulated services were required to do in order to comply with their duties - eg in Ofcom’s risk assessment guidance, or in the content of the draft codes - were limited, even within Ofcom’s preferred interpretation of the legislation.

Ofcom’s interpretation of the legislation

As we highlight in the annex, Ofcom’s responses in the statement when rejecting civil society recommendations are brief with no alternative offered to address the substantive issues. These include:

  • the risk of harm being left unmitigated at scale
  • the gaps between the risk assessment duties and the measures companies must take to address their risks
  • the skewed approach to proportionality that prioritises an economic view over user safety - and indeed a bald statement that they have not even considered the economic or societal costs of online harm in their approach to regulation, or
  • the loopholes which companies might exploit as technologies develop and change.

Ofcom’s response therefore frequently falls back - even more explicitly than in its response to the illegal harms consultation - onto its interpretation of the Act as justification for inaction.

We also highlight in the annex some of the material where Ofcom sets out these apparent limitations. This is particularly pertinent as we had in February provided DSIT with a number of recommendations to amend the Act to remove these very barriers to action that stem from Ofcom’s interpretation of the Act and produced pre-drafted amendments for many of these which we subsequently submitted to the Data (Use and Access) Bill Committee.

We are disappointed that the Department did not take the opportunity to pursue these amendments: if they had done so, they might have then be able to announce that, alongside the children’s codes, they were laying these amendments for the Report stage of the Data (Use and Access) Bill on 7 May to ensure that the next iterations of the codes could be more expansive. Instead, the Secretary of State responded to the publication of the codes by floating a number of unconnected policy interventions in an interview to the Telegraph, rather than providing tangible legislative responses to address the gaps in the regime.

Iterations and delays

As a Network, we are concerned about the significant missed opportunities here. While Ofcom was keen in its media approach to characterise these codes as “transformational”, their consultation documents suggest otherwise. However, a further illegal harms consultation to add new measures to those codes - many of which were gaps which had to be flagged to the regulator by civil society in response to its first consultation in February 2024 - was promised first in December 2024, then in April this year and has now been pushed back to later this summer; the measures proposed there will not be in force for at least another 18 months. There is no timescale proposed for further consultations on new measures for the child protection codes.

The time taken in drawing up, consulting on and implementing the codes is unacceptable, leaves victims and vulnerable users open to significant harm, and undermines the repeated assurances from Ofcom that it understands the material impact of the concerns that have been expressed to them. The failure of DSIT to amend the Act in the meantime - such that the barriers Ofcom repeatedly refers to are removed and those future codes could be more robust - is a further, regrettable missed opportunity.

In summary, as we said in response to the illegal harms codes in January, “the purpose of the OSA is for regulated services to assess and mitigate the risk of foreseeable harm to users of online services. Organisations in our Network fought hard, and engaged in detailed policy development and engagement work, over many years to ensure that the legislation delivered this”.

We stand by our assessment, which we made first in February 2024 in response to the illegal harms consultation and again in January 2025: “interpretations of the Act involve some degree of judgement and choice. Ofcom has chosen an interpretation of the Act that does not use all the flexibility it provides, resulting in a first set of codes that set a weak foundation for user safety as the OSA regime takes effect.”

Many of the mistakes made in the illegal harms codes have been entrenched in these new codes. We now call on DSIT to intervene urgently to amend the Online Safety Act to ensure that these codes’ future iterations deliver on the expectations of Parliament - as well as the objectives for the regime that the Secretary of State has himself set out in his Draft Statement of Strategic Priorities.


OSA Network

May 2025

Signed by

Antisemitism Policy Trust

5Rights Foundation

Center for Countering Digital Hate

Molly Rose Foundation

Parentzone

CEASE (Centre to End All Sexual Exploitation)