The coming into force of the children's safety duties: what can we expect?

Tags:

This Friday (25th July) marks a significant milestone in the implementation of the Online Safety Act: the children’s safety duties come into force, with companies in scope of them required to comply with those duties in addition to the illegal harms duties already in force. But what does it mean? And will it make a difference?

The duties

As of Friday, user-to-user services that are “likely to be accessed by children” must ensure that those children are prevented from encountering primary priority content (including suicide/self-harm material and pornography) and protected from being exposed to a list of other priority content. (Prior to this deadline, those services should have carried out a children’s access assessment and a risk assessment.) For some services, this will mean they need to introduce “highly effective” age assurance (HEAA) to keep children off the entire service; for others, they will need to use HEAA to target a series of safety measures (which Ofcom has set out in its codes of practice) towards children to protect them from exposure to priority content through content moderation or recommender feeds. (Age assurance not only has to be highly effective but comply with privacy rules too; as per the ICO guidance on the children’s code.)

For search services, the obligation is slightly different. Services are under an obligation to minimise the risk of children of any age encountering primary priority content as well as minimise the risk of children in age groups judged to be at risk of harm from other harmful content from encountering it. Essentially search services “likely to be accessed by children” will need a search moderation function designed to downrank and/or blur content that is harmful to children and for large services filter out primary priority content for users believed to be children.

We have a helpful explainer on what the child safety duties cover which you can access here.

The impact

There is a lot riding on the implementation of these duties. Dame Melanie Dawes, the Ofcom Chief Executive, has called them “transformational” and a “game changer”. We, along with many of our civil society partners, think the codes of practice that underpin them could have been much stronger. These conflicting assessments are about to be tested - and a lot will ride on Ofcom’s swift enforcement of transgressions from Friday onwards.

There are positive signs from its early enforcement of the part 5 pornography duties. A number of investigations have been opened into porn sites who have not provided adequate risk assessments and, last month, Ofcom announced that the major commercial porn providers - including Porn Hub - were going to introduce HEAA to keep children off their sites by Friday’s deadline. Ofcom has also announced a series of investigations into sites which have not engaged with the prior requirement to provide children’s risk assessments (a duty that came into force in April) and investigations are underway into a number of sites relating to their compliance with CSAM measures under the illegal harms duties.

However, given that the duties come into force in just a few days’ time, there is very little industry noise in relation to HEAA for compliance with the children’s duties. Dating sites, notably Grindr, have implemented age verification. Bluesky announced a couple of weeks ago that it would introduce age verification for UK users and Reddit announced it would be doing similar to stop under-18s accessing “certain mature content”; Discord have also announced their plans in the last couple of days. Ofcom said they “expect other companies to follow suit, or face enforcement if they fail to act”. (Notably, the major platforms are now arguing both in the US and most recently in Europe that age assurance should be carried out at device level.) At this stage, a few days before the compliance deadline, one might have expected to see more companies, particularly the biggest social media platforms, either falling into line on HEAA where relevant, or making clear to their users what measures they will be taking to ensure compliance with their wider children's codes duties - whether or not they want to make a PR song and dance about it.

Public perception

But the communications to the public around why these age checks are coming in and what the legal requirements are on the companies who do (or do not) introduce them is noticeably lacking. If we look at Ofcom’s approach to enforcement to date - such as its open investigations of services relating to their children's risk assessments - the regulator seems as likely to go after services for procedural errors than wait for a failure to comply with a big-ticket duty. But harnessing public pressure is as valuable a tool in effective enforcement as regulatory sanctions - and there is certainly plenty of that pressure being directed at the Government at present from parents, with calls for them to go further and faster and ban children's access to social media, or smartphones, entirely.

Conversely, effective enforcement also requires public understanding and acceptance of the rationale for a new regulatory requirement and the impact that the new rules will have on individuals beyond those they are designed primarily to protect: this is as true for parents who are desperate to see long over-due safety improvements to the online world that their children are accessing as it is for adults, who may shortly find their access to legal content on some sites is restricted without providing a means to prove their age.