Ofcom and ICO call for tech companies to strengthen online protections for children
Last week saw both Ofcom and the ICO issue letters to the big tech companies demanding greater clarity on the actions they are taking to protect children online. Ofcom wrote to Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube giving them until the end of April to outline the steps they are taking to strengthen age checks and prevent online grooming. At the same time, the ICO sent letters to TikTok, Snapchat, Facebook, Instagram, YouTube, and X asking them to explain how their age check policies are keeping children safe.
It is positive to see both regulators increase pressure on large social media platforms in this way. Ofcom's demand for platforms to report on how they are enforcing their existing minimum age restrictions is long overdue, as is the move to a more public accountability process on this and other key safety measures. Campaigners have repeatedly been rebuffed when seeking transparency, so greater openness in Ofcom's oversight of the largest platforms is welcome. However, it is unclear what the consequences or penalties will be for companies that fail to comply. While product testing and risk assessments may be easier to measure, the creation of safer feeds for children is harder to quantify without a clear metric.
Furthermore, there appears to be an enforcement gap where platforms have complied with existing duties, but only minimally. For example, at the end of last year Ofcom released key insights from the risk assessments submitted by platforms in the first year of the Act being in force, confirming that investigations had been launched for platforms that had failed to submit a risk assessment. However, the report cited numerous platforms who had failed to include key components of their risk assessments, with many platforms inconsistently assessing illegal and harmful content and poorly accounting for risks from their service design choices, in particular, the potential impact of a larger user base, allowing child users, and risks from encrypted spaces. Despite these serious weaknesses, it remains unclear whether Ofcom has taken enforcement action, or whether companies will be held to account if their responses do not tangibly improve platform safety.
After a prolonged absence, it is good to see the ICO take a more proactive role on age assurance and data protection. Privacy violations under the guise of age gating are not acceptable, and we look forward to seeing a detailed assessment of current industry standards for the major platforms when it comes to delivering highly effective age assurance.
We have consistently called for the Online Safety Act to be strengthened to tackle the scale of harm Ofcom itself identified in its risk register. The removal of the Act's 'safe harbour' provisions, as well as the 'technically feasible' clause, would allow Ofcom to take a far more ambitious approach, and would ensure that the codes of practice are the floor, not the ceiling. A clearly defined approach to safety by design that is enshrined in legislation would ensure tech platforms are accountable for unsafe products and functionalities that are released onto the market.
In addition, Melanie Dawes' comments supporting further legislation are encouraging, though in many areas - including minimum age enforcement - Ofcom could already have acted more robustly under existing powers. Given parental and political demands for urgent action, it is positive to see that Ofcom is now willing to be bolder in their approach to both current and emerging harms.
We look forward to seeing the outcomes of their report in May, which will provide real insights into how effective the regime has been to date.