The Online Safety Act at Two
The Online Safety Act is two years old, having received Royal Assent on 26 October 2023. We marked its first birthday by talking about the wider landscape of digital regulation. But now, with the two main pillars of the Act - the illegal harms safety duties and the protection of children duties - fully in force and Ofcom’s enforcement well underway, it’s worth a deep dive into where we are, where we’re heading and what more is left to do.
Where we are
Our Guide to the OSA and its Implementation provides all the latest “what, when and how” detail on this. What we’ll focus on here is whether we are where we expected to be. And by “we”, we are reflecting the overall mood amongst the civil society organisations we work with on a daily basis - experts and advocates who campaigned tirelessly for the Act during its passage through Parliament and have continued since then to push the Government and the regulator to go further and faster to protect users from a multitude of online harms.
Enforcement
There is much to celebrate: having a regulator that is now holding tech platforms to account, for a start, with a series of enforcement programmes already underway (for example, monitoring companies’ compliance with the children’s risk assessment duties). Some of Ofcom’s initial investigations (for example into file-sharing services) have resulted in businesses changing their services to come into compliance. The threat of escalating regulatory intervention has led to some high-harm services - such as Bitchute - choosing to prevent UK users accessing the service rather than comply. Voluntary geoblocking is not necessarily a definitive win for safety in all circumstances, however: Ofcom has paused its investigation into an online suicide forum linked to over 130 UK deaths, putting it instead on a watchlist, but this is unlikely to be the end of the story given the harm that that particular site has caused. The first fines of the new regime have also been issued (to 4Chan) in recent weeks, though it appears that that platform is unlikely to be paying these any time soon.
The large mainstream porn platforms have, in the main, complied with the requirement for age verification - restricting access to material that children would not be able to access offline; indeed Ofcom, at a recent Lords Committee hearing, said they were themselves surprised at how well this compliance had gone. And, there are some early signs of success in relation to the impact of the children’s codes measures on what under-18s can see online. So, despite the noise of opponents of the Act when the age assurance requirements came into force at the end of July, the case for rowing back on this new normal is flimsy, at best, as our expert adviser, Prof Lorna Woods, argues here.
Expansion
The Act has been strengthened further since 2023, too. Successful civil society campaigns have led to the Government bringing NCII, cyberflashing and self-harm content into the priority offences schedule, which requires platforms to take proactive steps to identify and prevent access to such content. A new offence of encouraging or assisting serious self-harm is contained in the Crime and Policing Bill, which is currently going through the Lords, and which would replace the OSA’s existing communications offence. And there were a number of measures in the Data (Use and Access) Act which responded to unfinished business from the OSA, including a new offence of creating or soliciting sexually explicit deepfakes without consent, measures on the retention of social media data related to the death of a child and the creation of the social media data access regime (though with further consultations still required on the regulations to bring that into force, it'll be a while before researchers will have that much-needed access).
The Government has also - eventually - been responsive to campaigns to extend the criminal law to take account of mounting evidence of harm in other areas. Additional offences relating to AI-generated CSAM and taking intimate images without consent have been included by the Government in the Crime and Policing Bill, with a new offence of depicting strangulation in porn to be added as that Bill progresses through the Lords.
Engagement
Constructive engagement with Ofcom - allied with some significant pressure and campaigning where necessary - has led to the regulator moving to fill gaps in its initial suite of codes measures in its recent Additional Safety Measures consultation; (see our response here). Its Illegal Content Judgement Guidance has also been updated to take account of feedback and further changes were included in the recent ASM consultation; and guidance on super-complaints and data preservation has also been turned round (relatively) quickly, once the regulatory enablers were put in place.
The Government stayed true to its commitment to keep the Online Safety Act out of the UK-US Trade Deal - a notable win. DSIT also responded positively to feedback from civil society in its consultations on both the Statement of Strategic Priorities for Online Safety and the proposals on the OSA’s Super-complaints regime, strengthening both final documents significantly - though Ofcom’s response on the former was little more than a weak round-up of what is already doing, rather than a statement of intent on how it plans to meet the Government’s ambitions.
Where we’re heading
But it’s been a long two years. The Labour Government - particularly during Peter Kyle’s term of office in DSIT - has been reluctant to act proactively to improve legislation that, when in Opposition, it knew was weak and had promised to strengthen.
Gaps
An early weakness in the Act emerged within a month of it coming into power - with the online response to the murders in Southport leading to weeks of violent disorder on Britain’s streets. Ofcom’s response to this has been to propose a fairly insubstantial measure on crisis response mechanisms in the recent consultation. The Government’s response has been to refuse to accept any of the more substantial legislative recommendations from the Science, Innovation and Technology Committee inquiry into social media, misinformation and harmful algorithms - an inquiry set up specifically to investigate where the gaps had been exposed in the regime in the aftermath of Southport and how to fill them. The Government instead has continued to propose that we “wait and see” how OSA implementation goes. (See our response to that, in relation to the Committee’s recommendations on online advertising, here.) Similarly - as also highlighted in the SIT Committee’s report - the Government is refusing to fill gaps in relation to AI chatbots, despite concerns about the limits of the OSA’s coverage of them being flagged over a year ago.
Concerns remain also about how digital regulators are working together to incorporate policies that cut across regulatory regimes and the challenges posed by emerging technologies such as AI. For example, regarding age assurance, Ofcom is not mandating privacy is incorporated in its core criteria - which is ultimately what services will be judged on - despite the fact that ensuring these systems are privacy-preserving is crucial for building trust in them; as well as the online safety regime at large.
Delays
It's taken a long time to get to the full implementation of the main pillars of the Act. And Ofcom’s future implementation timetable has also slipped (see our implementation table, here) with the judgement (and particularly the judge’s remarks) in the case brought against the Government by Wikimedia in relation to the categorisation regulations having delayed Ofcom’s publication of the register of categorised services, despite the fact the ruling upheld the Government’s approach. The register of categorised services is a necessary first step before transparency reporting requirements (on which Ofcom has already consulted) and other additional duties on those services (for example, relating to fraudulent advertising and the provision of user empowerment tools) can be implemented; on current timescales, we’re unlikely to see the register until early 2026 or the first transparency reports until the summer, and it will be mid-2027 before the additional extra duties come into force.
Reversals
Finally, while the volume of investigations launched by Ofcom to date (21 investigations into 69 sites) is welcome and undoubtedly signals the regulator’s intent, much of the action is based on the procedural shortcomings of the services under scrutiny. The biggest platforms have to date been untouched, despite mounting evidence that exposure to harms on many of them is getting worse - see for example recent research from the Molly Rose Foundation on Instagram and TikTok; Global Witness on TikTok; and CCDH on X. Many platforms have rolled back protections in the last year and global trust and safety teams are being decimated, with the latest moves by TikTok to make hundreds of UK-based content moderators redundant a particular concern. While the UK Government’s win in relation to the UK-US Trade Deal was welcome, DSIT and Ofcom’s appetite to continue to mitigate the global impacts of the pro-tech, anti-regulation Trump administration and stand firm in ongoing rows about free speech - never mind to go further and extend regulatory protections - is all the more pressing.
What more is left to do
In short, lots.
Ambition
There is still much of the regime to implement: the publication of the guidance to protect women and girls is due from Ofcom in the coming weeks and, while there was some welcome ambition in Ofcom’s approach in its consultation, Ofcom’s appetite to drive its adoption will be critical here given that it is not something which can be enforced against. The extent of Ofcom’s ambition when it brings forward the proposals on the duties on categorised services is also tbc: those relating to fraudulent advertising are particularly key given the scale of the problem as well as the wider, significant gaps in meaningful action to address the harms arising from online advertising more generally. And there is much still to do if the regulator is going to fully realise the intent of the Act, as we have continued to argue since the publication of its very first proposals on illegal harms.
Urgency
Representations from civil society on numerous issues - on significant risks and harms from AI (particularly chatbots), on a more sophisticated, age-differentiated approach to safety by design, and on some of the major gaps in the Act - have largely fallen on deaf ears. It is down to the new Secretary of State to demonstrate a change of approach from that of her predecessor - not relying on a “wait and see” mantra with regard to Ofcom’s implementation, but looking ahead (especially given how long it takes to develop and enact legislation) and engaging with the policy problems and consulting with experts on how to resolve them well in advance of them becoming intractable.
The inertia does not begin and end in DSIT, however. The Government has not responded to the independent Pornography Review, leaving it to Baroness Bertin to table her own amendments to the Crime and Policing Bill to try to get her recommendations delivered; nor has it shown any urgency in setting out how it will deliver its target to halve VAWG in a decade, with the much-delayed VAWG strategy still awaited. Other things in the “tbc” column which have relevance to, and could be vehicles for, improved online safety measures include: a Fraud Strategy; an Elections Bill and the much-deferred AI Bill.
In the meantime, Ofcom needs to remain vigilant for any further rollback in companies’ Terms of Service and to fully test whether the biggest services’ compliance with their duties is more than just “more of the same”. That means rigorously monitoring potential loopholes to effective implementation (such as the “technically feasible” get-out), using their transparency and information-gathering powers to their fullest extent and acting responsively to flags - from wherever they originate - as to the impact and nature of emerging harms.
Advocacy
For civil society organisations - such as the brilliant ones we work with - maintaining effective advocacy and engagement campaigns across so many fronts in the face of so many hurdles is something of a miracle. But, as we noted at the start of this piece, there have been significant wins in the past two years which - without such pressure - are unlikely to have been achieved. However, as we have consistently flagged, this level of engagement is not without costs - the drain on already stretched civil society resources of responding to the scale and volume of Ofcom’s consultations over the past 24 months, never mind the sometimes painstaking level of persistence required to deliver some of the smallest concessions, would be intolerable were it not for the resilience, collaboration and solidarity that defines this remarkable sector and its collective determination to stand up for those most at risk of preventable harms.
We’re by no means done.