Online Safety Act Network

Response to the Digital Regulation Cooperation Forum call for input on its 2024/5 workplan

Tags:

This is the response submitted by the OSA Network to the Digital Regulation Cooperation Forum (DRCF) call for input on its 2024/5 workplan which can also be downloaded as a PDF below. 

We welcome the opportunity to provide input to the DRCF’s workplan for 2024/5. The Online Safety Act Network has been set up to continue the work that Carnegie UK took forward on the online harms agenda, in particular providing policy development and advice to civil society organisations and convening discussions on priority issues arising from the Ofcom consultation programme.

We remain – as previously, under the auspices of Carnegie UK - particularly interested in regulatory coherence and the need for regulatory bodies with an interest in the services that will fall under this regime to work closely together. As Ofcom picks up the reins on implementing the Act, we still believe it is incumbent on the DRCF to stand ready to identify where better coordination or more robust information-sharing powers between the constituent regulators would assist it in its role as lead regulator.

What are the most important areas of technology or digital regulation where you are observing intersections between the responsibilities and work of the DRCF regulators?

We would suggest that there may be more to do between regulators with regard to the problem of the deregulated internet. We note that the DRCF has done some work on web3 but, for example, Threads has launched and is interoperable with Mastodon and other software. There are specific issues arising here relating to the remits of individual regulators but there may also be value in considering the general problem collectively, either as part of the enabling innovation work or horizon-scanning streams.

There are also some new issues arising relating to working arrangements between digital regulators, particularly Ofcom, and the police. Just recently we have seen the National Crime Agency’s intervention in response to  Meta’s decision to introduce end-to-end encryption on its services; there are questions about the impact on detection of crimes as well as the prosecution offences. These concerns affect regulators: the Online Safety Act puts reporting obligations on the NCA relating to child sexual abuse content and section 68 amends the Crime and Courts Act 2013 to give the NCA information sharing powers with Ofcom. The DRCF could look in the round at how regulatory enforcement affects law enforcement.

There are also significant questions that could be explored regarding data protection and privacy in relation to identity and age verification as well as in relation to the end-to-end encryption debate. While these issues clearly involve the ICO and Ofcom, the question around encryption probably involves the FCA too. While there are some independent identity providers, the market does not seem fully established; we wonder if there is a risk that dominant social media, search and other digital service providers foreclose this market by leveraging their sign-on services. In this area, there is a question of the extent to which there are overlapping powers. For example, there is a question as to the extent to which Ofcom’s OSA obligations actually mean it will need to enforce privacy rules too – or can it just hand that off to ICO?

We have also raised in discussions with the DRCF the issue of the safety and security-by-design sector and outsourcing to third parties: this cuts across competition and market concerns (CMA), privacy and security concerns (ICO) and effectiveness concerns (Ofcom/FCA).

What specific joint action by DRCF member regulators would you like to see? What shared processes, guidance or other outputs would you find useful?

We recognise that there is work already underway on AI and algorithms, but we feel that developments in the past 12 months justify more specific work to address the cross-sectoral and cross-regulatory challenges of GenAi. We see that the harms arising have relevance to:

  • the CMA (competition issues arising from the potential market dominance of small numbers of Large Language Models which affect choice, consumer competition and access to diversity of information as well as media mergers).
  • the ICO (data protection concerns in relation to access and use of personal data to train AI  - for example, the recent story about DropBox allowing access to Open AI); and
  • Ofcom, as it takes forward the OSA regime, not just relating to emerging harms arising from the use of GenAI and immersive tech but also some of the tools used for content moderation and threat detection. There is also an issue relating to the impact of consolidation in the GenAI market on freedom of expression and diversity.

Finally, with regard to positive impacts, we would like to acknowledge the fact that there is much to applaud in the work of the DRCF to date, particularly its approach to cooperation and coordination and its forward-looking and transparent agenda. We look forward to continuing to work with you in the year ahead.

Download assets