Online Safety Act Network

Categorisation of services in the Online Safety Act

Tags:

A PDF version of this piece is available to download at the bottom of the page.

Issue

Ofcom has recently published its advice to the Secretary of State on thresholds for categorisation, along with a call for evidence to inform the implementation of the related duties. The categorisation of services under the Online Safety Act determines which of the regulated user-to-user or search services will be subject to additional duties. These include user empowerment duties and additional responsibilities relating to terms of service - duties which are the only remaining routes to providing additional safety protections for adults since the Government’s decision to remove the wider adult safety provisions from the Online Safety Bill in autumn 2022.

Following that decision, many concerns were raised in Parliament that - if the threshold conditions for category 1 were solely determined by size - many small, high-harm services would be excluded. A late amendment - the result of a concession by the Government in the House of Lords - ensured that Ofcom could consider size or functionality when setting the category 1 thresholds, giving the regulator greater flexibility when setting the thresholds while also taking account of the other provisions in the Act. Their advice to the Secretary of State does not, however, use the flexibility that was granted to them nor explain in any detail why they have not chosen to do so - despite the intent of Parliament.

This blog post:

  • Puts the development of the provisions in the Act in context, from the very start of the legislative process to the late concession by the Government before the end of its Parliamentary passage;
  • Sets out what the Online Safety Act (OSA) requires in terms of Ofcom’s role in relation to the categorisation of services and what duties flow on services as a result.
  • Considers Ofcom’s advice to the Secretary of State in the context of what’s in the OSA, Parliament’s intent and Government assurances; and
  • Outlines what might happen next.


The history of this issue in the Online Safety Act

The OSA contains duties for Part 3 services – that is, regulated user-to-user services and search services – in addition to the general illegal content duties and children’s safety duties, depending on whether they fall into a set of further categories of service. The categories for user-to-user services are Category 1 and Category 2b. For search, there is just Category 2a. In particular, Category 1 services have additional obligations with regard to user empowerment tools, content of democratic importance as well as journalistic content, enforcement of terms of service and more stringent obligations with regards to freedom of expression and privacy. A full list of the obligations in relation to each category can be found here. Ofcom must maintain a register of the services falling in each of the categories, and that register must be made public (see s 94 onwards).

While the obligations for the different categories are clear on the face of the Act, the precise thresholds for the categories are not. These are to be set out by the Secretary of State in secondary legislation (regulations) following advice from Ofcom. The process is set out in paragraph 2 to Schedule 11. It envisages Ofcom carrying out research on specified matters relating to size and service functionality before providing advice to the Secretary of State based on that research (para 2(5)). The advice is to be published and the Secretary of State must then make the regulations “as soon as reasonably practicable” after Ofcom has provided its advice.

The Secretary of State may do something different from what Ofcom suggests, but if so must publish a statement explaining why (see Sch 11, para 2(8) and (9)). Schedule 11 specifies issues that must be included in the regulations (Sch 11, para 1(1)-(3)), as well as minimum types of requirement that must be satisfied by a service to reach a categorisation threshold (Sch 11, para 1(4)). Certain matters that must be taken into account are also identified (Sch 11, para 1(5)-(7)). The first regulations made under para 1(1) of Sch 11 may not be made unless a draft has been laid before and approved by each House (s 225(8)); any other regulation under paragraph 1 of Schedule 11 is subject to annulment (s 225(9) and (10)(g)).

Revisions during the progress of the Bill through Parliament meant Ofcom was additionally required to maintain a list of emerging Category 1 services (cl 88 OSB – now s 97 OSA) so that they could keep an eye on rapidly scaling companies and add them to Category 1 without delay when thresholds have been met (EN, para 502). However, concern remained that the position of small high-harm platforms was unaddressed, a point that was picked up in the Commons and, more successfully, in the Lords.

Parliamentary debate

The impact of the thresholds for categorisation and the potential for these to exempt small, high-harm user-to-user services from the Category 1 duties emerged at various stages in the progress of the Online Safety Bill. Category 1 is particularly important given that service providers in this category are the only providers subject to any form of duties, beyond the illegal content duties, in relation to adults. These are the duties to provider user empowerment tools, including enabling the blocking of unverified accounts, and to enforce terms of service.

Labour’s Alex Davies-Jones MP made the case repeatedly in the Public Bill Committee debates – both in the first scrutiny period and when the Bill was recommitted – and further argued at the first stage of Commons Report (12 July 2022) that “categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet”. Throughout the passage of the Bill, the small sites that were of most concern to Parliamentarians and campaigners included suicide forums (recent news reporting has linked one such forum to 50 UK deaths) and sites which propagate targeted hate, misogyny and abuse (such as 4Chan, implicated in the Buffalo shooting in the US).

When the revised Bill – after a hiatus during which the Government removed the “harms to adults” provisions and introduced the “Triple Shield” measures in its place – was reconsidered at the second stage of Commons Report, the former DCMS Secretary of State Jeremy Wright (who was responsible for the Online Harms White Paper) put forward amendments aimed at ensuring that size was not the dominant criterion for bringing platforms into scope of additional duties. These were not accepted.

However, it was in the House of Lords debates on the Bill that cross-party momentum built up on this issue, largely due to the cogent arguments and effective campaigning of Conservative Peer Baroness Morgan, another former DCMS Secretary of State who had had responsibility for the Bill at an earlier point in its legislative progress. The first substantive debate on the issue, related to an amendment that would have introduced a requirement for Ofcom to consider the “significant risk of harm” in its advice on categorisation, occurred in Committee on 25 May – the transcript is here – with the debate highlighting the range of cross-party support to ensure consideration of small suicide sites, extremist sites, eating disorder sites, etc.

Baroness Morgan noted that, at Second Reading in the Lords, the Government had argued that there was insufficient evidence to show that the Category 1 designation was required, “despite Ofcom’s road map for online safety making it clear that it had already identified a number of small platforms that are clearly giving cause for concern” (Col 1098). She noted that the amendment would:

“not compel Ofcom to add burdens to all small platforms but provides a specific recourse for the Secretary of State to consider the risks of harm as part of the process of categorisation. A small number of well-known, small high-harm sites would be required to add what will ultimately be minimal friction and other measures proportionate to their size”.

Lord Parkinson set out a number of reasons why the Government would not accept the proposals and, although Morgan withdrew that particular amendment, she signalled in her closing speech that the issue was not going to be dropped:

“Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course.”

Baroness Morgan subsequently tabled at Report stage amendment 245 (the simple amendment that changed the word “and” to “or” – in order that Ofcom would be able to consider either size or functionality of services when setting thresholds, not being limited to considering both), co-signed by both the Labour and Liberal Democrat front benchers and the cross-bencher, Baroness Kidron. In her contribution to the debate on 19th July 2023, Morgan stressed the importance of this amendment in relation to small, high-harm sites that propagated suicide or self-harm material or those that spread incel, hate or extremist propaganda. She provided real-world examples relating to the influence that hate sites had had on fatal shootings; and an example of the influence that a suicide forum had had on the suicide of Zoe Lyall. Morgan then reflected on an exchange of correspondence she had had with the DSIT Secretary of State, Michelle Donelan, who had indicated that:

“she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled”.

Morgan also reflected on the fact that “the Government have .. said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about.” She ended her contribution by indicating that she would put the matter to the vote which, after a number of further speeches in favour of her amendment, Morgan won.

By the time the Bill reached its Third Reading in the Lords, the Government had already indicated that it would accept Morgan’s amendment. In the Commons debate that followed, the then DSIT Minister Paul Scully said:

“The Government are grateful to Baroness Morgan of Cotes and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who like many in the House have steadfastly campaigned on the issue of small but risky platforms. We have accepted an amendment to the Bill that changes the rules for establishing the conditions that determine which services will be designated as category 1 or category 2B services and thus have additional duties. In making the regulations used to determine which services are category 1 or category 2B, the Secretary of State will now have the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors. Previously, the Secretary of State was required to set the threshold based on a combination of both factors. It is still the expectation that only the most high risk user-to-user services will be designated as category 1 services. However, the change will ensure that the framework is as flexible as possible in responding to the risk landscape.” (Commons Consideration of Lords Amendments 12 September 2023)

What the OSA requires

Schedule 11 remains the relevant schedule. The provisions that specify the conditions that should be covered in the regulations remain unchanged from the revised Bill (see Sch 11, para 1). As regards Category 1 services, the regulations must specify conditions “relating to each of the following -

(a) number of users,

(b) functionalities of that part of the service, and

(c) any other characteristic of that part of the services or factors relating to that part of the service that the Secretary of State considers relevant” (Sch 11, para 1(1)).

“Functionality” is a defined term (see s 233). For user-to-user services, it “includes any feature that enables interactions of any description between users of the service by means of the service …” including creating accounts, using emojis, following users and using hyperlinks. A “characteristic” is defined as including the user base, business model, governance and other systems and processes (Sch 11, para 7).

It seems that while there must be provision in the regulations concerning size and functionalities, provision regarding characteristics are not required by the Act though it permits them. Further, though the Secretary of State must consider the impact of size and of functionality on virality of content, there is no express requirement for a provision on this linkage. The provisions on the issues that Ofcom must research reflect this (Sch 11, para 2). Ofcom is required to consider the spread of content, number of users and functionalities. Additionally, it may choose, where it deems it relevant, to consider other characteristics.

While the considerations to be included in the regulations in Sch 11 para 1(1)-(3) have not changed, the requirement in the Bill that a user-to-user services to satisfy a size and a functionality condition to be categorised as either Cat 1 or Cat 2B was, as noted, removed. Instead, para 1(4)(a) states that “at least one specified condition about the number of users or functionality must be met” (emphasis added).

The Explanatory Notes state that the regulations could still “specify that a service must meet a combination of conditions, for instance conditions relating to both the number of users and the functionalities on offer” (EN, para 483). Limiting the scope of Category 1 in this way might not reflect the debates and concerns expressed in Hansard. The issue identified in the Explanatory Notes is that “the factors which are set out on the face of the Act as being the most important in determining whether it is proportionate to place additional duties on the provider of a user-to-user service or a search service will be reflected in each designation decision”. Ofcom may still, of course, include other characteristics or factors in its advice to the Secretary of State, based on Ofcom’s research (Sch 11, para 2(6); EN, para 486).

One point to note is that the Schedule does not identify the purpose of the research or how what Ofcom finds out turns into advice for where the boundaries are. Schedule 11 should not be understood in isolation, however, but in the context of the Act. Specifically, s 1 identifies the purpose of the Act as understanding the risk of harm to UK users and acting to mitigate it. The significance of size, functionality and characteristics is therefore in their respective impact in risk of harm. Category 1 obligations are the regime’s response to services that have increased risk of contributing to harm as a result of these features.

This seems to leave us with the position that large services, which do not have inherently “risky” functionalities, can still be risky by virtue of their size, whereas small services can still be risky by virtue of their functionalities alone. It is unclear how services which are risky by virtue of the nature of their content alone (which would seem to be an issue of characteristics (eg terms of service and moderation policy) rather than functionality) will be dealt with. This seems to be a question of when Ofcom and/or the Secretary of State deem it relevant.

Ofcom’s advice to the Secretary of State

Ofcom submitted their advice – and the underpinning research that had informed it – to the Secretary of State on 29 February 2024 and published it on 25 March. In summary, its advice is as follows:

Category 1

Condition 1:

  • Use a content recommender system; and
  • Have more than 34m UK users on the U2U part of the service

Condition 2:

  • Allow users to forward or reshare UGC; and
  • Use a content recommender system; and
  • Have more than 7m UK users on the U2U part of the service

Ofcom estimates that there are 9 services captured by condition 1 and 12-16 likely to be captured by condition 2. There is one small reference in the annex that the 7m+ monthly users threshold corresponds to the DSA (A6.15)


Category 2a (search)

  • Not a vertical search service; and
  • Have more then 7m UK users

Ofcom estimates that there are just 2 search services that currently sit (a long way) above this threshold but that it is justified to put it at this level to catch emerging services.

Category 2b (children)

  • Allow users to send direct messages; and
  • Have more than 3m UK users on the U2U part of the service

Ofcom estimates that there are “approximately 25-40 services” that may meet this threshold. There is an important justification for this threshold set out at para 5.28: “many of the most-used services which have the functionality of direct messaging also met the category 1 thresholds, however a small number did not. As services will only be designated as one category of service, we are of the view that the user number threshold for category 2B must be lower than the user number threshold for category 1. This is to ensure that category 2B captures an appropriate number of different services to category 1. However, we found there are several high-reach services that have the functionality of direct messaging that do not meet the other proposed category 1 thresholds, so we consider it appropriate to not place an upper limit on the user number threshold (and capture those services where relevant).”

There is a short and somewhat oblique reference to the change that was made to Schedule 11 as a result of the Morgan concession at paragraph 3.30:

“We have discounted a recommendation that allowed for the categorisation of services by reference exclusively to functionalities and characteristics since the research indicates that user reach has an important role to play too. For instance, there are services where the functionalities and characteristics discussed above are core to the service, but whose smaller number base means that the dissemination of user-generated content on the service is comparatively less pronounced in its speed and breadth relative to other services with a greater number of users and the same functionalities.”

When Ofcom’s advice was published, it was done so alongside a letter from Donelan to Ofcom’s Chief Executive, Melanie Dawes, seeking further information. The letter from Donelan referred to her obligations to “consider for Category 1, the likely impact of the number of users of a service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of a service” (our emphasis). While this reflects the requirements in Sch 11, para 1(5), it does not refer to the fact that the regulations may rely on “the number of users of a service, or its functionalities”, as permitted by the Act. Donelan went on to ask for more information on how Ofcom had used its “regulatory judgement” in setting the user number thresholds and additional evidence relating to the six functionalities it had considered, along with the recommender algorithms. It did not ask whether there were any sufficiently risky types of service that could be categorised without reference to the numerical threshold.

Melanie Dawes responded on 16 April. On the user number threshold query, she restates the methodological approach and evidence that Ofcom had used to inform their thresholds and says that in each case they needed to make their own assessment on what comprised “targeted and proportionate regulatory action”. It is implicit in the following paragraph, from the end of page 2, that Ofcom made no judgements based on “size or functionality”:

“preliminary indicative analysis suggests that approximately 12-16 services may meet one or both of the user number thresholds we proposed, when factoring in the impact of the functionality requirements recommended in our advice for Category 1. We explained that in our view, this estimated number of services “indicates that our recommended user number thresholds are likely to strike the right balance in terms of targeting those services where content is likely to be disseminated easily, quickly and widely, while ensuring that the duties apply to a sufficiently targeted number of services”.

Furthermore, on page 3, the decisions about functionality are framed in terms of “speed, ease and breadth of dissemination”.

“Of the 69 functionalities reviewed as part of that methodology, most were discounted as irrelevant to the research question at hand due to a lack of evidence of their impact on the speed, ease and breadth of content dissemination. We identified six functionalities that were relevant to the criteria, based on evidence from the literature review and/or the internal logic analysis. We weighed the available evidence about each of these six functionalities to develop our proposals and took a decision based on functionalities for which evidence of links to content dissemination was strongest.”

The decision to limit the functionalities included in the assessment based on “content dissemination” certainly reflects some central aspects of Schedule 11. Moreover, the amendment allowing regulations to consider size or functionality did not require the drawing up of rules containing thresholds just on one of these requirements. Nonetheless, the Schedule does not limit Ofcom’s considerations to where there is evidence of functionalities or characteristics having an impact on dissemination and not discussing “small but risky” at all seems to run counter to the Parliamentary intent of revising the Act to ensure that functionality as a means to cause harm can also be considered when the Secretary of State makes their designation.

In Dawes’ explanation as to the evidence that was considered in determining this advice, she explains the limitations of the evidence base – referring to the “minimal” and “low” volume of academic studies and research. In our submission to Ofcom on the proposals set out in their illegal harms consultation, we describe at length the problems Ofcom have created for themselves in setting their evidential threshold so high (see section 3, page 30 onwards) – and the significant impact this will have on the effectiveness of the regime as implemented.

In this context, it is notable that none of the evidence submitted by civil society organisations to Ofcom in their call for evidence (published here) is referenced in their research. This includes submissions from the Antisemitism Policy Trust and Carnegie UK, both of whom were instrumental in supporting Baroness Morgan to win her concession, 5 Rights (which is headed up by Baroness Kidron) and from Refuge and Glitch, who both reference Morgan’s amendment in their response. Industry submissions are referenced as being considered, however: at paragraphs 2.13, 3.13, 5.3 (it is “currently challenging for services to accurately identify and measure the number of child users”), A6.6, A6.10 (“we have sought (where possible) to ensure our approach accommodates existing industry practice”) and A6.15. We here provide further analysis on the evidence submitted in this call, and how much it is reflected in the subsequent advice.

What happens next?

Ofcom sets out the next steps on page 6: the Secretary of State will set out the threshold conditions in secondary legislation (a Statutory Instrument, or SI) which, once passed, will allow Ofcom to then gather information from regulated services to publish a register of categorised services that meet each threshold condition.

“Assuming secondary legislation on categorisation is finalised by summer 2024, we expect to publish the register of categorised services by the end of 2024. We are aiming to publish draft proposals regarding the additional duties on these services in early 2025. Over time, we will revisit and update our register of categorised services as appropriate.”

What this does not set out is the intervention points between the submission of their advice and the laying of the secondary legislation. It is not without precedent for a Secretary of State to reject advice that has been required by legislation. The Courts generally infer that public authorities to whom independent regulatory advice is addressed should follow it unless there are “cogent reasons” for rejecting it (R. (on the app of Equitable Members Action Group) v Her Majesty’s Treasury [2009] EWHC 2495 (Admin)). The question of whether there are cogent reasons focuses on the rejection, not the quality of the advice in the first place (see R. (on the app of Bradley) v SoS for Work & Pensions [2008] EWCA Civ 36, which concerned the Secretary of State’s decision to reject the Parliamentary Ombudsman’s finding that the Government was guilty of maladministration). Merely having a different perspective from the regulator would not be sufficient.

It is therefore possible that in this case the Secretary of State could disregard Ofcom’s advice (given that the concerns underlying the amendment allowing Ofcom to consider size or functionality when providing its advice were not considered). This might then involve the articulation of legally sound reasons that outweigh the regulator’s advice, considering the general principles of transparency, accountability, and the rule of law. In any event, the Secretary of State must stay within the authority conferred by the Act and any decision to disregard the advice of the regulator could be subject to judicial review.

There are also further points at which Parliament can intervene once the Secretary of State has published draft regulations: the Cabinet Office Guide to Making Legislation sets out that the Government has committed to 12-week consultation periods for draft statutory instruments and the proposals will need to go through Parliamentary scrutiny, most likely one or all of the Joint Committee on SIs, the Lords Secondary Legislation Scrutiny Committee and the Delegated Legislation Committee.

Pressure to secure parliamentary scrutiny at key points during the implementation phase of the Online Safety Act was also a critical part of the Lords’ debates during the passage of the Bill. The Lords Delegated Powers and Regulatory Reform Committee included a recommendation in their report on the OSB that the first regulations for the category 1 thresholds should be subject to the affirmative procedure – that is they will need to be voted upon. In accepting the recommendation, Lord Parkinson said:

“The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.” (Hansard 19 July 2023 Col 2339)

The Opposition in the Lords had also long been arguing for more formalised oversight and scrutiny arrangements to ensure that Parliament could scrutinise the Government’s plans and consultations in the implementation phase. In the same debate, Parkinson made a number of commitments in this regard:

“First, where the Bill places a consultation requirement on the Government, we will ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open. Secondly, while we do not wish to see the implementation process delayed, we will, where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process. These timelines will be on a case-by-case basis, considering what is appropriate and reasonably practical. It will be for the committees to decide how they wish to engage with the information that we provide, but it will not create an additional approval process to avoid delaying implementation.” (our emphasis; Hansard 19 July 2023 Col 2352)

This might suggest that, in addition to the procedural Committees which have a standing role in relation to secondary legislation, Select Committees in both Houses (for instance, the Science and Technology Committee in the Commons and the Communications and Digital Committee in Lords) might expect to play a role in the scrutiny prior to the laying of the regulations.

At the end of the Bill’s passage through the Lords, Parkinson was also at pains to stress that the Secretary of State would be accountable for the decisions made on categorisation thresholds, saying in the closing speeches on 6 September: “I stress that it is the Secretary of State who will set the categorisation thresholds. She is, of course, a Member of Parliament, and accountable to it. Ofcom will designate services based on those thresholds, so the decision-making can be scrutinized in Parliament”.

Once the Parliamentary scrutiny has finished and the affirmative resolution vote has taken place, the regulations will come into force 40 days later.


Download assets