OSA amendments and Henry VIII clauses
Following the Government's recent press release announcing that it would act swiftly to implement the findings of its consultation on children’s online safety (“Growing up in an Online World”), it has put forward two amendments, one to the Children’s Wellbeing and Schools Bill (full text of the amendment here) and one to the Crime and Policing Bill (full text here). They seek to give the Government powers to take steps by secondary legislation to implement changes to the Online Safety Act in respect of social media bans and AI-generated content (including chatbots) respectively.
This note does not seek to express an opinion on the policy positions themselves but rather seeks to explain the scope of the two sets of powers before making some general comments about “Henry VIII clauses”.
Children’s Wellbeing and Schools Bill
The Children’s Wellbeing and Schools Bill has its Commons consideration of Lords amendments on Monday (9th March); the start of “ping pong”. The Government has proposed amendments relating to a ban by inserting a new provision, s 214A, into the Online Safety Act, as well as a provision to allow the Secretary of State the power to amend the digital age of consent and relating to age verification in relation to that consent. These are intended to replace Lords amendments 37 and 38 which include Lord Nash’s amendment to bring in an under-16 ban. A discussion of the data protection provisions lies outside the scope of this note.
Proposed s 214A (1) OSA would empower the Secretary of State to make regulations, a form of statutory instrument (secondary legislation), to introduce a social media ban or curfew. As the Government suggested in its press release, this would allow it to act (comparatively) swiftly, once the policy decisions had been made, as statutory instruments do not require the same amount of Parliamentary time and debate as primary legislation. The proposal is an empowering provision: the Government can act once its consultation has finished, in a manner yet to be determined, or it could choose to do nothing. No action is required by the provision.
Sub-section 1 sets out the extent of any regulations made under this power – any regulations must be for the purposes specified in 214A (1)(a) or (b). The text does not specify whether (a) and (b) operate in the alternative or cumulatively but, given the power is permissive, an interpretation allowing either (a) or (b) seems likely. The Secretary of State may set up a regulatory system addressed to providers of “internet services”. This is a defined term in the OSA; it means “a service that is made available by means of the internet” (s 228(1) OSA). This definition is broad – it does not just cover “user-to-user services” (social media and messaging) or search. It could cover free-standing blogs, or payment services, for example. Paragraph (a) seems to relate to age gates for either a service or a functionality on a service; paragraph (b) seems to relate to curfews or time limits, an interpretation supported by subsection (3). This maps on to some of the options on which the Government is consulting. Thus the degree of choice left to the Secretary of State is significant. The amendment allows the Secretary of State to determine which services or types of functionality could be caught, whether to adopt a ban, curfew or time limiting approach and also the age of children triggering any such measure. All this is to be “specified” in the regulations.
The following sub-sections give more detail about what could be included by the regulations. Proposed section 214A (2) OSA identifies three categories: provisions detailing how a provider would comply with the duties the regulations imposed; provisions about monitoring compliance; and about enforcement. The amendment merely identifies the categories themselves but contains no further detail. There is, however, more detail in relation to proposed s214A(1)(b). There, it seems that the matters in (3)(a) (quantitative limits on a service or functionality) and (b) (curfew provisions) should be included. It is not clear whether these are in the alternative or whether both must be included whether or not the intention is to have time limits or a ban, though presumably the regulations could specify that there be no limits.
By contrast to the amendment in the Crime and Policing Bill (see below), it is not so easy to borrow wholesale the duties in the OSA. The proposals, however, specifically acknowledge that new regulations could borrow structures already in place in OSA for the purposes of filling out the regulatory regime. For example, subsection (4) specifically envisages that the regulations could be “enforceable requirements” for the purposes of Ofcom’s OSA enforcement powers. The proposals do not expressly require a regulator, nor specify such a regulator, but the provision will sit in OSA most of which is subject to Ofcom enforcement powers. The provisions do envisage a role for Ofcom- it may be required to carry out research for the Secretary of State – though the subject of such research (if any) and its timing is down to the Secretary of State. The Secretary of State does not need to consult any particular body in making the regulations.
Crime and Policing Bill
The Crime and Policing Bill is currently in Lords Report stage, with a number of days of debate still scheduled. The Government has tabled an amendment giving them the power to amend the Online Safety Act in relation to AI-generated content. The amendment would introduce proposed s 216A OSA. It is much longer than the social media ban amendment. As is the case for that provision, though, the first subsection here delineates the scope of the powers granted to the Secretary of State. Again, it empowers the Secretary of State but does not require her to take action – it is permissive. The powers would give the Secretary of State the power to amend any provision in the OSA (other than s 234 which defines “harm” for the purposes of OSA). The definitions sections make clear that “amend” here includes repeal and apply (with or without modifications) (see s 216A (17)). This is a very broad power, though the amendment must be for the purpose specified in proposed s 216A(1) OSA.
The purpose must be linked to minimising or mitigating risks of harm from a variety of illegal content that the amendment introduces: “illegal AI-generated content” or using AI services to commit or to facilitate the commission of priority offences (as set out in Schedules 5-7 OSA). The proposed provisions define “illegal AI-generated content” as “illegal content” for the purposes of OSA that is generated by AI. There is no specific definition of AI, the definitions merely note that AI is short for artificial intelligence. The current definition of illegal content limits the regulated content to regulated user-generated content; this limitation is removed for these purposes. The proviso in s 59(12) OSA that illegal content is not to be disregarded simply because it is created by a bot, clearly remains relevant – though the proposed regulation expressly specifies that this provision may be amended too. “Content” is very broad (see definition in s 236 OSA) and therefore covers text, image and video, audio and any combinations thereof. Presumably this would cover use AI to generate scripts for grooming children, for example, as well as non-consensual purported intimate images.
The Independent Reviewer of Terrorism Legislation has suggested that this power is broad enough to allow the Secretary of State to create criminal offences. While he does not expand on this point, an offence could be imposed on providers for failure to comply with their duties (this is wider than the offences currently contained in the OSA).
The regulations envisage the illegal content duties (or analogous duties) could be imposed on AI services. An AI service is defined as “means an internet service that is capable (or part of which is capable) of generating AI-generated content (no matter what proportion of content on the service is AI-generated)”. Interestingly, the regulations specifically envisage that the duties in relation to priority illegal content can apply to AI services in relation to AI Illegal content even if that content does not relate to a priority offence. It is also envisaged that the Category 1 duties and the duties in relation to fraudulent advertising (applicable to Category 1 and Category 2a services) could apply also. Again, the borrowing of the existing regime is clearly envisaged, including the responsibilities of Ofcom, and the Government’s delegated legislation memorandum suggests that this the power “is therefore appropriately constrained and predictable” indicating that, if the Secretary of State acts, she should include the provisions identified. Of course, the drafting here is permissive: the Secretary of State does not have to do anything at all.
It is clear that the proposals here do not give full response to the concerns raised in relation to AI, particularly chatbots. It is noticeable that the Secretary of State could not use these powers to deal with content harmful to children (save to the extent that that content was also illegal content); nor could she use the powers to deal with non-content harm (eg addiction). Societal harms which are not well described by individual experience of harm are also excluded from the scope of these powers (as well, of course, as the underlying OSA).
The drafting here specifically envisages that the regulations brought in might have extraterritorial effect in the same way that the OSA itself does – the proposal refers to the necessity for “UK links”, as provided for in s 4 OSA. By contrast, the provision of the Children’s Wellbeing and Schools Bill is silent on extraterritoriality.
Henry VIII Clauses
Both proposals are what is termed “Henry VIII clauses” – that is clauses which give the executive the power to amend primary legislation using statutory instrument. This means that there is much less Parliamentary oversight, even when using the affirmative resolution process (as proposed here). Put simply, Parliament cannot amend such secondary legislation – they have a choice, vote “yes” or “no”. It is thus difficult to improve secondary legislation where there is a fault in a proposal where the basic principle is sound, leaving Parliamentarians in a very difficult situation. There was a Parliamentary Convention against Henry VIII clauses, but over the years these sorts of clauses have become more common. It is, however, accepted that they should not be used to create criminal offences. The Lords have suggested that insofar as they can be acceptable, clear and cogent reasons should be given for their use. Moreover, the powers granted in secondary legislation should in general be targeted rather than broad. The Government has pointed to the need to update the regime to catch issues outside the current OSA framework and the need to act as speed. It is debatable, however, whether this generic claim really satisfies constitutional expectations. Here, we have not only Henry VIII clauses but great breadth of power. This is problematic constitutionally speaking and the Lords has said that “it is constitutionally objectionable for the Government to seek delegated powers simply because substantive policy decisions have not yet been taken”.