The Crime and Policing Act 2026- An Online Safety Perspective

Tags:

The Crime and Policing Act, which received Royal Assent at the end of the last Parliamentary session, is a wide-ranging piece of legislation containing a smorgasbord of offences and supporting provisions, not all of which are relevant to online safety. Commentary on those issues can be found, for example, here, here, here, here and here. As is usual, most provisions do not come into force on Royal Assent; provisions require commencement orders. Additionally, some aspects of the Act require further detail to be provided by secondary legislation and some of the provisions in the Act constitute Henry VIII clauses. It may be a while therefore before some of the Act takes effect.

Of this wide field, this blog focusses on aspects of the Act that relate to online safety. The Children’s Wellbeing and Schools Act, which also received Royal Assent at the end of the last Parliamentary session, contains online safety relevant provisions in that it provides the Secretary of State with powers to introduce a social media ban or restrictions on specified functionalities through secondary legislation.

1 Content takedown

There are two sets of provisions dealing with the removal or takedown of specified types of content: that relating to illegal weapons content and that relating to non-consensual intimate images (NCII).

a) Weapons Content

The Act introduces a number of measures concerning weapons as part of the Government’s Safer Streets mission. They implement changes proposed in the Independent end-to-end review of online knife sales. In relation to the online environment, it mandates stricter age verification checks for the online sale and delivery of knives and crossbows. It also brings a mechanism requiring the removal of certain weapons-related content backed up by personal liability measures on senior managers of online platforms who fail to take action on this content in line with the outcome of a consultation on executive liability. This was also part of the Labour manifesto.

The provisions do not extend the scope of the criminal law as regards weapons content - this content is already related to criminal content (see s 20) and also constitutes priority offences under the Online Safety Act (OSA) (Schedule 7). Removal is defined in s 27 as meaning for user-to-user services

“any action that results in the content being removed from the service, or being permanently hidden, so users of the service in any part of the United Kingdom in which the content is unlawful weapons content cannot encounter it”

and for search services as

“taking measures designed to secure, so far as possible, that the content is no longer included in the search content of the service that is available in any part of the United Kingdom in which the content is unlawful weapons content”.

The takedown process borrows the definitions of “user-to-user service” and search service from OSA (see s 13).

The Act expects there to be a coordinating police officer (designated by the Secretary of State) (s 14) and who may, by notice, require relevant services to have a designated content manager (s 15). Failure to comply with this requirement may lead to the imposition of a civil penalty of an amount not exceeding £60,000 (s 19). The coordinating officer may notify (by means of a “content removal notice”) the relevant service of “unlawful weapons content” which the services should then remove within 48 hours (though there are rights to request a review of the issuing of a notice) (s 21). A civil penalty notice may be imposed for failure to comply; the designated content manager may also receive a penalty notice (s 24 and Schedule 4). The Act envisages that the Secretary of State will issue guidance as to the exercise of these powers (s 25) which must be laid before Parliament using the negative procedure.

b) NCII

While not included in the original Bill, the Crime and Policing Act contains provisions (s 100), amending the Online Safety Act illegal content duties for both user-to-user and search services. They require a regulated service to have a system allowing the takedown of certain intimate images on notification of there being relevant intimate images within 48 hours (borrowing the time frame from the US Take It Down Act) and mandate specific reporting processes for such content. The provision also contains powers for the Secretary of State to specify additional requirements for an intimate image content report, or to make further provision about how the existing requirements are to be met. Relevant images are those amounting to an offence under s66B(1),(2) or (3) Sexual Offences Act (non-consensual intimate images).

As this is amending the illegal content duties already in place by virtue of OSA, the crosscutting duties in that Act will apply here too. Thus the duty will be subject to proportionality safeguards and the cross-cutting duty on providers to consider, in relation to their safety measures, the importance of the right to freedom of expression (sections 22 and 33 OSA) and the complaints and appeals procedures will also apply. By contrast to the weapons content, there is no definition of “takedown” – though it should be noted that the phrase was already included in the OSA.

It is unclear whether the obligations here and in the OSA more generally about takedown mean the same things as “removal” for illegal weapons content purposes. Although both sets of provisions are focussed on ensuring users do not encounter content, different actors trigger the processes in each case and there are different review/oversight mechanisms.

It should be noted that these amendments to the OSA, which target NCII, do not cover child sexual abuse material (CSAM). As yet, there is no formal takedown mechanism.

While not part of the takedown regime, and also following on from the work of Baroness Owen, the Act introduces measures to control the spread of NCII – this however requires further secondary legislation to implement. There are three aspects each requiring action by the Secretary of State:

  1. The Act places a duty on the Secretary of State to designate a trusted flagger for the purpose of providing information to ISPs and providers of internet access services (“IASPs”) about verified NCII content which, if shared, is likely to amount to an offence under section 66B of the Sexual Offences Act 2003.
  2. It confers a power on the Secretary of State to amend OSA in connection with the creation of duties on both user-to-user and search services to provide information to the designated trusted flagger.
  3. The Secretary of State is given a power to establish a register of intimate image content. As part of this power, the Secretary of State may require Ofcom to have regard to the register when exercising certain of its functions under OSA.

These changes were necessary because “internet service providers (“ISPs”) consider that they are restricted from blocking this content both under net neutrality legislation and because they are not competent to make decisions about whether content has been shared illegally.”

The powers are intended to operate in two phases. The first relates to the trusted flagger: the Government intends to designate the Revenge Porn Helpline; the register is a more complex matter and the Government intends to carry out a scoping exercise before the powers are exercised. Note that while the trusted flagger designation will use the negative resolution procedure, the provisions implementing the register will require a positive resolution. Of course, this issue of potentially illegal content is relevant not just in relation to NCII but other areas too (eg encouraging suicide or serious self-harm) and the measures for NCII do not address them.

The Act also creates new forms of court orders: an “image deletion order”, available to a sentencing court when dealing with an offender convicted of an intimate image offence, breastfeeding voyeurism recording, or sharing semen-defaced images. While the terms of the order will be for the court in each case, in principle the court could require an offender to destroy or delete an intimate image (or breastfeeding image or semen-defaced image) to which the offence relates and, where it has been posted online, to take it down. The order could also apply to other intimate images of the same victim under the offender’s control. The Court’s powers to make deprivation orders have also been amended so that photographs or copies are to be treated as being used for the purpose of the offence. Existing principles about when orders may be made will still apply: if there has been a sufficient investigation to justify a finding that the property is the product an offence and where the court is satisfied that the order is proportionate and justified.

2 AI

There are three sets of provisions dealing with AI, though none of them could be called a comprehensive approach. On the whole they focus on image generation. Two sets of provisions introduce criminal offences whilst the remainder envisages regulatory solutions. There is also an ancillary offence relating to manuals and guidance.

a) CSAM

The Act contains a number of provisions relating to the protection of children. In particular, there is a ban relating to AI-models optimised to produce child sexual abuse material - called a CSA image-generator. CSAM is be defined in accordance with existing legislation, i.e. photographs/pseudo photographs (section 1 of the Protection of Children Act 1978) or prohibited images (section 62 of the Coroners and Justice Act 2009). The images need not necessarily be photorealistic but could be

drawings/cartoons. Prior to the Crime and Policing Act, these fined-tuned versions of off-the-shelf AI models were not illegal. The offence covers tools that are made to generate CSA images or are adapted for that purpose. It is unclear whether this extends to those tools which are likely to produce CSAM but were neither designed with that aim in view nor adapted for that purpose, though Government guidance states that the “offence will not criminalise AI developers”. There are red-teaming defences as well as defences for neutral intermediaries. The intermediary defence is conditional in that hosting services which are notified of the AI image generator lose immunity if they do not act on notice to remove it.

b) NCII and nudification tools

There are also provisions relating to deepfakes and the supply of image generators (s 99). Interestingly, the language used here in relation to purported image generators is different from that used in relation to CSAM and image generators there. The description here is as follows:

“A “generator of purported intimate images” is a thing for creating, or facilitating the creation of, purported intimate images of a person” and this will be the case “if a reasonable person (having regard to all the circumstances) would consider that they do so”.

A person has a defence if they can show they took all reasonable steps to prevent such usage. This then is a strict liability offence, with a reverse burden of proof as regards the reasonable steps defence. The Act envisages that the offence may be carried out by a company and makes provision in that instance that senior officers of the company may be liable. Again, the intermediary immunity defence is available.

c) Guidance and Manuals

There is also a new offence relating to manuals. It closes a gap in the previous regime (s 69 Serious Crime Act) which excludes pseudo-photographs. The Crime and Policing Act makes it a criminal offence to possess guidance on creating synthetic or partially synthetic CSAM – for example, how to nudify photos for the purposes of sextortion or scripts for grooming. Note that there is no analogous offence in relation to guidance on nudification etc in relation to NCII. Indeed, it is questionable as to whether online courses on nudification of women such as here (US example) would be caught by the law.

d) AI and Chatbots

As noted in a previous blog, the Act also contains a Henry VIII clause giving the Secretary of State powers to apply the illegal content duties to an “AI service”, that is, “an internet service that is capable (or part of which is capable) of generating AI-generated content (no matter what proportion of content on the service is AI-generated)”.

3 Moderators and Administrators

It will be noted that in the main, the Act provides for intermediary immunity. However, the Act expressly criminalises the carrying out of a “relevant internet activity” with the intention of facilitating child sexual exploitation and abuse. This would include moderators and administrators of websites that host child sexual abuse material. These measures sit in Part 5 of the Act dealing with sexual offences, specifically chapter 1 protecting children. They have been introduced as the NCA’s view is that the backbone of the spread of CSAM is linked to certain sites dedicated to supporting and distributing CSAM.

4 Pornographic Content and the Online Safety Act

The Crime and Policing Act lays the groundwork for strengthened requirements around the verification of performer age and consent, with implementation via secondary legislation. This reflects proposals from the Bertin Review on pornography. The Act envisages that there will be a further review which will look at, amongst other things, the role of internet service providers in ensuring consent and age of performers. The Act specifies that there should be a report. To implement the report, the Secretary of State is given two sets of powers to make regulations. First, the Act amends the OSA to give the Secretary of State powers to require the providers of regulated services (which could include Part 5 services) to verify age of performers and to verify that they have consent to performing. Ofcom may be required to give services guidance on these duties. Secondly, the Secretary of State must appoint a monitoring/enforcement body, which will be able to apply for measures similar to the business disruption measures in the OSA (s 110). This second sets of powers enables the possibility of free-standing proposals rather than necessarily relying on the OSA and Ofcom. As the Memorandum on Delegated Legislation notes, it is intended that these powers are used in the alternative, though as a matter of drafting the Crime and Policing Act does not preclude the possibility of them being used cumulatively. Further, while they are both linked to the review, there is nothing in the Act requiring them to implement the outcome of that review.

5 Changes to Offences

Section 98 notes that Schedule 13 contains a list of amendments to the Sexual Offences Act 2003 in connection with offences relating to semen-defaced images, intimate photographs or films and voyeurism. They strengthen the law in relation to the taking of intimate images without consent and the installation of equipment with the intention of enabling either oneself or another to do so. The previous exposure offence was considered narrow because the intention had to be about sexual gratification. Following the Law Commission’s report, the offence has been extended so as to capture circumstances where the purpose is to humiliate or to obtain sexual gratification. Sections 66, 69 and 70 of the Sexual Offences Act, concerning exposure, sex with an animal and sex with a dead body, are all amended. Following the recommendations of the Bertin Review (and after significant pressure from Baroness Bertin and campaigners), there are also pornography offences in relation to pornographic images of strangulation or suffocation, of sex between relatives or sexual activity with child under 16 where an adult is roleplaying a child. The intermediary immunity provision in the Criminal Justice and Immigration Act 2008 into which these offences are inserted are extended to cover these offences. Most of these offences have been made priority illegal content under OSA – the exceptions are the amended offences in ss 113 (exposure), 114 (sex with animals), and s 115 (sex with a corpse). It is likely they will still be relevant criminal offences under the OSA even if not designated as priority offences.

The assisting self-harm offence replaces the version in the OSA; it is more broadly drafted than the previous offence. There is an intermediary immunity provision although not in standard wording in relation to this offence. Unlike the standard version of immunity, here the internet service provider would not seem to lose immunity should they fail to act once on notice.

6 Enforcement of Serious Offences

The Act also contains powers for investigative agencies to apply to the courts for an order addressed to third parties involved in the provision of domain names to suspend the provision of those names or IP addresses to specified persons. The purpose is to target servers that are being used to facilitate crimes. The crimes in relation to which these powers may be used are limited, according to Sch 18 to the Act, to “serious crimes”, that is, an offence in relation to which a first-time offender could reasonably be expected to be sentenced to three years’ imprisonment or more.