Online racist abuse of English footballers at the Euros: the Government’s Groundhog Day

Tags:

Just over four years ago, the OSA Network’s Prof Lorna Woods and Maeve Walsh, along with William Perrin, wrote a blog post as part of work on online harms at Carnegie UK which responded to the shocking online racist abuse of the three black England footballers who missed penalties in the 2021 Euro Championship final.

In our post, we made a number of specific recommendations for the Government to strengthen the draft Online Safety Bill which, at that point, was going through pre-legislative scrutiny in Parliament. At the time, the abuse caused national outrage and sparked Parliamentary debate, including challenges to the then Prime Minister, Boris Johnson, from the current Prime Minister, Keir Starmer, at PMQs and an Urgent Question in Parliament, where the current Cabinet Office Minister, Nick Thomas-Symonds, accused the then Government of “dragging its feet” on the Online Safety Bill.

Roll forward four years and the shocking online racist abuse suffered by one of England’s women footballers, Jess Carter, during this month’s Euros has prompted a police investigation and shone a spotlight on the growing problem of online abuse in the women’s game. The Online Safety Act 2023 is now in force. The Prime Minister, Keir Starmer, has said on X there is “no place for racism in football or anywhere in society” and that he stands “with Jess, the Lionesses and any players who have suffered racism, on and off the pitch”.

So why is the legislation not making a difference? The answers reflect shamefully on both the previous and current Government and their persistent refusal to get a grip with a systemic problem on social media and to hold the platforms who enable it to account.

We called for three things in our 2021 blog post:

  1. Greater clarity for how the criminal law applies on social media such that the threshold for triggering the OSA obligations on platforms relating to illegal content was effective in reducing its spread, particularly in fast-moving, high-profile cases such as this. Unfortunately, Ofcom’s approach to the Illegal Content Judgement Guidance focuses on only priority offences - such as racial hatred - and omits consideration of the lower grade offences that would catch a broader range of abuse. It is therefore largely ineffective in cases such as Carter’s. Prof Woods has written on the problem more generally with the ICJG here.

  2. The strengthening of the adult safety duties as they apply to racism directed at adults. These duties were removed entirely by the previous Conservative Government at the end of 2022, replaced by a “Triple Shield” which included: making sure illegal content is removed (see above re the limitations of that); the provision of user empowerment tools (enabling users to filter out content they do not want to see); and a requirement on platforms to enforce their Terms of Service (ToS). The last two duties are not yet in force and - on Ofcom’s current timescales - are unlikely to be so until early 2027. They are also not sufficient.

    As we wrote then:

    “relying on terms of service suggests that the regulatory focus is at the point when companies tell users what they can and cannot do – content moderation policies and take down rules. What it does not seem to do is to require companies to change their upstream systems and processes, which are more likely to be effective at scale than tighter terms of service. Such mechanisms include giving people tools to protect themselves, not algorithmically promoting racism, not recommending groups etc.”

    Moreover, there is no requirement under the Act for a minimum standard for ToS nor any sanctions for companies if they reduce protections in those ToS - as we have seen Meta and X do in the past couple of years. We have written on why this is a problem and called on the Government repeatedly to bring in urgent amendments to the OSA to rectify this; see here. The impact of these rollbacks on women and people of colour is already being felt, with 76% of women respondents to a recent global survey reporting that harmful content targeting protected characteristics has increased since Meta’s decision to dilute protections in January this year, and 78% of people of colour reporting similar.

  3. The inclusion of a general duty to take reasonable steps to prevent reasonably foreseeable harms in the legislation. We remain of the view that this should be a policy priority for any Government that is serious about addressing the scale of harm online in a way that is systemic, risk-based, design-focused and - most importantly - scalable at times of emergency or crisis, such as the seemingly inevitable online targeting of female and/or minoritised ethnic sports stars with abuse when they are participating in international events. This general duty would solve a number of problems with the current complex Act, given its predominant focus on content moderation and takedown.

Next year, the England men’s football team will be competing in the 2026 World Cup, in the United States. The time for platitudes is over. If the Prime Minister really wants to “stand with” footballers suffering racism and use the levers that he and his Government now have to deliver his aspiration that there is “no place for racism in football or anywhere in society” he should draw a line under the cowardly capitulation of the foot-dragging, previous Government, stand up to the (US-owned) platforms and amend the Online Safety Act before then.

Only that way will the online protections desperately needed by our footballers - and indeed, all women and minorities - be delivered.