The Speaker's Conference report on security of candidates, MPs and elections
The Speaker's Conference has published the final report from its year-long inquiry into the security of MPs, candidates and elections. It is a long, sobering read, detailing the extent and impact of online and offline abuse faced by our democratic representatives. This second report focuses on the criminal justice system and social media, with the Conference’s headline summary on the latter topic a very clear analysis of the current regulatory and legislative protections:
“For as long as people continue to use social media as a mouthpiece for abuse, responsibility for tackling this issue lies with the platforms. However, we have no faith that the social media companies we have spoken to will address the underlying factors that drive abuse on their platforms unless they are legally obliged to do so. The Online Safety Act 2023 has the potential to deal with many of these issues, but to do so effectively Ofcom must make full use of the regulatory powers it has been given. It must be given time to implement the Act fully, but when appropriate, success of the Act as it relates to the issues under our consideration must be measured by the lived experience of politicians online. Notwithstanding the Online Safety Act’s potential, the centrality of social media and constant progress of technology make it highly likely that further social media regulation will be needed at some point, and the Government should be preparing for this now.” (Speaker's Conference Second Report, p3)
We welcome the recommendations that the Speaker's Conference has made with regard to the effectiveness of the Online Safety Act and the areas for Government action. In particular, we are delighted that the Conference has adopted the recommendation in our submitted evidence for the Government to “consider the merits of mandating Ofcom to produce an elections code of practice for social media platforms, and the feasibility of introducing this requirement as part of the Bill it has said it will bring forward during this Parliament on electoral reform.” This approach - which would put specific risk assessment and mitigation responsibilities on platforms to provide additional protections from online abuse for candidates during high-risk election periods - would align with the regulatory framework of the OSA; we recommend that Ofcom should work with the National Police Chiefs’ Council and the Electoral Commission in developing it. You can read more on our proposal and the suggested OSA amendments to enact it, along with a skeleton code, here.
The Committee’s analysis, informed by the testimony of MPs, research from academics and the evidence from representatives of the two platforms “that are the main source of threats to MPs and candidates” (Facebook and X), is spot on. They have noted that: “the design of a platform affects the level of abuse” (p52) (we have long called for a definition on safety by design, as well as a supporting code of practice, to be introduced under the OSA, in order to deliver one of the Act’s primary objectives (see section 1).
The Conference also noted that “women consistently receive a disproportionate amount of sexualised abuse and disproportionately high abuse if they become more prominent or senior in a party, while politicians from ethnic minorities receive a disproportionate amount of hate speech. Female ethnic minority candidates ‘faced significantly more hate speech than their male counterparts, underscoring the compounding nature of discrimination based on both ethnicity and gender’.” (p55-6); and that “social media platforms have become increasingly unwilling to remove abusive content.”
One particularly damning finding, from evidence provided by the Parliamentary Security Department, was that “while platforms have previously been willing to work with PSD to remove abusive and intimidating content, they no longer have ‘any interest whatsoever’. “ (p56) The engagement of the two main platforms at an oral evidence session also came in for criticism, with the Conference caused “to doubt their sincerity”; their follow-up written evidence “was often more evasive than it was informative”.
This led the Conference to conclude that:
“We have no faith that Meta and X will resolve these issues [removal of abuse as well as algorithmic design prioritising incendiary and threatening content] unless they are legally obliged to do so. There is therefore no point in us recommending wholesale change of policies or modus operandi. The only viable solution is for Ofcom and the Government to compel them to protect their users. Currently, this will be via the Online Safety Act 2023. But should that legislation fail to tackle the root issues that influence the level of abuse and intimidation on social media platforms, further legislation will be needed.” (p69)
(It is perhaps worth noting here that - had the previous Government not removed the duties relating to content harmful to adults when the Online Safety Bill was going through Parliament at the end of 2022 - new legislation might not already be needed to deal with sub-criminal online abuse.)
The Conference report rightly emphasises the importance of phase 3 of the implementation of the Online Safety Act, which will bring in transparency reporting, consistent enforcement of terms of service, user empowerment tools and user verification duties, but notes that “this opportunity will only materialise if Ofcom makes full use of all the regulatory powers it has been given under the Act”. It calls for greater opportunities for Parliament to scrutinise Ofcom’s implementation and assess its effectiveness, echoing calls from the Science, Innovation and Technology Committee and Lords Communications and Digital Committee for better advance notice for Parliament of OSA implementation milestones, consultations and draft codes; see their joint letter to the DSIT Secretary of State here.
The Conference also potentially pre-empts the likely Government response on OSA implementation - “wait and see” - by flagging that it is likely to need to bring in more legislation and “preparations should be made ahead of time so that options for further regulation are ready for the Government to act on when needed”. (p77) In particular, it points the Government to a review of “the enforcement and effectiveness of the false communications offence introduced by the Online Safety Act 2023. In line with the recommendation in our first report, and in order to address the terrible spread of sexually explicit deepfakes targeting female candidates, the Government should ensure that current legislation and any amended legislation on false statements about candidates explicitly cover the creation of deepfakes”. (p79); and to considering legislation to address doxing and pile-on harassment, which are “not sufficiently addressed by existing legislation” (p80)
The Government has until 23 December to respond to the Speaker's Conference recommendations.