Online Safety Act Network

Ofcom's approach to human rights in the illegal harms consultation

Tags:

(A PDF version of this analysis piece is available to download at the end of the page.)

Issue

The Online Safety Act directs Ofcom to consider freedom of expression (Article 10 ECHR) and privacy (Article 8 ECHR), but these are not the only relevant rights – as indeed Ofcom notes.

All the rights protected by the European Convention on Human Rights should be considered when considering the impact of the regime – or the lack of it. So, as well as the qualified rights of freedom of expression, the right to private life and rights noted by Ofcom – e.g. the right to association – we should consider other rights including the unqualified rights – the right to life, freedom from torture and inhuman and degrading treatment as well as the prohibition on slavery and forced labour (e.g people trafficking). Note also that rights can include positive obligations as well as an obligation to refrain from action; a public body can infringe human rights by failing to protect as well as by interfering itself in an individual’s rights.  

Article 14 ECHR constitutes the requirement for people not to be discriminated against in the enjoyment of their rights; all people (and not just users of a particular service) should be considered. This reflects the general principle of human rights that all people’s right should be treated equally – and indeed that the starting point is that no right – for example, freedom of expression – has automatic priority over another. It also means that the European Court has adopted a specific methodology for balancing rights of equal weight, 1 rather than its typical approach where a qualified right may suffer an interference in the public interest but that interference must be limited. This difference in methodology reaffirms the significance of seeing all the rights in issue when carrying out balancing exercises. A failure to carry out a proper balance by national authorities has itself led to a finding of a violation of the procedural aspects of the relevant right.

Note also that Article 17 prohibits the abuse of rights so that “any remark directed against the Convention’s underlying values would be removed from the protection of Article 10 by Article 17” 2. While this applies only to a narrow sub-set of speech, it is nonetheless a factor that should form part of the balancing exercise where relevant. Areas where Article 17 might be relevant include threats to the democratic order 3; racial hatred4; holocaust denial 5; religious 6 or ethnic 7 hate; hatred based on sexual orientation; incitement to violence and support for terrorist activity. 8

What the Act says

Section 22 sets out duties with regard to freedom of expression and privacy, and says that all user-to-user services:

(2) When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users’ right to freedom of expression within the law.

(3) When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a user-to-user service (including, but not limited to, any such provision or rule concerning the processing of personal data).

An analogous provision (section 33) applies in relation to search services.

As a public body, Ofcom falls within section 6 of the Human Rights Act which specifies that “[i]t is unlawful for a public authority to act in a way which is incompatible with a Convention rights”.

Ofcom’s proposals

The concern here is that Ofcom’s approach, as set out in its illegal harms consultation, considers only the rights of users (as speakers) and has principally focused on their freedom of expression. In doing so, it has not really considered the nature of the speech (which the Convention court does take into account), nor provided evidence that speech in some instances would be chilled 9 – it has rather hypothesised a rather theoretical concern. It has not considered the rights of other users and non-users that require steps to be taken against rights infringing harms – and where the infringement of a right has been recognised in the judgments of the European Court, or the opinion of UN Special Rapporteurs. This means that any balancing exercise is skewed towards not taking action for fear of inconveniencing users (who could well be infringing the rights of others) and companies.

We set out a number of examples below taken from various sections of the Illegal Harms consultation to demonstrate our concern. These intersect with our concerns about the proportionality judgements that underpin the overall approach in the consultation on which we will be writing further.

Ofcom on prioritising rights of users over rights of intended victims with regard to content moderation:

  • “Content moderation is an area in which the steps taken by services as a consequence of the Act may have a significant impact on the rights of individuals and entities - in particular, to freedom of expression under Article 10 ECHR and to privacy under Article 8 of the European Convention on Human Rights (‘ECHR’).” (Vol 4, 12.57)

Ofcom on applying a proportionality test to including measures in the codes

  • “to include a measure in the Codes, we need to assess that the measure is proportionate (with reference to both the risk presented by a service, and its size, kind and capacity) and does not unduly interfere with users’ rights to freedom of expression and privacy.” (Vol 4 11.22) – no mention of the rights of others – whether this be their freedom of expression and privacy or other aspects of Article 8, let alone Articles 2, 3 or 4.

Ofcom on recommending cumulative risk scoring systems

  • “We consider that cumulative risk scoring systems could provide various benefits for tackling illegal harms such as fraud, drugs and weapons offences, child sexual exploitation and abuse, terrorism, and unlawful immigration. We recognise however that there is significant complexity involved in these systems, and that there could be adverse impacts on user privacy or freedom of expression if the operation of the system were to result in inappropriate action being taken against content or user accounts. We have limited evidence on this at present. As a result, we are not proposing to include a recommendation that services use cumulative risk scoring systems in our Codes of Practice at this time” (Vol 4, 14.322) – adverse impact on freedom of expression and privacy trumps “various benefits” for tackling illegal harms but does not consider the need to protect fundamental rights

Ofcom’s interpretation of the “chilling effect”

  • “In addition, there could be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented a more effective content moderation process as a result of this option. (Vol 4, 13.52)Potential interference with users’ freedom of expression arises insofar as content is taken down on the basis of a false positive match for CSAM or of a match for content that is not CSAM and has been wrongly included in the hash database. In addition, there could be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented hash matching in accordance with our option.” (Vol 4, 14.87)
  • “Potential interference with users’ freedom of expression arises insofar as content detected by services deploying keyword detection technology in accordance with this option does not amount to a priority offence regarding articles for use in frauds, but is wrongly taken down on the basis that it does. There could also be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented keyword detection technology in accordance with this option.” (Vol 4. 14.281)
  • “We recognise that these user support measures may have a limited chilling effect on the rights to freedom of expression and freedom of association in that they would briefly delay children from disabling defaults and may result in children being less likely to do so (preserving the existing restrictions on their rights outlined in paragraph 18.65 above). The measures may also result in children being less likely to establish new connections or communicate with new users online.” (Vol 4. 18.135)
  • “Chilling effect” here is to dissuade or inconvenience users from using services that act robustly on illegal harms including CSAM and fraud. There is no mention of “chilling effect” in relation to impact of individual users on others.

Ofcom on balancing freedom of expression rights with recommending measures for strikes or blocking of accounts

  • “Although blocking and strikes may be a way of tackling illegal content, there are also concerns about the use of these systems on lawful speech. Preventing a user from accessing a service means removing their ability to impart and receive information and to associate with others on that service. It therefore represents, for the duration of the block and in respect of that service, a significant interference with that user’s freedom of expression and association. The impact also extends to other users, who will be unable to receive information shared by the blocked user on the service in question. Restricting access to certain functionalities as part of a strikes system may also interfere with user rights, for example if the user is prevented from posting content on the service.” (Vol 4 21.39) - no consideration of rights protected through such blocking; or the value ascribed to the speech in the blocked account as regards both the speakers’ rights and those receiving the information.
  • “Our proposed recommendation around strikes and blocking in this consultation relates to proscribed groups. We are inviting further evidence from stakeholders to be able to explore broadening this in future work; in particular, we are aiming to explore a recommendation around user blocking relating to CSAM early next year. We are particularly interested in the human rights implications, how services manage the risk of false positives and evidence as to the effectiveness of such measures.” (Vol 4 11,15) - measures for CSAM blocking not recommended in the first codes as a result despite the impact on children and likely interference with children’s Article 8 and 3 rights and possibly also Articles 2 and 4.

Evidence

1. The Silencing Effect of Abuse – Article 10 ECHR

As Ofcom has recognised, women and other minoritised groups receive a disproportionate amount of abuse 10 – abuse here can take various forms from direct threats to gendered or racist misinformation and the use of deepfakes to undermine and harass – to name but a few. Yet it was established for at least 5 years that “online gender-based abuse and violence assaults basic principles of equality under international law and freedom of expression”. 11 Dubravka Šimonović, the UN Special Rapporteur, in 2018 highlighted the importance of applying a human rights-based approach to online violence against women, 12 and it has been recognised that women in particular are being targeted, especially those in public life. 13 As the UN Special Rapporteur on Freedom of expression emphasises, there should be no trade-off between the right to be safe and the right to speak. 14

This point can be made in relation to other minoritised groups – and those with intersectional identities suffer particularly. In short, the failure to provide a safe environment in which to express themselves – which the European Court of Human Rights recognises is part of the positive obligations under Article 10 15 – constitutes an infringement of the victims’ expression rights, as well as those who share relevant characteristics with them. The point about freedom of expression is particularly important for those in public life, but the underlying facts in any given case may also implicate Article 8 and have an even wider impact.

This point about shared characteristics is also important. Of course, men receive abuse online too, but it seems more addressed to ideas expressly (and thus could be categorised as an extreme form of debate) whereas women seem to be targeted for their characteristics, 16 which is pure abuse, which does not receive high protection under the Convention if it receives any at all 17. By contrast, as discussed below, negative stereotyping of a group, when it reaches a certain level, is capable of impacting on the group’s sense of identity and the feelings of self-worth and self-confidence of its members. It is in this sense that it can be seen as affecting their “private life” within the meaning of Article 8(1). 18

Moreover, the approach to dealing with misogynistic trolling in particular in imposing obligation on the victim itself contributes to an environment in which victims are not taken seriously 19 and rape culture (through symbolic violence) continues. The impacts, while clearly affecting speech, are deep, wide-ranging and often misunderstood and undervalued 20 – especially when the violation of rights is not recognised and what is going on is characterised vaguely as harmful.

2. CSAEM/Grooming of Children – Article 8 and 3

Article 8 is not just about the confidentiality of communications. The text of Article 8 covers four groups of interests: private life, family life, home and correspondence, each of which has been interpreted broadly. As for Article 10, in addition to protecting interference with these rights by public authorities, there are positive obligations to ensure that Article 8 rights are respected even as between private parties. Positive obligations are particularly significant when the interests at stake involve “fundamental values” or “essential aspects” of private life, and the Court looks for deterrence and effective protection. So, granting an amnesty for the perpetrator of sexual assault constituted a breach of Article 8 (and also Article 3) – a point that should be borne in mind when understanding severity of harm and appropriate balances in terms of impact on with service providers or the user speaking. 21

As regards the relevance of Article 8 to CSAEM offences, KU v Finland specified that an advert of a sexual nature placed in the name of a 12-year-old boy on an Internet dating site, leaving him open to approach by paedophiles, was indisputably within Article 8 which covers the physical and moral integrity of a person. Here the Court emphasises the potential threat to the boy’s physical and mental and moral welfare, as well as the boy’s vulnerability because of his age. This then was a grave threat: “sexual abuse is unquestionably an abhorrent type of wrongdoing, with debilitating effects on its victims. Children and other vulnerable individuals are entitled to State protection, in the form of effective deterrence, from such grave types of interference with essential aspects of their private lives”. 22 In this instance, although there were laws in place they were ineffective and the Finnish Government had failed to “put in place a system to protect child victims from being exposed as targets for paedophiliac approaches via the Internet”.23

In Söderman,^30 the step-father of a 14 year old covertly videoed her underdressing before showering; the film was subsequently destroyed without anyone seeing it. He was subsequently acquitted because the act of filming was not illegal in itself. Again, this fell within Article 8 and the State’s obligation to protect the physical and psychological integrity of an individual from other persons – especially where that person is a child. Rape and sexual abuse of a child implicate fundamental values and essential aspects of private life, and aggravating factors include the offence taking place in the child’s home where the child should feel safe. So, “in respect of less serious acts between individuals, which may violate psychological integrity, the obligation of the State under Article 8 to maintain and apply in practice an adequate legal framework affording protection …” that must be “sufficient”. 24 So this is not just the legislative framework that is considered, but also the implementation of that framework.

KU was dealt with under Article 8; it did not involve physical assault of the child. There is some suggestion that verbal abuse without physical violence would fall within Article 8 rather than Article 3. 25 These more serious assaults may trigger Article 3 or even Article 2 (both of which are unqualified rights). Here there are also positive obligations on the state to protect the personal integrity of a child, with breaches of the right being found where state procedures are ineffective 26.

A similar analysis can be made in relation to a whole range of offences related to violence against women. Coercive control, domestic violence and similar behaviours likewise infringe the survivors’ Article 8 and, in some instances, Article 3 rights. 27 While many of the cases involve a State’s failure to protect the victim of domestic violence against physical injuries and psychological damage, there are cases involving digital tools. In Volodina v Russia (No 2), the claimant claimed breaches of her Convention rights arising from the State’s failure to take action in respect of cyber-harassment: her former partner had used her name, personal details and intimate photographs to create fake social media profiles, that he had planted a GPS tracker in her handbag, that he had sent her death threats via social media. In this case the Court found a violation of Article 8. Image-based sexual abuse likewise engages article 8: Ismayilova. 28 More generally, the Court has recognised ‘cyber-bullying’ as an aspect of violence against women and girls and that it could take on a variety of forms, including cyber breaches of privacy, intrusion into the victim’s computer and the capture, sharing and manipulation of data and images, including private data. In this case, Buturugă v Romania, the Court found a violation of Articles 3 and 8.

These offences also contravene the Istanbul Convention 29 and the UN Convention on the Elimination of Discrimination Against Women ^30. They should be seen as infringing fundamental rights, infringements in respect of which the State has positive obligations. ^31 Ofcom’s balancing and proportionality assessments should take this into account; so far the analysis has not recognised this.

The President of the Court of Human Rights recently noted that, “the victims of domestic and gender-based violence are not born vulnerable. They are rendered vulnerable, on their journey from girl to womanhood, by the imbalanced social structures into which they are born, by the law and by law-makers, and by attitudes and patterns of behaviour in their regard which are ignored, permitted or endorsed by society, including the State.” She suggests that the focus of the Court “must remain the actions and omissions of State authorities” and suggests the key question “were the applicants accorded equal and sufficient protection before the law?” 30

4. Trafficking – Article 4

Article 4 prohibits “slavery or servitude” in para (1), while Article 4(2) prohibits forced or compulsory labour. The Court has distinguished between these terms. 31 In S.M v Croatia, the Court clarified that “forced or compulsory labour” covers serious exploitation, for example forced prostitution, irrespective of whether it is linked to trafficking. Trafficking in human beings, by its very nature, is based on the exercise of powers attaching to the right of ownership. It threatens the human dignity and other fundamental freedoms of its victims. While the prohibitions in Article 4(1) arguably relate to more serious infractions of fundamental rights, both Article 4(1) and Article 4(2) constitute rights that should be considered as such in any balancing of interests.

Recommendation

Ofcom should review its recommendations in the light of their obligations, taking into account the weight of the rights violations as against companies’ costs in particular.

References


  1. See e.g. Perinçek v. Switzerland (27510/08) [GC] 15 October 2015, para 198; Axel Springer AG v. Germany (39954/08) [GC] 7 February 2012, paras 83-84 on the balance between articles 8 and 10 – the precise factors taken into account in the balance will vary depending on the underlying facts in a case and the rights involved. ↩︎

  2. Seurot v France (57383/00), decision 18 May 2004 ↩︎

  3. Schimanek v Austria (32307/96), dec 1 February 2000 ↩︎

  4. Glimmerveen and Hagenbeek v NL (8348/78 8406/78), dec 11 October 1979 ↩︎

  5. Garaudy v France (65831/01), dec 24 June 2003 ↩︎

  6. Belkacen v Belgium ( 34367/14) , dec 27 June 2017 ↩︎

  7. Ivanoc v Russia (35222/04), dec 20 February 2007 ↩︎

  8. Roj TV A/S v Denmark (24683/14), dec 18 April 2018 ↩︎

  9. In Wille v Liechtenstein (28396/95) [GC], 28 October 199, the Court suggested that interferences could take a wide range of forms “formality, condition, restriction or penalty”, but this does not seem to match on to what Ofcom is suggesting and see Metis Yayıncılık Limited Şirketi v Turkey (4751/07), decision 20th June 2017, suggesting that there should be some substance to any cause leading to a claim of chilling effect where swiftly terminated criminal proceedings were not deemed to have a chilling effect; see also Schweizerische Radio- und Fernsehgesellschaft and Others v. Switzerland (68995/13), decision 12 November 2019, para 72. ↩︎

  10. For example, in volume 2’s discussion of the nature of the risks arising from the harassment, stalking threats and abuse offences, Ofcom notes: “Reviewing Ofcom and third-party evidence indicates that these offences disproportionately affect certain identity groups – most notably women – alongside other intersecting risk factors; the impact on those individuals can be severe.” (para 6E:15) ↩︎

  11. Press Release UN experts urge States and companies to address online gender-based abuse but warn against censorship https://www.ohchr.org/en/press-releases/2017/03/un-experts-urge-states-and-companies-address-online-gender-based-abuse-warn?LangID=E&NewsID=21317 ↩︎

  12. UN Human Rights Council. Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective. June 2018. 38th session ↩︎

  13. Naser Al Wasmi, “UN High Commissioner for Human Rights says women are being silenced” The National, 28 September, 2018, https://www.thenationalnews.com/world/the-americas/un-high-commissioner-for-human-rights-says-women-are-being-silenced-1.774871

    Ofcom’s own risk analysis finds that “Experiencing abuse and harassment can have a silencing effect, making victims and survivors feel unsafe in expressing themselves on social media services. Human rights organisation Amnesty International found that 24% of women experiencing online abuse and harassment said they stopped posting their opinions on certain issues. A study from 2016 found that 27% of US internet users censor their own online posts for fear of being harassed. This silencing effect can harm victims and survivors’ careers. A study of women journalists found that those facing abuse and harassment reported making themselves less visible (38%), missing work (11%), leaving their jobs (4%), with some deciding to abandon journalism altogether (2%).” (Volume 2, 6E:23)
    Protecting People From Online Harm Volume 2: “The causes and impacts of online harms”,  https://www.ofcom.org.uk/__data/assets/pdf_file/0019/271243/volume-2-illegal-harms-consultation-1.pdf ↩︎

  14. UN, Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan (A/78/288), https://documents-dds-ny.un.org/doc/UNDOC/GEN/N23/233/65/PDF/N2323365.pdf?OpenElement ↩︎

  15. Özgür Gündem v. Turkey (23144/93), 16 May 2000; Dink v Turkey (2668/07) judgment 14 September 2010; Khadija Ismayilova v. Azerbaijan (65286/13 and 57270/14) 10 January 2019 ↩︎

  16. Marjim Nadim and Audun Fladmoe, ‘Silencing Women? Gender and Online Harassment’ (2021) 39(2) Social Science Computer Review 245, DOI: https://doi.org/10.1177/0894439319865518 ↩︎

  17. See eg Norwood v UK (2004) App No 23131/03 on racist abuse and the impact of Article 17 ECHR ↩︎

  18. Aksu v Turkey (4149/04 and 41029/04) [GC], 15 March 2012 ↩︎

  19. This can also be seen in media portrayals of trolling: Karen Lumsden and Heather Morgan “Media framing of trolling and online abuse: silencing strategies, symbolic violence, and victim blaming” (2107) 17(6) Feminist Media Studies, doi https://www.tandfonline.com/doi/full/10.1080/14680777.2017.1316755 ↩︎

  20. Azmina Dhrodia, ‘Social media and the silencing effect: why misogyny online is a human rights issue - A survey of women in eight countries shows a clear trend’ The New Statesman, 23 November 2017, https://www.newstatesman.com/culture/social-media/2017/11/social-media-and-silencing-effect-why-misogyny-online-human-rights-issue ↩︎

  21. EG v Moldova (37882/13), judgment 13 April 2021 ↩︎

  22. KU v Finaland (2872/02), judgment 2 December 2008, para 46 ↩︎

  23. KU para 48 ↩︎

  24. Söderman v Sweden (5786/08) [GC], judgment 12 November 2012 ↩︎

  25. Söderman, para 85, 91 ↩︎

  26. Association Accept and Others v Romania (19237/16), judgment 1 June 2021; Fo.O v Croatia (29555/13), judgment 22 April 2021 ↩︎

  27. X v Bulgaria (22457/16) [GC], judgment 2 February 2021; Z and Others v UK (29392/95) [GC], judgment 10 May 2001 ↩︎

  28. See eg Vuckovic v. Croatia 12 December 2023 ↩︎

  29. Ismaililova, para 116

    Ofcom’s risk profile analysis (Volume 2 of the illegal harms consulyation) sets out some of the evidence around coercive control offences: “To put the risks of harm into context, a survey by Women’s Aid of victims and survivors of domestic abuse in 2013 found that 45% had experienced abuse online during their relationship. Research by Refuge conducted in 2022 found that 82% of victims and survivors of tech abuse had experienced harassment or stalking on social media, 41% had experienced threats of violence and 29% had experienced intimate image abuse. However, getting up-to-date measurement of CCB, including the specific prevalence of online CCB, is challenging. It is not only a relatively new offence, but also overlaps with other offences (see in 6G.10) (Vol 2: 6G:14) 

    Also “Low incidences of reporting further complicate the measurement of CCB; like related offences, CCB is consistently under-reported. Refuge found that half of victims and survivors (49%) said they told nobody about the abuse, with only 13% of women reporting the abuse to the social media platform where the abuse happened. Only one in ten victims and survivors (10%) reported it to the police. One in five victims and survivors (18%) did not tell anyone because they were not sure how to report the abuse. Evidence also suggests that women may not identify as victims and survivors if asked directly; responses are higher if women are given specific examples of abuse to relate to. (6G:15) … Research by Refuge also shows that women fear for their physical safety following online CCB. Almost one in five (17%) said they felt afraid of being attacked or subjected to physical violence following tech abuse. Fifteen per cent felt their physical safety was more at risk, 5% felt more at risk of ‘honour’-based violence and 12% felt afraid to leave the house. (6G.26) … Some victims and survivors are left feeling uncomfortable online. If victims and survivors disengage from online services because of CCB, this can have significant adverse effects including isolating them from family, friends, professional and social networks (thereby reducing their ability to access support). Being inaccessible online can in fact heighten abuse or amplify the risk of physical contact to enable the perpetration of abuse. (6G.28)

    Protecting People From Online Harm Volume 2: “The causes and impacts of online harms”,  https://www.ofcom.org.uk/__data/assets/pdf_file/0019/271243/volume-2-illegal-harms-consultation-1.pdf ↩︎

  30. Siofra O’Leary, Speech on the Opening of the Judicial Year 2024, Strasbourg, 26 January 2024 p5, https://www.echr.coe.int/documents/d/echr/speech-20240126-oleary-jy-eng ↩︎

  31. S.M v Croatia (60561/14) [GC], judgment 25 June 2020; Rantsev v Cyprus and Russia (25965/04), judgment 7 January 2010; Siliadin v France (73316/01), judgment 26 July 2005 ↩︎

Download assets