The Government's VAWG strategy: our response

Tags:

Today the Government has published their long-awaited Violence Against Women and Girls strategy, which sets out their ambition for how they will tackle VAWG and meet their mission to halve VAWG in the next decade.

To tackle the full continuum of VAWG, it is vital that online abuse is taken as seriously as offline abuse. Yet, whilst pervasive in our society, it remains poorly understood by criminal justice agencies and is not robustly dealt with by tech platforms. We are therefore pleased to see several specific measures to tackle online abuse and technology facilitated abuse, which we have responded to in more detail below.

Ban on nudification apps and other tools

Nudification apps facilitate the abuse of women and girls through the creation of non-consensual intimate and sexualised images, which have often been used to extort children and drive women offline. These apps have made image abuse easily accessible to perpetrators with relatively little technical ability. We therefore particularly welcome proposals to ban nudification apps, which follows the tireless campaign efforts of numerous organisations that are part of the OSA Network. We acknowledge, however, that there may be challenges in defining the scope of the ban so that it achieves its purpose whilst not being over-broad or easily circumvented.

Furthermore, the strategy purports to “make it impossible for children in the UK to take, share or view a nude image, to help protect our children”, following that they “are working constructively with companies to make this a reality”. Self-generated CSAM is a pressing issue, and one that should not be dealt with lightly. However, we are yet to see the detail of what this action would look like, and the implications it will have on user privacy, particularly where data containing nude images of children are held. It is vital that any proactive steps taken to combat the sharing of nude images amongst young people does not come at the expense of their safety or security.

Tackling extreme pornography

As has already been announced, the Government is legislating to make it a criminal offence to possess or publish pornography which depicts strangulation or suffocation. This is an extremely welcome move, particularly given that recent research from the Institute for Addressing Strangulation (IFAS) found that 32% of 16-17-year-olds who had prior sexual experiences reported having strangled others during sex. This is accompanied by action to designate the depiction of strangulation or suffocation in pornography as a priority offence under the Online Safety Act, which will mean that platforms are held accountable for ensuring this content does not spread by taking proactive steps to prevent users from seeing illegal strangulation and suffocation content in pornography. Search services will need to minimise the risk of individuals encountering content of this type. We wonder whether Ofcom will need to consider which measures are best suited to effectively implement these requirements.

The strategy also commits to addressing the issues detailed in Baroness Bertin’s Independent Pornography Review and creating a joint team, across the Home Office, Department for Science, Innovation and Technology, Ministry of Justice and Department for Culture, Media and Sport, “to rigorously examine the evidence to inform the government’s approach to pornography policy”. Baroness Bertin’s pornography review set out numerous evidence-based recommendations which were previous rejected by the Government in their response, including banning pornography depicting incest and rape. We are encouraged by this commitment to take another look at the review and urge the Government to act swiftly on all of the recommendations set out in it.

Safety by Design

We are delighted to see numerous references to safety by design in the Government’s action plan (Volume 2 of the strategy), including references to working on what more Government can do to “encourage ‘safety by design’ of smart and connected technology to better protect victims and survivors and help stop perpetrators using this type of technology to further their abuse”. Furthermore, the strategy makes specific reference to utilising the Government’s Statement of Strategic Priorities (SSP) for online safety, which include important points on safety by design and algorithmic transparency.

Safety by design, whilst a key tenet of the Online Safety Act, is not defined in the legislation. Although Ofcom’s more recent work does include express reference to safety by design, its interpretation of “safe by design” is limited mainly to a few ex-post measures, and they have not taken a holistic approach to what “by design” means in terms of the creation and operation of services, their systems and processes or their business model, even where indicated by the risk register. We are therefore calling on Government to amend the OSA to insert a definition of safety by design into the Act to make clear to Ofcom and services what it was that Parliament intended, and lay secondary legislation to require Ofcom to produce a “safety by design” code of practice, a cross-cutting code to sit beneath the content-based codes.

AI and VAWG

On Artificial Intelligence (AI), the strategy outlines previous commitments on deepfake abuse and AI generated CSAM by ensuring that these technologies cannot assist with VAWG offending. It says, “we are introducing new legislation to strengthen safeguards against AI-generated sexual abuse. The testing defence for non-consensual intimate images, child sexual abuse and extreme pornography will empower designated bodies – such as AI developers and child protection agencies – to carry out safe and secure testing of AI models. This will help to embed robust safeguards from the outset and drive continuous improvement.” This action has been much-needed and vital, creating space and removing claimed obstacles to product testing.

However we must not wait for more reports of new and developing AI technologies being used to perpetrate VAWG. We know what the risks are and what is at stake when technology providers ignore them. Whilst we welcome the commitment from Government to develop their “own testing on VAWG risks to improve our understanding of the capabilities and threats posed by AI”, we would argue that we already have enough information in this space to mandate tech providers to make their products safe by design to ensure future action is not in reaction to abuse that has already occurred.

Furthermore, these commitments remain hollow whilst plans to introduce an AI Bill have stalled. We urge the Government to reconsider their previous promise to introduce legislation on this issue to ensure AI products are safe by design, supported by robust risk-based regulation and transparency. This includes particular action to bring AI chatbots fully into the scope of the OSA, given that there is already evidence to demonstrate how they have been used to both facilitate and directly perpetrate VAWG crimes.

Civil redress for survivors of Image-Based Abuse (IBA)

Whilst there have been positive developments in legislation in the last few years to tackle IBA, even where a conviction for IBA has been secured, survivors are often retraumatised by their intimate images remaining online. We therefore note with interest that the strategy commits the Government to exploring “routes to ensure that intimate images that are taken, created or shared without consent are removed online.” This is further detailed in their commitment to establish “a global mechanism to share and remove online intimate image abuse as well as agreeing a new Model National Framework for Adult Non-Consensual Intimate Image (NCII) sharing”.

As we highlighted in our response to the Women and Equalities Select Committee (WESC) on Non-Consensual Intimate Image Abuse (NCII), “while courts do have powers to make some orders to remove material, they are not comprehensive or well-known. They are also difficult to access as specific legal knowledge and advice is required which can be costly.” It is vital that survivors have access to civil redress which is comprehensive and survivor-centred. An example of this happening effectively is in British Columbia, which introduced a straightforward, online court regime to process claims and provide the orders. We therefore urge the Government to take steps to ensure that existing powers are used by police and courts and also to look closely at this regime.

Abuse of female politicians

The use of online abuse to censor and silence female politicians disrupts the democratic process and leads to women dropping out of political life. The strategy outlines action to “support a Foreign, Commonwealth and Development Office-funded Online Violence Alert System for women political actors, to detect and predict abuse and coordinate rapid responses.” This commitment is welcome if designed and implemented effectively, and we look forward to seeing the detail of how this would work in practice.

Alongside this, we urge Government to consider the recommendation made by the Speakers Conference in their recent report “noting that elections are high risk periods for abuse and given the significance and authority of codes of practice within the structures of the Online Safety Act, the Government should consider the merits of mandating Ofcom to produce an elections code of practice for social media platforms, and the feasibility of introducing this requirement as part of the Bill it has said it will bring forward during this Parliament on electoral reform.” This recommendation is based on work produced by Professor Lorna Woods and Maeve Walsh, and would ensure that service providers are compelled to act proactively to ensure that female politicians are protected from abuse and have access to critical support.

Prevention of Online VAWG

The implementation of preventative measures to stop VAWG from occurring in the first place is very much welcome, and reflects the many calls from the VAWG sector for a proactive approach to tackling VAWG. This includes updates to the school curriculum to cover topics such as misogyny, pornography, healthy relationships and media literacy.

The strategy adds that to “improve media literacy among adults in England and Wales, helping people to develop the digital skills they need to protect themselves online, and to tackle the prevalence of misogynistic content. We will work with Ofcom to embed their three-year Media Literacy Strategy which reflects this priority. So far, this has included Ofcom research exploring why harmful behaviour occurs and how individuals are drawn into communities promoting discriminatory views.” In our recent response to Ofcom’s media literacy consultation, we noted that their Media Literacy Strategy lacked ambition beyond platforms’ existing duties in the illegal harms codes and children’s code.

Furthermore, it remained unclear whether media literacy in this context was aimed at helping users better understand the online environment generally or how to navigate the specific platform a user is on. Media literacy should go beyond providing better information about the design of the particular platform users are operating to deliver clear information about how users can navigate the broader online community and ecosystem in a safe and critically engaged manner, as well as encouraging better user behaviour, considering pro-social design as part of a media literacy by design approach.

It is important that the Government focuses on improving media literacy amongst adults beyond platform level interventions by engaging with communities and providing them with the critical skills they need to navigate the online world safely, treat other users with respect, and ensure that they can critically analyse false or harmful information.

Police response to Online VAWG

The strategy also proposes to invest almost £2m in “crack police squads made up of “covert online investigators” to target internet abusers. The introduction of dedicated teams within police forces is a welcome measure, however there is still a fundamental gap in knowledge within police forces when it comes to online VAWG. Without properly understanding the nature of online VAWG and the rapid evolution of new technologies that are being used to perpetrate VAWG, such as AI, there is still a concern that investment will not lead to results.

Facial recognition

We are particularly concerned by commitments to using facial recognition technology to identify and pursue perpetrators within the strategy. Beyond the cross-cutting concerns about privacy and excessive state surveillance, many facial recognition systems have been trained on non-representative datasets that contain more images of White individuals. This imbalance means the algorithms learn patterns that work best for lighter skin tones and perform worse on darker ones. Facial recognition technology is therefore prone to racial bias, and may lead to the over-criminalisation of Black and minoritised communities, particularly given the systemic racism in policing.

Metric for halving VAWG

The Government also sets out their metric for delivering on the mission to halve VAWG, which relies on a combined measure of domestic abuse, sexual assault, and stalking recently developed by the Office for National Statistics (ONS). We are concerned that all forms of online VAWG will not be properly represented by this statistic, and that the complexities of new and emerging forms of online abuse will not be captured.

Next steps

We are pleased to see that this much-needed strategy delivers on many of the asks put forward by the VAWG sector and civil society, including prevention, victim support and a whole-society approach. We acknowledge the hard work undertaken by the Government to develop a comprehensive, cross-cutting strategy that faces the current epidemic of VAWG head on. However it is hard to ignore how long this strategy took to come. Now, the urgency of action is key to the success of this strategy, so more detail will need to follow quickly on how these raft of measures will be funded, delivered, and monitored for impact.