Response to Ofcom's call for evidence on age assurance and app stores

We welcome Ofcom’s call for evidence to inform its two forthcoming statutory reports and the clarity of the questions in the call. We note that the report on age assurance will also be informed by “a comprehensive programme of research” and information sought from providers via your information-gathering powers. This will be one of the first reports assessing the effectiveness of OSA implementation and we are reassured that Ofcom is approaching this in such a comprehensive way.

We also welcome the clarity set out in the consultation document about the scope of the app store report. As Ofcom will be aware, this statutory report was added to the Online Safety Act as a concession by the previous Government in response to pressure in the House of Lords for apps stores to be included in the scope of the legislation, specifically with regard to the risk of harms to children from accessing age-inappropriate content via such stores. It is reassuring to note that Ofcom’s report for the Secretary of State will include “app store design, governance processes and safeguarding measures (such as app review processes)” and how they “may shape children’s online experiences, and the effectiveness of existing protective measures, such as age assurance, age ratings and parental controls at protecting children from harmful content.”

We do not have evidence to share in response to the specific questions but refer Ofcom to the submissions from Reset and Internet Watch Foundation which we support. However, we would like to put forward a few points and observations for consideration as Ofcom progresses this work.

Framing of the call for evidence

While there is undoubtedly much common ground between the two statutory reports and administrative efficiencies for Ofcom’s teams by merging the call for evidence for both reports, they are distinctly different products with very different statutory underpinnings. The implementation and effectiveness of the roll out of age assurance under the Online Safety Act focuses on the services that are already regulated by the Act: namely user-to-user services, with social media platforms being the most prominent. There was much noise and furore at the time these duties came into force at the end of July this year - some of which was likely to have been fuelled by services which, whether through accident or design, implemented their new obligations badly. Ultimately, those services are still responsible for that ineffectual implementation. The high compliance rate and relative success of the roll-out of age assurance by pornography providers demonstrated that services which prepared appropriately and engaged with Ofcom in good time before the deadline, were able to introduce the age checks smoothly and seamlessly, without much public or media pushback. The user-to-user services who either - “by accident” - didn’t prepare properly ahead of the date for compliance or - “by design” - were content with shoddy implementation as they could deflect their users’ irritation and anger onto the regulator or the Government are still responsible for those outcomes. We trust that this distinction will be under close scrutiny when Ofcom studies the information it receives from industry or via its information-gathering powers and we look forward to seeing it reflected in the analysis in the final report.

This particularly matters when the call for evidence on the effectiveness of the rollout of age assurance by user-to-user services is combined with the call for evidence to inform recommendations to the Government on the merits of age assurance for app stores. We do not have an objection to this as an important safety measure to protect children from unsafe or inappropriate apps which are not in scope of the OSA, including gaming or other interactive apps which do not have a user-to-user functionality. We would see this as an additional protection for children and an important levelling of the regulatory playing field in relation to children’s safety, bringing in device-level/operation-system checks to further protect child users from harm in addition to the requirements for age assurance on user-to-user services.

However, Ofcom will be aware that there is a significant push by social media platforms - most prominently Meta - to push all responsibility for age checks to protect children onto operating system providers. This has been seen in the US, in Europe and most recently in Canada, backed in each of these jurisdictions by extensive, well-funded industry lobbying. The rationale behind this is absolutely clear: to absolve Meta and its platforms from having to take responsibility for keeping children safe, pushing this instead onto Apple, Google and Microsoft at the operating system level. Meta and its supporters are not arguing for this to level the playing field or provide a “belt and braces” approach for child protection. Instead, they do not want to see any further laws introduced to require them to carry out age verification on their users and will use this lobbying push to undermine the existing laws, such as the OSA, with which they currently have to comply.

We are concerned, therefore, that in framing this consultation as a “two-for-one” ask of industry - to provide evidence on the effectiveness of age verification under the OSA and to also provide evidence for introducing that at the app store level - Ofcom has left itself exposed to the risk of significant bad-faith industry lobbying and the presentation of evidence that will skew the consideration of the distinct and separate issues covered by these two statutory reports.

We also note Ofcom’s focus on age assurance. While in the context this is understandable, we would suggest that there are other mechanisms in addition that Ofcom might consider - for example age ratings - and that Ofcom might find its own research on audience protection measures might also provide valuable insight into even though the sectors are different. In general, in so far as similar protection measures are used across the service sectors there is an argument for trying to use similar baseline information tools. For example, age rating scales and logos should be consistent across the services - they are capable of being accessed on the same device and it would be less confusing for users if standards were the same.

OS-level checks

On the specifics of OS-level age checks - where these are considered as an alternative, not an addition to platform-level age checks - we would also raise the following points for Ofcom’s consideration.

As the Communications regulator, Ofcom has a duty under the Communications Act “to further the interests of consumers in relevant markets, where appropriate by promoting competition”. They are also required under the Government’s growth duty to “have regard to the desirability of promoting economic growth”. The UK safety tech industry is a recent economic success story, with many of the digital ID companies currently providing age checks under the OSA a key part of that success - driving innovation, providing consumer choice and offering a diversity of privacy-preserving options for users. Any regulatory decisions that shift online age verification solely to monopolistic, global tech firms will therefore undermine both Ofcom’s duties to promote competition in consumer markets and to promote growth. There will also be significant risks to user privacy and data protection (including identity theft, cross-app tracking and profiling, and child surveillance) and a significant impact on Ofcom’s (and the Government’s) democratic and regulatory oversight and the effectiveness of its transparency levers.

As we state above, we are not against this as a route to reduce harm to children from unregulated apps and profit-driven developers and to ensure that there is a consistent standard of safety across app stores. But we do not believe that this should be an “either/or” choice: in the context of global industry lobbying for OS-level checks to absolve social media platforms from child safety responsibilities, we are concerned that Ofcom’s call for evidence suggests just that.