Collaboration is a core value of our firm’s client service. Daily our lawyers with in-depth experience in different practice areas collaborate with each other to find joined-up and multi-faceted solutions to the legal issues facing our clients. This is particularly so in the field of online safety, where several legal regimes overlap. We have already discussed how the application of the new Online Safety Act (“OSA”) will need to align with the UK competition rules (here). On 1 May 2024, Ofcom and the Information Commissioner Office (“ICO”) issued a joint statement (here) acknowledging that some of the OSA provisions also overlap with UK data protection rules and indicated how they intend to collaborate, including by exchanging information to effectively tackle issues of common interest in a coordinated manner. While collaboration in enforcement is welcome – just as much it makes sense in compliance – it can create some unique challenges. Data protection and online safety are no exemption to this rule.

Collaboration themes

The joint statement lists the examples set out below as issues of ‘common interest’ to Ofcom and the ICO. The regulation of these topics falls at the intersection between the OSA UK GDPR and the Data Protection Act and the ICO’s and Ofcom’s relevant codes of practice. The issues of common interest include:

  • age assurance;
  • recommender/ranking systems;
  • proactive tech and relevant AI tools;
  • default settings and geolocation settings for child users;
  • online safety privacy duties; and
  • upholding terms, policies and community standards.

Age assurance is currently a particular area of focus for Ofcom and the ICO. Ofcom has recently initiated an investigation into the age verification measures of OnlyFans and the ICO announced earlier in the year that it conducted voluntary audits relating to age assurance measures as part of the two regulators’ work to align on age assurance requirements. As part of its consultation for protecting children from harms online, Ofcom has issued a series of draft codes (here) which set out numerous safety measures for online service providers including in connection with ‘robust age checks’.

In practice, Ofcom’s draft codes would impose a requirement for online services that do not ban harmful content to introduce highly effective age-checks to prevent children from accessing the entire site or app or restricting access to parts of a service for younger users. On the face of it, this is somewhat different from the ICO’s call in its guidance on ‘Age assurance for the Children’s code’ (here) for services to collect only an amount of information about a user that is proportionate to the risk the service poses. Naturally, ‘robust age checks’ will require a larger volume of information, especially when profiling or biometric technology is deployed. However, both the ICO and Ofcom agree that service providers must first understand whether their services could be accessed by children and, if so, what level of risk they pose before they turn to deciding which age assurance measures might be the most appropriate.

In practice, both regulators currently appear willing to support a risk-based approach which focuses on the level of risk posed to children by the content that is made accessible to them via the service. Both the ICO and Ofcom support the use of robust age checks when the content of the service could be harmful to children, with the ICO calling for the ‘highest possible’ certainty on the age of users and Ofcom calling for ‘highly effective’ age assurance. Further developments are expected in this area, especially as Ofcom finalises its guidance on protecting children from online harms. 

Companies of mutual interest

The joint statement also indicates that Ofcom and the ICO will draw up a list of companies or services which are subject to both the online safety and data protection regimes and therefore of current regulatory interest to both regulators. The joint statement is vague here. It appears the list could be extensive and without additional qualitative or quantitative criteria. Apart from issues of legal certainty, any overly open-ended approach would be challenging for businesses from a practical compliance perspective.

Information exchange

The joint statement also confirms that the ICO and Ofcom will share certain specified information on a routine basis in connection with companies of common interest, including:

  • dates of requests for information sent to, or meetings with, companies of mutual interest, expected response time/deadline and the company names – not the content of the request or meetings;
  • publicly available information which may be of interest to one another, for example, information about a company that is currently of mutual interest that has been reported in the press; and
  • current intelligence work without specifying the name of the companies, for example, ‘Ofcom’s supervision team are currently looking into recommender systems and how services reduce the risk of recommending harmful content to children’ or ‘In the next quarter, the ICO plans to look at data protection issues arising from use of automated content moderation across a number of social media platforms.’

All other information may be shared between Ofcom and ICO on a case-by-case basis only subject to the following statutory safeguards:

  • for Ofcom – section 393 of the Communications Act 2003 prohibits the disclosure of certain information about a business without consent unless a statutory gateway applies. One such gateway is where disclosure is made for the purpose of facilitating the carrying out by Ofcom of any of its functions.
  • for the ICO – section 132 of the Data Protection Act 2018 prohibits the disclosure of confidential information about an individual or business unless the disclosure is made with lawful authority. Lawful authority is defined in section 132(2) and includes where the disclosure is made for the purposes of, and is necessary for, the discharge of one or more of the ICO’s functions and disclosures made in the public interest having regard to the rights, freedoms and legitimate interests or any person (it is envisaged that, in many cases, the sharing of information with Ofcom for the purposes of discharging its functions is likely to be in the public interest). There are other grounds in s132(2) which may be applicable in a particular case.

There are obviously some legal issues that may affect how much cooperation is possible – sharing of information obtained through formal investigations will be limited by the legal framework unless the parties waive confidentiality, as often happens with cross-border merger control. Parties may find it advantageous to waive confidentiality in exchange for a coordinated regulatory approach, particularly when trying to close multiple investigations into the same conduct by way of a common set of remedies. These legal issues will become more sensitive the more closely the regulators work together on individual cases. It is reasonable to expect that Ofcom and the ICO, as well as the other members of the UK Digital Regulation Cooperation Forum (which also includes the CMA and FCA), will need to keep the existing joint statements and other memoranda of understanding between regulators under review to reflect best practices as they emerge on a case-by-case basis.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only, and is not intended to constitute or be relied upon as legal advice.