ICO publishes guidance on content moderation

Published on 01 August 2024

The question

What steps should businesses operating content moderation systems in the UK be taking to comply with new guidance published by the UK’s Information Commissioner (ICO)?

The key takeaway

Following guidance published by the ICO in February 2024, businesses engaged in the moderation of online content should review their systems’ design and processes to ensure they meet data protection requirements around data minimisation, transparency and fairness. The guidance outlines the ICO’s specific expectations of how these systems should be structured, safeguarded and documented. Particular care is needed when automating decisions entirely through AI, as information about the decision made, its logic and potential consequences must be provided to users.

The background

The UK’s Online Safety Act 2023 (OSA) requires digital platforms to implement processes to remove illegal content quickly and prevent children from accessing harmful content. In practice, this generally requires the moderation of large volumes of user-generated content, which will often involve the processing of considerable amounts of personal data, not least as this content will often include contextual information that identifies its creator.

To assist organisations navigating these requirements, the ICO collaborated with Ofcom to publish guidance outlining the impact the moderation of online content can have on information rights and how this will engage UK data protection laws.

The development

The ICO’s guidance defines content moderation in two parts. The first of these is the analysis of user-generated content to evaluate whether it meets certain standards. Secondly, organisations may take subsequent action as a result of the analysis, such as removing the content or banning a user from the service.

The guidance explains that moderation of online content will involve the processing of personal data where the content makes users identifiable, which in practice means either data about a user or connected to the user account that uploaded the content. These services may also process contextual information like age, location and previous user activity to aid their decisions, as the OSA requires consideration of such “reasonably available” information. This may also constitute personal data about the users. As a result, the operators of the platform will need to comply with applicable data protection laws, including the Data Protection Act 2018 and the EU General Data Protection Regulation as it forms part of retained EU law in the UK (the UK GDPR).

Among the requirements of data protection laws is the identification of a lawful basis for the processing under Article 6 UK GDPR. In the guidance, the ICO identifies the most relevant potential bases as either compliance with a legal obligation (which may in this case include the platform’s compliance with the OSA), or legitimate interests. Organisations will also be required to conduct a Data Protection Impact Assessment in respect of this processing, as the guidance states that the ICO would generally consider this form of content moderation to present a high risk to the rights of individuals.

The ICO also reiterates that organisations will need to comply with the requirements of data protection laws more generally, and sets out particular guidance around certain of the key data protection principles set out in Article 5 UK GDPR, namely:

  • organisations will need to design content moderation systems with fairness in mind, to ensure that the system produces unbiased, consistent outputs that avoids bias or discrimination
  • the principle of transparency will also be important in the design of the system. Users will need to be informed upfront of any content identification technology used in the Terms of Service. More broadly, moderators must provide clear and accessible information about the processing of their personal data, and
  • the principle of data minimisation, which requires that the processing of the contextual information described above is limited through safeguards in the moderation process, and only accessed when reviewing the content alone is insufficient for the moderators to make a decision.

The guidance reiterates the rights of data subjects to request access to their personal data processed in content moderation, meaning that systems must be designed for easy accessibility to the moderation information on a per-user basis. The user must also be given any details of automated decision-making involving the processing of personal data, as these technologies can be used widely in content moderation. Importantly, the guidance also highlights that using these technologies will be subject to much more stringent legal requirements where any decision to exclude users from the platform is made solely on the basis of such automation, without a human in the loop.

Why is this important?

This guidance is essential for businesses grappling with the intersection between the monitoring requirements of the Online Safety Act and compliance with applicable data protection laws. By applying the guidance, businesses will be better equipped to design and operate moderation systems that are compatible with users’ information rights. This is particularly relevant when navigating the areas of AI and automated decision making in this context.

The guidance also clarifies that DPIAs are generally required when processing personal data for the purposes of content moderation, so organisations can allocate resources effectively on this basis. 

Any practical tips?

Content moderation systems are clearly in the ICO’s line of sight. Businesses operating these systems should consider:

  • completing a Data Protection Impact Assessment in respect of any personal data processed in content moderation on their platform, and take any remediation actions that arise as a result of this
  • ensuring that their privacy notices and other user-facing documentation inform users of any content identification technology used and how this will affect the processing of personal data
  • reviewing whether the personal data processed is easily accessible on a per-user basis in the event of a data subject access request, and that adequate safeguards are in place to limit the use of contextual information that contains personal data to that which is strictly necessary
  • regular audits to check that any decisions made based on the processing of personal data are consistent and fair.

Summer 2024

Stay connected and subscribe to our latest insights and views 

Subscribe Here