Social media and video sharing platforms targeted by ICO over children's privacy practices

Published on 17 October 2024

The question

What must social technology platforms be aware of to ensure they are following the ICO's codes of practice for children's online safety?

The key takeaway

As social media and video sharing services continue to evolve and become a part of day-to-day life, the ability of children to access these services increases. Online platform providers must ensure they are designing their services with mechanisms in place to protect children's privacy or face regulatory scrutiny.

The background

The ICO's Children's Codes (the Codes) outline the standards that social technology platforms should meet to ensure their services safeguard children's personal information. The technologies within the scope of the Codes include social media platforms and those that enable the sharing of videos. The ICO emphasises that children's safety should be a primary consideration for platform operators in the development of online services. The accompanying Children's Codes Strategy (the Strategy) further details the ICO's focus on improving how platforms protect children's personal information online.

The development

An August 2024 update published by the ICO reveals varying levels of adherence to the Codes among the providers of social media and video sharing platforms. The ICO confirmed that 11 out of the 34 online platforms it reviewed to inform this update will face further questions from the ICO in relation to their children's online privacy procedures. The regulator has announced it is prepared to take enforcement action against platforms who fall short of their legal obligations.

Back in April 2024, the ICO announced the Strategy, which pushed the following focus areas to the forefront of its children's privacy protections:

  1. having default privacy and geolocation settings;
  2. the profiling of children for targeted advertising;
  3. controlling the use of children's personal information in certain categories of machine learning and algorithms; and
  4. the use of personal information of children under 13 years old.

The ICO's approach to its Strategy review was to observe 34 social media and video sharing platforms' joining processes for children. This required creating proxy accounts of children of various ages, and using these to sign up to different social media platforms. The ICO then examined key settings and privacy information presented to children before interacting with different users.

The concerns raised by the review have seen 11 unnamed social media and video sharing platforms face scrutiny for their practices and adherence to the Codes. This includes age and geolocation privacy issues, plus further queries based on the ICO's advertising expectations for targeting ads at children.

Interested stakeholders have been invited to provide views and evidence on Strategy focuses (c) and (d), relating to how algorithms use children's personal information and age assurance to identify children below 13. This will inform further Strategy reviews of the Codes and measure the regulator's success in guiding the market to structure these technologies in a way that protects privacy and personal information. The ICO has reaffirmed its aims of supporting opportunities for young people to explore and develop via online platforms, whilst obliging platform operators to improve safeguarding.

Why is this important?

The ICO is further strengthening its commitment to children's online safety, alerting social media and video sharing platforms that they must take responsibility for the use of their services by those under 18. Thanks to the rapidly evolving social media sphere, its approach remains a prevalent regulatory focus and this looks set to continue. The ICO's Deputy Commissioner, Emily Keaney, has commented that online platforms "have a duty of care to children" and warns that poorly regulated services can increase the risk of harm to children from bullying and abuse.

Any practical tips?

The operators of social media and video sharing platforms should be alert to the ICO's increasing regulatory scrutiny. These organisations must regularly review and adapt their children's privacy practices to meet their regulatory requirements.  Clearly close adherence to the Codes can help achieve this.

Autumn 2024

Stay connected and subscribe to our latest insights and views 

Subscribe Here