ICO issues preliminary enforcement notice against Snap for its “My AI” Chatbot
The question
How can organisations who wish to join to the world of generative AI ensure that they adequately assess the risks from the perspective of the Information Commissioner’s Office (ICO)?
The key takeaway
Organisations should ensure that the risk assessment which they conduct, prior to their implementation of generative AI technologies, adequately addresses both the benefits and the risks which such technologies pose to data subjects, especially where a portion of those data subjects are children.
The background
On 6 October 2023, the ICO announced that it had issued a preliminary enforcement notice against Snap Inc. and Snap Group Limited (Snap), alleging that they had failed to adequately evaluate the risks which were associated with Snap’s rollout of its AI-powered chatbot “My AI”.
“My AI” was the first generative AI technology to be built into a messaging platform in the UK when it was launched on 27 February 2023. “My AI” was originally launched for Snapchat+ subscribers as a feature of the Snapchat app, but it was later rolled out to all Snapchat users on 19 April 2023.
The tool, which is powered by OpenAI’s GPT technology, is a chatbot which can be used by Snapchat users to answer such questions as: what gift to buy, what hiking trip they should go on at the weekend, or what they should make for dinner. As a Snapchat user uses “My AI” it becomes more personalised over time, learning more about the user, and making users feel as though they are chatting with a friend.
The development
Following its investigation, the ICO provisionally found that the risk assessment, which was carried out by Snap prior to the rollout of the “My AI” feature, did not sufficiently evaluate the risks which are associated with the implementation of generative AI technology, especially given that the technology would be used by Snap to process the personal data of children aged 13-17. To stress, the ICO’s findings are provisional, and Snap will have an opportunity to respond to the ICO before a final decision is made, or any fine is imposed. The preliminary enforcement notice outlines the potential actions which Snap could take to address the ICO’s concerns. Of course, if the ICO chooses to issue a final enforcement notice, Snap could be prevented from processing UK users’ personal data for the purposes of the “My AI” feature.
Why is this important?
The preliminary notice issued by the ICO emphasises the importance for organisations of conducting fulsome risk assessments before launching a product which incorporates new, innovative technologies. These risk assessments should analyse both the benefits and the risks which may be posed by new technologies to all categories of data subject concerned. According to the ICO’s “Generative AI: eight questions that developers and users need to ask” (see here), the ICO emphasises that conducting an adequate Data Protection Impact Assessment (DPIA), and keeping the DPIA updated as the processing of personal data evolves, will assist organisations with assessing and mitigating any data protection risks before they start processing personal data.
Any practical tips?
The preliminary enforcement notice issued by the ICO is an important reminder that organisations need to be live to the privacy risks which are posed to their data subjects by the implementation of generative AI technologies. As a starter, any organisation which is considering developing or implementing generative AI technology should consider:
- the ICO’s “Generative AI: eight questions that developers and users need to ask” (see here)
- the ICO’s updated guidance on “AI and data protection” (see our Summer 2023 Snapshot article here)
- the ICO’s guidance on “Data protection by design and by default” (see our Summer 2023 Snapshot article here).
Winter 2023
Stay connected and subscribe to our latest insights and views
Subscribe Here