ICO forces Serco Leisure to stop using facial recognition technology for employees

Published on 01 August 2024

The question

In what circumstances can facial recognition technology (FRT) be acceptable to monitor employees in the workplace?

The key takeaway

Using FRT to monitor employees for attendance purposes is highly unlikely to comply with strict data protection laws over the processing of special category biometric data, especially where alternative processes and technologies (such as key fobs and radio-frequency identification) can achieve the same purpose.

The background

On 19 February 2024, the UK’s Information Commissioner (ICO) issued a series of enforcement notices against Serco Leisure, Serco Jersey, and seven other associated community leisure trusts (together Serco). The notices ordered Serco to cease the monitoring of its employees through FRT and fingerprint scanning across 38 leisure facilities.

The ICO’s investigation into Serco’s use of biometric data was kickstarted by an employee’s complaint in 2019 that the leisure company was using FRT in the workplace. This use of FRT involves the collection and use of biometric data, which is a special category of personal data under the UK GDPR that is subject to stricter rules on its processing.

Serco argued that biometric technology was the “sole technology capable of eliminating buddy punching and falsified timecards” which makes it “more accurate and secure than cards or keys, because a fingerprint or face scan cannot be lost, stolen or easily replicated”. For this reason, it considered it had a lawful basis for processing this data under Article 6(1)(b) (contractual necessity) and Article 6(1)(f) (legitimate interests) of UK GDPR. Serco also argued that it fulfilled one of the stronger conditions for processing special category data under Article 9(2)(b), as it was necessary to comply with employment regulations by monitoring working hours, calculating national living wages, paying its employees correctly for the time worked and ensuring that it could comply with right to work, tax, and accounting obligations.

Serco claimed that the employees were able to opt out and be subject to an alternative attendance system if an employee was concerned about the use of biometric technology. However, the ICO found that this type of “opt-out” policy was not in place until April 2021. In fact, when an employee complained to Serco regarding the use of FRT, they were directed to a ShopWorks representative (the company which supplied Serco with the biometric recognition technology) to ease their privacy concerns. However, the employee was still advised that they would be required to use the system upon their return to work, so in practice they were not given the opportunity to object to this processing.

The development

The ICO held that Serco, being the data controller, had failed to establish a lawful basis for processing personal data and special category biometric data. The ICO held that less intrusive methods of recording employee attendance than the use of biometric data, such as key fobs and radio-frequency identification cards, could have been used. When presented with these alternatives, Serco could not justify why they were unsuitable and why they would otherwise lead to widespread abuse.

The ICO held that reliance on legitimate interests as a basis for processing personal data is also not acceptable where the controller could reasonably achieve “the same result in a less intrusive way”. There was no evidence that Serco had considered less intrusive alternatives, such as taking disciplinary action against individuals who had abused previous attendance recording methods. The ICO also noted that whilst “necessity” does not equate to “absolutely essential” processing, the processing must nevertheless be “more than just useful”. In this instance, the use of biometric technology was not held to be directly necessary to pay employees appropriately, nor was it proportionate with only a minority of employees having previously abused attendance recording systems.

Serco had argued that its processing of special category personal data would be permitted as it was required to do so by law, citing s.9 Working Time Regulations 1998 which requires employers to maintain adequate timekeeping records and s.13 Employment Right Act 1996 whereby workers have the right to not suffer unauthorized deductions to their wages. However, the ICO held that Serco had not in fact relied on the above laws when processing the biometric data and had also not referred to them during the ICO’s investigation. This argument failed as a result, and Serco was held to have no valid condition to process special category data under Article 9 UK GDPR.

The ICO held the infringement to be particularly serious as the processing of the personal data had taken place since May 2017 and was estimated to involve at least 2,283 data subjects. The processing was considered by the ICO to be “highly intrusive” and could have potentially caused distress to the data subjects concerned, especially considering the imbalance of power due to the employment relationship. The lack of an alternative mechanism and the fear of the risk of disciplinary action against an objector.

As well as immediately stopping biometric data processing for the purposes of attendance monitoring, Serco were required to destroy all biometric data it is not legally required to retain within three months. If Serco fails to comply with the enforcement notices, the ICO may serve a penalty notice requiring the payment of up to £17,500,000 or 4% of the organisation’s worldwide annual turnover, whichever is highest.

Why is this important?

Organisations must be mindful of what special category personal data they expect to process. In an age of fast-developing technology, biometric technology may seem appealing as a next-generation security measure, however it comes with many legal and regulatory strings attached.

Any practical tips?

Businesses must be able to demonstrate that they have a real necessity for FRT processing, for which no suitable alternative could achieve the same purpose. Furthermore, policies surrounding the use of such technology must be made clear to all related parties and “opt-out” options must be clear and accessible to data subjects.

Summer 2024

Stay connected and subscribe to our latest insights and views 

Subscribe Here