The ICO has taken enforcement action against an employer to prevent it from using biometric technology to monitor its staff. This is a useful reminder to all businesses using this type of technology to ensure that they have a legitimate basis for processing such data and to be able to justify that less-intrusive data collection methods would not achieve the same objectives.

In this blog we summarise the ICO's enforcement notice and consider the steps that businesses should follow if they are considering using biometric technology such as fingerprint or iris scanners, and voice and facial recognition systems.

The monitoring

The Serco companies and leisure trusts each utilised the collection and processing of data through a mixture of facial recognition and fingerprint scanning technologies. They used the data collected to monitor attendance and subsequently inform on payments due to their employees.

The law

The data that Serco and the other leisure trusts collected when using facial recognition and fingerprint scanning systems is known as "biometric data". This is a form of personal data resulting from "specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images…".

When biometric data is used for the purposes of identification, it is classified under data protection law as special category data. Special category data is given special protection as processing can give rise to significant risks in relation to someone's fundamental rights and freedoms (i.e., freedom of expression and freedom from discrimination.

Therefore, in order to process biometric data lawfully, the controller must:

  1. Identify a lawful basis for processing from Article 6 UK GDPR (and where legitimate interests is the basis, complete a legitimate interests assessment);
  2. Identify a lawful basis for processing from Article 9 UK GDPR (which may include one of the conditions for processing set out in Schedule 1 of the Data Protection Act 2018);

Serco was purporting to rely on the contractual necessity and legitimate interests bases of Article 6, and the employment law necessity basis under Article 9 "on the basis that Serco needs to process attendance data to comply with a number of regulations, such as working time regulations, national living wage, right to work rules and tax/accounting regulations".

It also stated within the staff privacy notice that all employees were expected to use the technologies.

The enforcement action

While the ICO recognised Serco gave reasons for employing the use of the technologies, the ICO found that Serco had not shown that the use was "necessary" to achieve its stated aims.

This is a requirement of relying upon the contractual necessity basis – Serco could not show that less intrusive means could achieve the same objectives. For example, the ICO references "radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets" as alternatives Serco could have considered.

In relation to the reliance on legitimate interests, Serco could not show that the objectives could not be achieved by less intrusive means.

While Serco cited some alleged abuse of 'normal' time recording measures (e.g. manual written time sheets), the ICO said it had not evidenced widespread abuse in a way that would only be combatted by use of biometric data processing. While it may be beneficial to Serco to use biometric technology, it hadn’t demonstrated it was a necessity to do so over and above other measures such as disciplinary action.

On Article 9, Serco tried to base its processing on provisions of the Working Time Regulations 1998 and Employment Rights Act 1996 relating to the right to be paid. However, Serco failed to demonstrate how these laws required Serco to undertake the processing at issue.

Serco also failed to give its employees clear information about how they could object, and the ICO found there to be an imbalance of power between the employees and Serco so that even if employees wanted to object, they would not do so.

As well as stop processing, the ICO also ordered Serco to destroy any of the biometric data it holds which it obtained through the use of the facial recognition and fingerprint scanning technologies.

ICO issues Biometric Guidance

The Serco enforcement notice was issued on the same day as new guidance on the use of biometric recognition systems.

While some other Article 9 conditions may apply in particular circumstances, as the ICO's enforcement notice against Serco shows, the employment law necessity condition in paragraph 1 of Schedule 1 of the 2018 Act is unlikely to be available to most employers.

The guidance therefore states that explicit consent is likely to be the most appropriate condition when processing personal data via biometric systems. However, the guidance also recognises that it might be difficult to obtain in an employer-employee context as employees may feel pressurised to consent and may not be presented with a free choice.

The guidance also recommends using a Data Protection Impact Assessment when using any biometric recognition systems to identify the risks associated with processing this type of data and to evidence why other less intrusive measures are not suitable to achieve the organisation's aims.

Any individual whose data is being collected via such technology has the standard rights which apply to all data subjects, including that of access, objection and erasure. The guidance considers how these rights might apply in practice in relation to biometric data, which often consists of complex mathematical outputs in a machine-readable format and as such are less accessible to individuals requesting such data.

Using biometric technology

As the Serco case shows, it can be difficult to justify the use of biometric technology, even where it may make the lives of employees easier or have operational benefits.

There is also a risk that businesses assume that biometric technology solutions are lawful based on marketing materials issued by technology vendors or simply because the technology is available for them to use.

If you are considering using biometric technology then here are five steps to follow to try and mitigate these risks:

  • Diligence: Carry out detailed diligence on the proposed technology to understand how it works and to identify why your organisation wishes to use it.
  • Legal basis: Identify a valid legal bases under both Articles 6 and 9 for using the biometric technology.
  • Proportionality: Consider whether you can demonstrate that your proposed use is both proportionate and necessary for the purpose (including why less intrusive methods are not appropriate).
  • Consent: If no other legal basis is available and you are relying upon explicit consent, ensure that employees are given sufficient information on how the technology is to be used and have a free choice. This includes providing information on alternative processes that do not involve the use of biometric data (for example, a swipe card or access PIN).
  • Transparency: Ensure that you can comply with data subject rights, including providing information in your privacy notice.

The best way to identify a valid legal basis and document the risks and your proposed mitigations is to carry out a data protection impact assessment.

Should you wish to discuss any of the issues raised in this blog, please contact Martin Sloan, Rachel Lawson or your usual Brodies contact.

Contributors

Martin Sloan

Partner

Rachel Lawson

Associate