The UK’s supervisory authority, the Information Commissioner’s Office (ICO), announced on 23 July 2024 that it had issued a warning to a school in Essex for the unlawful use of facial recognition technology – a violation of Art. 58 para. 2 lit. b UK GDPR.
What happened?
In March 2023, the school began using facial recognition technology in the school canteen to enable students to make cashless payments.
Facial recognition technology (FRT) processes students` biometric data for unique identification. The technology was intended to replace the – also biometric – method of capturing fingerprints, which has been in use at the school since 2016.
The school´s data protection officer was not initially involved in the introduction of the technology. A letter was sent to the children’s parents in March 2023, which they were asked to return if they did not consent to their child’s participation in facial recognition. In November 2023, consent was obtained through an affirmative action. Here the data protection officer also involved, carrying out a data protection impact assessment (DPIA) and submitting it to the data protection authority.
Evaluation of the ICO
The ICO saw high data protection risks in the case. The high school, which is attended by around 1,200 students aged 11-18, had failed to carry out the required DPIA prior to deployment. There had been no prior assessment of the risks to minors‘ data. For a lawful and responsible use of facial recognition technology, it would have been necessary to carry out a DPIA within the meaning of Art. 35 para. 1 UK GDPR before implementing the technology.
Like other data protection authorities, the ICO has published a list and guidance on the situations in which a DPIA must be carried out in accordance with Art. 35 para. 4 UK GDPR. This list includes the necessity of carrying out a DPIA for the data of particularly vulnerable groups, such as in this case, minors, as well as the use of biometric data.
In addition to the lack of a DPIA, the school also failed to obtain proper and unambiguous consent for the processing of biometric data of data subjects. The students therefore had no opportunity to decide whether or not their biometric data could be used in this way. Both the UK GDPR and the GDPR do not consider the refusal (so-called opt-out), which the parents could assert by returning the letter, to be sufficient and explicit consent. Therefore, until consent was obtained in November 2023, there was no legal basis for the processing.
The ICO also points out that a large number of the students would have been old enough to give their own consent to the data processing. The opt-out procedure, directed at parents, deprived the students of the ability to exercise their rights and freedoms.
Lynne Currie, Head of Privacy Innovation at the ICO, is cited providing the following reasons behind the move:
“Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.
We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children.
We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights.”
In the press release, the ICO further quotes Mrs. Lynne Currie as follows:
“A DPIA is required by law – it’s not a tick-box exercise. It’s a vital tool that protects the rights of users, provides accountability and encourages organisations to think about data protection at the start of a project.”
The full text of the warning with the ICO’s further recommendations to the high school can also be viewed on the ICO website.
Despite recognising that the school ultimately carried out a data protection impact assessment – albeit belatedly – the ICO sees further room for improvement. The necessity and proportionality test should also take into account the risks of bias and discrimination through the system used. In this regard, the ICO also refers to its own published case study on the requirements for facial recognition systems in the education sector.
Conclusion
Data controllers should bear in mind that carrying out a data protection impact assessment for high-risk processing operations is required by law in order to identify the risks and address them with sufficient measures.
This case concerned the use of new technologies, the processing of biometric data as sensitive data and minors as data subjects.
The early involvement of the data protection officer would have been necessary here and would have helped with the categorisation, evaluation and selection of suitable measures to reduce the risks in the use of the technology. With the assistance of the ICO the consent of the data subjects would likely have been legally effective from the outset.
This particular high school is not the first school in the UK to introduce the use of facial recognition technology in its canteens. Back in 2021, there were media reports that this technology was being used in Scottish schools. Following massive criticism from data protection organisations and the ICO, ten schools deactivated the facial recognition technology they had been using. The organisation Defend Digital Me has also published a list of demands for the protection of children’s data in the education sector, which also includes the prohibition of the collection of biometric data. With regard to minors, this technology should only be used after careful consideration and should always be subject to a necessity and proportionality test.