Facial Recognition Technology: UK Regulator Reprimands School for Failure to Conduct Data Protection Impact Assessment
The UK data protection regulator, the Information Commissioner’s Office (“ICO”), has issued a formal reprimand to an academy school in Essex for failing to conduct a Data Protection Impact Assessment (DPIA) prior to introducing facial recognition technology in respect of its cashless catering system for pupils.
The decision, which can be found on the ICO’s website, serves as an important reminder of the legal requirement to conduct a DPIA prior to introducing new technological solutions for vulnerable data subjects, such as children.
When to conduct a DPIA?
DPIAs are a fundamental, but often overlooked, part of the accountability obligations for all data controllers under the UK GDPR. They are an important legal responsibility designed to identify, assess, and mitigate privacy risks at an early stage and avoid non-compliance. Failure to carry out a DPIA when legally required to do so, may leave organisations vulnerable to enforcement action from the ICO, and fines of up to £8.7 million, or 2% of global annual turnover if higher.
Article 35(1) of the UK GDPR states that a controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data, where this processing is likely to result in a high risk to the rights and freedoms of natural persons.
The ICO has published a list of processing activities that require a DPIA to be undertaken. This list specifically includes the proposed use of biometric data in relation to children’s personal data.
What went wrong?
In this case, the school had not conducted a DPIA prior to introducing facial recognition technology for its catering provision to 1,200 students aged between 11 and 18. The School did not seek appropriate advice from its own data protection officer (“DPO) prior to implementation. Nor did it conduct any consultation with parents or pupils before introducing the technology.
Instead, for approximately six months after introducing the system, the school had effectively relied on “assumed consent” for using the system. The only exception was where parents or carers (but not pupils themselves) had exercised the opportunity to opt out. The ICO pointed out that in any case “assumed consent” would not have provided a valid legal basis under the UK GDPR for processing biometric data as valid “consent” (to a UK GDPR standard) requires clear affirmative action. The ICO was also critical of the fact that the use of a parental opt out deprived those children who were old enough to make their own decisions about the use of their data, of the opportunity to do so.
In deciding to limit the regulatory action taken to a formal reprimand rather than a fine, the ICO recognised that the school had taken steps to mitigate its failure to conduct an initial DPIA by refreshing consents and obtaining specific affirmative opt-ins from pupils using the system. The school had also conducted a comprehensive DPIA some months after the event. Nonetheless, the decision should serve as a timely reminder for schools to conduct and keep on record a written DPIA, with the advice of its DPO, prior to implementing new projects using biometric data (or any other processing that may have an impact on the data privacy rights of students).
For more information on DPIAs or any other data privacy issue, please contact James Quartermaine, Legal Director, or another member of our data privacy team.
James Quartermaine
James Quartermaine is a legal director in our data privacy team, advising clients on a wide range of privacy and data protection issues.
- Legal Director
- T: +44 (0)20 3750 2494
- Email me
The articles published on this website, current at the date of publication, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your own circumstances should always be sought separately before taking any action.