UK Information Commissioners Office: Biometrics: data protection by design and default
With any new biometric technology, building public trust and confidence is essential to ensuring that its benefits can be realised. Where more sensitive categories of personal data are processed, for example via Live Facial Recognition (LFR) for the purposes of unique identification, the public must have confidence that its use is lawful, fair, transparent and meets the other standards set out in data protection law.
Biometric data extracted from a facial image can be used to uniquely identify an individual in a range of different contexts. It can also be used to estimate or infer other characteristics, such as their age, sex, gender or ethnicity. The UK courts, in the case of R (Bridges) v Chief Constable of South Wales Police and Others, have concluded that “like fingerprints and DNA” [a facial biometric template] is information of an “intrinsically private” character.”[1] LFR in particular can collect this data without any direct engagement with the individual. This means it has greater potential to be used in a privacy-intrusive way.
The UK GDPR requires controllers to take a data protection by design and default approach to help them comply with the UK GDPR’s fundamental principles and requirements, and forms part of the focus on accountability. This approach is explored further below, and is particularly important in the context of LFR because many issues of fairness, necessity and proportionality need to be addressed during the planning and design stage of a system.
LFR: The Information Commissioner’s opinion
The Information Commissioner previously published an Opinion on the use of LFR in a law enforcement context. It concluded that data protection law sets high standards for the use of LFR to be lawful when used in public places. The Information Commissioner’s Office (ICO) has built on this work by assessing and investigating the use of LFR outside of law enforcement. This has covered controllers who are using the technology for a wider range of purposes and in many different settings.
This work has informed the ICO’s view on how LFR is typically used today, the interests and objectives of controllers, the issues raised by the public and wider society, and the key data protection considerations. The Information Commissioner has published a separate Opinion to explain how data protection law applies to this complex and novel type of data processing.
Key requirements under data protection law
Any use of personal data must be lawful, fair, necessary and proportionate. These are key requirements set by data protection law. Where the personal data in question is particularly sensitive, such as biometric data, there are stronger legal protections. Where the processing is automatic and there is a lack of choice or control for the individual, again there are stronger protections. This means that when LFR is used in public places for the automatic and indiscriminate collection of biometric data, there is a high bar for its use to be lawful.
The Information Commissioner has identified a number of key data protection issues which can arise where LFR is used for the automatic collection of biometric data in public places. These have been identified through the ICO’s investigations, our work reviewing data protection impact assessments (DPIAs) and wider research. These issues include:
- the governance of LFR systems, including why and how they are used;
- the automatic collection of biometric data at speed and scale without clear justification, including the necessity and proportionality of the processing;
- a lack of choice and control for individuals;
- transparency and data subjects’ rights;
- the effectiveness and the statistical accuracy of LFR systems;
- the potential for bias and discrimination;
- the governance of watchlists and escalation processes;
- the processing of children’s and vulnerable adults’ data; and
- the potential for wider, unanticipated impacts for individuals and their communities.
For any use of biometric data, controllers must perform a DPIA for any processing that is likely to result in a high risk to individuals. This includes the broad spectrum of biometric characteristics that may be used to uniquely identify an individual, and monitoring publicly accessible places on a large scale.
Data protection by design and default
Data protection by design and default is about embedding data protection into everything controllers do, throughout all processing operations. They should therefore consider privacy and data protection when procuring, purchasing or developing any biometric systems. They should also ensure biometric products or services they adopt from vendors have been designed with appropriate data protection and privacy-preserving features built-in. Controllers, not technology vendors, are responsible for this under the law.
It is important that controllers do not deploy “off-the-shelf” solutions without adequate due diligence to understand the technical processing and associated privacy implications. Controllers also should consider whether a biometric system, such as LFR, is designed with data subjects’ rights in mind. For example, they should have the capability to isolate and extract personal data in response to a subject access request, other individuals’ rights or for disclosures to authorised third parties unless valid exemptions apply.
In the specific context of LFR, it is especially important that controllers set out clear processes and policies governing its use, including:
- The circumstances in which the controller may activate the LFR system;
- Clear criteria and governance for any watchlists;
- Well-defined procedures for intervention in the event of a match and clear escalation measures;
- How data subjects are informed, how controllers will handle complaints, and how they will fulfil the public’s data protection rights; and
- Processes to continually monitor the impact of the LFR system and assess whether it continues to be fair, necessary and proportionate.
The Information Commissioner recommends that by integrating data protection considerations at the very start of any biometric processing, and documenting any decisions that are made in a DPIA, will help controllers comply with their legal obligations under data protection law. This will also enable controllers to adopt the data protection by design and default approach.
[1] R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341, paragraph 59
UK Information Commissioners Office
Steven Wright
Member of the Future Direction Group, Biometrics Institute
Joined in 2020
Applications and use cases | Privacy and policy | Research and development | Technology innovation