You are walking along a side street towards your office. Unbeknown to you a private company has installed closed circuit television with facial recognition capabilities and is tracking your movements. Is this lawful?
On 15 August 2019, the Information Commissioner, Elizabeth Denham released a statement which may address this very issue.
Ms Denham advised that she would be investigating the use of facial recognition technology in King’s Cross London by a privately-owned development. Ms Denham states that she is “deeply concerned about the growing use of facial recognition in public spaces, not only by law enforcement agencies but also increasingly by the private sector.”
The investigation follows a previous blog by Ms Denham in July that live facial recognition software was a high priority area for the ICO and that her office was conducting an investigation into and monitoring trials by the South Wales and Met Police.
Live facial recognition technology is different to CCTV monitoring. By using biometrics (certain physical and physiological features), the technology can map facial features to identify particular individuals by matching these with a database of known faces. This technology has been in use for some years by certain public and government agencies but with the advent of AI and machine learning, it has become more prevalent in the private sector.
Facial Recognition Concerns
Whilst the privacy legal framework for law enforcement is different to that for private companies, the privacy concerns about the use of facial recognition software in public spaces remain the same.
Some threats to privacy include:
Processing of Personal Data by Private Companies
Organisations which process personal data within the UK must comply with the General Data Protection Act (GDPR) and the Data Protection Act 2018.
Processing of personal data must only be undertaken if any of the grounds under Article 6 of the GDPR apply. Further where the processing involves special category data, which includes biometric data, a further justification must be found within Article 9.
Article 6 (1) lists the lawful bases for processing. In the context of video surveillance, the applicable bases includes:
- the consent of the individuals concerned (Article 6(1) (a). For consent to be valid under the GDPR it must be freely given, specific, informed and unambiguously given prior to the processing.
- necessary for the legitimate interests pursued by the controller. (Article 6 (1)(f). Processing for the purpose of ‘legitimate interest of a controller’ will be lawful unless such interests are overridden by the fundamental rights and freedoms of an individual.
Facial Recognition may be processing Special Category Data
If the processing of personal data involves special category data, in addition to identifying a lawful basis under Article, 6, an exemption must also be found in Article 9 to justify such processing.
Facial recognition technology will collect a type of special category data, biometric data if it is capable of uniquely identifying an individual. Biometric data involves the physical, physiological or behavioural characteristics of a person.
Therefore, if facial recognition technology is used to identify a particular individual as opposed to a category of persons (such as the profiling of customers by race, gender, age) this will be processing biometric data.
Article 9(2) of the GDPR lists a of limited number of exemptions which may justify processing special category data. These grounds include with the explicit consent of the data subject and other various grounds including vital interests of the data subject (immediate medical emergency), necessary for the establishment, exercise or defence of legal claims, processing relates to personal data already made public by the individual, substantial public interest, various medical and public health reasons and for scientific research or statistics.
The European Data Protection Board (EDPB) recently issued draft guidelines for public consultation, Guidelines 3/2019 on processing of personal data through video devices. The draft guidelines specifically address the use of facial recognition technology.
The EDPB is an independent European body established under the GDPR and publishes guidance on the application of European data protection laws.
The draft guidelines make some interesting observations about both the use of CCTV and facial recognition:
If the conclusions of the ICO’s investigation into the Kings Cross matter reflect the views of the EDPB (explicit consent likely to be required for facial recognition), the flow on affects for companies could be widespread. Certainly, companies which use facial recognition on individuals in public areas ought to be now reviewing their data protection compliance and procedures.