Clarkslegal LLP - Solicitors in Reading and London

Legal Updates

Police use of Facial Recognition - Artificial Discrimination?

21 February 2020 #Data Protection #Information Technology


It has been some months since the High Court delivered its judgment in R v Bridges concerning the use by South Wales police (SWP) of live facial recognition (LFR) in a public place. For further details see our case analysis here. The claimant has since appealed the decision in the Court of Appeal and the SWP is still under investigation by the Information Commissioners Office (ICO).

Now, the Met Police has come under scrutiny for deploying LFR last week in East London and yesterday, at Oxford Circus in central London on short notice.

Privacy groups have criticised the move in what they consider is an unjustified intrusion into the public’s expectation of privacy which the Court in R v Bridges has confirmed exists when LFR is used on an innocent member of the public going about their business in a public place.

This comes amidst controversy concerning the intended use by the Scottish police of live facial recognition software in its 10 year strategy. A Scottish Parliament sub-committee recently delivered its report criticising the use of LFR as being known to discriminate and its use being a radical departure from the Scottish polices’ fundamental principle of policing by consent. The EU is also considering banning use of facial recognition in public spaces for up to five years.

 

Right to Privacy

Under Article 8(2) of the European Convention of Human Rights (ECHR) an interference by police authorities to an individual’s right to privacy is only permitted on certain grounds. An Authority may not interfere with such rights unless it is “in accordance with law” and “is necessary in a democratic society in the interests of certain purposes including public safety”.

In the UK the Data Protection Act 2018 (DPA) also governs the use of personal data (including biometric data such as that used with facial recognition). Law enforcement must only use the personal data where this is strictly necessary for the law enforcement purposes pursued and in accordance with a DPA compliant policy document.

The question is therefore not just whether the Met Police interference was necessary in the interests of law enforcement but also whether the interference is in accordance with Article 8(2), the DPA and other relevant legislation such as the Equality Act 2010.

 

Fairness and Transparency

In R v Bridges, the  Court found that the SWP had complied with the DPA and conducted the trial in a fair and transparent way because they advertised the fact that they would be using LFR in the region with “significant public engagement”, the LFR was used for a limited time period in a limited region for the purpose of seeking particular individuals wanted for serious offences resulting in two arrests.

The Court held that the infringement of privacy was therefore fair, transparent and proportionate.

In the case of the Met Police, it is reported that only two hour’s notice was given prior to the cameras being deployed. Further, that whilst the aim of the deployment in East London last week was reportedly to apprehend serious offenders in that location, none were arrested.

The Court in R v Bridges found that privacy concerns will arise once a “permanent record comes into existence in the public domain” and the extraction of biometric data by use of LFR goes beyond the “expected and unsurprising” (para 53 R v Bridges).

Interestingly, in its data protection impact assessment released on 10 February 2020, the Met Police reported that 57% of those surveyed felt police use of LFR is acceptable, with public support increasing to 83% acceptance for LFR to search for serious offenders; 50% of those surveyed felt that the technology would make them feel safer and approximately one third raised concerns about the impact on their privacy.  

 

Discriminatory Impacts

Another criticism labelled at the use of LFR by the SWP in R v Bridges was the inherent unreliability of LFR software. LFR necessarily requires a base set of faces from which the automated facial recognition software scans with the live feed. If the feed “matches” with the base set, it triggers an alert from which the police officers will need to determine whether their person of interest has been identified.

In a number of trials, LFR has recorded a number of false positives on various sectors of the public such as females and the BAME community. This is a feature widely acknowledged as a limitation of LFR systems. It had been reported that eight trials conducted by the Met police during 2016-2018 resulted in a 96% false positive rate.

In its Data Protection Impact Assessment on the use of LFR released on 10 February 2020, the Met Police stated that those included in any data base watchlist used by them will be bespoke and will be typically drawn from EWMS (Emerald Warrant Management System), MPS custody records and Police National Computer as well as other “police forces and agencies associated with law enforcement as well as the wider public”. As the composition of the watchlist is crucial for accurate results, discriminatory impacts may be heightened if a low number or quality of images are included in the watchlists.

Questions concerning the police selection of images inevitably arise: Are the base images verified to source? Who chooses the images and why are they used? Should images of persons that have not been charged with an offence be included? How long should the images be used for?

One key question is the source of the images, including those images shared by “the wider public”. Last year the BBC reported that the Met Police had provided images to a private corporation, the managers of the Kings Cross Estate development for use with its own facial recognition technology. The Surveillance Camera Commissioner, Mr Tony Porter who is responsible for investigating public sector use of surveillance cameras is apparently looking into the sharing arrangement and was quoted as being concerned with the lack of oversight on private companies working with the police.

The circumstances serious enough for police to justify deployment of LFR needs careful consideration. Currently, LFR could be deployed on the premise of apprehension of a particular suspect wanted for serious crime but its use could possibly be extended to relatively minor infractions which are then investigated as a result of an image secured via a match from an LFR watchlist. Will your face then be included in a watch list and targeted for identification? At what point should LFR deployment require legislative based authorisation such as a warrant? This measure has recently been introduced in the United States under the Facial Recognition Technology Warrant Act 2019.

It remains to be seen whether the Met Police have complied with the DPA or the Surveillance Code of Conduct, these issues will no doubt need to be assessed on the particular facts.  However, for some, compliance with the current regulatory framework does not go far enough.

The UK’s Biometrics Commissioner Mr Paul Wiles who oversees police use of biometrics (DNA samples and LFR) has responded to the recent use of LFR by the Met Police and on 11 February 2020 stated that proper governance of LFR is needed through legislation to decide whether LFR ought to be used by police and if so for what purposes.

This view is shared by the ICO Commissioner Elizabeth Denham who released her own statement regarding use of LFR by the Met Police on 24 January 2020. Ms Denham states that whilst use of LFR by the police is supported by the public, its use must adhere to the DPA and be strictly necessary for law enforcement. She has called on government to introduce a statutory and binding code of practice for LFR as a matter of priority.

As with other rapid advances in technology, the law appears to be slow in addressing the particular privacy concerns raised by use of LFR. Until the law catches up, the public may well expect the police to take a thoroughly cautious approach to its use.

Disclaimer
This information is for guidance purposes only and should not be regarded as a substitute for taking legal advice. Please refer to the full General Notices on our website.

Read more articles

Chrysilla de Vere CIPP/E

Chrysilla de Vere CIPP/E
Partner

E: cdevere@clarkslegal.com
T: 0118 960 4636
M: 0774 863 8845

Contact

Data Protection team
+44 (0)118 958 5321