Search

How can we help?

Icon

Case Analysis: R (Bridges) v CCSWP and SSHD [2019] EWHC 2341

On 4 September 2019 the world’s first decision regarding the privacy implications of facial recognition was handed down by the High Court in Cardiff. The implications of the case were of such significance that both the Information Commissioner and Surveillance Camera Commissioner joined as Interveners.

The case concerned the trialled use of facial recognition software on members of the public by the South Wales Police.

One such member of the public, Mr Edward Bridges a civil liberties campaigner alleged that he had been present and caught by cameras by the South Wales Police (SWP) deploying auto-facial recognition (AFR) software on two particular occasions.

Mr Bridges was not the subject of any investigation or action by the SWP nor was there any evidence that Mr Bridges’ image had in fact been collected by the SWP (as any images not of interest to SWP would have been immediately deleted).

Mr Bridges claimed that the use of AFR by the SWP was in breach of the right to privacy as contained in Article 8 of the European Convention of Human Rights (ECHR) and that the SWP had failed to comply with the Data Protection Act 1998 and its successor, the Data Protection Act 2018 (DPA 2018). Mr Bridges also claimed that the application of the AFR was in breach of the Equality Act 2010 in that the SWP failed to consider that the use of facial recognition might produce results that were indirectly discriminatory.

On the two occasions Mr Bridges claimed to be present, the SWP used a single marked van equipped with a camera employing AFR on a street in the Cardiff Centre, the second at another public event at the Motorpoint Arena. Both deployments were for a limited period between 8am – 4pm. On both occasions the SWP published a fair processing notice on social media, posted signs within range of the police van and the police van was marked.

SWP stated its purpose of using the AFR was to locate and detain certain wanted offenders who had already been identified within police watchlists.  The watchlists comprised images of known offenders and persons of interest compiled from police records. The AFR would extract biometric information from faces of members of the public captured on the CCTV and compare this with biometric information of persons on the watchlist.

The High Court found that on the facts on that case, the SWP had not unlawfully interfered with Mr Bridges’ right to privacy nor had the SWP failed to comply with the various provisions of the DPA 2018 concerning the processing of personal data by law enforcement. It also found the SWP had given due regard to its Equality Act obligations.

Right to Privacy under Article 8

Article 8 of the ECHR provides individuals with the right to a private life which is given effect in the UK by the Human Rights Act 1998. This right which protects individuals from interference by public authorities and those exercising public functions, is not absolute. It may be derogated from on certain grounds which are found in Article 8(2).

Article 8(2) provides that a public authority may not interfere with an individual’s right to privacy unless in accordance with law and is necessary in a democratic society in the interests of certain purposes including public safety and for the prevention of disorder or crime.

In considering various other forms of collection of personal data such as DNA and fingerprints, the High Court determined that the initial collection of biometric data by the AFR did constitute an interference with the right to privacy under Article 8(1).

This was apparently also the view of Ms Denham. The UK’s Information Commissioner who had submitted that without appropriate safeguards “such data could be collected in a manner amounting to a serious interference with privacy rights.” (Para 60)

The question then became whether SWP had interfered with such privacy rights “in accordance with law”, which requirement includes that there must be some basis in domestic law and that the law must afford adequate legal protection against arbitrariness and contain safeguards against disproportionate interference with ECHR rights.

The High Court answered this in the affirmative and determined that:

  1. The police have well established powers at common law and explicit statutory powers to make reasonable use of imagery for the purpose of prevention and detection of crime and apprehension of suspects and such powers were “amply sufficient” to enable the police to use AFR (para 72, 78)
  2. There was a clear and sufficient legal framework to govern when and how the software was to be used. This included the Data Protection Act (both the DPA 1998 and DPA 2018), the Surveillance Camera Code of Practice issued under the Protection of Freedoms Act 2012 and SWP’s own policies. (para 84)

The High Court then considered whether the interference to Mr Bridges’ Article 8(1) privacy rights was justified under a four-limb test contained in Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700 (Bank Mellat test).

The third and fourth limbs of the test were at issue which provided:

  • whether a less intrusive measure could have been used without unacceptably compromising the objective;
  • whether having regard to these matters and to the severity of the consequences, a fair balance has been struck between the rights of individuals and the interests of the community,

In its application of this test, the Court commented that there is no reason “to draw any distinction between the levels of protection for individual rights under the Human Rights Act 1998” and the DPA 2018. (Para 100)

The High Court was satisfied that the deployment of the AFR struck a fair balance and was not disproportionate.

The factors the Court found relevant included that:

  • the use of AFR was done in an open and transparent way, with significant public engagement, was undertaken for a limited time covering a limited footprint.
  • Its use was for a specific and limited purpose to identify particular individuals. On one occasion it led to two arrests,
  • No data was retained, and the interference was limited to near instantaneous algorithmic processing and discarding of biometric data.
  • Nobody was wrongly arrested, and no-one complained of mistreatment.
  • No personal information of the claimant was made available to any human agent nor was there any attempt to identify the claimant.

The Court did point out that the question of proportionality was fact sensitive and invariably a limit to what can be sensibly said in relation to future uses. (paraph 108).

Data Protection Act Claims

Mr Bridges made various claims under both the DPA 1998 and DPA 2108.

He claimed that the SWP had failed to comply with the data protection principles contained in Part 1 of Schedule 1 to the DPA 1998, in particular the first principle which provides that personal data must be processed fairly and lawfully, meets one of the conditions in Schedule 2 and in the case of sensitive personal data, at least one of the conditions in Schedule 3.

A key issue in dispute was whether the SWP had processed personal data of Mr Bridges and all members of the public captured by the AFR.

The SWP had argued that it did not process the personal data of Mr Bridges, only those persons on the watchlist as they were the only persons that SWP could identify by name.

The High Court disagreed and held that the SWP had processed the personal data of all members of the public.  It considered that the term “personal data” covered circumstances of indirect identification (such as with dynamic IP addresses) as well individuation or singling out of an individual.  The High Court held the AFR covered the second circumstance, namely that it uniquely identifies an individual because the biometric facial data is used to distinguish that person from any other person in its matching exercise.

The High Court also held that the SWP had met the requirements of the first data protection principles for the same reasons as they had given in the context of the Article 8 claim.

Mr Bridges then made two claims under the DPA 2018.  The first claim again alleged that the use of the AFR software did not comply with the first data protection principle and that its use entails sensitive processing as described in section 35 (8) of the DPA 2018. Sensitive processing is defined as including biometric data for the purpose of uniquely identifying an individual.

SWP again contended that it had not processed the biometric data of members of the public, just those individuals in the watchlist.

The High Court held that the sensitive processing of biometric data had taken place in respect of all members of the public, not just those on the watchlist as the comparison run by the software could only be undertaken if each biometric template uniquely identified that individual.  (133)

SWP therefore needed to comply with the additional requirements for sensitive processing as contained in the law enforcement provisions, in Part 3 the DPA 2018.

Section 35(5) applicable in the case provides that sensitive processing can only be conducted if the processing was strictly necessary for the law enforcement purpose, meets one of the conditions in Schedule 8 (for example necessary the exercise of a function conferred by a rule of law) and at the time of processing, an appropriate policy document was in place.

The High Court found that the processing by SWP did meet the aforementioned conditions within the DPA 2018 and again referred back to the same grounds it used in respect of the Article 8 claim. In its reasoning the Court did note that it could not comment on whether the policy document put in place by the SWP fully met the required generic standard and observed that it would be desirable for the Information Commissioner to provide specific guidance on what is required to meet this standard.

Mr Bridges also argued that the SWP did not conduct a proper data protection impact assessment as required by section 64 of the DPA 2018.

The Court found that the SWP had conducted an assessment which met the conditions of that section but noted that it “will not necessarily substitute its own view for that of the data controller on all matters, The notion of an assessment brings with it a requirement to exercise reasonable judgment” and that “when conscientious assessment has been brought to bear, any attempt to second-guess that assessment will overstep the mark.” (146)

Equality Act Claim

Mr Bridges claimed that the SWP had failed their statutory duty under section 149(1) of the Equality Act 2010. Section 149(1) requires public authorities to have due regard to the need to eliminate discrimination. Mr Bridges contended that in its assessment the SWP failed to consider the possibility that the AFR software may produce results were indirectly discriminatory.

The basis of the contention was commentary by Mr Bridge’s expert witness that the accuracy of AFR systems generally could depend on the data set used to train the system.

The case concerned the trialled use of facial recognition software on members of the public by the South Wales Police.

The SWP’s technical officer had advised that whilst there was a proportionately higher rate of female false positives than male false positives, this could be explained by the presence of two female faces in the watchlist which had significant generic features.   (154)

The High Court found the SWP did have the due regard required when it commenced the trial and at the time there was no specific reason why SWP ought to have assumed the software would have produced less reliable results depending on whether the face was male or female, white or minority ethnic. (157). The Court did counsel that as result of the evidence developed during the course of the trial, that SWP ought to consider whether the software may produce discriminatory impacts. The Court also suggested a possible failsafe; that no step be taken unless an officer has reviewed and made his/her own assessment of the potential match generated by the software.

The case may be unsatisfactory for privacy advocates who may have been hoping for a judicial call to enact separate legislation governing of the use of the technology. They may also have hoped for some overarching direction on future uses by police. However, the Court declined to comment about whether any future use of the technology would in fact infringe on privacy rights stating rather that “we are satisfied that there is no systemic or proportionality deficit such that it can be said that future use of AFR Locate would be inevitably disproportionate.” (para 108)

At the time of writing, the ICO is still to conclude its own investigations into both the SWP’s and Met Police’s use of facial recognition technology and may well provide some additional guidance.

About this article

Disclaimer
This information is for guidance purposes only and should not be regarded as a substitute for taking legal advice. Please refer to the full General Notices on our website.

About this article

Read, listen and watch our latest insights

Pub
  • 26 March 2024
  • Privacy and Data Protection

AI Podcast: AI and Data Security

In the third and final podcast in our ‘AI Podcast’ trilogy, members of the data protection team, will be discussing how to use AI to process data safely. They will be looking closely at the risks for businesses and the types of data security protections you can put in place.

art
  • 26 March 2024
  • Privacy and Data Protection

Key considerations for data retention policies

In the ever-evolving landscape of data protection regulations, data retention stands as a crucial aspect of compliance and risk management for organisations across industries.

art
  • 21 March 2024
  • Immigration

What is a right to work share code?

A right to work share code is a unique 9-character alphanumeric code generated via the UK Government website. This initiative has been implemented by the UK Government to verify an individual’s right to work online.

Pub
  • 21 March 2024
  • Employment

TUPE Podcast Series: Who Transfers?

In this fifth podcast in our TUPE Podcast Series, Amanda Glover will be focusing on ‘who transfers’ under TUPE. Looking at the definition of ‘employee’ under TUPE legislation and the tests that apply in deciding if those employees transfer.

art
  • 20 March 2024
  • Employment

Changes to Employment Laws from April 2024 – are you ready?

There’s a large number of employment law changes coming in April which are set to shake up the workplace. It’s crucial for employers to stay informed and prepared.

art
  • 19 March 2024
  • Employment

Instant Messaging in the Workplace: Factors to be aware of

Workplaces have changed beyond recognition in the four years since the first COVID-19 lockdowns. This anniversary represents an opportunity to look back at how workplaces have changed in that period, from the increased use of flexible and hybrid working, to the continuing and significant integration of more technology in office-based work.