UK Court of Appeal finds facial recognition technology unlawful
The UK Court of Appeal has handed down its judgment in R (Bridges) v Chief Constable of South Wales Police ([2020] EWCA Civ 1058).
On 11 August 2020, the UK Court of Appeal overturned the High Court’s previous dismissal of a challenge to South Wales Police’s use of automated facial recognition technology (AFR), finding that its use was unlawful and in violation of the European Convention of Human Rights (ECHR).
Background
In September 2019, the claimant brought judicial review proceedings after South Wales Police launched a surveillance project using facial recognition technology. The technology, a system called ‘AFR Locate’, was deployed at certain events and in different locations where crime was likely to occur, and was used to scan members of the public, capturing up to 50 faces per second. These images were then matched against others in a ‘watchlist’ compiled by the police, using biometric data analysis. Where an image didn’t match any others in the watchlist, it was automatically deleted. It has been estimated that over the 50 deployments that were undertaken in 2017 and 2018, the technology may have captured sensitive facial biometric data (i.e. measurements of facial features) from around 500,000 people without their consent.
The claimant contended that the use of the technology was unlawful, as it was in contravention of both Article 8 of the ECHR (the right to respect for private and family life) and data protection law in the UK. It was also claimed that the police did not properly comply with the Public Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010, as the Equality Impact Assessment undertaken was “obviously inadequate” and had failed to take into account the risk of indirect discrimination on the basis of sex or race.
In the first instance, it was decided by the High Court that while facial recognition technology does engage the privacy and data protection rights of those scanned, the current legal framework provides sufficient safeguards for use of this technology. Additionally, it was decided that the use of the technology to obtain biometric data of members of the public was carried out within the police’s common law powers to prevent and detect crime.
Judgement
The Court of Appeal allowed the appeal to succeed on the following grounds:
The use of AFR by the South Wales Police was found to be in breach of Article 8 ECHR, on the grounds that it automatically and without consent, collected and processed the biometric data of members of the public, which engaged the right to a persons’ image.
The Data Protection Impact Assessment (“DPIA”) conducted by the police did not comply with section 64(3)(b) and (c) of the Data Protection Act 2018 as it failed to properly assess the risks the technology posed to people’s rights and freedoms.
The Appellant argued that the DPIA was inadequate in the following three ways:
- firstly, it failed to recognise that the personal data of individuals who were not on the watchlists (whose data was automatically deleted) was still being ‘processed’ within the meaning of data protection law, even if this processing was momentary;
- secondly, it failed to acknowledge that the rights of individuals under Article 8 of the ECHR were engaged by the processing in a way that might lead to the infringement of these rights; and
- thirdly, it was silent as to the risks to other rights which are likely to be affected by the use of AFR, namely the rights to freedom of assembly under Article 11 of the ECHR and freedom of expression under Article 10 of the ECHR.
The police did not properly comply with the PSED under section 149 of the Equality Act 2010, prior to or in the course of their use of AFR on these occasions.
The Court found that the police had failed to gather sufficient evidence to establish whether the technology presented a risk of indirect discrimination prior to its use for two reasons:
- firstly, because the data of individuals whose images didn’t match those on the watchlists were automatically deleted, they couldn’t be analysed for the purpose of assessing bias; and
- secondly, the police were not aware of the dataset on which the technology had been ‘trained’, and so they were not able to establish whether there had been a demographic imbalance in that training data.
The judgement demonstrates that, despite the key safeguards provided under UK data protection law for the processing of sensitive personal data, the application of the legal framework in the context of the deployment and use of AFR is perhaps not as clear cut. AFR is a complex technology and, as acknowledged by the Court of Appeal, the future development of AFR technology is likely to require periodic re-evaluation of the sufficiency of the legal regime.
Businesses making use of AFR or similar technology should ensure that they pursue and maintain the highest possible level of compliance with applicable privacy and data protection laws. It should be ensured that, where required, adequate and thorough DPIAs are carried out before processing, taking into account any potential engagement with human rights and equality laws as well as privacy laws.






.jpg?crop=300,495&format=webply&auto=webp)




