The Equalities and Human Rights Commission (EHRC) has stated that the live use of facial recognition technology should be suspended until its impact has been independently scrutinised and laws governing its application improved.
The statutory non-departmental public body covering England and Wales, established under the Equality Act, raised the concerns in a report to the United Nations on civil and political rights in the UK.
“The law is clearly on the back foot with invasive AFR [automated facial recognition] and predictive policing technologies,” said Rebecca Hilsenrath, chief executive at the EHRC. “It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected.”
The report explained that evidence indicates many AFR algorithms disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner, adding that “such technologies may replicate and magnify patterns of discrimination in policing and have a chilling effect on freedom of association and expression”.
Police in London and South Wales have already begun using facial recognition technology, which uses cameras to capture images of faces and double-checks these against databases of suspects.
Last month, the Metropolitan Police deployed cameras to scan shoppers in Stratford and Oxford Circus in London, while South Wales police used the technology at a Slipknot concert at the Cardiff City football stadium in January.
Peter Fussey, a professor at Essex University who conducted the only independent review of the Metropolitan Police’s public trials on behalf of the force, found it was verifiably accurate in just 19 per cent of cases.
Last September, the High Court refused a judicial review of South Wales police’s use of the technology, ruling that although it amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.
Metropolitan Police commissioner Cressida Dick recently defended the force’s use of facial recognition technology, calling critics ill-informed.
“I and others have been making the case for the proportionate use of tech in policing, but right now the loudest voices in the debate seem to be the critics, sometimes highly incorrect and/or highly ill-informed – and I would say it is for the critics to justify to victims of crimes why police shouldn’t use tech lawfully and proportionately to catch criminals,” she stated.
Recent Stories