Amazon stops police using its facial recognition software

Amazon has announced a one-year moratorium on police use of its artificial intelligence-driven facial recognition, following a growing backlash around its potential for racial bias in the wake of recent protest.

When it was first released in 2016, the Rekognition software was criticised by advocacy groups which said the technology could have a disproportionately negative effect on non-white people.

Congresswoman Alexandria Ocasio-Cortez echoed this complaint in a recent tweet, stating: “Facial recognition is a horrifying, inaccurate tool that fuels racial profiling and mass surveillance... it regularly falsely [identifies] black and brown people as criminal”.

An experiment run by the American Civil Liberties Union (ACLU) in 2018 showed Rekognition incorrectly matched 28 members of Congress to photos of people arrested for a crime - overwhelmingly misidentifying Congress members who were not white.

In a statement, Amazon said it will pull the use of its technology from police forces until there is stronger regulation around it.

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” it read. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

The move follows IBM chief executive Arvind Krishna writing to the US Congress earlier this week, stating it would no longer offer general purpose facial recognition or analysis software, adding that the company "firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparency".

The Washington county sheriff’s office in Oregon, the first law enforcement agency in the country to contract with Amazon to use the technology, confirmed it would suspend its use of Rekognition product in light of Amazon's announcement.

Suspension of the software does not mean all partnerships with law enforcement will be halted though, as Amazon noted that the International Center for Missing and Exploited Children, as well as technology companies Thorn and Marinus Analytics, will still have access to Rekognition for human trafficking cases.

Paul Bischoff, a privacy advocate at Comparitech, commented: "Amazon's and IBM's announcements about moratoriums on police use of face recognition is welcome news - at this critical moment in our history, now is not the time to empower police with the ability to identify protesters or restrict freedoms of movement and assembly.

"We need more regulation that stipulates how, when, where, and in what context police are allowed to use face recognition, and with whom the police can share face recognition data," he continued. "Allowing police to purchase face recognition services without oversight could have serious consequences, both predictable and unforeseen."

    Share Story:

Recent Stories