UK police accused of using ‘racist’ prediction tools

Amnesty International has accused UK police of using racist crime prediction technology that breaches international human rights obligations.

In a report published by the human rights charity on Thursday, it said that at least 33 police forces – including the Metropolitan Police, West Midlands Police, Greater Manchester Police, and Essex Police – have used predictive, profiling or risk prediction systems.

The 120-page report highlights two main types of alleged racist predictive policing systems: location and profiling.

Amnesty says that location technology used by police makes predictions about the likelihood of crimes being committed in geographic locations, which it claims specifically targets people from Black, Asian, and other ethnic minority backgrounds.

The non-profit added that in the year ended March 2023, there were 24.5 stops and searches for every 1,000 Black people; 9.9 stops and searches for every 1,000 people with mixed ethnicity; 8.5 for every 1,000 Asian people; and 5.9 for every 1,000 white people, with the vast majority – 69 per cent – leading to no further action.

Another method that the organisation has raised concerns about is individuals being place in a “secret database” and profiled as someone likely at risk of committing certain crimes in the future.

Amnesty says that areas such as London, the West Midlands and Manchester, where there are high populations of Black people and people from other ethnic minority backgrounds are “repeatedly targeted by police and therefore crop up in those same police records.”

“No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive," said Sacha Deshmukh, chief executive, Amnesty International UK. "The use of predictive policing tools violates human rights.

"It doesn’t keep communities safer because it does not lower crime. We are all much more than computer generated risk scores. These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background."

A spokesperson from the National Police Chiefs' Council (NPCC) said that policing uses a "wide range of data" to inform its response to tackling and preventing crime.

“Hotspot policing and visible targeted patrols are the bedrock of community policing, and effective deterrents in detecting and preventing anti-social behaviour and serious violent crime, as well as improving feelings of safety," they continued. “We are working hard to improve the quality and consistency of our data to better inform our response, ensuring that all information and new technology is held and developed lawfully, ethically and in line with the Data Ethics Authorised Professional Practice (APP).

“It is our responsibility as leaders to ensure that we balance tackling crime with building trust and confidence in our communities whilst recognising the detrimental impact that tools such as stop and search can have, particularly on Black people. The Police Race Action Plan is the most significant commitment ever by policing in England and Wales to tackle racial bias in its policies and practices, including an ‘explain or reform’ approach to any disproportionality in police powers."

The NPCC added that the national plan is "working with local forces and driving improvements in a broad range of police powers, from stop and search and the use of Taser through to officer deployments and road traffic stops".

It said that the plan also contains a specific action around data ethics, which has directly informed the consultation and equality impact assessment for the new APP.

Speaking about its Knife Crime and Violence Model, which Amnesty claims can affect criminal justice sentences and decisions for prisoner categorisation for those assessed by the system, Essex Police said it does not seek to criminalise any individual.

When first launching the project in 2021, the force said that the model works by using "police-derived (factual) data" to best predict whether an individual is on the trajectory towards using a knife to commit an act of serious violence.

“Its aim is to tackle the root cause of knife crime violence and prevent young people who are at heightened risk of becoming involved in that from doing so - and ultimately save lives," it said in response to Amnesty International's claims. “The model uses the Cambridge Crime Harm Index, which measures the seriousness of crime harm to victims, and not simply the number of officially recorded crimes.

"This approach means larger weight is given to more harmful crimes, which can assist in identifying the most harmed victims and the most harmful offenders and those at risk of becoming the most harmed victims or the most harmful offenders. The offences which feed into the assessment model are violence with injury, violence without injury and robbery."

Essex Police added that to "remove any risk of bias", the model – which is currently being evaluated by the College of Policing – does not include stop and search, including possession of weapon offences, the ethnicity of those involved in any incidents, any intelligence data and any police-generated mental health or drug markers.

In response to Amnesty's report, a Greater Manchester Police spokesperson said: “Our priority is preventing crime and stopping people from coming to harm. We have a range of ways we proactively do this, including engaging with communities and working with partners and charities to divert people away from criminality.

“Proactive policing is a particularly vital part of how we’ve brought down violent crime which saw 1,600 fewer victims come to harm last year across Greater Manchester. We focus our local and specialist resources in the areas where a combination of community information and recent reporting suggests there is risk of people being subjected to harm.

“The work of the XCalibur Task Force in the past two decades has seen us engage with areas of Manchester where violent crime has previously blighted communities. It does not work from any ‘gang profiling’ database and instead uses information from the community to inform the life-changing work it continues to achieve alongside local community groups and agencies.”

National Technology News has also approached several other police forces mentioned in the report who did not immediately respond.

The Met Police declined to comment.



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.