19,403 people called on the UK to ban "crime predicting" technology

Many thanks to the thousands of you who took action calling on the UK to ban "crime predicting" technology. Your response has been incredible.
Is your police force using this technology in YOUR area?
What’s been happening?
Almost three-quarters of forces across the UK are using technology to try to “predict crime” - with little regard for our human rights.
Restricting our rights in this way does not keep us safe.
In February 2025, Amnesty International UK published a report, Automated Racism, outlining how automated and ‘predictive’ policing violates many rights and its use is in breach of the UK’s national and international human rights obligations.
Is your police force using this technology in YOUR area?
The map will give you information on whether your local police force uses "crime predicting" technology, and if so, which one.
The map will also include stop and search rates and demographics in your area, as well as how the stop and search rate compares to other police forces.
Can you predict the future? Police think they can. By racially profiling communities across the UK.
These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.
Policing in the UK is already biased against minoritised communities, with many forces accepting they’re institutionally racist. So when they add data-driven technology, we get automated racism.
These tools to “predict crime” harm us all by treating entire communities as potential criminals, making society more racist and unfair.
Governments across the UK must prohibit the use of these technologies. Right now, they can demand transparency on how these systems are being used. People and communities subjected to these systems must have the right to know about them and have meaningful routes to challenge policing decisions made using them.