Home / Crime and Justice / Facial Recognition Bias: Black People More Targeted
Facial Recognition Bias: Black People More Targeted
19 Mar
Summary
- AI cameras are significantly more likely to identify Black individuals.
- Essex police paused live facial recognition use due to bias risks.
- A study found AI systems better identify men than women.

Live facial recognition (LFR) technology, utilized by at least 13 police forces across England and Wales, is facing scrutiny after a study found it disproportionately targets Black individuals. Essex police have suspended their use of these AI systems following the discovery of potential accuracy and bias risks.
A commissioned study by University of Cambridge academics in Chelmsford involved 188 actors and actively deployed police cameras. While overall identification accuracy was around 50%, with rare incorrect matches, the system demonstrated a statistically significant higher likelihood of correctly identifying Black participants over others. This disparity, according to the study's authors, warrants further investigation.
This issue differs from common concerns about misidentifying innocent individuals, such as a recent case where retrospective scanning led to a wrongful arrest. Experts suggest that overtraining algorithms on Black individuals' faces could be a cause, potentially rectifiable by adjusting system settings.
The Home Office reports that LFR cameras in London led to over 1,300 arrests between January 2024 and September 2025. However, critics argue this latest research validates ongoing warnings about inherent bias in LFR technology, calling for immediate action by police forces nationwide.




