Home / Crime and Justice / Facial Recognition Bias: Black People More Targeted
Facial Recognition Bias: Black People More Targeted
19 Mar
Summary
- AI cameras are significantly more likely to identify Black individuals.
- Essex police paused live facial recognition use due to bias risks.
- A study found AI systems better identify men than women.

Live facial recognition (LFR) technology, utilized by at least 13 police forces across England and Wales, is facing scrutiny after a study found it disproportionately targets Black individuals. Essex police have suspended their use of these AI systems following the discovery of potential accuracy and bias risks.
A commissioned study by University of Cambridge academics in Chelmsford involved 188 actors and actively deployed police cameras. While overall identification accuracy was around 50%, with rare incorrect matches, the system demonstrated a statistically significant higher likelihood of correctly identifying Black participants over others. This disparity, according to the study's authors, warrants further investigation.