Home / Technology / UK Police Lobby for Biased Facial Tech
UK Police Lobby for Biased Facial Tech
10 Dec
Summary
- Facial recognition system disproportionately misidentifies women and Black people.
- Police successfully lobbied to keep the biased system after initial mitigation.
- System known bias for over a year, despite Home Office claims.
- Current algorithm may misidentify Black women almost 100 times more than white women.

Facial recognition software used by UK police forces has been found to disproportionately misidentify women and Black individuals. Documents reveal that police forces actively lobbied against an initial decision to increase the system's confidence threshold, which aimed to mitigate known biases. They argued that the adjustment significantly reduced 'investigative leads', prioritizing operational effectiveness over addressing the technology's discriminatory impact.
Concerns have been raised by experts regarding the police forces' apparent prioritization of convenience over fundamental rights. The system's known bias persisted for over a year, with some settings showing it could incorrectly identify Black women nearly 100 times more frequently than white women. This situation highlights a potential disconnect between anti-racism commitments and practical implementation within policing.




