feedzop-word-mark-logo
searchLogin
Feedzop
homeFor YouUnited StatesUnited States
You
bookmarksYour BookmarkshashtagYour Topics
Trending
Terms of UsePrivacy PolicyAboutJobsPartner With Us

© 2026 Advergame Technologies Pvt. Ltd. ("ATPL"). Gamezop ® & Quizzop ® are registered trademarks of ATPL.

Gamezop is a plug-and-play gaming platform that any app or website can integrate to bring casual gaming for its users. Gamezop also operates Quizzop, a quizzing platform, that digital products can add as a trivia section.

Over 5,000 products from more than 70 countries have integrated Gamezop and Quizzop. These include Amazon, Samsung Internet, Snap, Tata Play, AccuWeather, Paytm, Gulf News, and Branch.

Games and trivia increase user engagement significantly within all kinds of apps and websites, besides opening a new stream of advertising revenue. Gamezop and Quizzop take 30 minutes to integrate and can be used for free: both by the products integrating them and end users

Increase ad revenue and engagement on your app / website with games, quizzes, astrology, and cricket content. Visit: business.gamezop.com

Property Code: 5571

Home / Technology / UK Police Lobby for Biased Facial Tech

UK Police Lobby for Biased Facial Tech

10 Dec, 2025

Summary

  • Facial recognition system disproportionately misidentifies women and Black people.
  • Police successfully lobbied to keep the biased system after initial mitigation.
  • System known bias for over a year, despite Home Office claims.
  • Current algorithm may misidentify Black women almost 100 times more than white women.
UK Police Lobby for Biased Facial Tech

Facial recognition software used by UK police forces has been found to disproportionately misidentify women and Black individuals. Documents reveal that police forces actively lobbied against an initial decision to increase the system's confidence threshold, which aimed to mitigate known biases. They argued that the adjustment significantly reduced 'investigative leads', prioritizing operational effectiveness over addressing the technology's discriminatory impact.

Concerns have been raised by experts regarding the police forces' apparent prioritization of convenience over fundamental rights. The system's known bias persisted for over a year, with some settings showing it could incorrectly identify Black women nearly 100 times more frequently than white women. This situation highlights a potential disconnect between anti-racism commitments and practical implementation within policing.

The Home Office has stated that a new, independently tested algorithm with no statistically significant bias has been procured and will be tested soon. However, the recent NPL review highlighted significant issues with the existing technology. The government is currently consulting on widening the use of facial recognition, while critics call for strict national standards and independent scrutiny to ensure it does not exacerbate existing disparities.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
The system is more likely to misidentify women, young people, and ethnic minority groups compared to white men.
Yes, police bosses were informed of the system's bias in September 2024, and police forces successfully argued to keep it operational.
The Home Office has procured a new algorithm independently tested to have no statistically significant bias, which will be tested soon.

Read more news on

Technologyside-arrow
•
trending

HAL shares tumble after AMCA

trending

Cognizant profit rises 18.7%

trending

realme P4 Power 5G launched

trending

Qualcomm stock falls on shortages

trending

JSW Cement Q3 profit rises

trending

Pakistan India T20 boycott

trending

Trent share price cautious outlook

trending

MHADA sale postponed

trending

Bharat Taxi launches in Delhi

You may also like

Bunnings Wins Facial Recognition Battle

9 hours ago • 2 reads

article image

AI Cams Tackle Fly-Tipping Scourge

26 Jan • 17 reads

article image

Crime Victims Get Direct Case Updates Online

20 Jan • 78 reads

article image

AI BSL Translations Spark Access Fears

19 Jan • 53 reads

article image

Twins Trigger Digi Yatra Glitch: Face ID Fails!

5 Jan • 35 reads

article image