Home / Crime and Justice / EU Parliament Blocks Child Exploitation Scan Law
EU Parliament Blocks Child Exploitation Scan Law
10 Apr
Summary
- Law allowing big tech to scan for child abuse expired April 3.
- EU Parliament did not vote to extend the temporary measure.
- Child safety experts warn crimes may go undetected.

A critical law enabling big tech firms to scan for child sexual exploitation on their platforms has expired in the European Union, creating a legal vacuum. The temporary measure, enacted in 2021 under the EU Privacy Act, lapsed on April 3 after the European Parliament did not vote to extend it, citing privacy concerns. This has led to uncertainty for companies like Google, Meta, Snap, and Microsoft, who are now legally barred from scanning but still liable for removing illegal content under the Digital Services Act.
Child safety experts have expressed grave concerns, warning that the lapse will likely result in a steep decline in reports of child abuse. They point to a previous 18-week period in 2021 with a similar legal gap, which saw a 58% drop in reports from EU-based accounts to the National Center for Missing and Exploited Children. This lack of visibility, they argue, directly impacts the ability to protect victims, as "detection goes dark" while abuse continues.
Despite the EU's decision, major tech companies have stated they will continue to voluntarily scan their platforms for child sexual abuse material (CSAM). They described the parliament's failure to reach an agreement as an "irresponsible failure." The European Parliament claims it is prioritizing new legislation to combat online child sexual abuse, though no timeline for agreement or implementation has been provided.
Privacy advocates, however, argue that scanning technologies infringe upon fundamental privacy rights and could lead to mass surveillance. They refer to the measures as "chat control." Conversely, child safety proponents emphasize that blocking CSAM is not an evasion of privacy and that "free speech does not include sexual abuse of children." The technology uses machine learning to detect known abuse patterns and language, generating digital fingerprints to identify and block matching content without human review.