Home PoliticsGoogle, Meta, Snap and Microsoft criticize EU lapse in child abuse scanning law

Google, Meta, Snap and Microsoft criticize EU lapse in child abuse scanning law

by Maya Albright
0 comments
Google, Meta, Snap and Microsoft criticize EU lapse in child abuse scanning law

Google, Meta, Snap and Microsoft have criticized the European parliament after it blocked an extension of a law that allowed tech firms to scan for child sexual exploitation on their platforms.

The temporary measure, which was introduced in 2021 as a carve-out of the EU Privacy Act, let companies use automated detection tools to search for material linked to child sexual abuse, grooming and sextortion. It expired on 3 April after lawmakers declined to vote for an extension, leaving a legal gap that child safety experts say could allow crimes to go undetected.

The development has raised alarm among campaigners and industry figures who warn that the absence of the law could sharply reduce reports of abuse. Those concerns are based in part on the experience of 2021, when a similar legal gap was associated with a 58% drop in reports.

Privacy concerns played a role in the parliamentary decision, with some lawmakers arguing against prolonging the exception. But child protection advocates say the lapse removes an important tool used by companies to detect child sexual abuse material, often referred to as CSAM, across their services.

The issue now leaves the EU with a gap between privacy protections and child safety enforcement, and the lack of an extension means the temporary scanning regime is no longer in force. Experts warn that without a replacement, more illegal activity may remain hidden from investigators and platform safety teams.

The row underscores the continuing tension in Europe over how to balance digital privacy rules with measures aimed at identifying serious abuse online. For now, the expiry of the law means major platforms no longer have the same legal basis they previously relied on to use automated scanning for these specific harms.

You may also like