News

Published: 26 March, 2026

Children’s online protection is under threat


What has happened?

The temporary ePrivacy derogation from 2021 allows services to voluntarily detect and report child sexual abuse material. The derogation expires on April 3, 2026, unless it is extended. The European Commission proposed an extension, but the European Parliament wants to limit the derogation: only already identified or flagged material should be scanned. Negotiations have stalled, creating a legal vacuum.

What does this mean for children?

Reporting to the National Center for Missing & Exploited Children (NCMEC) shows how crucial the derogation is:

  • Before the law came into force in 2020, reporting dropped by 58% in 18 weeks
  • In 2024, NCMEC received 20.5 million reports, corresponding to 29.2 million incidents and over 62 million files – almost all from the platforms themselves
  • Without the derogation, the ability to detect and stop abuse material risks decreasing dramatically

What are platforms doing today?

Major platforms such as Meta, TikTok, Snapchat, and Roblox use the derogation to:

  • Scan images, videos, and chats to identify both known and new abuse material
  • Detect grooming and inappropriate behavior
  • Flag suspicious material in advance and report it to NCMEC

Without the derogation, these measures risk disappearing:

  • Hash matching of known material
  • AI-based detection of new material
  • Grooming detection and behavioral analysis
  • Proactive flagging before users report
  • Reporting to NCMEC

These tools are central to protecting children and supporting law enforcement work effectively.

What do the platforms themselves say?

Snapchat, Google, LinkedIn, Meta, Microsoft, and TikTok are urging the EU to extend the derogation. Allowing it to expire would reduce children’s protection and stop vital tools against the spread of abuse material.

How does this affect Sweden?

If the derogation expires:

  • Swedish children will have less protection against abuse and grooming online
  • Platforms can no longer proactively analyze chats or content
  • Law enforcement will become fragmented and ineffective

Examples of Swedish cases where the derogation has been crucial:

  • Stockholm 2026: Tips via NCMEC led to convictions for serious child pornography offenses
  • Linköping 2026: Snapchat detected material that NCMEC reported → conviction
  • Haparanda 2024: The man was convicted following NCMEC reporting of suspected file sharing

Even today, the spread is massive and heavily underreported. Without technical tools, it would be impossible for the police to effectively stop abuse.


What does ChildX demand?

Full extension of the ePrivacy derogation
– Platforms must continue to be able to detect and report abuse material, including grooming

Preserve voluntary detection
– The ability to analyze both known and new material must remain

Introduce permanent legislation with clear requirements
– Platforms must actively detect, analyze, and report abuse material
– Child protection systems must operate continuously with legal support

Turning off these systems does not stop abuse – it only makes children invisible online.
ChildX urges decision-makers not to dismantle existing protections and at the same time quickly establish a permanent legal basis that protects children from sexual exploitation online.