European Union member states have reached a common position on the contentious child sexual abuse (CSA) regulation, aiming to systematically remove abusive material from online platforms. The agreement, reached this Wednesday by Justice ministers, seeks to bolster protections for children while navigating concerns about privacy and potential mass surveillance. The proposed law would establish a new EU Centre on Child Sexual Abuse and empower national authorities to demand content removal.
The agreement follows years of debate and failed attempts to forge consensus, particularly around the sensitive issue of scanning private communications. While the initial proposal faced strong opposition, a compromise has been reached that avoids mandatory scanning of end-to-end encrypted messages by authorities, though platforms can still scan messages themselves.
The Road to a CSA Regulation
Negotiations surrounding the CSA regulation began in 2022, with successive rotating presidencies struggling to reconcile differing viewpoints. The core challenge centered on “detection orders” – the extent to which authorities should be able to proactively search for illegal content. Previous presidencies from the Czech Republic, Spain, Belgium, Hungary, and Poland were unable to secure a workable agreement.
Denmark’s presidency ultimately brokered a compromise by removing the requirement for authorities to scan private communications. This concession addresses significant concerns raised by technology companies and privacy advocates. However, the agreement still allows platforms like Facebook Messenger and Instagram to implement their own scanning mechanisms.
Big Tech Response
The technology industry has largely welcomed the compromise, though with reservations. CCIA Europe, a Brussels-based lobby group, emphasized the importance of balancing child protection with the confidentiality of communications. They stated that maintaining end-to-end encryption is crucial and hope this principle will guide further negotiations.
Additionally, industry stakeholders are keen to see a clear legal framework that avoids unintended consequences and ensures the responsible use of detection technologies.
Privacy Concerns Remain
Despite the compromise, online privacy advocates continue to express concerns about potential mass surveillance. Former Pirate MEP Patrick Breyer argues that the agreement legitimizes the “warrantless, error-prone mass surveillance” of European citizens by US corporations. He contends that the “voluntary” scanning of content effectively creates a system of widespread monitoring.
A key concern is the accuracy of artificial intelligence (AI) systems used to detect CSA imagery. Data from the German Federal Police indicates a high rate of false positives, with 50% of reports proving to be criminally irrelevant. The potential for errors raises questions about the impact on innocent individuals and the erosion of privacy.
Furthermore, the introduction of age-verification systems, potentially relying on ID cards or facial recognition, is seen as a threat to online anonymity and data protection. The European Data Protection Board has consistently highlighted the risks associated with such technologies.
What’s Next for the CSA Regulation?
With the Council reaching a common position, the next step involves “trilogue” negotiations with the European Parliament and the European Commission, scheduled to begin in 2026. These negotiations will be critical in finalizing the details of the CSA regulation and ensuring a balanced approach to child protection and fundamental rights.
These talks must conclude before the current E-Privacy regulation expires, which currently allows for exceptions enabling voluntary scanning practices. The outcome of these negotiations will significantly shape the future of online safety and privacy within the European Union. Stakeholders should continue to monitor developments and engage in the ongoing debate surrounding this important legislation.

