Use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online (temporary derogation from certain provisions of Directive 2002/58/EC)  
2020/0259(COD) - 11/12/2020  

The Committee on Civil Liberties, Justice and Home Affairs adopted the report by Birgit SIPPEL (S&D, DE) on the proposal for a regulation of the European Parliament and of the Council

on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards as the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.

As a reminder, the proposal aims to introduce limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

The committee recommended that the European Parliament’s position adopted at first reading under the ordinary legislative procedure should amend the Commission proposal.

Scope

Members considered that the proposed Regulation should only apply to videos or images exchanged over messaging or email services. It should not apply to the scanning of text or audio communication, which remains fully subject to the provisions of the e Privacy Directive.

In view of its temporary nature, the material scope of the proposed Regulation should be limited to the established definition of so called ‘child pornography’ as defined in Directive 2011/93/EU and ‘pornographic performance’ as defined in the same directive.

Additional safeguards

The committee stated that voluntary measures by providers offering number-independent interpersonal communications services in the EU applied for the sole purpose of detecting and reporting child sexual abuse online and detecting, removing and reporting child sexual abuse material should be subject to certain conditions:

- a mandatory prior data protection impact assessment pursuant and a mandatory consultation procedure, prior to the use of the technology;

- human overview and intervention is ensured for any processing of personal data, and no positive result is sent to law enforcement authorities or organisations acting in the public interest without prior human review;

- appropriate procedures and redress mechanisms are in place:

- no interference with any communication protected by professional secrecy;

- effective remedies provided by the Member States at national level.

All these conditions need to be met to ensure the proportionality of the restriction to the fundamental rights that this activity implies.

Data retention

When no online child sexual abuse has been detected, all data should be deleted immediately, according to Members. Only in confirmed cases can the strictly relevant data be stored for use by law enforcement for a maximum of three months.

Public register

The Commission should establish a public register of organisations acting in the public interest against child sexual abuse with which providers of number-independent interpersonal communications services can share personal data under this Regulation.

Time limitation of the proposed Regulation

Members proposed reducing the application period from 31 December 2025 to 31 December 2022.