Digital Services Act

2020/0361(COD)

The Committee on the Internal Market and Consumer Protection adopted the report by Christel SCHALDEMOSE (S&D, DK) on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC.

The committee recommended that the European Parliament’s position adopted at first reading under the ordinary legislative procedure should amend the Commission's proposal as follows:

Scope

Members stipulated that the proposed regulation should apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. It should not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service.

Removing illegal content

The amended text stated that online platforms should:

- be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide illegal content, for which the illegality can be established without conducting a legal or factual examination or for which they have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were later overturned;

- suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints handling systems, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are unfounded.

Advertisements to minors

The report stressed that online platforms should take steps to phase out collecting or processing personal data for the purpose of targeting recipients for non-commercial and political advertising, in favour of contextual advertising. The same would apply to targeting people based on sensitive data, or to targeting minors.

For the purpose of targeting the recipients to whom advertisements for commercial purposes are displayed, online platforms should offer users the possibility to easily opt-out from micro-targeted tracking and advertisements that are based on their behaviour data or other profiling techniques.

Algorithms and dark patterns

The amended text called for providers of intermediary services to include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review.

Online platforms should refrain from using deceiving or nudging techniques to influence users’ behaviour through ‘dark patterns’. These techniques may be used to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data.

Moreover, the Digital Services Coordinator of each Member State, by means of national legislation, may request a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling cases involving the removal of lawful content online that is taken down erroneously.

The report also called for increased accountability on algorithms.

Pornographic content

Where an online platform is primarily used for the dissemination of user-generated pornographic content, the platform should take the necessary technical and organisational measures to ensure: (i) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (ii) professional human content moderation, trained to identify image-based sexual abuse, including content having a high probability of being illegal; (iii) the accessibility of a qualified notification procedure in the form that individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity.

The amended text stipulated that content notified through this procedure is to be suspended without undue delay.

Complaints and compensation

The amended text noted that online platforms should handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner and within ten working days starting on the date on which the online platform received the complaint. Upon receipt of the complaint, the Digital Services Coordinator of establishment should assess the matter in a timely manner and should inform within six months the Digital Services Coordinator of the Member State where the recipient resides or is established if it intends to proceed with an investigation. Recipients of the service should have the right to seek compensation from providers of intermediary services, against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under the proposed regulation.