Digital Services Act  
2020/0361(COD) - 20/01/2022  

The European Parliament adopted amendments to the proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC.

The matter was referred to the committee responsible for inter-institutional negotiations.

The main amendments adopted in plenary relate to the following points:

Subject matter and scope

The draft legislation on digital services clearly defines the responsibilities and obligations of intermediary service providers, in particular online platforms, such as social media and marketplaces. It would apply to intermediary services provided to service recipients whose place of establishment or residence is in the EU, irrespective of the place of establishment of the service providers. Micro and small enterprises would be exempt from certain obligations related to the legislation.

Notification and action procedure

All hosting service providers, regardless of their size, should put in place easily accessible, comprehensive and user-friendly notification and action mechanisms, allowing easy notification to the hosting service provider concerned of material which the notifying party considers to be illegal content (‘notification’).

Hosting service providers should act on notifications without undue delay, taking into account the type of illegal content notified and the urgency to act, and should inform the person or entity notifying the specific content of its decision as soon as possible.

Members also provided for stronger safeguards to ensure that notifications are dealt with in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including freedom of expression.

Removal of illegal content

Members suggested linking the concept of ‘illegal content’ to the general idea that ‘what is illegal offline should also be illegal online’. The proposed measures include clearly defined procedures for removing illegal products, services and content online.

Online platforms would be empowered to suspend, for a reasonable period of time and after issuing a prior warning, the provision of their services to recipients of the service who frequently provide illegal content, whose illegal content can be established without legal or factual examination, or who have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were alter overturned.

Traceability of traders

Online platforms that allow consumers to conclude distance contracts with traders should obtain additional information on the trader and the products and services they intend to offer on the platform. Prior to offering its services to the trader, the online platform operator should make best efforts to assess if the information provided by the trader is reliable.

Online platforms should demonstrate that they are doing their utmost to prevent the dissemination of illegal products and services by traders and should inform beneficiaries when the service or product they have acquired through them is illegal.

Where an online platform becomes aware that a product or service offered by a trader on its interface is illegal, it should promptly remove the illegal product or service from its interface and, where appropriate, inform the relevant authorities, and make available to the public a register containing information on illegal products and services removed from its platform in the last 12 months.

Targeted advertising

Online platforms should ensure that recipients of the service can refuse or withdraw their consent for targeted advertising purposes in a way that is not more difficult nor time-consuming than to give their consent. Refusing consent in processing personal data for the purposes of advertising should not result in access to the functionalities of the platform being disabled. Alternative access options should be fair and reasonable both for regular and for one-time users, such as options based on tracking-free advertising.

Online platforms should also not use personal data for commercial purposes related to direct marketing, profiling and behaviourally targeted advertising of minors. Targeting individuals on the basis of special categories of data which allow for targeting vulnerable groups should not be permitted.

Increased transparency of algorithms

Providers' terms and conditions should be drafted in clear and unambiguous language. They should include information on the policies, procedures, measures and tools used for content moderation purposes, including algorithmic decision-making, human review, and the right to terminate the use of the service.

Online platforms should refrain from using deceiving or nudging techniques to influence users’ behaviour through ‘dark patterns’. These techniques may be used to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data.

Members also called for more choice over algorithm-based ranking: the parameters of recommender systems should be presented in an easily understandable way so that recipients understand how information is prioritised for them. Very large online platforms should let recipients decide whether they want to be subject to recommender systems that rely on profiling and ensure that a non-profiling option is available.

Compensation

Recipients of the service should have the right to seek, in accordance with relevant Union and national law compensation from providers of intermediary services, against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation.

Other aspects

Other amendments adopted in plenary include:

- accessibility requirements for online platforms to ensure full, equal and unrestricted access to intermediary services for all recipients of services, including persons with disabilities;

- additional obligations for platforms used primarily for the distribution of user-generated pornographic content, including the obligation to ensure professional moderation of content by a human being trained to detect image-based sexual abuse;

- the need for providers to respect freedom of expression, freedom and media pluralism in their terms and conditions, as well as a provision on the right to use and pay for digital services anonymously;

- combating the spread of disinformation by introducing provisions on mandatory risk assessment, risk mitigation measures, as well as the accountability of very large online platforms to an independent external audit.