Digital Services Act

2020/0361(COD)

PURPOSE: to contribute to the proper functioning of the internal market and to ensure a safe, predictable and reliable online environment in which the fundamental rights enshrined in the Charter are duly protected.

LEGISLATIVE ACT: Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act).

CONTENT: the Regulation lays down harmonised rules for the provision of intermediary services in the internal market. It clearly defines the responsibilities and obligations of intermediary service providers, in particular online platforms, such as social media and marketplaces.

The digital services act is based on the principle that what is illegal offline should also be illegal online. It aims to protect the digital space against the spread of illegal content, and to ensure the protection of users’ fundamental rights.

Scope of application

The Regulation applies to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those services are established. Very large online platforms and online search engines (with more than 45 million active users per month in the EU) are subject to stricter requirements. Micro- and small enterprises are exempted from certain obligations related to the legislation.

Governance

The Commission will have exclusive powers to supervise very large online platforms and search engines for compliance with their obligations. They will be monitored at European level in cooperation with the Member States.

Measures and protection against misuse

Hosting service providers will put in place mechanisms to enable any individual or entity to report illegal content online. These mechanisms must be easy to access and use and allow for the submission of notifications exclusively by electronic means. The regulation requires platforms to cooperate with specialised ‘trusted flaggers’ to identify and remove illegal content.

Due diligence obligations

Providers of the intermediary services should clearly indicate and maintain up-to-date in their terms and conditions the information as to the grounds on the basis of which they may restrict the provision of their services. In particular, they should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint-handling system

Traceability of traders

In order to discourage traders from selling products or services in breach of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders must ensure that traders can be traced. Once they have authorised the trader to offer a product or service, the online platform providers concerned will endeavour to randomly check whether the products or services offered have been identified as illegal in online databases.

Right to information

In particular, marketplaces will collect and display information about the products and services offered to ensure that consumers are properly informed. Where an online platform provider becomes aware that a trader is offering an illegal product or service to consumers in the EU through the use of its services, the provider should inform consumers, insofar as it has their contact details, of (i) the fact that the product or service purchased is illegal, (ii) the identity of the trader; and (iii) any relevant means of redress.

Advertising on online platforms

Providers of online platforms that present advertising on their online interfaces will ensure that service recipients can identify, for each specific advertisement presented to each individual recipient, in a clear, concise, unambiguous manner and in real-time manner: (i) that the information presented is an advertisement, including through prominent markings; (ii) identify the natural or legal person on whose behalf the advertisement is presented and identify the natural or legal person who paid for the advertisement.

Providers of online platforms will not present advertising to recipients of services based on profiling using special categories of sensitive data.

Dark patterns

For online platforms and interfaces covered by the digital services act, the co-legislators have agreed to prohibit misleading interfaces known as ‘dark patterns’ and practices aimed at misleading users.

Recommender system transparency

The legislation lays down transparency requirements for the parameters of recommender systems have been introduced to improve information for users and any choices they make. Very large online platforms and search engines will have to offer users a system for recommending content that is not based on their profiling.

Crisis mechanism

In the context of the Russian aggression in Ukraine and the particular impact on the manipulation of online information, a new article has been added to the text introducing a crisis response mechanism. This mechanism will be activated by the Commission on the recommendation of the board of national Digital Services Coordinators. It will make it possible to analyse the impact of the activities of very large online platforms and search engines on the crisis in question and decide on proportionate and effective measures to be put in place for the respect of fundamental rights.

Protection of minors online

Providers of online platforms accessible to minors will put in place appropriate and proportionate measures to ensure a high level of protection of the privacy, safety and security of minors with regard to their service. They will not present advertisements that rely on profiling using personal data about the recipient of the service when they know with reasonable certainty that the recipient of the service is a minor.

Systemic risks presented by very large platforms

Very large digital platforms and services are required to analyse the systemic risks they pose and to carry out a risk mitigation assessment.

This assessment will be carried out on an annual basis and will allow for continuous monitoring to reduce risks related to (i) the dissemination of illegal content, (ii) adverse effects on fundamental rights, (iii) misinformation or manipulation of elections, (iv) cyber-violence against women or harm to minors online. These measures must be balanced against restrictions on freedom of expression and will be subject to independent audits.

New rights

Users will have new rights, including the right to complain to the platform, to request out-of-court dispute settlements, to complain to their national authority in their own language or to seek redress for violations of the rules.

ENTRY INTO FORCE: 16.11.2022.

APPLICATION: from 17.2.2024.