focus on
-
Average read time 5'

Digital Services Act: the new Digital Services Regulation

Published in: Intellectual Property
by Margherita Manca
Home > Digital Services Act: the new Digital Services Regulation

The Digital Services Act (DSA) is the European Union’s new regulation on digital services and is aimed at updating the regulations on digital intermediary service platforms.

In this article we will discuss:

DSA goals and innovations

The main objective of the new European Union Regulation is to create an updated legal framework for digital services, while ensuring greater protection of consumer rights and promoting fair competition among digital businesses.

The DSA ensures greater transparency of online activities and prevents the dissemination of illegal or harmful content, while promoting freedom of expression and the protection of users’ privacy.

In particular, digital service providers will be required to comply with stricter rules regarding the removal of illegal content (such as content related to pedophilia, incitement to hatred and violence, etc.). In addition, the DSA provides for the establishment of a system for users to report such content and an appeals mechanism for those who believe their content has been unjustly removed.

The Regulations also introduce an obligation for digital service providers to give their users more information about content moderation and the operation of the algorithms that determine its visibility.

Who are the recipients of the Regulations?

The Regulations will apply to all providers of online intermediary services, including social networks, e-commerce platforms, search engines and cloud computing services, as well as hosting service providers that manage user-generated content.

Excluded from the scope of the DSA will remain digital services operated by small businesses (e.g., small e-commerce platforms or online forums) that do not exceed certain thresholds of activity and brokering services provided by public entities or noncommercial nonprofit organizations.

What are the obligations for platforms?

The Digital Services Act defines a number of obligations for digital service providers to take measures to ensure the security and protection of users’ rights and prevent the dissemination of illegal content. Among the obligations under the DSA for intermediary services are:

  1. Transparency obligations: digital service providers must provide clear and accessible information to users, such as regarding conditions of use, content moderation policies, and advertising.
  2. Cooperation obligations: digital service providers must cooperate with national authorities to prevent and counter the dissemination of illegal content, such as child pornography and terrorism.
  3. Notification obligations: digital service providers must notify national authorities in case of incidents that could compromise user security or data privacy.
  4. Obligations to cooperate with law enforcement authorities: digital service providers must cooperate with law enforcement authorities to prevent and counter the dissemination of illegal content.
  5. Algorithm transparency obligations: digital service providers must provide transparent information regarding the use of algorithms that influence users’ choices.

At the same time, the Digital Services Act does not impose general supervision of user-uploaded content on platforms.

National authorities may request information from platforms about users’ activities, as long as such requests are reasoned and respect users’ fundamental rights, privacy, and freedom of expression.

As a final point, the DSA requires platforms to provide users with effective complaint and redress channels in case of unjustified removal of content or violation of their rights.

What are the obligations for large platforms?

The Digital Services Act (DSA) provides for the introduction of more restrictive obligations for digital platforms that are considered “massive”. These are the platforms that have more than 45 million monthly active users in the European Union. Among the more restrictive obligations are:

  1. Designation of a legal representative in the European Union to ensure greater accountability of digital platforms.
  2. Conducting risk analyses and impact assessments on security and protection of users’ fundamental rights.
  3. Adoption of measures to ensure transparency and publicity of advertising activities, particularly with regard to the identification of advertisers and advertising messages.
  4. Adoption of measures to prevent and counter abuse of market power, manipulation of information, and dissemination of illegal content.

What institutions does the DSA introduce?

The DSA envisions the creation of the European Algorithmic Transparency Center (ETAC), which will be tasked with monitoring and regulating the use of algorithms by digital platforms. ETAC will be tasked with monitoring the impact of algorithms on user behavior and the marketplace, and ensuring transparency of algorithm use by platforms. In addition, ETAC will play a technical support role to national authorities and the European Data Protection Board (EDPB) to ensure uniform application of the DSA rules across the European Union.

The DSA also establishes the creation of an independent mediation body, which users will be able to contact to resolve any disputes related to content moderation. This body, called the Board for Digital Services (BDS), will be composed of independent experts and will be tasked with ensuring the proper application of content moderation policies by digital service providers.

When will the DSA come into force?

The Regulations will not be fully implemented until February 17, 2024, although some precepts came into effect on November 16, 2022.

The most recent implementation was the requirement for service providers to report the average number of European users they work with. This number is essential for the subsequent designation of a platform as a “very large platform,” which, in turn, will result in a number of more restrictive obligations that only this category of platforms will have to take on. Within a couple of months, some providers having more than 45 million users will start having the “oversight fee”. They will also have to perform independent audits before July 2023.

Another deadline to consider is Feb. 16, 2024, the date by which member states must have designated the competent authority at the national level that will oversee the regulation of the European Commission’s guidelines. On the other hand, the Commission is establishing ETAC to support its new oversight role with internal and external multidisciplinary expertise.

© Canella Camaiora Sta. All rights reserved.
Publication date: 15 June 2023
Last update: 7 September 2023

Textual reproduction of the article is permitted, even for commercial purposes, within the limit of 15% of its entirety, provided that the source is clearly indicated. In the case of online reproduction, a link to the original article must be included. Unauthorised reproduction or paraphrasing without indication of source will be prosecuted.

Margherita Manca

Lawyer at The Canella Camaiora Law Firm, member of the Milan Bar, she specialises in industrial law.
Read the bio
error: Content is protected !!