We have created an interactive tool that gives you the opportunity to get a customized quote
Calculate your quote
The Digital Services Act (DSA) is the European Union’s new regulation on digital services and is aimed at updating the regulations on digital intermediary service platforms.
In this article we will discuss:
The main objective of the new European Union Regulation is to create an updated legal framework for digital services, while ensuring greater protection of consumer rights and promoting fair competition among digital businesses.
The DSA ensures greater transparency of online activities and prevents the dissemination of illegal or harmful content, while promoting freedom of expression and the protection of users’ privacy.
In particular, digital service providers will be required to comply with stricter rules regarding the removal of illegal content (such as content related to pedophilia, incitement to hatred and violence, etc.). In addition, the DSA provides for the establishment of a system for users to report such content and an appeals mechanism for those who believe their content has been unjustly removed.
The Regulations also introduce an obligation for digital service providers to give their users more information about content moderation and the operation of the algorithms that determine its visibility.
The Regulations will apply to all providers of online intermediary services, including social networks, e-commerce platforms, search engines and cloud computing services, as well as hosting service providers that manage user-generated content.
Excluded from the scope of the DSA will remain digital services operated by small businesses (e.g., small e-commerce platforms or online forums) that do not exceed certain thresholds of activity and brokering services provided by public entities or noncommercial nonprofit organizations.
The Digital Services Act defines a number of obligations for digital service providers to take measures to ensure the security and protection of users’ rights and prevent the dissemination of illegal content. Among the obligations under the DSA for intermediary services are:
At the same time, the Digital Services Act does not impose general supervision of user-uploaded content on platforms.
National authorities may request information from platforms about users’ activities, as long as such requests are reasoned and respect users’ fundamental rights, privacy, and freedom of expression.
As a final point, the DSA requires platforms to provide users with effective complaint and redress channels in case of unjustified removal of content or violation of their rights.
The Digital Services Act (DSA) provides for the introduction of more restrictive obligations for digital platforms that are considered “massive”. These are the platforms that have more than 45 million monthly active users in the European Union. Among the more restrictive obligations are:
The DSA envisions the creation of the European Algorithmic Transparency Center (ETAC), which will be tasked with monitoring and regulating the use of algorithms by digital platforms. ETAC will be tasked with monitoring the impact of algorithms on user behavior and the marketplace, and ensuring transparency of algorithm use by platforms. In addition, ETAC will play a technical support role to national authorities and the European Data Protection Board (EDPB) to ensure uniform application of the DSA rules across the European Union.
The DSA also establishes the creation of an independent mediation body, which users will be able to contact to resolve any disputes related to content moderation. This body, called the Board for Digital Services (BDS), will be composed of independent experts and will be tasked with ensuring the proper application of content moderation policies by digital service providers.
The Regulations will not be fully implemented until February 17, 2024, although some precepts came into effect on November 16, 2022.
The most recent implementation was the requirement for service providers to report the average number of European users they work with. This number is essential for the subsequent designation of a platform as a “very large platform,” which, in turn, will result in a number of more restrictive obligations that only this category of platforms will have to take on. Within a couple of months, some providers having more than 45 million users will start having the “oversight fee”. They will also have to perform independent audits before July 2023.
Another deadline to consider is Feb. 16, 2024, the date by which member states must have designated the competent authority at the national level that will oversee the regulation of the European Commission’s guidelines. On the other hand, the Commission is establishing ETAC to support its new oversight role with internal and external multidisciplinary expertise.
Margherita Manca