An important step has been taken with the provisional political agreement reached on the Digital Services Act (DSA) between the Council and the European Parliament.
In terms of ambition, the nature of the actors regulated and the innovative aspect of the supervision involved, the DSA is a world first in the field of digital regulation.
The DSA follows the principle that what is illegal offline must also be illegal online. It aims to protect the digital space against the spread of illegal content, and to ensure the protection of users’ fundamental rights.
The DSA will apply to all online intermediaries providing services in the EU.
The obligations introduced are proportionate to the nature of the services concerned and tailored to the number of users, meaning that very large online platforms (VLOPs) and very large online search engines (VLOSEs) will be subject to more stringent requirements. Services with more than 45 million monthly active users in the European Union will fall into the category of very large online platforms and very large search engines.
To safeguard the development of start-ups and smaller enterprises in the internal market, micro and small enterprises with under 45 million monthly active users in the EU will be exempted from certain new obligations.
The EU aims for a secure digital marketplace for citizens and businesses
In order to ensure effective and uniform implementation of requirements under the DSA, the Council and Parliament have decided to confer on the Commission exclusive power to supervise VLOPs and VLOSEs for the obligations specific to this type of actor.
They will be supervised at European level in cooperation with the member states. This new supervisory mechanism maintains the country-of-origin principle, which will continue to apply to other actors and requirements covered by the DSA.
Given the important role played by these actors in the daily lives of European consumers, the DSA will impose a duty of care on marketplaces vis-à-vis sellers who sell their products or services on their online platforms.
Marketplaces will in particular have to collect and display information on the products and services sold in order to ensure that consumers are properly informed.
Systemic risks of very large platforms and search engines
The DSA introduces an obligation for very large digital platforms and services to analyse systemic risks they create and to carry out risk reduction analysis.
This analysis must be carried out every year and will enable continuous monitoring aimed at reducing risks associated with:
- dissemination of illegal content
- adverse effects on fundamental rights
- manipulation of services having an impact on democratic processes and public security
- adverse effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users
For online platforms and interfaces covered by the DSA, the co-legislators have agreed to prohibit misleading interfaces known as ‘dark patterns’ and practices aimed at misleading users.
Recommendation systems are found in many uses of online users, allowing them to quickly access relevant content.
Transparency requirements for the parameters of recommender systems have been introduced in order to improve information for users and any choices they make. VLOPs and VLOSEs will have to offer users a system for recommending content that is not based on their profiling.
In the context of the Russian aggression in Ukraine and the particular impact on the manipulation of online information, a new article has been added to the text introducing a crisis response mechanism.
This mechanism will be activated by the Commission on the recommendation of the board of national Digital Services Coordinators. It will make it possible to analyse the impact of the activities of VLOPs and VLOSEs on the crisis in question and decide on proportionate and effective measures to be put in place for the respect of fundamental rights.
Protecting minors online
Platforms accessible to minors will have to put in place special protection measures to ensure their safety online in particular when they are aware that a user is a minor. Platforms will be prohibited from presenting targeted advertising based on the use of minors’ personal data as defined in EU law.
More information: European Council – Press release
Leave a Reply