The European Parliament and the Council reached a provisional agreement on the Digital Services Act (DSA), a landmark legislation that imposes new content control and transparency duties on online platforms and services. The Act aims to create a safer and more transparent digital environment by preventing the spread of illegal, harmful, and misleading content in the digital space.
Regardless of whether they are based in the European Union or not, the DSA will apply to all online intermediaries that provide services in the single market.
The Act’s obligations will be proportional to the number of users and the nature of the platforms’ services. Very large platforms and services with more than 45 million monthly active users in the European Union, such as Meta and Google, will be subject to strict requirements. On the other hand, micro and small businesses will be exempt from certain obligations in order to support the development of start-ups and smaller businesses in the European market.
Dita Charanzova, Vice-President of the European Parliament, warned "Google, Meta and other large online platforms will have to act to better protect their users." “Europe has made clear that they cannot act as independent digital islands.”
Very large platforms and services will be required to do regular risk assessments under the legislation to prevent their services from being used for illegal, manipulative, or abusive purposes.
The Act contains a number of measures to combat illegal content, goods and services, such as a mechanism for users to flag illegal content so that platforms can respond quickly. Online platforms will have to ask business users for basic information to allow access to online marketplaces in order to identify people selling illegal goods or services.
The DSA requires online platforms to provide further transparency to their users on a variety of topics, including the algorithms used for content and product recommendations. Under the new rules, tech firms may be obliged to provide authorities and researchers access to their key data to allow them to gain insight into how online risks evolve.
Under the DSA, targeted advertising on minors as well as advertising based on sensitive data such as a user’s ethnicity, gender, or religion will be prohibited. Dark patterns, which are deceptive tactics used to manipulate users into doing something or buying specific products or services, will also be banned.
A crisis response mechanism has been introduced by the Act in the context of the invasion of Ukraine. Under the mechanism, very large companies could be required to take specific measures in response to crises, particularly to avoid misinformation during crises affecting public security or public health.
The commission will have the authority to directly supervise very large platforms and impose fines of up to 6% of the companies’ global turnover in the event of noncompliance with the law. In the most serious cases, the non-compliant platform’s services could be temporarily suspended. Users will also be able to challenge platforms’ content moderation decisions and seek redress in the event of noncompliance.
The law is now pending formal approval from the Council and the European Parliament and is expected to enter into force in 2024. For very large online platforms and search engines, the agreement will apply earlier, within four months from their designation.
Share This Post On
Leave a comment
You need to login to leave a comment. Log-in