Digital Services Act: EU institutions seek to tighten rules for online players

by Mathilde Quembre on 13 Jan 2022

The European Union’s Digital Services Act (DSA), currently working its way through the legislative process, will create an unprecedented set of new rules for intermediary service providers. The legislation will establish a framework for content moderation and reinforce the rules for platforms that should serve to further protect the fundamental rights of all users of digital services across Europe. This blog looks at the DSA’s progress and the positions recently taken by the European Parliament and the Council of the EU. We explore how they are seeking to re-shape the original proposal and what this means for businesses and consumers.

The DSA was proposed by the European Commission in 2020 as part of the Digital Services Package. For the past year, the European Parliament and the Council of the European Union have been preparing their position on the text, with each organisation working to strengthen the rules proposed by the Commission. The institutions will very soon start the three-way or ‘trilogue’ negotiations, to agree the final text of the regulation.

European Parliament

The European Parliament’s Internal Market and Consumer Protection (IMCO) Committee has agreed its report on the Digital Services Act (DSA). This paves the way for the Parliament to adopt its final position on the proposal in early 2022. The delay in adopting the report, originally scheduled for November 2021, highlights the difficult negotiations surrounding the establishment of provisions setting new rules for platforms, particularly on content moderation and use of algorithms. The European Parliament’s position is decisively consumer-oriented, using the regulation as an opportunity to hold intermediary service providers accountable for the services that they provide.

"The result of the IMCO vote has shown that we are trying to send two main signals: that we would like to protect users and consumers, but at the same time make sure that we have a digital economy that keeps on growing in a transparent, trustworthy and democratic framework."

Christel Schaldemose MEP (S&D, Denmark), Rapporteur for the IMCO Committee.

Under the successive Portuguese and Slovenian Presidencies, the Council of the European Union agreed on its General Approach to the DSA, which received support from all Member States, allowing the interinstitutional negotiations to officially begin once the European Parliament is ready.

The European Parliament has sought to strike a balance between preventing illegal content from appearing online and preserving users’ freedom of speech. It wants to encourage platforms to take voluntary action on this front as opposed to establishing a general monitoring obligation, whereby platforms would be required to monitor the entirety of user activity on their websites. Such an obligation has been a red line which neither policymakers, nor the industry, wanted to cross as this would represent an unrealistic burden for companies and force them to assess the legality of all content posted on their website.

The Parliament’s position remains that platforms should ensure that illegal content is removed diligently. If a platform notices content that possibly amounts to a ‘serious criminal offence’, it must contact the relevant authorities. However, Members of the European Parliament added safeguards to the text so that intermediary service providers are not forced to assess the legality or illegality of content, by re-affirming there is no general monitoring obligation.

Under the proposed text by the Parliament, platforms would also be responsible for restoring content that has wrongly been removed, as determined by the outcome of legal proceedings. The risk of platforms ‘over-removing’ content has been a concern of several MEPs, especially within the Civil Liberties’ Committee. Accordingly, the European Parliament’s report clarifies that providers should not be held liable if they do not remove content while an assessment of said content’s legality is still pending. All voluntary actions taken by intermediary service providers should be ‘effective, specific and targeted, as well as accompanied by appropriate safeguards’, including human oversight.

Another hotly debated topic in the European Parliament’s report concerns the use of algorithms and profiling. The Parliament has strengthened provisions related to algorithmic transparency, especially for very large online platforms (VLOPs). According to the text, VLOPS must inform users when they are subject to algorithms or automated tools and allow users to opt out. VLOPs must also disclose the methodology and systems undergirding their algorithms if requested by digital service coordinators (DSCs); perform assessments of automated decisions taken by algorithmic tools; and correct their algorithms if found non-compliant by the European Commission.

While Socialist & Democrat and Green MEPs strongly defended a ban on targeted advertising (based on users’ data, collected via tracking technologies) the European Parliament agreed on additional transparency measures and easy opt-out options, giving users the possibility to choose contextual advertising (based on the visited website’s content) over targeted advertising. The ban on targeted advertising will be limited to users under 18 years old. This compromise comes after intense negotiations with MEPs from the European People’s Party, which argued that small and medium sized enterprises rely on targeted advertising to reach consumers at a reasonable cost.

Regarding online marketplaces, the European Parliament’s position advocates for providing end-users with more information on products and services offered by third-party traders online. For instance, online marketplaces would require traders to provide an array of information while ensuring that the information is ‘complete and reliable’ before making it accessible. These marketplaces would need to make best efforts to ensure that disseminating illegal content does not take place on their platforms, particularly offering and distributing dangerous products. Measures such as random checks should be conducted, and marketplaces would need to keep a database of offers removed.

Council of the European Union

The Council of the European Union’s General Approach is more conservative and only moderately amends the European Commission’s initial proposal. The Council’s position makes it clear that the DSA should remain applicable to all sectors and proportionate.

In the General Approach, the scope of the DSA has been clarified so that the number of active recipients of a service, which is the basis for designating very large online platforms, is calculated by including all internet users browsing a given platform’s website ‘for the purposes of seeking information or making it accessible’. This amendment could be a game-changer for designating large platforms and VLOPs, as this definition would not only include users pro-actively using their services.

The enforcement regime has been at the heart of discussions in the Council. Member States have decided to give the European Commission exclusive powers to supervise VLOPs and large online search engines. The European Commission, in cooperation with Member State authorities, would therefore be responsible for designating and regulating VLOPs. This decision was taken so as not to repeat problems identified when enforcing the General Data Protection Regulation (GDPR), where long delays in enforcement actions have been widely criticized. For intermediary service providers that are not designated as VLOPs, authorities located in each provider’s respective country of origin would be responsible for enforcing the DSA.

In its proposal, the Council has further strengthened requirements for VLOPs to mitigate risks linked to their size and influence. The Council proposes that VLOPs must adjust their content moderation practices, making them quicker and more efficient, while also taking awareness-raising measures that would alert users to illegal content. Finally, VLOPs would also have additional transparency reporting obligations and would be held responsible for conducting improved risk assessments.

Next Steps

Once the European Parliament adopts the IMCO report in plenary session, the trilogue negotiations will begin. While the Parliament’s text is more focused on users’ rights and consumer protection, the Council’s position contains additional provisions for the supervision of VLOPs and enforcement. There is no manifest contradiction between the two institutions’ positions, making it harder to predict which topics could trigger difficult discussions.

One of the most crucial issues in the negotiations might be diverging expectations regarding speed and timing, with Member States aiming to find an agreement as fast as possible, while, as Ms Christel Schaldemose MEP indicated, the European Parliament is ready to take more time to discuss the provisions of the future regulation.

The interinstitutional negotiations should begin in January 2022 and will be led under the auspices of the French Presidency of the Council. In a press conference on 9 December 2021, French President Emmanuel Macron stated that digital policies will be a top priority, especially work on the Digital Services Act, which will contribute to ‘the regulation and accountability’ of platforms. President Macron’s statements were re-affirmed in the 1 January release of the Presidency’s Programme.

Online platforms will be taking a keen interest in the trilogue negotiations over the next few months, as policymakers try to reach common ground on the rules that will affect their activities for the coming years. How the rules will affect them will depend on their size, with exemptions for small companies and additional provisions for very large ones. The institutions will also discuss the timeline for the entry into force of the regulation, which could be as soon as six months after its publication in the Official Journal, only giving businesses a short period of time to implement new measures.

If you have any questions about online platform regulation, or are interested in an informal chat, please contact us at enquiries@inlinepolicy.com.

Regular updates on political and regulatory trends - subscribe here

Topics: European Politics, E-commerce, Data policy, Big Tech, Online Platforms, EU Digital Services Act, Technology

Mathilde Quembre

Written by Mathilde Quembre

Mathilde provides policy analysis and monitoring to clients in the tech sector.

Get the latest updates from our blog

Related Articles

Three key EU institutions - the European Commission, the European Parliament, and the Council of the European ... Read more

The Media Bill is a broad piece of legislation which will, amongst other things, makes changes to the way in ... Read more

In the ever-changing global landscape marked by geopolitical tensions and technological shifts, the European ... Read more

As political institutions slowly emerge from their Christmas hibernation, we look at the key unresolved ... Read more

Comments