The European Parliament and the EU member states have reached political agreement on the European Commission’s draft Digital Services Act (DSA). The DSA was originally proposed in December 2020. The agreement comes just weeks after the Online Safety Bill was introduced to the UK parliament.

The DSA’s scope covers online intermediary services. Their obligations under the DSA depend on their role, size, and effect on the online environment. These online intermediary services include intermediary services offering network infrastructure, hosting services, online marketplaces, social media platforms as well as very large online search engines with more than 10% of the 450 million consumers in the EU, and therefore, in the Commission’s view, more responsibility for curbing illegal content online; and very large online platforms, with a reach of more than 10% of the 450 million consumers in the EU, which could pose specific risks regarding illegal content and societal harms.

Under the new rules, intermediary services such as social media platforms and marketplaces will be required to employ measures to protect their users from illegal content, goods and services.  The rules include:

  • The European Commission and the member states will have access to the algorithms of very large online platforms with the aim of ensuring algorithmic accountability;
  • Swift removal of illegal content online, including products, services: a clearer “notice and action” procedure where users will be able to report illegal content online and online platforms will have to act quickly;
  • Stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression and data protection;
  • Online marketplaces will be required to ensure that consumers can purchase safe products or services online, by strengthening checks to prove that the information provided by traders is reliable (“Know Your Business Customer” principle) and make efforts to prevent illegal content appearing on their platforms, including through random checks;
  • Victims of cyber violence will be better protected especially against non-consensual sharing of illegal content (revenge porn) with immediate takedowns.
  • Fewer burdens and more time to adapt for SMEs: there will be a longer period to apply the new rules to support innovation in the digital economy. The Commission says that it will closely follow the potential economic effects of the new obligations on small businesses.


The DSA also includes new transparency obligations for platforms which aim to allow users to be better informed about how content is recommended to them and to choose at least one option not based on profiling.

Users will have better control over how their personal information is used. Targeted advertising will be banned when it comes to sensitive data (for example, based on sexual orientation, religion, ethnicity) but significantly, there will not be a general ban despite earlier calls for one. However, platforms accessible to minors will have to take specific measures to protect them, including by fully banning targeted advertising.

Manipulating users’ choices through ‘dark patterns’ will be prohibited: online platforms and marketplaces should not nudge people into using their services, for example by giving more prominence to a particular choice or urging the recipient to change their choice via interfering pop-ups. Moreover, cancelling a subscription for a service should become as easy as subscribing to it.

Enforcement and redress

Recipients of digital services will have a right to seek redress for any damages or loss suffered due to infringements by platforms.

If they do not comply, online platforms and search engines can be fined up to 6% of their worldwide turnover. In the case of very large online platforms (with more that 45 million users), the Commission will be able to require compliance.

Misinformation and harmful content

Very large online platforms will have to comply with stricter obligations under the DSA, proportionate to the significant societal risks the Commission says they pose when disseminating illegal and harmful content, including disinformation. Very large online platforms will have to assess and mitigate systemic risks and be subject to independent audits each year. In addition, those large platforms that use so-called “recommender systems” (algorithms that determine what users see) must provide at least one option that is not based on profiling. The DSA also contains measures which provide that when a crisis occurs, such as a public security or health threat, the Commission may require very large platforms to limit any urgent threats for up to three months.

Next steps

The text will need to be finalised before the European Parliament and Council give their formal approval. Once this process is completed, the DSA will be directly applicable across the EU and will apply fifteen months after entry into force or from 1 January 2024, whichever is later. For the very large online platforms and very large online search engines the DSA will apply from four months after their designation.

Although this will not apply directly in the UK, many of the provisions are similar to those proposed by the draft Online Safety Bill as well as the proposals about subscriptions in the UK’s recent response to its consultation about changes to consumer law.  In addition, UK platforms and marketplaces doing business with EU-based suppliers and customers will need to bear in mind its requirements.