On October 4, 2022, the European Council gave its final approval to the Regulation on a Single Market for Digital Services, also referred to as the “Digital Services Act” or DSA. This marks the final step for the DSA to come to life. Its main goal is to create a safe, predictable, and trustworthy online environment, and to protect the digital space against the spread of illegal content, online disinformation, and other societal risks. When adopting the DSA in July 2022, the EU Parliament praised it as “strong, ambitious regulation of online platforms,” enabling “the protection of users’ rights online.” And indeed, the DSA is more than just a refresh of the EU’s existing regulatory framework for the online sector. It introduces a comprehensive regime of content moderation rules for a wide range of businesses across the EU.

1. Regulatory Approach

The DSA harmonizes the diverging national rules of the European Member States that had emerged under the E-Commerce Directive (2000/31/EG) since its entry into force in 2000 on how online services must moderate the third-party content that they distribute. With the DSA, the EU is now introducing a complete makeover of the relevant rules, particularly aiming to better reflect the importance of (large) online intermediary services for a democratic society. Unlike the E-Commerce Directive, the DSA will apply immediately in all Member States without a need for further implementation into national laws.

It may, however, still take some time until the DSA’s obligations are fully fleshed out. The DSA provides that the EU Commission will lay out certain details in implementing acts to supplement the main text. In particular, the Commission shall adopt acts to clarify the criteria for identifying “very large online platforms” and “very large search engines” (together, VLOPs) and for certain technical specifications. Those regulations are not yet available, not even in draft form. It will therefore take some additional time until the full scale of the DSA’s provisions is clear.

2. Scope

The DSA will apply to online intermediary services offered to consumers and business users in the EU. Services are typically offered in the EU, if, for example, users can order products or services in one or more Member States, pay in Euro, select an EU language, or access the service via a relevant top-level domain. The providers’ place of establishment or location, within or outside the EU, is irrelevant in this regard.

Specifically, the DSA addresses four different categories of online intermediaries:

  • Intermediary services are online services which transport, cache, or store third-party information, e.g., online search engines, wireless local area networks, domain name system (DNS) services, top-level domain name registries, virtual private networks, cloud infrastructure services, or content delivery networks, that enable, locate, or improve the functions of other providers of intermediary services.
  • Hosting services are intermediary services storing third-party information provided by users of the service, e.g., cloud computing, web hosting, paid referencing services or services enabling sharing information and content online, including file storage and sharing.
  • Online platforms are hosting services that also disseminate the information they store to the public, e.g., social networks or online marketplaces. Emails and private messaging services typically fall outside the scope, if the relevant communication is among a finite number of persons.
  • VLOPs are online platforms and search engines with at least 45 million monthly active users in the EU. VLOP status will be designated by the EU Commission in dedicated proceedings. Any VLOP-specific obligations under the DSA will then only apply as of four months after the provider was formally notified of the Commission’s designation decision.

To safeguard the development of start-ups and smaller enterprises in the internal market, micro- and small enterprises will be exempted from certain of the DSA’s obligations.

3. Key Obligations

The DSA applies a staggered approach to its substantive obligations. While its most basic obligations apply to all providers of intermediary services, additional obligations apply to providers in the other categories, with the heaviest regulation targeting VLOPs.

Obligations applicable to all providers of intermediary services:

  • Liability: The DSA retains the existing provider liability regime of the E-Commerce Directive. Providers of online intermediary services will thus continue to generally not be liable for illegal user-generated content unless they have actual knowledge of such content or, upon obtaining such knowledge, fail to act in due course to remove or block the content.
  • Official orders: All intermediary services that receive an order to act against illegal content must inform the relevant national authority of any follow-up given to the order, specifying if and when they followed the order. The same obligation applies to orders to provide information.
  • Terms and conditions: Intermediary services must have clear terms and conditions for their content moderation practices. They must also provide easily accessible information on the right to terminate the use of the service. If their terms change significantly, they must notify their users.
  • Transparency reports: Intermediary services must publish annual transparency reports on their content moderation activities, including the measures they take to apply and enforce their terms and conditions.
  • Point of contact: Service providers must designate a single electronic point of contact for official communication with regulators in the EU and with their users, and non-EU-based services must also appoint a local legal representative.

Additional obligations applicable only to providers of hosting services:

  • Notice and action mechanisms: Hosting services must establish effective notice and action mechanisms for content that users may consider to be illegal.
  • Statement of reasons: Hosting services must provide their users with a statement of reasons whenever they delete or block access to their content for content moderation purposes. They must also provide such a statement whenever they restrict payments or suspend or terminate their own service or the user’s account.
  • Notification of suspicions of criminal offenses: If hosting services suspect any serious criminal offenses, they must notify national law enforcement or judicial authorities.

Additional obligations applicable only to providers of online platforms:

  • Complaint-handling system: In addition to the requirements described above, online platforms must facilitate the submission of complaints through internal complaint-handling systems.
  • Dispute settlement: Online platforms must implement mechanisms for out-of-court dispute settlement to resolve any dispute regarding their decisions to delete or block access to any user content.
  • Trusted flaggers: Online platforms must treat reports of illegal content submitted by designated “trusted flaggers” as high priority.
  • Misuse: Online platforms must implement efficient sanctions for repeated misuse of their service, such as the frequent posting of illegal content or the frequent submission of unfounded notices or complaints on allegedly illegal content.
  • Transparency reports: In addition to the general transparency obligations for all intermediary services, online platforms are subject to additional transparency requirements, including reports on dispute settlements and the misuse of their service.
  • Interface design: Online platforms must not use any so-called “dark patterns,” i.e., practices that deceive or nudge users into undesired decisions through the design of their interface.
  • Advertising: Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the advertisement. They must, among other actions, not display personalized advertising directed towards minors or based on sensitive personal data (e.g., sexual orientation).
  • Recommender systems: Online platforms must make transparent the main parameters that determine how they suggest or prioritize information for users. Users must also be able to select or modify their preferred option for sorting/ranking content on the online platform if several options are available.
  • Protection of minors: In addition to the prohibition of personalized advertisements directed towards minors, online platforms have a general obligation to ensure a high level of privacy, safety, and security of minors on their service.
  • Traceability of traders: If online platforms enable consumers to conclude contracts with traders (e.g., on online marketplaces), they must ensure traceability of these traders by collecting and assessing the veracity of basic trader information. If the platforms become aware of an illegal product or service offered by the trader, they must inform affected consumers.
  • Interface design: If online platforms enable consumers to conclude contracts with traders, they must design their service in such a way that traders can comply with certain of their own information obligations under EU law, e.g., regarding pre-contractual information under consumer protection rules.

Additional obligations applicable only to providers of VLOPs:

  • Systemic risks: In addition to the obligations above, VLOPs must conduct a specific risk assessment and implement mitigation measures regarding the distribution of illegal content, related risks for fundamental rights, civic discourse, electoral processes, and public security, intentional manipulations on their service, and adverse effects on minors, gender-based violence, and damage to physical or mental health.
  • Transparency obligations: VLOPs are subject to further enhanced transparency obligations, which include annual independent compliance auditing, additional obligations regarding public advertising databases, and even more stringent semi-annual transparency reporting.
  • Recommender systems: VLOPs must offer users at least one option for each of their recommender systems that is not based on profiling.
  • Data access: VLOPs must allow regulators to access their data to assess compliance and let researchers access their data to identify systemic risks of illegal or harmful content.
  • Compliance function: VLOPs must establish an independent compliance function and appoint at least one compliance officer who reports directly to management.
  • Supervisory fee: VLOPs must pay an annual supervisory fee and its amount will depend on the Commission’s estimated cost of supervision. The individual annual supervisory fee should not exceed 0.05% of the annual worldwide net income of the provider concerned.
  • Crisis response: The Commission may instruct VLOPs to take proportionate and effective measures in the event of a crisis, such as extraordinary threats to public security or public health in the EU.

4. Enforcement

Each Member State will have to designate a competent authority as a Digital Services Coordinator (DSC) to be responsible for supervising intermediary services established in their territory and enforce DSA rules against them. The Commission will be solely competent to supervise and enforce the specific obligations under the DSA for VLOPs.

The DSC will serve as a single point of contact for the Commission and will be required to participate in the EU cooperation mechanism. The national DSCs will cooperate within an independent advisory group, called the European Board for Digital Services (the “Board”). The Board will work with the Commission and the DSCs to promote effective cooperation. It mainly serves to achieve a consistent application of the DSA.

Each EU Member State will determine its rules for penalties for DSA infringements under its own competence. The DSA only specifies the maximum amount of fines that a Member State can impose. For a failure to comply with a DSA obligation, the maximum fine will be 6% of the offender’s annual worldwide revenue. For the supply of incorrect, incomplete, or misleading information, failure to reply or rectify such information, and failure to submit to an inspection, the maximum fine will be 1% of the offender’s annual global revenue. The maximum amount of a periodic penalty payment will be 5% of the offender’s average daily worldwide revenue. The Commission will be able to impose similar fines (only) against VLOPs.

Failure of an intermediary to comply with the DSA can also trigger private claims for compensation for damages or losses suffered due to an infringement by providers of their DSA obligations. Such compensation can be claimed on other grounds, especially under applicable national law or consumer protection rules. National authorities, including courts, must not make decisions that are contrary to Commission decisions which were adopted under the DSA.

5. Interplay with other EU Laws

The DSA complements sector-specific legislation such as the Audiovisual Media Services Directive, the Terrorist Content Online Regulation, and other relevant EU laws that regulate other aspects of intermediary services. The DSA builds on the E-Commerce Directive, which has been the main legal framework for the provision of digital services in the EU for the last 20 years. As far as the DSA does not amend the E-Commerce Directive, it will not affect its application. Specifically, the DSA replaces the E-Commerce Directive’s provisions on provider liability. Otherwise, the E-Commerce Directive remains in effect and therefore continues to provide, for example, the country-of-origin privilege for online services, and to govern imprint and general advertising obligations for online services.

6. Interplay with Member State Laws

The DSA seeks to fully harmonize the rules applicable to intermediary services. Accordingly, Member States must not adopt or maintain additional national requirements on the matters falling within the scope of the DSA. Member States can thus still adopt and enforce national rules that pursue public interest objectives other than those pursued by the DSA. On this basis, e.g., the German government expects that the DSA will largely replace the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz, NetzDG), which regulates how social networks and video sharing platforms must treat user reports of alleged criminal content. Against this background, it has already announced a revision of the NetzDG, while it also plans to revise the German Telemedia Act (Telemediengesetz, TMG) and Youth Protection Act (Jugendschutzgesetz, JuSchG) following the entry into force of the DSA.

7. Entry into Force

After the final approval of the European Council on October 4, 2022, the DSA will be published in the EU’s Official Journal. The DSA will enter into force 20 days after that publication. Affected service providers will then have until January 1, 2024 to comply with its provisions.

For VLOPs, which are directly supervised by the Commission, the new rules might kick in earlier. To allow VLOP designations, the obligations for online platforms to report on their user numbers as well as the rules on the VLOP designation process will apply immediately. And, once designated by the Commission, providers of VLOPs have only four months to comply with the DSA, even if that deadline is earlier than January 1, 2024.

8. Conclusion

The DSA establishes a gigantic set of new rules for digital services in the EU. And other than the recently adopted Digital Markets Act, its scope is not limited to just a few “gatekeeper” companies and their “core platform services.” The DSA will ultimately affect every company doing business with third-party content in the EU, regardless of size and place of establishment. All of these providers of intermediary services will need to comply with the numerous DSA obligations, unless they are exempted from certain rules as micro- or small enterprises. Affected companies can expect significant organizational and operational challenges to ensure compliance and to avoid potential regulatory fines.