This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

MediaWrites

By the Media, Entertainment & Sport group of Bird & Bird

| 4 minute read

New content moderation rules in Germany: flexibility for community standards and justice by procedure

The German Federal Supreme Court confirms in its latest “hate speech”-judgements that platforms have the right to set their own frameworks for user communication in their community standards. As a result, platforms can prohibit content below the threshold of criminal liability to create a respectful debate and advertising environment for their users. At the same time, user interests must be safeguarded through content moderation by procedure. These seminal judgements set a legal framework applicable to all content sharing platforms.

What has happened in both “hate speech” cases?

The German Federal Supreme Court has ruled on two cases governing the deletion of inappropriate yet lawful “hate speech” content. In each case, a user had posted xenophobic text on a social network in 2018. The social network at issue subsequently deleted both posts for infringement of its community standards (which ban hate speech) and limited both user profiles to a “read only”-mode. Each of those users filed an action challenging the deletion of the content as well as the blocking of accounts. Essentially, in the first proceeding (court docket number: III ZR 179/20), the District Court Nuremberg and the Court of Appeal Nuremberg rejected the user’s action. While the District Court of Regensburg in the other case (court docket number: III ZR 192/20) partially granted the user’s claim, the Court of Appeal again rejected the user’s challenge. The German Federal Supreme Court (Bundesgerichtshof) has now partially overruled both decisions.

What has the German Federal Supreme Court ruled on content moderation?

The Federal Supreme Court provides much-sought guidance on one of the most debated issues on content moderation in Germany: The Court confirms that a platform is entitled to delete unwelcome, yet lawful content for non-compliance with its own terms and conditions, in particular with community standards. In detail:

  • On the substance, the German Federal Supreme Court found that a platform is generally obliged to provide its services to its users to enable them to upload and share content as well as to contact and chat with other users. Any deletion therefore requires a contractual justification under the social network’s terms and conditions.
  • Terms and conditions are liable to justify the deletion of unwelcome, yet lawful content, if the provisions at issue are valid under the German Civil Code. To this end, the terms and conditions must not unreasonably disadvantage users, which is determined under a comprehensive balancing test involving countervailing fundamental rights at issue. On a general note, considering, in particular, the freedom of speech of users and the freedom to conduct a business, the Court found that a platform is generally entitled to delete content which, although not subject to criminal punishment, infringes its Community Standards. Platforms may indeed require its users to comply with certain communication standards which go beyond the requirements of criminal law. Therefore, a social network may also reserve the right to take action against violations (e.g., by deleting posts or by blocking accounts).
  • However, a platform’s right to establish Community Standards and to take enforcement measures is not unlimited under the Court’s holding. The interests of platforms must be balanced with those of users (such as the freedom of expression) so as to ensure that users’ rights are as effective as possible. The Court has drawn on previous libel case law which sought to strike a fair balance between the interests of users, third parties and the platform by implementing “procedural” means.

What needs to be done to comply with this German “hate speech” case law?

Accordingly, terms and conditions of a platform will be fair (and thus enforceable) regarding content moderation to the extent platforms expressly commit – in their terms and conditions – to:

  • inform users about planned deletions (at least ex post) and the envisaged blocking of accounts in advance;
  • give objective reasons for the respective action; and
  • offer, proactively, the opportunity to respond in an appeal process, followed by a potential second decision by a human employee.

The failure to comply with these criteria will trigger the invalidity of terms and conditions under German law. Measures taken against any users’ hate speech content are liable to be unlawful then. The Court allows “narrowly tailored” exceptions to these principles, which must be set out in the general terms and conditions. These exceptional cases must be clearly defined in the terms and conditions. Eligible exemptions include deletions for purely technical reasons that do not relate to the content of the statement, e.g., removing posts of a user having irrevocably deleted their account already. These potential exceptions are very likely to take center-stage of future court decisions.

What impact do these “hate speech” decisions have on platform liability in general?

Both judgements rally behind the German “Justice by procedure”-approach on platform liability for copyright infringement. The legislator has drawn on the same mechanism for the German implementation of Art. 17 DSM-Directive, codified in the Copyright-Service-Provider-Act (Urheberrechts-Diensteanbieter-Gesetz, “UrhDaG”). Under the UrhDaG, there is a similar procedure for all blocking actions. Platforms are obliged to inform users about planned blockings and to give those affected the opportunity to respond. In future, it is likely that the German Courts may take a similar approach in areas of IP such as trademark law.

It is also likely that German case law will apply or adapt the principles developed in these seminal judgments to different content sharing platforms. Under the new case law, every platform is generally entitled to set up a framework of its own for unwelcome, yet still lawful user behaviour, content or communication. As long as there is no arbitrary restriction of individual expressions of opinion, and no limitation of user communication by subject-matter (such as political or socially critical statements) to ensure an attractive interaction and advertising environment. For instance, platforms can prevent certain flirting attempts (e.g. vaguely salacious remarks in a chat) or specific advertising campaigns in a Metaverse to create a safe space for their users. While the interests of users must be safeguarded by “procedural” means, platforms do not have to permit every expression of opinion in a state-like manner (as the diversity of opinions in German public broadcasting). As a result, platforms can set the framework for their user communication based on objective criteria.

Tags

social media, hate speech, germany