Online Platforms' Speech Rights According to the European Court of Human Rights: Google v. Russia
If content moderation is protected expression, as the recent ECtHR ruling suggests, what could this mean for the DSA?
When is it justified to require an online platform to remove content or accounts, and what limits and safeguards should be observed? Can states force a platform to leave up content to allegedly protect the right to freedom of expression of their users? These are two very relevant questions in current debates around freedom of expression within the context of online platforms.
More specifically, these issues are particularly addressed in platform regulation systems, both within the European Union's legal framework, primarily through the Digital Services Act (DSA), and in the United States under Section 230 of the Communications Decency Act as well as the First Amendment itself.
Regarding the DSA, Article 8 outlines a detailed regime for platforms’ obligations upon receipt of an order to act against illegal content, “issued by the relevant national judicial or administrative authorities,” and Article 16 establishes comprehensive notice-and-action mechanisms. Furthermore, Article 14 establishes that intermediaries shall inform users on any restrictions that they impose regarding content provided by them, in their terms and conditions, and specifies that platforms must act with “due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter”.
In the United States’ legal system, there are clear exemptions for platforms in terms of responsibility when they decide either to leave up or take down content. But recent policy debates and legal proposals have emerged in areas such as platform transparency mandates for content, including disinformation and hate speech (in New York and California), as well as requiring viewpoint-neutral content moderation and restricting platforms’ ability to moderate posts about political candidates (in Texas and Florida, respectively). As Daphne Keller has explained, it is worth noting that Courts have blocked “democracy-protective” laws from both approaches. While a majority of Supreme Court justices rejected Texas’s and Florida’s must-carry rules for social media, First Amendment challenges to state laws such as those in New York and California have succeeded in lower courts.
The European Court of Human Rights (ECtHR) is at the forefront of regional human rights case law regarding freedom of expression online. It has adopted early and landmark decisions regarding the blocking of online content and applications (notably, the decision on the case of Ahmet Yildirim vs Turkey from 2012) while also endorsing more problematic approaches to the equation between freedom of expression online and platform liability exemptions (with the saga started with the much questioned decision on Delfi AS vs Estonia and the more recent yet still problematic approach adopted in Sanchez vs France).
In the recent judgment on the case of Google LLC and others vs Russia, the Court clearly reprimands the arbitrary and disproportionate regulations in place in Russia, particularly affecting online expression by political dissenters. It is also important to note that, although the Russian Federation ceased to be a party to the European Convention on Human Rights in September 2022 following its exclusion from the Council of Europe, the ECtHR remains competent to deal with applications concerning actions or omissions that occurred before that date.
The importance of this case lies in the Court's focus on Google's position and rights as an intermediary platform. It particularly refers to the reception of a series of requests from Russian State authorities (in particular, the telecommunications regulator Roskomnadzor) to block specific content and channels on YouTube on the grounds that they contained “socially significant disinformation” or “calls for extremist activities.”
In addition to this, Google Russia had also suspended and refused to reinstate the Gmail and YouTube accounts of Tsargrad, a Russian media group owned by a Russian businessman sanctioned by the European Union, the United States and Canada for providing material support to Russian-backed separatists in eastern Ukraine and publicly supporting Russia’s annexation of Crimea. The Ninth Commercial Court of Appeals decided in favor of Tsargrad’s claims, and access to accounts was restored, although features that would enable Tsargrad to generate revenue (“monetization”) from the accounts were not. Google Russia was subsequently charged with an administrative offence for failure to comply with the court order.
As a result of these decisions, domestic courts imposed a series of coercive penalties on different Google incorporations, both in Russia and abroad, that, at the time of the claim before the Court in Strasbourg, had accrued the amount of 16 trillion (!) US dollars. It is worth noting that awards in favor of Tsargrad paved the way for other plaintiffs, predominantly Russian State-owned and affiliated television channels, to file more than twenty repeat or “copycat” claims. All tangible property owned by Google Russia was seized by the competent authorities, and in 2022, it filed for bankruptcy.
Considering these elements, the ECtHR considers that the right to freedom of expression recognized under Article 10 of the Convention has been violated from two perspectives.
First, the Court establishes that “in principle, any measure compelling a platform operator to restrict access to content under threat of penalty constitutes interference with freedom of expression.” Based on this general principle, and when assessing the legitimacy of such interference, the Court observes that the measures consisting of forcing Google to take down content and accounts deemed illegal were applied “indiscriminately to a broad range of content” on YouTube, including political expression, criticism of the Russian Government, reporting on Russia’s invasion of Ukraine by independent news outlets and content supporting LGBTQ rights.
At the same time, the Court finds it difficult “to ascertain how such expressions of political opinion or independent reporting could constitute a genuine threat to national security, territorial integrity or public safety.” The subsequent and strong conclusion is therefore that the sole basis for requiring the removals in question “appears to have been their capacity to inform public debate on matters which the authorities preferred to suppress”.
Taking also into consideration the role of a platform such as YouTube, it establishes that the platform’s significance lies “in its role as a forum where users can share diverse viewpoints on matters of public interest, including those that may not find expression in traditional media.” Even though “facilitating and shaping public debate engenders duties of care and due diligence,” it is considered that “penalizing Google LLC for hosting content critical of government policies or presenting alternative views on military actions, without demonstrating a pressing social need for its removal, strikes at the very heart of the Internet’s function as a means for the free exchange of ideas and information”.
All these arguments, together with the completely disproportionate severity of the penalties imposed, led the ECtHR to find that there had been a violation of Article 10 of the European Convention in this first part of the case.
Second, regarding the coercive penalties by the Russian courts intended to compel Google to host content from Tsargrad TV, as well as to enable monetization, it is stated that they directly impacted Google’s right to “determine what content it was prepared to host on its platform.” The Court also establishes that such right falls within the scope of Article 10, “which protects not only the content of information but also the means of its transmission.”
Interestingly, the Court acknowledges the fact that Tsargrad’s YouTube account was suspended due to sanctions imposed on its owner. However, it is also noted that while purporting to defend freedom to receive information in Tsargrad’s case, the Russian authorities were in fact “simultaneously demanding that the applicant companies remove content critical of government policies, including political expression regarding Russia’s invasion of Ukraine and reporting from independent news outlets.”
These inconsistencies would therefore raise doubts as to whether the measures pursued any genuine “pressing social need” relating to the protection of the right to freedom of expression. All these elements, together with the grossly disproportionate penalties and the misuse of “copycat claims” to harm the claimant’s interests, led the Court to again declare the violation of the right to freedom of expression under Article 10.
When it comes to this second restriction, it is noteworthy how the Court refuses to sympathize with Russian authorities in their purported efforts to protect the right to freedom of expression vis-à-vis the impact of foreign sanctions against local media owners. On the contrary, it implicitly accuses them of hypocrisy for using the freedom of expression argument to force the full reinstatement of specific YouTube accounts while simultaneously engaging in broad and systematic repression of legitimate political content.
On a related note, the Court clearly indicates that deciding what content to leave up or take down is a right that a platform, such as YouTube, has in direct connection with the protection granted under Article 10 of the Convention. In other words, content moderation decisions are therefore manifestations of the freedom of expression rights that online platforms enjoy in this area.
Even though, and in line with the previously mentioned jurisprudence regarding intermediary liability for third-party content, the Court issued a reminder that the role of online platforms engenders duties of care and due diligence, the ECtHR is, for the first time, quite explicit in declaring that content moderation policies and enforcement constitute expressive activities that deserve to be protected. Obviously, this also means that certain types of interference by States regarding how platforms handle content and decide on the publications and accounts they allow or deactivate may contradict the protections included in Article 10 of the Convention.
Without prejudice to the interest of this decision, what remains to be properly and adequately defined by subsequent decisions by the Court is how to strike a fair balance between the mentioned duties of care and the right of platforms to determine which and how content is made available. The case objective of this analysis was quite clear, considering the gross arbitrariness in the behavior of national authorities, combined with the overall authoritarian political situation in Russia.
But it will be interesting to see how the ECtHR might analyze possible future cases, especially as legislation like the aforementioned DSA might be used by the EU or national authorities to force platforms to take systemic actions that could have excessive or unjustified effects on the right to freedom of expression.
Joan Barata is a Senior Fellow at The Future of Free Speech and a Fellow of the Program on Platform Regulation at the Stanford Cyber Policy Center. He works on freedom of expression, media regulation, and intermediary liability issues.