The Free Flow — January 23, 2025
Introducing: A Global Free Speech Roundup for the Digital Age
The Digital Age

» U.S. Supreme Court Upholds TikTok Ban, Citing National Security Concerns
On January 17, 2025, the Supreme Court upheld a law that requires ByteDance to sell TikTok or shut it down in the United States.
Background:
The Protecting Americans from Foreign Adversary Controlled Applications Act, passed in April 2024, obliges TikTok’s parent company, ByteDance, to divest from TikTok or face a U.S. ban by January 19, 2025, due to national security concerns over its Chinese ownership.
TikTok unsuccessfully challenged the ban on First Amendment grounds.
As we argued in our amicus brief with the Foundation of Individual Rights and Expression (FIRE), “[a]llowing the TikTok ban to stand would mark an unprecedented departure from our longstanding commitment to free speech exceptionalism, which sets the United States apart not only from authoritarian states but also other democracies that mandate significant government regulation of online speech, such as the European Union and Germany.”
The Latest: Soon after taking office, President Trump signed an executive order to try to delay the ban by 75 days.
Dive Deeper: You can read Jeff Kosseff’s full analysis of the TikTok decision here.
» Meta Ends Current Fact-Checking Program, Will Move to Community Notes
On January 7, Meta, the parent company of Facebook and Instagram, announced that it would be making some significant changes to its content moderation policies, including:
Moving away from centralized fact-checking to a “community notes” model where users can add context and corrections to posts, similar to a feature on X (formerly Twitter).
Ending restrictions on controversial topics like immigration and gender identity and loosening the platform’s hate speech policies.
Giving users the ability to see more political content.
Shifting the moderation focus to illegal and high-severity violations like terrorism, child sexual exploitation, drugs, fraud, and scams.
Background:
Meta CEO Mark Zuckerberg stated that the company’s former fact-checking program was pressured by the FBI and Biden Administration to censor, amongst others, references to Hunter Biden’s laptop (the story could be shared, but the algorithm would not promote it) and lab leak coronavirus theories.
The company has faced criticism for bending to the incoming Trump administration’s demands. President Trump boasted about how his past threats against Zuckerberg incentivized the company to change its policies.
For more analysis of these changes, read Jacob Mchangama at Persuasion and here at The Bedrock Principle.
The EU’s Reaction:
The European Commission’s chief spokesperson, Thomas Regnier, declined to comment on Meta’s new policy regulation initiative, as uncertainty reigns as to how Meta’s new content regulation strategy will fare on the Continent.
Meta currently has “no immediate plans” to disband fact-checkers internationally. Were it to do so, under the EU’s Digital Services Act (DSA), it would have to submit a risk-assessment report to the European Commission.
The EU’s Digital Services Act empowers the European Commission to fine companies who fail to adequately mitigate systemic risks — which include negative impacts on civic discourse and election processes — with fines going up to 6 percent of global revenue.
The vague nature of these obligations has caused concern among experts and companies.
The Brussels Effect: Europe and Beyond
» Revised Code of Conduct on Illegal Hate Speech Integrated into the DSA
Background:
On January 20, the European Commission integrated its revised Code of Conduct on Countering Illegal Hate Speech Online into the Digital Services Act (DSA) framework.
This voluntary code builds on a similar Code of Conduct adopted in 2016 and is signed by companies such as Google, Meta, Microsoft, Snap, TikTok, Twitch, and X.
What It Means:
Platforms designated under the DSA can adhere to this code to demonstrate compliance with the DSA's requirements, including the mitigation of illegal content risks.
The code also introduces commitments like transparency reporting, fast review times for flagged content, and structured cooperation with civil society organizations.
Signatories Commit To:
Allow a network of Monitoring Reporters—not-for-profit or public entities with expertise in illegal hate speech—to regularly assess how platforms handle hate speech notices.
Aim to review at least two-thirds of hate speech notices from Monitoring Reporters within 24 hours.
Engage in specific transparency measures, including the use of automated tools to detect and reduce hate speech prevalence.
Cooperate with civil society organizations and experts to monitor trends and prevent waves of hate speech from going viral.
Raise user awareness about illegal hate speech and provide procedures to flag illegal content.
Provide detailed reporting on outcomes of implemented measures, including data on recommender systems and the reach of illegal content prior to removal.
Present country-specific data on hate speech, classified by characteristics such as race, religion, gender identity, or sexual orientation, and address findings from multi-stakeholder cooperation.
Next Steps:
Platforms’ adherence to the code will be monitored through annual independent audits, with the results informing updates to the code and its enforcement mechanisms under the DSA.
» Google Tells EU Regulators It Will Not Add Fact-Checks
Google has said it will not add fact-checks to Google and YouTube, nor use them to rank or remove content.
Background:
The DSA requires large tech companies to mitigate risks for civic discourse and electoral processes, two nebulous concepts.
A DSA Code of Conduct on Disinformation, building on a similar Code adopted in 2022, is expected to be adopted soon. This Code would provide companies with guidance on how to implement the DSA.
According to Axios, the DSA code would require Google to incorporate fact-check results and force it to build fact-checking into its ranking systems and algorithms. Axios implies that, otherwise, Google would not meet DSA requirements.
Misleading Headlines?
The Future of Free Speech’s Senior Research Fellow Jordi Calvet-Bademunt explains how this reporting is misleading.
It is true that companies, including Google, have incentives to comply with these codes. Refusing to participate in certain codes could have uncertain consequences in DSA investigations.
Still, companies are able to justify that they are complying with the law in a different way, adopting measures that make sense if a commitment doesn’t comport with their product or content-moderation approach. Different approaches to content moderation should be possible, depending on companies' preferences and products.
» MEPs Urge EU Commission to Investigate Elon Musk Livestream
X-owner Elon Musk held an X Livestream with the leader of Germany’s far-right AfD party.
Background:
Germany is scheduled to hold elections in February 2025
Some MEPs are calling Musk’s livestream “election interference,” urging the EU Commission to investigate X’s actions under the EU’s Digital Services Act (DSA)
The incident shows how easy it is for politicians to appeal to the DSA to investigate companies for hosting controversial speech.
In an interview with the Spanish newspaper eldiario.es, Jordi warns about the risks of a misguided application of the DSA.
It’s understandable that many are worried about the political climate in Europe and Musk’s reckless behavior. But interpreting the DSA as granting the EU such broad powers to determine what constitutes acceptable political speech could set a dangerous precedent.
» The UK’s Controversial Online Safety Act Comes Into Force
The UK’s Online Safety Act, which passed under the previous Conservative government in October 2023, officially came into force on December 16, 2024.
Background:
The law sets new standards for child safety and harm-prevention online
But it also contains a provision criminalizing the spread of “disinformation” in certain circumstances.
As a result, the Act’s enforcement—even its very existence, in its current form—could cause chilling effects on free speech.
» Spain's Proposal to Repeal Blasphemy Law
Spain’s Socialist Party government moves to scrap the country’s blasphemy law.
The party’s parliamentary spokesperson, Patxi Lopez, noted the law “is constantly used by extremist and fundamentalist organizations to persecute artists, activists (and) elected representatives, subjecting them to costly criminal proceedings.”
It’s important to note that this promise has been made repeatedly with no action and that it could be postponed or delayed again.
Quick Hits
Trump Issues Executive Order on Disinformation: President Trump has signed an executive order directing his administration to end federal dollars and staff resources devoted to policing dis- and misinformation.
TikTok Users Move to RedNote, Encounter Censorship: As TikTok’s fate lingers in uncertainty, users have moved to another Chinese social media app: RedNote. Here, they are, unsurprisingly, met with strict censorship.
U.S. House Introduces Anti-SLAPP Legislation: On December 5, 2024, a bipartisan bill known as the Free Speech Protection Act aims to shield individuals and organizations from baseless lawsuits designed to silence free speech and political engagement.
FIRE Defends Iowa Pollster: After President Trump sued Iowa pollster J. Ann Selzer under a “consumer fraud” claim for her inaccurate polling of the 2024 election, FIRE signed on to defend her against this frivolous lawsuit. Clearly, pollsters have a First Amendment right to report their poll results, regardless of whether the candidates like the findings.
GEC Shuts Down: The U.S. Congress’ bipartisan federal spending bill did not include funding toward the Global Engagement Center (GEC), a State Department body initially created to counter foreign propaganda. Critics maintain the GEC often overreached its mandate.
Press Freedom Watch:
Palestinian Journalists Killed in Gaza: Article 19 calls on Israel to stop targeting journalists after 11 Palestinian journalists were killed in Gaza.
Iran Releases Italian Journalist: Italian journalist Cecilia Sala returns home after Italy agrees to release detained Iranian engineer accused of helping Iran build drones used by Russia in Ukraine.
Russia further targets journalists: A new law in Russia confiscates assets of journalists designated as “foreign agents.” Most of these are journalists who fled the country in the wake of its invasion of Ukraine.
Local reporters in India face violence: The murder of journalist Mukesh Chandrakar sparked renewed concerns of press freedom in India. Currently, the country’s press freedom score, compiled by Reporters Without Borders (RSF), ranks even lower than its neighbor, Pakistan.
Protest Watch:
Violence in Mozambique erupted over the governing party’s disputed election victory.
South Korea’s President was arrested over declaring marital law during protests.
Want to learn more about The Future of Free Speech? Check out our website here: