Recent changes on Facebook support the spread of misinformation. What do we think of Meta’s new direction
On January 7, Mark Zuckerberg, the owner of Meta, announced changes to the system for filtering and moderating content on Facebook, Instagram, and Threads.1
These changes include eliminating the fact-checking program in favor of a “community notes” – a system that is effectively shifting the responsibility for combating misinformation onto users of the platforms. Restrictions on discussions about politically and socially sensitive topics, such as migration or gender identity, will also be loosened. Additionally, Meta will no longer proactively scan for hate speech or other violations of community standards, instead limiting moderation to user-reported cases. Existing automated systems will focus on addressing severe violations, such as content related to terrorism, drugs, fraud, or child exploitation.
These changes have been implemented in the United States effective immediately, affecting the three largest social media platforms in the world – hosting a combined user base of over 3 billion. It remains unclear how this will impact the type of content available in countries where the EU Digital Services Act applies. This act holds digital companies operating in the EU accountable for content on their platforms, aiming to create a safer online environment and counteract societal risks on the internet.2
In a published video, Zuckerberg directly linked this “return to free speech” to the upcoming presidency of Donald J. Trump. (Notably, Trump’s Facebook account was suspended by Meta due to statements he made during the attack on the U.S. Capitol in January 20213.) Zuckerberg stated that the outcome of the recent U.S. elections appears to be “a cultural tipping point towards, once again, prioritizing speech.”
While explaining his decision, Zuckerberg also referenced flaws in the previous systems, which were often overly sensitive – erroneously removing content that was permitted under the platforms’ Community Standards (e.g., works of art, educational materials, or historical content4). However, removing the fact-checking program is unlikely to improve the insufficiently fine-tuned filters/algorithms or the overall quality of content on Meta’s platforms. Moreover, an open letter from fact-checking organizations worldwide (including Poland’s Demagog) describes this step as “a decision that threatens to undo nearly a decade of progress in promoting credible information online.”5 A more appropriate approach might involve clarifying existing Community Standards or providing greater substantive support to moderators, whose decisions could improve existing moderation filters.
The adoption of the “community notes” system reminds us of the decline of Twitter/X, where this system was first introduced. Twitter transformed from a vibrant platform into a hotbed of political extremism, hate speech, misinformation, and bot-generated content. This is particularly harmful given that platforms like Twitter can serve as tools for significant social change – facilitating grassroots movements like Black Lives Matter or Poland’s Women’s Strike; and improving public education through direct connections between countless public institutions, NGOs, and their audiences. Sudden changes (or collapses) of platforms leave a mark on the communities that rely on them. Although these companies are private, commercial enterprises, they carry obligations stemming from the trust placed in them by users, as well as a tangible impact on public sentiment and our environment (even if we personally don’t use these platforms). This underscores the importance of fostering a responsible ecosystem that supports accurate information and ensuring long-term, proactive efforts to maintain the quality of content and discussions on social media.
- Joel Kaplan, Meta, More Speech and Fewer Mistakes.
- Council of the European Union, Digital Services Act.
- Oversight Board, Former President Trump’s Suspension.
- Agata Pyka, Notes from Poland, Facebook apologises for flagging Auschwitz Museum’s posts as violating community standards.
- Demagog, Open letter to Mark Zuckerberg from fact-checers worldwide.
- Anna Mierzyńska, Oko.press, Kremlin bots have strenghtened their attacks on X. It affects not only the Poles