Premium

Meta Reverses Content Moderation Policies, Shifting to a Community-Based Approach

Meta is making significant changes to its content moderation policies, emphasizing a return to free expression and reducing over-enforcement. The company is ending its third-party fact-checking program in the U.S. and replacing it with a Community Notes initiative. This shift allows users to write and rate notes, offering context to content deemed potentially misleading. Meta believes this community-driven approach, already successful on other platforms, will better serve its goal of providing users with relevant information without bias. This transition aims to empower people across various perspectives to decide what additional context should be shared, aligning with the company’s original intention to foster informed decision-making.

Become a Subscriber

Please purchase a subscription to continue reading this article.

Subscribe Now

In addition to altering fact-checking procedures, Meta is loosening its content moderation rules. It plans to reverse policies that limit discussions on sensitive topics like immigration and gender identity, which are often debated politically. The company will also reduce the use of automated systems for content enforcement, focusing them on severe violations like terrorism or child exploitation, while relying on user reports for less serious infractions. Meta intends to improve transparency around enforcement mistakes and provide more personalized political content recommendations to users based on their preferences.

Read more