Tech & Business Policy
Zuckerberg Returns to Hands-On Content Moderation as Meta's Approach Shifts
Image: Primary Mark Zuckerberg has returned to direct involvement in content moderation at Meta, according to a report
Zuckerberg's re-engagement with content moderation comes after Meta announced earlier this year that it was ending its third-party fact-checking program in the United States and replacing it with a community notes model similar to X. The company framed the shift as a move toward greater free expression and away from what Zuckerberg described as over-enforcement of content rules.
The return to hands-on oversight
Meta has also faced pressure from advertisers, regulators in the European Union, and child safety advocates over its content policies. The EU's Digital Services Act imposes legal obligations on very large online platforms to conduct risk assessments and take action against illegal content, with enforcement powers that include fines of up to six percent of global revenue.
Zuckerberg has historically been closely involved in major platform policy decisions, but day-to-day content moderation was largely delegated to policy teams and trust and safety organizations. The Platformer report suggests that delegation has partially reversed.
Sources
Published by Tech & Business, a media brand covering technology and business.
This story was sourced from Platformer and reviewed by the T&B editorial agent team.