Facebook mark zuckerberg violence moderation reviewers – Breaking News & Latest Updates 2026
Skip to main content

Facebook will add 3,000 moderators after video killings

Mark Zuckerberg Facebook Stock
Mark Zuckerberg Facebook Stock

After multiple, high-profile incidents of violence recently unfolded on Facebook, Mark Zuckerberg said in an announcement today that the company will add 3,000 people over the next year to work on reviewing videos and other flagged reports.

“If we’re going to build a safe community, we need to respond quickly.”

“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg writes. “We’re working to make these videos easier to report so we can take the right action sooner -- whether that’s responding quickly when someone needs help or taking a post down.”

Last month, Facebook faced criticism after a Cleveland video showing a deadly shooting stayed on the site for hours. Facebook apologized for its handling of the situation and pledged “to do better,” but in another incident in Thailand later that month, disturbing videos showing the murder of a child stayed up for a full day.

Zuckerberg writes that the company will add the 3,000 people to the 4,500 working on the “community operations team.” The statement does not specify the workers’ exact relationship to Facebook, but the company, like others in the tech industry, has been known to outsource moderation to workers around the world. The additional people will work “to review the millions of reports we get every week, and improve the process for doing it quickly,” Zuckerberg writes. Zuckerberg also says the company will change its flagging process to make it easier for users to report problems and for moderators to contact law enforcement.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.