Facebook plans to hire 3,000 more people to review videos and posts after getting criticized for not responding quickly enough to a murder being live-streamed and posted on their platform, according to the New Zealand Herald.
These hires add to the 4,500 already employed to identify crime and other illegal content for removal.
CEO Mark Zuckerberg wrote Wednesday that the company is "working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."
Facebook's rules outlaw content that promotes and glorifies violence, but Facebook has faced heavy criticism recently after being slow to remove violent videos of a murder in Cleveland and the killing of a baby in Thailand.
In most cases, content is only reviewed and potentially removed after user complaints.
Policing live video streams is difficult, as the exciting, unedited, live and raw nature of these videos is a massive part of their appeal.
Zuckerberg said Facebook workers review "millions of reports" every week.
In addition to removing videos of crime or getting help for anyone injured in these videos, he said the reviewers will "also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation."
Wednesday's announcement is a clear sign that Facebook still requires the services of humans for reviewing content, as they continually outsource some of the work to software due in part to its sheer size and the volume of posted content.