Facebook hiring 3000 moderators in response to spate of disturbing Live videos – DigitalSpy.com

Posted: May 4, 2017 at 3:22 pm

Facebook is adding 3,000 extra moderators to its community operations team in order to combat the increased prevalence of extreme and sometimes illegal content on the platform.

The social network has been hit by a spate of unpalatable incidents in recent months in which users have broadcast their own violent crimes. Last week, a Thai man used Facebook to livestream the murder of his 11 month-old daughter, later taking his own life. Only a few days earlier, the killing of a Cleveland man was uploaded by his killer to the network.

Advertisement - Continue Reading Below

In the case of the Thailand murder, Wuttisan Wongtalay's videos were viewed 370,000 times and remained online for 24 hours before being removed.

Governments and users have criticised Facebook for not removing such content fast enough. In response, its founder Mark Zuckerberg has announced that the company's community operations team will almost double in size.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," said Zuckerberg in a post on his Facebook profile.

Getty Images

"Over the next year, we'll be adding 3,000 people to our community operations team around the world on top of the 4,500 we have today to review the millions of reports we get every week, and improve the process for doing it quickly."

Facebook has almost two billion users, uploading enormous volumes of content daily. At the moment inappropriate content must be flagged by users, and then viewed by human moderators in order to be removed.

"If we're going to build a safe community, we need to respond quickly," Zuckerberg added. "We're working to make these videos easier to report so we can take the right action sooner whether that's responding quickly when someone needs help, or taking a post down."

Facebook has also indicated that it may experiment with more automated means of filtering content.

In the meantime, the human moderators have the unenviable job of sifting through the worst Facebook has to offer, which as it stands includes rapes, murders, and an assortment of other criminal activity.

Some have voiced concern at the potential psychological damage this job could cause moderators.

"People can be highly affected and desensitized. It's not clear that Facebook is even aware of the long-term outcomes, never mind tracking the mental health of the workers," said Sarah T Roberts, an information studies professor from UCLA to The Guardian.

But until automated processes can take the place of human interventions, it looks like Facebook is stuck between two potential harms: that to its enormous amount of users, or a smaller group of individuals trying to protect them.

Want up-to-the-minute entertainment news and features? Just hit 'Like' on our Digital Spy Facebook page and 'Follow' on our @digitalspy Twitter account and you're all set.

Excerpt from:

Facebook hiring 3000 moderators in response to spate of disturbing Live videos - DigitalSpy.com

Related Posts