To censor or sanction extreme content? Either way, Facebook can’t win – The Guardian

Posted: May 23, 2017 at 10:22 pm

The documents detail what is and is not permitted on the platform, covering graphic violence, bullying, hate speech, sexual content, terrorism and self-harm. Photograph: Shailesh Andrade/Reuters

Facebook allows people to live-stream their suicide attempts as long as they are engaging with viewers but will remove footage once theres no longer an opportunity to help the person. Pledges to kill oneself through hashtags or emoticons or those that specify a fixed date more than five days in the future shouldnt be treated as a high priority.

These are tiny snippets from a cache of training materials that Facebook content moderators need to absorb, in just two weeks, before policing the worlds largest social network.

The guidelines also require moderators to learn the names and faces of more than 600 terrorist leaders, decide when a beheading video is newsworthy or celebratory, and allow Holocaust denial in all but four of the 16 countries where its illegal those where Facebook risks being sued or blocked for flouting local law.

The documents detail what is and is not permitted on the platform, covering graphic violence, bullying, hate speech, sexual content, terrorism and self-harm. For the first time the public has a glimpse of the thought process behind some of the companys editorial judgements that go beyond the vague wording of its community standards or statements made in the wake of a live-streamed murder.

This might be the most important editorial guide sheet the world has ever created. Its surprising its not even longer, said Carl Miller, research director at the Centre for the Analysis of Social Media at London-based thinktank Demos. Its come out of a mangle of thousands of different conversations, pressures and calls for change that Facebook gets from governments around the world.

It is clear that Facebook has an unprecedented challenge on its hands. The platform has inadvertently become the worlds largest media organization, with nearly 2bn readers and contributors encompassing the full spectrum of humanitys capacity to entertain, sadden, bore, horrify and disgust.

In order to provide simple instructions to moderators, the documents highlight specific visceral examples. And its not pretty.

Footage of animal abuse is allowed but must be marked as disturbing if there is, among other things, dismemberment or visible innards. Images of physical child abuse is acceptable unless shared with sadism and celebration. Comments such as Irish are stupid are removed while moderators are told to ignore Blonde women are stupid. A picture of a child who appears to have Down syndrome captioned I can count to potato does not have to be deleted.

The files explain that people use violent language to express frustration online without stopping to think about the consequences. This is because they feel indifferent towards their target because of the lack of empathy created by communications via devices as opposed to face to face a neat description of the so-called online disinhibition effect.

This appears to contradict much of Facebook CEO Mark Zuckerbergs 5,700-word manifesto, published in February, that placed heavy emphasis on the social network fostering human connections. It is jarring to see so many examples of human cruelty and depravity laid bare, but they raise important questions over whether Facebooks users are comfortable with the lines the company has drawn.

Either way, Facebook cannot win.

On one hand, it is expected to clamp down on terrorist recruitment, glorified violence and live-streamed crime, while on the other it is accused of overzealous censorship and collaboration with oppressive regimes. This is a terrible bind, Miller said. They found themselves with all these responsibilities and power they never anticipated getting and would rather do without.

Private companies are doing what weve only really expected constituted officials of sovereign power to do.

Part of the challenge for Facebook is ensuring its rules are applied correctly across the world, taking into account linguistic and cultural nuance. They want to create a systematic check box exercise in order to universalize the operation, but thats not the way human language works, said Miller, who studies how terror networks use social media.

One solution being explored by Facebook is a global voting system to allow users to set their own levels of comfort with content, an idea floated by Zuckerberg in his manifesto.

Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings, he wrote.

Although we will still block content based on standards and local laws, our hope is that this system of personal controls and democratic referenda should minimize restrictions on what we can share.

Moderators also need to take into account local laws, but not in every case.

We respect, but do not welcome, local law that stands as an obstacle to an open and connected world, said Facebook in the training documents. Given this we will not censor content unless a nation has demonstrated the political will to enforce its censorship laws.

Facebook has restricted content in Pakistan, Russia and Turkey in the past and has reportedly developed software to accommodate Chinas censorship demands.

The cherished principles of free speech on which the internet was founded go out the window when they dont align with business interests.

The companys commitment to these things appears to wax and wane depending on public sentiment

So many of these policies are at odds with each other, said Sarah T Roberts, a UCLA professor who studies large-scale moderation of online platforms. The companys commitment to these things appears to wax and wane depending on public sentiment.

Its no wonder the company errs so regularly, whether thats censoring Napalm Girl or live-streaming the murder of a grandfather in Cleveland. In response to the mounting slip-ups, CEO Mark Zuckerberg pledged to add 3,000 more content reviewers to its community operations team.

Facebooks moderators, known as community operations analysts, are typically low-paid contractors. The Guardian found job listings offering an annual salary of between $23,000 (in Dublin) and $40,000 (at Facebooks headquarters in California), although many others will earn less in places such as the Philippines. The 4,500-strong community operations team reviews more than 100m pieces of content every month, which leaves around 10 seconds to make a judgement call about each one.

Facebook told the Guardian that it recognizes the work can often be difficult and offers every person reviewing content psychological support and wellness resources.

In order to make moderation more efficient, Facebook is developing artificial intelligence to identify offending content. It also uses algorithms to spot suicidal users and is exploring ways to use AI to distinguish between news stories about terrorism and actual terrorist propaganda.

As much as Facebook might want to replace human moderators with automated systems, doing so will not be easy.

Its impossible for algorithms alone to manage human experience, said Peter Friedman, CEO of LiveWorld, which provides moderation services to big brands.

This is highly complex work, Roberts said, and requires a mastery of many topics, current events, other cultures and languages, so its interesting that its so devalued.

Miller agrees: However clever Facebook is, so much of this is impressionistic and contextual and difficult to interpret.

He said he frequently struggles to judge suspected terrorist content because he doesnt understand the tropes, language or internal slang. The idea that anyone can learn that in addition to all the other bodies of content they need to make judgements about in two weeks is very surprising to me.

It must be one of the worst jobs on the internet, he said.

In the UK, the Samaritans can be contacted on 116 123. In the US, the National Suicide Prevention Hotline is 1-800-273-8255. In Australia, the crisis support service Lifeline is on 13 11 14.

Go here to see the original:
To censor or sanction extreme content? Either way, Facebook can't win - The Guardian

Related Posts