Murder on Facebook raises big censorship questions: What should social-media companies do about violent content? – Salon

Posted: April 23, 2017 at 12:21 am

On Easter Sunday horrific footage of a 74-year-old man being gunned down on a Cleveland sidewalk was posted on Facebook by his killer, reigniting an ongoing debate over how social-media content should be policed.

But effective strategies forblocking every piece of offensive and illegal content have been elusive and may never be 100 percent effective, according to some experts. Others including Facebook itself say more can and should be done to root out offensive content, including hate speech, horrific and illegal snuff videos and fake news items that mold the opinions of gullible users.

Facebook says it receives millions of complaints objecting tocontent everyweek from its nearly 2 billion active users. When the company receives a complaint, an algorithm automatically flags the content, which is then reviewed by moderators to quickly determine if itviolates the law or the companys terms and conditions.

Footage of the murder of Robert Godwin Sr. by deranged killer Steve Stephens, 37, who committed suicide on Tuesday following a police chase in Pennsylvania, was publicly viewable on Stephens Facebook profile for about two hours on Sunday. Facebook said it disabled Stephens account 23 minutes after it received reports of the murder video, but it was publicly viewable long enough for users to capture the footage, prompting a pleaon Twitter from one of Godwins grandchildren for people tostop sharing the video.

Desmond Patton,an assistant professor of social work at Columbia University, said that while the Godwin murder video should clearly have been taken down, it one extreme example of a larger issue. Companies like Facebook, Twitter and Google (which owns YouTube), he said, need to recruit specialists and elicit feedback from community leaders to improve how content is moderated, including material that might not seem offensive to every user.

I study violence on social media and all of the [problematic] content that I see almost never gets taken down, Patton told Salon. If youre just using tech people from Silicon Valley [as content monitors,] youre going to miss a lot of things. You need to diversify who makes these decisions.

Facebook declined to comment to Salon about thestrategies its considering to fortify its efforts to block objectionable and illegal content uploaded by its users, but having a more aggressive content filtering system could have unintended consequences. For example, would a stricterpolicy lead to the censorship of footage like the July 2016 shooting of Philando Castile by a Minnesota police officer? It could be argued that this video servesthe publics interest because it viscerally highlights the ongoing problem of excessive force inflicted on African-Americans by members of law enforcement.

Sarah Esther Lageson, a sociologist at Rutgers Universitys School of Criminal Justice, saidthat Facebook is under intense pressure to take a stance and define its position on monitoring user-uploaded content, which couldleadto more surveillance something that not all Facebook users will welcome. But she said the benefits of having an open and easy way to produce and share online videos, which can highlight injustices and expose crimes, outweigh the negative effects of giving people so much freedom.

Facebook will likely provide an array of creative solutions and will likely do their best to streamline oversight of user-uploaded content using [artificial intelligence] or machine learning, but I wont make an argument that those efforts would catch every instance of an extremely rare event like this, Lageson told Salon in an email.

Besides, she said in a follow-up phone conversation, horrific crimes take place in public no matter what we do to prevent them; its the new medium by which criminals can advertise their crimes that concerns people.

This is clearly an innovative way of doing something that has always been done: People have always killed people in public, mass shootings happen, she said. That being said the internet is a way to get into peoples homes, which I think is what scares people, that you cant even feel protected from witnessing a crime on your cell phone or your laptop. Its one thing to see a crime happen on the street and another thing to see it when youre on your couch.

As Facebook and other social-networking service providers struggle to moderate the immense content stream coming at them from their users, the solution to the many problems that can arise is complicated. It requires, as Patton suggested, more feedback from experts and community members abouthow to establish policies for all types of harmful, violent and offensive content. And as Lageson pointed out, the fact that people can produce and share content so easily has helped fight crime and injustice.

The solution to the problem of preventing offensive, hateful, violent and murderous content from being distributed onsocial networks is as complicated as people are themselves, and there may never be a solution that satisfies everyones concerns.

Read more:
Murder on Facebook raises big censorship questions: What should social-media companies do about violent content? - Salon

Related Posts