Social media companies say they want to be transparent. So why arent they? – Wired.co.uk

Who should have the right to a platform, and who should make decisions about what expression is allowed online? These conversations are nearly as old as the internet itself.

Among the earliest instances of civil society collaboration with social media occurred in 2007 when Egyptian journalist Wael Abbas lost his YouTube account for posting videos depicting police brutality in his country. By removing the content, YouTube a subsidiary of Google was simply enforcing its own rules, but the story nevertheless made international headlines.

Back then, YouTubes rules were pithy and light. Respect the YouTube community, they began. Were not asking for the kind of respect reserved for nuns, the elderly, and brain surgeons. We mean dont abuse the site. Regarding graphic violence, however, they were fairly clear: Graphic or gratuitous violence is not allowed. If your video shows someone getting hurt, attacked or humiliated, dont post it.

Abbas reached out to friends at influential groups around the world, as well as the US Embassy, according to a cable leaked by WikiLeaks. The cable details the videos contents and ends with a request that the State Department contact Google to try to resolve the matter. Whether the US government or civil society activists reached Google first is unknown, but Abbass account was quickly restored.

Content moderation was a nascent field at the time, but over the years it has grown to a multi-billion-dollar industry. With that growth have come changes some positive, others not so much. In the years following Abbass experience, most major tech companies began issuing transparency reports regular reports that show how they respond to government demands for user data and content removal. This initiative came after a push from civil society, one that continues today. But theres still much more they could be doing.

The Santa Clara Principles were created by a coalition of civil society and academic partners in 2018 to set baseline standards for transparency and accountability. The principles supported by more than 100 institutions around the world are simple: they demand that companies publish numbers about posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines; provide detailed notice to users when their content is removed or their account suspended; and, perhaps most importantly, ensure that every user has the right to remedy the chance to appeal to a human being when a potentially erroneous decision is made.

A year after their release, the principles were endorsed by Facebook, Twitter, YouTube, Snap, Apple, Github, Instagram and a handful of other companies, but only one Reddit has actually implemented them in full. (Disclosure: Reddit is an independent subsidiary of Cond Nasts parent company, Advance Publications.) In fact, amidst the Covid-19 pandemic, much of the progress that had been made in recent years in terms of providing appeals to users has backslid, as commercial content moderators working in countries like the Philippines have been sent home, unable to do their jobs because of (valid) concerns about privacy and mental well-being.

Automation has been touted as the next big thing in content moderation. But although this could alleviate some of the problems that face human moderators most notably, the horrific content that these workers have to look at on a daily basis it is not particularly good at detecting nuance, leading to all sorts of mistakes. Mistakes that threaten the ability of users around the world to express themselves.

The rules that social media companies put in place to maintain user safety are sometimes necessary, and sometimes absurd (think the banning of womens bodies or Facebooks requirement of authentic names). But it is imperative that these rules be enforced fairly and evenly, and that when mistakes are inevitably made, users are able to seek justice.

The Santa Clara Principles are just a starting point: if Silicon Valleys tech giants have any interest in contributing to a just world, there is more to be done. We must diversify the boardrooms and root out existing biases. Rules, many of which were put in place more than a decade ago, should be audited and updated for the 21st century. And key voices, not just from the US but from around the world and particularly the global south, need to be brought into the policymaking process. A more equitable internet is possible.

Originally posted here:

Social media companies say they want to be transparent. So why arent they? - Wired.co.uk

Related Posts
This entry was posted in $1$s. Bookmark the permalink.