Trump, nipples and the hypocrisy of the social media giants – Pursuit

It needed a mob to storm the US capitol before former US president Donald Trump, whose online and offline lies had incited the attack, was finally banned by social media platforms like Twitter and Facebook. This was after four arduous years of the President repeatedly breaching their community guidelines.

Yet, just over the other side of the political fence, many women around the world particularly women of colour along with LBGTQ+ communities, continue to be shadow-banned (blocked) and have their posts deleted especially those containing nudity.

Where does this hypocrisy originate? Who writes these policies? How do the social media giants monitor and enforce their policies? And what do they mean for society and those most vulnerable both on (and off) these platforms?

Social media platforms have long been governed by community guidelines which dictate behaviour on the platform. Trumps Twitter ban and the removal of the microblogging and networking platform Parler from application stores were due to the content inciting violence as laid out in the community guidelines of Twitter, and Amazon.

Read more

Twitters hateful conduct policy bans hateful or threatening content against people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease. These guidelines and overall governance structures are crucial for platforms that host billions of diverse users around the globe.

A long-standing and contentious example is the banning of female nipples from Facebook and its Instagram affiliate as they require users to post photos and videos that are appropriate for a diverse audience under which they view nudity as inappropriate.

This policy has been widely challenged and criticised because the label of inappropriateness seems only to apply to female nudity. As a response, the platform had to amend their policies in 2014 to allow photos of mastectomy scars and breastfeeding mothers.

But as of now, photographers, models and Instagram users are still barred from showing pictures of nipples, along with the consistent and ongoing shadow banning of sex workers.

In short, social media platforms censor users all the time.

But the enforcement of these guidelines and policies is woefully inconsistent. In 2018 Facebook, for example, was blamed by the United Nations for facilitating the spread of the hate speech that it said incited violence against the Rohingya Muslims of Myanmar.

And Trump isnt the only political figure who has been allowed to use social media platforms to incite violence others include Brazils President Jair Bolsonaro and Philippines President Rodrigo Duterte.

Read more

Part of the problem is the lack of diversity and dissenting voices within the tech industry, highlighted recently by the controversy over the apparent forced exit of Timnit Gebru, the former co-leader of Googles AI Ethics research team.

How can these platforms design inclusive ethical and considerate guidelines when only certain people are brought to the table? Why do some groups face more aggressive content policing than others?

Our research in Digital Ethics from the Faculty of Engineering and IT and the Centre for AI and Digital Ethics (CAIDE) has shown there is a significant lack of diversity amongst authors in computer science and its various subfields.

The research, which is has been made available for the ACM Conference in Fairness, Accountability, and Transparency (FAccT) to be held online March 3-10, found that, based on a statistical analysis of female-sounding and gender-neutral names against published articles in computer science via the Microsoft Academic database, women were significantly under-represented.

In a direct male to female comparison, the study found that publications featuring men outnumbered those featuring women by 5:1. This disproportion may be worse than the already well-known under-representation of women in these fields previous research has suggested that women account for 26 per cent of IT professionals worldwide.

The research argues that the field of computer science (and by extension, the tech industry) should be actively working to ensure greater representation of women and other minority groups in computer science publications, if they are to uphold the diversity and inclusion standards outlined by professional organisations.

Read more

The research draws on care ethics (CE), which questions traditional masculine moral and ethical approaches and assumptions that historically have been rarely questioned.

For example, traditional thinking will view moral decisions as being made by people who are independent, unattached, self-sufficient, unemotional, and rationalistic, but this is far from the reality. It downplays the role of biases, prejudices and self-interest that inevitably affect human relationships and interaction.

It means that in order for tech to be more inclusive, the industry must be more diverse not just in terms of people but in how it is structured and who has input into the decisions that are made women, people of colour, and all marginalised groups must be represented in the field of computer science, and therefore throughout companies, research and platform design.

If women and marginalised groups are included by design, governance and community guidelines will likely reflect an inclusive group, as well as be policed in a fair and consistent way. We can expect that diverse voices will be better placed to critically examine, react and prevent the inconsistency that we are seeing from playing out.

The bans and censoring of womens bodies on Twitter and Facebook are an example of these flow-on effects. For example, women had to fight to display breast feeding pictures because the male-dominated industry viewed breasts as being primarily pornographic, instead of being mothering and nurturing.

The reality is that so long as we have social media platforms we must have community guidelines, and somehow in a world that seems to grow more fractious everyday we must find ways to design guidelines which encourage unity and harmony, not hate.

And these guidelines must be consistent for all actors not one rule for the powerful and another for everyone else.

Gabby Bush is CAIDE project officer and Mariam Nadeem is a CAIDE intern.

Banner: Supporters of then President Donald Trump storming the US Capitol building, Jan. 6, 2021. Getty Images

See original here:

Trump, nipples and the hypocrisy of the social media giants - Pursuit

Related Posts
This entry was posted in $1$s. Bookmark the permalink.