New paper calls on Instagram to do more to protect women and vulnerable users online – City, University of London

A new paper from City, University of London argues that Instagrams algorithm is censoring women and vulnerable users but doing little to stop abusers.

The paper, How Instagrams algorithm is censoring women and vulnerable users but helping online abusers, which was published in Feminist Media Studies, argues that Instagrams algorithm censors female accounts for showing skin and nudity out of fear that it promotes or facilitates prostitution.

Accounts regularly affected range from sex workers to carnival dancers and athletes, who have had their posts deleted or hidden by Instagram.

The censoring includes shadow banning, where accounts and posts are hidden from the explore or search functions of the platform. In most cases, users are not notified that they have been shadow banned and only notice when their content engagement decreases.

Written by Carolina Are, PhD Candidate and Visiting Lecturer in the Department of Sociology, the paper argues that Instagram then fails to protect some of those same users from online harassment such as cyber flashing the unsolicited sending of explicit photos to other users via Bluetooth or direct message and trolling, showing a bias against certain accounts.

Carolina Are, PhD Candidate and Visiting Lecturer in the Department of Sociology

From her research, Carolina calls on Instagram to do more in checking accounts that claim to have been censored or harassed, callout unfair moderation practices whether that be censorship or harassment and provide better moderation through help or report a problem features.

Harassment has emotional, psychological and economic costs for victims, making women stop contributing to online spaces and cutting them off from work and/or public life, says Carolina.

The same platforms that were going to give them a voice are also giving users new opportunities to harass, insult and silence them, she continues.

Carolinas research goes further in identifying that Instagrams algorithm discriminates against women, seen where female users have increased engagement rates when changing their profile gender to male.

As a pole dancing instructor, Carolina has experienced Instagrams shadow banning first hand, preventing people in those industries from reaching bigger audiences or finding work.

Instagram's algorithm

Another affected group is sex workers, who are seeing a safe space to advertise their work and a source of income such as social media platforms being taken away from them.

Carolina said: Social media platforms have become a form of civic space. Because of this, platforms need to be held accountable about their biases, and they need to be more transparent about the rules that govern them.

I have witnessed hateful comments and the lack of moderation surrounding them driving women off platforms and having to deal with their traumatic consequences without support.

It is not sustainable for large parts of Instagrams user population to continue being silenced by and targeted with abuse on social media.

If social media architecture is kept as it is, offline inequalities may become even greater online, and the value that social media platforms could provide to our society will be lost.

Read the full paper: How Instagrams algorithm is censoring women and vulnerable users but helping online abusers, published in Feminist Media studies.

For more information at Sociology at City see here.

Visit link:

New paper calls on Instagram to do more to protect women and vulnerable users online - City, University of London

Related Posts
This entry was posted in $1$s. Bookmark the permalink.