The 5 Most Explosive Claims Made In The Four Corners x Hack Investigation Into TikTok – Pedestrian TV

CONTENT WARNING: This article discusses disordered eating.

A joint investigation by ABCs Four Corners and Triple J Hack that aired on Monday night has made explosive claims about TikToks algorithm and data practices, saying that the app can expose users to dangerous content with real-life impacts through its algorithms, data harvesting, and censorship.

Here are the five biggest claims to come from the report.

It takes less than 30 seconds to find harmful content on TikTok, and a few hours for the algorithm to dominate someones feed with offensive videos, according to several researchers, the Four Corners report claimed.

The report referenced tech advocacy organisation Reset Australia, citing their experiments that discovered it takes about four hours for the algorithm to learn that a 13-year-old is interested in racist content, and about seven hours for sexist videos to swamp someones feed.

Laura Hemmings, a university student, spoke to Four Corners about joining the app to watch funny videos, but said after she followed a fitness influencer, the algorithm appeared to push her toward viral calorie-counting trends.

After four months on TikTok, Lauren was diagnosed with an eating disorder.

According toSwinburne Universitys Dr Suku Sukunesan, who advises TikTok on how to make the app safer, TikTok videos can basically teach people how to have an eating disorder because the algorithm sends vulnerable young people toward similar content.

I was immediately given all this eating disorder content. After a couple of hours, TikTok suggested 30 different accounts to follow and they were all people living with eating disorder issues, he said on the episode, after embedding himself into TikToks eating disorder community.

Its almost like a pit with no end and you find that these kids would ultimately harm themselves more.

Claire Benstead, a 22-year-old who has been in and out of hospital over the last five years due to suffering from an eating disorder, was in recovery when she joined TikTok. Her algorithm quickly suggested videos relating to eating disorders to her, which she claims eventually led to her relapse.

Benstead tried cleaning up her feed by reporting videos that promoted eating disorders, but says she was told that the videos she reported did not breach TikToks guidelines.

The app also claims to ban content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm, or eating disorders, with a TikTok spokesperson telling the ABC:

Our teams consult with NGOs and other partners to continuously update the list of keywords on which we intervene,

Another TikTok user told Hack and Four Corners that she reported a viral video of a man taking his own life, and claims that she was also told it did not breach any community guidelines.

Claims that TikTok has a racial bias are not new. Last year, TikTok apologised forhiding posts with the hashtags Black Lives Matter and George Floyd as thousands of creators complained about being silenced, citing a glitch.

Earlier this month, TikTok user Ziggi Tyler went viral for showing how the platform flagged words such as Black, Black success, and Black Lives Matter in his bio as inappropriate content, but not terms such as neo-nazi and white supremacist.

TikTok shared a statement toForbes which read: Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order.

We recognise and apologise for how frustrating this was to experience, andour teamhas fixed this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27 billion views on our platform.

The Four Corners x Hack report featured interviews with two creators of colour, Unice Wani (@unicewani) and Paniora Nukunuku (@pnuks), who discussed being shadow banned from TikTok for creating videos discussing race. (Shadow banning is term for when videos or posts are hidden from a platforms feed without explicitly being banned or taken down).

Nukunuku told the ABC that his videos on life with a disability are sometimes pinged for violating community guidelines, despite not breaking any rules.

Wani claims that a video he posted about Black Lives Matter saw his account banned for a week, and a video he put up in support of Palestinian protests was removed just hours after he posted it.

You tend to get a lot of shadow bans for speaking up about stuff such as racism I guess they focus more on the white girls dancing and stuff like that, Wani said.

The Four Corners report claimed that TikTok doesnt just mine facial data from the videos uploaded onto the app, but also from videos users might record on the app and never upload, or any videos and photos in their camera rolls.

The report alleged that the app analyses faces for personality and demographic traits, using that information to create a profile of the user and create a more accurate algorithm.

Anne Longfield, the former Childrens Commissioner for England, is leading a class-action lawsuit alleging that every child who has used TikTok since May 25, 2018, may have had private personal information illegally collected by ByteDance (TikToks parent company) through the platform for the benefit of unknown third parties.

Parents and children have a right to know that private information, including phone numbers, physical location, and videos of their children are being illegally collected, she said.

The lawsuit is demanding TikTok delete any personal information it has stored regarding children.

TikTok has strongly denied the allegations, with arepresentative saying the companys top priorities are privacy and safety and that the platform has plenty of policies, processes and technologies in place to protect all its users, including the younger end of the demographic.

We believe the claims lack merit and intend to vigorously defend the action, the representative for TikTok told the ABC.

The Four Corners x Hack report referenced an academic investigation by The Australian Strategic Policy Institute (ASPI) which found that TikTok appears to use its algorithm to hide political speech that it thinks is controversial.

The US State Department funded the study which found hashtags relating to mass detention of Chinese Muslim minority Uyghurs, pro-democracy Hong Kong protests, LGBTQI issues and anti-Russian government videos were just some of the content that appeared to be hidden by TikTok.

We see evidence of how content moderation that takes place in China, how that type of thinking is still applied to TikTok outside of China, ASPIs Fergus Ryan said.

As it has expanded around the world, and particularly after its received a lot of scrutiny, the company has tried to, as much as possible, disconnect TikTok, the company, from its roots in China. But ultimately, those links cant be fully severed.

In a statement, TikTok vehemently denies companys involvement in political censorship.

We do not moderate or remove content based on political sensitivities. We have never removed content at the request of the Chinese government, nor have we been asked to.

You can read the full investigation into the TikTok spiral over at the ABCor watch the Four Corners episode here.

If you need support, give Butterfly Foundation a call on 1800 33 4673 or chat online.

If you are in distress, please call Lifeline on 13 11 14 or chat online.

Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.

Visit link:

The 5 Most Explosive Claims Made In The Four Corners x Hack Investigation Into TikTok - Pedestrian TV

Related Posts
This entry was posted in $1$s. Bookmark the permalink.