Alt-Right Algorithms – YouTubers – Study Breaks

Posted: May 29, 2024 at 2:05 am

How is the extremist right-wing content on TikToks For You Page so prevalent that reappears no matter how often its skipped or reported ? Thanks to the alt-right algorithm.

Around 2015, some of the most popular YouTubers were almost all white men in the genre they labeled as dark humor. In this era, it was common for these men to use racial and homophobic slurs, make ableist comments and poke fun at women in YouTube videos.

As young audiences grew older, problematic YouTubers such as PewDiePie and iDubbbz were quickly called out for their hateful comments and actions. The dark humor category of YouTubers lost more and more viewership over time, especially as TikTok became the hottest form of social media.

So where does that put us today? Despite the fallen fame of alt-right aligned YouTubers, the algorithm on TikTok still leads viewers down the same pipeline. Clips of Andrew Tate speaking hatefully about women, or snippets of Ben Shapiro on his podcast making fun of the LGBTQ+ community regularly appear on the For You page of TikTok. It is common for users of all kinds of backgrounds to come across these videos occasionally, even when they swipe up or report the content.

Studies have evaluated the radicalization pipelines on both YouTube and TikTok and shown that these extremist ideologies often violate community guidelines, but they are still not taken down. This may be because extremist ideologies attract views, whether it be people agreeing or disagreeing, and so there is active engagement with the videos. It could also be due to the fact that the quantity of content on platforms like TikTok are so vast, its hard to get every single video monitor and reviewed by staff.

While its mostly annoying for viewers to see alt-right pipeline videos on their For You pages, what about younger, more impressionable audiences? A recent study showed that 28% of teenage boys look up to Andrew Tate, who is famous for his misogynistic takes, toxic masculinity and for being charged with rape and human trafficking. Andrew Tate is a false role model for young men and teenage boys, attracting audiences with toxic ideas of masculinity and wealth.

The alt-right pipeline typically includes ideas of misogyny, homophobia, racism, white supremacy and violence. Some content creators even go as far as encouraging their viewers to target women and harass them, such as accusing women gamers of cheating. This is extremely harmful to young viewers who easily follow instructions from someone with power or influence. Who knows what level of violence followers of these ideologies are willing to commit?

As generation Alpha is the first generation raised completely fluent and immersed in technology and social media, society must evaluate the effects of this accessibility. With the alt-right algorithm still thriving on TikTok, its hard to say how many young viewers are having their views and values shaped by it. The alt-right algorithm has transitioned to a new platform, and its a neglected problem that needs to be addressed urgently.

Original post:

Alt-Right Algorithms - YouTubers - Study Breaks

Related Posts