The dangers of deplatforming

The first time I saw an Andrew Tate video (and Ill be honest), I kinda laughed. Not because my thought process was, haha women r stupid, but because I thought he was playing a character. He was so over the top, I thought it must be ironic. Over the next few months, I watched in horror as self-proclaimed sigma males fawned over every word from his mouth. We all know how this story ends: Tate got clapped off the internet. He was deplatformed and we all rejoiced.

In the aftermath, Tate lunged at the scraps of attention leftover from his moment in the spotlight. It was clear I would never see his face again, outside of some niche and postmodern meme slideshows. Yet his impact never quite faded new men picked up with misogyny right where he left off. I looked for something to blame for those months. I felt like kicking Tate off the internet meant wed solved misogyny, and then I found it.

Deplatforming can be dangerous, even when we do it to people who deserve it, like Andrew Tate. It runs three risks: it restrains potential solvency for online hate, it could be weaponized against non-agitators if they garner public chagrin, and it concentrates hateful discourse into spaces where it can fester and become worse.

Let me start by saying that deplatforming does an excellent job at getting hateful language out of spaces with high online traffic. However, it is important that we recognize that deplatforming only lowers the visibility of content we dont like and often does not do more than that. It takes one user off of at least one platform, not prevents that user from finding other ways to spread hate. Deplatforming is not an effective solution for hatred because it does not address its ideological roots, focusing instead on cleaning up a platforms political aesthetic.

Thats all a wordy way of saying that while deplatforming does make sure that we dont have to see the ugliness of hatred very often, it is not the Swiss Army knife of ending phobia despite its situational effectiveness. Deplatforming works sometimes, but the consequences are scary.

That being said, Ill try to prove myself by presenting my worst argument first. When we rush to deplatform, we focus on getting bad content out of our faces and feeds. Secondary to this, we consider what sort of impact deplatforming has on the world and whether it is an appropriate solution to what we really need to address: the ideologies that encourage aggressive rhetoric. White supremacy, patriarchal norms and a whole bunch of other buzzwords persist despite their mouthpiece being deplatformed. It may be easy to confuse deplatforming as a win against any of the aforementioned issues because we incorrectly identify agitators like Andrew Tate as the entire problem rather than a representative of a harmful ideology. Simply put, we are giving those idiots way too much credit; you can just ignore them and theyll go away.

That is the entire nature of this game all of the points that people like Andrew Tate make are not new. There has been an Andrew Tate of every single generation, making being misogynistic look cool to a whole generation of impressionable boys. Deplatforming is a solution that does not solve the problem we should be mad at but tucks harmful ideology away until someone new comes around. Therefore, when we leap to it, we stop ourselves from seeking other solutions that may better address phobia.

This feels like an appropriate time to point out that the idea of getting canceled and deplatforming is an almost uniquely American phenomenon. Theres a reason why companies such as YouTube, Twitter and Facebook dedicate most of their content screening efforts to our servers. The reason why they do that is simple: profit. They primarily filter our content because we fuss the most about things such that our outcry is unprofitable for them. The only thing that matters during this outcry is its severity, not which group is being loud. The reason I highlight this as a concern is because the right-wing niche is growing in this country, whether we like it or not. Ill keep this point short and sweet: Deplatforming can become anti-progressive very fast if political will changes.

Finally, lets take a look at the aftermath of deplatforming. Like I said at the beginning of my tirade, there is nothing more to deplatforming than just getting something you dont like out of your face. In this instance, what you dont like is hateful speech and your face is your feed. So where does this speech go when its not in your face? It finds a new face. The analogy got weird at the end, but my point is that theres always a place online that accepts hate. When its not on YouTube, its on Reddit. However, in those lower visibility spaces, the alt-right decentralizes and the things they say become more concentrated. When theres no pushback, the radicalization has no floor. In those spaces, though we cannot see it, the alt-right continues to grow. Thats why even after we get rid of Andrew Tate, people continue believing in the same things he does. Worse still, deplatforming makes it so that those same propaganda-vulnerable people are fed even more poison.

With all that being said, I still believe that deplatforming has its place. Given Ye (aka Kanye West)s recent statements, I had to reconcile with the fact that my favorite artist is morally horrible now. I think hes said enough; we can deplatform him. However, we should deplatform with the understanding that we are not solving any problems, we are just making our feeds match a political aesthetic that media companies profit from. In both Yes and Tates case, the ban was warranted, but its important that we recognize the underlying ideologies that they represent and fight those with more rigor. We should not let an itch to deplatform distract us from that.

See the rest here:

The dangers of deplatforming

Related Post
This entry was posted in $1$s. Bookmark the permalink.