Deplatforming Works – Vice

The dust is still settling after Alex Joness InfoWars was more-or-less simultaneously banned by YouTube, Spotify, Apple, and Facebook. The move has spawned thousands of takes about whether deplatforming Jones was the right move or a slippery slope toward more censorship. But just as important to consider: Will it work?

This is called "deplatforming" or "no platform,"social media companies (sans Twitter, which says he hasnt broken its rules) have decided to stop being complicit in spreading Joness conspiracy theories and hate. And weve seen no indication Jones will stop. But will his business remain viable and will his influence wane?

The knee-jerk reaction, among Jones and some parts of the conservative movement, is that banning Jones will only make him stronger. InfoWars noted (correctly) that Google searches for InfoWars skyrocket after tech purge, and added that Silicon Valleys censorship campaign backfires as interest in InfoWars goes through the roof. The Ringer, meanwhile, noted that the paradox of Alex Jones, Infowars, the alt-right, and this whole unfortunate orbit of web-taught, forum-dwelling eugenicists: They are nothing without YouTube, and yet theyre nothing without getting banned, dramatically, from YouTube (or wherever else).

Its true that Silicon Valleys lethargy on the far-right, aided by endless press coverage, helped amplify their message and turned the far right into a real, powerful political force in the United States. And the Streisand Effect is definitely a real thing: Trying to censor somethingeven if that censorship is warrantedis only going to drive interest in it.

But the belief among people who have studied deplatforming is that, in the long term, it does work, though it may have some unintended consequences that have not been fully understood yet.

Weve been running a research project over last year, and when someone relatively famous gets no platformed by Facebook or Twitter or YouTube, there's an initial flashpoint, where some of their audience will move with them Joan Donovan, Data and Societys platform accountability research lead, told me on the phone, but generally the falloff is pretty significant and they dont gain the same amplification power they had prior to the moment they were taken off these bigger platforms.

Theres not a ton of research on this, but the work that has been done so far is promising. A study published by researchers at Georgia Tech last year found that banning the platform's most toxic subreddits resulted in less hate speech elsewhere on the site, and especially from the people who were active on those subreddits.

Early results from Data and Society sent to an academic listserv in 2017 noted that its unclear what the unintended effects of no platforming will be in the near and distant future. Right now, this can be construed as an incredibly positive step that platforms are making in responding to public complaints that their services are being used to spread hate speech and further radicalize individuals. However, there could be other unintended consequences. There has already been pushback on the right about the capacity and ethics of technology companies making these decisions. Weve also seen an exodus towards sites like Gab.ai and away from the more mainstream social media networks.

There are lots of examples of people who have been deplatformed and have seen their power wane. After he lost his Fox News show, Glenn Beck couldnt sustain his influenceThe Blaze reaches only a fraction of the people he used to. Milo Yiannopoulos, the former Breitbart personality, was permanently banned from Twitter for inciting targeted harassment campaigns against actress Leslie Jones, and he resigned from Breitbart over comments he made about pedophilia on a podcast. His general prominence in public discourse has waned ever since.

I think the anecdotes are what makes a difference hereeach individual, when you add them up, you get a net effect. You dont need much data behind it to point out that with Milo, he lost Twitter, and the result was he lost a lot, Angelo Carusone, president of Media Matters, which monitors conservative media and is studying deplatforming, told me on the phone. He lost his ability to be influential or at least to project a veneer of influence.

Deepfakes, the AI-assisted technology used to create fake celebrity porn, grew in popularity after Motherboard reported on it. When Reddit, Pornhub, Gfycat, and others banned it, there were brief cries of censorship, and a brief spike in interest. But today, deepfakes are only being made and shared on the margins, in private forums and smaller, 4chan-like image boards.

"Yes, we must take away the kinds of coordinative power theyre able to gain on platforms"

One of the most important things to keep in mind when predicting what may happen to InfoWars is to consider how most people consume media these days. These platforms are so powerful for a reason: the vast majority of Americans use them every single day, and many people use social media as their only source of news. Social media is designed to be habit forming and many thousands of hours of research have been put into making sure these platforms are a daily habit; the question is whether Alex Jones and InfoWars is going to remain a daily habit after the initial Streisand Effect spike.

A lot of Joness programming is impromptu, where hes doing emergency broadcasts drunk in his house at 1 AM, Carusone said. Without YouTubes push notifications or algorithms, theres no way anyone would be watching that.

Of course, getting banned from his major platforms wont make Jones disappear, just like it hasnt made Milo or Beck completely disappear. As platforms have begun to ban certain types of content, alternative platforms like Gab.ai and Voat have popped up, where more-or-less anything goes. These platforms, too, were hyped as potentially powerful alternatives to the big social media players, but are largely struggling and arguably no more relevant than standard message boards that have been used by the far-right to organize for decades.

Nonetheless, the concern among academics is that, as hate moves to the darker corners of the internet, that some of their old followers may move with them and become further radicalized.

The good that comes with deplatforming is, their main goal was to redpill or get people within mainstream communities more in line with their beliefs, so we need to get them off those platforms, Robyn Caplan, a PhD student at Rutgers University and Data and Society affiliate, told me on the phone. But now weve put them down into their holes where they were before, and they could strengthen their beliefs and become more extreme.

The question is whether its more harmful to society to have many millions of people exposed to kinda hateful content or to have a much smaller number of ultra-radicalized true believers.

Donovan believes that, ultimately, its important to deplatform people when their rhetoric is resulting in negative, real-world consequences: The way Jones activates his audiences has implications for people who have already been victimized, she said. We have always had groups of white supremacists, misogynists, and violent insurrectionists joining message boards. But social media has made these tools much more powerful. So yes, we must take away the kinds of coordinative power theyre able to gain on platforms.

"Alex Jones is not the only person being deplatformed or who has been deranked"

There are a couple other things worth mentioning. First, Jones has been deplatformed before, in the late 1990s and early 2000s, when he lost his radio shows. Jones was able to build something of a dedicated, organic audience long before YouTube in part by executive producing and distributing the 9/11 conspiracy theory documentary Loose Change. InfoWars was streaming video on its own website before it was streaming on YouTube. He is nothing if not persistent.

Second, the people who will stay on YouTube but wont follow Jones are not suddenly going to be earnestly consuming the New York Times. Jones became popular on social media because he was vitriolic, and because social media algorithms favor vitriolic, high-velocity content.

Not only did Jones threaten and pantomime shooting Robert Mueller, the reason he did is because he said Muellers a demon, Carusone said. If youre already plugged into the demon algorithm on YouTube, theres plenty of other people spewing demon stuff for you there.

Most importantly, we need to remember that Joness banning, and to a lesser extent Milos, only became major national news because it fits into a false narrative that Silicon Valley censors only conservatives, which has been posited by Ted Cruz, Congressman Jim Jordan, and, recently, Donald Trump.

Deplatforming works best when the people being deplatformed dont have any power to begin with. Nor are we talking about people from marginalized communities who have self-censored or left social media because of far right harassment and hate campaigns (and could, in theory, come back with more proactive moderation by large platforms.)

Alex Jones is not the only person being deplatformed or who has been deranked, Caplan said. We need to puncture this myth that its only affecting far-right people. Trans rights activists, Black Lives Matter organizers, LGBTQI people have been demonetized or deranked. The reason were talking about far-right people is that they have coverage on Fox News and representatives in Congress holding hearings. They already have political power.

See more here:

Deplatforming Works - Vice

Related Posts
This entry was posted in $1$s. Bookmark the permalink.