De-platforming Is a Fix, But Only a Short-Term One – Just Security

Though we are in the midst of a techno-reckoning that has been neatly packaged as the Great Deplatforming, the measures taken by individual companies following the attack on the Capitol have been varyingly radical. Applying their now-customary methods of content moderation to high-profile U.S. users, major social media companies like Facebook and Twitter took action against many on the right, including President Donald Trump, Representative Majorie Taylor Greene, and 70,000 accounts linked with QAnon for inciting violence and violating terms of service. More atypically, those operating the previously mostly invisible digital infrastructure that platforms like Facebook and Twitter are built on also displayed their power, taking down Parler, a free-speech alternative to Twitter. Apple, after issuing a short-fuse warning Parler couldnt possibly heed, removed Parler from its App Stores, Google removed the social network from its Google Play store, and Amazon Web Services (AWS) took down its service entirely. These emergency measures should not be taken lightly, but they have reportedly greatly reduced the spread of election disinformation online. While new technologies and technology companies have contributed to, perhaps even caused, some of the problems that lead to the events on Jan. 6, Pandoras box has been opened. Conspiracy theorists and disinformation remain in the world and online. At the end of the day, solving that problem will require more than just a technical solution.

The App Store and Google Play store are the gatekeepers to hundreds of millions of American smartphones. Despite the fact Parler had been previously warned of its bad behavior, the landscape shifted quickly after the events of Jan. 6. When Parler was taken off the App Store and Google Play store, would-be users were no longer able to download it onto their phones, and the company was restricted from pushing updates to their app. While the statistics for Parler are not public, 93 percent of video views on Twitter come from mobile. When Google and Apple remove an app from their respective stores they do not quite issue a platform like Parler a death sentence, but such actions are extremely damaging to a networks ability to grow and maintain a user base.

While Apple and Google can create serious or even fatal business problems, a web hosting company can spark an immediate crisis for a hosted service should it pull the plug. AWS provides computing and data storage for much of the internet; in 2019, the company maintained about 45 percent of the internets cloud infrastructure. When the company abruptly canceled Parlers contract after the Capitol attack, the website went down. But death on the internet is short lived. About a week after being taken down, Parler registered its domain with Epik, a company known for hosting far-right content, and announced, We will resolve any challenge before us and plan to welcome all of you back soon. It is possible that Parler will follow in the footsteps of the far-right social network Gab and host its own servers at an undisclosed data center. This lifeline is more expensive and more difficult to set up and maintain.

The internet has many entry points for the moderation of content. Anything providing for the movement of bits could theoretically become a selective barrier: With varying degrees of precision, a cloud provider, wifi router maker, ISP, or owner of a copper fiber wire can determine what passes across some portion of the internet, whether through literal technical intervention, or use of its broader leverage to compel an application provider, who relies upon it to function, to take action. But the deeper into the infrastructure one goes further removed from specific instances of communication between people and the specific apps theyre using to make it the rarer and more significant a deplatforming becomes. AWS justified the take down of Parler by noting that its acceptable use policy states that users may not host certain content, including content that violate[s] the rights of others, or that may be harmful to others. Nor may they use the service in a way that poses a security risk to the Service Offerings or any third party. These broad terms are seldom enforced outside of services engaged in outright fraud or hacking against their own users.

In its opposition to Parlers application for a temporary restraining order on the basis of breach of contract and antitrust claims, AWS noted that it reported to Parler dozens of examples of content that encouraged violence, including calls to hang public officials, kill Black and Jewish people, and shoot police officers in the head. Crucially, on Parler, hatred did not lurk on the fringes and calls to violence werent isolated voices lost in a sea of content. Instead, it was common to come across posts and comments calling for violence or civil war. And this dozens figure cited by AWS is only a fraction of a fraction of the content that incited violence.

As part of my research, I joined Parler in early November, right before the election. It was clear that the network facilitated and bred hatred and disinformation. Its ecosystem was immediately evident. A prominent figure, such as Republican Senator Ted Cruz or Fox News commentator Sean Hannity, would post a provocative article, usually with no direct incitement to violence. However, their post or Parley was meat for the piranhas. For instance, on both Parler and Twitter, Cruz posted an article from the Washington Examiner titled: Graphic warning: Reported Black Lives Matter counterprotesters sucker punch and stomp on man leaving DC Trump rally, commenting Why is the media ignoring this? Why are Dems silent? The response to the posts on the two platforms was drastically different.

On Twitter, many of the responses are critical of the article, noting that the video was edited to remove the beginning of the incident where the man who is eventually sucker punched shoves a man to the ground and kicks him as other Trump supporters shout kill them. Some even condone the sucker punch, with someone tweeting [] he got exactly what he deserved. Make better choices, then. Other commenters bash the media and liberals, tweeting They blame the victim and make up lies, or, Because theyve gotten away with it since it started. [] There are no politicians that will hold them accountable. A small minority foreshadow future violence: We are only going to put up with the crap for a little while [] We will stand and fight and have no mercy on them.

But, on Parler, the comments were far more extreme. The comments on the platform read: []We have the 2nd amendment on our side. Put a damn mask on to cover you[r] identi[ty] lock and load, and start cleaning the streets of this vial filth, and [If] I see any BLM or ANTIFA and I am going to pull my gun and start shooting! F[***] those a[******] communist f[****]!.

Even more dangerously, conservatives used Twitter and Parler differently. For instance, Sean Hannity posted on Parler that Antifa and Stalinist Sympathizers Disguised in Trump Gear Identified in DC Protests, a claim based on an article that is false. There is no such reciprocal post on his Twitter account. Politicians and commentators can additionally post different messages on different social media. As journalist Nick Martin noted, Representative Paul Gosar (R-Ariz.) seemingly parleyed in support of the Jan. 6 raid of the Capitol while simultaneously tweeting a soft condemnation of it.

Thus, when it comes to political purpose, Parler served two main roles for those on the right: It was a community, a safe space to express and consume disinformation and radical viewpoints, and it allowed a forum to collect and re-interpret mainstream messages. To hear a dog whistle, you need a dogs ear, after all. Parler is not the first network to have served these functions. It wont be the last, either. In fact, even without the internet, Parler can be replaced.

Without Parler or Twitter, disinformation and hatred coded or overt will continue to be broadcast. The Trump White House and Fox News were, by some researchers findings, the largest spreaders of fake news. Even without Twitter or the White House, Trump will retain a spotlight, if only through right-wing new organizations. Plus, the U.S. Congress now has at least two members who have publicly supported QAnon, including Marjorie Greene Taylor, who in 2017 expressed her belief that having Trump as president provided the country with a once-in-a-lifetime opportunity to take [the] global cabal of satan-worshipping pedophiles out. Hannity no longer has his Parler, but he still spewed disinformation about Antifa masquerading as pro-Trumpers at the Capitol on his cable show, which averages 4.5 million viewers a night.

And without Parler or Twitter, disinformation and hatred coded or overt from radical elites will continue to be noted and interpreted. As recently as this summer, QAnon Facebook groups had millions of members. When these groups were shut down, many moved to Parler, a platform that consisted of adults who consensually joined it, presumably to have discussions like the ones they were having and to consume content like what they were seeing. With Parler gone and Twitter and Facebook cracking down on conspiracy theories, millions have downloaded the encrypted messaging apps Signal and Telegram, which allows for groups of up to 200,000 people. This doesnt absolve Parler of responsibility, nor does it mean any action taken is helpless. But conspiracy theorists wont disappear; theyll migrate.

In the wake of this, the question arises: What is there to be done? Already, Big Tech has answered that question in its own way, with Apple, Google, and AWS taking aggressive measures to disable the platform. The moves they made were probably the right ones, at least in the short term, but problems remain. Millions of Americans believe the big lie that the 2020 election was stolen, a problem for which there is no technical solution. At a certain point, the question about what to do with Parler is only part of the broader one about how society should cope with the fact that segments of the population are living in different realities. And thats a far trickier problem one that, when the dust settles, and platforms are unable to reasonably cite the imminent threat of violence, we will have to solve.

See more here:

De-platforming Is a Fix, But Only a Short-Term One - Just Security

Related Posts
This entry was posted in $1$s. Bookmark the permalink.