Content Moderation, Section 230, and The First Amendment – AAF – American Action Forum

Posted: May 29, 2020 at 1:03 am

Introduction

On Wednesday, White House press secretary Kayleigh McEnany told reporters that President Donald Trump intended to sign an executive order on Thursday regulating social media companies. This move comes after Twitter placed a fact-checking label on one of President Trumps tweets concerning voting by mail. While the digital age has led to an explosion of speech of many different forms and opinions, social media platforms have faced criticisms from both the left and the right for the decisions they make regarding what content to leave up, take down, or otherwise moderate. Nevertheless, those that value freedom of expression or see the benefits technology brings should be concerned about calls for government regulation of private actors in this area.

Free Speech Rights and Regulation of Social Media

Critics of content-moderation decisionsto remove certain users or content or add warnings or fact checks to this informationquestion if platforms decisions to do so violate Free Speech rights. These internet platforms should be neutral to all speech and such decisions are currently biased, the argument goes. But these claims misunderstand the constitutional claims involved with violations of the First Amendment.

First, with each new content-moderation controversy, it has been pointed out that these are private platforms. First Amendment speech rights restrain government, not private actors, when it comes to the regulation of speech. Therefore, the First Amendment doesnt directly implicate private actors such as social media companies.

Second, government regulation of private platforms, such as those regulations proposed in the executive order, could raise serious First Amendment concerns. Platforms themselves have First Amendment speech rights, and they exercise these when they themselves speak, such as by attaching a fact check to user-generated content. As Judge Andrew Napolitano explained on Fox News, The president can say what he wants about Twitter and they can say what they want about him. Government attempts to control or regulate such decisions does not further Free Speech, but rather undermines the Free Speech rights of the platforms themselves.

It should be concerning how these regulations could spill over into other expressions beyond social media. While the executive order may only concern the regulation of social media platforms, it could set a dangerous precedent if upheld that could allow future government intervention into other speech rights. Particularly given a vague standard or catchall such as otherwise objectionable, different officials could weaponize such terms to remove unpopular opinions from the other side.

Case Law Does Not Support Government Intervention Into Decisions Concerning Online Speech

The expected executive order argues that social media platforms serve as the functional equivalent of a traditional public forum. This argument has been repeatedly rejected by the courts.

These arguments for the executive order rely on Packingham v. North Carolina, where the Supreme Court held that state actors could not impose restrictions on access to internet platforms. But since Packingham, courts have repeatedly stated that private social media companies are not required to apply First Amendment Free Speech standards to their own content moderation decisions. Both California state and federal courts have rejected such claims in cases brought by Prager University after YouTube placed some of its videos in restricted mode and limited its advertising. Earlier this week in a lawsuit brought by activist Laura Loomer and FreedomWatch, the U.S. District Court for the District of Columbia ruled that private social media platforms were not places of public accommodation as defined by the DC Human Rights Act, and thus that arguments against private moderation regarding requirements for places of public accommodations failed. Both federal and state courts have come to the same conclusion for a variety of platforms following decisions to ban or remove content.

By carrying others speech, social media platforms are not transformed into a public square. This principle has been applied to traditional media as well as to new digital platforms. Cases surrounding libraries, bookstores, and wire services reached similar conclusions in a pre-digital age. The protection of platforms regarding their decisions about what content to allow reflects general legal principles and is not a special handout. Additionally, in Manhattan Communication Access Corp. v. Halleck, the Supreme Court held, in a decision written by Justice Brett Kavanaugh, that a privately operated public access television station was not a public forum bound by First Amendment standards. This case is likely more legally analogous to the current situation concerning social media platforms than Packingham is.

Fairness Would Actually Harm Conservative Voices Online

Many of the calls to regulate social media from the right call that platforms need to be accountable and fair. But requiring neutrality or removing Section 230 could result in a new version of the Fairness Doctrine and actually make it harder for new voices to be heard online.

Section 230, a law that limits the liability of an online platform for content created by users and enables it to make moderation decisions regarding such content, makes it easier for new platforms to emerge. In doing so, it provides speakers with new ways to express themselves or allows for a set of rules that better fits their preferences. For example, when it comes to fact-checking political speech, Facebook and Twitter have taken different approaches, as seen in comments from Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey. But a protection from liability and the ability to make different content-moderation decisions doesnt just protect the giant incumbents; it also allows new platforms and communities to develop without the risk that they get crushed before they can take hold. This can help expand speech to speakers that would have otherwise been left without a voice and creates a marketplace of ideas. As senior editor of The Dispatch David French wrote in Time regarding what Section 230 has allowed, While different sites have different rules and boundaries, the overall breadth of free speech has been extraordinary. Think about all the ways we have continued to feel connected by user-generated content during the current pandemic. Without Section 230, platforms would either be forced to engaged in constant moderation that would likely silence many legitimate discussions or engage in no moderation at all, resulting in the internet being a place not many people would enjoy.

But should the government require platforms rules to be fairly enforced? In the past this was tried with more traditional media under the Fairness Doctrine. The Fairness Doctrine obliged those licensed by the Federal Communications Commission (FCC) to ensure that coverage included opposing views by interested citizens. This rule resulted in radio and later television stations being required to carry certain responses and information, giving rise to concerns that the doctrine could chill speech and violate First Amendment rights. The FCC during the Reagan Administration removed the rule and this change in part allowed for the rise of conservative talk radio.

Requiring neutrality or removing Section 230 could backfire on the conservative voices that feel liberal platforms are biased against them. As Tech Freedoms Ashkhen Kazaryan explained, if platforms must be neutral to enjoy First Amendment protectionwebsites tailored for specific populations cease to exist.This decline in diversity would be concerning for both conservative voices that might want a more family-friendly experience and those in communities that may face persecution or discrimination such as the LGBTQ community. Fairness may sound like an ideal, but government-imposed neutrality would likely result in more silence and not more voices.

Conclusion

Government threatening to regulate online speech should be concerning regardless of which side of the aisle it comes from. The internet has enabled citizens to hold the government accountable, facilitated communication and creation in innovative ways, and resulted in more opportunities for expression than ever before. Many of the rationales behind such calls misunderstand the fundamentals of free speech and could damage founding American values as well as the very voices they claim to protect.

Original post:
Content Moderation, Section 230, and The First Amendment - AAF - American Action Forum

Related Posts