First Amendment & Social Media Censorship: Breaking Down NetChoice, LLC v. Paxton
Social media platforms have become some of the most prominent arenas for debates over freedom of speech in recent years. While the First Amendment is intended to protect citizens from the government’s infringement of our rights to freedom of speech, the law is less clear about what role the First Amendment plays in protecting speech expressed on web-based platforms owned and operated by private companies like Meta (Facebook/Instagram), Twitter, TikTok, and the like.
Recent rulings from the nation’s highest court may offer some insights on how the justices interpret the First Amendment as it applies to censorship on social media platforms.
On May 31, the Supreme Court granted an emergency stay request to tech industry companies that petitioned against the Texas Legislature’s House Bill 20. The bill, originally passed in September of 2021, would prohibit social media platforms from blocking, removing or somehow discriminating against users’ posts on the basis of political views. Notably, the Supreme Court implementation of this bill was blocked in a 5-4 decision by a surprising blend of liberal and conservative justices: Chief Justice John Roberts and Justices Breyer, Sotomayor, Kavanaugh, and Barrett (you can read Justice Alito’s reasons for dissenting here).
While the question of the law’s constitutionality remains unresolved for now, the high court’s ruling means that the law will not be able to take effect while the case—NetChoice, LLC v. Paxton—continues through the Fifth U.S. Circuit of Court of Appeals.
Given the significance of this case for freedom of speech, let’s dive deeper into some* of the arguments presented by both sides to understand the potential implications for both the First Amendment and social media censorship.
*Note: This blog post covers only a fraction of the hundreds of documents filed for this case so far. It is not intended to be a comprehensive assessment of all arguments involved, but rather an overview of First Amendment-related arguments both for and against government regulation of social media companies, as well as the potential implications this case could have for freedom of speech.
Arguments Presented Against H.B. 20
In the initial application to block the implementation of H.B. 20 filed on May 13, 2022, the applicants (NetChoice, LLC and the Computer and Communications Industry Association) presented several arguments related to three central claims about how H.B. 20 would. The first claim was that the bill would negatively impact business. The applicants explained this by saying, “because there is no ‘off-switch’ to platforms’ current operations, the cost of revamping the websites’ operations would undo years of work and billions of dollars spent on developing some platforms’ current systems” (p. 3).
Their second claim was that H.B. 20 would lead to the proliferation of “objectionable viewpoints.” The applicants cited several high-profile examples from recent years including Russian propaganda about the invasion of Ukraine, ISIS propaganda, pro-Holocaust content and posts encouraging disordered eating among children and adolescents.
Finally, the applicants claimed that this bill would infringe on editorial freedom of the platforms. They explained this saying the bill would, “impose] related burdensome operational and disclosure requirements designed to chill the millions of expressive editorial choices that platforms make each day” (p. 1).” Their supporting arguments for this claim repeatedly cited the Supreme Court precedent established by Reno v. ACLU (1997) to justify their arguments about social media companies’ editorial control over content published and disseminated on their websites. The applicants pointed out that all platforms have their own hate speech policies, and also “engage in speech they author themselves, through warning labels, disclaimers, links to related sources, and other commentary they deem important” (p. 8).
Amongst the many direct references to the First Amendment, the applicants cited Supreme Court precedents set by Tornillo, PG&E and Hurley to argue that the First Amendment “[protects] the rights of private entities (a newspaper with market power, a monopoly public utility, and parade organizers) not to disseminate speech generated by others (candidates, customers, and parade participants)” (p.19).
After the initial application was filed, multiple parties submitted additional supporting documents in the middle of May. In one such filing submitted on May 17, the summary of the applicants’ argument was as follows:
“HB20 will have an unprecedented detrimental effect on online platforms as we know them. It will transform social media platforms into online repositories of vile, graphic, harmful, hateful, and fraudulent content, of no utility to the individuals who currently engage in those communities. And it will flood otherwise useful web services with wasteful and irrelevant content. A single-sentence, unreasoned order is an unwarranted way to mandate this devolution.”
There were dozens of pages of documents filed on behalf of the applicants prior to the Supreme Court’s decision to issue the emergency stay order. For more detailed information on the applicants’ arguments, you can find them here.
Arguments in Favor of H.B. 20
On May 18, Attorney General of Texas Ken Paxton filed a response to the applicants. Some of the key arguments brought up by the respondent included:
- Citizens ought to be guaranteed access to the “modern public square” . The argument here is that removing content or blocking users on the basis of their expressed viewpoints may constitute an exclusion from the digital public sphere.
- The “Hosting Rule” in H.B. 20 does not prohibit social media platforms from removing entire categories of content. For example, the respondent argued that these companies could ban all foreign government speech if they don’t want to host Russian propaganda about the invasion in Ukraine (as long as the content bans are applied equally). Furthermore, Paxton argued, H.B. 20 only applied to expressions shared or received in Texas specifically.
- The “Hosting Rule” does not implicate the First Amendment because it “regulates conduct, not speech—specifically, the platforms’ discriminatory refusal to provide, or discriminatory reduction of, service to classes of customers based on viewpoint” (p. 21). The respondent also argued that, even if the First Amendment is implicated by H.B. 20, social media companies may be viewed as “common carriers” for communications because they “hold themselves open as willing to do business with all comers on equal terms; they are communications enterprises; they are demonstrably affected with a ‘public interest’; and they enjoy statutory limitations on liability” (p. 26).
- For more detailed information on the history of common carriage laws, check out this extensive guide from Cybertelecom.
- Social media companies have repeatedly claimed that they do not publish nor edit content (which offers them some legal immunity under Section 230 of the Communications Decency Act of 1996), but they allegedly contradicted themselves when arguing that “HB 20 limits their editorial discretion over user content in violation of the First Amendment” (p. 14).
First Amendment Issues to Consider in Wake of NetChoice, LLC v. Paxton
As mentioned at the beginning of this post, the Supreme Court has not determined whether H.B. 20 infringes upon the First Amendment rights of social media companies. By granting the emergency stay request, the justices effectively blocked the law from taking effect while the lower courts continue to assess its constitutionality. This process could take several months to resolve, so in the meantime, we ask you to reflect on the following questions related to arguments presented by each side in NetChoice, LLC v. Paxton:
- If large social media platforms may be considered “common carriers” for public discourse, should they be required to host [what may be considered] hate speech and/or graphic content on their platforms?
- By what standards should social media companies determine the acceptability of content published and/or disseminated on their platforms? In other words, how could these platforms realistically determine what is merely an expression of speech versus potentially/actually harmful content?
- Should internet-based, user-generated content platforms be held to the same rules and standards as print based news platforms? Should Supreme Court precedent arising from the unanimous decision in Miami Herald Publishing Co. v. Tornillo (1974) apply?
- Should we consider repealing Section 230 of the Communications Decency Act and hold social media companies liable for what their users publish or disseminate on their platforms?
We welcome you to share your thoughts, insights or additional questions you have about this case in the comments section below this post!