The Supreme Court moderates itself on social media regulation
In Netchoice, SCOTUS charts a careful course
Last year, Justice Kagan quipped that Supreme Court justices are “not, like, the nine greatest experts on the internet.” In keeping with that humility, Justice Kagan penned a wise and restrained decision for the Court clarifying how the First Amendment applies to social media companies. The bottom line: while platforms have robust First Amendment rights to curate content in their main feeds, they are by no means immune from regulation.
The careful approach should prohibit Orwellian speech diktats, but it also maintains space for key regulation of large social media platforms. Striking that balance is a win for free speech and our democracy.
How did we get here?
After Facebook, then-Twitter, YouTube, and other social media companies deplatformed former President Donald Trump in 2021, Florida and Texas passed laws to keep big platforms from doing that by restricting the way they moderate content. Florida’s law prohibited large platforms from “censor[ing]” or “disfavoring” posts based on their content or source and required platforms to “consistent[ly]” apply their moderation policies. Texas’ law prohibited large platforms from “censor[ing]” a user based on viewpoint. Both laws required platforms to make certain disclosures — including notifying users when they were censored — and enabled users to sue platforms that violated the restrictions.
The purpose of these laws was no secret. In signing the Florida bill, Governor DeSantis issued a press release, titled “Governor Ron DeSantis Signs Bill to Stop the Censorship of Floridians by Big Tech.” Governor Abbott asserted the Texas bill was a necessary response to “a dangerous movement by social media companies to silence conservative viewpoints and ideas.” Both governors were echoing the state legislators who drafted and passed the bills.
Following the laws’ passage, NetChoice — a tech industry trade association — sued both states to have the bills struck down. When the cases arrived at the Supreme Court, NetChoice and Florida each took extreme positions. Florida argued platforms’ content moderation is not protected by the First Amendment. On the polar side, the platforms argued that the First Amendment bars nearly all regulations of the content platforms carry.
SCOTUS’s decision and its impact
The NetChoice decision, thankfully, rejected both extremes in a unanimous, narrow holding. Technically, the only thing the Court did was rule that neither lower court correctly applied the right standard in evaluating NetChoice’s facial challenges to the laws’ constitutionality, kicking it back to the lower courts to dig in deeper. SCOTUS asked those courts to more precisely assess the scope of the laws, decide how many (if any) of the laws’ applications are constitutional, how many (if any) are not, and determine whether the number of unconstitutional applications substantially outweigh the number of constitutional applications.
But beyond its narrow decision, SCOTUS offered some clear guideposts for how to assess what is and is not constitutional, making clear that at least some aspects of the Texas and Florida laws are likely unconstitutional.
The most immediate impact of the Court’s decision is that the core of the Texas and Florida laws will likely fall. That’s a good thing. Both laws were clearly aimed at propping up speakers and views in ways that violate the north star of First Amendment doctrine — i.e., that “no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion.” The decision should establish a bulwark preventing the government from the most blatant attempts at compelling large social media companies to distribute speech the government prefers and restrict speech it does not.
On the other hand, just because the First Amendment generally applies to platforms’ primary feeds does not mean the feeds are un-regulable; it only means that regulations must pass constitutional muster. As the Court explained, because the “whole project of the First Amendment” is to ensure “a well-functioning sphere of expression, in which citizens have access to information from many sources,” “the government can take varied measures, like enforcing competition laws, to protect that access.” What’s more, the majority noted that “[m]any possible interests relating to social media” might justify regulation, alluding elsewhere to the effects of social media on adolescents’ mental health. To survive, regulations of primary feeds will have to pass at least (see below for a caveat) intermediate First Amendment scrutiny, requiring the government to demonstrate a “substantial governmental interest” that is “unrelated to the suppression of free expression.”
Beyond antitrust and child safety, the broader contours of the decision thus holds space for future regulation in other areas. Consider a few.
Transparency Regulations. Most straightforwardly, the Court’s decision should offer ample space for platform transparency requirements. That is a hopeful development for future transparency policymaking, including Congressional efforts such as the Platform Accountability and Transparency Act.
Public Accommodations Requirements. Another avenue of regulation the decision preserves is applying public accommodation laws to large platforms. Those laws have traditionally prevented bus stations, restaurants, theaters, and other hubs from discriminating against individuals based on protected characteristics, like race, religion, and national origin, or disabilities.
Privacy and Process-Based Restrictions. Broadly, the decision encourages policymakers to pursue lawmaking driven by interests outside of forcing platforms to host content they would otherwise exclude. This should push would-be reformers to redouble efforts to pass comprehensive federal privacy reform. And it moves the conversation beyond focusing purely on platforms’ content moderation toward a broader imagining of how our online information ecosystem can better “be a well-functioning sphere of expression.”
Apart from preserving avenues of regulation, the decision will likely force policymakers to be more precise when defining “social media platforms” and more specific about the activities they aim to regulate. It is the very breadth of the Texas and Florida laws’ coverage and definitions that all the justices identify as driving the need to send the case back to the lower courts – in acknowledgement of the need to engage with the fact we’re not just dealing with a few centralized social media platforms. There’s an “ever-growing number of apps, services, functionalities, and methods for communication and connection” that might be implicated by expansive state laws.
The majority’s decision, however, leaves some major questions unanswered and raises several areas of concern.
The Court declined to decide if strict or intermediate First Amendment scrutiny applies to regulation of social media platforms – strict scrutiny would severely curtail the universe of permissible regulations. It also left unresolved what specifically or concretely delineates platforms’ “expressive activity” that should receive First Amendment protection from their non-expressive activity.
The Court also declined to go ahead and strike the statutes down, despite observing that their purposes were to benefit certain viewpoints and long-standing precedent that such laws are presumptively unconstitutional. This is concerning and may suggest a watering down of the strong medicine the Court has traditionally applied to viewpoint-based restrictions of speech.
Finally, Justices Alito and Thomas’s invocations of “common-carrier doctrine” in their concurrences raises concerns that lower courts might hold that platforms are “common carriers” for all purposes. While applying some common-carrier-like requirements (e.g., a narrow set of public accommodation rules) to the largest platforms makes sense, there really is no coherent common carrier doctrine ready to be applied wholesale to platforms – and without engaging with the novel nature of digital platforms, particularly compared to existing types of common carriers, any broad, un-nuanced determination that platforms are “common carriers” for all purposes could upend our digital sphere, severely curtailing platforms ability to curate speech.
Takeaway: A Welcome Dose of Humility and Caution
As Justice Kagan put it, the internet and social media platforms have brought a “dizzying transformation in how people communicate” – and this pace of development and change will continue. But the pace of change shouldn’t tempt us to neglect the principles surrounding free speech that are bedrock to our democracy. We can’t afford for this to be an all-or-nothing area of the law.
NetChoice holds space for us to continue to strive towards a place of balance — holding platforms accountable for their role as the digital squares of our democracy without throwing out the First Amendment along the way.