Skip to main content

Filed under:

The Supreme Court hears arguments for two cases that could reshape the future of the internet

The Supreme Court is hearing oral arguments for Gonzalez v. Google, a potentially landmark reinterpretation of Section 230 of the Communications Decency Act, and Twitter v. Taamneh, a case about anti-terrorist sanctions. The cases could shape the future of the internet, as plaintiffs seek to hold Google and other companies liable for recommending terrorist content on platforms like YouTube. But a major narrowing of Section 230 could affect everyone from tech giants to Wikipedia editors and Reddit mods.

  • The cases are submitted.

    NetChoice got the last word in the arguments over Texas’ law, with Clement saying the state had not proven social media’s common carrier status.

    Clement also pointed to a statement from an anti-child predator group who said the law’s transparency requirements, which were minimally discussed in these arguments, could give predators a “roadmap” as to why their messages aren’t reaching kids.

    And with that, we’re wrapped for the day — we’ll be awaiting a decision in the coming months.


  • Gorsuch asks how market power comes into play here.

    He makes the point that unlike telegraphs where there’s basically one way to run the wires, “here one can start a new platform, at least in theory, any time.”

    Nielson says when it comes to speech, it’s really not about market power at all.


  • US Solicitor General explains why moderation laws are different from net neutrality.

    Kagan asked what makes Internet Service Providers (ISPs) so different from social media platforms in what they can be required to carry. Prelogar said ISPs are “fundamentally different” because they are not engaging in expressive action, rather simply transmitting data to let users access websites.

    Kavanaugh followed up, asking with a hint of humor if he could buy into Prelogar’s argument without agreeing to net neutrality.

    “You can leave for another day all of the conduit questions that come up in the net neutrality context,” Prelogar said.


  • YouTube would be one heavy newspaper.

    “If YouTube were a newspaper, how much would it weigh?” Alito asked Clement.

    Alito was trying to nail down whether a news publisher is really the right metaphor for social media companies in how they decide what content to host. Clement has made that point that even in the case where a parade organizer wanted to exclude an LGBT group, the court decided it could make that decision, even though it welcomed a large swath of other participants.


  • “Why are all the dog photos white?”

    Clement suggests that’s what users might think if a white supremacist is posting dog photos on their social media account. He was making the point that content from some users with particularly abhorrent views could impact everything they post, making it reasonable to moderate their account as a whole.

    It came in response to a question from Kagan about whether an antisemite should be prevented from posting anything from a social media site, even cat videos.


  • Social media companies wouldn’t host suicide prevention posts under the states’ laws, according to NetChoice.

    Clement said that’s because under the states’ laws, the companies would also have to carry suicide promotion posts if they did. That’s because Texas’ law, for example, prevents discrimination on the basis of viewpoint.


  • That’s a wrap on Florida’s arguments.

    Arguments over Texas’ social media law in NetChoice v. Paxton are just beginning now.


  • Coney Barrett worries about stumbling on “landmines” in a decision.

    The conservative justice asked the Solicitor General if platforms could be liable for boosting content like the Tide Pod challenge (where people challenged each other to eat the laundry detergent pods) under Section 230.

    Prelogar said when the platform’s own conduct causes harm it might not be protected by 230, but that’s besides the point from the First Amendment question here.

    “I totally agree but I also think there are a bunch of landmines,” Coney Barrett said.


  • Are social media platforms like telegram carriers?

    Gorsuch and Prelogar got into a rapid back-and-forth over whether social media companies can be considered common carriers like a telegram company. Gorsuch argued that despite being common carriers, telegram companies would argue they’re allowed to exclude some “bare minimum” amount of speech, but otherwise are “open to all comers.”

    But Prelogar said it would be wrong to call those sorts of calls curation by the telegram companies. Unlike telegram providers, social media companies compile a large volume of content in a way that represents the companies’ own free expression.


  • Now the US Solicitor General weighs in.

    US Solicitor General Elizabeth Prelogar is now weighing in, supporting NetChoice. The Biden administration weighed in on the case in briefs last year.

    The state laws in Florida and Texas “don’t withstand constitutional scrutiny,” she says.


  • “Let’s do only puppy dogs in Florida.”

    Clement said that might be the approach of social media sites if the court upholds Florida’s law. That’s because it requires platforms to enforce content moderation rules in a consistent manner, a requirement that can be hard to parse. As a result, platforms might avoid hosting controversial topics altogether.


  • Kagan suggests Venmo could be made to host transactions regardless of viewpoint.

    Kagan is trying to find the boundaries of sites’ First Amendment rights, asking about whether direct messaging services and payment platforms like Venmo could be made to host accounts regardless of viewpoint.

    Kagan said NetChoice’s argument about Facebook’s editorial discretion seems to work because Facebook is engaged in speech activities. But Venmo, she said, is not.


  • “Exactly what are they saying?” Thomas asks about social media algorithms.

    That question gets to the issue of whether the platforms have an editorial perspective when they use algorithms to choose what they show.

    “Is it a consistent message?” Thomas asked.

    It’s important to NetChoice to show that it exercises editorial discretion similar to a newspaper that is allowed to reject or accept op-ed submissions as it sees fit.

    Clement said that social media companies’ use of algorithms shows that there is so much material on their sites that platforms exercise a huge volume of editorial expression.


  • Alito asks if Google could cut off Tucker Carlson or Rachel Maddow’s Gmail accounts.

    Conservative Justice Samuel Alito asked if Florida’s law would cover Gmail. Clement said he thinks it could.

    Without the law, Clement seemed unsure if that would mean Gmail could cut off the accounts of the major conservative and liberal talkshow hosts.


  • It’s NetChoice’s time to is defend its arguments against the Florida law.

    Paul Clement is arguing for NetChoice, saying Florida’s law violates the First Amendment “several times over.”


  • Could Florida make a bookstore display books only in alphabetical order?

    Conservative Justice Amy Coney Barrett asks if the law would impact how information is organized, not just whether it’s hosted. She asks if Florida could pass a law that requires a bookstore not to favor certain books in its display.

    “Don’t all methods of organization represent some kind of judgement?” she asked.

    Whitaker said “the question of organization is analytically distinct” from that of hosting and that despite prohibitions on shadow-banning (i.e. severely downranking content), platforms can organize content however they’d like.


  • Whitaker explains why social media companies can be treated like wireless carriers.

    In an exchange with conservative Justice Neil Gorsuch, Whitaker explained why it’s appropriate to compare the social media companies to common carriers like wireless carriers, who can be prevented from silencing speech.

    The “principle function of a social media site is to enable communication,” Whitaker said, adding that the more public forum social media platforms tend to host doesn’t change that.

    Verizon wouldn’t be allowed to censor a conference call more than a one-to-one call, he said.


  • Gorsuch asks if the court will need to look at Section 230 to decide this case.

    Whitaker said that the preemption of tech’s legal liability shield for hosting or moderating users’ content won’t “dispose of the case.” Gorsuch suggested he would return back to this topic later on.


  • Kagan points to the motivation behind the content moderation laws.

    Liberal Justice Elena Kagan alluded to what brought the Florida content moderation law about: platforms’ decisions to exclude speech of anti-vaxxers and insurrectionists.

    “That’s what motivated these laws, isn’t it?” Kagan asked.

    Whitaker earlier said common carriers have “always conducted their businesses” according to “general rules of decorum.” But he noted that “upwards of 99 percent of what goes on the platforms is basically passed through without review.”


  • Arguments have begun in Moody v. NetChoice.

    You can tune in directly on the Supreme Court’s site.


    Live Oral Argument Audio

    [www.supremecourt.gov]

  • Adi Robertson

    May 18, 2023

    Adi Robertson

    Supreme Court rules against reexamining Section 230

    The YouTube logo against a black background with red X marks.
    Illustration by Alex Castro / The Verge

    The Supreme Court has declined to consider reinterpreting foundational internet law Section 230, saying it wasn’t necessary for deciding the terrorism-related case Gonzalez v. Google. The ruling came alongside a separate but related ruling in Twitter v. Taamneh, where the court concluded that Twitter had not aided and abetted terrorism.

    In an unsigned opinion issued today, the court said the underlying complaints in Gonzalez were weak, regardless of Section 230’s applicability. The case involved the family of a woman killed in a terrorist attack suing Google, which the family claimed had violated the law by recommending terrorist content on YouTube. They sought to hold Google liable under anti-terrorism laws.

    Read Article >
  • T.C. Sottek

    Feb 22, 2023

    T.C. Sottek

    Guns, banks, and 280 characters.

    During a brief rebuttal, Twitter tried to weaken some very colorful hypotheticals made throughout the day (like Osama Bin Laden walking into a bank). Waxman did not get to fully complete his thought before the court adjourned, but he seemed to be trying to suggest the absurdity of comparing Twitter’s functions to giving direct material assistance to a known terrorist who walks through your door.

    “There are 545 pages in this complaint and there are 4 that mention recommendations,” Waxman said.


  • Adi Robertson

    Feb 22, 2023

    Adi Robertson

    An old Twitter statement is coming back to haunt it.

    In 2014 Mother Jones wrote that Twitter was deliberately avoiding taking aim at ISIS, quoting a Twitter official saying that “one man’s terrorist is another man’s freedom fighter.” Taamneh attorney Eric Schnipper brought the quote back up today — this time to argue that Twitter should in fact be found liable for supporting terrorists.


  • Adi Robertson

    Feb 22, 2023

    Adi Robertson

    You wouldn’t sell Osama Bin Laden a hospital.

    Twitter has been making the case that aiding and abetting requires helping a specific terrorist attack, but justices are referencing cases that suggest aiding the enterprise at all is indefensible — even if it’s not specifically used for an attack.


  • Adi Robertson

    Feb 22, 2023

    Adi Robertson

    Justice Kavanaugh finally introduces the First Amendment to the conversation.

    The subject of speech has been notably absent from today’s arguments. Kavanaugh asks if interviewing a terrorist leader on TV would provide the same material support as a Twitter recommendation. “I think the First Amendment would solve that problem,” says Schnapper.