Published on April 6, 2023 (about 1 year ago)

Up next: the lawsuit threatening your YouTube watch queue

Victoria Nemiah
By Victoria Nemiah9 min readVideo news

LinkHow the Wolf of Wall Street almost killed the internet

Back in the 1990s, a lot was going on. While we were learning to do the Macarena and struggling to keep our Tamagotchis alive, the internet was starting to really take off. In early court cases, judges recognized that these new websites were acting like mere “distributors” of content (think newsstands) instead of “publishers” (think newspapers), ensuring that websites wouldn’t drown in legal claims for third-party content they hosted.

Until, that is, the Wolf of Wall Street entered the fray.

In 1994, Jordan Belfort’s popular investment firm Stratton Oakmont was going through some leadership changes after Belfort was ousted for securities fraud. Stratton was a particularly hot topic on Prodigy, an early online bulletin board host, so when an anonymous user posted that Stratton had committed fraud during an IPO and called the firm a "cult of brokers who either lie for a living or get fired,” it didn’t take Stratton Oakmont long to sue.

Stratton argued that Prodigy, as the platform hosting the content, should be treated as the content’s publisher — an argument akin to holding Reddit responsible for an untrue comment in one of their subreddits. After considering the facts, the court held that because Prodigy tried to moderate the content on its site (including screening out offensive language), it could be held responsible for the post.

Thankfully, Congress quickly intervened. In 1996, they enacted the Communications Decency Act, including Section 230, which specifically overrode the court’s approach to Stratton and established new rules around liability for internet platforms.

Roughly speaking, Section 230 says that:

  1. Internet platforms can’t be treated as the publisher of things third parties post on them (i.e., they shouldn’t be treated like newspapers, which are held responsible if they put out misinformation).
  2. Internet platforms can’t be held liable for trying to intervene and take down obscene, violent, harassing, or otherwise objectionable content (i.e., they shouldn’t be punished for attempting “good Samaritan” behavior, even when they sometimes miss things).

With this statutory course correction, Congress ensured immunity for internet platforms, and the web took off.

LinkSame law, different year

Fast forward 30 years, and Section 230 continues to protect online platforms from liability for the content its users post. There’s just one problem: the exact wording of Section 230 was written to address the internet of 1996, but online platforms look a lot different today than the bulletin boards and forums of yore.

This brings us to a current Supreme Court case that’s had the media asking a spicy question: is this a ruling that could end the internet as we know it?

The case of Gonzalez v. Google was brought by a California family after their daughter was tragically killed during an ISIS terror attack in Paris. The family takes issue with the fact that when someone watches an ISIS-related video on YouTube (Google), the sidebar will serve them “recommendations” that include more ISIS-related videos. They argue this “aids and abets” terrorism.

Normally, this is the kind of suit that Section 230 would stop in its tracks. But the Gonzalez family has a novel theory: They argue that when YouTube makes recommendations in the sidebar, it is creating something original rather than simply publishing third-party content. Put simply, they argue that these recommendation choices are YouTube’s own speech.

LinkNot like the nine greatest experts on the internet

Reading the arguments that each side submitted to the Supreme Court, you’d think there were two different versions of the law floating around.

The lawyers for Gonzalez paint a picture of Congress aiming to protect platforms from lawsuits over only two specific topics: failing to take a video down and choosing to take a video down. They claim Congress didn’t mean to make tech companies immune from liability in every case. In their view, YouTube can safely host an ISIS video on an individual page, and they can make content moderation decisions to leave it up or take it down. However, the moment they do something like feature it in a “similar videos” sidebar, add it to an autoplay queue, or reference it in an email, those are YouTube’s own content, and they can be liable for hyping ISIS. Going a step further at oral argument, Gonzalez’s lawyer asserted that Google could be liable for its choice of which result to show first when someone searches Google for the word “ISIS.”

On the other side, Google envisions a world where Section 230 stops any lawsuit about third-party internet content before it even starts. In their view, all of these YouTube features are part of presenting user-generated content to an audience, no different than the simple individual video pages that gave the platform its start. They argue that in an internet brimming with content, there is literally no way to present things that doesn’t involve some deliberate prioritizing of what to show. Taking their argument to its logical extreme, Google’s lawyers even claimed that Section 230 would protect a platform against suit over a deliberately pro-ISIS algorithm.

As both sides made these leaps during oral arguments, it became clear the Supreme Court wasn’t having it. Despite Justice Kagan’s admission that she and her peers “are not like the nine greatest experts on the internet,” the justices grilled both sides on hypotheticals involving everything from cat videos to AI-generated content. Even Justice Thomas threw in some zingers. Ultimately, the justices expressed frustration that they weren’t being offered any less-extreme approaches to interpreting the statute.

The court also looked closely at an argument advanced by a number of amicus briefs, which are statements submitted by third parties advocating for their own preferred outcome. Platforms like Reddit, Wikimedia, and the ACLU argue that narrowing the interpretation of Section 230 would risk “devastating the Internet.” From the loss of forums for free speech to the decimation of smaller internet platforms, these briefs highlight the many unintended consequences that could flow from a decision to allow the current case against Google to proceed.

LinkWhat comes next

The Supreme Court is expected to hand down a decision by June, so for now we watch and wait. In the meantime, it’s likely that large user-generated content platforms are studying clues to figure out how the case might go.

A few outcomes are possible. First, the court could uphold the lower court decisions that 230 immunizes Google from liability in this case. Second, the justices could find in favor of the Gonzalez family, taking 230 immunity off the table and causing the lawsuit to move on to its next question: whether YouTube’s conduct violates specific antiterrorism laws. Last, the court could ultimately decline to rule on the question, citing clerical issues or the low likelihood Gonzalez would succeed on the antiterrorism element.

A number of scholars expect a decision that sides mostly with Google, preserving most of Section 230’s current scope. Several justices noted that the court’s ultimate job is simply to interpret the law as it’s currently written, sticking to what Congress intended as much as they can. If the law isn’t modern enough for 2023 or needs something new added, that’s a job for lawmakers. Conveniently, Congress has dozens of open proposals to revisit 230, signaling to the Supreme Court that it doesn’t need to jump to action.

LinkTech companies read the tea leaves

Even with the optimism that a decision this summer will leave Section 230 reasonably intact, it’s natural for user-generated content platforms to consider some contingency planning. In a worst-case scenario, we could see platforms decide to curate/filter nothing (think 4chan) or everything (think Netflix Kids). However, if the court adopts the Gonzalez family’s argument that the liability stems from YouTube proactively serving up content, the solution might be as easy as some extra account settings where users “instruct” YouTube to serve them additional content after an initial search.

Working at a video infra company with petabytes of content flowing through our systems every day, my next thought was what the case could mean for us. While there’s lots of uncertainty for platforms that curate user-generated content (like YouTube), infrastructure providers (like Mux) serve as mere conduits for content and don’t ourselves prioritize, recommend, or organize it — the behaviors at issue in the Gonzalez case. That means that even if the Supreme Court does rule against Google, those of us providing pure infrastructure services shouldn’t need to immediately make big changes.

Instead, we should set our sights on those many other proposals to rewrite Section 230. To really make sure we don’t see “the end of the internet as we know it,” we need Congress to understand that it’s more than a series of tubes.

Written By

Victoria Nemiah

Victoria Nemiah – VP of Information Systems, General Counsel in Legal

[@portabletext/react] Unknown block type "span", specify a component for it in the `components.types` prop

Leave your wallet where it is

No credit card required to get started.