Americans for Tax Reform joined the Cato Institute and R-Street Institute in filing an amicus brief with the US Supreme Court in Gonzalez v. Google. This case could have a profound impact on Section 230 liability protections for interactive computer service providers and users, including social media platforms and search engines. These protections ensure that third-party users can upload content to social media sites like YouTube, Facebook, and Twitter, and others can view and interact with that content. They also allow these providers to make a good faith effort to remove inappropriate and illegal content to ensure that users and advertisers have a comfortable experience while engaging in the platform, instead of subjecting them to additional liabilities for improper removals.
The case will address a lawsuit filed by the family of Nohemi Gonzalez, an American student living in France who was killed by an ISIS terror attack in 2015, against Google, the parent company of YouTube. The family argues that YouTube is liable for its algorithm displaying recruiting videos posted by the Islamic State. They argue that Section 230 does not protect YouTube from liability for ISIS using their platform.
The Biden Administration and Gonzalez plaintiffs advocate for a ruling that will ignore the law as written, usurp Congress’s role in addressing the debate around content moderation, and potentially make it harder for users to access content. Additionally, rolling back Section 230 protections through the blunt instrument of judicial fiat will either make it difficult for platforms to remove obscene content or drive platforms to censor more aggressively to avoid liability. The Court should instead interpret the law as written and leave legislative changes to these liability protections to the legislative branch.
ATR’s brief examines prior case precedents to show that courts have consistently construed Section 230 broadly, which is in line with the explicit text of the statute. The brief points out that Section 230’s language “does not limit its protections to any categories of claims” and further remarks that “lower courts have, with admirable consistency, stuck to the text of the statute in resolving [cases involving the scope of Section 230 protections].”
A primary contention from the Biden Administration and the plaintiff is that some legal distinction exists when an internet provider such as YouTube recommends content to users. Recommending content, from their perspective, could simply be done through an algorithm displaying content in the “Up Next” portion of their website based on users’ prior video selections. They contend that this act ought to be considered legally distinctive from displaying content because a recommendation contains an “implicit message” from the website that is not protected under Section 230.
By this logic, every form of display involves the transmittance of some implicit message. The brief makes this point by asserting that “if displaying some content more prominently than others is recommending, then recommending is inherent to the act of publishing.” It further explains that “publishing inherently involves prioritizing some speech over others, such as placing one article under the front-page headline and another on page 35.” Thus, there is no way that Section 230 can immunize publishers, as it clearly does according to the explicit text of Section 230, without immunizing the means by which a publisher displays content, even if that includes recommending content. Recommending content would also not constitute the creation or development of that content, according to the brief, because “YouTube does not compel any particular type of disclosure or input” as “inputs are freely provided by users in the forms of searches, views, likes, follows and skips.” This argument also obscures the difference between a newspaper making a single editorial decision that is the same for all readers and an algorithm offering personalized display of content for each individual user.
ATR’s brief recognizes the importance of “preserv[ing] the benefits of algorithmic discovery for the most pressing conversations in our society.” As a result, it requests that the Court follow the interpretation of Section 230 that the vast majority of lower courts have adopted, recognize that recommendation algorithms fall firmly within the purview of Section 230 protections, and defer to Congress on any potential changes to the letter of the law.
You can read the full brief here.