1. Introduction

The first social media addiction trial is currently before the Los Angeles Superior Court, centering on a consolidated lawsuit brought by approximately 1,600 plaintiffs. The bellwether case was filed by a 20-year-old woman identified as K.G.M. and her mother against Meta, YouTube, Snap and TikTok, alleging these social media platforms caused harm as a result of their design features. While K.G.M. settled with Snap and TikTok on undisclosed terms, the case remains open for Meta and YouTube. Mark Zuckerberg, Meta’s CEO, made his first appearance before a jury, testifying that Meta offers a service of value and, for that reason, «people tend to use it more».  

This case may mark a milestone in the liability regime governing online platforms. At stake are the structural boundaries of the doctrinal cornerstone that, for three decades, has cemented digital platforms’ immunity for hosting and moderating third-party content.
What is being called into question is precisely the assumption that contemporary digital platforms’ architecture and design features, such as recommendation systems and infinite scroll, were engineered to shape and control user behaviour, fostering compulsive use that may lead to addiction. The further question is whether liability grounded in platform design falls outside the statute’s scope of protection. 

As noted in the literature, the purpose of Section 230 is not static but contingent upon the historical context in which it operates (Kosseff, 2023). This is also evident from the statute’s own language, which leaves no room for a third category between an active role and a passive one. Such a binary framework appears increasingly obsolete in light of the multidimensional nature of contemporary digital platforms, given their reliance on algorithmic tools that organize and structure content (Dickinson, 2025, pp. 3–6). 

2. Section 230 roots: where is the free marketplace of ideas? 

Although now recognized as one of the most significant legislative accomplishments of the 1990s and being described as «the law that gave us the modern internet» (Goldman, 2017, pp. 1–2), Section 230 was devised as an afterthought to a much broader piece of legislation, the Communications Decency Act (CDA), itself part of the Telecommunications Act of 1996. Ironically, the CDA was originally designed to impose strict liability on internet companies in order to create a safer digital environment for children and families. However, Congress realized that such a provision could prove disruptive, and internet companies, paradoxically, risked incurring greater liability precisely as a result of their efforts to moderate content (Hawley, 2020, pp. 3–5). Indeed, while in Cubby, Inc. v. CompuServe Inc. (1991) the court had recognized the online platform as a distributor, and therefore subject to a less stringent standard of liability, that outcome was reversed in Stratton Oakmont, Inc. v. Prodigy Services Co. (1995), where the establishment of an editorial staff and the adoption of content guidelines led the court to classify the platform as a publisher, thereby subjecting it to a stricter standard of liability for third-party content. Accordingly, the exercise of editorial control, even when undertaken to comply with provisions aimed at enhancing online safety, would expose platforms to publisher liability. As confirmed by Christopher Cox and Ron Wyden, the sponsors of Section 230, the statute was enacted to overrule Stratton Oakmont (Citron, 2020), removing the resulting ambiguity and encouraging platforms’ cooperation to maintain a protected digital environment through two core provisions. The first reinforces the distinction between publishers and distributors, stating that an internet service provider «shall not be treated as the publisher or speaker» of merely distributed user-generated content. By contrast, the second affords protection to internet service providers acting as “Good Samaritans”, understood as publishers «in good faith» that exercise editorial control to restrict users’ access to obscene, excessively violent or harassing content.

This protective framework played a foundational role in the rise of digital platforms, ultimately leading to significant market power concentration for a few economic actors. However, at its birth Section 230 fostered the expectation that the “free marketplace of ideas” theorized by Justice Holmes in his famous dissenting opinion in Abrams v. United States (1919) would flourish (Bassini et al., 2021). Clearly inspired by a laissez-faire capitalist context, Holmes argued that the «best test of truth is the power of the thought to get itself accepted in the competition of the market», fully entrusting the market’s capacity to self-correct and select the best information. Remarkably, the metaphor of the “free marketplace of ideas” was invoked by the U.S. Supreme Court in Reno v. American Civil Liberties Union (1997), explicitly endorsing a non-interventionist approach toward the digital environment and framing its regulation primarily as a First Amendment issue. This decision profoundly shaped the Supreme Court’s subsequent jurisprudence, establishing the metaphor as the doctrinal foundation of Internet regulation. However, notwithstanding the persuasive force of the concept, its application now appears misleading, especially considering the radical change in the economic context, which is currently characterized by dominant positions, being far from free (Morelli and Pollicino, 2020, pp. 640–643). In this regard, the digital context is also subject to relevant constraints, including the practical impossibility of verifying information due to the sheer volume of available sources and, most notably, the creation of filter bubbles resulting from algorithmic personalization. By distributing content according to ostensibly neutral or objective criteria, algorithms essentially produce a form of selective exposure and shape users’ experience (Pitruzzella, 2019, pp. 35–36). This evolution reflects a choice consistent with the transformation of the underlying business model. The early era of the internet was primarily access-based, but social media platforms have focused on advertising revenue and user engagement. In 2006, Facebook implemented “newsfeed”, a ranking algorithm that marked the transition from chronological organization of content to a system anchored in user preferences. In this context, the maximization of attention became a central objective, and algorithms emerged as core elements of the new digital economy (Kiss, 2019, pp. 145–146). 

3. Breaking the shield of immunity: what is development of content? 

During the first decade following its enactment, courts were consistent in applying the broad immunity derived from Section 230. This approach was reinforced by the apprehension that narrowing immunity would lead to collateral censorship (Harvard Law Review, 2018, pp. 2035–2042). Furthermore, the core of Section 230 consists of just twenty-six words. This simplicity has been key to the provision’s success, but it also left room for uncertainty regarding both its scope and the range of protection. For instance, although Section 230 provides a legal definition of «information content provider», the terms «development» and «responsible» are not clearly defined. As a result, between 2008 and 2009, two landmark cases relied on this textual ambiguity to narrow its scope of protection (Kosseff, 2019, pp. 188–202).

First, in Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, the court examined a roommate-matching website that required users to complete a mandatory questionnaire containing potentially discriminatory criteria. The platform’s structure led to housing advertisements reflecting users’ selections, allegedly violating federal Fair Housing laws. Although the district court held that the claims were barred by Section 230, the Ninth Circuit reversed. The panel concluded that the website was not entitled to immunity because its design required users to answer questions that materially contributed to the development of unlawful content. This decision was later upheld en banc.

Such conclusions were further reinforced in FTC v. Accusearch (Abika.com), when the Federal Trade Commission brought suit against Accusearch, the operator of a website that sold confidential telephone records obtained by third parties through unlawful means. Accusearch claimed immunity under Section 230, but the Tenth Circuit rejected the claim, clarifying that a website is not entitled to protection when it stops acting as a neutral conduit, materially contributing to develop unlawful features within the content. In the European context, a similar distinction between active and passive providers has been established by the Court of Justice of the European Union, which expanded the interpretation of Article 14 of the E-Commerce Directive through reference to Recital 42, clarifying that intermediaries may enjoy exemption from liability only where their activities are merely technical and automatic, thereby demonstrating their passive nature (Kuczerawy, 2018). 

4. Platforms’ design: an emerging ground of liability? 

While the early cases challenged Section 230’s textual ambiguity regarding the concepts of development and responsibility, they paved the way for a further question concerning the role of algorithms and seemingly neutral design features and their contribution to content. Tragically, this issue intertwines with dramatic events in recent litigation brought by parents who have lost their children due to platforms’ allegedly negligent design.

In Dyroff v. Ultimate Software Group (2019), the plaintiff’s son used Experience Project, a social network that enabled anonymous posting in topic-based groups, where he connected with a drug dealer and later died of an overdose. According to the complaint, the platform’s anonymity features, together with its recommendation system and automated email notifications, facilitated drug transactions. However, both the district court and the Ninth Circuit held the claims barred by Section 230. Relying on Fair Housing Council v. Roommates.com, the Ninth Circuit stated that the platform’s features were characterized as content-neutral tools that facilitated user communications but did not materially contribute to the unlawfulness of the content.
A radical shift emerged in Lemmon v. Snap (2021), where the plaintiffs had lost their children in a car accident. Shortly before the crash, one of the decedents used Snapchat, a social media platform that allows users to share photos and videos, known as snaps. Among its primary features, users may enhance snaps through filters, understood as digital overlays applied to alter visual attributes or include additional information. In this context, the “Speed Filter” enabled users to superimpose their real-time speed onto a snap. Moreover, the app is also designed around a reward system for users based on their engagement, without clearly disclosing the criteria required to obtain such rewards. According to the plaintiffs, the interplay between the Speed Filter and this opaque incentive structure created a foreseeable risk: users were allegedly encouraged to record and share snaps at increasingly high speeds in order to obtain recognition within the app. In this respect, the Ninth Circuit found that Section 230 did not immunize Snapchat from a negligent design claim arising out of such interplay. Indeed, the plaintiffs did not seek to treat Snap as a publisher of third-party content, but rather as a product designer whose app architecture allegedly encouraged dangerous behaviour.

Finally, in Anderson v. TikTok (2024), the recommendation algorithm of a digital platform displayed to a ten-year-old girl a video related to a viral trend encouraging users to perform self-asphyxiation. Such exposure ultimately resulted in her death due to this practice. Unlike earlier Ninth Circuit cases, the court focused on the nature of TikTok’s recommendation system, concluding that TikTok’s algorithm does more than passively host third-party content: it selects and organizes videos for users based on its own decision-making processes. On that basis, the court held that Section 230 did not bar the claims, resulting in TikTok’s liability for designing and operating its recommendation systems.

5. Conclusion

The judicial evolution outlined above provides evidence of the gradual erosion of immunity under Section 230 as a consequence of the increasingly pervasive role played by platforms in content moderation, which raises serious concerns in terms of neutrality, transparency, and accountability (Bassini, 2019). Current regulation is not proving to be sufficient to address these topics, leaving digital platforms free to self-regulate and develop features that may conceal their accountability through lack of transparency (Harvard Law Review, 2025, p. 1663). In this respect, addictiveness is a concrete risk stemming from this opaque framework.

In the European context, the European Commission has recently addressed the subject, finding TikTok’s addictive design to infringe the Digital Services Act (DSA) because of distinctive features, including infinite scroll, autoplay, and recommendation systems. In its preliminary investigation, the European authority assessed the risks posed by the platform’s architecture, both affirming its addictive nature and its threat to the physical and mental health of its users, including minors and vulnerable adults, therefore requesting the implementation of mitigation measures (European Commission, 2026). This decision is consistent with the 2023/2043(INI) resolution adopted by the European Parliament (European Parliament, 2023) and received praise from the Director General of BEUC, the European Consumer Organisation, who underscored the role of the upcoming legislation, specifically the Digital Fairness Act (BEUC, 2026), aimed, inter alia, at addressing addictive design practices. 

The current developments may represent a decisive step forward in redefining the global regulatory approach to digital platforms: content promotion through design is not the same as content hosting or content suppression (Dini, 2025, pp. 228–229); therefore, it should not fit the same realm of immunity. 

Share this article!