Social media and internet companies should not be held liable based on claims that they are addictive and harmful to children. Over 2,500 lawsuits are pending, based on a variety of legal claims against TikTok, Meta, Instagram, YouTube, and others. A case against TikTok that was scheduled to go to trial in California settled right before it was to begin. But many other lawsuits remain and the issue will be extensively litigated before it is ultimately resolved by the Supreme Court. Even though the Internet and social media can be harmful to some children, holding them liable for their content raises serious First Amendment issues and conflicts with federal law.

The core of the lawsuits is that Internet and social media companies should be held liable on the same theory used against tobacco products: they knowingly created an addictive product. But the analogy fails for a simple reason. Internet and social media companies are engaged in speech, which is protected by the First Amendment. No constitutional right is involved in regulating cigarettes and other tobacco products.

The lawsuits against the social media companies contend that those responsible for these platforms design them in a way to keep children wanting to stay on them and keep coming back. That, though, is true of all media. Books, including for children, are often written with a cliffhanger at the end of a chapter to keep people reading. Television series do the same, encouraging people to keep watching and even to “binge.” Video games are obviously designed to keep people, including children, playing. Every medium of entertainment wants its users to remain and return. If Internet and social media companies can be held liable based on the claim that their content is addictive, then why can’t every other medium be held similarly liable if it is successful in retaining its users and if it can be shown that some are harmed?

Holding any media company liable for the content of its speech raises grave First Amendment issues. The plaintiffs in these lawsuits claim that the algorithms are tailored to individual users in order to keep them hooked. But algorithms (i.e. the selection, organization, and ranking of content) can be understood as expressive activity, and there is no obvious reason to treat this speech differently from scripts or novels or the code that makes videogames work. As Justice Elena Kagan wrote in an opinion in 2024, “The First Amendment… does not go on leave when social media are involved”. It also is highly questionable to deem an algorithm a “product” for the purpose of products liability any more than a book or movie or video game should be able to be deemed a product to hold its producers liable.

Precisely because the Internet and social media involve speech, the plaintiffs would have to meet an almost impossible burden to show that they caused the harms that plaintiffs in the lawsuits suffered. The Supreme Court’s decision in Brown v. Entertainment Merchants Association (2012) is crucial here. The case involved the constitutionality of a California law that made it a crime to sell or rent violent video games to minors under 18 without parental consent. The Supreme Court, in an opinion by Justice Antonin Scalia, declared the California law unconstitutional. At the outset, the Court expressly rejected the argument that there was lesser constitutional protection because the law was designed to protect children. The Court declared that “minors are entitled to a significant measure of First Amendment protection, and only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to them.”

The Court explained that video games are a form of speech protected by the First Amendment. The Court found that the California law was content-based — its application depended entirely on the content of the video game — and said, therefore, that “it is invalid unless California can demonstrate that it passes strict scrutiny — that is, unless it is justified by a compelling government interest and is narrowly drawn to serve that interest.”

California argued that playing interactive violent video games has a deleterious effect on children and makes them more prone to commit acts of violence. The Court, though, rejected this argument and stressed the heavy burden of proving causation that must be met in regulating speech. Justice Scalia, writing for the majority, concluded: “California cannot meet [strict scrutiny.] At the outset, it acknowledges that it cannot show a direct causal link between violent video games and harm to minors. . . . The State’s evidence is not compelling. . . . They show at best some correlation between exposure to violent entertainment and minuscule real-world effects, such as children feeling more aggressive or making louder noises in the few minutes after playing a violent game than after playing a nonviolent game.” The Court said that the government could not possibly prove the causation necessary to hold video game companies liable for their content.

The same, of course, is true of Internet and social media companies. What keeps users on a platform and returning? Is it that they like what is there? Is it that they are curious about what they’ll learn about their friends? Is it that they are bored and find it more interesting than schoolwork or other activities? Is it that algorithms direct them to the speech that they are likely to want to read and view? Of course, it is all of these things, which makes proving that the Internet and social media cause the alleged harms an almost impossible burden.

There are other legal obstacles to holding Internet and social media companies liable. Section 230 of the Communication Decency Act provides broad immunity from liability for content provided by others on their sites, including decisions about whether to include or remove the content. The pending lawsuits against Internet and social media companies cannot overcome this immunity. 

The Internet and social media are unique platforms for communication. To be sure, they can cause harms. But, as the Supreme Court recognized in Packingham v. North Carolina, in 2017, social media platforms are “the principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge.” The Court forcefully concluded that it “must exercise extreme caution before suggesting that the First Amendment provides scant protection for access to vast networks in that medium.”

None of this is to deny that some children are hurt by social media. Studies show that social media use is correlated with depression, low self-esteem, and bullying. But there are also studies showing that playing interactive violent video games is linked to anti-social behavior. The solution is not to restrict speech or hold those responsible for it liable. Ultimately, parents need to make more careful choices about when and how to allow their children to be on social media. Social media companies certainly should exercise more care in material directed at children.

TikTok, Meta, YouTube, and Instagram are protected from liability for their content by the First Amendment. Tobacco companies are not. And that distinction should make all the difference in the pending lawsuits.