By David Heim, Esquire Partner at Bochetto & Lentz, P.C.
The decision by TikTok and its parent company, ByteDance, not to pursue an appeal to the U.S. Supreme Court regarding the Third Circuit’s ruling in the Anderson v. TikTok case marks a significant moment in the ongoing dialogue surrounding social media accountability.
As a lawyer litigating internet defamation and practicing in the broader field of the First Amendment, I find this case particularly relevant in our current digital landscape, where the boundaries of responsibility and immunity for online platforms are increasingly being scrutinized.
The Anderson case emerged from a tragic incident involving a 10-year-old girl who lost her life while attempting the “Blackout Challenge”—a dangerous stunt that went viral on TikTok. The plaintiff, Tawainna Anderson, alleged that TikTok not only allowed such harmful content to be posted but actively promoted it to vulnerable users, including minors. This is a critical distinction that the Third Circuit recognized when it ruled that TikTok could not claim immunity under Section 230 of the Communications Decency Act for actively promoting harmful content.
Historically, Section 230 has served as a liability shield for social media companies, allowing them to evade liability for user-generated content. But the tides are beginning to turn. The court’s ruling suggests that as platforms evolve and take on more active roles in content curation and promotion, the legal protections they have enjoyed may no longer be applicable in the same way. This shift is significant, not just for TikTok, but for all social media platforms that rely heavily on algorithms to drive user engagement.
In my practice, I have witnessed firsthand how the unchecked proliferation of harmful content can lead to devastating consequences. The Anderson case serves as a stark reminder of the real-world implications of online actions. It highlights the necessity for social media companies to take a more proactive approach to content moderation and user safety.
TikTok has recently announced that, after its request for the Anderson decision to be reconsidered by the court en banc, it would not appeal the decision to the Supreme Court, likely realizing the prospects of setting national precedent. With the Third Circuit’s ruling now standing as precedent (at least in the Third Circuit, which includes, Pennsylvania, New Jersey and Delaware) there is an urgent need for these platforms to reassess their responsibilities and the algorithms that govern content delivery. The decision not to appeal seems to indicate TikTok’s acknowledgment of the potential for increased accountability in the face of tragedy. While the ruling’s immediate impact may be confined to the Third Circuit, it sets a precedent that could influence future cases across the country. As courts continue to grapple with the complexities of social media liability, the Anderson case may serve as a catalyst for broader reforms.
Moreover, the decision draws attention to the critical issue harmful social media content, including internet defamation. When harmful content is not only hosted but actively promoted by a platform, it raises questions about the standards of care and due diligence that these companies are required to uphold. The legal landscape surrounding defamation is evolving alongside these technological advancements, making it essential for legal professionals to stay informed and adaptable.
As we move forward, it is imperative that social media companies recognize their role in shaping user experiences and, by extension, societal norms. The Anderson case signals a growing demand for accountability, urging platforms to take responsibility for the content they promote. It also opens the door for legal challenges that may redefine the scope of Section 230 and the protections it offers.
In the end, the decision by TikTok not to pursue an appeal to the Supreme Court may be more than a legal maneuver; it establishes settled precedent reflecting a pivotal moment in the relationship between social media platforms and the communities they serve. As a society, we must continue to advocate for safer online environments, where the voices of users, particularly those of our youth, are protected from harm. It is time for social media companies to step up, take responsibility, and foster a culture of accountability that prioritizes user safety above all else. In the Third Circuit, at least, social media companies will have to take these issues more seriously.