Judge Dismisses TikTok Wrongful Death
The parents of a 10-year-old who took part in the “blackout challenge” filed a wrongful death case against TikTok after the child died. The parents filed a lawsuit against TikTok claiming they were responsible for the child’s death. A judge recently dismissed the lawsuit holding TikTok responsible for the lawsuit. Below, we’ll discuss the facts presented in the case.
Analyzing the cause of action
TikTok has a “For You” tab that you can click on where it makes suggestions based on your interests. In this case, it furnished a 10-year-old girl with information on how to choke herself until she passed out. This was all part of a social contagion called, “the blackout game” which has been around ever since I was a kid. Basically, you choke yourself until you pass out, and when you come to, you get a weird high. It is similar to the high you get from inhalants, except considerably cheaper and impossible to confiscate. Of course, the girl tried it herself, and her parents found her hanging from a purse string when they came home.
The lawsuit was dismissed on the grounds of Section 230 of the Communications Decency Act which does the opposite of what it sounds like. Instead of ensuring decency, it provides a liability shield for apps like Facebook and TikTok as regards third-party content. This is despite the fact that their algorithms curate the content and present it in potentially harmful ways, as evidenced above.
So long as Section 230 of the Communications Decency Act remains in effect, it appears impossible to hold a social media company accountable for the content they curate no matter how terrible its effects are.
Another failed attempt based on strong allegations
Parents who are filing suicide lawsuits against Meta were watching this lawsuit carefully as it could have provided insight into how their own lawsuit would be treated. The TikTok lawsuit made stronger allegations than the Meta-suicide lawsuits due in large part to the fact that a 10-year-old was exposed to a potentially fatal behavior. Nonetheless, the court held that TikTok could not be held liable for third-party content.
While TikTok and Facebook are no doubt thrilled about this, the plaintiffs believe that the social media giants are abusing protections afforded to them and it was never meant to allow them to expose children to dangerous content.
Here’s the thing. We already have filters that prevent children from being exposed to pornographic material. Why aren’t we using them to prevent children from being exposed to other types of material?
Talk to a Florida Product Liability Lawyer Today
Halpern, Santos & Pinkert represent the interests of injured Florida residents filing lawsuits against negligent third parties. Call our Florida personal injury lawyers today to schedule a free consultation and learn more about how we can help.