Switch to ADA Accessible Theme Close Menu
Florida Injury Attorney

How Does Section 230 Prevent Lawsuits Against Social Media Platforms?

Section230

Facebook and Instagram have faced a growing number of lawsuits alleging that they are responsible for suicides and mental health issues arising from chronic use of the platform by teenagers and younger folk. However, these lawsuits and others have been dismissed prior to seeing a jury due in large part to Section 230 of the Communications Decency Act. Section 230 has been the basis for denying claims against Facebook that they should have done more to stop human trafficking. Their platform was sued by form victims who alleged that Facebook facilitated human trafficking and their abuse. The Texas courts dismissed the lawsuit on the grounds that Section 230 prohibited Facebook from being held liable for the actions of third-party content creators, including individuals.

Section 230 prevents social media companies from being sued over third-party content, period. So, there is currently limited grounds on which to pursue this type of lawsuit against a social media company. That could change, however, after a whistleblower complaint, several suicides, and allegations that Meta, the parent company of Facebook and Instagram, were well aware that their content was harming teenagers.

Is Section 230 important to the U.S.? 

Depending on who you ask, it either insulates social media companies from any accountability or it protects your right to free speech on social media platforms. Conservative think tanks like the Cato Institute argue that Section 230 is an important protection for Americans insofar as it protects their right to share content that is appealing to them. Nonetheless, Facebook has the final say and if the content is objectionable to their community standards, they are allowed to penalize users with temporary bans and other restrictions. Cato argues that Section 230 is more about protecting the user’s free speech than the platform.

The problem here is that Facebook and Instagram also provide a context for third-party content. For example, if you were to post about looking fat in a photo, the platform may scan this and choose to advertise swimwear and diet supplements. In other words, Facebook selectively publishes content to manipulate users into purchasing products. AI and big data-driven analytics get smarter about how to do it the more information they are given. Today, they have enough information to manipulate users impressively.

It is imperative to the interpretation of Section 230 that Facebook is not a publisher because publishers can be held liable for the content they produce. However, if Facebook’s algorithms are selectively making content available to users, then you can argue that Facebook is a publisher and therefore liable. Such arguments have yet to gain traction in civil courts, however.

One successful lawsuit was filed against Snapchat 

Thus far, one court has agreed with the plaintiffs after a wrongful death lawsuit related to a dangerous Snapchat app called the speed filter. Essentially, the speed filter would allow users to take pictures and the picture would show how fast the user was going. Obviously, this led to car crashes. The lawsuit, however, was only allowed to proceed on the basis that Snapchat provided the filter to users and it was not a third-party app.

Talk to a Florida Product Liability Attorney Today

 Halpern, Santos & Pinkert file product liability lawsuits against defective manufacturers. Call our Florida personal injury attorneys today to schedule a free consultation and we can begin preparing your suit immediately.

Source:

thewire.in/tech/a-new-lawsuit-says-instagram-hurts-teenagers-by-design

Facebook Twitter LinkedIn

© 2019 - 2024 Halpern, Santos & Pinkert, P.A. Attorneys at Law. All rights reserved.