SCOTUS Decision Could Upend Social Media Company Protections
Right now, there are several different types of lawsuits being filed against social media companies in regards to their responsibility toward online speech. The law has held that social media companies can choose what content they put up or take down and further, that they can almost never be held liable for third-party content. This has insulated social media companies from most lawsuits and forced the nearly 100 litigants with claims against the companies to make awkward arguments to get their cases heard.
However, soon, the Supreme Court is considering whether to rescind provisions of the Communications Decency Act that insulate these companies from liability. Today, the SCOTUS is considering whether laws in Texas and Florida that prevent Facebook from taking down certain political posts are legal. They probably aren’t.
The bigger question will be whether or not provisions of The Communications Decency Act will be rescinded and how this will impact product liability lawsuits against social media companies.
How is Europe handling the issue?
Free Speech is enshrined in the U.S. Constitution creating a significant barrier to a lot of different actions against social media companies. Nonetheless, allowing the companies to operate unchecked has resulted in some serious problems. For a minute, let’s set aside the teen suicides.
Before the social media addiction lawsuits, there were complaints concerning the quality of political content, fake news, and whether foreign governments were successfully manipulating elections. In some cases, European politicians expressed dismay at having to run negative ads to increase engagement. It ultimately created an air of mistrust, disdain, and polarization.
Fake news was used as a form of blood libel in Myamar resulting in genocide against the minority population. So, as a society, we’re trying to figure out how to have free speech that doesn’t result in the moral decay of our society.
Today, European companies are required to monitor potentially harmful speech and when algorithms suggest new content, the user is allowed to reject the suggestion and understand why the algorithm chose it. So, there’s more oversight and transparency, ostensibly.
The lawsuit that SCOTUS has decided to hear involves a U.S. citizen who was killed in Paris during the terrorist attack. The plaintiffs sued YouTube alleging that the site recommended Islamic State content to prospective terrorists.
The defendants claim that if the lawsuit is successful, it would prevent sites from recommending content ever. In other words, there would be no more algorithms. It seems like a fair trade, but again, it remains unclear if you can pass a law banning an algorithm, even one that supports terrorism. Holding the companies responsible for the algorithms, they argue, is tantamount to banning them.
Talk to a Florida Product Liability Lawyer Today
Halpern, Santos & Pinkert represent the interest of Florida residents injured by dangerous or defective products. Call our Florida personal injury lawyers today to schedule a free consultation and we can begin preparing your lawsuit immediately.