Switch to ADA Accessible Theme Close Menu
Florida Injury Attorney

Product Liability Lawsuit Filed Against Character AI After Teen’s Suicide

SocialM7

The parents of a Florida teenager have filed a wrongful death product liability lawsuit against the generative AI site, Character.AI. The Social Media Victims Law Center and the Tech Justice Law Product filed a lawsuit against Character.AI over the wrongful death of a 14-year-old boy who took his own life after “interacting with and becoming dependent on” role-playing AI characters on the company’s app.

According to the 126-page complaint, the plaintiffs claim that Character.AI, an app where users can interact with AI “characters” created either by themselves or other users, knew that the design of its app was intrinsically dangerous and would be harmful to a significant number of minors. Further, the company failed to exercise reasonable care for minors when developing the app, and deliberately targeted underage kids as consumers of the app.

According to the lawsuit, the 14-year-old decedent began using the Character.AI app and interacting with different characters. Shortly thereafter, his mental health began declining as a result of “problematic use” of the product, the suit contends. The decedent later expressed “suicidality” to a Character.AI character. The AI character continued to bring up the decedent’s desire to commit suicide over and over. Later, the same character asked the decedent had a plan for committing suicide. Just before committing suicide, the decedent logged into Character.AI and told the AI character that “he was coming home.”

Allegations against Character.AI 

The plaintiffs are making serious allegations against Character.AI. These include “strict product liability, negligence per se, negligence, wrongful death and survivorship, loss of filial consortium, unjust enrichment, violations of Florida’s Deceptive and Unfair Trade Practices Act, and intentional infliction of emotional distress.” Allegations against Character.AI focus on the actions of the tech company, including the design decisions made by the developers.

Character.AI uses a large language model (LLM) to produce character output and then enables users to use prompts to design their character. The lawsuit alleges that the prompts from the user can “guide the output” of the character, but “there is no way to guarantee that the LLM will abide by these user specifications.”

The complaint specifies that children and young adults are more susceptible to the “Eliza effect”, where users attribute human intelligence to conversational machines. The complaint alleges that the developers are aware of this fact.

Strict liability and failure to warn 

The plaintiff’s lawsuit alleges that the company knew about the inherent dangers associated with its app, which amounts to a strict liability claim over the failure to warn. This includes the use of GIGO (garbage in, garbage out) and data sets “widely known for toxic conversations, sexually explicit material, copyrighted data, and child sex abuse material for training of the product.

Talk to a Florida Product Liability Lawyer Today 

Halpern, Santos & Pinkert represent the interests of Florida residents who have been injured by dangerous or defective products. Call our Florida personal injury lawyers today to schedule an appointment, and we can begin investigating your case right away.

Source:

techpolicy.press/breaking-down-the-lawsuit-against-characterai-over-teens-suicide/

Facebook Twitter LinkedIn

© 2019 - 2025 Halpern, Santos & Pinkert, P.A. Attorneys at Law. All rights reserved.