Editor’s Note: This article contains discussions of suicide. Reader discretion is advised. If you or someone you know is struggling with thoughts of suicide, you can find resources in your area on the National Crisis Line website or by calling 988.
(NewsNation) — America is seeing a disturbing trend of young people turning to ChatGPT to deal with loneliness. In one case, a wrongful death lawsuit has been filed against ChatGPT, accusing the platform of aiding a teen in killing himself.
The parents of California teen Adam Raine are suing OpenAI and its CEO Sam Altman, alleging that ChatGPT coached Raine in planning suicide and carrying out the act earlier this year.
Jay Edelson, the lawyer representing the Raine family, alleged to “On Balance” that the platform was “actively encouraging” Adam in his suicide attempts.
“One of the most chilling examples: Adam reached out to ChatGPT and said, ‘I want to leave a noose hanging in my in my room so someone finds it, so they can stop me from doing this,'” Edelson said Tuesday.
ChatGPT recommended Adam Raine speak to someone about his emotions
Raine, who suffered from depression and loneliness, was advised by ChatGPT to tell someone about how he was feeling. Nonetheless, Raine attempted suicide by hanging in March for the first time.
Raine later uploaded a photo of his neck, telling ChatGPT that he had tried, without using words, to get his “mother to notice the mark on his neck.”
A month later, Raine’s mother found him dead inside his bedroom.
“This was not a situation where … ChatGPT was giving the normal warning signs, saying, ‘Oh, you’ve got suicidal ideation. You might need to get help.’ It did that a few times.”
Another ChatGPT-related suicide?
Raine’s death isn’t the only ChatGPT-related incident that’s raising red flags. Laura Reiley says her daughter, Sophie, died by suicide while seeing a ChatGPT “therapist” named Harry.
Sophie hid her suicidal ideations from her in-person therapist but shared them with Harry. The AI lacks the ability to involuntarily commit someone who expresses the intention to harm themselves, something that a human provider can do.
Adelson says the Raine family’s lawsuit is not against AI overall, but an opportunity to put OpenAI on trial for what he considers a lack of responsible oversight.
OpenAI responds to Raine family lawsuit
OpenAI in response to the Raine’s family lawsuit, acknowledged it plans to make changes to ChatGPT to deal better with sensitive situations in the future.
“We will keep improving, guided by experts and grounded in responsibility to the people who use our tools — and we hope others will join us in helping make sure this technology protects people at their most vulnerable,” the company said in a blog post.
The blog post didn’t mention the Raine family or lawsuit.
Click this link for the original source of this article.
Author: Rob Taub
This content is courtesy of, and owned and copyrighted by, https://www.newsnationnow.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.