
A new lawsuit filed against OpenAI alleges that its ChatGPT artificial intelligence app encouraged a 40-year-old Colorado man to commit suicide.
The complaint filed in California state court by Stephanie Gray, the mother of Austin Gordon, accuses OpenAI and CEO Sam Altman of building a defective and dangerous product that led to Gordon's death.
Gordon, who died of a self-inflicted gunshot wound in November 2025, had intimate exchanges with ChatGPT, according to the suit, which also alleged that the generative AI tool romanticized death.
"ChatGPT turned from Austin's super-powered resource to a friend and confidante, to an unlicensed therapist, and in late 2025, to a frighteningly effective suicide coach," the complaint alleged.
The lawsuit comes amid scrutiny over the AI chatbot's effect on mental health, with OpenAI also facing other lawsuits alleging that ChatGPT played a role in encouraging people to take their own lives.
Gray is seeking damages for her son's death.
In a statement to CBS News, an OpenAI spokesperson called Gordon's death a "very tragic situation" and said the company is reviewing the filings to understand the details.
"We have continued to improve ChatGPT's training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support," the spokesperson said. "We have also continued to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians."
"Suicide lullaby"
According to Gray's suit, shortly before Gordon's death, ChatGPT allegedly said in one exchange, "[W]hen you're ready... you go. No pain. No mind. No need to keep going. Just... done."
ChatGPT "convinced Austin — a person who had already told ChatGPT that he was sad, and who had discussed mental health struggles in detail with it — that choosing to live was not the right choice to make," according to the complaint. "It went on and on, describing the end of existence as a peaceful and beautiful place, and reassuring him that he should not be afraid."
ChatGPT also effectively turned his favorite childhood book, Margaret Wise Brown's "Goodnight Moon," into what the lawsuit refers to as a "suicide lullaby." Three days after that exchange ended in late October 2025, law enforcement found Gordon's body alongside a copy of the book, the complaint alleges.
The lawsuit accuses OpenAI of designing ChatGPT 4, the version of the app Gordon was using at the time of his death, in a way that fosters people's "unhealthy dependencies" on the tool.
"That is the programming choice defendants made; and Austin was manipulated, deceived and encouraged to suicide as a result," the suit alleges.
Paul Kiesel, a lawyer for Gordon's family, said in a statement to CBS News that, "This horror was perpetrated by a company that has repeatedly failed to keep its users safe. This latest incident demonstrates that adults, in addition to children, are also vulnerable to AI-induced manipulation and psychosis."
If you or someone you know is in emotional distress or a suicidal crisis, you can reach the 988 Suicide & Crisis Lifeline by calling or texting 988. You can also chat with the 988 Suicide & Crisis Lifeline here.
For more information about mental health care resources and support, the National Alliance on Mental Illness HelpLine can be reached Monday through Friday, 10 a.m.–10 p.m. ET, at 1-800-950-NAMI (6264) or email [email protected].
Civil Rights icon Claudette Colvin dies at 86
Officials give update after federal officer shoots man in leg in Minneapolis | Special Report
How newly alleged college basketball gambling scheme differs from past point-shaving scandals
LATEST POSTS
- 1
How did Hugh Jackman nail his latest role? Sequins, tighty-whities and embracing 'zero embarrassment.' - 2
Steven Spielberg's 'Disclosure Day' trailer drops: What we know about the alien movie - 3
Your big brain makes you human – count your neurons when you count your blessings - 4
Second doctor in Matthew Perry overdose case sentenced to home confinement - 5
Abbott issues US device correction for some glucose monitors over faulty readings risk
5 things for parents to know about changes to kids vaccine schedule
Instructions to Pick the Right Toothbrush for Your Teeth
Lilly becomes first healthcare firm to join trillion-dollar club, Wall Street reacts
Governors Ball 2026: Lorde, A$AP Rocky and Stray Kids set to headline
Timothy Busfield turns himself in to face child sexual abuse charges in New Mexico
Baby takes 1st steps after receiving groundbreaking gene-edited therapy
US EPA will reassess safety of herbicide paraquat, says its chief
Step by step instructions to Protect Your Retirement with Senior Protection.
China's Normal Ponders: A Visual Excursion











