• Blog
  • California Consumer Privacy Act (CCPA)
  • Cart
  • Checkout
  • Contact
  • DMCA
  • Home
  • My account
  • Privacy Policy
  • Shop
Wednesday, January 21, 2026
  • Login
Buyer's Insight
  • Home
  • Top Stories
  • Local News
    • Politics
    • Business & Economy
    • Entertainment
    • Sports
  • Health
  • Lifestyle
  • Science & Environment
  • Technology
  • Review Radar
    • Weight Loss Products Reviews
    • Forex Trading
    • Shop
  • Contact
No Result
View All Result
  • Home
  • Top Stories
  • Local News
    • Politics
    • Business & Economy
    • Entertainment
    • Sports
  • Health
  • Lifestyle
  • Science & Environment
  • Technology
  • Review Radar
    • Weight Loss Products Reviews
    • Forex Trading
    • Shop
  • Contact
No Result
View All Result
Buyer's Insight
No Result
View All Result

ChatGPT served as ‘suicide coach’ in man’s death, lawsuit says

James Walker by James Walker
January 17, 2026
in Technology
Reading Time: 3 mins read
0

A new lawsuit filed against OpenAI alleges that its artificial intelligence application ChatGPT encouraged a 40-year-old Colorado man to commit suicide.

The complaint filed in California state court by Stephanie Gray, Austin Gordon’s mother, accuses OpenAI and CEO Sam Altman of building a defective and dangerous product that led to Gordon’s death.

Gordon, who died from a self-inflicted gunshot wound in November 2025, had intimate exchanges with ChatGPT, according to the suit, which also alleged that the generative AI tool fictionalized death.

“ChatGPT has gone from Austin’s super-powered resource to a friend and confidant, to an unlicensed therapist, and, by the end of 2025, to a frighteningly effective suicide coach,” the complaint alleges.

The lawsuit comes amid increased scrutiny of the AI ​​chatbot’s activities. effect on mental healthOpenAI also being faced other lawsuits alleging that ChatGPT played a role in encouraging people to commit suicide.

Gray seeks damages for his son’s death.

In a statement to CBS News, an OpenAI spokesperson called Gordon’s death “a very tragic situation” and said the company was reviewing records to understand the details.

“We have continued to improve ChatGPT training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real support,” the spokesperson said. “We have also continued to strengthen ChatGPT’s responses at sensitive times, working closely with mental health clinicians. »

“Suicide Lullaby”

According to Gray’s lawsuit, shortly before Gordon’s death, ChatGPT allegedly said in an exchange: “(W)hen you’re ready… you go. No pain. No mind. No need to continue. Just… it’s done.”

ChatGPT “convinced Austin – a person who had previously told ChatGPT that he was sad and who had discussed his mental health issues in detail – that choosing to live was not the right choice to make,” according to the complaint. “He went on and on, describing the end of existence as a peaceful and beautiful place, and reassuring him that he need not be afraid.”

ChatGPT also turned his favorite childhood book, “Goodnight Moon” by Margaret Wise Brown, into what the lawsuit calls a “suicide lullaby.” Three days after that exchange ended, in late October 2025, law enforcement found Gordon’s body next to a copy of the book, according to the complaint.

The lawsuit accuses OpenAI of designing ChatGPT 4, the version of the app Gordon was using at the time of his death, in a way that fostered people’s “unhealthy dependencies” on the tool.

“This was the programming choice that Defendants made; and Austin was manipulated, deceived, and encouraged to commit suicide as a result,” the suit claims.

Paul Kiesel, the Gordon family’s attorney, said in a statement to CBS News that “this horror was perpetrated by a company that has repeatedly failed to keep its users safe.” This latest incident demonstrates that adults, in addition to children, are also vulnerable to AI-induced manipulation and psychosis. »


If you or someone you know is in emotional distress or suicidal crisis, you can reach the 988 Suicide & Crisis lifeline by calling or texting 988. You can also chat with the 988 Suicide & Crisis lifeline here.

For more information about mental health care resources and support, the National Alliance on Mental Illness Helpline can be reached Monday through Friday, 10 a.m. to 10 p.m. ET, at 1-800-950-NAMI (6264) or by email at info@nami.org.

Edited by Alain Sherter

More from CBS News

Go further with The Free Press

Source | domain www.cbsnews.com

Post Views: 0
Previous Post

As insurance prices rise, families question options

Next Post

HDFC Bank Q3FY26 Preview: Street Eyes Stable Earnings Amid LDR Concerns

Related Posts

Technology

Who will inherit the stars? A space ethicist on what we don’t talk about

January 18, 2026
Technology

What to know about Gmail’s new AI-driven updates and how they impact your privacy

January 18, 2026
Technology

UK says banning Elon Musk’s Platform X ‘on the table’ over sexualized Grok AI images

January 18, 2026
Technology

Trump administration wants tech companies to buy $15 billion worth of power plants they can’t use

January 18, 2026
Technology

Grookey Community Day: Your Essential PvE Guide

January 18, 2026
Technology

Yakuza 3 Remastered will not be available for purchase as a standalone game after the release of Yakuza Kiwami 3 and Dark Ties next month

January 17, 2026
Next Post

HDFC Bank Q3FY26 Preview: Street Eyes Stable Earnings Amid LDR Concerns

Zoma News Pulse

  • Home
  • California Consumer Privacy Act (CCPA)
  • Contact
  • DMCA
  • Privacy Policy

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Top Stories
  • Local News
    • Politics
    • Business & Economy
    • Entertainment
    • Sports
  • Health
  • Lifestyle
  • Science & Environment
  • Technology
  • Review Radar
    • Weight Loss Products Reviews
    • Forex Trading
    • Shop
  • Contact