Posted by on December 7, 2025
Tags:
Categories: U.S. News

A teen told a Character AI chatbot 55 times that she was feeling suicidal. Her parents say the chatbot never provided resources for her to get help. They are one of at least six families suing the company.

Read MoreHome – CBSNews.com

Leave a Reply

Your email address will not be published. Required fields are marked *