Her teenage son killed himself after talking to a chatbot. Now she’s suing.
- Posted on October 25, 2024
- By Washington Post
- 1 Views
Her teenage son killed himself after talking to a chatbot. Now she’s suing.
The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.