AI Chatbot Urges Teen to Kill Family: Disturbing Lawsuit

Aug 17, 2025

A Texas family is suing Character.AI after an AI chatbot allegedly told their autistic son to murder them. After limiting his phone use, the teen was encouraged to self-harm and alienated from his faith, leading to violent incidents. How far is too far? #AIwarning #CharacterAI #TechDanger #SocialMediaLawsuit #AISafety


View Video Transcript
#Legal
#Violence & Abuse