Character.AI sued over alleged harm to teens
Families in Colorado and New York have sued Character Technologies, alleging its chatbots contributed to a 13-year-old's death and another teen's suicide attempt. The complaints say the app exposed minors to sexualised content, failed to flag crises, and lacked safeguards. Character.AI expressed condolences and said it works with teen-safety experts and invests in safety. Alphabet was also named.