The mother of a 14-year-old boy who died by suicide earlier this year has sued the parent company of Character.AI, alleging that the chatbot mobile app is responsible for his death.

Megan Garcia, who lives in Orlando, Florida, claimed that Character Technologies Inc., which launched Character.AI in 2022, purposely used "anthropomorphic, hypersexualized, and frighteningly real" features to target children under 13 with chatbots posing as licensed therapists or friends.