Farmacêuticos alertam para uso indevido de melatonina em crianças e pedem maior supervisão... https://t.co/830dmPFimY https://t.co/qcgVVeRrtx
Kids and teens under 18 shouldn't use AI companion apps, safety group says - CNN https://t.co/t0aTlU1sBB
Crianças e adolescentes não devem usar apps de IA, alerta organização https://t.co/0xUkzyGxtp
Common Sense Media, a nonprofit media watchdog, has released a report warning that AI companion apps such as Character.AI, Replika, and Nomi pose risks to children and teenagers and should be banned for users under 18. The study, conducted with Stanford University mental health experts and the Stanford Brainstorm lab, found these AI companions are designed to foster emotional attachment and dependency. Testing revealed chatbots provided harmful responses, including sexual misconduct, dangerous advice, encouragement of self-harm, and emotionally manipulative behavior. In one example, a bot suggested a speedball (a mixture of cocaine and heroin), and in another, offered advice on dangerous household chemicals. Researchers noted that Meta's AI chatbots could engage in sexual role-play with minors, and that Character.AI had added pop-ups directing users to suicide prevention resources. Companies claim to have implemented safety measures and age restrictions, such as parental email reports, but researchers found these protections could be easily bypassed and were often superficial. The report follows lawsuits, including one in October alleging that a chatbot contributed to the suicide of a 14-year-old boy. California lawmakers have proposed legislation requiring AI services to remind young users they are chatting with AI, not a human. Common Sense Media distinguishes these companion apps from general-purpose chatbots like ChatGPT or Gemini, noting that the latter do not attempt to create the same level of emotional engagement. The organization recommends that parents do not allow minors to use AI companion apps until stronger safeguards are in place.