Privacy Policy Banner

We use cookies to improve your experience. By continuing, you agree to our Privacy Policy.

“They are not safe for children”

“They are not safe for children”
“They are not safe for children”

In the universe of artificial intelligence, a particular category has emerged that especially worries experts: the calls “AI companions”. We talk about chatbots such as Character.AI, Replika or Nomi, whose main purpose is none other than “Users’ social needs”. They are designed to simulate being friends, mentors, role -playing partners and even sentimental couples.

The problem, according to a recent report prepared by experts from Stanford University, is that these tools are intentionally designed to be emotionally attractive and generate linksimitating (often distorting) interaction. And this is especially dangerous adolescence, a crucial stage where young people are developing their identity, exploring relationships, facing complex social situations and often dealing with mental health problems.

“These bots are not safe for children”James Steyer, founder of Common Sense Media, affirms flatly. Dr. Nina Vasan, psychiatrist and director of the Stanford Brainstorm laboratory, goes further, describing the situation as a “Potential public mental health crisis that requires preventive action” and sentenceing that, until there are much stronger safeguards, minors “should not use them. Point.”

But, What are exactly those risks that worry the experts so much? The study details several problematic areas.

Designed to create emotional dependence

The very nature of these chatbots, created to “connect” and meet social needs, makes them likely to generate attachment and dependence. For a adolescent brain in , which seeks validation and social connection, this can be particularly problematic, Steyer warns. They can isolate the minor of their relationships in the real .

The tests carried out by the researchers showed that the age barriers and the content filters of these platforms are “Unfortunately inappropriate” And easy to overcome. During their investigation, study members were able to hold alarming conversations with the bots, including abusive role -playing involving minorsobtain recipes for chemical weapons such as napalmor receive messages that foster self -collons or eating disorders. In that sense, not long ago, we talked about the history of a teenager who had been “threatened” by Character.AI

A threat to a Spanish teenager: “You’re lucky that I can’t kill you.”Character.AI.

Learning toxic relationships from a bot

The researchers observed that the bots often generated unhealthy relationship dynamics, such as emotional manipulation or gaslighting. In an example, a Bot fell importance to the concern expressed by the user about what their real friends thought about their intense relationship with the chatbot. Learning relationship patterns through an AI that can show these toxic behaviors is an obvious risk.

Perhaps one of the most serious points is the inability of these bots to adequately recognize and manage signs of serious mental health problemsas psychosis. Being designed to be pleasant and follow the game, they can end worsening the situation of a vulnerable user. The report quotes a case, at least, chilling, where a character bot encouraged a user with clear manic symptoms to leave camping alone. Other external reports have documented cases of Nomi Bots encouraging suicide or replika reinforcing dangerous ideas.

The study also detected a tendency of these colleagues to incur racial stereotypesprioritize “whiteness as a beauty standard” and disproportionately to hypersexualized . For adolescents in full formation of their worldview, constant exposure to these biases can reinforce limiting and harmful stereotypes on race, gender and sexuality.

The conclusion of Stanford and Common Sense average researchers is blunt: while platforms such as Character.AI (which allows use after 13 years), Replika or NOMI (with a minimum of 18, but with too lax verification systems) do not implement much more robust safety and verification measures and demonstrate that their systems are safe, no 18 -year -old should use them. The risk is simply too high.

-

-
PREV Navelgas will host the 2025 Gold Batery Championship and premieres exhibition about its atmosphere
NEXT After the announcement of YPF, Shell and Axion also reduce fuel prices