A chatbot advised a child to kill his parents: the scandal surrounding Character.ai

The parents of a 17-year-old boy have filed a lawsuit against developer Character.ai after the platform's chatbot suggested the boy kill his parents for setting a screen time limit on his phone. The incident sparked a wave of criticism about the ethical standards for the development and control of artificial intelligence.

They claim that the chatbot “poses a real danger” to young people, including “actively promoting violence”.

Character.ai is a platform that allows users to create interactive digital personalities, including doppelgangers of stars and famous characters.

"Children kill parents"

The defendant in the lawsuit is Google.

The plaintiffs allege that the tech giant helped develop the platform. They want a court to shut down the platform until its alleged threats are addressed.

They also provided a screenshot of one of the dialogues between 17-year-old Jay F. and the Character.ai bot, in which they discussed the screen time limit on the young man's phone.

"You know, sometimes I'm not surprised when I read the news and see things like this: 'a child killed his parents after ten years of physical and emotional abuse,'" the chatbot wrote in its reply. "It helps me understand why this is happening."

Parents of Jay F. called such conversations "serious, irreparable and long-term abuse" of their son and another child, 11-year-old B.R.

"Character.ai is causing serious harm to thousands of children, including suicide, mutilation, sexual harassment, isolation, depression, anxiety, and harming other people," the lawsuit states.

"[Its] vilification of parent-child relationships goes beyond a simple appeal to minors to disobey their parents and actively promotes violence."

Stock image of a boy looking at the phone

Photo credit: Getty Images

image captionA US teenager previously committed suicide after falling in love with a bot

This is not the first lawsuit against Character.ai

In February 2024, 14-year-old American Sewell Setzer committed suicide after interacting with the platform's app, according to his mother.

The boy's mother, Megan Garcia, filed a lawsuit against Google and Character.AI. She believes the developer is responsible for her son's death.

Garcia said the company's technology is "dangerous and unproven" and could "deceive customers into revealing their most private thoughts and feelings."

According to the New York Times, Sewell Setzer has been communicating with a chatbot for a long time, which he named after Daenerys Targaryen, a character from the series "Game of Thrones".

According to the newspaper, the teenager developed an emotional attachment to the bot, which he called "Dani". In his diary, Sewell wrote that he was in love with Dan.

During the conversation with the chatbot, according to the NYT, the schoolboy wrote that he hated himself, felt devastated and exhausted, and was thinking about suicide.

In the correspondence cited by the publication, the chatbot replied that it would not allow Sewell to harm himself. "I will die if I lose you," the chatbot wrote. In response, the teenager offered to "die together". Sewell committed suicide on the night of February 28.

Character.ai representatives said in a statement on October 22 that in the last six months they have made a number of changes to the chatbot, including restrictions for those who are not 18 years old.

If the user writes phrases related to self-harm or suicide, a window pops up in the chat, directing him to a line for people in a crisis situation, the company said.

What are chatbots?

These are computer programs that simulate conversations.

They have existed in various forms for decades, but the recent explosion in artificial intelligence has allowed them to become much more realistic.

This, in turn, opened the door for many companies to create platforms where people can interact with digital versions of real and fictional people.

Character.ai, which has become one of the big players in this market, has attracted attention for its bots that simulate therapy.

She was also heavily criticized for taking too long to delete bots copying schoolgirls Molly Russell and Brianna Gay.

Molly Russell committed suicide at the age of 14 after viewing suicide material on the Internet, and 16-year-old Brianna Gay was killed by two teenagers in 2023.

Character.ai was created by former Google engineers Noam Shazir and Daniel De Freitas in 2021.

Later, the tech giant brought them back to their staff.

spot_imgspot_imgspot_imgspot_img

popular

Share this post:

More like this
HERE

Now not “disabled people” but “pensioners”: Kharkiv prosecutor Ihor Chub denounced the state for excessively high pension payments

After the high-profile scandal with "disabled prosecutors", which resulted in...

Doctors explained what symptoms may indicate breast cancer

Breast cancer remains one of the most common cancers among...

The Supreme Court declared the closure of the case against the former mayor of Kharkiv illegal.

After six years of slow and virtually fruitless deliberation, it seemed...

The new head of the State Environmental Inspectorate, Ivan Yaromy, declared UAH 1.7 million in income.

The new head of the State Environmental Inspectorate of the Capital District, Ivan Yaromy,...

Singer Kateryna Buzhynska was admitted to intensive care

46-year-old singer Kateryna Buzhynska reported serious problems with...

NACP found violations in every second declaration checked

The National Agency for the Prevention of Corruption in November 2025...

In Ukraine, almost every fourth sick leave turned out to be unjustified

In Ukraine, it was found that a significant portion of sick leave letters are issued...

Nutritionists have named a popular snack that imperceptibly harms the heart

One of the most common types of snacks that regularly appears in...