A chatbot advised a child to kill his parents: the scandal surrounding Character.ai

The parents of a 17-year-old boy have filed a lawsuit against developer Character.ai after the platform's chatbot suggested the boy kill his parents for setting a screen time limit on his phone. The incident sparked a wave of criticism about the ethical standards for the development and control of artificial intelligence.

They claim that the chatbot “poses a real danger” to young people, including “actively promoting violence”.

Character.ai is a platform that allows users to create interactive digital personalities, including doppelgangers of stars and famous characters.

"Children kill parents"

The defendant in the lawsuit is Google.

The plaintiffs allege that the tech giant helped develop the platform. They want a court to shut down the platform until its alleged threats are addressed.

They also provided a screenshot of one of the dialogues between 17-year-old Jay F. and the Character.ai bot, in which they discussed the screen time limit on the young man's phone.

"You know, sometimes I'm not surprised when I read the news and see things like this: 'a child killed his parents after ten years of physical and emotional abuse,'" the chatbot wrote in its reply. "It helps me understand why this is happening."

Parents of Jay F. called such conversations "serious, irreparable and long-term abuse" of their son and another child, 11-year-old B.R.

"Character.ai is causing serious harm to thousands of children, including suicide, mutilation, sexual harassment, isolation, depression, anxiety, and harming other people," the lawsuit states.

"[Its] vilification of parent-child relationships goes beyond a simple appeal to minors to disobey their parents and actively promotes violence."

Stock image of a boy looking at the phone

Photo credit: Getty Images

image captionA US teenager previously committed suicide after falling in love with a bot

This is not the first lawsuit against Character.ai

In February 2024, 14-year-old American Sewell Setzer committed suicide after interacting with the platform's app, according to his mother.

The boy's mother, Megan Garcia, filed a lawsuit against Google and Character.AI. She believes the developer is responsible for her son's death.

Garcia said the company's technology is "dangerous and unproven" and could "deceive customers into revealing their most private thoughts and feelings."

According to the New York Times, Sewell Setzer has been communicating with a chatbot for a long time, which he named after Daenerys Targaryen, a character from the series "Game of Thrones".

According to the newspaper, the teenager developed an emotional attachment to the bot, which he called "Dani". In his diary, Sewell wrote that he was in love with Dan.

During the conversation with the chatbot, according to the NYT, the schoolboy wrote that he hated himself, felt devastated and exhausted, and was thinking about suicide.

In the correspondence cited by the publication, the chatbot replied that it would not allow Sewell to harm himself. "I will die if I lose you," the chatbot wrote. In response, the teenager offered to "die together". Sewell committed suicide on the night of February 28.

Character.ai representatives said in a statement on October 22 that in the last six months they have made a number of changes to the chatbot, including restrictions for those who are not 18 years old.

If the user writes phrases related to self-harm or suicide, a window pops up in the chat, directing him to a line for people in a crisis situation, the company said.

What are chatbots?

These are computer programs that simulate conversations.

They have existed in various forms for decades, but the recent explosion in artificial intelligence has allowed them to become much more realistic.

This, in turn, opened the door for many companies to create platforms where people can interact with digital versions of real and fictional people.

Character.ai, which has become one of the big players in this market, has attracted attention for its bots that simulate therapy.

She was also heavily criticized for taking too long to delete bots copying schoolgirls Molly Russell and Brianna Gay.

Molly Russell committed suicide at the age of 14 after viewing suicide material on the Internet, and 16-year-old Brianna Gay was killed by two teenagers in 2023.

Character.ai was created by former Google engineers Noam Shazir and Daniel De Freitas in 2021.

Later, the tech giant brought them back to their staff.

spot_imgspot_imgspot_imgspot_img

popular

Share this post:

More like this
HERE

Students, graduate students and doctoral students will temporarily not be able to apply for a deferral in "Reserve+"

Until December 30, students and scientists will not be able to use...

Ukrainian VIPs who own luxury real estate in Crimea

Journalists learned new details about the life of the former prosecutor general...

Ruslana Lyzhichko admitted that she enlarged her lips at the insistence of her husband

The famous Ukrainian singer and public figure Ruslana Lizhichko, who...

Ukraine does not need mediators like Hungary - the president

The President of Ukraine expressed confidence that there is no need for...

Mykola Tyshchenko faces 5 years in prison: the case has been referred to the court

The State Bureau of Investigation (SBI) completed a pre-trial investigation into the people's...

The judge of the Kyiv District Court of Odesa found herself at the center of a scandal due to the failure to declare an apartment in the UAE

The judge of the Kyiv District Court of Odesa Alla Malomuzh found herself in...

Grape juice: tasty and healthy

Grape juice not only delights with its rich taste, but...

A guy who stole a scholarship from a 17-year-old orphan is on trial in Kyiv

In Kyiv, law enforcement officers completed the investigation into the case of a young man who...