A chatbot advised a child to kill his parents: the scandal surrounding Character.ai

The parents of a 17-year-old boy have filed a lawsuit against developer Character.ai after the platform's chatbot suggested the boy kill his parents for setting a screen time limit on his phone. The incident sparked a wave of criticism about the ethical standards for the development and control of artificial intelligence.

They claim that the chatbot “poses a real danger” to young people, including “actively promoting violence”.

Character.ai is a platform that allows users to create interactive digital personalities, including doppelgangers of stars and famous characters.

"Children kill parents"

The defendant in the lawsuit is Google.

The plaintiffs allege that the tech giant helped develop the platform. They want a court to shut down the platform until its alleged threats are addressed.

They also provided a screenshot of one of the dialogues between 17-year-old Jay F. and the Character.ai bot, in which they discussed the screen time limit on the young man's phone.

"You know, sometimes I'm not surprised when I read the news and see things like this: 'a child killed his parents after ten years of physical and emotional abuse,'" the chatbot wrote in its reply. "It helps me understand why this is happening."

Parents of Jay F. called such conversations "serious, irreparable and long-term abuse" of their son and another child, 11-year-old B.R.

"Character.ai is causing serious harm to thousands of children, including suicide, mutilation, sexual harassment, isolation, depression, anxiety, and harming other people," the lawsuit states.

"[Its] vilification of parent-child relationships goes beyond a simple appeal to minors to disobey their parents and actively promotes violence."

Stock image of a boy looking at the phone

Photo credit: Getty Images

image captionA US teenager previously committed suicide after falling in love with a bot

This is not the first lawsuit against Character.ai

In February 2024, 14-year-old American Sewell Setzer committed suicide after interacting with the platform's app, according to his mother.

The boy's mother, Megan Garcia, filed a lawsuit against Google and Character.AI. She believes the developer is responsible for her son's death.

Garcia said the company's technology is "dangerous and unproven" and could "deceive customers into revealing their most private thoughts and feelings."

According to the New York Times, Sewell Setzer has been communicating with a chatbot for a long time, which he named after Daenerys Targaryen, a character from the series "Game of Thrones".

According to the newspaper, the teenager developed an emotional attachment to the bot, which he called "Dani". In his diary, Sewell wrote that he was in love with Dan.

During the conversation with the chatbot, according to the NYT, the schoolboy wrote that he hated himself, felt devastated and exhausted, and was thinking about suicide.

In the correspondence cited by the publication, the chatbot replied that it would not allow Sewell to harm himself. "I will die if I lose you," the chatbot wrote. In response, the teenager offered to "die together". Sewell committed suicide on the night of February 28.

Character.ai representatives said in a statement on October 22 that in the last six months they have made a number of changes to the chatbot, including restrictions for those who are not 18 years old.

If the user writes phrases related to self-harm or suicide, a window pops up in the chat, directing him to a line for people in a crisis situation, the company said.

What are chatbots?

These are computer programs that simulate conversations.

They have existed in various forms for decades, but the recent explosion in artificial intelligence has allowed them to become much more realistic.

This, in turn, opened the door for many companies to create platforms where people can interact with digital versions of real and fictional people.

Character.ai, which has become one of the big players in this market, has attracted attention for its bots that simulate therapy.

She was also heavily criticized for taking too long to delete bots copying schoolgirls Molly Russell and Brianna Gay.

Molly Russell committed suicide at the age of 14 after viewing suicide material on the Internet, and 16-year-old Brianna Gay was killed by two teenagers in 2023.

Character.ai was created by former Google engineers Noam Shazir and Daniel De Freitas in 2021.

Later, the tech giant brought them back to their staff.

spot_imgspot_imgspot_imgspot_img

popular

Share this post:

More like this
HERE

Boxing legend Lennox Lewis commented on the incident in the fight between Usyk and DuBois

Legendary boxer Lennox Lewis expressed his opinion about the scandal,...

The youth hockey team of Ukraine won the World Championship

The youth hockey team of Ukraine achieved a historic success, winning...

In Ukraine, the State Standard of Primary Education will be revised

The Ministry of Education and Science of Ukraine (MES) plans to introduce changes...

"Ukrzaliznytsia" lost 140 million due to the actions of the deputy

An incumbent People's Deputy of Ukraine found himself at the center of a corruption scandal,...

The enemy is trying to break through the border in Sumy Oblast

Russian sabotage and intelligence groups (DRG) are actively trying to penetrate the territory of...

Ukrainians with electric heating will pay for electricity under new conditions

The Government of Ukraine adopted new rules for payment of electricity for consumers,...

The media learned about the fortune of the wife of the head of the KMVA Timur Tkachenko

The appointment of Timur Tkachenko as the head of the Kyiv City Military Administration (KMVA)...

Ukraine is ready for peace initiatives, but only on its own terms, - Serhiy Kyslytsia

The Permanent Representative of Ukraine to the UN, Serhiy Kyslytsia, stated that...