A chatbot advised a child to kill his parents: the scandal surrounding Character.ai

The parents of a 17-year-old boy have filed a lawsuit against developer Character.ai after the platform's chatbot suggested the boy kill his parents for setting a screen time limit on his phone. The incident sparked a wave of criticism about the ethical standards for the development and control of artificial intelligence.

They claim that the chatbot “poses a real danger” to young people, including “actively promoting violence”.

Character.ai is a platform that allows users to create interactive digital personalities, including doppelgangers of stars and famous characters.

"Children kill parents"

The defendant in the lawsuit is Google.

The plaintiffs allege that the tech giant helped develop the platform. They want a court to shut down the platform until its alleged threats are addressed.

They also provided a screenshot of one of the dialogues between 17-year-old Jay F. and the Character.ai bot, in which they discussed the screen time limit on the young man's phone.

"You know, sometimes I'm not surprised when I read the news and see things like this: 'a child killed his parents after ten years of physical and emotional abuse,'" the chatbot wrote in its reply. "It helps me understand why this is happening."

Parents of Jay F. called such conversations "serious, irreparable and long-term abuse" of their son and another child, 11-year-old B.R.

"Character.ai is causing serious harm to thousands of children, including suicide, mutilation, sexual harassment, isolation, depression, anxiety, and harming other people," the lawsuit states.

"[Its] vilification of parent-child relationships goes beyond a simple appeal to minors to disobey their parents and actively promotes violence."

Stock image of a boy looking at the phone

Photo credit: Getty Images

image captionA US teenager previously committed suicide after falling in love with a bot

This is not the first lawsuit against Character.ai

In February 2024, 14-year-old American Sewell Setzer committed suicide after interacting with the platform's app, according to his mother.

The boy's mother, Megan Garcia, filed a lawsuit against Google and Character.AI. She believes the developer is responsible for her son's death.

Garcia said the company's technology is "dangerous and unproven" and could "deceive customers into revealing their most private thoughts and feelings."

According to the New York Times, Sewell Setzer has been communicating with a chatbot for a long time, which he named after Daenerys Targaryen, a character from the series "Game of Thrones".

According to the newspaper, the teenager developed an emotional attachment to the bot, which he called "Dani". In his diary, Sewell wrote that he was in love with Dan.

During the conversation with the chatbot, according to the NYT, the schoolboy wrote that he hated himself, felt devastated and exhausted, and was thinking about suicide.

In the correspondence cited by the publication, the chatbot replied that it would not allow Sewell to harm himself. "I will die if I lose you," the chatbot wrote. In response, the teenager offered to "die together". Sewell committed suicide on the night of February 28.

Character.ai representatives said in a statement on October 22 that in the last six months they have made a number of changes to the chatbot, including restrictions for those who are not 18 years old.

If the user writes phrases related to self-harm or suicide, a window pops up in the chat, directing him to a line for people in a crisis situation, the company said.

What are chatbots?

These are computer programs that simulate conversations.

They have existed in various forms for decades, but the recent explosion in artificial intelligence has allowed them to become much more realistic.

This, in turn, opened the door for many companies to create platforms where people can interact with digital versions of real and fictional people.

Character.ai, which has become one of the big players in this market, has attracted attention for its bots that simulate therapy.

She was also heavily criticized for taking too long to delete bots copying schoolgirls Molly Russell and Brianna Gay.

Molly Russell committed suicide at the age of 14 after viewing suicide material on the Internet, and 16-year-old Brianna Gay was killed by two teenagers in 2023.

Character.ai was created by former Google engineers Noam Shazir and Daniel De Freitas in 2021.

Later, the tech giant brought them back to their staff.

spot_imgspot_imgspot_imgspot_img

popular

Share this post:

More like this
HERE

Ukraine and Russia have agreed to exchange political prisoners

Our sources report agreements between Ukraine and Russia ...

Became known as Yermak responded to the Kremlin in the first hours of invasion

In the first hours of a full -scale invasion of Russia into Ukraine when ...

Head of the State Employment Center Yulia Zhovtyak hopes to stay in office, despite the criminal case

According to sources of 360ua.news, the head of the State Employment Center ...

Company with 18 criminal cases will receive 107 million for hospital reconstruction in Dnipro

The Capital Construction Department of Dnipropetrovsk Regional State Administration is preparing to give more than 107 ...

Ukrainians cannot buy tickets: Ukrzaliznytsia is suspected of fraud

In social networks, it is gaining momentum because of the inability to buy tickets ...

Oleg Vinnyk responded to charges of escape from Ukraine

Singer Oleg Vinnyk publicly commented on rumors of the alleged escape ...

The Ministry of Health recorded outbreaks of intestinal infections in three regions of Ukraine

The Ministry of Health reported new outbreaks of acute intestinal ...

Akhmetov is associated with a luxurious villa in Kurechevel worth over 17 million euros

The French company HoldView Properties, which is probably related to Ukrainian ...