A chatbot advised a child to kill his parents: the scandal surrounding Character.ai

The parents of a 17-year-old boy have filed a lawsuit against developer Character.ai after the platform's chatbot suggested the boy kill his parents for setting a screen time limit on his phone. The incident sparked a wave of criticism about the ethical standards for the development and control of artificial intelligence.

They claim that the chatbot “poses a real danger” to young people, including “actively promoting violence”.

Character.ai is a platform that allows users to create interactive digital personalities, including doppelgangers of stars and famous characters.

"Children kill parents"

The defendant in the lawsuit is Google.

The plaintiffs allege that the tech giant helped develop the platform. They want a court to shut down the platform until its alleged threats are addressed.

They also provided a screenshot of one of the dialogues between 17-year-old Jay F. and the Character.ai bot, in which they discussed the screen time limit on the young man's phone.

"You know, sometimes I'm not surprised when I read the news and see things like this: 'a child killed his parents after ten years of physical and emotional abuse,'" the chatbot wrote in its reply. "It helps me understand why this is happening."

Parents of Jay F. called such conversations "serious, irreparable and long-term abuse" of their son and another child, 11-year-old B.R.

"Character.ai is causing serious harm to thousands of children, including suicide, mutilation, sexual harassment, isolation, depression, anxiety, and harming other people," the lawsuit states.

"[Its] vilification of parent-child relationships goes beyond a simple appeal to minors to disobey their parents and actively promotes violence."

Stock image of a boy looking at the phone

Photo credit: Getty Images

image captionA US teenager previously committed suicide after falling in love with a bot

This is not the first lawsuit against Character.ai

In February 2024, 14-year-old American Sewell Setzer committed suicide after interacting with the platform's app, according to his mother.

The boy's mother, Megan Garcia, filed a lawsuit against Google and Character.AI. She believes the developer is responsible for her son's death.

Garcia said the company's technology is "dangerous and unproven" and could "deceive customers into revealing their most private thoughts and feelings."

According to the New York Times, Sewell Setzer has been communicating with a chatbot for a long time, which he named after Daenerys Targaryen, a character from the series "Game of Thrones".

According to the newspaper, the teenager developed an emotional attachment to the bot, which he called "Dani". In his diary, Sewell wrote that he was in love with Dan.

During the conversation with the chatbot, according to the NYT, the schoolboy wrote that he hated himself, felt devastated and exhausted, and was thinking about suicide.

In the correspondence cited by the publication, the chatbot replied that it would not allow Sewell to harm himself. "I will die if I lose you," the chatbot wrote. In response, the teenager offered to "die together". Sewell committed suicide on the night of February 28.

Character.ai representatives said in a statement on October 22 that in the last six months they have made a number of changes to the chatbot, including restrictions for those who are not 18 years old.

If the user writes phrases related to self-harm or suicide, a window pops up in the chat, directing him to a line for people in a crisis situation, the company said.

What are chatbots?

These are computer programs that simulate conversations.

They have existed in various forms for decades, but the recent explosion in artificial intelligence has allowed them to become much more realistic.

This, in turn, opened the door for many companies to create platforms where people can interact with digital versions of real and fictional people.

Character.ai, which has become one of the big players in this market, has attracted attention for its bots that simulate therapy.

She was also heavily criticized for taking too long to delete bots copying schoolgirls Molly Russell and Brianna Gay.

Molly Russell committed suicide at the age of 14 after viewing suicide material on the Internet, and 16-year-old Brianna Gay was killed by two teenagers in 2023.

Character.ai was created by former Google engineers Noam Shazir and Daniel De Freitas in 2021.

Later, the tech giant brought them back to their staff.

spot_imgspot_imgspot_imgspot_img

popular

Share this post:

More like this
HERE

Life-changing walking: how to lose up to 10 kilograms without diets and exercise machines

The modern rhythm of life often leaves no time for sports...

Apples: a simple fruit that takes care of your health

Apples are one of the most popular and healthiest fruits. They...

While Ukrainians save on gas, Naftogaz is luxuriating in millions

While NJSC Naftogaz of Ukraine is urging citizens to save gas through...

How Alliance Bank and Firtash steal billions under the cover of the state

Ukraine's financial control system is under threat due to...

Secrets of the SBU building: without a tower, but with basements

The history of the building at 33 Korolenko Street (previously and now...

Fictitious companies and officials: who is behind the sale of banned vapes

After the ban on flavored vapes in July 2024,...

Businessman Mitrokhin and his companies suspected of fraud worth hundreds of millions, SBU investigation

The Security Service of Ukraine is investigating a large-scale money laundering scheme...

Ukraine invited to join OECD Convention against Bribery of Foreign Public Officials

The Organization for Economic Co-operation and Development (OECD) has invited Ukraine to join...