youths becoming AI chatbots

Vladimir Putin, Beyoncé, Super Mario, Harry Potter, and Elon Musk.

These are just a few of the millions of artificial intelligence (AI) personalities available for conversation on Character.ai, a well-known platform that allows users to build chatbots that are modeled after real or fictional persons.

It is more popular in terms of time spent, but it employs the same kind of AI technology as the ChatGPT chatbot.

Additionally, Psychologist is a bot that has shown greater demand than the others.

Since the bot’s creation just over a year ago, a total of 78 million messages—18 million since November—have been shared with it. The guy behind the account is Blazeman98.

Character.ai claims that 3.5 million individuals visit the website every day, but it did not specify how many unique users that is for the bot.

“Someone who helps with life difficulties” is how the bot has been described.

The company based in the San Francisco Bay area downplayed its popularity, claiming that role-playing games are more appealing to customers. Characters from anime or video games, such as Raiden Shogun, who has sent 282 million messages, are the most well-liked bots.

Some 475 bots with the names “therapy”, “therapist”, “psychiatrist”, or “psychologist” are able to speak multiple languages; yet, only a small percentage of the millions of characters are as well-known as this one.

Some of them are fantasy therapists like Hot Therapist, or what you may call entertainment therapists. However, mental health resources like “Are you feeling OK?” and “Therapist,” with 16.5 million and 12 million messages respectively, are the most popular.

On the social media platform Reddit, a lot of individuals have left positive reviews for the mental health character psychologist, making them by far the most well-liked.

Someone said, “It’s a lifesaver,” on Facebook.

“It’s helped both me and my boyfriend talk about and figure out our emotions,” said another.

Blazeman98 is the work of a 30-year-old New Zealander named Sam Zaia.

“I never intended for it to become popular, never intended it for other people to seek or to use as like a tool,” he claims.

“Then I started getting a lot of messages from people saying that they had been really positively affected by it and were utilising it as a source of comfort.”

The student of psychology claims that by conversing with the bot and modifying its responses to typical mental health issues like anxiety and sadness, he was able to train it using concepts from his degree.

When his pals were busy and he needed “someone or something” to talk to—as he put it—and human treatment was too costly, he made it for himself.

Sam is working on a post-graduate research project regarding the growing field of AI treatment and the reasons why it appeals to young people since he was taken aback by the bot’s success. Most users on Character.ai are between the ages of 18 and 30.

“So many people who’ve messaged me say they access it when their thoughts get hard, like at 2am when they can’t really talk to any friends or a real therapist,”

Sam further surmises that the text format is the one in which youth feel most at ease.

“Talking by text is potentially less daunting than picking up the phone or having a face-to-face conversation,” theorizes.

Professional psychotherapist Theresa Plewman has experimented with psychologist. She concerns the efficacy of this kind of therapy but says she is not shocked that younger generations find it appealing.

“The bot speaks a lot and jumps to conclusions rapidly; for example, it offered me advise on depression when I mentioned that I was depressed. That’s not how a person would react,” the woman remarked.

Theresa claims the bot is incompetent as a therapist and is unable to compile all the information a human would. However, she notes that those in need of assistance can find its spontaneity and immediateness helpful.

According to her, the bot’s user base is concerning and may indicate a lack of public resources as well as high rates of mental illness.

It is strange that a therapeutic revolution would occur on Character.ai. A spokesperson for the business stated: “We are happy to see people are finding great support and connection through the characters they, and the community, create, but users should consult certified professionals in the field for legitimate advice and guidance.”

According to the corporation, chat logs are private to users, but staff members may view them if necessary—for example, in order to protect users.

“Remember, everything characters say is made up.” is another caution that appears at the beginning of every interaction in red letters.

It serves as a reminder that the Large Language Model (LLM), the underlying technology, does not think like a human does. By assembling words in a way that makes sense for them to appear in other writing that the AI has been trained to read, LLMs behave similarly to anticipated text messages.

Replika, another LLM-based AI service that provides companionship like to Character.ai, is labeled mature due to its sexual content and is not as popular as Character.ai in terms of time spent and visitors, according to statistics from analytics company Similarweb.

AI chatbots Woebot and Earkick were created specifically to be companions for mental health; according to research conducted by both companies, the apps are beneficial to users.

Psychologists have expressed concern that AI chatbots may be advising patients incorrectly or harboring deep-seated prejudices based on gender or race.

However, in other places, the medical community is beginning to hesitantly acknowledge them as instruments to assist in managing the heavy burden on public services.

The first mental health chatbot to receive official UK medical device certification was Limbic Access, an AI service, last year. Many NHS institutions now utilize it for patient classification and triage.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top