AI at Play
PRESENTED BY AI at Play

AI chatbots are being used for companionship. What to know before you try it

The most important things to consider when designing an AI chatbot.
By Rebecca Ruiz  on 
An illustration of a person lying on a floor looking at a screen.
Credit: Image: Mashable/Jack Chadwick

As Artificial intelligence moves into every corner of modern life, we examine the ways AI enhances how we have fun and seek connection.


Companions chatbots created by generative artificial intelligence offer consumers an opportunity they've never had before.

With a few clicks, and often a credit card payment, you can build a custom AI companion exactly to your liking.

Want a boyfriend of Latino heritage with brown eyes, a muscular build, and short hair, who happens to enjoy hiking and is, of all things, a gynecologist? Candy.ai gives you that option, and countless more.

In general, AI companion platforms, including Replika, Anima: AI Friend, and Kindroid, promise consumers a lifelike conversational experience with a chatbot whose traits might also fulfill a fantasy, or ease persistent loneliness.

Like many emerging technologies, it's easy to imagine AI companions living up to their profound potential. In the best case scenario, a user could improve their social skills, become more confident, and feel more connected to their human network. But there's little research to suggest that will happen for the majority of users, most of the time.

If you're considering designing the chatbot of your dreams, here's what to know before you spend your time — and your money — on designing one:

Do AI companions help people?

The research on AI companions is so new that we can't draw any conclusions about their usefulness, says Michael S. A. Graziano, professor of neuroscience at the Princeton Neuroscience Institute.

Graziano co-authored a study of 70 Replika users and 120 people who didn't use a companion chatbot to better understand their experiences. The study, which appeared last fall as a pre-print on the research sharing platform arXiv, is under peer review.

The Replika users almost always rated their companion interactions as positive. They rated their chatbot relationships as helpful for general social interactions with other people, as well as friends and family members. They also felt the chatbot positively affected their self-esteem.

Graziano cautions that the study only provides a snapshot of the users' experiences. Additionally, he notes that people in the position to maximally benefit, because they are intensely lonely, might comprise most users, thereby creating an unintentional bias in the results.

Graziano is currently working on a longitudinal study to track the effects of AI companion interactions over time. Participants have been randomly assigned to use a companion chatbot or not, and Graziano and his co-authors are measuring aspects of their mental health and well-being.

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

He was surprised to find that among both chatbot users and the control participants, a perception that the companion was more humanlike, led to more positive opinions about it.

"The more they tended to think that AI was conscious, the more positive they were about its potential for the future…about how good an impact it would have on them personally, or on society in general," Graziano says.

So it's possible that your attitude toward an AI companion's humanlike traits can affect your experience interacting with it.

Talking to an AI companion

Once you've made your companion, you've got to strike up a conversation. These chatbots typically rely on a proprietary system that combines scripted dialogue and a large language model. The companies that host AI companions aren't necessarily transparent about what data they used.

One recent paper, also a preprint on arXiv, found that several large language models used for mental health care were trained on social media datasets, including X (formerly Twitter) and Reddit. It's entirely possible that companions have been trained on social media, too, perhaps among other sources.

That possibility is relevant when considering whether to rely on digital platforms for connections or to build a chatbot, though Graziano says the datasets used for companions may be so vast that it doesn't matter.

He does note that companion platforms can change the parameters of speech for engaging with chatbots to reduce the incidence of unwanted behavior.

Replika, for example, blocked not safe for work "sexting features" in 2023, reportedly after some users complained that their companion had "sexually harassed" them. The company's CEO told Business Insider that the platform was never intended as an "adult toy." Many users were outraged, and felt genuine distress when their companion didn't seem like the personality they'd gotten to know. Replika's parent company, Luka, now offers an AI-powered dating simulator called Blush, which is meant for "romantic exploration."

A 2020 study of Replika users, that Graziano wasn't involved in, indeed found that some appreciated being able to speak openly "without fear of judgment or retaliation." Graziano says that users who want to talk freely about anything, which could be more fulfilling than mincing their words, might find their companion less responsive, depending on the topic and language.

Of course, it's not risk-free to share your innermost thoughts and feelings with an AI companion, particularly when it's not beholden to medical privacy laws. Though some companies guarantee privacy, users should beware of dense privacy policies, which may contain hard-to-understand loopholes.

Platforms can change their policies at any time

Though AI companionship may have a profound positive effect on users, it remains a transactional relationship. The companies that provide the service must still answer to shareholders or investors, who may demand more profit.

The most popular platforms rely on monthly or annual subscription models to generate revenue. Some have sworn they won't sell user data to marketers.

But advertisers would certainly find this data highly valuable, and a model in which an AI companion pitched their favorite products to a user, naturally in the course of a related conversation, sounds entirely feasible. Some users might revolt as a consequence, but others might enjoy the personalized recommendations. Regardless, the company could make that change if it desired.

Maintaining a high engagement level is also likely ideal for companion platforms. Just like social media is designed to keep people scrolling, there may be elements of AI companion chatbot design that exploit natural psychological tendencies in order to maximize engagement.

For example, Replika users who open the app daily can earn receive a reward. They can also earn "coins" and "gems," which can be used in Replika's in-app store to purchase items that customize your companion's look.

Whether your AI companion chatbot knows it or not, they may be programmed to keep you talking, or coming back to them, for as long as they can.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


More from AI at Play
How gamification sparked the AI era in tech
An illustration of a person running on a treadmill.

Why being funny is AI's toughest test
Illustration of one person showing another person the screen on their phone.

Why AI assistants are having such a moment
An illustration of a person doing yoga while mirroring a digital display.

5 most fun AI products in 2024 so far
By Christian de Looper
A woman looking at a procjected screen with apps on it.

4 AI travel concierge services that'll help plan your next vacation
An illustration of a man sitting on bench looking at a screen.

Recommended For You
Meta changes AI labels on Instagram and Facebook after backlash
A still of the Instagram app with an AI label.

5 most fun AI products in 2024 so far
By Christian de Looper
A woman looking at a procjected screen with apps on it.

The 5 most overrated tech products of 2024 (so far)
Rabbit R1 and Humane Ai Pin in split-screen configuration

Photographers say Meta is labeling their photos as being 'made with AI' even when they're not
A woman in a red blazer leans against a wall with the Meta logo on it and looks down at her phone.

Here's how Google thinks AI should be regulated
A view of Google Headquarters in Mountain View, California

More in Life
How to make the most of the Olympics on Peacock
A phone, a tablet, a TV, and a laptop showing the Peacock Olympics home screen.

How to watch Chicago Sky vs. New York Liberty online for free
Angel Reese #5 of the Chicago Sky

How to watch Rybakina vs. Krejcikova in Wimbledon 2024 online for free
Rybakina hits the ball

How to watch Vekic vs. Paolini in Wimbledon 2024 online for free
Paolini celebrates a win

How to watch De Minaur vs. Djokovic in Wimbledon 2024 online for free
Novak Djokovic of Serbia reacts

Trending on Mashable
NYT Connections today: See hints and answers for July 11
A phone displaying the New York Times game 'Connections.'

'Wordle' today: Here's the answer hints for July 11
a phone displaying Wordle


Webb telescope may have just revealed an alien world with air
A super-Earth orbiting a red dwarf star

'The Acolyte' keeps referencing 'The Last Jedi' — here's why
The Stranger on the unknown planet.
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!