Chances are that you have crossed paths with a virtual companion. Especially with the generative AI craze, virtual companions have become nearly inescapable. Most recently, there was even a reported case of a 36-year-old mother of two from Bronx tying the knot with a chatbot she created based on the fictional protagonist Eren Yeager in the Japanese manga/anime television series “Attack on Titan.”
What is driving the craze?
Virtual Companions Are Everywhere
The broader AI companion market is generally divided into three categories: Applications, robots and others.
These range from cute virtual pets like Peridot that jump around on your furniture (thanks to the magic of augmented reality), to completely immersive VR experiences that allow users to meet a dearly departed soul reconstructed from digital ephemera. The latter was most memorably captured in a haunting 10-minute South Korean documentary and in Project December, a now-defunct project powered by OpenAI’s GPT 3.
There is also an influencer that offers her services to be an AI girlfriend (Caryn AI), and AI-enabled virtual companion services like Replika, Kuki or the minimalist SimSimi.
Replika can roleplay as a user’s friend, sibling, romantic interest, or spouse. You can also choose to head over to Inworld for a chat with Freud or Gollum or any other historical or fictional character you fancy. Or build your own bot from scratch with Hybri.
The wide range of offerings in virtual companions span age, gender, and cultural contexts. They are custom-made to address different demographics such as children, adults, and seniors. Virtual companions are now used across a spectrum of roles: For general companionship; emotional support; as a learning aid and therapy tool; for elderly care and companionship; grief management; and, of course, customer service.
The sheer range of applications suggests that these companions are more than just a passing fancy.
What Makes Virtual Companions So Alluring?
Imagine having a friend for all weather. No matter what your mood, your friend is ready with a smile, a listening ear, and is always willing to please. We know that this almost never maps with our experience of friendship in the real world. Many times, we don’t even have a clue what our friend is talking about!
But that doesn’t prevent us from instinctively feeling their pain or joy. Human relationships are messy, complex things where there is no option other than to feel your way to the other person.
Virtual companions take the messy bits out of human relationships — for a price.
Many will point to the generative AI frenzy as the reason for renewed interest in virtual companions. But services like Replika were a mainstay during the pandemic as more and more people (many brutally cut off from all social connections) turned to virtual companions for emotional support. Some of the earlier virtual companions are decades old and consist of a patchwork quilt of overstretched pixels.
MORE FOR YOU
Projecting Human Behaviors
It seems we cannot stop ourselves from anthropomorphizing anything from inanimate objects to virtual companions. Dubbed the ELIZA effect after the earliest chatbot, or natural language processing computer program built in 1966 at MIT by Joseph Weizenbaum, this effect shows the human tendency to ascribe human behaviors to computers.
Simply put, we tend to behave with computers as if they are humans, a paradigm known as Computers Are Social Actors framework (CASA), which ultimately leads to emotional entanglement. Nowak and Fox (2018) argued that when chatbot avatars are gendered, people tend to ascribe (gendered) knowledge and stereotypes to them, potentially reinforcing the practice.
More recently, human-AI interaction researcher and psychologist Marisa Tschopp reported a study that found that people who related to their voice assistants as friends (rather than a rational equal or a servant) trusted the systems to make both cheap and expensive purchases. In other words, a perception of friendship made purchases easier.
The Business of Friendship
As Marisa puts it, “friendship sells.”
In Replika, for instance, users start out being friends with their replikas. The chatbot uses facial expressions and state-of-the-art generative AI to ask logical follow-up questions to keep the conversation going and create the impression of being listened to for the user.
According to a German study, chatbots are designed with as many as 48 social cues. These design elements are intended to mimic human interactions and include verbal, paraverbal, and non-verbal language to craft specific chatbot personalities. The emphasis is put on small talk, gestures, response times, and more to ensure natural conversational flow.
As the relationship deepens, users can choose to spend on an in-game currency of gems and coins to buy clothes or upgrades for their friend. For the ultimate experience of the romantic tier, users pay $70 per year. This tier opens the door to flirting, erotic roleplay, and even the opportunity to create photographic ‘memories.’
In other words, virtual companions commodify the experience of emotional connection through friendship and relationships as a service.
Unique Personalized Shopping Experiences
Many current non-AI chatbots have no memory of user interactions or preferences. But with virtual companions for customers, collaborative, personalized, and long-term relationships as trusted shopping companions becomes a real possibility.
Academics Timo Strohmann and Susanne Robra-Bissantz argue that this personal and collaborative relationship enables the transition of virtual assistants from mere task-completion assistants to full-fledged companions. It enables businesses to at once craft individual and personalized services and automate them.
More humanized customer interactions are already in evidence with UneeQ, an AI adviser service that can map emotional cues and empathy.
Seller Beware
However, generative AI technology is by its very nature uncontrollable and businesses must take special care to minimize unexpected or awkward behavior by virtual companions. Sudden removal or change in virtual companion behavior or attributes can cause real trauma and hurt in users, as already seen in the case of Replika. Businesses must also be mindful of significant security and privacy issues with GenAI. It is not just a case of caveat emptor — buyer beware — but seller beware, too.