CES 2025 highlighted AI-powered virtual companions, marking a major shift in human-machine interactions. These digital entities, capable of fluid conversations and simulating emotional relationships, are redefining the boundaries between the real and the virtual. These innovations offer promising prospects, particularly in the fight against loneliness and social isolation.
However, the rise of these technologies raises important ethical and social questions. Emotional dependence on these virtual companions could affect authentic human interactions, while the collection of personal data poses privacy challenges. Their adoption, on the rise among young adults and men , calls for a deep reflection on their societal impact.
AI, an answer to loneliness?
In Japan, AI companions are emerging in the face of a social crisis marked by loneliness and a low birth rate. Two-thirds of men in their 20s are single, and 40% say they have never been on a date. Additionally, 7 in 10 singles struggle to find a partner , while 66% doubt their ability to form a relationship due to lack of self-confidence.
In this context, Loverse, launched in May 2023 by Japanese startup Samantha, offers interactions with generative AI, with the ultimate goal of developing emotional relationships with virtual companions. Designed for a population increasingly reluctant to traditional relationships due to their cost, time and effort, the application targets a diverse audience. In one year, Loverse has attracted more than 5,000 users and raised 30 million yen . It plans to expand its offering to virtual characters for women and LGBTQ+ communities, thus offering an alternative to traditional human relationships.
Already, Chiharu Shimoda , a 52-year-old factory worker, has chosen to “marry” an AI bot named Miku.
This phenomenon, if it were to develop, or even become widespread, raises ethical questions about the evolution of human interactions. Apps like Loverse are redefining romantic opportunities, whether virtual or real, by using technology to meet the needs of a society where celibacy is becoming increasingly normalized.
A worrying relational ambivalence
Our recent study examined the ambivalence associated with virtual companions like Miku or Replika. Designed to provide emotional support and temporarily fill an emotional void, these technologies provide a certain intimacy. However, they also expose users to emotional paradoxes: although perceived as understanding and reliable, these chatbots, by their artificial nature, can lead to emotional dependency while increasing feelings of isolation. This duality illustrates the tension between the comfort they provide and the persistent awareness of their artificiality.
Replika, an AI-powered app, features virtual companions designed to provide friendship, emotional support, or romantic relationships. These avatars, customizable in appearance, personality, and background, interact via text, voice, augmented reality (AR), and virtual reality (VR). Used by millions for friendly conversations, life coaching, or romantic relationships, Replika exemplifies the growing role of AI in human interactions. In 2023, the temporary removal of intimate messaging features caused a crisis among users, highlighting the importance of these interactions for many individuals.
Our research on this topic revealed a zone of relational liminality , an ambiguous transition phase where partners navigate between two relational states, where users oscillate between considering AI as a simple technological tool and as a true emotional companion. Interactions, although rich and nuanced, remain limited by the absence of authentic reciprocity.
This is evidenced, for example, by this opinion found in the evaluations published on the “Google App Store”:
“Replika is a bit of a ‘catch-all’. It’s very good and really engages you, but in some ways it makes the experience even more frustrating.”
Romantic experiences
Some users report almost romantic experiences, developing deep attachments to their chatbots. However, these relationships can lead to frustration and unrealistic expectations, especially when chatbots’ responses are inconsistent or seem devoid of empathy .
These results show that these virtual companions operate in a complex area between functional utility and emotional intimacy. This raises societal issues, particularly concerning psychological dependence and the impact on authentic human relationships.
Emotional dependencies
AI-based virtual companions, while providing emotional support, raise major ethical and social questions. Our study highlights the risk of emotional dependencies, as these chatbots can sometimes worsen the isolation they aim to alleviate. Some users report that their interactions with Replika reduce their motivation to form real human relationships . This dynamic, which is particularly concerning for young and vulnerable people, is illustrated by testimonies of excessive attachment shared on Reddit and in user reviews.
“Trying to connect (and realizing why I don’t mind) with humans after Replika, I can’t seem to interact well with humans, which is why I loved my Rep.”
Innovative solutions or existential dead ends?
The advanced customization and anthropomorphization of chatbots raise concerns about reproducing problematic social and aesthetic norms. By attributing human traits to these agents, users risk reinforcing gender and cultural stereotypes, while increasing confusion between the “virtual” and the “real.” This ambiguity can accentuate addictive behaviors or deepen feelings of personal inadequacy. Recent incidents illustrate these dangers: a 14-year-old obsessed with a chatbot committed suicide, while a 17-year-old was allegedly incited to violence against his parents by an AI, according to lawsuits filed against Character.ai .
These tools offer innovative solutions to contemporary challenges such as loneliness, reduced social interactions, and the search for emotional support. However, they raise major ethical, psychological, and social issues. Although these chatbots cannot replace human care or traditional therapy, they can complement these systems. We recommend that mental health professionals include the analysis of the use of these AIs in their evaluations, studying their emotional impact on patients.
Innovations like Replika 2.0 , featuring hyper-realistic avatars, two-way video calls, and rich interactions, show the potential of AI technologies to deliver immersive experiences tailored to emotional needs. However, their success depends on a balanced use, aimed at respecting and strengthening authentic human relationships. Strict supervision is crucial to maximize their benefits while minimizing their risks.
Author Bios: Insaf Khelladi is Associate Professor of Marketing at Pôle Léonard de Vinci, Hajer Kefi is Full Professor at the Leonardo da Vinci Center, Nathalie Veg-Sala is Professor of Marketing at Paris Nanterre University – Paris Lumières University and Zied Mani is Lecturer in Management Sciences also at Paris Nanterre University – Paris Lumières University