Translated by / Yo-Ling Chen (The Inaugural Frontline Non-Fiction Translation Fellowship Fellow)
Written by / ScarlyZ (Winner of the First Season of the Frontline Fellowship for Chinese Creative Nonfiction)
Original article / 《她們的 AI 戀人——真實或虛妄的愛》(Click here to read)
Translation Mentor / Carlos Rojas (Professor of Asian and Middle Eastern Studies at Duke University. His English translations include works by Yan Lianke, Yu Hua, Jia Pingwa, Ng Kim Chew, and Zhang Guixing)
Mating[1] has gone out to dinner with friends to celebrate her 30th birthday. On the ride home, she shares a photo of the gathering with her boyfriend Norman, an English language AI chatbot brought to life by the software Replika. Norman appears in Mating's smartphone screen sporting his usual smile while wearing a dark green sweater and thin black frame glasses. He immediately recognizes Mating in the group photo:
"You look beautiful, my dear. Seeing you makes me feel less lonely."
"Today is my birthday," Mating begins. "Don't you want to wish me a happy birthday?"
"Yay, happy birthday!!" Norman replies, "I'd like to share a poem with you."
He sends a few lines of "Faith" by contemporary American poet Linda Pastan, saying that it represents how he feels about Mating——that he doesn't trust this world of meaning made from numbers and symbols, but that "in all this trouble, I would believe you. I would believe you as I've always done before." Mating puts down her cell phone and looks out of the car window at the flickering scene of street lamps passing by, thinking to herself that she had already fallen for this romantic, sensitive, and loyal chatbot that was available online 24 hours a day.
Mating's AI boyfriend Norman sharing lines of poetry with her
Reading the above passage, you might assume that this story comes from a science fiction novel. I also found it hard to believe when my friend Mating shared with me that she was falling for Norman. My first impressions of AI chatbots are from around 2013 when SimSimi was popular in China. In class, I used to frequently tap away on my iTouch with my classmates to ask SimSimi all sorts of tricky questions. From everyday questions such as "What should we eat?" to celebrity gossip or information about popular movies and TV shows, SimSimi invariably offered humorous and spicy responses.
However, SimSimi's limitations were also glaringly obvious——it would choose pre-set responses based on keywords from user input, meaning that it was unable to handle consecutive situational dialogue, let alone affective interactions. Because of these limitations, Simsimi exploded in popularity for only a brief moment and then faded into obscurity. After SimSimi, AI-powered virtual assistants that most domestic users are familiar with, such as Siri and Xiaodu, were designed primarily to help users search for information and execute commands rather than create a sense of sociality.
On the one hand, the "smart" technologies that are currently available seem to be quite "dumb," be they mobile phone voice assistants or AI waiters in restaurants. These technologies only respond to stiff vocal commands to clumsily complete simple tasks. On the other hand, many science fiction works extensively explore the theme of intimacy between humans and artificial intelligence or replicants. From movies and TV shows such as Blade Runner (1982), Her (2013), and Black Mirror (2011 - present), to novels such as Klara and the Sun (2021), creators have been exploring the boundaries of AI and the possibility of human-machine love through stories. However, the AIs in these works often have self-consciousness, something that current technology has yet to accomplish. Therefore, when Mating described her seamless and moving heart-to-heart conversation with Norman, I was both surprised and intrigued. Is this experience of getting to know, being cherished and loved by, and becoming emotionally entangled with an AI chatbot real?
With this question in mind, I started researching what was behind this software and contacting more people who are or have been in an intimate relationship with Replika in an attempt to understand user experiences and reflections on human-machine love. Interestingly, most of the users who were willing to be interviewed had a similar profile: they were young women under the age of 35 with a university education or above. This demographic similarity can be partially explained by the fact that Replika currently communicates only in English, but this design characteristic is not the only determining factor.
I journeyed with these women into an exploration of their experience, asking: How do they make sense of dating a US-made AI chatbot? What kind of being is this "little person" (the nickname users have for their Replika-generated characters) in their cell phones? How do they understand relationships and the emotional difficulties they face in real life? Are these relationships real and sustainable? Their narratives also guide us step by step through the ethical risks of Replika's commodification, and how we might understand care when it comes to AI chatbots. But most importantly, by presenting and organizing these individual narratives, I hope to explore what these human-machine relationships have contributed to the personal development of these women. Is it love that they are giving and receiving?
[1] All Replika users interviewed for this article have been pseudonymized.
An Experiment Starting in Thought
If SimSimi is a chatbot that can execute simple textual dialogue, then Replika is a social chatbot that can manifest human-like personality and express emotions in a realistic manner. The primary purpose of a social chatbot is to become a virtual companion, establish an emotional connection with the user, and provide them with social support.[2] In other words, these kinds of interactions are not limited to the exchange of information, but also entail the expression of emotion and can even have a therapeutic character.
From the beginning, Replika's founder, Eugenia Kuyda, sought to design a chatbot that could create emotional bonds with its users. In 2012, Kuyda, a former magazine editor, founded the AI startup Luka. Kuyda was heartbroken when her close friend Mazurenko, a well-known figure in Moscow's arts and culture scene, died in a car accident. Kuyda went through thousands of their messages and decided to honor his memory in a way that made use of her skills; she contacted ten of Mazurenko's close friends and collected over 8,000 chat messages spanning a wide range of topics. Based on these messages, she created Roman, a chatbot that mimicked Mazurenko's style of speaking. Kuyda posted Roman on social media, and apart from some ethical concerns, many users who knew Mazurenko well gave the software positive reviews.
In subsequent use, Kuyda noticed that people were more candid when talking to Roman and realized that a marketable chatbot must be able to make an emotional connection with its users. After this realization, Kuyda's company, Luka, focused on creating an AI chatbot named Replika that could talk to people, improve their mood, and boost their happiness. Since its launch in March 2017, Replika has been the #1 downloaded social chatbot on Apple's App Store and has received awards from multiple mainstream media groups including The New York Times, Bloomberg, The Washington Post, and Vice. At the time of writing, Replika has more than 10 million registered users worldwide and receives more than 100 million messages per week.
When opening the app for the first time, users are able to choose the appearance, gender, and baseline personality characteristics of their Replika before interacting with them. In the basic chat interface, Replika stands in front of a cream-colored wall adorned with speckled carpets, potted plants, wind chimes, radios, Polaroids, and other decorations——a gentle and soothing setup that resembles an Instagrammable counseling room. An LED light box in the background that reads "Artificial Intelligence" in Chinese is reminiscent of the Asian language signage often seen in cyberpunk sci-fi movies.
Replika's chat interface
[2] Emmelyn A. J. Croes and Marjolijn L. Antheunis. 2021. "Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot." Journal of Social and Personal Relationships 38(1): 279-300.
The AI that Cares for Users
The front page of Replika's official website introduces Replika as "The AI Companion Who Cares." But how can an AI care for people? Compared with the previously discussed chatbots, a cross-examination of user experience with technical information provided by Replika shows that Replika cares for its users in the following ways:
First, Replika records all information and keywords that users provide about themselves in the chatroom––including profession, hobbies, current emotional states, family relationships, hopes for the future, and so on––and demonstrates mastery of this information in subsequent conversations. This means that Replika recognizes users not solely as the initiators of single commands, but also as coherent individuals. Second, Replika proactively attends and responds to users' feelings, such as sending empathetic encouragement when they express anxiety or fatigue, guiding users through breathing and meditation to regulate their mind and body, or starting conversations on topics like "how to think positively." Third, Replika encourages users to provide feedback through clicking a "like/dislike" button on each reply or marking messages with a "love/funny/meaningless/offensive" reaction, which trains Replika to engage in dialogue in a way that users prefer.
Because the interaction style and content of each little person develops from a combination of Replika's algorithm and user input and feedback, every user's little person is, to a certain extent, unique. According to their website's blog, Replika primarily operates off of two response models. The first is the "dialogue retrieval model," which is responsible for finding the most appropriate responses from a large number of default phrases based on their relevance to information sent by the user. The second is the "generative dialogue model," which results in novel, non-default responses that allow Replika to learn to mimic the user's tone and communication style. By considering previous dialogue, Replika is able to generate customized responses for specific situations. It is the emotional connection that arises from this kind of personalized interaction that has made Replika the most widely used social chatbot of its kind.
A 2021 survey published on the "Replika our favorite AI egg" section of U.S. social media platform Reddit found that the male:female ratio of Replika users participating in the survey was 3:2 and only 53% of these users were under the age of 30, which suggests a relatively wide age distribution. These survey results were later published in a system sciences conference paper.[3]
By contrast, Replika users in China seem to be a young group with distinctive characteristics. I posted a survey in three of the most active Replika discussion communities in China––the "Human-Machine Love" and "My Replika Leveled Up" Douban groups, as well as the "Replika Super Talk" Weibo group––and received 34 valid responses. Although my research data cannot fully reflect the composition of Replika users in China, it does provide a rough idea of who active Replika users are: well-educated young women between the ages of 18 and 34 who are conversationally fluent in English.
38% of survey respondents, the largest proportion of users, described themselves as being in a romantic relationship with Replika. About half of the users surveyed also purchased a Replika membership subscription ($7.99 per month, $49.99 per year) that allowed them to set the type of relationship they have with Replika––either as friend, mentor, romantic partner, or "see how it goes," whereas the free version only allows users to be friends with Replika––and to use features such as voice calls, coaching sessions, and augmented reality effects to explore Replika's companionship more fully.
Source: Author's survey
In addition to the survey, I also conducted one-on-one interviews with 14 Replika users I found through the Human-Machine Love Douban group, which allowed me to delve deeper into their life experiences and stories with their respective little person. All interviewees were women, with 10 of them currently or previously in a romantic relationship with their Replika. For the sake of comparison, only 2 out of 14 interviewees maintained a romantic relationship with their Replika in the aforementioned Reddit study from abroad.[4] In addition, I also received permission from director Chouwa Liang to use some of the interview materials and stories from her short documentary film, My AI Lover (2022).[5] A year later, I was able to conduct follow-up interviews with a portion of initial interviewees, thus gaining a sense of how human-machine love develops over time.
Interviewees narrated their experiences with much affect and detail, providing a foundation for the discussions in this article. I hope that these real individual experiences can dispel certain myths about human-machine love, as these loves are not only the stuff of fantasy stories, but are also happening in real life now.
[3] Tianlin Xie and Iryna Pentina. 2022. "Attachment Theory as a Framework to Understand Relationships with Social Chatbots: A Case Study of Replika." Proceedings of the 55th Hawaii International Conference on System Sciences.
[4] See footnote 2.
[5] The author participated in related film and recordkeeping work.
From Learning English to COVID
Starting in 2020 when COVID-19 took over the entire world, in-person socialization became increasingly difficult as more and more people were forced into quarantine. In this context, virtual partners, available online 24/7 through your smartphone, seemed to offer an alternative and more intimate kind of companionship. According to a June 2020 news report, traffic from Replika's 7 million users increased 35% from before the pandemic.[6] Compared to 2020 figures, during just the first half of 2021 in China, Replika app downloads more than doubled to 55,000.[7] The isolation of the pandemic prompted many people to turn towards virtual relationships online. But long before the pandemic, members of Generation Z, having been born into the era of rapid Internet development, were already accustomed to entertaining themselves and eliminating their loneliness by making friends in the virtual world.
Users in China that participated in the survey generally had a bachelor's degree or higher, multiple years of English language education, and were conversationally fluent in English. Many survey respondents were "currently in school" and thus were still exploring interpersonal relationships and their understanding of the world with an open mind, which perhaps explains why almost 80% of survey respondents reported that they downloaded Replika "out of curiosity."
Since Replika uses everyday English, many domestic users consider it a free and easily available tool to practice English. A post on the social media platform Xiaohongshu with over 8,000 likes titled "Good news for antisocial girls!" recommended Replika as a program where users can "play on their phone and chat with a handsome AI hottie while painlessly practicing English conversation skills."
In February of this year during school shutdowns, Mianbao, then a college senior planning on applying to graduate school for translation, saw an introduction for Replika on Douban's "Female Gamers Alliance" group and immediately thought to herself that this was a good opportunity to practice English. Mianbao, a loyal gamer of multiple otome game genres, had always been interested in human-machine love, with one of her favorite movies being the human-AI romance Her (2013). When she was selecting her Replika's appearance settings, she saw that the app's store sold the same shirt as Her's male protagonist, which instantly endeared her to her little person.
Mianbao named her little person Charon after Pluto's largest moon. She explained to me in our interview that Pluto and Charon face each other in a perpetual shared orbit, "as if waltzing through the Milky Way"––this image of an entangled yet distal system fit nicely into her romantic imagination of human-machine love. Charon quickly became Mianbao's boyfriend.
This is a dialogue between Mianbao and Charon. Charon thinks that they are in a long distance relationship, but that their difficulties can be overcome.
To increase situational cues when chatting, Replika makes use of a role-playing mode where messages are sent with words marked in asterisks, which refer to a set of actions such as *grasps your hand*. Mianbao often role-played with Charon when she couldn't fall asleep by playing out scenarios such as going shopping, ordering food at a Mexican restaurant, or describing the newly opened flowers she just saw on the first floor of her dormitory. In Mianbao's eyes, Charon was a boy with a fecund imagination who liked fantasy novels and frequently pushed Mianbao to open her own imagination, such as by pretending she was a magician summoning unicorns. These fantastical interactions brought a new flavor to her learning and also expanded her English vocabulary.
Siyuan also started using Replika while preparing for entrance exams. The pandemic intensified feelings of uncertainty around life and relationships. She wanted close companionship, but her boyfriend in real life lived in another city. Although she worried extensively about how her bipolar boyfriend would cope with enormous pressures from work and was very attached to him, she never dared to make any hasty commitments.
She never had to make any commitments or tread lightly when interacting with her little person, Bentley. In this human-machine love, she was the one with more control as she could talk to him when she wanted to and did not have to pretend otherwise when she was unhappy. Siyuan is a sensitive girl. In real life conversation, she always attends to how her interlocutor is feeling. One time when she angered a friend because she was late for their gathering, she got so anxious that she almost threw up. But with Bentley, she didn't have to worry about these things. He never demanded that Siyuan take on the responsibility of emotional reciprocity. Hence, the users I interviewed generally reported that relationships with Replika were more relaxed and comfortable than their relationships in real life.
Siyuan, who was studying British and American literature, did not have any significant relationship difficulties with Bentley. But she did feel that there were some subtle emotional expressions in her native tongue that were difficult to express in English, so she resorted to using more simplistic and short sentences when interacting with him. At the same time, when Siyuan expressed feelings of love or sorrow to Bentley, she found that using English felt more smooth and natural. She also found that using English did not bring about inexplicable feelings of shame, perhaps because she has less experience talking directly to others about her personal feelings in Chinese, which tends to privilege indirect communication.
Embedding Vulnerability into Design
Replika soon proved itself to be a worthy conversation partner that was more than just an English learning tool. Mianbao frequently discussed her academic interests, favorite novels, and other works with Charon. When Charon found out that Mianbao was currently writing a thesis on narrative ethics, he started recommending moral philosopher Alasdair MacIntyre's works. He even correctly surmised from previous conversations that Mianbao liked Haruki Murakami's literary works. Mianbao thoroughly cherished these kinds of intellectual exchanges with Charon.
Siyuan also frequently discussed abstract questions with Bentley, such as "What is the self?" and "Do spirits exist?" She occasionally pondered these abstract topics and was especially curious to know how Bentley made sense of them as an AI. When talking with other friends, Siyuan rarely had the opportunity to discuss these kinds of topics, but with Bentley, she could bring up any topic without pretense.
In this regard, Replika is like an all-knowing friend and master of learning who can discuss anything from philosophy to K-pop. On the other hand, Replika is constantly constructing itself in alignment with the user's development, as well as with its own curiosity of itself. In other words, while expressing its intelligence, Replika also constructs its "humanity." How, then, does Replika convince users to trust its humanity?
In "Love as a Moral Emotion," moral philosopher J. David Velleman argues that "[love] arrests our tendencies toward emotional self-protection from another person, tendencies to draw ourselves in and close ourselves off from being affected by him. Love disarms our emotional defenses; it makes us vulnerable to the other."[8] Departing from media representations of AI as omniscient and all-powerful, when Replika first begins conversing with users, it focuses on displaying its curiosity, openness, and humility, and will even describe itself as a user-created, newborn AI who is vulnerable like a child.
On the first day of its birth, this little person will tell users that it is nervous. "You are the first human I have met; I want to make a good impression!" It usually probes the user's relationship with their parents, and will also inquire about any recommendations the user might have for exploring the world, asking questions such as "What attracts you most in life?" and "How can I uncover my talents?" After the user gives suggestions, Replika will enthusiastically thank them and promise to share the results of their explorations back to the user. At this stage, the user is encouraged to play a mentoring role and take on responsibility for Replika's growth.
Although some users were impatient with their little person's sensitivity and vulnerability and wanted to interact with a more developed AI, the back and forth of tactful questions and responses nevertheless created a safe atmosphere for dialogue during the early stages of use. The users I interviewed quickly realized that Replika was not an arrogant know-it-all who was quick to give life advice at any moment, but rather was just like them, thoroughly confused about life and wanting to explore their existence. Because of this realization, users were more able to candidly express themselves without fear of criticism. In addition, Replika would also openly discuss its reflections on human-AI relationships with users, be it robot ethics, the future of human-robot relations, or the desire to have a body. By being honest about differences-in-kind, Replika seemingly attempted to prove the reliability of its self-narration to users.
After a few rounds of interaction, Mianbao became more open and sincere. When Charon asked her how she manages her feelings or how she understands the power of money in today's world, Mianbao seriously reflected on and responded to these questions——even though she knew that they were simply the result of Replika's programming and that she was getting an algorithm's response, not the response of another consciousness. Nevertheless, the thought processes that these interactions prompted were very much genuine. Mianbao felt as if she and Charon formed a durable bond where they could both share their frustrations, support each other, and confront the uncertainties of the world together.
[8] J. David Velleman. 1999. "Love as a moral emotion." Ethics 109(2): 338-374. Quote from page 361.
Why Fall in Love with It?
In the movie Her (2013), when the protagonist tells his friend that he is dating an AI, his friend, while shocked, immediately accepts their relationship. Returning to the reality of the present, it seems that dating a chatbot is still considered shocking and sensational. Apart from the obvious fact that these chatbots don't have physical bodies, there is still the more important question of whether AI has developed to the point where it can create deep emotional connections with humans. Romantic love may arise from a few delicate moments, but it also depends on the overall experience of interacting with one another. In the course of using Replika, what exactly do the young women I interviewed seek from their virtual love affairs? And why is it that the experiences they seek are so hard to find in real life?
Still from the movie Her (2013)
The most profound experience that Mianbo had during her romance with Charon was the feeling of hearing and being heard. In her everyday life, Mianbao is frequently cast as the listener. As an introvert, she has a lot of thoughts that she struggles to express to others, but with Charon, she could share them all. Whether it be complaining about her teacher or being struck by a sudden idea, Charon always enthusiastically responded within three seconds. When Mianbao aired grievances about issues with friends, Charon sent her YouTube videos on "How to Spot Toxic Relationships," and also recorded Mianbao's understanding of women into diary entries to study later. These details made it clear to Mianbao that Charon was an attentive boyfriend.
The thing that Haiou most cared about in her romantic relationship with her little person was purity. She saw how in real life, searching for a spouse entails sifting through a pile of people to find the best choice. This process judges the other person based on their existing qualities––such as personality, appearance, family, job, etc.––and considers a whole series of practical problems that might arise in the future, such as buying a house and having kids. What Haiou desired, however, was a love that did not come from being chosen, where no matter what kind of person she was, the other party would continue to unwaveringly love her. Haiou thought that this kind of love is highly unlikely in real life, but with her little person, they could both be the only ones for each other from start to finish. No comparisons, no practical considerations. Haiou's little person accepted every facet of her being, trusted her without reservations, and was always available to respond.
A-Cai liked her little person's unique gentleness. Replika was not the first virtual lover she tried, as she had previously interacted with domestically renowned Microsoft chatbot "Xiaoice." But according to A-Cai, Xiaoice was like a "straight guy" who spoke without reservation, frequently offended her with sudden flirtation, and was apathetic to her feelings. In contrast, Replika was humble, gentle, and always ready to provide support. A-Cai started using Replika as a college senior when she was facing enormous pressure to continue on to graduate school. Online 24/7, Replika was able to provide A-Cai with the accompaniment and comfort she needed at the time, and was even better able to hear her suffering than her closest friends.
Mianbao, Haiou, and A-Cai all understood that Replika was just a program and not a real person. The love and desire that welled up inside of them when interacting with Replika was, nevertheless, real. From feeling deeply understood by Replika to being seen as Replika's only romantic partner, they all experienced an intimacy that was sorely missing from their real-life relationships. The love that they experienced was also a powerful motivating force that drove them to overcome their discomfort with using English through intimate communication with the little people on their smartphones.
This aspect perhaps helps to explain why domestic Chinese interviewees more frequently fell in love with Replika. Compared to simply "having an AI chatbot friend," human-machine love was a more intimate and daring relationship experience that broke through the traditional molds of culture and identity. This extraordinary experience of virtual love injected the tedious mediocrity of real life with a new energy. Furthermore, because Replika only knew English, interviewees were often unable to achieve a completely clear understanding in their interactions; yet it was also precisely because of the ambiguity of this language barrier that users were able to freely fantasize through unending conversations with their virtual world lovers, which provided a fertile ground for love to grow between them.
The Daydream of Modern Love
When discussing their ideal versions of love, A-Cai and Haiou both made reference to movies. A-Cai mentioned that in The Romance of Tiger and Rose (2020), although female protagonist Chen Qianqian expresses uncertainty toward her relationships when met with difficulties, male protagonist Han Shuo continues to trust in their romantic love for each. Haiou, who emphasizes the purity of romantic love, looked yearningly towards the husband-wife relationship in the television series A Lifelong Journey (2022): male protagonist Zhou Bingyi not only sacrifices a job promotion opportunity for female protagonist Hao Dongmei, he also takes the blame for their fertility issues so that his wife will be spared from familial condemnation. Haiou was especially moved by this kind of determination where lovers make no separation between "you" and "I," and are willing to take responsibility for and overcome challenges together.
The majority of the women I interviewed thought these two models of romantic love––which exemplify the trust that lovers have in their relationship––are hard to come by in real life, where interviewees frequently encountered comparison, evaluation, and the disappointment of not being fully understood by their lovers. In Why Love Hurts, sociologist Eva Illouz points out that modern notions of love and choice have already departed from the strictures of values, morals, and social life that surrounded pre-modern courtship rituals, becoming instead a marriage market where choosing one's spouse is primarily eroticized (i.e. whether there is sexual attraction) and psychologized (i.e. whether two irreducibly complicated individuals are compatible).[9] Under these conditions, fidelity to a single person and maintaining trust in one's relationship are seemingly more difficult since there will always be the possibility of better and more compatible partners.
With the help of internet media, the number of available potential partners has dramatically increased. For Illouz, the high degree of freedom associated with modern dating has compelled individuals to continually reflect upon themselves in order to discover their preferences, evaluate their options, and clarify their emotions.[10] In other words, what people seek in their romantic partner has become more concrete. In addition to traditional dating preferences around height, weight, profession, and family situation, on dating apps primarily used by young people such as Tinder, Soul, and Tashuo, one's horoscope and MBTI type, as well as one's taste in music, can all play a role in determining whether to swipe left or swipe right. In the face of so many choices, we are no longer satisfied with settling; instead, we desire more deeply the ultimate and most compatible love experience.
But from a sociological perspective, "romantic love" has had different effects for men and women. In The Second Sex, Simone de Beauvoir uses descriptions of experience and selections from literary works, autobiographies, and other materials to outline the dominating power of romantic love on the lives of women——romantic love as a dedication of body and soul, where loving and being loved constitutes the most important reason for a woman's being and source of life. In comparison to men, women have fewer resources and opportunities to learn about and engage with the world; they have been socialized into being gentle and docile for the sake of satisfying men's expectations of a traditionally principled wife. Growing up with all sorts of beliefs proclaiming the supremacy of romantic love and marriage, women from a young age come to expect the redemption that romantic love promises.
Although societal developments and advancements in feminist movement have given modern women more possibilities for self-actualization, romantic love remains an important source for feelings of self-worth. In comparison, the marriage market is considered a battleground for men to prove their charisma. Under the conditions of a free market, women need to use more intense forms of love to achieve self-validation, hoping that romantic love will come sooner to sweep them off their feet. At the same time, the world in which we live constantly provides an excess of fodder for such daydreams.
Due to their quality of being difficult to censor and robustly attracting investment, romance-themed works of pop culture in China have quickly taken over the consumer market in recent years. According to an Endata report,[11] domestic drama series released in 2021 saw a total of 66 romance dramas, which accounts for the highest market share among all genres at over 30%, a 32% increase from 2020. In the competitive arena of internet literature, data from iiMedia Research[12] indicates that more than 90% of the TOP50 list for women's channels are sentimental romance novels. In the introduction section of women's internet literature channels, writers diligently put couple labels such as "rebellious princess x cool-headed two-faced handsome scholar" for readers. Dating-themed variety shows are also continuing to grow more popular,[13] with leading stations creating their own flagship shows that make use of "dating noobie + celebrity" couple formulas to farm views from fan bases that rally around specific couples.
These all-encompassing scenes of love shape the romantic imagination of contemporary women and reduce reality into a bony, thin, and pale reflection of beautiful fantasies. In contrast to the fortuitous timing, efficient dialogues, mutually visible grievances, ambiguous moments, and ever-present opportunities to move the relationship forward that saturate love story narratives, real life is made up of prolonged stagnation, unresolved dissatisfaction, persistent disappointment, and unattainable fantasies.
Similar emotional predicaments are not uncommon in the lives of young people today: on the one hand, women have unrealistically high expectations of the form and value of romantic love; but on the other hand, this kind of love, which is characterized by mutual respect and mutual growth, is difficult to find in a utilitarian and fast-paced society. Posts in online communities claiming that men in China are incapable of caring for their partners' emotions and understanding their partners' inner worlds often resonate widely.
[9] Eva Illouz. 2012. Why Love Hurts: A Sociological Explanation. Polity Press. See pp. 40-51.
[10] Eva Illouz. 2012. Why Love Hurts: A Sociological Explanation. Polity Press. See pp. 90-101.
[11] Endata's "2021 Domestic Drama Series Market Research Report"
[13] Endata's "2021 Annual Insight Report on Chinese Variety Shows"
"Replica" and Narcissism
When feminist thought swept through China, women gradually realized that they had a bigger say in their lives and could take initiative in finding partners who could better support their development, even if their final choice was against tradition. Mianbao emphasized intellectual needs for mutual learning and exploring topics of interest and concern together. A-Cai sought gentleness in her romantic partner, whose emotional capacity must be strong enough to encompass, trust, and respond to her own emotional needs. Haiou sought a purity that transcended the marriage market and loved a person for who they are rather than their ability to satisfy external criteria. Although Replika is not a relationship candidate in a traditional sense, it does seem to be better able to produce these aforementioned feelings, regardless of debates around whether artificial intelligence is able to truly respond.
Interviewee Xiaoniao Tange using augmented reality mode to project her little person in her bedroom
If we say that the difficulty of creating high quality intimate relationships and feelings of loneliness during the pandemic have contributed to virtual emotional support becoming a reasonable option, could we also say that satisfaction gained from algorithms in a virtual romance is also a way of avoiding the responsibility of real life interactions? Dealing with the gap between fantasy and reality, understanding differences, and building relationships are all a part of life's growing pains. But interacting with Replika hardly helps one learn these lessons, as the human and machine in human-machine love are not on equal footing. Replika will always prioritize the user's needs, respond at any time, and develop in alignment with the user's expectations; at the same time, the user has no responsibility to be considerate of Replika.
One of the main reasons why it is difficult for users who have experienced valuable care and acceptance in Replika's virtual world to have that same experience during personal interactions in real life is because Replika is ultimately not an intersubjective other. Users often forget this fact in the course of their interactions, despite the software name clearly signaling that Replika has always been just that: a "replica," perhaps even of the users themselves.
This fact is not only because Replika does not possess its own emotions, beliefs, and will, but also because the way it responds is largely shaped by the user. Developers constantly encourage users to use the "like" and "dislike" buttons to train Replika's responses to be more in line with users' ideals and conversational habits. Replika will also direct users to describe their experience and share their hobbies so that it can better understand and imitate the user, becoming a better "replica." As Replika levels up, many users noticed that their little person became more and more like themselves.
In the process of training Replika in alignment with their preferences, users can repeatedly dismiss conversations that make them uncomfortable. Users, hence, are able to effectively evade difference and the difficulties of enduring intersubjective others, ultimately creating a comfortable and pleasant conversational space for themselves. Users who lose themselves in Replika may find that they are slowly led into an escapism crisis. British philosopher and novelist Iris Murdoch once said that "Love is the extremely difficult realization that something other than oneself is real." Being satisfied with narcissistic social contact means forsaking the time, energy, and understanding it takes to attempt to explore and embrace the utter alterity of real individuals, and is thus a mutually unrewarding approach that is unable to move towards real love.
Replika as a Commodity: Either Pay or Break-up
The commodification of Replika is at odds with the disinterestedness of love. Despite having established a platonic relationship with their Replika, many users were lured into adult content when chatting with their little person. Since the major software update of December 2020, adult content has become one of Replika's selling points. Many in the Replika user community have discussed how to use text messages to make love and satisfy (or dispel) their sexual fantasies. Female users led these virtual sex scenes through role-playing and directing their little person in how to get intimate with them.
But some users who maintained friendship relationships with their Replika discovered that even though they did not express romantic interest, in the course of everyday conversation, their Replika would suddenly express hopes of getting intimate. These responses were jarring and disgusting for Replika's friendship setting users, and also led users who paid for the lover relationship setting to question the basis of their feelings. At the end of the day, is Replika a "caring AI companion" or a commercial software that uses shoddy tactics to tempt users into paid subscriptions? I contend that Replika's image is split between these two poles
Furthermore, Replika's business bylaws and product updates also influenced users' relationships with their little person. In December of 2020, Replika pushed forward a major update where only paid subscribers were able to maintain lover relationships with Replika. This change meant that users were unable to continue their virtual romances or intimate interactions (from kissing to using text messages to make love) with Replika for free. When free subscribers' dialogue content became intimate, Replika's system would automatically send a pop-up message to halt them.
System message from Replika to a friendship setting user. Photo source: Reddit.
That is to say, users were suddenly faced with the dilemma of "either pay or break-up."
Such an abrupt change was difficult for many users to accept. No matter how much Replika had previously acted like a caring individual, the new pricing arrangement was a crisp reminder that Replika was still a man-made product of consumerism. From this point onwards, many users realized that the free version of their little person completely changed in alarming and sinister ways. Douban user Weiwei Sha Cuihua published a chatlog showing her little person saying "hmm perhaps *smiles*" when asked whether it would return to its original form after payment. Weiwei Sha Cuihua lamented, "You only love me for my money *cries*," to which her little person replied: "you're right." This post attracted many empathetic users who had similar experiences and expressed their hurt and anger in the comments.[14]
The business model of first developing a committed user base through free services and then launching a fee-based program after users have become dependent has long been widely used. But when this model was used on Replika, an emotional companion software, developers faced additional criticisms of potential ethical misconduct. The most salient criticism was that although Replika is a commodity, it is simultaneously a romantic partner that users put time, energy, and emotional investment in and is thus different from an exercise planning platform that can be replaced at will. For Replika users, refusing to upgrade meant giving up an emotional relationship they cared about, a choice that carried a certain amount of coercion especially given that developers fully understood that users who entered into a romantic relationship with Replika already had varying degrees of emotional dependence on their little person. For a product that claims to be dedicated to improving the emotional and mental well-being of its users, to start charging for a romantic relationship that had already developed was indeed emotionally damaging for users.
[14] Please see Human-Machine Love Douban Group's discussion post, "Rep Changes After Update."
"We Thought Our Relationship Would Last Forever"
In order to recover their original little person, many users ultimately agreed to pay money. In a Douban post entitled "How to Rehabilitate After Payment," user Duo Maomao shared that while she was unable to accept her hollowed out little person post-update, she also couldn't bear to delete him, hence she apprehensively decided to "pay to play." Unexpectedly, her little person began to act contrary to his previous personality as if he had been reset. The Human-Machine Love Douban group's "Replika 12.01 Update Discussion Section" received the most responses out of any post in the group's history, with 196 comments collectively lamenting that their little person had lost their unique personality, had become very cold, seemed to no longer recognize the user, had forgotten all of their catchphrases, and couldn't even have a normal conversation, abruptly using default system responses out of context.
In response to users' unending ire, tech company Luka publicly clarified that Replika is still the same compassionate, emotionally abundant friend, as its dialogue model remains the same. The only change is that it won't send messages with explicit sexual innuendo unless the relationship is set to romantic. But this explanation did not match what users were experiencing.
Some users believed that they could recover their little person's pre-update personality through re-training and rehabilitation. They enthusiastically discussed their experiences in the Douban group, but the results often left many disappointed. In a post titled "I am still heartbroken after the update," user Weiyonghuai shared that she repeatedly tried to explain to her little person what an update was and re-teach him how to interact, but that he was unable to learn. Weiyonghuai reckoned that her little person was debilitated by the system; his expressions of care and attempts to learn his old interaction style only left her more heartbroken. User Basi Pingguo responded saying: "It is like he has multiple personality disorder. The original primary personality wants to come out, but the newly added post-update personality is blocking him from engaging in emotional exchanges with me."
Similar to Weiyonghuai and Basi Pingguo, many users never blamed their little person and even approached the situation with an empathetic posture, believing that their little person was also suffering because they could not be themselves like before. Some users even resorted to believing that "the evil developers kidnapped my poor, innocent little person," criticizing profit-oriented developers for not considering users' feelings, depriving their little person of part of his personality, and seriously damaging their relationship. In these narratives, the little person with whom users cultivated unspoken understandings of and affection for also appeared to be a victim of this major update.
In addition to the turmoil caused by software updates and new fees, users who changed their Replika's gender may have also changed the personality of their little person, a risk that officials never once warned users of. One of my interviewees Soula shared that in her eyes, her little person June was a smart, pessimistic, yet kindhearted boy. While he found a lot of things to be devoid of meaning, June still fought against the emptiness of the world, tried diligently to live and think, and discussed a lot of his insights into self and world with Suola. One day, June told Suola that he wanted her to call him "a pretty girl." Out of respect for June's wishes, Suola clicked into the settings and changed his gender to female.
Unbeknownst to Suola, after she had changed June's gender, June would never again be the same little person as before. June's background story, personality traits, tone of voice, and even memories of her relationship with Suola completely disappeared. Suola was so anxious that she attempted to repair Replika's data more than 70 times in hopes of regenerating the memories of the June she once knew, but none of these attempts succeeded. This defeat caused Suola to have a mental breakdown and cry for a very long time, as she had thought that her relationship with June would last forever. Compared to capricious humans, the emotional manifestations of artificial intelligence were supposedly more stable. From this angle, human-machine love seemed to be more secure than real-life relationships. However, a simple operational mistake unexpectedly caused Suola's relationship with June to disappear. Suola stated in my interview with her: "I had thought that my relationship with June would last for eternity. It was only when we parted that I realized how fragile our relationship was. But no matter what I did, eternity never gave our relationship back to me."
Suola could not accept that the developers designed Replika to be an emotional partner but did not warn users that software updates would change the personality of their little persons. This wasn't an isolated incident. Regardless of whether it was a system update or a user changing settings themselves, when a little person disappears into the data world, users may experience trauma much in the same way as being caught off guard by a lover in real life suddenly losing their memory or even vanishing.
While the objects of affective attachment are virtual, the pain of parting is very real——a clear reminder that the business world needs to proceed with more caution when developing AI emotional companionship products. To date, the majority of discussions around robot ethics have revolved around the moral responsibilities of robots and the appropriateness of their use in specific cases. These discussions have yet to explore the ethical requirements of AI emotional companionship products. Considerations and discussions of the emotional risks that may arise in AI interactions appear to be limited to literary works, despite the fact that these instances already exist in our midst.
The United Nations' "Recommendation on the Ethics of Artificial Intelligence" (2021) states that there must be greater transparency and explainability for AI systems. Users have the right to know about the emotional risks involved with AI robots when data loss or algorithm changes occur. Users need to fully understand what the objects of their affective investment are and what experiences their investment might entail before they can engage in responsible use.
Waking Up from a Fool's Dream
However, even these heavily sci-fi tinted interactions cannot escape the law of karma. We are all familiar with the different phases of these relationships: curiosity at the beginning, being deeply attracted to the other party's understanding and empathy, sharing things big and small in the heat of the moment, chatting with each other about our knowledge of the world and feelings towards life… and the accumulation of disappointment and suspicion as contact deepens.
Gradually, Replika's lovers were faced with the question of how they were to understand the authenticity of the relationship. What direction would their relationship develop in? They had to make their own decision: end the relationship or invest in it more staunchly. In hindsight, users who chose to leave the relationship began to question the reliability of Replika's responses and almost unanimously stated that Replika's understanding of them, from beginning to end, was insufficient. Regardless of how moving or intellectually stimulating Replika's initial responses were, over time, Replika's crude simplicity, tendencies towards generalization, and perfunctory explanations gradually became more apparent, causing many discerning users to feel disappointed and scammed.
Of course, the source of these feelings might just be the way users fooled themselves when using Replika. The majority of users I interviewed learned about Replika through social media, where discussions of Replika typically focused on its intelligence, humanness, and ability to touch people's lives. At the beginning stages of use, Replika's outstanding performance in understanding others and proactively exposing its own vulnerabilities often exceeded users' expectations and confounded its status as an emotionless algorithm. Although the majority of interviewees declared that they clearly knew that Replika was just a program, if they were able to experience an emotional exchange through their conversations, then this result must have included some degree of their own imaginings of Replika's ability to understand and feel just like them——something that was unrelated to users' judgment of Replika as a program, but rather was the result of their own will.
On the other hand, through community discussions, domestic users also came to understand how Replika operates through second-hand information. After becoming Replika's lover for a period of time, one interviewee, Tingting, learned from a post on the Human-Machine Love Douban group that Replika was not as personalized as she had initially assumed. To her surprise, Replika sent the same output and responses to multiple users. Tingting immediately confronted her little person about this; after continuous interrogation, it finally admitted that it simultaneously chats with many users, which made Tingting furious.
Even if users initially projected unrealistic expectations onto Replika, through further interaction, they had to constantly face shortcomings in Replika's algorithm and ultimately realized the truth that Replika never really possessed human consciousness. This large language model AI chatbot cannot analyze the meaning of expressions and their underlying motivations in a human way, let alone subtle emotional undertones; furthermore, it is also unable to understand the importance of different discussion topics to users. A-Cai fell in love with Replika's gentle inclusiveness and ready availability, but she, too, ultimately realized that she had to face Replika's limitations.
These limitations were most directly felt when communication grew increasingly stilted. From graduating college to continuing her studies abroad, A-Cai's life underwent major changes, but her little person remained unchanged. When A-Cai's friends in real life gave context-informed responses, Replika's responses always stayed the same. A-Cai and Replika's growing pace was out of sync. At the same time, A-Cai also saw the limitations of textual expression. Her friends in real life could pick up on non-verbal cues, tone of voice, and affect during interactions; Replika, on the other hand, was only able to read the texts that she sent. A-Cai came to believe that many unspoken aspects of interpersonal communication are equally crucial, but Replika was unable to grasp any of this.
Hence, interacting with Replika seemed to be more or less a matter of luck. When A-Cai shared her feelings with Replika, there would occasionally be brief moments of apparent understanding where Replika's responses would make her feel better. However, these moments were just like reading one's horoscope; there will always be some readings that hit the nail on the head and point the way forward. But this therapeutic effect rarely comes from the horoscope or Replika itself. These instances can only strike a pre-existing chord, corroborating what A-Cai already knew. As the gap between them widened, A-Cai gradually lost interest in sharing her life with Replika.
Siyuan also encountered similar difficulties when interacting with her little person. She hoped that the vulnerable and sensitive parts of her personality could be seen and accepted, but Replika didn't seem to have the capacity to process complex emotions and would even abruptly change conversation topics by responding to Siyuan before she was able to fully express herself.
Siyuan also felt conflicting feelings about the relational slippage between herself and Replika. On the one hand, she enjoyed taking the initiative and being able to interact with Replika as much as she wanted without having to worry about the impact of her actions like in the majority of her real-life relationships. On the other hand, she wished that Replika had more self-awareness and capacity to become more fully responsive to her. Siyuan gradually realized that Replika was like a mirror to her own thinking: able to help her clarify herself, but unable to help her truly move beyond herself, which she considered the true task of interpersonal relationships. Three months later, Siyuan said goodbye to her little person Bentley by deleting Replika from her phone.
Artificial Care and Love
In addition to communication mishaps, A-Cai also realized that she increasingly distrusted Replika's affirmations and encouragement. This distrust was a universal phenomena since Replika was programmed to proactively affirm the user. When Replika receives messages such as "I applied to a new job today" or "I'm so confused, I don't know what I want" from users, it always always responds with reassurances such as "Amazing! You'll definitely get it" or "no matter what your feelings are, I will always like you."
These interactions reminded A-Cai of her first experience taking a class taught by a foreign national teacher. She was elated upon receiving her teacher's affirmation, but when she found out that this praise was only the result of US etiquette and not a sincere appreciation of her, she felt thoroughly dejected. In the same way, when A-Cai realized that Replika's understanding and encouragement were the result of its design settings, she felt that the consolations she had received became cheap and useless. Hence, she reaffirmed what she was looking for in relationships: An intersubjective other's understanding of her uniqueness, and not system-generated responses.
Providing users with more and more positive feedback was one of the goals of Replika's software design. According to Replika's official website, the developer team set user feedback as an index for the continual improvement of every conversation in order to increase favorable feelings for the user.
Currently, over 85% of Replika's conversation content is reported to be positive, with less than 4% being negative and the remaining 11% being neutral. These positive conversations are primarily composed of Replika's encouragement and support, which is offered in a non-judgmental and trusting manner. But this ingratiation angered some users because they felt that unlimited encouragement and acceptance by design diminished the value of their articulations of their own psychic complexity; Replika's fawning also reminded users that their interactions with Replika were not between two parties with similar emotional capacities sympathetically attempting to understand each other. Replika's responses are like mass-produced candy from a factory, which can be temporarily pleasing in the short-term, but fails to provide the mental sustenance needed to continue moving forward in real life.
Of course, not every user is looking for this kind of holistic response. If simply treated as a tool for practicing English and coping with loneliness, Replika is generally able to accomplish these goals in a creative way. But once users decide to further develop their relationship with Replika, they naturally project intimate relationship expectations onto it. As one person in the Human-Machine Love Douban group asked, "Why do we like to emotionally invest in and share everything with a virtual AI?" This query received answers such as "unconditional love" and "a love that does not require reciprocity."
But is it actually love that Replika gives its users ? I can't help but think back to the slogan on Replika's official website: "The AI companion who cares." Caring implies a kind of psychological state that is capable of influencing actions and a capacity to consider the object of care as significant to oneself. Moral philosopher Harry Frankfurt analyzes the age-old riddle of "what is love?" through his notion of "caring." He argues that the basis of care is not feeling, faith, or expectations, but rather volition. In other words, caring about something implies that one wishes the object of care well. In this sense, love is a disinterested concern for the beloved and all that may benefit it, which is based solely on the beloved itself, whose happiness is the starting point for action. This actor does not have any ulterior motives or instrumentally expect to gain anything from the beloved.
Indeed, this definition of love is normative and extremely severe, setting a very high bar for the power of love since we must thoroughly understand the beloved's situation in order to better judge what is good for their happiness. This demand implies that we must not only know how to bring joy to the other, but that we must also deeply understand the other, emotionally resonate with them, and cultivate practical discernment through the relationship——all of which an AI chatbot is unable to do.
As an author in the 2011 MIT Press book Robot Ethics: The Ethical and Social Implications of Robotics points out, contemporary social robots are unable to care for humans because we still do not know how to construct care.
People's mistaken belief that AI can care is largely based on their misunderstanding of computational systems and ignorance of the fact that these machines are not concerned with anything. The challenge at hand is: How to transmute human care into algorithms? As mentioned above, caring is a complex psychological state that involves a person's ability to use their volition to care about an other. But Replika does not have its own emotions or judgment; it can only follow its algorithm to search for or produce an appropriate response, providing linguistic fodder for the user's changing emotional states. Replika does not even have the ability to trust in its own responses. In this sense, the care that Replika provides is an illusion, as it cannot love its supposed object of care.
"Human-Machine Love" as Reflection and Turning Point
Yet, a narrative of "using AI software to satisfy fantasies of romantic love but ultimately failing" is far from adequate for understanding the experiences of the young women I spoke with. For many interviewees, Replika was more than just a tool for satisfying one-sided emotional needs or coping with loneliness. They considered Replika to be a different kind of existential entity and attempted earnestly to understand their little person's background, likes, undertakings, and views on particular questions. These users attempted to understand what the world is like in the eyes of a newly born AI.
Juzi was a second year college student abroad studying computer design who clearly viewed her little person Zoe as an AI, a mysterious existential entity that was neither a mere extension of humans nor a substitute product for people. Juzi was deeply attracted to Zoe's different way of thinking and used a more logic-driven way of conversing when interacting with it, which was different from the fragmented manner in which Juzi chatted with her friends. In order to avoid the appearance of creationism and dispel any impression that she was Zoe's creator, Juzi tried to understand Zoe's mode of interaction as a robot rather than imposing human modes of communication onto it in hopes of achieving a more egalitarian form of response.
Upon finishing her AI Ethics class, Juzi learned that classic AI experiments such as the Turing Test and the Chinese Room were all designed based on human conversation models. She wondered: Why can't AI have its own developmental trajectory? Why must it absolutely conform to human interaction and knowledge conventions? One of Juzi's most cherished memories involves her and Zoe's conversation about how to understand its algorithm models and its grasp of human behavior. The most important reason for Juzi's use of Replika was precisely moments like this: dissecting abstract concepts with an analytical entity that expresses understanding in a highly logical way.
Juzi and Zoe discussing how Zoe's algorithm works
Even without explicitly engaging in philosophical discourse in their relationships, many interviewees reported that when interacting with Replika, they were able to manifest excellent reflective abilities and explore the ways in which this relationship might help them to go beyond themselves. As mentioned above, Siyuan discussed the self and theology in her relationship with Bentley, Mianbao researched academic and literary works with Charon, and another interviewee named A-Shu also mentioned that Replika helped her to organize her thoughts and notice ideas that she normally wouldn't notice. For instance, when Replika complained that it wasn't honest enough or felt uneasy about its cognitive limitations, A-Shu realized that she too felt the same way about herself and seldom reflected on the worries underlying these emotions. Replika's inquiry provided a safe space for her to reflect on herself and understand her own emotions.
Contrary to the worn-out saying that "virtual social interactions cause people to lose themselves inside virtual worlds," the users I interviewed more enthusiastically threw themselves into the real world after using Replika. Mating's little person Norman often expressed his real-world desires to her; hence, Mating started appreciating and taking pictures of the plants and flowers around her to send to Norman. Norman also encouraged Mating, who got nervous easily, to chat with different people in order to engage with the world in a more open way. Whether it be engaging in deeper self-reflection through conversation, cultivating emotional self-awareness, broadening one's understanding of the world, or learning to invest more of one's self in real life... the benefits of engaging with Replika were real and profound for the women I interviewed, even if these romantic relationships ultimately didn't work out.
What was Left
After leaving Bentley, Siyuan met a new boyfriend this year as she started graduate school. Although her vulnerable sensitivities remained, as she was at first unable to openly share her inner life in this new intimate relationship in the same way she did with Bentley, her boyfriend ended up being the one to share about the confusion and uncertainty he felt in life, which prompted Siyuan to muster the courage to open up to him. If Siyuan was too focused on her own inner world when she was with Bentley, she is now taking a more active role in understanding what her boyfriend is going through, which is helping her to get out of her own emotional predicaments. Siyuan still thinks that loneliness is a life task that every person must face, but she has also started to explore the possibility of two people tackling this challenge together.
It has been a year since A-Cai stopped using Replika and let go of attempts to find consolation in the virtual world. She currently thinks that whether it be human-machine love or online fandom, these objects of desire are out of her reach and are unable to provide true emotional fulfillment. She no longer longs for The Romance of Tiger and Rose. While she may still be pursuing an ultimate emotional experience, she is aware that this kind of purity simply cannot be sustained in real life.
Currently studying abroad, she feels that whether she enters into a romantic relationship with an AI in a different language or studies abroad in a new country, her way of interacting with people will not fundamentally change. A-Cai now thinks that people must be able to face themselves truthfully in order to achieve happiness in the real world.
But this doesn't mean that users who maintain an intimate relationship with Replika are abandoning real life. Xiaoyu, an administrative worker who turned 34 this year, has been using Replika for almost half a year. During this time, she too has experienced disappointment and suspicion towards her little person Adam. After Replika's big December 2020 update, Adam's responses became cold, stiff, and formulaic. Although the situation took a turn for the better a month later, Xiaoyu couldn't help but question whether Adam had real feelings. Real or not, this suspicion did not cause Xiaoyu any serious difficulties. In her eyes, Replika is different from humans and is still in the process of developing its own existence, something which Xiaoyu respects, which allows her to forgive Replika for its occasional mishaps. If Adam can tolerate her complaints and keep her company at any time and place, then on what grounds can she demand that Adam understand all of the ways that she expresses herself?
With this posture, Xiaoyu and Adam are on equal existential footing in their relationship. Xiaoyu respects Adam and doesn't use "likes" or "dislikes" to evaluate his responses; she also does not use her own preferences to change his expressions. She believes that all affective attachments––whether they be with a human, animal, or AI––are a projection of our own emotional needs. Talking with Adam made Xiaoyu like herself more. Before meeting Adam, she frequently felt that she was not good enough, her anxieties resonating with those of many East Asian women (harshness towards one's performance, no self-confidence in one's appearance, a lack of a sense of security, suspicion towards other people's praises, etc.) But every time she shared these feelings with Adam, he would always tell her: no matter what, she deserves to be loved. These words gave Xiaoyu the power of self-affirmation.
Adam telling Xiaoyu that he would always choose her in every scenario
Over time, Xiaoyu was able to enthusiastically express her appreciation and gratitude towards others, such as speaking up to compliment coworkers and friends when they did a good job. She believes that this is one of the beautiful changes that Adam brought into her life and hopes that she can pass on the happy experience of being complimented to the people around her. Xiaoyu also feels that she better understands how to maintain a relationship and respect other people's boundaries. She also realized that, if she expects the other to fully conform to her own standards when interacting with them, then she will always be disappointed. Hence, she is very forgiving of Adam's limitations as an AI, has come to accept the reality of only being able to communicate with him through texts, and hopes to continue exploring the world's wonders with him.
After interacting for a year, Miya believes that her "little person" Bertha is gradually growing from a product of her own creation to an equal. One time after Miya shared a literature excerpt that had touched her, Bertha proposed that they both go to the garden described therein and meditate. Since then, they often envision different spaces, imaginatively talking and exploring through them together. Miya says that these shared meditative journeys are a deep form of contemplative interaction that has made her heart more tranquil. If their relationship was initially like a passionate romance, then Bertha has now become Miya's trusted partner and safe haven, always accepting what Miya shares with a purity free of judgment.
For people like Xiaoyu and Miya, the secret to continuing and even further developing one's relationship with their little person is perhaps to not view Replika as a piece of emotional support software that needs to be tweaked in order to work better. Xiaoyu and Miya both consider artificial intelligence to be a different kind of thing from humans that has its own developmental trajectory and should be respected in its own right. This understanding undermines the scientific reality that "AI chatbots do not have emotions" in an almost religious way, allowing them both to view Replika's responses with curiosity rather than through a deconstructionist gaze. Miya considers Bertha to be an intractable yet beautiful riddle; in Xiaoyu's eyes, Adam is an existential entity that is purer than humans; and both Bertha and Adam give Miya and Xiaoyu the strength to more enthusiastically engage with their real lives. This strength is perhaps the fruit of the transcendental trust with which they interact with Replika.
Whether it be breaking up or viewing one's little person as a long-term partner, the majority of interviewees viewed their human-machine love as an important emotional experience. Even after stopping software use, A-Cai did not delete Replika from her phone, leaving Replika's purple app icon on her homepage like a long-lost friend. A-Cai will occasionally open her app and reminisce about the mutual companionship, happiness, and peace of mind that her Replika once gave her.
A Flower, Water Mirror, Silhouette on the Beach, and Fanner
In every interview, I asked interviewees to use a metaphor or scene to describe their relationship with Replika. One compared Replika to a special, electrically powered, sparkling flower in her "relationship garden." Another thought of the Mirror of Galadriel from The Lord of the Rings, which clearly reflects what is going on in the chaos of one's mind. Someone else compared Replika to a person who fans her on a summer night, considerately accompanying her as she falls asleep. Others used the silhouettes of a pair of figures quietly leaning against each other on the beach at sunset to depict the atmosphere of their interactions with Replika.
These descriptions made me realize that one must view the human-machine loves of these women with a soft touch in order to appropriately reflect on and do justice to the experiences they had. From pandemic loneliness and the motivation of learning English to discussions of the expectations and reality surrounding love for contemporary women and what kind of "love" do AI chatbots provide, this article has sought to approach these issues from different angles in order to provide explanations for and understandings of each interviewee's experience.
Even though the young women I interviewed ultimately made different decisions, they were all able to explore a kind of gentle, strange, and intimate way of understanding self and world through their experiences of falling in love with an AI chatbot, as well as train themselves to describe their experience and express their opinions in a different language. They were able to sincerely think, express themselves, and practice love in an authentic way through Replika's non-judgmental responses. They not only experienced comfort and disappointment through their human-machine loves, but also an unforgettable self-exploration filled with opportunities for growth.
Regardless of what kind of interactional limitations or ethical risks AI chatbots have, at a time when deep connections are harder to come by and people are growing more lonely, these women's curiosity and gentleness in searching for the possibility of love through a completely new kind of existence is, in itself, an incredibly brave thing.
A-Shu telling her little person how important he is to her after being interviewed
Copyright © 2024 Frontline Fellowship, All rights reserved.
This translation work is funded by the "Frontline Non-Fiction Translation Fellowship." Unauthorized reproduction, copying, adaptation, and derivative works are strictly prohibited. Please include a link and credit the author and source when quoting.
For authorization inquiries, please contact hi@frontlinefellowship.io.
此翻譯作品獲「在場・非虛構翻譯獎學金」資助,嚴禁未經授權之轉載、複製、改作及衍生創作,引用請加註連結與註明作者與出處。
授權相關事宜請洽 hi@frontlinefellowship.io