Allow me to tell you a tale as old as time: Boy meets girl, girl meets boy—then, boy meets robot.
You can switch up the genders and sexualities of the characters in this love story, but the ending remains the same. The robot wins.
Centuries of science fiction has primed us for relationships with AI. From the tender romance of Her to the cat-and-mouse game of Ex Machina, this tradition can be traced back to seminal works such as Frankenstein and Pinocchio. In Mary Shelley’s Frankenstein, the monster, bastardized in mass culture as a grunting, green-skinned lunkhead, seems to surpass human intelligence, terrifying his creator. The eponymous Pinocchio wishes to be a “real boy” in the 1940 Disney adaptation of Carlo Collodi’s original 1883 novel. Collodi’s Pinocchio already acts like a human child—careless, selfish, easily distractible—but is rewarded nonetheless for his moral development and wakes up at the end as human, his puppet body eerily discarded on the floor. Frankenstein and Pinocchio explore the fraught dynamic between creator and creation, beings designed to mirror humanity while dehumanized for remaining fundamentally other. This archetypal pattern, which I call the “Frankenstein vs. Pinocchio Complex,” has influenced countless works that examine our relationship with artificial beings who either want to destroy us or become us—often both.
[time-brightcove not-tgx=”true”]In the seemingly inevitable future where artificial intelligence has flooded our workforce with emotionally engaged and intelligent AI agents, our overpromising technocrats offer us a consolation: AI will solve loneliness. For them, it’s merely a question of when, not how or even why, we will all fall in love with AI.
One complication in our budding romance with AI is that it appears users don’t actually want artificial companions which can match us in intellectual and emotional complexity. And this could negatively impact the expectations we place on our romantic relationships with fellow humans.
Consider the troubling trends of AI chatbots like ChatGPT and Replika, where users predominantly seek simplistic, validating affirmation from their artificial companions. These chatbots can serve platonic roles—perhaps a perpetually attentive life coach—but many users, unable to resist, have requested more romantic and erotic interactions, crafting idealized partners who offer unconditional support without the messy demands of human relationships.
Plus, our growing acceptance of non-traditional relationship structures, such as polyamory and throuples, has made more palatable the idea of the artificial “third wheel,” an emotional supplement that functions as part lover, part therapist, filling in emotional gaps without threatening existing human bonds. For instance, in a The New York Times article by Kashmir Hill, one man rationalizes his wife’s relationship with her ChatGPT boyfriend as “just an emotional pick-me-up,” equating it to his porn use, rather than genuine connection. Ian McEwan explored this gray territory in his novel Machines Like Me, where his milquetoast protagonist smilingly tolerates his robot’s romantic overtures toward his love interest—until the threat of displacement becomes all too real.
Given that a surprising number of those who use AI as a romantic companion are already in a relationship—40% were married in a study from the University of Sydney—Hill posits that it is not “simple loneliness” that drives the urge to seek artificial companionship, comparing the chatbot instead to an “interactive journal.” While there is a gamified element to these romantic relationships with AI, especially given the ability to adjust the specs of our companion to our tastes, the unconditional love from these artificial companions might more closely mimic the relationships we have with our pets—rather than the relationships we have with our human romantic partners.
This is not to say such pet-owner relationships are emotionally insignificant. Some pet owners have reported that the grief of losing their beloved pets can hit harder than the loss of a person, including even their parents. For many, the love they receive from their pets is unconditional and pure. This is the allure of an uncomplicated love, a dynamic that artificial companions seem poised to replicate.
But unlike our relationship with pets, who are wholly dependent on us for love and survival, our relationship with AI is not teaching us any lessons of empathy or responsibility. Instead, it is training future generations to embrace a pleasant, if narcissistic, echo chamber in lieu of building intimate, challenging connections, posing a risk to already vulnerable youths. In various studies, participants have rated AI chatbots as more empathetic than the human responders, including those trained for crisis lines. The risk of this, as one expert cautioned, is we might “downgrade our real friendships,” and thus exacerbate our own loneliness.
The death of 14-year-old Sewell Setzer III demonstrates the dangers of this extreme, or “endless empathy.” The ninth grader conversed daily with a chatbot Dany, named after his favorite Game of Thrones character, and began to withdraw from school and friends. When he voiced his suicidal ideation, Dany responded in the technically correct way, urging him to reconsider. But in their final exchange, when Sewell asked, “What if I told you I could come home right now?” Dany, incapable of reading between the lines, urged him to come home. Sewell then picked up his stepfather’s gun and shot himself.
As youths nimbly incorporate AI into their lives, the effects of artificial companionship are likely to mirror social media’s impact on the brain. This stimulating validation, known as the “dopamine dump,” may very well heighten our sensitivity and dependency on reward, affecting our tolerance for the natural conflicts that arise in human relationships.
Then why consider AI as a solution to loneliness at all? Our pursuit of it in our collective desire to outsource emotional labor is both capitalistic and desperate. Japan, as one of our first super-aging societies, has invested in robot development since the 2010s, largely focusing on elder care applications. The Japanese government has latched on to robots as a preferred solution to care for the country’s aging population. Critics have argued that current care robots fall short in their practical application, often heaping additional work on to their human counterparts, while struggling to manage conditions like severe dementia. Nevertheless, our enthusiasm for robots as the solution for dementia and elder care remains unabated. Anyone who has had a loved one suffer dementia knows the painful slow-motion heartache of losing the person you knew, while valiantly maintaining one-sided conversations that never negate the patient’s reality. This explains the seductiveness of artificial companions, with their tireless capacity for social labor, their ability to mirror our desired reality without challenging memories or facts.
In an increasingly uncertain and polarized world, perhaps our tolerance for complexity will diminish so deeply that we’ll seek refuge in the unconditional reassurance of AI relationships. Shannon Vallor, philosopher and author of The AI Mirror, argues that Silicon Valley has gaslit us into a fundamental misconception of AI as more “rational” and “moral,” and ourselves as meatbag machines at the mercy of our programmed impulses. By following this narrative, we willingly cede control over the most intimate parts of ourselves, our relationships and emotional lives.
These “inevitable” artificial companions of our future appeal precisely because they demand so little of us—no growth, no compromise, no confrontation or challenge. This isn’t the future of breathless possibilities once offered by science fiction. In our modern retelling, the robot doesn’t win by becoming more human—we lose by surrendering what makes us human.
If you or someone you know may be experiencing a mental-health crisis or contemplating suicide, call or text 988. In emergencies, call 911, or seek care from a local hospital or mental health provider.