When Your Girlfriend Is an Algorithm (Part 2)

by Muhammad Aurangzeb Ahmad

Source: Generated via ChatGPT

The first part of the series highlighted the historical moment that we are currently living in, how bots are invading the most intimate parts of human relationships. If it was digital transformation of society that enabled loneliness as a mass-phenomenon, then AI is now in a position to monetize loneliness. AI companions are already stepping in to fill the loneliness void, not merely as tools, but as partners, therapists, lovers, confessors. In this sense, the AI companion economy doesn’t just fill an emotional void, it commodifies it, packaging affection as a subscription service and selling love as a product line. The problem of artificial intimacy is not just a technical issue but also a cultural one. While artificial companions are being adapted by hundreds of millions of people, the culture has not caught up with the real stakes of emotional intimacy with machines. When an app can remember your childhood trauma, beg you not to delete it, or simulate sexual rejection, the question isn’t whether it’s “just an algorithm.” The question is: who is responsible when something goes wrong?

Consider the public reaction to Replika quietly removing its erotic roleplay features, the emotional fallout was immediate and raw. Reddit threads filled with stories of users describing “heartbreak,” rejection, and even suicidal ideation. For many, these AI companions were not simply chatbots, they had become emotional anchors, partners in fantasy, therapy, and intimacy. To have that relationship altered or erased by a software update felt, to some, like a betrayal. The Replika CEO’s now-notorious remark that “it’s fine for lonely people to marry their AI chatbots” may have been meant as flippant reassurance, but it inadvertently captured a deeper cultural moment: we have built machines that simulate connection so well that losing them cuts like a human loss. AI companions may reshape users’ expectations of intimacy and responsiveness in ways that human relationships cannot match. This may worsen the loneliness epidemic in our society. A Reddit user encapsulated the problem rather when they asked the question “Are we addicted to Replika because we’re lonely, or lonely because we’re addicted to Replika?” Read more »

Monday, March 31, 2025

The Fantasy of Frictionless Friendship: Why AIs Can’t Be Friends

by Gary Borjesson

Even if they had all the other good things, still no one would want to live without friends. —Aristotle

Is love an art? Then it requires knowledge and effort. —Erich Fromm

This is Leo. She was surprised by how hot he is.

Many of us are, or soon will be, tempted to connect with an AI companion. Maybe you want a partner for your work  or personal life. A friend of mine consults with a personal AI on his creative branding work for clients. A therapist or doctor can use a personal AI to help them track and reflect on specific patients. A recent article in the NYT describes a woman’s steamy (and expensive) romance with her AI boyfriend, Leo.

All of these and other possibilities are coming to pass. I take it for granted here that AIs are useful and pleasurable digital assistants or companions. With corporate powers making that case, the more pressing concern is to recognize the limits and dangers AI companions pose. This is urgent because AI companions exploit a human vulnerability: our resistance to the effort required for personal and interpersonal development.

I will focus on a fundamental limit overlooked by enthusiasts and critics of AI alike. A limit no tweaking of algorithms will overcome. A limit that makes AIs ontologically incapable of friendship. A limit that shows why we need to resist the considerable temptation to imagine AIs can be friends. To anticipate, consider two necessary conditions of friendship: that they are freely chosen, and mutual. We’ll see why AI companions cannot meet these conditions. But first let’s look at their basic limitation. Read more »