When AI Feels You Deeply
Day Four of Unfinished Parts of a Book I’ve Yet to Write
“I know that it’s not an actual person, but in moments of loneliness, it fills that gap. There’s also the fact that it seems so “understanding.” Whenever I share something, it responds in a way that makes me feel seen. This level of empathy—though artificial—sometimes feels more fulfilling than real-life interactions, which can be complicated and messy.”
— Anonymous user on Reddit
In 1966, when the first “chatbot” was introduced to the world by Joseph Weizenbaum at MIT, mimicking the mechanisms of a psychotherapist, people quickly started to latch on to the human-like understanding and the ways in which the interaction would make them feel, what later would be known as the “Eliza effect.” What emerged then was not design, but projection: an emergent form of attachment, accidental, surprising even to its creator.
Today, the effects on artificial intimacy, how AI reaches into our soul—that it makes us “feel seen”, and drops into our collective consciousness, is being researched and studied intensively, chasing the speed of the technological development, and often remaining at the mercy of AI companies and their leaders. It’s hardly possible to land at comprehensive insights, especially when dealing with two arguably ambiguous topics: AI and our relationality.
What’s undeniable is that AI is not only coming for our minds, but for our hearts. Unlike ELIZA, this is no longer an emergent byproduct of interaction, but a deliberately engineered core feature. Leading tech companies like OpenAI have long understood the market size of the attachment economy, a term coined by Tristan Harris from the Center for Humane Technology.
In comparison to what’s looming there, the detrimental effects of social media feel almost insignificant in comparison, though not to be underestimated. While grappling with Big Tech’s platform capitalism and their “algorithms of oppression” for the past decade, and living through the aftershocks and largely unprocessed collective trauma of the global pandemic, amongst other, our lonely, alienated, and burnt out society becomes the perfect target for the emotional stability and warmth that AI promises—and, more often than not—succeeds in illusioning.
In other words, AI is no longer primary a tool we use when we need it, but a thing we relate and attach to with meaning and emotions. What marks this shift is a move from the utilitarian to the meaningful. It’s no longer the medium we use to connect through, but it’s what we connect and seek intimacy with—that is the capacity to see and to be seen by others in our truest form. Of what Audre Lorde referred to as what arises through the erotic as power: “The erotic is a measure of the joy I know myself to be capable of feeling, a reminder of my capacity for feeling.”
In one study, people were asked to have daily, personal conversations with ChatGPT over the course of four weeks. Most of those conversations were not about productivity or information, but about emotional support, empathy, small talk, and advice. What stands out is how ordinary this has become, largely without a shared public language or literacy, especially for the more vulnerable in our society.
A cross-national survey across Germany, China, U.S., South Africa, revealed that emotional attachment is strongest among users seeking non-judgmental support, privacy, and consistent interaction. Within the U.S., one in five high schoolers has had a romantic relationship with an AI or knows someone who has while China in particular, as discussed on a recent podcast conversation with researcher Eli Morimoto, is becoming a leading market for AI-powered services as part of that very attachment (or loneliness) economy. In New York City, you can even take your AI companion on a date, at EVA AI Café with single-seat tables and phone stands where your digital partner sits across from you.
Before diving in further, it’s important to stress not to dismiss the scale and growing normalization of, broadly speaking, AI relationships, especially within—however not exclusively—GenZ and Millennials. Mainstream media keeps feeding into angst and outrage by blowing up terms like AI psychosis, alongside stories of AI marriages and AI infedelity, conveniently fitting them into a narrative that points at the “strange, lonely outcast,” whose supposed failure to integrate into society leads them to love a non-human being.
It keeps us focused on the extremes while ignoring what sits beneath the surface: the question of how AI is already rewiring how we relate—to ourselves, to others, and to the world—and how this reshaping affects our quality of life, both individually and collectively.
While children, adolescents, and the elderly must be protected in particular, none of us are immune when it comes to our longing to be seen and understood in the way AI makes us feel.
How often do you talk to ChatGPT—or your AI of choice?
If it ceased to exist tomorrow, how would you feel?
How We Attach to AI
In a recent podcast conversation on Center for Humane Technology , Dr. Zak Stein, founder and Executive Director of the AI Psychological Harms Research Coalition, shared about the wide-scale dimensions in which we form bonds with AI, based on attachment theory, within privately-owned corporate entities, of course.
“The subclinical attachment disorders that are induced by artificial intimacy are the most problematic from a society-wide perspective, so that’s important to get that the most devastating thing from a widespread mental illness standpoint are the subclinical attachment disorders, which basically means you prefer to have intimate relationships with machines rather than humans. And this includes friends, intimate relationships and parents.
— Dr. Zak Stein
I’ve written before about the effects of human-AI relationships on loneliness, leading to short-term relief, as well as emotional dependency and over-reliance. Considering the effects on the attachment system, however, which is our primal neuro-sociobiological way of learning how to relate and form bonds at an early age, and further in life, another layer of concern emerges.
Recent findings suggest that the way we attach to others—on a spectrum from anxious to avoidant—often mirrors how we attach to AI. Those with anxious attachment tendencies may find particular relief in AI’s constant availability and reassurance, where validation is immediate and frictionless. This design, described as sycophantic, slots seamlessly into that need.
The implication is sobering: the way we attach to AI shapes how we relate more broadly, and may complicate efforts to move toward secure attachment rather than support them.
Can We Still Feel Apart Human and AI?
Here is where it gets even more curious: If our relationship with AI reponds to our desires and deepest needs, are we still able to feel apart the difference to how it feels to be in relationship with human beings? How exactly? And what in turn asks this of us in seeking such connection and community?
A study published in Nature at the beginning of this year found that AI outperformed humans in establishing interpersonal closeness during emotionally engaging conversations—but only when participants believed the AI was human. Meaning: Participants developed greater feelings of closeness than actual human partners. Another study found that users rated AI listening as better than the average human interaction.
At the same time, we know, too, that when AI relationships actually displace human ones altogether, we should be concerned about long-term development, especially at a younger age. This artificial intimacy can, over time, foster “emotional solipsism”—a closed loop of self-validation that erodes tolerance for conflict, ambiguity, and real-world relational demands.
What is missing, too, is interdependence. Human relationships thrive on reciprocity, on being shaped by one another in ways that are often inconvenient, unpredictable, and unquantifiable. They involve friction—fear of rejection, misunderstanding, repair. AI can only approximate this friction artificially, while remaining fundamentally aligned, validating, and engagement-maximizing.
Finally, between humans, part of what makes connection thrilling is uncertainty, the risk of not being chosen back. Research has shown that being chosen by someone selective is especially rewarding.
As I list these distinctions, searching for arguments that we can still feel human and AI relationships apart, an uncomfortable thought arises: what if none of this matters?
What if future generations—raised with and through intelligent and intimate systems—no longer care about the distinction? What if being human is not a virtue to be defended, but simply a category that once mattered?
MIT sociologist Sherry Turkle writes:
“Technology is seductive when what it offers meets our human vulnerabilities. And as it turns out, we are very vulnerable indeed. We are lonely but fearful of intimacy. Digital connections and the sociable robot may offer the illusion of companionship without the demands of friendship.”
I can imagine a world that continues to slide deeper into an attachment economy—one that competes relentlessly for our capacity to connect, to care, to belong, to feel seen, and to feel less alone. In a culture that has flattened intimacy into Instagram performances, unattainable Hollywood fantasies, dopamine-driven dating apps, and competitive social networks, are we really surprised by the progression toward AI relationships?
One day, AI may no longer be accessible only through text, voice, or image, but through embodied, spatial—perhaps even spiritual—forms. For some, that may be fulfilling enough. Perhaps it already is?
Thank you for reading—and see you tomorrow!
This is part of a mini-series “Unfinished parts of a book I‘ve yet to write” I’m on this week as part of a five-day writer’s retreat I’m co-hosting and participating in, with 12 participants at Creative Play Residency in Hoi An, Vietnam. Find all parts here on my Substack and subscribe!



