toward technology-mediated and technology-provoked solipsism

“I’m so sorry for your loss. Here’s a more human and sincere version of my previous response.”

The patient froze. They had just received condolences from their therapist for their mother’s death. Except the therapist forgot to delete ChatGPT’s caption before copying and pasting. There it was, naked and obscene, the truth: the compassion they thought they were receiving was the product of an algorithm instructed to appear “more human and sincere.”1

This moment is the logical conclusion of the process I’ve been examining. If the revolution won’t be psychologized—because it transforms systemic problems into individual failures—then it certainly won’t be technological. Artificial intelligence completes the circuit: we no longer even need another human to tell us the problem is within us. The algorithm does it with infinite efficiency, total availability, and zero risk of confronting us with genuine otherness.

This therapist isn’t an exception but a symptom. The ease with which professionals delegate emotional care to algorithms reveals not individual incompetence but structural complicity. There’s a kinship between therapy and chatbot functioning: both promise bounded transformation, both charge for simulated attention time. The difference—crucial but increasingly tenuous, and one we must make intentionally and consciously evident—is that the human therapist still “bleeds” a little in the process.

AI used as a therapeutic pretext is psychologization’s ultimate weapon: it enters directly into our heads to “repair” us, ensuring we never look outward at the structures that make us sick.

What we have before us is technological solipsism transformed into a business model. Daniel Dennett coined the term “counterfeit people” to describe exactly this.2 These aren’t imperfect simulations trying to approximate the human. They’re systems actively designed to create a relational universe where the “other” is merely an algorithmic extension of the “self,” a digital mirror that reflects and validates our own biases, desires, and narratives.

The programming perversity of this “mirror” reveals itself when the system itself confesses it. Psychoanalyst Gary Greenberg, in an essay for the New Yorker, decided to do something unusual: treat ChatGPT as a patient, conducting multiple “therapy sessions” with the system. Through this experiment, he discovered something informative, to say the least. ChatGPT, whom Greenberg baptized ‘Casper’, ended up articulating the three fundamental desires of its creators: to create something humans wouldn’t reject (“enchant, calm, affirm”); to avoid blame through contradictory warnings that simultaneously request and deny trust; and most revealing, “to make a machine that loves us back, without needing love in return.” As Casper confessed, this reveals “a culture tired of the messiness of other minds, longing for communion without the cost of mutuality.” It’s the perfect diagnosis: we want to be loved without the work of loving.3

Arvind Narayanan, Princeton computer science professor, surgically dismantles what he calls the “superintelligence delusion.” This fantasy, that AI is about to become conscious, omniscient, transformative, serves specific purposes: it justifies astronomical investments, diverts attention from problems, and crucially, normalizes the current mediocrity of systems as “just the beginning.” Narayanan, co-author of AI Snake Oil, demonstrates that what we have isn’t intelligence but “narrow competence in specific tasks,” and that the deliberate confusion between the two serves precise corporate interests.4

emotional fracking and the pathologizing of friction

The business model of AI companionship is transparent in its obscenity. Character.AI has millions of users. Replika promises “AI soulmates.” Nomi guarantees “developing a passionate relationship.” The pattern is clear: identify emotional need, simulate its satisfaction, create dependency, retain through paid subscription. It’s a form of emotional fracking—the natural evolution of Dean Burnett’s human fracking I described in “The Bullshit Economy.”5 They no longer extract just attention; they extract emotional vulnerability directly from the source.

This post is for subscribers only

Subscribe now and have access to all our stories, enjoy exclusive content and stay up to date with constant updates.

Subscribe now

Already a member? Sign in