The ascent of the AI therapist

-

AI therapists could flatten humanity into patterns of prediction, and so sacrifice the intimate, individualized care that is anticipated of traditional human therapists. “The logic of PAI results in a future where we may all find ourselves patients in an algorithmic asylum administered by digital wardens,” Oberhaus writes. “Within the algorithmic asylum there isn’t any need for bars on the window or white padded rooms because there isn’t any possibility of escape. The asylum is already in every single place—in your homes and offices, schools and hospitals, courtrooms and barracks. Wherever there’s a web connection, the asylum is waiting.”


Chatbot Therapy:
A Critical Evaluation of
AI Mental Health Treatment

Eoin Fullam

ROUTLEDGE, 2025

Eoin Fullam, a researcher who studies the intersection of technology and mental health, echoes a few of the same concerns in . A heady academic primer, the book analyzes the assumptions underlying the automated treatments offered by AI chatbots and the way in which capitalist incentives could corrupt these sorts of tools.  

Fullam observes that the capitalist mentality behind latest technologies “often results in questionable, illegitimate, and illegal business practices through which the purchasers’ interests are secondary to strategies of market dominance.”

That doesn’t mean that therapy-bot makers “will inevitably conduct nefarious activities contrary to the users’ interests within the pursuit of market dominance,” Fullam writes. 

But he notes that the success of AI therapy will depend on the inseparable impulses to generate profits and to heal people. On this logic, exploitation and therapy feed one another: Every digital therapy session generates data, and that data fuels the system that profits as unpaid users seek care. The more practical the therapy seems, the more the cycle entrenches itself, making it harder to differentiate between care and commodification. “The more the users profit from the app when it comes to its therapeutic or some other mental health intervention,” he writes, “the more they undergo exploitation.” 


This sense of an economic and psychological ouroboros—the snake that eats its own tail—serves as a central metaphor in , the debut novel from Fred Lunzer, an writer with a research background in AI. 

Described as a “story of boy meets girl meets AI psychotherapist,” follows Adrian, a young Londoner who makes a living ghostwriting rap lyrics, in his romance with Maquie, a business skilled with a knack for spotting lucrative technologies within the beta phase. 

cover of Sike
Sike
Fred Lunzer

CELADON BOOKS, 2025

The title refers to a splashy industrial AI therapist called Sike, uploaded into smart glasses, that Adrian uses to interrogate his myriad anxieties. “After I signed as much as Sike, we arrange my dashboard, a large black panel like an airplane’s cockpit that showed my each day ‘vitals,’” Adrian narrates. “Sike can analyze the way in which you walk, the way in which you make eye contact, the stuff you discuss, the stuff you wear, how often you piss, shit, laugh, cry, kiss, lie, whine, and cough.”

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x