Peter—who asked to have his last name omitted from this story for privacy reasons—is much from alone. A growing variety of persons are using AI chatbots as “trip sitters”—a phrase that traditionally refers to a sober person tasked with monitoring someone who’s under the influence of a psychedelic—and sharing their experiences online. It’s a potent mix of two cultural trends: using AI for therapy and using psychedelics to alleviate mental-health problems. But it is a potentially dangerous psychological cocktail, based on experts. While it’s far cheaper than in-person psychedelic therapy, it will possibly go badly awry.
A potent mix
Throngs of individuals have turned to AI chatbots in recent times as surrogates for human therapists, citing the high costs, accessibility barriers, and stigma related to traditional counseling services. They’ve also been a minimum of not directly encouraged by some distinguished figures within the tech industry, who’ve suggested that AI will revolutionize mental-health care. “In the long run … we may have *wildly effective* and dirt low cost AI therapy,” Ilya Sutskever, an OpenAI cofounder and its former chief scientist, wrote in an X post in 2023. “Will result in a radical improvement in people’s experience of life.”
Meanwhile, mainstream interest in psychedelics like psilocybin (the important psychoactive compound in magic mushrooms), LSD, DMT, and ketamine has skyrocketed. A growing body of clinical research has shown that when used at the side of therapy, these compounds may help people overcome serious disorders like depression, addiction, and PTSD. In response, a growing variety of cities have decriminalized psychedelics, and a few legal psychedelic-assisted therapy services at the moment are available in Oregon and Colorado. Such legal pathways are prohibitively expensive for the typical person, nevertheless: Licensed psilocybin providers in Oregon, for instance, typically charge individual customers between $1,500 and $3,200 per session.
It seems almost inevitable that these two trends—each of that are hailed by their most devoted advocates as near-panaceas for virtually all society’s ills—would coincide.
There at the moment are several reports on Reddit of individuals, like Peter, who’re opening as much as AI chatbots about their feelings while tripping. These reports often describe such experiences in mystical language. “Using AI this fashion feels somewhat akin to sending a signal into an unlimited unknown—trying to find meaning and connection within the depths of consciousness,” one Redditor wrote within the subreddit r/Psychonaut a few yr ago. “While it doesn’t replace the human touch or the empathetic presence of a standard [trip] sitter, it offers a novel type of companionship that’s at all times available, no matter time or place.” One other user recalled opening ChatGPT during an emotionally difficult period of a mushroom trip and speaking with it via the chatbot’s voice mode: “I told it what I used to be pondering, that things were getting a bit dark, and it said all the appropriate things to simply get me centered, relaxed, and onto a positive vibe.”
At the identical time, a profusion of chatbots designed specifically to assist users navigate psychedelic experiences have been cropping up online. TripSitAI, for instance, “is concentrated on harm reduction, providing invaluable support during difficult or overwhelming moments, and assisting in the combination of insights gained out of your journey,” based on its builder. “The Shaman,” built atop ChatGPT, is described by its designer as “a smart, old Native American spiritual guide … providing empathetic and personalized support during psychedelic journeys.”
Therapy without therapists
Experts are mostly in agreement: Replacing human therapists with unregulated AI bots during psychedelic experiences is a nasty idea.
Many mental-health professionals who work with psychedelics indicate that the essential design of enormous language models (LLMs)—the systems powering AI chatbots—is fundamentally at odds with the therapeutic process. Knowing when to speak and when to maintain silent, for instance, is a key skill. In a clinic or the therapist’s office, someone who’s just swallowed psilocybin will typically placed on headphones (listening to a playlist not unlike the one ChatGPT curated for Peter) and an eye fixed mask, producing an experience that’s directed, by design, almost entirely inward. The therapist sits close by, offering a supportive touch or voice when essential.