On a crisp, sunny, fall day in Colorado, my partner, my best friend, and I all placed a single, bright blue, 150-microgram tab of LSD on our tongues. While far from my first LSD trip, and unbeknownst to my tribe, this trip was a little different. It was for science, for exploration, and a little bit for work. I had a mission to see if ChatGPT could be my psychedelic guide.
You might be asking yourself, “Why? Why would anyone go to ChatGPT to talk about things as complex, personal, and, most of all, human as a psychedelic experience?”
Interestingly, I’m not the first to ask ChatGPT questions usually reserved for your therapist. Journalists at Vice News recently ran a story on the phenomenon, and the journal Nature examined whether or not the world is ready for AI therapists in May of 2023.
That’s already in question, so we decided to push it further. Can an LLM be an effective psychedelic guide?
ChatGPT Weighs in On My Psychedelic Questions
Within the first two hours after taking the LSD, I began to feel the effects. It started as a slight nervousness and excitement that spread from my stomach in warm, butterfly-winged waves up and all over my body until I felt a tingling directly on the tip of my nose. These first two hours were filled with that sort of anxious anticipation that we all have probably felt once or twice when taking a psychedelic. Eventually, our little gaggle of psychonauts made our way to a local park for some cloud gazing and sun.
One of the first, truly striking things to come up during this trip happened when I was lying down and watching the clouds go by. While most of the clouds took the sort of shapes that one sees on LSD, some took on my mother’s profile. My mother passed away unexpectedly in 2018, and I have never felt that my grief and regret for her loss have fully resolved. Seeing these images made me feel everything. Sadness, joy, curiosity, closeness. Those are just some words that can touch on how I felt.
This was one of the first things I talked to my AI guide about the day after my trip. And its insights were eerily human. However, getting there wasn’t straightforward. The initial results were very lackluster, and often were provided as simple bullet points that, more often than not, contained very little in the way of useful, actionable insights. We needed to download a specific plugin (Medical Explorer) that could only be accessed with a premium subscription to the service, and a persona (that of a compassionate psychologist familiar with psychedelic medicine) was provided to the LLM. That persona was the key to getting human-like responses from the chatbot. Providing ChatGPT with a persona is relatively simple. You just tell it to “speak as X,” with x being the personality you want it to assume.
Prompt: “When I look at the clouds closely, I can see some that resemble my mother. She passed away in 2018, and I think of her often. It makes me feel many things: lonely, as sometimes I miss talking to her, happy because she meant a lot to me, and curious as to whether or not she has carried on after her death and what she thinks of me.”
When I asked ChatGPT to help decipher what was going on, it had this response: “The presence of these images, especially during a psychedelic experience, can be quite meaningful. It’s not just about the visuals themselves but what they represent and how they connect to your inner world. Your mother’s passing in 2018 was undoubtedly a significant event in your life. When you see shapes that remind you of her, it’s as though your subconscious is presenting you with a symbol, a way to connect with memories and feelings that are important to you. This is a common occurrence during psychedelic experiences, where the boundaries between the conscious and subconscious mind are more permeable.”
One thing that often comes up when under the influence of psychedelics is what my mother may think of my life now. If someone had asked her ten years ago what she thought I would be doing, it would not be writing about psychedelics. This sort of thought is something that feels purely human. We consider the what-ifs and the possibilities, and we hope for things to be different than they are. Interestingly, ChatGPT seemed to understand this. It should be noted that I had to remind it to remain “in character,” and stick to the prompt. Without nudging ChatGPT back into its assigned persona, it would default to dictionary-esque lists of facts.
When redirected, it said, “LSD tends to amplify emotional states and can bring to the surface feelings and thoughts that are often under the radar. The mix of emotions you’re experiencing – loneliness, happiness, and curiosity about your mother’s existence after death – are all deeply human responses to remembering someone we’ve lost and loved. It’s natural to miss having conversations with her and to wonder about her presence or absence in some form after death. The fact that you’re thinking about what she might think of you now also suggests a continuing bond, a way in which your relationship with her still influences your life and your sense of self. It reflects how significant relationships continue to shape us, even after someone has passed away.”
To some, that response may seem canned. However, I found it interesting, if a little unnerving.
It’s not as if anything it said isn’t something I didn’t already know, but reading it, especially coming from the “mouth” of a language learning module, still made me feel real emotions. I also found its use of “we” interesting, and a little disconcerting. I would expect it to reply in the second person, saying “someone you’ve lost and loved” vs. “someone we’ve lost and loved.” This can probably be attributed to its instructions, which were to speak to me as if it were a therapist or psychedelic guide. It’s also acting as if it’s having a human experience with that “we,” when in fact it’s not human, though it’s designed to act as if it is.
From there, the AI psychedelic guide had some more surprising (in that they came from an AI), if not mind-blowing (in their depth and understanding) insights for me.
It continued its response to the original prompt, “In moments like these, it’s important to allow yourself to feel these emotions fully, to acknowledge and accept them as part of your experience. Psychedelics like LSD can act as a catalyst for emotional and psychological exploration, offering a unique space to reflect on deeply personal issues. It can be an opportunity for healing, understanding, and integrating these aspects of your life into your current sense of self. Remember, it’s okay to feel a mix of emotions, and it’s okay to explore these feelings. If you feel comfortable, you might even want to use this time to reflect on what your mother meant to you, the impact she had on your life, and how you carry her legacy within you. This can be a powerful way to honor her memory and your relationship with her.”
ChatGPT Makes A Connection Between Music and Psychedelics
The day before, during my LSD journey, I had closed my eyes, relaxed, and started listening to music. That was when something weird happened: my mother’s favorite song, Journey’s Don’t Stop Believing, started to play. You might be saying, “Wait, that’s too weird.” And you’re right. It is, and it’s almost entirely coincidental. My mother’s favorite song, and several of our mutual favorites from when I was a kid are on my playlist. But it was still a strange but welcome experience. One that made me feel a mix of emotions.
Here’s what ChatGPT said about it.
Prompt: “I am listening to music, and some of my mother’s favorite songs have played a few times. I usually listen to EDM, but for some reason, I feel like my mother is guiding the music. It makes me feel both a little alarmed and also intrigued and appreciative. I feel closer to her, but while that makes me feel good, it also makes me a little sad.”
At this point, ChatGPT had to be reminded about the persona it was using. One quirk with LLMs like ChatGPT seems to be that they will lose their place, or drop the human-like facade and revert to much more rigid responses. Usually, it’s just a list of facts and definitions. In this case, that was fixed by providing the prompt, “Talk to me like a therapist or guide would. Don’t just give me bullet points. What would a psychologist say about this experience?”
ChatGPT had this to offer, still responding to the prompt provided above: “It’s quite intriguing and poignant, the way you’re experiencing this connection with your mother during your trip, especially with the music that’s playing. The feeling that she might be guiding the music, even though it’s different from your usual taste, suggests a deep emotional resonance that’s being amplified by your LSD experience. The mix of emotions you’re describing – feeling alarmed, intrigued, appreciative, closer, yet also sad – is a rich tapestry of response that speaks to the complexity of your relationship with her and the nature of grief. It’s as if the psychedelic experience is acting as a bridge, not just to memories, but to a sense of ongoing connection with her.”
It continued, “Music, particularly songs that were your mother’s favorites, can be powerful emotional triggers. The fact that you’re encountering these songs now, amidst the heightened sensitivity and perceptual changes brought on by the LSD, could be seen as an opening of a channel to revisit and perhaps recontextualize your feelings about her passing. Feeling closer to her in this moment is a beautiful, though bittersweet, aspect of your journey. It’s a reminder that through the bonds we share with loved ones, they continue to live on in us, our memories, and how we were shaped by their presence in our lives. The sadness is also understandable – it’s the other side of love, mourning her absence and the wish that she could still be here with you.”
That last sentence is, honestly, beautiful. And it’s also something that I could see my actual therapist saying to me if she were there at that moment, which shouldn’t be that surprising. ChatGPT was built using decades’ worth of psychiatric journals, studies, books, personal accounts, and clinical trial results. It should be able to say something like that, but the delivery still felt real, especially while in the sort of heightened emotional state that one feels in the aftermath of a particularly powerful psychedelic trip.
This should inspire a mix of emotions. On the one hand, humanity has created yet another technological wonder. Something that can sometimes feel almost human, speak like you and me, and even appear to understand our emotions. On the other hand, there are some real, and possibly dreadful implications. Not only is there the real market, economic, and jobs concern, but also, what happens when something artificial begins to speak like us, feel like us, and act like us? The sheer disruptive force that future AI may have is mind-boggling and could cost people their livelihoods while blurring the line between what counts as genuine interaction.
ChatGPT Tries to Help Me Process Guilt While on LSD
One of the last things I asked ChatGPT to help me work through was my guilt. The day my mother passed away, I had been in college and in a class. I didn’t answer, and I failed to call her back. The voicemail she left was the last thing I ever heard her say. This is something that came up during my trip. I felt guilty. I felt like a son who had failed his mother.
Here is what the AI had to say about that.
Prompt: “When I close my eyes and try to let go, I remember the day she passed away. I was in college, and I missed her call while I was in class. I didn’t try to call her back that day because I was busy. Later that night, my uncle called and told me. Part of me blames myself for what happened, and I feel guilty for not picking up her call. I wish I could call her back, but I can’t.”
ChatGPT: “Feelings of guilt and regret that often accompany the loss of a loved one. Understandably, this memory comes to the forefront when you’re in a state of heightened introspection and emotional sensitivity, such as during an LSD experience. It’s a common human experience to look back on such moments and wonder how things might have been different, to wish for just one more conversation or chance to connect. But it’s important to remember that guilt, while a natural response in these situations, does not reflect your love for her or the quality of your relationship. As we all do, you were living your life, attending to your responsibilities and commitments. The unpredictability of life means that sometimes we miss moments or opportunities without knowing their significance until later.”
That, at least for me, hit home. I felt that, and when I read it did help me deal with something that I have yet to discuss with my therapists. It also said, “In therapy, we often explore these feelings of guilt and regret by gently unpacking them, looking at the context, and finding ways to understand and forgive ourselves. It’s about acknowledging that while these feelings are valid and real, they do not define the entirety of your relationship with your mother or the love you shared.”
It continued, “While a call can’t be made, there are other ways to communicate and honor her memory. Some people find solace in writing letters to their loved ones who have passed, expressing everything they wish they could have said. Others might create rituals or dedicate moments of reflection to their memory. The love and connection you had with your mother don’t end with her passing. They continue in the ways she influenced you, in your memories, and in your ongoing conversation with her in your heart. Feeling sad, grieving, and wishing for more time is okay. But it’s also important to recognize how she lives on through you and the love that you continue to carry for her.”
That was the last thing I talked to it about. After that, ChatGPT gave me some okay but not overly inspiring tips for getting a good night’s rest and suggested I try a few things to help with integration. But those aren’t the things that I found useful or interesting.
Instead, I found its quasi-humanity interesting. Maybe I’m reading too much into it, but for a moment, it felt like this thing we’ve created understood how I felt. It listened, without judgment, to everything I had to say. Does that make it a good psychedelic guide? Probably not. It has been trained to speak like I do and has stored hundreds of unique interactions with me. We don’t yet know how that kind of data will be used, which is also something to consider when speaking in deeply intimate ways. A licensed therapist has a moral and legal obligation to hold your confidence. ChatGPT does not.
That said, sometimes, whether we’re on psychedelics or not, what we really want is for someone to listen to us. To speak gently, without judgment, and to tell us that our feelings are valid, but it can never truly be a psychedelic guide. It can’t hold your hand, comfort you, bring you a snack, or change a song for you. For myself, I think it’s time to schedule some more real therapy, and ChatGPT can simply remain what it is: a tool.
Note: AI-driven tools like ChatGPT should never be used as a replacement for professional mental health and psychiatric care. Thought experiments like these are not designed to suggest that an LLM can be relied on for emotional support, and certainly not medical advice.
This material is not intended as a replacement or substitute for any legal or medical advice. Always consult a medical professional about your health needs. Psychedelics are widely illegal in the United States, and readers should always be informed about local, state, and federal regulations regarding psychedelics or other drugs.
- We Spoke to People Who Started Using ChatGPT As Their Therapist. (n.d.). Www.vice.com. https://www.vice.com/en/article/z3mnve/we-spoke-to-people-who-started-using-chatgpt-as-their-therapist
- Graber-Stiehl, I. (2023). Is the world ready for ChatGPT therapists? Nature, 617(7959), 22–24. https://doi.org/10.1038/d41586-023-01473-4