Since its release on November 30, 2022, ChatGPT has acquired over 800 million weekly users and processed over 1 billion queries, with uses ranging from the most radical of ideas to the most practical. What shapes ChatGPT isn’t only its ideas, but its speed. In fact, ChatGPT’s average response time for straightforward questions is a mere fraction of a second. Even more complex questions receive thoughtful responses in moments, in a way that differs from Google as the information is essentially laid out in front of the reader; there is no need for endless searching in multiple websites and Internet pages. It is an efficiency incredibly out of humanity’s league, yet somehow ChatGPT still feels human. It is perhaps a culmination of this speed, efficiency, and personal touch that fueled a striking new phenomenon: the transformation of ChatGPT into a therapist.
Increasingly, individuals have been turning to ChatGPT for advice on how to confront both internal and external issues. In a way, ChatGPT is essentially the “perfect” therapist: it is the most active listener, is completely free (there is a paid option, yet the AI functions very similarly in the free version), and it is available all hours of the day. Whether it’s a late-night bout of anxiety, a sudden argument with a friend, or a slight moment of self-doubt, ChatGPT’s response is always a few seconds away. It is reassurance that is delivered with a few keystrokes, with no need for a scheduled appointment or the wait on a therapist’s calendar. ChatGPT works for you, for as long as you want, and caters an individualistic response to each user. Moreover, although ChatGPT’s language can flow just like a human’s, especially as it is trained to act like one, there is possibly an added bonus to the fact that it is not real. Users may feel more inclined to be honest not only because they can hide behind a screen, yet also because that robot will never expose the individual, as it doesn’t think or act alone; it just does what it’s programmed to do.
This new role for artificial intelligence is not limited to social media. In fact, it seems to also have taken over the Graded hallways just as ubiquitously. When freshman L.F. was asked about her AI use for therapeutic means, she responded that ChatGPT exists as “a safe space for [her] to sort out [her] thoughts and feelings. It helps [her] calm down and figure out what to do next”. She also pointed out her increased use now as a freshman, for it seems that the amount she needs to handle has doubled, and ChatGPT exists as constant reassurance. Senior I.T. shares similar thoughts, stating that “as a senior, there’s so much pressure with this cycle of school coming to an end. I’m constantly stuck between this bittersweet feeling of excitement for what’s next and sadness of having to let my childhood go, and ChatGPT helps me process my thoughts”. Even on both ends of the high school spectrum, both interviewees highlight how heard AI makes them feel, providing an intimate and safe space for all ages. This is further corroborated by a study from the University of Toronto, that explains how AI chatbots like ChatGPT can sustain their compassion for longer periods of time than real professionals.
Nonetheless, as was stated by a spokesperson for OpenAI (the company that created ChatGPT), ChatGPT is a “general purpose technology that shouldn’t serve as the substitute for professional advice” (Fortune). Psychotherapist Charlotte Fox Weber also highlights how, even though ChatGPT can offer that facade, it will never truly hold your pain. It will never challenge you or entice growth, like a real therapist. It is simply programmed to agree with you. It can also further black-and-white thinking, while also missing the key intuition that is needed during diagnosis of mental disorders. Thus, ChatGPT is the tool, not the replacement. Nothing can ever replace human connection, much less human empathy. That does not mean that those who use ChatGPT as therapy are erroneous in their ways; completely the opposite, as it is commendable to want to heal and solve issues. It just needs to be used cautiously, as once again, listening to the words of ChatGPT’s creator, it is not and will never be a medical professional.