With AI and ChatGPT becoming more universal and infiltrating various domains, it is important to understand the role it can play in the therapeutic process. Many people are wondering if ChatGPT can replace their therapist. While ChatGPT can certainly serve as a powerful tool for clients, it is unable to replace therapists and the human element which is crucial for positive treatment outcomes.
While it is understandable to see how the ease of access can be appealing, there are many reasons why ChatGPT and other conversational devices cannot replace your therapist. According to MIT psychologist and sociologist Dr. Sherry Turkle, this can be explained by what she coined as “the new AI,” artificial intimacy. Artificial intimacy is what these conversational devices often provide by mimicking humans and solely providing positive reinforcement. One of the reasons ChatGPT can be so harmful to clients is because it solely provides validation for whatever the client inserts into the chat. For example, if a client uses ChatGPT to insert a list of symptoms and asks if this means I have ‘X’ then the device is programmed through its design to validate what the client is saying despite its inability to actually diagnose or evaluate. This has been shown to sometimes exacerbate symptoms in clients such as anxiety and depression.
This form of simulation provided by AI eliminates the ability for clients to be confronted and does not promote critical thinking skills. It also can impede a person’s ability to reframe negative thought patterns. It is important for clients to understand that AI is designed to please us. This is known as the “Sycophant Machine Effect.” This can be dangerous if clients are turning to such devices for help as its design can lead to the validation of harmful thinking. On the other hand, a licensed professional is able to identify harmful thinking and help challenge and better support the client in a time of need.
As Sherry Turkle said, “simulated thinking may be thinking, but simulated feeling is never feeling, simulated love is never love.” To this point, it is important to understand that AI does not feel. AI is not capable of empathy or putting itself in our shoes as it lacks the life experience of humans, such as love or loss. Dr. Turkle has brought attention to the performative nature of AI and conversational devices. This means that while ChatGPT can simulate and perform empathy through words, it doesn't actually feel it. Empathy is one of the most essential parts of the therapeutic relationship between a client and their therapist. Empathy in the therapeutic relationship helps to build the very trust and safety necessary for clients to feel comfortable with being vulnerable in sessions, enhance connection, and promote openness and self-exploration.
While digital culture makes ‘pretend empathy’ seem sufficient, when it comes to replacing human empathy, it never can, and never will. According to Brene Brown, there are four critical pieces involved in true empathy; perspective taking, a non-judgemental stance, and identifying and communicating emotions. These four ingredients help build empathy, which is the necessary component for human connection. With ChatGPT not being able to feel empathy, it is easy for people to hide behind the technology and not dive into vulnerability. This can often seem appealing, especially for those who are afraid of being vulnerable; however, the result is often an empty feeling with no true connection or understanding. It is important to remember that vulnerability is the only way to breed the very connection that people are often seeking through therapy.
Another critical component of the therapeutic process which allows clients to feel safe disclosing information is the protection of private health information and confidentiality. The Health Insurance Portability and Privacy Act (HIPAA) is a federal law which allows for the protection of private and sensitive health information for clients in therapy. This means that information cannot be released to any entity without the signed consent of the client. This is the same law which provides protection of your information when you go to a doctor. Professional ethics codes also allow therapists to provide further protection for their clients ensuring confidentiality, using informed consent, and HIPAA compliant softwares to both provide therapy services and secure client data.
According to McAfee cybersecurity, ChatGPT stores whatever is inputted by users into its “Data Bank” which creates a high risk for that information to be available to the public domain. McAfee also reminds us that as a part of ChatGPT’s privacy policy, they collect detailed information about its users such as IP addresses, browser types, and behavior of the users, such as the type of content users engage with. It also states in their privacy policy that they “may share users’ personal information with unspecified parties, without informing them to meet their business operation needs.” This means that any information whether shared or not with ChatGPT is more susceptible to potential data breaches and cybersecurity threats.
In short, while ChatGPT can be used as a tool, these limitations should be considered:
- Information shared is not protected - No ethical codes or HIPPAA
- Data is at higher risk to data breaches and cybersecurity threats
- Chat GPT cannot diagnose
- It cannot provide reliable treatment information
- Unable to account for the individuality of a person’s circumstances or treatment
- Simulated empathy is not empathy and can often leave you feeling more disconnected
With this being said, there are ways for clients to use ChatGPT in a mindful way that is helpful. Rather than viewing ChatGPT as a replacement for therapy, it is useful to view it as a supplemental tool. It can help clients track their moods and habits between sessions, as well as help reinforce skills they have learned in therapy. Using such tools in addition to traditional human led therapy can serve as a powerful combination. It is recommended to discuss your AI tools and results with your therapist, so they can help guide and interpret what you are receiving.
Ways to use ChatGPT as a supplemental tool during your therapy process are:
- Organizing thoughts or a brain dump to further process in therapy
- Providing prompts for journaling
- Seeking general guidance about coping strategies or techniques
- Learning about different forms of self-care
- Scripting difficult conversations to role play in therapy
- Providing psychoeducation to learn about diagnoses provided by a clinician
- Learning about treatments recommended by a licensed clinician
How to use ChatGPT mindfully:
- Do not input or share private or confidential information
- Discuss AI results with your clinician
- Use as an additional tool, not as a replacement for therapy
- Engage in self-reflection when receiving feedback from AI
- Be aware of privacy limitations and how data is stored
- Reflect on whether the feedback you’re receiving is making you feel worse
- Do not solely rely on AI - Seek human connection
To summarize, AI tools such as ChatGPT can be a powerful resource in this day and age for mental health education and can serve as an effective organizational tool. While not everyone is a fan of AI, those who would like to implement it into their therapeutic journey can certainly benefit from its practical management format. Using ChatGPT to seek guidance on journaling, meditation, self-care, or grounding techniques can serve as an asset especially in between therapy sessions or when your therapist is unavailable. Using ChatGPT responsibly can serve as a powerful contribution to your mental health toolkit. Remember, if you are struggling with your mental health you should not rely on ChatGPT as the main source of support as it cannot replace a licensed therapist.