Many people find ChatGPT and other LLMs to be helpful, particularly at early stages of therapy. Particularly with issues that are uncomfortable or embarrassing, or with mild symptoms, these tools can be a good place to start to get some support. These tools can provide a great deal of information about different mental health issues, and they will provide the user with affirmation and validation. They will never tell someone that they are wrong, and they will almost always sound “understanding.” They will not, however, be able to hear your tone of voice change and your body tense up when you talk about something difficult or notice that when you talk about your child that your whole body relaxed and your eyes focused into the distance. These details are important, because you are a whole person. You are considerably more complicated than just the words that come out of your mouth or the letters you type on a keyboard. A skilled therapist will know when to let there be silence, when to challenge a little, and when to get curious about something you said or did.
Another issue concerns confidentiality and the risk of harm to self and others. LLMs make no promises about your privacy or the security of your interactions. Licensed therapists take great pains to ensure that your disclosures remain private. At the same time, licensed therapists provide a much-needed safety factor, by knowing when confidentiality must be overruled; a therapist can identify when a client has risk factors for suicidality, harm to self or others, and address risks to vulnerable populations such as seniors or children. In this manner, psychologists, clinical therapists, and mental health workers provide a safety factor for all of society that so far has not been replicated by LLMs.