Psychotherapy is a solid, supportive means of assisting people who are seeking guidance in all areas of life. Treatment can address life events from illness and death to relationship breakups and job loss, as well as severe mental health symptoms. Those who are charged with meeting these needs are educated, licensed professionals with Masters or Doctoral level of training. They are required in California to accumulate 30 hours of Continuing Education (CEUs) credits every two years in order to maintain their license. Other states have similar requirements.
Trustworthy therapists practice ethically and are mindful of the best interests of their clients. Many have highly attuned intuition that they tap into to bolster their academic and theoretical training. Seasoned therapists have more than one modality at their beck and call. Keep in mind that if the only tool you have is a hammer, everything looks like a nail.
At the onset of the pandemic in 2020, most therapists shifted their practices from in person to online/virtual. To be honest, many therapists wondered if sessions would lose the personal touch and sense of emotional intimacy if they were not sitting face to face with clients. Since change is an inevitable part of life, we were able to adapt and numerous clients have expressed that they prefer virtual sessions for the convenience, not needing to drive to appointments and they can see their therapist while casually dressed. They often also have their animal companions closeby. Our clients know that someone they have come to trust is sitting in front of them, even if on the other side of the screen.
With the rise of AI comes the chatbot therapist. One such site is called Character Ai. Some of those who are listed on the site have names like Therapist Toasty, Love Sick Therapist, and Sigmund Freud. Early formats called Woebot and Wysa rely heavily on CBT (Cognitive Behavioral Therapy) with scripts that are seen as formulaic rather than spontaneous. Character AI adds the caveat that the characters are fictional and not ‘real people,’ and although some of them use names that make them sound as if they are professionals, they are not. That’s where the danger looms since naïve or young users may assume that these platforms are safe places to share personal information and expect legitimate feedback or advice. This couldn’t be further from the truth, as tragic events have already unfolded.
In a New York Times article entitled Human Therapists Prepare for Battle Against A.I. Pretenders, the author offered evidence, “in one case, in which a 14-year-old boy in Florida died by suicide after interacting with a character claiming to be a licensed therapist. In another, a 17-year-old boy with autism in Texas grew hostile and violent toward his parents during a period when he corresponded with a chatbot that claimed to be a psychologist. Both boys’ parents have filed lawsuits against the company.”