AI Therapist?

The AI conversations are ramping up. Mostly it’s been around generative AI art and lately I’ve seen more discussions around yay or nay to AI therapists. Would you trust one? Would you date someone who uses one? The responses have been eye-opening.

The most unsurprising and obvious is that there is a gap between supply and demand. Our mental health is flagging and there isn’t sufficient support. While the past few years exacerbated mental health, it is by no means a new problem. Kaiser Family Foundation data shows that as of December 31, 2024, there are 6,418 Mental Health Care HPSA Designations. HPSA stands for Health Professional Shortage Area and where the population to provider ratio is over 30,000 to 1. The table also shows that the US needs 6,200 more practitioners to drop below this ratio.

In a 2023 article by American Counselling Association, five factors were identified by the few health professionals interviewed.

  • Insufficient government funding which means self-pay or private insurance need to cover the service fees
  • Low reimbursements by insurance companies which means the professionals are inadequately paid
  • Low retention rate since employment in the mental health field is not an attractive career pathT
  • Need for mental health care exceeds access and access is limited due to lack of service in rural areas
  • Workforce nearing retirement combined with low retention rate results in shrinking field practitioners

When I looked at Threads on the AI therapist topic, these factors were reflected in people’s responses. In contrast, AI is available 24/7/365, low-cost, and easily accessible. Many recognized it is only a tool. They see it, for example, as a bridge – what they can utilize in between therapy sessions. For others, AI is fair game to “trauma dump” and there is no risk of rejection, unlike with their friends. AI therapists also have no relationship fatigue. Probably for most people, AI chatbots play the role of a needed sympathetic ear. And that’s good enough.

Those who have been using AI as a therapist shared their experiences:

  • need to ask AI to be direct and to challenge you
  • can direct AI to use specific therapeutic tools and principles
  • with effective prompts, AI can help them recognize patterns and connect dots
  • AI therapists are useful in providing alternative scenarios which help them see beyond their own viewpoint

Those who oppose using AI therapists have this to say:

  • AI only mimics empathy and is not embodied – there is no soul
  • AI is not human and it is a healthy human relationship that is missing. A relationship and dependance on AI does not help the person build meaningful human relationships.
  • AI cannot provide human connection.
  • Personal information and data are being mined so privacy issues
  • AI tends to people please which can be harmful
  • Interactions are based on data and there is no intuition or experiential wisdom provided

Another obvious thread emerged, aside from the lack of accessible mental health support. What’s obvious, reading the online conversations, is how many people have unhealthy human relationships. It’s not an area of life that’s easy or fulfilling for many people and probably for all of us, at some point. There are many frustrations, pressures, unmet expectations, and even ridiculous ideas about what a successful relationship even is. This is probably why so many Human Design practitioners are keen to share how knowing our individual and connection charts can be pivotal. This helps us to understand how each of our energy and how the energy of togetherness naturally move in life.

Currently there are no regulations for ethical development of AI therapists. Users are not necessarily aware of this. This is not surprising as chatbots are built to have fluid conversations and provide solutions. Humans have evolved to subconsciously recognize non-verbal communications which aid in our decision-making. These cues are missing with AI interactions.

There are two lawsuits against Character.AI which led to violence against one boy’s parents and to the suicide of another teenager. The lawsuit against Character.AI (Character Technologies) and Google in the second case has been allowed to proceed to court.

The AI scene is still evolving. For many it’s problematic that companies are resisting regulation. AI is not only changing how we process information. It’s the most momentous technological shift in our history that will change the social fabric and even the presentation of what’s real or not. It’s up to each of us to decide if an AI Therapist is for us. This also means more awareness of the capabilities and pitfalls of AI.