Imagery Coaching

View Original

he Future of Coaches and Psychologists in a World of AI

Understanding the role of AI in coaching and psychological support.

Key points

  • AI enhances coaching efficiency and scalability but lacks the human touch needed for empathy and connection.

  • Ethical considerations are crucial; relying solely on AI in coaching undermines being person-centered.

  • A hybrid approach, combining AI strengths with human intuition, offers the best potential for performance.


In a spacious, sunlit office in central London, Claire Mitchell was at her wit's end. As a high-powered executive in a leading tech firm and a former competitive athlete, she had always thrived under pressure. She was used to pushing her limits and performing at high levels. But, lately, the demands of her job and the complexities of her personal life had begun to overwhelm her. Her usual strategies for managing stress and maintaining high performance were failing her. She needed help.

Claire had heard of AI coaching tools and apps. With a background in technology, she was curious about their potential. Could an AI chatbot, armed with algorithms and vast data, offer the insights and guidance she desperately needed? Skeptical but willing to try anything (and this would be cheaper than a human coach), Claire downloaded the latest AI coaching app and scheduled her first session with "Sarah," the virtual coach.

The initial interactions were promising. Sarah's responses were swift and data-driven, providing Claire with practical tips and structured plans. However, something crucial was missing. The advice, though technically accurate, felt hollow and impersonal. There was no warmth, no understanding of the subtleties of Claire's emotions and experiences. The responses lacked the depth that comes from genuine human connection and empathy.

The Human Element in Coaching

To be clear, AI is not a bad option; doing nothing is a bad option when anxiety builds. However, for some, AI coaching might feel like navigating through an automated call center, with frustrating twists and turns that lead nowhere. While others might benefit from AI, the majority still prefer human interaction. A PwC survey found that 80 percent of people prefer dealing with a human over AI or automated chatbot responses.

AI is great at quickly providing information, but it lacks the ability to truly understand and respond to the complexities of human emotion and behavior. True growth happens in moments of reflection and deeper introspection, areas where human coaches excel. Interestingly, the best coaches often don't offer much direct advice because they recognize that solutions often lie within the individual. Instead, they listen deeply, observe body language, and pick up on subtle nonverbal cues that no algorithm can detect. Coaches and psychologists create a safe space for reflection and incubation of ideas, allowing individuals like Claire to explore their thoughts and feelings without judgment in a format that best suits them.

On the plus side, a study published in PLoS One found that AI can perform similarly to human coaches in helping achieve specific goals. While AI can offer structured advice based on data, it falls short in providing the meaningful understanding and empathy that human coaches bring, especially when managing complex and challenging goals that might contain shifting goalposts.

Psychological Support and Trust

There has been an increase in research investigating the benefits of eye-tracking software and video gesture mapping to detect emotion (Lim et al., 2020). These technologies, designed to analyze and interpret nonverbal cues, do well to match human emotion with facial movement, but that’s where the sophistication stops. Can AI detect anxiety? Yes. Can it then use that information to shape a conversational plan that helps an individual manage emotion? No. That doesn’t mean that in time AI won’t be able to provide psychological support, but we aren’t there yet. If we do get to that juncture in the future, we then have to examine another question: Do we trust robots to help coach us through big decisions?

A study published in Government Information Quarterly suggests that public trust in AI supporting complex areas is low. We may trust Alexa, Google, or Siri to turn on our lights and perhaps even order a pizza, but when it comes to an empathetic conversation and a solution-focused strategy, they are nowhere near good enough—well, my Alexa isn’t, anyway!

The creative process, where ideas are unraveled and solutions are born, relies heavily on collaboration and trust, followed by periods of reflection and mental incubation where ideas get to ruminate. AI, by its very nature, processes and synthesizes existing data but lacks the ability to engage in this complex cognitive dance. Human coaches and psychologists, on the other hand, are adept at guiding clients through these phases, facilitating breakthroughs that AI simply cannot replicate.

The Hybrid Coach

Despite these limitations, there is potential for AI to enhance, rather than replace, human coaching and psychological support. A hybrid model in which AI tools assist coaches by handling administrative tasks, analyzing data patterns, and providing supplementary insights is the best practice for integration. This allows human practitioners to focus on what they do best: connecting with clients on a deep, personal level and asking challenging questions that foster genuine growth and development.

This hybrid coaching approach aligns with findings from a recent survey by the International Coach Federation, which reported that while AI can augment the coaching process, the majority of clients still prefer the empathetic support of a human coach. Furthermore, many coaches are already incorporating AI into their practices in various ways to enhance their effectiveness, from writing up notes to scheduling appointments.

We Are Asking the Wrong Question

The question isn't whether AI will replace coaches and psychologists. We know that AI offers efficiency and scalability in certain areas, but it is the human touch that fosters imagination, empathy, and innovation. At its core, we should be examining the ethics of tech firms that think it’s acceptable to coach using AI. The question we need to ask is this: Is it ethical to solely use AI to coach?

While time-saving, scaling, and innovation are important, we fail our clients if we are not fully present during each interaction. AI is response-centered, not person-centered. Observing, listening, reflecting, and asking questions that challenge the status quo are essential to evoke motivation. Coaching and all psychological consultancy must remain a deeply personal process, a collaboration between two individuals working together to unpack thinking and repackage motivation.

As we look to the future, we must remember that AI, despite all its capabilities, does not imagine. It merely collates existing opinions. It is through the unique matrix of human insight and technological assistance that we can reach our potential, building a path that honors the best of both worlds. Like an athlete who relies on both training data and the intuition of a seasoned coach, we can achieve high performance by blending the strengths of AI and human expertise.


RECOMMENDED ARTICLES

See this gallery in the original post