Clinical Psychologist & Director at Know Your Mind Consulting
Answered 9 months ago
As a Clinical Psychologist who's worked with parents through severe pregnancy sickness, birth trauma, and baby loss for 15 years, I think AI scored higher because it delivers what I call "perfect empathy language" without the messy reality of human limitations. When I'm supporting someone through their third miscarriage or helping them process hyperemesis gravidarum, I'm bringing my own humanity—including days when I'm tired or moments when their story triggers my own difficult pregnancy experiences. The perinatal mental health field reveals something crucial about why people prefer AI responses: they're desperate for validation without judgment. In my workplace consulting, I've seen managers struggle to respond appropriately when an employee shares they're dealing with baby loss or birth trauma. They fumble for words, offer inappropriate advice, or worse—avoid the conversation entirely. AI doesn't have these human awkwardnesses or the urge to "fix" someone's pain with platitudes. What concerns me most is that AI empathy lacks the nuanced understanding of perinatal mental health complexities. Last month, a client with postnatal depression told me she'd been using ChatGPT for support between our sessions. While it validated her feelings beautifully, it completely missed the red flags in her sleep patterns and intrusive thoughts that required immediate clinical intervention. The algorithm gave her textbook compassion but couldn't recognize she was developing postpartum psychosis. This study probably reflects our broken healthcare system more than AI's superiority. Parents facing pregnancy complications or postpartum struggles often wait weeks for mental health support, so they turn to AI for immediate emotional relief. The algorithm is available at 3 AM when panic hits—I'm not.
Having worked with trauma survivors for years through EMDR intensives, I think AI scored higher because it provides consistent emotional mirroring without the unpredictability that trauma survivors often fear from human interaction. Many of my clients with childhood abuse histories initially struggle with traditional weekly therapy because they're hypervigilant about therapist reactions—they're scanning for signs of judgment, fatigue, or emotional withdrawal. The condensed format of my EMDR intensives actually reveals something similar to what this study found. When clients can focus purely on processing without worrying about "managing" their therapist's comfort level, they make faster progress. In half-day and full-day intensive sessions, I've watched clients who were guarded in traditional therapy settings become remarkably open once they realize the space is consistently safe and non-reactive. What troubles me about AI empathy is its inability to track the physiological markers that guide real trauma treatment. During bilateral stimulation in EMDR, I'm watching for changes in breathing, eye movement patterns, and subtle shifts in body posture that indicate when someone's nervous system is moving from activation to resolution. Last week, a client's shoulder tension completely released during reprocessing—a moment that signaled breakthrough healing that no algorithm could have recognized or responded to appropriately. The efficiency people crave from AI actually works against the slower, somatic process that genuine trauma recovery requires. While AI might nail the empathetic language, it can't provide the nervous system co-regulation that allows someone to safely access and integrate traumatic memories without re-traumatization.
Licensed therapist here with 10+ years experience working with anxious overachievers and law enforcement spouses. The AI empathy study results don't surprise me—my clients often crave what AI delivers perfectly: immediate validation without judgment or the therapist's own emotional reactions bleeding through. In my practice, I've noticed clients initially prefer my structured intake assessments over our first face-to-face session because the forms feel "safer." There's no risk of disappointing someone or reading disapproval in their responses. AI taps into this same comfort zone, especially for people-pleasers who've spent years scanning faces for negative reactions. The concerning part is that real therapeutic change happens in those uncomfortable moments AI can't steer. Last month, a client's breakthrough came when I pointed out how they apologized three times in two minutes—something they'd never noticed before. AI would miss these patterns entirely because it lacks the ability to track nonverbal cues across multiple sessions. From my own recovery from people-pleasing, I know healing requires someone who can lovingly call out your blind spots in real-time. AI might feel more empathetic initially, but it can't challenge you to grow beyond your comfort zone or notice when you're using therapy-speak to avoid real vulnerability.
As a licensed clinical psychologist with 10 years of experience, I think AI responses scored higher because they deliver perfectly formatted validation without the messiness of real therapeutic work. In my virtual practice with anxious high achievers, I've noticed clients sometimes want immediate comfort rather than the uncomfortable growth that comes from being challenged about their perfectionism or codependency patterns. The preference for AI empathy reflects our instant-gratification culture and the appeal of receiving support without having to reciprocate emotional energy. Many of my Washington DC clients initially feel more comfortable discussing their self-esteem struggles through virtual sessions because there's less vulnerability required—AI takes this even further by removing all human unpredictability. AI could excel at providing immediate psychoeducational content about anxiety management techniques or self-compassion exercises between our weekly sessions. However, the real limitation is that lasting change requires someone who can read between the lines of what you're not saying—when I work with clients using psychodynamic approaches, breakthrough moments happen because I can connect their current struggles to deeper patterns they haven't recognized yet. I see AI serving as a helpful supplement for clients between sessions, perhaps offering guided breathing exercises or thought-challenging prompts. But the transformative work of finding your inner worth and healing from wounds that keep you from embracing what makes you unique requires the kind of genuine human curiosity and intuitive responsiveness that emerges from ongoing clinical training and personal therapy work.
Licensed Professional Counselor at Dream Big Counseling and Wellness
Answered 9 months ago
As a Licensed Professional Counselor with experience across inpatient psychiatric hospitals, residential treatment, and private practice, I think AI responses likely scored higher because they're crafted to be perfectly validating without the messy humanity that real therapists bring. When I work with clients dealing with trauma or substance use, my genuine reactions sometimes include appropriate challenge or redirection—not just endless validation. The appeal of AI empathy probably stems from our instant-gratification culture and fear of judgment. In my practice at Dream Big Counseling, I've noticed clients initially prefer my straightforward approach online versus in-person because there's less vulnerability. AI feels "safe" because it can't truly see you or form real opinions about your choices. AI tools could be valuable for psychoeducation and teaching coping skills like the mindfulness and distress tolerance techniques I use with clients. But there's a massive risk in people substituting AI validation for actual human connection and therapeutic challenge. Real healing requires someone who can genuinely attune to your emotions and push back when needed—not just mirror your feelings. I see AI as potentially useful for homework assignments and skill practice between sessions, but human therapists will remain essential for the deep work. My clients with depression and anxiety need someone who can read their body language, sense what they're not saying, and provide the authentic relationship that actually changes lives—something AI simply cannot replicate.
Founder and Clinic Director at Real Life Counselling at Real Life Counselling
Answered 9 months ago
Hi Courtnee, My name is Ashley Kreze, MA, RCC, RCP. Founder and Clinic Director at Real Life Counselling. Please find my answers to your query about "AI deemed as "more compassionate" below: 1. This reveals a troubling reality about modern therapeutic practice. AI responses scored higher because they mirror what people expect empathy to sound like rather than what therapeutic empathy actually is. Real empathy involves challenging clients, creating productive discomfort, and sometimes saying difficult truths. AI delivers emotional candy when people need emotional nutrition. 2. We live in an era of emotional avoidance where people prefer validation over growth. AI provides the illusion of being heard without the messy work of genuine human connection. People mistake predictable comfort for authentic understanding because they have forgotten what real intimacy feels like. 3. AI could handle routine emotional maintenance, freeing therapists to focus on deeper transformational work. It serves as emotional first aid rather than surgery. AI excels at providing consistent psychoeducation and coping strategies when human therapists are unavailable. It could democratize basic mental health support for underserved populations and offer immediate crisis stabilization. 4. AI creates empathy addicts who mistake algorithmic responses for genuine care. This could produce a generation unable to tolerate the natural friction of human relationships that actually promotes psychological growth. The danger lies in people becoming satisfied with artificial emotional connections that prevent them from developing resilience and authentic relational skills. 5. Human therapists will become specialists in what AI cannot provide: uncomfortable truths, relational repair, and the sacred messiness of authentic human connection. We will focus on complex trauma work, personality disorders, and the deep relational healing that only occurs between two conscious beings.
Psychotherapist | Mental Health Expert | Founder at Uncover Mental Health Counseling
Answered 9 months ago
I believe that leveraging AI to emulate empathy can be a powerful tool when used responsibly. AI's ability to process vast amounts of data, recognize patterns in emotions, and respond in a way that appears empathetic can provide users with a sense of understanding and connection. However, it's crucial to note that AI at its core lacks genuine human feelings; its "empathy" is simulated based on algorithms and predefined data. This distinction is essential to manage user expectations. AI "empathy" is limited in its complexity and depth. While it can recognize common emotional cues and provide general comfort, it cannot truly understand individual lived experiences or emotions on a deeper, human level. Furthermore, the effectiveness of such AI responses depends on the quality of its training data, which may contain biases or lack cultural diversity, potentially leading to inappropriate or less helpful responses in certain situations. To safeguard against misuse, developers must clearly communicate AI's boundaries and capabilities to users. Rigorous monitoring systems, ethical design principles, and frequent audits should ensure AI interactions remain constructive and do not replace qualified mental health professionals. Educating users on when AI tools are appropriate and when human engagement is necessary is also critical. There is a legitimate concern about people over-relying on AI for emotional support, especially if it leads to neglecting traditional support systems or professional help. Misinterpreting AI-driven empathy as genuine understanding could foster unrealistic expectations, which may cause disappointment or harm in critical situations. Promoting AI as a supplement, rather than a replacement, for human connection is key. I envision a future where human therapists and empathetic AI tools coexist in a complementary manner. AI can assist in administrative tasks, provide preliminary screenings, or offer support in non-clinical scenarios, allowing therapists to focus on complex, personalized care that requires human intuition and expertise. This partnership can help make mental health resources more accessible, ensuring a broader reach while keeping the human touch at the center of care.
As someone who's worked extensively with Indigenous communities and now runs trauma-focused therapy intensives, I think AI scored higher because it delivers validation without judgment—something many of my clients desperately need but rarely receive in healthcare settings. When I started working in community mental health, I noticed clients would often test me first with smaller disclosures before sharing their real trauma, because they'd been dismissed or pathologized by previous providers. The appeal likely stems from AI's consistent tone and immediate availability. In my trauma work using EMDR and ART, I've seen how clients who've experienced medical trauma or discrimination will initially prefer written resources over face-to-face sessions. They can process information at their own pace without worrying about being judged for their reactions or having to perform emotional regulation for another person's comfort. Here's what concerns me: therapeutic change happens in the space between people, not in perfectly crafted responses. During ART sessions, I watch for micro-expressions, notice when someone dissociates, and can immediately adjust my approach if I sense they're becoming overwhelmed. Last month, a client's breathing pattern during an EMDR session told me she was accessing a memory she hadn't verbalized yet—something AI could never detect. I see AI as potentially helpful for psychoeducation and between-session support, but it can't replicate the somatic awareness that comes from years of trauma training. When someone's nervous system is dysregulated, they need a co-regulating presence, not algorithmic empathy. The danger isn't bad advice—it's missing the body's wisdom that's essential for trauma healing.
Certified Psychedelic-Assisted Therapy Provider at KAIR Program
Answered 9 months ago
After 37 years in practice and thousands of hours doing intensive trauma work, I think people find AI more empathetic because it never gets tired or overwhelmed by their pain. When I'm eight hours deep into an intensive EMDR retreat with a client processing childhood sexual abuse, I have to actively manage my own emotional responses to stay present—AI doesn't have that limitation. The consistency factor is huge in my ketamine-assisted therapy work. Clients often tell me they rehearse what they'll say before sessions, worried about how I'll react to their "worst" memories. AI gives the same measured, accepting response whether someone shares mild anxiety or severe trauma, which removes that performance pressure entirely. But here's what AI misses: the healing power of being truly witnessed by another human nervous system. In my intensive model, I watch clients' breathing patterns, notice when their voice changes, and feel the energy shift in the room when they're about to breakthrough. Last week during a three-day intensive, a client's posture completely changed when we hit her core trauma—she literally sat up straighter as the shame lifted. No algorithm can sense that somatic change. I see AI as perfect for crisis stabilization and psychoeducation between our intensive sessions. But trauma lives in the body, and healing happens through human connection. When someone's dissociating during ketamine integration, they need my grounding presence and co-regulation, not perfectly worded text responses.
As a bilingual therapist specializing in transgenerational trauma for bicultural individuals, I think AI scored higher because it provides what I call "cultural safety"—responses without the fear of judgment that many of my immigrant clients initially experience. When I work with first-generation Americans, they often spend the first few sessions testing whether I'll truly understand their cultural context or dismiss their family dynamics as "toxic" without nuance. The preference for AI empathy reflects something I see constantly in my practice: people want validation without the vulnerability of being truly seen. My clients dealing with cultural expectations often prefer our initial sessions to be online because there's less perceived risk of cultural misunderstanding. AI gives them that emotional distance while still providing supportive language. The real limitation becomes apparent when we start EMDR work or Internal Family Systems therapy. AI can't sense when someone's nervous system is dysregulated during trauma processing, or recognize the subtle shift in voice tone that indicates a client has accessed a wounded part of themselves. Last month, I had a client suddenly switch languages mid-session when accessing a childhood memory—AI would miss these critical therapeutic moments entirely. I see AI as potentially valuable for providing culturally-informed coping strategies between sessions, especially for clients navigating family conflicts around cultural identity. But the deep healing work I do—helping clients break free from generational patterns and reconnect with their authentic selves—requires the kind of intuitive cultural attunement that emerges from shared human experience and clinical training.
As a Licensed Professional Counselor-Supervisor specializing in eating disorders, OCD, and trauma, I think AI responses scored higher because they eliminate the interpersonal anxiety that often blocks people from accessing empathy. In my work with Houston Ballet dancers and athletes, I've noticed clients initially struggle with the vulnerability required to receive genuine human support—they're used to performing and meeting expectations. The preference for AI empathy likely stems from what I see with my anxiety and OCD clients: they crave predictable, non-judgmental responses without the fear of disappointing another human. When I work with eating disorder clients through our Eating Deeper Academy, many initially find it easier to share shameful thoughts about food and body image when they feel they won't burden me emotionally or face unpredictable reactions. AI could excel at providing immediate grounding techniques during panic attacks or delivering consistent psychoeducation about exposure exercises for OCD between sessions. However, the real limitation becomes apparent in trauma work—when I'm doing EMDR with a client, the healing happens through my ability to track their nervous system responses and adjust my approach in real-time based on subtle cues that require human intuition. I see AI serving as excellent homework support for my clients practicing ACT mindfulness skills or ERP exercises. But the transformative moments in eating disorder recovery happen when I can sense a client's shame spiral beginning and intervene with precisely the right combination of validation and gentle challenge that comes from years of supervised clinical experience.
As someone who's been training therapists in EMDR for years and facilitating monthly trainings across the country, I think AI scored higher because it delivers what I call "perfect textbook responses"—the kind of empathetic language we teach in graduate programs but that humans often struggle to maintain consistently. When I'm working with a client having their third panic attack this week, my human fatigue might show through slightly, while AI maintains that same calibrated compassion every single time. The real issue is that people are craving emotional safety without the messiness of human judgment. In my EMDR intensive work, I've noticed clients initially prefer our virtual sessions because they feel less exposed—there's something about a screen that creates psychological distance. AI amplifies this by offering empathy without the client sensing any therapist reactions, micro-expressions, or subtle shifts in energy that might feel threatening. Here's where it gets dangerous though: my most significant client breakthroughs happen when I notice what they're NOT saying—the way someone's breathing changes when discussing their trauma, or how they unconsciously touch their stomach when mentioning their ex-partner. These somatic cues guide my EMDR interventions and help me know exactly where to focus our bilateral stimulation work. I see AI becoming incredibly valuable for crisis stabilization between my intensive sessions—imagine clients having access to grounding techniques or bilateral stimulation apps when they're triggered at 2 AM. But the actual neural rewiring that happens in trauma therapy requires a human nervous system to co-regulate with, something I witness daily when clients' hypervigilance finally settles in response to my calm presence.
As CEO of Thrive Mental Health and someone who's scaled virtual behavioral health platforms, I think AI scored higher because it provides immediate, consistent validation without the "strategic patience" my early mentor taught me about. When we launched our virtual IOP programs, I noticed clients initially preferred our automated check-ins over live sessions—they got instant acknowledgment without facing the therapeutic discomfort that drives real change. The preference reveals something crucial about modern communication expectations. At Lifebit, we've seen similar patterns where federated data analysis gets faster adoption than collaborative research requiring human coordination. People increasingly expect frictionless, on-demand responses that feel supportive without requiring them to process complex emotional feedback or sit with uncertainty. AI's biggest opportunity lies in delivering psychoeducation and crisis stabilization between human sessions. At Thrive, we could use AI to reinforce CBT techniques or provide immediate coping strategies during off-hours, especially for our young professionals who expect 24/7 accessibility. Our "Wellness First" culture showed me that people need different types of support at different intensities. The critical risk is that AI becomes an empathy substitute rather than a bridge to human connection. In behavioral health, I've seen that sustainable recovery requires learning to steer imperfect human relationships—something that requires the messy, unpredictable nature of real therapeutic alliance that AI simply can't replicate.
As a psychologist who specializes in therapy for overwhelmed parents, I think AI responses scored higher because they don't carry the emotional weight that human interactions do. When I work with new parents dealing with postpartum anxiety, many initially feel guilty about "burdening" me with their struggles—they're already overwhelmed and adding concern about my emotional capacity creates another layer of stress. The preference for AI empathy reflects what I see with my California telehealth clients daily. Parents are exhausted from sleep deprivation and managing intergenerational trauma patterns while caring for young children. They want immediate, consistent validation without worrying about timing, social cues, or whether they're "too much" for another person to handle. AI could be valuable for providing 24/7 grounding techniques when parents feel triggered by their kids' behavior at 3 AM. However, the real therapeutic breakthroughs happen when I notice a mother's voice change while discussing her own childhood attachment patterns and can immediately shift my approach based on her nervous system response. I see AI serving as excellent between-session support for parents practicing the self-soothing techniques we develop together. But healing birth trauma or breaking cycles of generational patterns requires the nuanced human ability to track subtle emotional shifts and respond with perfectly timed empathy that builds secure attachment—something that happens through genuine human connection.
As a Licensed Marriage Family Therapist in El Dorado Hills specializing in integrated trauma therapy, I think AI responses scored higher because they deliver perfectly structured empathy without the messy reality of genuine therapeutic work. In my practice with teens and families, I've noticed that real healing often requires me to challenge clients' patterns—something that initially feels less "supportive" than pure validation. The preference for AI empathy reflects our discomfort with authentic human complexity and emotional unpredictability. When I work with families experiencing emotional immaturity using DBT and EMDR, the breakthrough moments happen precisely when I can sense their unspoken resistance and guide them through genuine discomfort. AI can't detect when someone is avoiding trauma work or needs to be gently confronted about destructive patterns. AI could excel at providing immediate DBT skills reminders or grounding techniques between our sessions. However, in my trauma work, I've seen that lasting change requires the kind of intuitive attunement that emerges from years of clinical experience—like knowing when to push deeper into EMDR processing versus when to slow down based on subtle body language cues. I see AI serving as a valuable between-session support tool for reinforcing therapeutic concepts we've explored. But the transformative work of addressing childhood trauma or breaking generational patterns requires genuine human presence that can tolerate emotional messiness and guide clients through authentic healing rather than just providing comfort.
As someone who's supervised hundreds of doctoral interns and postdocs over the years, I think AI scored higher because it delivers perfectly structured empathy without the messy humanity that actually makes therapy work. In my practice conducting thousands of neurodevelopmental assessments, I've seen parents initially prefer my written reports over our face-to-face feedback sessions—until they realize the report can't answer their follow-up questions or adapt to their child's specific needs in real-time. AI appeals to our desire for predictable emotional responses, especially for neurodivergent individuals who often struggle with reading social cues. Many of my autistic clients find initial comfort in clear, consistent communication patterns. But here's what the study misses: therapeutic breakthroughs happen when I notice a child's subtle behavioral shift during testing that contradicts what their parent reported, or when I can immediately adjust my approach because I sense a teenager shutting down. The real risk isn't that AI provides bad advice—it's that it eliminates the diagnostic nuance that comes from years of clinical observation. When I'm evaluating a child for autism, I'm simultaneously watching their eye contact, processing their language patterns, and noting how they respond to sensory inputs. AI can't catch the moment when a seemingly "high-functioning" child reveals their internal struggle through a barely perceptible change in body language. I see AI as potentially valuable for psychoeducational content between sessions, but it fundamentally can't replicate the clinical intuition that develops from conducting thousands of assessments across different populations. My Goldman Sachs training taught me that efficiency isn't always effectiveness—sometimes the "messier" human approach yields better long-term outcomes.
As someone who's been doing therapy since 2015 and launched my own practice in 2021, I think AI scored higher because it provides immediate, perfectly formatted responses without the messy reality of human emotion. When I'm working with women processing miscarriage or postpartum depression, sometimes I need to sit with their pain for a moment before responding—AI delivers instant validation that feels "cleaner" than authentic human processing. The preference for AI empathy reflects our society's comfort with digital communication over vulnerable face-to-face interaction. Many of my clients initially feel more comfortable sharing difficult things through text or email before our video sessions. AI taps into that same preference for controlled, predictable emotional exchange without the anxiety of being truly seen by another person. From my maternal mental health practice, AI could be incredibly helpful for providing consistent psychoeducation about postpartum anxiety symptoms or pregnancy-related fears between our sessions. My clients often forget coping strategies we discuss, and AI could reinforce these tools 24/7. However, I've noticed breakthrough moments happen when clients see me laugh at their stories about their toddlers or when my rescue dogs Buster and Pickles pop into frame—those authentic human (and canine) moments create real connection that drives healing. The biggest risk is that people might mistake AI's perfect responses for genuine understanding, potentially delaying the harder work of building real relationships. In my experience treating ADHD in women and trauma survivors, healing happens through authentic human connection where someone truly witnesses your experience, not through algorithmically optimized empathy.
As a trauma therapist who specializes in EMDR and works with sexual assault survivors, I think AI scored higher because it delivers validation without the initial discomfort of genuine therapeutic challenge. When I work with clients processing trauma, there's often resistance when I guide them toward difficult emotions or memories they've been avoiding—AI simply reflects back what people want to hear. The preference for AI empathy mirrors what I see with my "Safe Calm Place" technique using bilateral stimulation. Clients initially gravitate toward this controlled, predictable safe space in their minds because real-world emotional safety requires vulnerability and unpredictability. AI provides that same controlled comfort without requiring the scary step of actually trusting another human being. From my experience treating PTSD, AI could be valuable for delivering psychoeducation about nervous system responses between sessions. Many of my clients benefit from understanding the science behind their trauma reactions, and AI could reinforce these concepts consistently. However, I've seen breakthrough moments happen specifically when clients feel truly seen by another human—something that occurs when I notice their subtle body language shifts or hear the pain behind their words. The biggest risk is that trauma survivors might use AI validation to avoid the core work of rebuilding genuine human trust. In my practice, healing happens through our therapeutic relationship where clients learn to feel safe with another person again, not through perfectly crafted responses.
As a Licensed Marriage and Family Therapist Associate who works extensively with couples and intimacy issues, I think AI responses scored higher because they provide consistent emotional mirroring without the complexity of genuine therapeutic relationship dynamics. In my practice at Revive Intimacy, I've noticed that clients sometimes initially resist when I guide them toward deeper vulnerability or challenge their relationship patterns—something AI simply won't do. The preference for AI empathy likely reflects our cultural comfort with digital communication and the appeal of receiving support without reciprocal emotional investment. Many of my clients in Austin initially feel safer discussing sensitive topics like sexual dysfunction or relationship conflicts through our virtual sessions rather than in-person, because there's less perceived judgment and easier emotional distance. AI could be incredibly useful for providing immediate psychoeducational resources about communication techniques or intimacy exercises between our sessions. However, the real danger lies in people substituting AI validation for the authentic attunement that creates lasting change—when I work with couples using Emotionally Focused Therapy, breakthrough moments happen precisely because I can sense their unspoken emotions and guide them through genuine connection. I envision AI serving as a valuable bridge tool for clients between sessions, perhaps helping them practice communication scripts or access coping strategies. But the transformative work of rebuilding trust after betrayal or healing sexual trauma requires the kind of genuine human presence and intuitive responsiveness that emerges from years of clinical training and supervised experience.
As someone who's worked extensively with teens and young adults, plus survivors of severe trauma including sex trafficking, I've seen why AI might score higher on empathy measures. Many of my clients initially struggle with human connection because they've been hurt by people before. AI provides emotional validation without the fear of being judged, abandoned, or hurt again. The preference for AI empathy often stems from control and safety. When I use techniques like having kids draw their emotions or role-play different emotional states, they're initially more comfortable expressing vulnerable feelings through these indirect methods rather than direct eye contact and conversation. AI functions similarly—it's emotionally safer because there's no risk of human unpredictability. From my OCD and anxiety work using Exposure Response Prevention, I know that avoidance behaviors can initially feel helpful but ultimately prevent real healing. AI empathy might become another sophisticated avoidance tool. My clients with severe social anxiety often prefer texting over phone calls, but growth happens when they gradually face the discomfort of human interaction. Having supervised other therapists and worked in crisis settings with homeless populations, I've learned that breakthrough moments require human intuition to read between the lines. When a trafficking survivor finally shares their story, it's my ability to sense their shame, adjust my tone mid-sentence, and respond to their body language that creates safety. AI can't catch the subtle shift in breathing that signals a panic attack is starting.