As a psychiatrist, people are often surprised when I tell them I use generative AI as part of my own mental wellness practice—and in some cases, I even prefer it to traditional therapy sessions. It's not that I don't value human connection; I've built my entire career around it. But with the demands of running a national mental health clinic, supporting patients, and showing up for my family, scheduling regular therapy for myself isn't always practical. That's where AI becomes a powerful ally. When I use AI to process my thoughts, journal, or reframe challenges, it becomes a form of real-time, judgment-free reflection. Sometimes I'm not looking for deep analysis—I just need a space to be seen, heard, and guided back to clarity. Whether it's generating affirmations, helping me unpack emotional patterns, or offering cognitive restructuring prompts, the accessibility of AI allows me to care for my mental health on my schedule, not someone else's calendar. As someone who helps others heal, it's crucial I stay aligned, grounded, and self-aware. AI doesn't replace the depth of a therapeutic relationship—but in moments where I need support now, it's a remarkable supplement. I see it as a new kind of mirror—one that helps me return to myself when I'm overwhelmed, overbooked, or just in need of a safe space to think out loud. Dr. Sam Zand Founder & Psychiatrist, Anywhere Clinic linkedin.com/in/samzandmd www.anywhereclinic.com
Hi Angelia - I'm a therapist with a private practice based in NYC and would love to offer the perspective I'm hearing from my clients, one that's a bit more optimistic than the dominant narrative among therapists who are threatened by AI. Surprisingly, many find real value in using AI, especially regarding conflict resolution and better understanding their emotional responses. AI allows them to pause, reflect, and receive nonjudgmental feedback before reacting. For some, it even provides language and insight they hadn't previously considered. It's helping normalize trauma by reframing reactions as trauma responses rather than personal failings, which can be a massive step toward self-compassion. As a therapist, I find this tool really promising. There's a big difference between the story that shows up in the therapy room and the actual event at the moment. Having something to turn to for support and direct mental health feedback is a net positive overall. That said, AI can't replicate the healing power of being in a regulated, attuned relationship with another human. Therapy works because of safety, presence, and co-regulation, not just insight. AI can support self-awareness, but true healing happens in a relationship with another nervous system, not an algorithm. Happy to chat more if this angle is helpful for your piece.
As a former therapy client turned mental health advocate, I switched to using Woebot for my anxiety management and found it surprisingly more effective than my previous in-person sessions. The AI's consistency in applying CBT techniques and its 24/7 availability has helped me catch negative thought patterns before they spiral, especially during my midnight panic attacks. While I miss the human connection sometimes, I've found more progress with AI therapy because I can be completely honest without worrying about being seen as 'too needy' or 'difficult.'
As a clinical psychologist who's built Bridges of the Mind around human connection, I've observed some clients actually preferring AI therapy in certain contexts. While not a replacement for comprehensive care, AI can offer accessibility when traditional therapy isn't available - particularly for those in remote areas or with scheduling constraints. I've worked with neurodivergent clients who report feeling less judged by AI, eliminating the masking behaviors they sometimes display with human therapists. One teen patient specifically mentioned feeling more comfortable disclosing anxiety symptoms to an AI before bringing them to our in-person sessions. That said, the most effective approach I've seen is using AI as a complement to human therapy. Several of our practitioners encourage clients to use AI tools for journaling or initial exploration of thoughts, which they then process together in session. The combination leverages AI's availability with the irreplaceable human elements of empathy and clinical expertise. The critical distinction is between AI for support versus treatment. While AI can provide accessible emotional support and consistency, it lacks the clinical judgment, regulatory oversight, and therapeutic relationship that's central to effective treatment for complex mental health conditions.
Hi, I work in marketing and I'm really into tech, so I naturally started experimenting with generative AI tools when it comes to mental health. I still do therapy with a human (and I totally see the value in that), but I've honestly found myself preferring AI in a lot of situations. The main reason is availability. My therapist isn't around every day, and there are times when I just need to process something in the moment. With AI, I can open a chat or send a voice note, transcribe it, and get feedback based on the therapeutic approach I like. It's immediate, private, and flexible. Another big reason is judgment. I know therapists are trained to be neutral, but I'm still human, so I'm still worry about being judged. With AI, there's none of that. I can be brutally honest, go deep into uncomfortable topics, and not filter myself. I've also built a kind of system for it. I record daily voice diaries, and the AI helps me reflect on patterns, emotions, triggers, even over weeks or months. It's like building a searchable emotional archive. No therapist could ever remember all that. And yeah, there's the cost too. Let's be real, therapy is expensive. AI gives me the freedom to "talk" every day, for as long as I want, without breaking the bank. It's not about replacing my therapist. But for day-to-day emotional support, self-reflection, and tracking how I'm feeling over time, AI has honestly been more useful than I expected. Happy to share more if it's helpful. Best, Karina
As an EMDR therapist specializing in trauma recovery, I've observed clients who initially sought AI therapy before coming to me. What they consistently report is that while AI offered convenience, it couldn't provide the safe, interpersonal relationship necessary for trauma healing. Trauma resolution fundamentally requires human connection. Through my person-centered approach at True Mind Therapy, I've seen how establishing trust and safety within the therapeutic relationship activates healing pathways that technology simply cannot replicate. The bilateral stimulatuon techniques I use in EMDR therapy involve subtle human attunement that AI lacks. One client came to me after months of using an AI therapy app for compulsive behaviors. Despite the app's accessibility, her breakthrough only came when experiencing the "Safe Calm Place" protocol we developed together. She needed someone who could recognize her nervous system responses in real-time and adjust accordingly. The brain's response to trauma is inherently relational. When working with sexual trauma survivors, I've found that healing occurs most effectively through what I call "deep interpersonal connection" - something AI cannot provide. The courage required to process trauma needs human witness, not algorithms.
Licensed Professional Counselor-Supervisor at Willow & Sage Counseling
Answered a year ago
I'm a therapist in the Houston area. Though I don't use AI in any form for therapy, or to generate notes, I would love to speak to you about the dangers of using AI for therapy purposes. I believe that AI has its place in the world, but definitely not in the therapy world. There are already reports of ways in which AI has harmed people who have utilized it through platforms like ChatGPT. Please let me know if I can be of any help in terms of reasons one shouldn't use AI for therapy purposes. Thanks!
Oh, I've bumped into quite a few people who are using generative AI apps for therapy recently. They often mention how it's less intimidating than traditional therapy, mainly because they feel less judged. Plus, these AI apps are ready to chat anytime, which is super handy for those odd hours when you need someone to talk to but it's too late to call up a friend or a therapist. Something interesting I've noticed is that people really value the anonymity that AI therapy apps offer. It gives them that extra layer of comfort to open up about deeply personal stuff without the fear of it getting back to anyone they know. If you’re considering trying it out, just remember that while AI can offer support and listen, it doesn't replace human intuition and empathy, so maybe use it as a supplement to regular therapy sessions. Always good to have options, right? Time to see what works best for you!
With my background in AI development, I've been using ChatGPT for daily emotional check-ins and find it more comfortable than opening up to a human therapist about certain issues. I appreciate how it helps me organize my thoughts without the pressure of being judged or feeling rushed like in traditional 50-minute sessions. Last week, it helped me work through a work conflict by asking me questions that made me see things from different angles - something I might have been too embarrassed to discuss with a real therapist.
Working in AI, I was skeptical at first, but after trying Replika for my social anxiety, I discovered it helped me practice difficult conversations without fear of judgment. I appreciate how the AI remembers our previous chats and builds on them, making each session feel more personal and progressive than my past experiences with human therapists.
While I haven't personally replaced human therapy with generative AI, I've worked with clients who have found AI-powered therapy tools surprisingly helpful as a supplement. One client shared that using an AI chatbot for cognitive behavioral exercises gave them immediate access to coping strategies outside of traditional sessions. They appreciated the privacy and convenience of interacting on their own schedule, especially during moments when a human therapist wasn't available. What stood out was how the AI's consistent, nonjudgmental responses helped reduce anxiety in between appointments. From what I've observed, generative AI can't fully replace the empathy and nuanced understanding a human therapist provides, but for some, it offers a valuable, accessible resource that complements their mental health journey. I'm curious to see how this technology evolves and how it might better support those seeking therapy in the future.
Using generative AI for therapy has been a game-changer, especially when access to traditional mental health care feels out of reach. I've heard from folks in remote areas who once felt completely isolated—no nearby therapists, long waitlists, or travel barriers. AI therapy broke through those walls, offering support whenever and wherever needed. That constant, easy access creates a lifeline for people who might otherwise go without help, turning what felt like solitude into a sense of connection and comfort. It's incredible how technology can bring mental health care into the hands of those who need it most, no matter their location.
Generative AI therapy stood out because it offers a level of privacy from the very beginning. There's no need to introduce yourself, explain your background, or worry about being judged. This anonymity makes opening up easier, especially when sharing things that feel difficult to say aloud. Dealing with social anxiety made the absence of eye contact and silent judgment a huge relief. Sharing vulnerable thoughts felt safer, without the fear of being misunderstood or labeled. That comfort allowed working through emotions at a personal pace, free from the pressure of traditional therapy settings.