The new clinical skillset: working with AI without losing yourself

Minimalist Note Card Composition

Early lessons on maintaining authenticity, presence, and attunement while integrating digital tools into the therapy process.

Balancing the Benefits and Risks of Using AI in Therapy

Fostering therapeutic change through presence and relational depth is increasingly challenged by administrative burdens. The meticulous process of clinical documentation is consistently linked to clinician burnout and distracts valuable time away from direct patient care or clinical reflection (Leung, Coristine, & Benis, 2025).

A potential solution has emerged in Artificial Intelligence (AI) scribes. These systems listen to synchronous patient-therapist conversations and automatically generate clinical notes for review and approval (Leung et al., 2025; You, Dbouk, Landman, et al., 2025). While this automation offers the promise of improved efficiency, its deployment in psychotherapy presents a complex dilemma.

The Benefits of Automation

The primary goal of adopting AI scribes is to alleviate heavy administrative workloads, allowing healthcare professionals to focus cognitive efforts on clinical decision-making rather than administrative tasks (Leung et al., 2025). Initial evaluations suggest that AI scribe use is associated with significant reductions in burnout and improvements in perceived documentation-related well-being (You et al., 2025).

Clinicians report experiencing a lower cognitive task load and achieving meaningful time savings, particularly in after-hours work (Leung et al., 2025). One consistently reported benefit is improved patient-therapist interaction, as reduced note-taking allows practitioners to feel more present during clinical encounters (Leung et al., 2025).

A study comparing AI-generated notes to therapist-authored notes found comparable quality, with AI notes scoring higher in thoroughness and organization (Palm et al., 2025). Another study showed that AI-generated treatment plans across five therapeutic modalities were indistinguishable in quality from those created by licensed professionals (Behnad & Hodjat, 2025). AI completed the work in under 30 seconds, whereas human teams required extensive time to coordinate and finalize plans (Behnad & Hodjat, 2025). AI's ability to rapidly analyze large volumes of session data also shows promise for transforming clinical supervision, particularly where professional oversight is limited (Behnad & Hodjat, 2025; Kleine, Kokje, Hummelsberger, et al., 2025).

The Risks and Ethical Dilemmas of Automation

The benefits of efficiency and improved presence come with significant concerns.

AI outputs frequently contain errors, omissions, or hallucinations—fabricated or invented information (Leung et al., 2025; Palm et al., 2025). One study detected hallucinations in 31% of AI-generated notes, compared to 20% in human-authored notes, underscoring the need for strict clinician oversight (Palm et al., 2025).

A major long-term concern is the potential for cognitive debt: repeated reliance on AI for complex cognitive tasks may erode essential critical thinking and memory skills (Gerlich, 2025; Kosmyna, Hauptmann, Yuan, et al., 2025; Leung et al., 2025). Delegating cognitive tasks to AI risks diminishing the intellectual effort required for clinical synthesis (Gerlich, 2025; Stadler, Bannert, & Sailer, 2024). Research beyond healthcare demonstrates a significant negative correlation between frequent AI tool usage and critical thinking abilities (Gerlich, 2025). Even if documentation feels tedious, engaging in the process may strengthen clinical reasoning and memory retention (Leung et al., 2025).

AI also raises ethical and practical dilemmas, including privacy and security concerns when handling sensitive client data (Leung et al., 2025; Behnad & Hodjat, 2025). Additional concerns include algorithmic bias, blind trust in opaque systems, and the risk of note bloat—unnecessarily long and redundant documentation that burdens care teams (Leung et al., 2025).

Phone and radio
Phone and radio

Elevating Clinical Practice and Supervision

The risk of cognitive offloading diminishing critical thinking (Gerlich, 2025; Kosmyna et al., 2025; Zhai, Wibowo, & Li, 2024) requires a shift in how clinicians engage with AI. Instead of allowing AI to replace intellectual labor, clinicians should use AI outputs—such as draft notes or preliminary formulations—as starting points for detailed review. This augmentation mindset demands that clinicians question, verify, and refine AI-generated insights (Behnad & Hodjat, 2025; Gonsalves, 2024).

Delegating clerical tasks to AI frees clinicians to invest more energy into higher-order reasoning, such as case conceptualization, which depends on clear thinking, accurate evaluation of clinical data, and the ability to detect gaps or inconsistencies in reasoning (Gerlich, 2025; Hong et al., 2020; Essien, Bukoye, O'Dea, & Kremantzis, 2024; Stadler et al., 2024). Stronger clinical reasoning is associated with improved and more consistent therapeutic outcomes (Miller, Hubble & Duncan, 2014).

Conclusion

By delegating necessary but tedious clerical work, therapists can elevate their practice, focusing intellectual resources where they matter most: deep clinical thinking and ensuring AI augments, rather than replaces, human expertise.

References

  • Behnad, J., & Hodjat, B. (2025). Artificial intelligence and human therapy supervisors are interchangeable. Athens Journal of Psychology, 1(3), 179–214.DOI

  • Essien, A., Bukoye, O. T., O'Dea, X., & Kremantzis, M. (2024). The influence of AI text generators on critical thinking skills in UK business schools. Studies in Higher Education, 49(5), 865–882. DOI

  • Gerlich, M. (2025). AI tools in society: Impacts on cognitive off-loading and the future of critical thinking. Societies, 15(1), 6. DOI

  • Gonsalves, C. (2024). Generative AI's impact on critical thinking: Revisiting Bloom's taxonomy. Journal of Marketing Education, 1, 1–16. DOI

  • Hong, G., Wilcox, L., Sattler, A., Thomas, S., Gonzalez, N., Smith, M., ... Harrington, R. (2020). Clinicians' experiences with EHR documentation and attitudes toward AI-assisted documentation [White paper]. Stanford University School of Medicine and Google Health. PDF

  • Kleine, A.-K., Kokje, E., Hummelsberger, P., Lermer, E., Schaffernak, I., & Gaube, S. (2025). AI-enabled clinical decision support tools for mental healthcare: A product review. Artificial Intelligence in Medicine, 160, 103052. DOI

  • Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X.-H., Beresnitzky, A., ... Maes, P. (2025). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for essay writing task. arXiv. DOI

  • Leung, T. I., Coristine, A. J., & Benis, A. (2025). AI scribes in health care: Balancing transformative potential with responsible integration. JMIR Medical Informatics, 13, e80898. DOI

  • Palm, E., Manikantan, A., Mahal, H., Belwadi, S. S., & Pepin, M. E. (2025). Assessing the quality of AI-generated clinical notes: Validated evaluation of a large language model ambient scribe. Frontiers in Artificial Intelligence, 8, 1691499. DOI

  • Stadler, M., Bannert, M., & Sailer, M. (2024). Cognitive ease at a cost: LLMs reduce mental effort but compromise depth in student scientific inquiry. Computers in Human Behavior, 160, 108386. DOI

  • You, J. G., Dbouk, R. H., Landman, A., et al. (2025). Ambient documentation technology in clinician experience of documentation burden and burnout. JAMA Network Open, 8(8), e2528056. DOI

  • Zhai, C., Wibowo, S., & Li, L. D. (2024). The effects of over-reliance on AI dialogue systems on students' cognitive abilities: A systematic review. Smart Learning Environments, 11(28). DOI

  • Miller, S., Hubble, M., & Duncan, B. (2014). The Secrets of Supershrinks: Pathways to Clinical Excellence. Psychotherapy Networker. DOI