What the ACA and APA Get Wrong About AI in Therapy
- Ian Felton

- Oct 15
- 2 min read
Both the American Counseling Association (ACA) and the American Psychological Association (APA) have issued guidance on using AI to transcribe sessions and help write therapy notes.

Their advice? Use HIPAA-compliant tools. Get informed consent. Review the output. Stay human-centered.
But the problem is that these frameworks treat AI as a convenience to be managed not as a philosophical and clinical rupture to be confronted.
Therapy isn’t a workflow to optimize. It’s a human relationship.
The moment we invite machine surveillance into the room, whether through transcription or auto-drafting notes, we change the clinical frame. Clients know they’re being recorded. That knowledge alters trust, inhibits vulnerability, and introduces a second audience into the room: the algorithm.
AI doesn’t just write notes, it replaces reflection.
In my own research on how psychotherapists become experts, the single most important factor was deliberate, effortful reflection on one’s work, not just reviewing what happened, but struggling to make meaning of it.
Outsourcing that process to a tool undermines our development as clinicians. It erodes skill, replaces depth with ease, and trains us to become passive reviewers instead of active meaning-makers.
The ACA and APA are silent on that.
Where’s the guidance on how AI impacts clinical development? On how it rewires our habits of attention and our tolerance for discomfort?
They seem completely out to lunch when it comes to the psychological cost of convenience.
🧠 If you're curious, here’s my deep dive into the research:
👉 How to Become an Expert Psychotherapist [https://lnkd.in/gzSWJtMB]
This isn’t about rejecting technology (I'm a software engineer and a psychotherapist) it’s about protecting what is sacred, relational, and human in therapy.
AI doesn't just change how we write notes.
It changes what clients are willing to say.
And it changes who we, as therapists, are becoming.
These reflections express my personal opinions on systemic issues in psychotherapy and technology, not on any specific organization or individual.



Comments