The Silent Third Party: Ethical Dilemmas of AI-Powered Transcription-Based Documentation and its Potential Threat to the Therapeutic Process

The rapid advancement of artificial intelligence has led to the emergence of documentation software that “listens in” on therapy sessions, automatically transcribes them, and then generates clinical notes, progress reports, treatment plans and more. While this may seem like an innovative solution to reduce the administrative burden on therapists, I believe it raises some profound ethical concerns. This technology runs the risk of threatening the integrity of the therapeutic setting, may introduce serious security risks, and potentially alters the nature of psychotherapy itself. In this post, I will explore why such software, though understandably appealing for the busy clinician, is a problematic development and why we should approach it with caution.

The Erosion of the Therapeutic Space

At the heart of effective psychotherapy lies the creation of a safe, private, and confidential space where clients can express themselves freely. This is a requirement of the psychotherapeutic setting that clinicians are often called upon to protect in the face of potential third-party intrusions (Levin et al., 2003). Even when such a safe-space is effectively created, many clients understandably continue to struggle with disclosing intimate details about themselves, their feelings, and their deepest secrets. Having this clinical knowledge and experience, we are called upon as clinicians to think carefully about the potential impact of the presence of the AI observer in the therapeutic space. The mere knowledge that a session is being recorded, transcribed, and analyzed by AI software may disrupt the therapeutic environment in several ways:

  • Inhibition of Client Disclosure: Clients may hesitate to discuss deeply personal issues such as sexual fantasies, conduct difficulties, political views, or suicidal ideation if they know their words are being recorded, transcribed, AI-analyzed, and stored.
  • Self-Censorship and Performance: Clients may unconsciously (and consciously) shift how they present their thoughts and emotions if they feel they are being observed by an external system, rather than engaging in an organic, spontaneous therapeutic process.
  • Loss of the “Human-Only” Sanctuary: Traditional therapy relies on the understanding that the therapeutic dialogue is a private, confidential, and deeply human interaction. With uncannily human-like qualities, AI’s presence and “participation” during a session introduces possible emotional confusion about this private human interaction.

The “Observer Effect”

A fundamental concept in psychology and science is that the presence of an observer changes the behavior of the observed. Originally borrowed from quantum physics (the Heisenberg  Principal), in Social Psychology it is referred to as the Hawthorne Effect. In the therapeutic context, the knowledge of being recorded and transcribed by an AI observer runs the risk of subtly (and perhaps not so subtly) altering the natural flow of dialogue between therapist and client. In addition to the factors noted earlier about the therapeutic setting, we might also consider the following:

  • Impact on Authentic Emotional Expression: Clients may hold back from emotional vulnerability if they know a digital system is “listening.” Instead of processing deep emotions, they may present a more curated version of themselves.
  • Therapist Adaptation: Clinicians, too, may change their behavior. Instead of focusing entirely on their client, they may begin to consciously and unconsciously edit their language, anxious about how the AI-generated documentation might be interpreted later. Anyone who has, usually in the context of training, had supervisors directly observe their sessions will recall the complex ways in which being observed altered their interventions and spontaneity.
  • Shift from Relational to Procedural Therapy: Therapy is an evolving, relational process that relies on trust and spontaneity. AI documentation introduces an intrusive third party into this intimate dynamic, potentially threatening the organic nature of the process – a process that is as of yet not even fully understood in all of its complexity.

The Security Risks of AI-Recordings and Transcription

AI documentation software often relies on cloud-based platforms, voice recognition algorithms, and third-party servers to record, process and generate notes. This creates multiple layers of potential vulnerability:

  • Non-Secure Transmission of Sensitive Information: If the software transmits data over an unsecured network (unlikely nowadays but not impossible), client information could be intercepted, leading to HIPAA or PIPEDA violations.
  • Potential Data Breaches: Even if a company claims to encrypt and protect information, AI-powered platforms have become prime targets for hackers seeking to exploit sensitive data.
  • Lack of Control Over Data Storage: Little is known about how AI documentation services store recordings and transcripts and how long they retain them, even when clinicians delete their files. This raises concerns about long-term data security and whether clients truly have control over their personal information.
  • Regulatory Uncertainty: Because this technology is so new, there are currently no standardized regulations governing how AI-recordings and transcriptions should be handled, secured, or ethically implemented. Clinicians are essentially entrusting sensitive data to an industry without clear regulation or oversight. Typically, these companies are not operated or controlled by individuals with clinical training and thus are likely not sensitive to the clinical and ethical implications of their technologies.

The Loss of Clinical Reflection and Conceptualization

One of the most overlooked consequences of AI-generated documentation software is the potential erosion of deep clinical thinking. Though it is a difficult task, traditionally, therapists review and reflect on their sessions while writing their notes, allowing for deeper understanding, elaboration, and clinical conceptualization.

  • The Act of Writing as a Processing Tool: Writing notes is not just an administrative task—it is a cognitive process that helps clinicians distill insights, track patterns, and develop meaningful treatment plans.
  • AI as a Shortcut to Superficiality: If therapists rely on AI to generate notes, they risk engaging with the session only at the surface level, losing valuable opportunities to contemplate the client’s journey more profoundly.
  • The Danger of “Mindless” Documentation: With AI-generated notes, therapists may become passive receivers of documentation rather than active participants in clinical interpretation, leading to a diminished sense of engagement with their own work.

Conclusion: A Step Too Far?

While AI-driven documentation software promises efficiency, the ethical trade-offs appear far too significant to ignore. I believe that the presence of an AI “listener” in the therapy room fundamentally alters the therapeutic process, inhibits client openness, distracts the clinician, and may introduce important risks to the security of sensitive clinical material. Moreover, it may deprive therapists of the invaluable opportunity to reflect more deeply on their sessions. Until there are clear, enforceable regulations and thorough ethical considerations in place, clinicians should be wary of incorporating this technology into their practice.

Psychotherapy thrives on trust, privacy, and human connection. If we allow AI to intrude upon these sacred spaces, we risk transforming therapy from a deep, relational process into a monitored, mechanized exchange—one where neither the client nor the therapist is fully present. I’m not sure the convenience is worth it.

Over the past few months, we have had a number of inquiries about whether Note Designer offers the recording and transcription of therapy sessions to create AI generated notes. I usually just answer that we do not – in this post, I’m sharing with you some of the reasons behind why we have taken that decision.

Though it could be argued that some of what I’ve mentioned (regarding the potential loss of clinical reflection resulting from the reliance on therapy documentation software) may apply to Note Designer as well, we tried to create a system that would help clinicians think more carefully about their clinical work. By offering a structure for thinking about sessions and numerous examples of possible interventions and clinical experiences, our hope is that Note Designer provides a tool to support rather than replace clinical thinking and reflection.

Levin, C., Furlong, A., & O’Neil, M. K. (Eds.). (2003). Confidentiality: Ethical perspectives and clinical dilemmas. Analytic Press.

 

Patricia C. Baldwin, Ph.D.
Clinical Psychologist
Note Designer Inc.

 

Start typing and press Enter to search