APA’s New Ethical Guidance for Using AI in Clinical Practice: What we need to know
The American Psychological Association (APA) has released an important and timely resource: Ethical Guidance for AI in the Professional Practice of Health Service Psychology (APA, June 2025). As artificial intelligence becomes more common in clinical work, psychologists and other mental health providers are being asked to balance innovation with ethics. This new guidance helps clarify what responsible AI use should look like in real-world settings, offering a framework that supports both client care and professional integrity.
At Note Designer, we fully support the APA’s leadership in this area. We are encouraged to see thoughtful direction that protects both clients and clinicians as the profession navigates this rapid technological change. The APA’s guidance brings much-needed clarity and we are proud to say that our own approach is aligned with their ethical recommendations.
APA’s Key Ethical Considerations
The APA’s document is available at:
https://www.apa.org/topics/artificial-intelligence-machine-learning/ethical-guidance-professional-practice.pdf
(Ethical Guidance for AI in the Professional Practice of Health Service Psychology, APA, June 2025)
Here I will review some of the highlights from the article with a particular focus on what this implies for using AI in clinical documentation.
Transparency and Informed Consent
First off, it’s important to know that if you’re planning to use an AI scribe to record and transcribe your sessions with your patients or using AI to guide your treatment decisions, you clearly need to get your patients’ informed consent. This includes if you will have an AI scribe write your progress notes based on your transcription, or propose a treatment plan based on what it gleaned from the session. The APA guidelines make an important distinction between these uses and the use of AI as a writing tool: “While certain AI use cases may be considered more subtle and/or innocuous (e.g., using predictive text when writing provider notes), others may be considered more substantial and would require greater discussion and disclosure with patients/clients (e.g., a health care system using AI to determine the best mental health treatment approach for an individual).” Note Designer falls into the innocuous writing-tool category as I will discuss more below.
Mitigating Bias and Promoting Equity
APA highlights that AI systems should be evaluated for bias and for their potential to exacerbate existing healthcare disparities: “Responsible AI development considers the full range of lived experiences to avoid unfair discrimination.”
If we allow an AI system to take over the writing process and compose our treatment documentation based on a session transcription, we are at heightened risk of introducing bias and inaccuracy into the patient record. This can include cultural, racial, gender, and theoretical bias. This is why it is extremely important that it is the clinician who remains in full control of selecting what is entered into their documentation. Note Designer’s AI system has the clinician direct the content of their notes while the AI simply serves to improve the grammar and flow of the narrative output. This helps mitigate bias and allows the clinician to ensure that treatment equity is reflected in the clinical record.
Data Privacy and Security
When it comes to clinical documentation, it is very clear that Data Security and Privacy needs to be front and center. APA emphasizes: “AI systems handling sensitive behavioral data pose risks related to privacy breaches and unethical data use.”
It is important for clinicians to also give careful attention to the ethics of who may access their clinical records and the PHI within. Note Designer software is not only secure and consistent with HIPAA guidelines but we also ensure that no identifying client data is sent, stored, or used to train any AI model. In addition, our AI system is not powered by a third party—instead, we manage a private instance of an open-source model hosted on our HIPAA-compliant server space. Because we manage the AI model, no third party has access to any client data, and no information is ever shared externally.
Accuracy and Misinformation Risks
APA is firm about the critical importance of accuracy, stressing that psychologists should “critically evaluate AI-generated content, both at the start of use and in ongoing applications…Upholding the Ethical Principle of Integrity, psychologists take responsibility for the quality of information used in their practice.”
The more authority (authorship) we give to an AI system to write our progress notes for us, the more we risk entering inaccuracies into the treatment record. When using scribe-based AI systems for instance, this risk is inherently heightened as the AI is being given free rein to interpret and possibly embellish what it has recorded. Note Designer’s AI is less prone to such hallucinations as it was created to closely follow the inputs selected by the clinician and has also been rigorously tested by our clinical team for accuracy.
Human Oversight and Professional Judgment
I could not agree more with APA’s clear statement that “AI should augment, not replace, human decision-making.”
At Note Designer, the clinician is active in selecting what content gets entered into their notes while the AI generated content remains fully editable with clinicians encouraged to review, revise, and apply their own clinical judgment to every note. The software does not offer diagnostic or treatment recommendations, nor does it make decisions on behalf of the provider. This structure ensures that professional responsibility and final decision-making remain solely in the hands of the psychologist, in full alignment with the APA’s ethical principles. Quite disturbingly, AI-scribe technology that creates documentation from therapy transcripts essentially eliminates the clinician’s own judgment, clinical knowhow, and expertise – something that I find quite concerning as a clinician.
Liability and Ethical Responsibility
APA also alerts clinicians that “psychologists should consider liability risks related to AI tool selection.”
In addition to liability concerns, clinicians have an ethical responsibility to ensure their documentation accurately and sensitively reflects their work with their patients. What gets entered into a patient’s record and what gets withheld must be in the hands of the individual clinician (who is ultimately responsible for the treatment). Clinicians need to be aware of the risks not only of AI inaccuracies but the ways in which AI may misrepresent a patient’s struggles and their treatment. The use of AI scribes takes this control out of the hands of the individual clinician potentially exposing them to significant liability. To address liability and ethical responsibility, Note Designer offers transparent AI functionality that is optional and clearly explained. No sensitive client data is stored or used to train the AI, and clinicians can choose when and how to use AI tools within their workflow. Because Note Designer does not record sessions or rely on passive data collection, users maintain full awareness of how the tool operates. This approach supports clinicians in meeting both ethical and legal responsibilities in accordance with APA guidance.
How Note Designer Embodies Ethical AI
As an APA member myself and someone who has worked in the area of professional ethics, I have been very invested in the ethical use of AI in the mental health field. We built our AI features informed by the same ethical values that the APA has now made explicit – transparency, user control, and security.
Here’s a summary of how we are aligned:
- Clinician Control: AI use is always optional – in fact, you can still create great notes with Note Designer without using AI at all. The clinician decides when and how to apply it within their workflow.
- No identifying client information is ever sent to the AI.Patient details from Headings remain entirely within your browser.
- No Recording or Listening: We never record sessions and our AI does not transcribe or eavesdrop on therapy sessions. AI is used only to re-write your notes.
- Privacy Protection: No identifying client data is sent, stored, or used to train any AI model. Our systems are hosted securely and meet HIPAA-compliant standards. Our AI system is not powered by a third party—instead, we manage a private instance of an open-source model hosted on our HIPAA-compliant server space. Because we manage the AI model, no third party has access to any client data, and no information is ever shared externally.
- Bias Awareness: We have tested and continue to monitor our model to minimize bias and improve clinical appropriateness.
- Human-Centered Design: Our tools enhance efficiency while keeping the therapist’s voice and judgment at the center of the note-writing process.
- Clear Communication: We are transparent about what our tools can and cannot do, and we welcome user feedback to guide further development.
The APA reminds us that ethical AI begins with a commitment to human care. We believe technology should serve clinicians, not replace them. Note Designer is founded, built, and operated by fellow mental health professionals with these values at the core of every feature we create.
To learn more about my perspective on the use of AI in the mental health field, check out my other blogs:
https://notedesigner.com/what-we-lose-when-we-outsource-our-minds/
https://notedesigner.com/note-designers-ethical-ai/
Source: American Psychological Association. (2025, June). Ethical Guidance for AI in the Professional Practice of Health Service Psychology. https://www.apa.org/topics/artificial-intelligence-machine-learning/ethical-guidance-ai-professional-practice
By Patricia C. Baldwin, Ph.D.
Clinical Psychologist
Co-Founder Note Designer Inc.
Author of
👩🏻💻 Note Designer: A simple step-by-step guide to writing your psychotherapy progress notes (2nd Edition – updated and expanded); 2023.