therapistsprivacyhipaa

Why Therapists Need Local AI (And Why Cloud AI Is a Risk)

Mario Simic

ยท5 min read
ShareXLinkedIn

Therapists, counsellors, and psychiatrists face a documentation burden that is among the highest of any professional group. Session notes, treatment plans, progress reports, referral letters, insurance correspondence โ€” a full-time therapist can spend 2-4 hours per day on documentation that is legally required but clinically non-productive. AI assistance with documentation could reclaim significant time for direct patient care. But the data involved makes the choice of AI tool a serious ethical and legal question.

Why Confidentiality Is Non-Negotiable

Therapy involves some of the most sensitive information a person will ever disclose: trauma history, relationship patterns, mental health diagnoses, substance use, suicidal ideation, family conflict. The therapeutic relationship depends on the client's absolute trust that this information goes nowhere beyond the clinical relationship. Breaching that trust โ€” even inadvertently, even through negligent tool choice โ€” can cause real harm to vulnerable people who may already struggle to trust.

In the United States, HIPAA's Privacy Rule classifies psychotherapy notes separately from other medical records, with even stricter protection requirements. In the EU, mental health data falls under GDPR's special category provisions โ€” the highest tier of data protection. Both frameworks create clear obligations about how this data can be processed and by whom.

The Cloud AI Problem for Therapists

When a therapist pastes session notes into ChatGPT or uses a cloud AI tool to draft a treatment plan, they have transmitted protected health information (PHI) or special category personal data to a third party. For this to be lawful under HIPAA, OpenAI would need to be a Business Associate with a signed BAA. Under GDPR, there would need to be a lawful basis, a DPIA, and likely explicit client consent. In practice, most therapists using cloud AI tools have none of these in place.

This is not hypothetical legal risk. Several US mental health practices have already received compliance guidance from their professional associations specifically warning against using standard consumer AI tools with patient information. The NASW, APA, and similar bodies have issued ethics guidance touching directly on this issue.

What Local AI Makes Possible

With a local AI agent, session documentation stays on the therapist's device. The AI processes the notes, suggests structured documentation in the required format (SOAP, DAP, BIRP, or the therapist's preferred system), and produces drafts without any patient information leaving the clinical environment.

The practical workflow: after a session, the therapist dictates or types key clinical observations. The local AI formats them into a proper clinical note, flags missing required elements (risk assessment documentation, treatment modality notes), and drafts the next treatment plan update based on progress this session. The therapist reviews, modifies, and signs. What used to take 20-30 minutes takes 5-10.

For insurance correspondence and referral letters โ€” less sensitive than session notes but still confidential โ€” the AI drafts from templates with session-specific information filled in, and the therapist approves before sending.

Skales runs entirely locally with Ollama. Patient data never leaves the device. Read the full therapist use case or see the privacy architecture.

Try it yourself ๐ŸฆŽ

Skales is free for personal use. No Docker. No account.

Download Free โ†’
ShareXLinkedIn