Therapists cannot use cloud AI for session notes because patient data in API calls violates HIPAA
ai+2aihealthcareprivacy0 views
A solo-practice therapist who wants to use an LLM to help structure session notes, generate treatment plans, or draft insurance pre-authorization letters cannot send patient details through OpenAI or Anthropic APIs without a Business Associate Agreement — and even with a BAA, every API call creates a copy of PHI on a third-party server that becomes a breach liability. If a therapist pastes 'Patient reports suicidal ideation following divorce from [name]' into ChatGPT, that data may persist in logs indefinitely, and a single breach can trigger $50,000-per-violation HIPAA penalties that would bankrupt a solo practice. On-device Gemma 4 fine-tuned on clinical documentation templates processes the note entirely on the therapist's phone — PHI never leaves the device, no BAA is needed, no server logs exist, and the therapist gets AI-assisted documentation without creating a regulatory time bomb. The structural requirement is that the model runs where the data already lives: on the clinician's own hardware.
Evidence
https://www.hipaajournal.com/when-ai-technology-and-hipaa-collide/