Healthcare generates enormous amounts of documentation โ clinical notes, referral letters, discharge summaries, patient communications โ and most of it is written by highly trained professionals doing work that is far below their skill level. AI assistance with documentation could reclaim hours per day per clinician. But the data involved is uniquely sensitive, and the compliance requirements are uniquely strict.
Why Cloud AI and Healthcare Don't Mix
In the EU, patient data is subject to GDPR's special category provisions โ some of the strictest data protection rules in the legislation. Processing patient health information through a cloud AI service without explicit patient consent, a DPIA, and a compliant DPA is almost certainly illegal under EU law. In the US, HIPAA imposes strict requirements on Business Associates โ cloud AI providers generally do not qualify without specific healthcare agreements that most have not put in place for individual clinicians.
What Local AI Makes Possible
With local AI, the clinical note stays on the clinician's computer. The model processes it, suggests structured documentation, flags missing information, and formats output for the patient record system โ all without the data ever leaving the device. A GP can dictate notes verbally after a consultation and get a draft SOAP note in seconds. A specialist can have AI draft a referral letter from their bullet-point notes. An administrator can have routine patient communications drafted from templates with appointment-specific details filled in.
Skales stores all data at ~/.skales-data. With Ollama connected, nothing leaves the machine. The voice input via Whisper works offline. For healthcare professionals who need AI assistance without compliance compromise, this architecture is uniquely suited. Read about Skales for healthcare workers.