Domestic violence survivors cannot safely use cloud AI because API logs become court-discoverable evidence

ai+20 views
When a DV survivor asks ChatGPT or Claude for help drafting a safety plan, finding shelters, or understanding custody law, those queries are logged on company servers and can be subpoenaed in custody battles or divorce proceedings. An abuser's attorney can compel OpenAI or Anthropic to produce chat logs showing the survivor was planning to leave, which gets reframed as 'premeditation' or 'parental alienation' in court. The survivor is forced to choose between getting no AI help at all or creating a discoverable paper trail that their abuser's lawyer will weaponize. On-device models like Gemma 4 running locally on a phone produce zero server logs, require no account creation, and leave no network trace — the conversation exists only in local memory that the survivor can wipe instantly. This is not a 'nice to have' privacy feature; it is the difference between a survivor safely planning an exit and an abuser's attorney presenting exhibit A.

Evidence

https://nnedv.org/latest_update/new-openai-court-order-raises-serious-concerns-about-ai-privacy-and-safety-for-survivors-of-abuse/

Comments