Federal Ruling · Feb 2026 US v. Heppner (SDNY): Judge Rakoff held that submitting client information to a consumer AI platform constitutes third-party disclosure, destroying attorney-client privilege. Both ChatGPT and Claude were cited by name. 25 CR. 503 (JSR)
⚖ Legal AI Infrastructure

Client confidentiality
isn't a setting.
It's architecture.

A federal court has now confirmed what careful practitioners already suspected: submitting client information to a public AI platform is disclosure to a third party. Privilege dies. Confidentiality obligations attach.

eRacks builds on-premise AI servers where client data, case files, and work product never leave your firm's infrastructure — not to OpenAI, not to Anthropic, not to anyone.

Trusted by law firms and in-house legal departments across the United States.

U.S. District Court, Southern District of New York Feb. 17, 2026
United States v. Heppner, No. 25 CR. 503 (JSR)
Court's Holding
"Heppner could have had no reasonable expectation of confidentiality in his communications with Claude. He disclosed it to a third party, in effect AI, which had no obligation of confidentiality."
— Judge Jed S. Rakoff
⚠️ Consumer AI = Third Party. ChatGPT, Claude, Gemini (consumer tiers) are third parties for privilege purposes.
⚠️ Sending to counsel afterward doesn't cure it. Pre-existing unprivileged documents don't become privileged when forwarded to a lawyer.
⚠️ Applies to clients, not just attorneys. Any client using consumer AI on legal matters may waive privilege over underlying communications.
The court noted enterprise/on-premise tools may differ — where confidentiality is architecturally guaranteed, the analysis changes.
Risk comparison

Consumer AI vs. enterprise AI vs.
on-premise — the privilege analysis

Not all AI deployment models carry the same legal risk for your firm and your clients. Here is how they compare under the Heppner framework.

Factor Consumer AI
(ChatGPT Free/Plus, Claude Pro)
Enterprise AI
(with contractual BAA/NDA)
eRacks On-Premise AI
Client data leaves the firm ✕ Yes — every prompt ⚠ Yes — encrypted, contractual controls ✓ Never — local only
Third-party disclosure risk (Heppner) ✕ High — confirmed by court ⚠ Reduced — depends on contract ✓ Eliminated architecturally
Used for model training ✕ Yes (default on most tiers) ⚠ Typically no — verify BAA ✓ Never — open-source, local model
Can be subpoenaed / seized ✕ Yes — Heppner FBI seizure ⚠ Possible with legal process ✓ Only through firm's own infrastructure
Reasonable expectation of confidentiality ✕ None per Judge Rakoff ⚠ Partial — contractual only ✓ Full — no external party involved
Model Rule 1.1 technology competence ✕ At risk ⚠ Satisfiable with documentation ✓ Satisfies confidentiality obligation
Monthly cost $20–$30/user $30–$60+/user $0/month after purchase
Data flow analysis

What happens to client data
in each scenario

⚠ Cloud AI — Privilege at Risk
Attorney types client information into ChatGPT, Claude, or Copilot
Data transmits to vendor servers — OpenAI, Anthropic, Microsoft
Vendor's privacy policy: inputs may be used for training, disclosed to government authorities
No reasonable expectation of confidentiality — per Heppner
FBI can seize AI-generated documents. Opposing counsel can seek them in discovery.
✓ eRacks On-Premise — Privilege Protected
Attorney types client information into firm's private Open WebUI interface
Request stays inside firm's LAN — routed to eRacks server in your server room
Inference runs on firm's own GPU — no third-party custody, no external transmission
No vendor ever sees the data — there is no "third party" to whom disclosure occurs
Confidentiality is preserved at the architectural level, not just contractually
Legal use cases

What law firms run on their
eRacks AI servers

Litigation
Document Review & Summarization

Summarize discovery documents, depositions, and case files. Identify key facts, dates, and inconsistencies across hundreds of pages in minutes.

Corporate / Transactional
Contract Review & Redlining

Flag non-standard clauses, missing provisions, and deviation from firm templates. Accelerate first-pass review without sending contracts outside the firm.

All Practice Areas
Brief & Motion Drafting

Generate first-draft arguments, structure briefs, and synthesize research into motion sections — with your client facts staying on your server.

M&A / Due Diligence
Diligence Summarization

Process data rooms and due diligence files into structured summaries, issue lists, and risk memos — without sending sensitive M&A information to cloud services.

Client Service
Client Communication Drafts

Draft status updates, advice letters, and client-facing summaries from case notes. Maintain consistent voice and tone while saving attorney time.

Knowledge Management
Firm Knowledge Base

Query your own precedent library, standard forms, and prior work product through a private RAG system — your institutional knowledge, instantly searchable.

"After the Heppner ruling, we couldn't continue using cloud AI for client matters. eRacks had us running a private server within two weeks. Our attorneys use it every day — client files stay inside our walls."
Managing Partner · Mid-size litigation firm · eRacks customer
Privilege, ethics & technical questions

What attorneys ask before
making a decision

What exactly did the Heppner court hold about AI and privilege?

In United States v. Heppner, No. 25 CR. 503 (JSR) (S.D.N.Y. Feb. 17, 2026), Judge Jed Rakoff held that documents generated using a consumer AI platform were not protected by attorney-client privilege or the work-product doctrine. The court identified two independent grounds: (1) An AI platform is not an attorney and cannot form an attorney-client relationship; (2) there was no reasonable expectation of confidentiality because Anthropic's privacy policy expressly permits data use for model training and disclosure to government authorities. The court noted that enterprise tools with contractual confidentiality protections may present a different analysis — and on-premise deployment, where no third party ever receives the data, resolves the concern entirely at the architectural level.

Does this ruling apply only to criminal cases?

The holding arose in a criminal context, but the underlying privilege analysis — that disclosing information to a consumer AI platform destroys confidentiality — applies equally in civil litigation and regulatory matters. Multiple state bars have issued guidance cautioning lawyers about using consumer AI with client data, citing confidentiality obligations under Model Rule 1.6. The Heppner decision is the first federal court ruling on this issue but will be broadly cited as persuasive authority. Practitioners should assume the privilege risk exists in any context.

What about enterprise versions of ChatGPT or Claude?

Enterprise agreements change the contractual landscape — vendors agree not to train on your data and provide stronger confidentiality terms. The Heppner court explicitly noted these tools may present a different privilege analysis. However, even with enterprise contracts: (1) client data still travels to the vendor's servers; (2) you are dependent on the vendor's security posture and compliance; (3) the vendor can still be served with legal process. On-premise eliminates all three risks simultaneously. For the highest-sensitivity client matters, on-premise is the only architecture where confidentiality is guaranteed by physics, not contract.

How does this satisfy Model Rule 1.1's technology competence requirement?

Model Rule 1.1 Comment 8 requires attorneys to understand "the benefits and risks associated with relevant technology." Using consumer AI with client data — now that Heppner has confirmed the privilege risk — creates a competence issue if attorneys proceed without appropriate safeguards. An on-premise AI server directly satisfies the confidentiality preservation obligation: there is no third-party transmission, no training risk, and no exposure to government process through the AI vendor. Several state bar associations have issued guidance specifically recommending attorney-directed, non-consumer AI tools for sensitive client work.

Can the AI be directed by counsel, as the Heppner court suggested matters?

Yes — and this is an important nuance. The Heppner court noted that if counsel had directed the client to use the AI tool for the research, the work-product analysis might have differed. On an eRacks firm server, all AI use is under the firm's infrastructure and effectively under the direction of the supervising attorneys. This strengthens work-product arguments significantly compared to clients using consumer tools on their own initiative. For formal work-product protection, document that AI use is attorney-directed by building it into your matter intake and retainer documentation.

Ready to protect client confidentiality

Your clients' information.
Your server. Your control.

Tell us your firm size, practice areas, and primary workflows. We'll spec the right AI server configuration and provide a full quote — typically within one business day. eRacks has served legal customers for over 20 years.