Compliance Guide

GDPR & AI: The complete compliance guide

How to use AI on personal data without violating Article 28. What your DPO needs to know about data processing, residency, and hardware-sealed alternatives.

The problem: AI and personal data

67% of employees use AI tools on sensitive data. They paste contracts, medical records, financial reports, and client communications into tools like ChatGPT, Claude, and Gemini every day. Most of these tools are hosted in US datacenters, process data outside the EU, and operate under terms of service that do not meet GDPR Article 28 requirements for data processors.

The gap between how people use AI and what the law requires has grown rapidly. Employees adopt AI tools faster than compliance teams can evaluate them. The result: widespread, uncontrolled processing of personal data by third-party AI providers who were never vetted as data processors under GDPR.

For regulated industries — law firms, accountants, healthcare providers, financial services — this is not an abstract risk. It is an active compliance violation with concrete consequences: fines up to 4% of annual revenue, regulatory orders to cease processing, and reputational damage that no insurance policy covers.

GDPR Article 28 explained

Article 28 governs the relationship between data controllers (your organization) and data processors (anyone who processes personal data on your behalf). When your employees use an AI tool to analyze documents containing personal data, the AI provider becomes a data processor. Article 28 requires a binding contract that includes specific obligations:

  • Process personal data only on documented instructions from the controller
  • Ensure all persons processing the data have committed to confidentiality
  • Assist the controller with data subject rights requests (access, erasure, portability)
  • Delete or return all personal data after processing ends
  • Make available all information necessary to demonstrate compliance
  • Submit to audits and inspections by the controller or an appointed auditor
Key requirement: Article 28(1) states the controller shall use only processors providing “sufficient guarantees to implement appropriate technical and organisational measures.” For AI providers, this means the hosting infrastructure itself must enforce data protection — not just a written policy.

Why ChatGPT and similar tools fall short

OpenAI, Anthropic, and Google all publish Data Processing Agreements. That is a necessary first step. But a DPA alone does not satisfy Article 28 if the underlying technical measures are insufficient. Here are the specific gaps:

  • Data processed in US datacenters. Post-Schrems II, transfers to the US require additional safeguards that most standard API agreements do not provide.
  • OpenAI publishes a DPA, but hardware-level access is not restricted. Operators and infrastructure staff can technically access memory contents.
  • Training opt-out is a policy toggle, not a hardware enforcement. There is no cryptographic proof that your data was excluded from training.
  • No on-chain or hardware attestation of data handling. You rely on contractual promises, not verifiable technical guarantees.

This does not mean these tools are unusable. For non-personal, non-sensitive data, they may be perfectly appropriate. But when processing personal data subject to GDPR — client files, patient records, employee data, financial information — the technical guarantees fall short of what Article 28 requires.

What a compliant AI setup looks like

A GDPR-compliant AI deployment for personal data processing requires both legal agreements and technical enforcement. The legal layer (DPA, SCCs) is necessary but not sufficient. The technical layer must make it physically impossible for the processor to access the data. Here is what that looks like:

EU-hosted data processing

All computation occurs within EU-governed datacenters. No transatlantic data transfers.

Hardware-sealed execution

Intel TDX or AMD SEV enclaves ensure the hosting provider physically cannot read data in memory.

Zero retention after processing

Data exists in encrypted memory only during inference. Destroyed immediately after the response.

DPA with Art. 28 clauses

A Data Processing Agreement that specifically references GDPR Article 28 obligations, not a generic privacy policy.

Hardware attestation

Cryptographic proof — signed by the CPU — that the enclave was genuine and unmodified. Auditable by the controller.

Audit capability

The processor must allow inspections. Hardware attestation makes remote audits technically verifiable.

Several providers now offer some or all of these capabilities, including Microsoft Azure Confidential Computing, Google Cloud Confidential VMs, and VoltageGPU. The key evaluation criterion: can the provider prove — with hardware evidence, not just a contract — that they cannot access your data?

Hardware-sealed computing explained

Traditional encryption protects data in two states: at rest (stored on disk) and in transit (moving across networks). But during processing, data must be decrypted in memory. Anyone with root access to the host machine — including the cloud provider's staff — can read that memory.

Hardware-sealed computing (also called confidential computing) adds a third layer: encryption in use. Technologies like Intel Trust Domain Extensions (TDX) and AMD Secure Encrypted Virtualization (SEV) create isolated memory regions — called enclaves or trust domains — where data remains encrypted even while the CPU processes it.

How Intel TDX works: The CPU generates and manages encryption keys that are inaccessible to all software — including the hypervisor, BIOS, and operating system. Memory is encrypted with AES-256 at the hardware level. Protected PCIe ensures data moving to and from the GPU is also encrypted. The result: even the datacenter operator with physical access to the machine cannot read the data.

Why this matters for GDPR: Article 28 requires “appropriate technical measures.” Hardware-sealed computing is the strongest technical measure currently available because it removes the need to trust the processor. The CPU enforces isolation — not a policy, not a contract, but physics and silicon.

This is not experimental technology. Intel TDX is supported by the Confidential Computing Consortium (members include Intel, AMD, NVIDIA, Microsoft, Google, and ARM) and is deployed in production by major cloud providers.

Checklist for your DPO

Before approving any AI tool for processing personal data, your Data Protection Officer should verify the following. Each item maps directly to a GDPR requirement.

Is the AI provider a registered EU entity or subject to EU jurisdiction?
Does the Data Processing Agreement specifically reference GDPR Article 28?
Is all data processing performed in EU datacenters?
Can the provider prove they physically cannot access the data during processing?
Is there zero data retention after processing completes?
Is hardware attestation available for independent verification?
Can your organization audit the provider, either on-site or via remote attestation?

If the answer to any of these is “no” or “unclear,” the tool should not be approved for processing personal data until the gap is resolved. Document your assessment — GDPR accountability (Art. 5(2)) requires demonstrable due diligence.

Frequently asked questions

Does using ChatGPT Enterprise solve the GDPR problem?

Partially. ChatGPT Enterprise offers a DPA and disables training on your data by default. However, data is still processed in US datacenters, the operator can technically access memory contents, and there is no hardware attestation. For regulated industries handling client personal data, this often falls short of Art. 28 requirements.

What is the difference between software encryption and hardware-sealed computing?

Software encryption protects data at rest and in transit, but during processing the data must be decrypted in memory — where the hosting provider can access it. Hardware-sealed computing (Intel TDX, AMD SEV) encrypts data even while it is being processed. The CPU enforces isolation. Not even root access on the host machine can read the enclave memory.

Is self-hosting an open-source model (Llama, Mistral) GDPR compliant?

Self-hosting can be GDPR compliant if you control the infrastructure, implement proper security measures, and document everything. The challenge is operational: you need to maintain the hardware, ensure patches, handle attestation yourself, and write your own DPA for any infrastructure providers. For most organizations, this is prohibitively expensive.

What fines can result from non-compliant AI data processing?

Under GDPR Article 83, violations of processor obligations (Art. 28) can result in fines up to 10 million euros or 2% of annual worldwide turnover, whichever is higher. Violations of data transfer rules (Chapter V) can attract fines up to 20 million euros or 4% of annual turnover. Beyond fines, regulators can order processing to cease entirely.

Does the EU AI Act change anything about GDPR compliance for AI tools?

The EU AI Act adds requirements on top of GDPR. It introduces risk categories for AI systems, transparency obligations, and conformity assessments for high-risk uses. However, it does not replace GDPR. You still need full Art. 28 compliance for any AI system processing personal data, regardless of its AI Act risk classification.

Can a US-based AI provider be GDPR compliant?

In principle, yes — if they use Standard Contractual Clauses, implement supplementary measures (such as encryption the provider cannot break), and undergo transfer impact assessments. In practice, the Schrems II ruling makes this difficult because US surveillance laws (FISA 702, EO 12333) can compel data disclosure. Hardware-sealed EU processing eliminates this risk entirely.

Get the GDPR AI Compliance Checklist

Download the 10-point checklist as a PDF. Share it with your DPO, compliance team, or IT department.

No spam. Unsubscribe anytime. GDPR compliant.

Meet all seven criteria above.

VoltageGPU is one solution built for this. French company, Intel TDX hardware, GDPR Article 28 by design. Try free.