How to use AI on personal data without violating Article 28. What your DPO needs to know about data processing, residency, and hardware-sealed alternatives.
67% of employees use AI tools on sensitive data. They paste contracts, medical records, financial reports, and client communications into tools like ChatGPT, Claude, and Gemini every day. Most of these tools are hosted in US datacenters, process data outside the EU, and operate under terms of service that do not meet GDPR Article 28 requirements for data processors.
The gap between how people use AI and what the law requires has grown rapidly. Employees adopt AI tools faster than compliance teams can evaluate them. The result: widespread, uncontrolled processing of personal data by third-party AI providers who were never vetted as data processors under GDPR.
For regulated industries — law firms, accountants, healthcare providers, financial services — this is not an abstract risk. It is an active compliance violation with concrete consequences: fines up to 4% of annual revenue, regulatory orders to cease processing, and reputational damage that no insurance policy covers.
Article 28 governs the relationship between data controllers (your organization) and data processors (anyone who processes personal data on your behalf). When your employees use an AI tool to analyze documents containing personal data, the AI provider becomes a data processor. Article 28 requires a binding contract that includes specific obligations:
OpenAI, Anthropic, and Google all publish Data Processing Agreements. That is a necessary first step. But a DPA alone does not satisfy Article 28 if the underlying technical measures are insufficient. Here are the specific gaps:
This does not mean these tools are unusable. For non-personal, non-sensitive data, they may be perfectly appropriate. But when processing personal data subject to GDPR — client files, patient records, employee data, financial information — the technical guarantees fall short of what Article 28 requires.
A GDPR-compliant AI deployment for personal data processing requires both legal agreements and technical enforcement. The legal layer (DPA, SCCs) is necessary but not sufficient. The technical layer must make it physically impossible for the processor to access the data. Here is what that looks like:
All computation occurs within EU-governed datacenters. No transatlantic data transfers.
Intel TDX or AMD SEV enclaves ensure the hosting provider physically cannot read data in memory.
Data exists in encrypted memory only during inference. Destroyed immediately after the response.
A Data Processing Agreement that specifically references GDPR Article 28 obligations, not a generic privacy policy.
Cryptographic proof — signed by the CPU — that the enclave was genuine and unmodified. Auditable by the controller.
The processor must allow inspections. Hardware attestation makes remote audits technically verifiable.
Several providers now offer some or all of these capabilities, including Microsoft Azure Confidential Computing, Google Cloud Confidential VMs, and VoltageGPU. The key evaluation criterion: can the provider prove — with hardware evidence, not just a contract — that they cannot access your data?
Traditional encryption protects data in two states: at rest (stored on disk) and in transit (moving across networks). But during processing, data must be decrypted in memory. Anyone with root access to the host machine — including the cloud provider's staff — can read that memory.
Hardware-sealed computing (also called confidential computing) adds a third layer: encryption in use. Technologies like Intel Trust Domain Extensions (TDX) and AMD Secure Encrypted Virtualization (SEV) create isolated memory regions — called enclaves or trust domains — where data remains encrypted even while the CPU processes it.
Why this matters for GDPR: Article 28 requires “appropriate technical measures.” Hardware-sealed computing is the strongest technical measure currently available because it removes the need to trust the processor. The CPU enforces isolation — not a policy, not a contract, but physics and silicon.
This is not experimental technology. Intel TDX is supported by the Confidential Computing Consortium (members include Intel, AMD, NVIDIA, Microsoft, Google, and ARM) and is deployed in production by major cloud providers.
Before approving any AI tool for processing personal data, your Data Protection Officer should verify the following. Each item maps directly to a GDPR requirement.
If the answer to any of these is “no” or “unclear,” the tool should not be approved for processing personal data until the gap is resolved. Document your assessment — GDPR accountability (Art. 5(2)) requires demonstrable due diligence.
Partially. ChatGPT Enterprise offers a DPA and disables training on your data by default. However, data is still processed in US datacenters, the operator can technically access memory contents, and there is no hardware attestation. For regulated industries handling client personal data, this often falls short of Art. 28 requirements.
Software encryption protects data at rest and in transit, but during processing the data must be decrypted in memory — where the hosting provider can access it. Hardware-sealed computing (Intel TDX, AMD SEV) encrypts data even while it is being processed. The CPU enforces isolation. Not even root access on the host machine can read the enclave memory.
Self-hosting can be GDPR compliant if you control the infrastructure, implement proper security measures, and document everything. The challenge is operational: you need to maintain the hardware, ensure patches, handle attestation yourself, and write your own DPA for any infrastructure providers. For most organizations, this is prohibitively expensive.
Under GDPR Article 83, violations of processor obligations (Art. 28) can result in fines up to 10 million euros or 2% of annual worldwide turnover, whichever is higher. Violations of data transfer rules (Chapter V) can attract fines up to 20 million euros or 4% of annual turnover. Beyond fines, regulators can order processing to cease entirely.
The EU AI Act adds requirements on top of GDPR. It introduces risk categories for AI systems, transparency obligations, and conformity assessments for high-risk uses. However, it does not replace GDPR. You still need full Art. 28 compliance for any AI system processing personal data, regardless of its AI Act risk classification.
In principle, yes — if they use Standard Contractual Clauses, implement supplementary measures (such as encryption the provider cannot break), and undergo transfer impact assessments. In practice, the Schrems II ruling makes this difficult because US surveillance laws (FISA 702, EO 12333) can compel data disclosure. Hardware-sealed EU processing eliminates this risk entirely.
VoltageGPU is one solution built for this. French company, Intel TDX hardware, GDPR Article 28 by design. Try free.