Skip to main content
Governance & Compliance

GDPR and AI Communication Analysis:
Enterprise Buyer Guide

The most common reason AI communication analysis deployments stall is not technical — it is GDPR. Here is what compliance actually requires, what questions to ask your vendor, and what responsible deployment looks like.

Jonathan Prescott
Jonathan Prescott · Founder & CEO, Cavefish · April 2026 · 11 min read
Quick Reference — EchoDepth GDPR Status
ICO Registration: ZB915623
Lawful Basis: Explicit consent (default)
Biometric data retention: None — vectors discarded post-analysis
Data residency: UK default · EU available
DPA provided: Yes — all deployments
DPIA support: Yes — provided with enterprise onboarding
On-premise deployment: Yes — Docker, air-gap capable
Automated decisions: None — human-in-the-loop required

Why GDPR Matters for AI Communication Analysis

The single most common reason enterprise procurement teams stall or reject AI communication analysis deployments is GDPR. Not because AI communication analysis is inherently non-compliant — it can be fully compliant with the right architecture — but because many vendors have not done the governance work to make compliance straightforward.

The questions are predictable. Does this involve biometric data? What is the lawful basis? Is a DPIA required? Where is data stored? What happens to the raw video after analysis? A vendor who cannot answer these clearly in writing is a vendor whose deployment will create problems later.

This guide covers the key GDPR considerations for enterprise AI communication analysis — and what responsible deployment looks like in practice.

Does AI Communication Analysis Involve Biometric Data?

This is the first question procurement and legal teams ask — and the answer is nuanced.

Biometric data is defined under Article 4(14) UK GDPR as personal data resulting from specific technical processing relating to physical or physiological characteristics that allows or confirms unique identification. Facial expression analysis using FACS Action Units does involve biometric processing in this sense, if it identifies individuals.

The distinction that matters for GDPR risk is between systems that:
— Retain raw biometric data (facial vectors, voice prints) that could be used to identify individuals, and
— Process biometric signals ephemerally, deriving scored outputs without retaining raw identifiers

EchoDepth operates on the second model. Facial Action Unit vectors are processed during analysis and discarded. What is retained is the derived output — Trust Scores, Credibility Signal timelines, coaching notes — which are communication quality scores, not biometric identifiers.

This architecture significantly reduces Article 9 exposure, but does not eliminate it entirely. If the source video remains identifiable (which it typically does), the processing relationship exists during the analysis window. This should be addressed in the DPA and DPIA.

What Lawful Basis Applies?

For enterprise AI communication analysis deployments, there are three lawful basis options worth considering:

Explicit consent (Article 6(1)(a) + Article 9(2)(a)) is the most robust basis for most deployments. It requires a genuine, freely given, specific and informed consent from each data subject before processing begins. For video analysis of employees, customers or third parties, this means a clear consent process before recording or analysis occurs.

The practical advantage of consent is clarity — it is the most defensible basis in any ICO investigation, and it builds trust with data subjects. The practical limitation is that it requires a consent architecture to be built into the deployment process.

Legitimate interests (Article 6(1)(f)) may be applicable for some internal operational deployments — for example, analysing leadership communications to improve change programme outcomes. However, this requires a Legitimate Interests Assessment (LIA) confirming that the processing is necessary, the interests are legitimate, and the balance test is met. The ICO has indicated scepticism about legitimate interests as a basis for biometric processing without additional safeguards.

Contract (Article 6(1)(b)) is unlikely to apply unless analysis of an individual's communications is directly necessary to fulfil a contract with them — which is rare in enterprise contexts.

For most deployments, explicit consent with a documented consent architecture is the correct starting point.

When Is a DPIA Required?

A Data Protection Impact Assessment is required under Article 35 UK GDPR when processing is likely to result in a high risk to individuals. The ICO publishes nine criteria — if two or more apply, a DPIA is strongly recommended; if the processing is systematic, uses new technology, and involves biometric data, it is almost certainly required.

Enterprise AI communication analysis typically meets at least three criteria:

Systematic processing: analysis is applied consistently as part of a defined process
New technology: AI-based analysis constitutes new technology under ICO guidance
Biometric data or special category data: facial and vocal analysis falls into this category depending on architecture

A DPIA for AI communication analysis should cover: the processing activity description, the necessity and proportionality assessment, the risk identification (including to individuals' rights), the mitigation measures (consent architecture, data minimisation, retention limits, access controls), and residual risk assessment.

Cavefish provides a structured DPIA support pack for all enterprise deployments, covering the EchoDepth-specific processing activity and the mitigations the system architecture provides. This does not replace your organisation's own DPIA — it provides the vendor-side inputs.

Purpose Limitation and Data Minimisation

Two of the most important GDPR principles for AI communication analysis deployments are purpose limitation and data minimisation.

Purpose limitation requires that data is processed only for the specific purpose for which consent was obtained and documented in the DPA. If consent was obtained for sales demo coaching, the same data cannot be used for HR performance assessment. Each use case requires its own documented purpose and consent.

Data minimisation requires that only the data necessary for the defined purpose is processed. This means not retaining raw video recordings beyond the analysis window, not generating outputs beyond what is needed for the stated purpose, and not sharing outputs with parties not specified in the DPA.

EchoDepth's architecture implements both principles by design: raw biometric data is not retained, processing is scoped to the defined purpose in the DPA, and outputs are structured to the minimum necessary for each deployment context.

What to Ask Your Vendor

Before deploying any AI communication analysis tool, your procurement and legal team should request written answers to these eight questions:

1. Are you ICO registered? Verify the registration number independently at ico.org.uk/ESDWebPages/Search. EchoDepth: ICO ZB915623.

2. Do you provide a signed Data Processing Agreement? A DPA is a legal requirement for any data processor under UK GDPR. Insist on one before any data is shared.

3. Do you retain raw biometric data post-analysis? The answer should be no. EchoDepth discards biometric vectors after the analysis window closes.

4. What is your data residency? UK GDPR requires adequate protection for data transferred outside the UK. EchoDepth uses UK data residency by default; EU residency is available on request.

5. Do you support on-premise deployment? For defence, finance and regulated environments with zero-egress requirements. EchoDepth supports full Docker on-premise deployment.

6. Do you provide DPIA support documentation? A responsible vendor should provide inputs to your DPIA, not leave you to invent the processing description.

7. Do you have a process for data subject access requests? Under Article 17 UK GDPR, individuals can request erasure of their personal data. Your vendor must support this.

8. What is your breach notification SLA? Article 33 UK GDPR requires notification to the ICO within 72 hours of a personal data breach. Your DPA should specify the vendor's notification obligation to you.

EchoDepth's Governance Architecture

EchoDepth is designed for GDPR compliance from the ground up:

No raw biometric data retention: Facial AU vectors and vocal processing data are discarded after the analysis window closes. Only derived outputs are retained.

Consent-first architecture: All deployments require explicit informed consent from data subjects before any analysis. Consent frameworks are documented and deployment-specific.

UK data residency by default: All processing within UK data centres. EU residency and on-premise deployment available.

Signed DPA for every deployment: Provided before any data is shared or processed. Covers purpose limitation, retention periods, sub-processor obligations and data subject rights.

DPIA support pack: Structured documentation covering the EchoDepth processing activity, architecture mitigations and residual risk assessment — provided as inputs to your organisation's DPIA.

ICO registered: ZB915623. Verifiable at ico.org.uk/ESDWebPages/Search.

Human-in-the-loop required: EchoDepth outputs augment human judgement — they do not make automated decisions about individuals. This is documented in every DPA and is a condition of deployment.

Request the EchoDepth GDPR Pack

Includes: signed DPA template, DPIA support documentation, ICO registration certificate, data flow diagram, and consent architecture guidance.

Request Governance Pack →
Related Reading
EchoDepth Governance Framework →Security & Data Compliance →FCA Consumer Duty Guide →Ethics of AI Communication Analysis →