GDPR Article 35 — Data Protection Impact Assessment (DPIA)
The GDPR requirement to conduct a formal risk assessment before deploying processing operations that are likely to result in high risk to individuals' rights and freedoms.
Article 35 of the GDPR requires controllers to conduct a Data Protection Impact Assessment (DPIA) prior to any processing "likely to result in a high risk to the rights and freedoms of natural persons." Article 35(3) specifies three categories that always require a DPIA: systematic and extensive evaluation of personal aspects based on automated processing including profiling; processing of special categories of data (Article 9) or criminal conviction data (Article 10) on a large scale; and systematic monitoring of a publicly accessible area on a large scale. The European Data Protection Board (EDPB) published Guidelines WP248 rev 1 on DPIAs, providing nine additional criteria for assessing "high risk": evaluation/scoring, automated decision-making with legal or similarly significant effects, systematic monitoring, sensitive data, data processed at large scale, matched or combined datasets, data concerning vulnerable subjects, innovative use or application of new technologies, and transfer of data outside the EU with inadequate protections.
Conducting a DPIA is a structured risk assessment process, not a paper exercise. Article 35(7) specifies minimum DPIA content: a systematic description of the envisioned processing operations and the purposes; an assessment of necessity and proportionality; an assessment of risks to rights and freedoms; and the measures to address those risks. For engineering teams, this translates to: documenting data flows with DFD (Data Flow Diagram) precision, mapping every data element to a processing purpose and legal basis, quantifying the volume and sensitivity of data processed, modeling threat scenarios (unauthorized access, data breach, misuse by controller or processor), and documenting technical and organizational controls that mitigate identified risks. The DPIA must be conducted before processing begins — retrofitting a DPIA to an existing system is acceptable for compliance but provides no design benefit.
DPIAs have regulatory teeth beyond Article 35 itself. Where a DPIA indicates that the processing would result in high risk absent mitigating measures, Article 36 requires the controller to consult with the supervisory authority before proceeding. Supervisory authorities can prohibit processing if risks are not adequately mitigated. Several DPAs have published lists of processing operations that always require a DPIA in their jurisdiction (e.g., the UK ICO's mandatory DPIA list, the CNIL's list of 12 processing types requiring a DPIA, the German DPAs' Blacklist). AI systems are increasingly subject to mandatory DPIA requirements: the EU AI Act (Regulation 2024/1689, effective August 2024) requires conformity assessments for high-risk AI systems that overlap substantially with GDPR DPIA requirements for AI-powered profiling and decision-making systems.
We conduct DPIAs as structured engineering risk assessments: data flow documentation using process modeling tools, threat modeling using LINDDUN (privacy-specific) and STRIDE methodologies, quantitative risk scoring against ENISA privacy risk assessment methodology, and technical control recommendations mapped to specific risk scenarios. We integrate DPIA triggers into product development workflows (Jira, Linear, ADO) so that high-risk processing designs are flagged for DPIA before architecture is finalized.
Compliance-Native Architecture Guide
Design principles and a structured checklist for building software that is compliant by default — not compliant by retrofit. Covers data architecture, access controls, audit trails, and vendor due diligence.