Data Privacy

Privacy Impact Assessment Template: How to Run a DPIA or PIA That Satisfies GDPR, CPRA, and 20+ US State Privacy Laws

May 13, 2026 Rebecca Leung
Table of Contents

TL;DR

  • A Privacy Impact Assessment (PIA) — called a Data Protection Impact Assessment (DPIA) under GDPR — is legally required before processing that presents “high risk” to individuals under EU law and equivalent requirements in California and 20+ US states.
  • GDPR Article 35 mandates DPIAs for automated decision-making with legal effects, systematic monitoring, sensitive data processing, and novel-technology use cases — most financial services firms hit multiple triggers.
  • California’s CPRA risk assessment regulations took effect January 1, 2026; retrospective assessments for existing processing are due December 31, 2027, with CPPA attestation required by April 1, 2028.
  • A defensible PIA is not a form you fill out once — it’s a living document that follows the data through its lifecycle, updated whenever processing materially changes.

Your engineering team just shipped a new feature that uses customer transaction data to generate personalized financial recommendations. It deploys an ML model. It processes data on EU customers. And nobody ran a privacy impact assessment.

That’s not a hypothetical. It’s the pattern behind some of the largest privacy enforcement actions of the last five years — not because companies didn’t know the rules, but because nobody translated the regulatory requirement into an operational workflow that actually runs before new processing goes live.

A Privacy Impact Assessment (PIA) — known as a Data Protection Impact Assessment (DPIA) under GDPR — answers one question before you process data: Is this worth the privacy risk?

Getting that question embedded in your product development lifecycle, vendor onboarding workflow, and change management process is what separates companies that get regulatory inquiries from companies that navigate them.

What Is a PIA / DPIA?

A Privacy Impact Assessment is a structured evaluation of a proposed data processing activity that:

  1. Describes what data is being processed, by whom, for what purpose, and under what legal basis
  2. Assesses whether the processing is necessary and proportionate to the stated purpose
  3. Identifies risks to individuals’ privacy rights — from data exposure, profiling, discrimination, or other harm
  4. Documents the controls and safeguards in place, and whether residual risk is acceptable

Under GDPR, this is called a Data Protection Impact Assessment (DPIA). Under most US state privacy laws, it’s called a data protection assessment or privacy risk assessment. The term Privacy Impact Assessment (PIA) is commonly used in US federal practice under the E-Government Act. All three terms describe essentially the same exercise.

For operational purposes, build one process and apply it across all jurisdictions. The specific documentation requirements vary at the margins, but a PIA that satisfies GDPR Article 35 will generally satisfy US state law equivalents.

When Is a PIA Required?

Under GDPR (Article 35)

GDPR Article 35 requires a DPIA whenever processing is “likely to result in a high risk to the rights and freedoms of natural persons.” The European Commission has provided guidance on what constitutes high risk. The mandatory triggers that hit most financial services firms:

GDPR TriggerFinancial Services Example
Systematic and extensive profiling with legal or significant effectsCredit scoring, affordability assessment, fraud risk scoring
Large-scale processing of special category dataProcessing health data for insurance underwriting; criminal conviction data for AML screening
Systematic monitoring of publicly accessible areasBranch or ATM surveillance systems
Automated processing with legal or similarly significant effectsAutomated loan approval, automated account closure, algorithmic investment advice
New technology posing high riskBiometric authentication, generative AI in customer service, behavioral analytics
Large-scale profiling or sensitive data combinationsTransaction-level data combined with geolocation for customer profiling

Each EU and UK data protection authority also publishes a list of processing types that always require a DPIA. The ICO’s list includes systematic profiling and innovative use of biometric data. The CNIL (France) and other DPAs maintain equivalent lists. When in doubt, run a brief DPIA screening — a shorter review that determines whether a full DPIA is needed — before any new significant processing activity.

Under US State Privacy Laws

More than twenty US state privacy laws now require data protection assessments for high-risk processing activities. The triggers are broadly consistent across states:

Processing ActivityStates Requiring Assessment
Targeted advertising using personal dataVirginia, Colorado, Connecticut, Oregon, Texas, Indiana, Kentucky, Rhode Island, and others
Selling or sharing personal dataMost state privacy laws with assessment requirements
Profiling for consequential decisionsVirginia, Colorado, Connecticut, Oregon, Montana, and others
Processing sensitive personal dataVirginia, Colorado, Connecticut, California (CPRA), New Jersey, and others
Automated decision-making with significant effectsColorado, Connecticut, California (ADMT, effective 2027)

California’s new CPRA risk assessment regulations are the most comprehensive US state requirement. The CPPA finalized rules that took effect January 1, 2026, requiring businesses to complete a risk assessment before initiating any processing activity that presents “significant risk” to consumer privacy — which expressly includes selling or sharing personal data, processing sensitive personal information, and using automated decision-making technology for significant decisions affecting financial services, employment, housing, education, or healthcare.

Key California compliance deadlines:

  • January 1, 2026: Risk assessments required for new significant-risk processing
  • December 31, 2027: Retrospective risk assessments due for pre-2026 significant-risk processing that continues
  • April 1, 2028: Attestation to the CPPA confirming required assessments were completed

If you’re a business with California consumer data and in-scope processing activities, that retrospective deadline is not hypothetical — it requires you to go back and document assessments for processing that may have started years ago.

The Required Elements: What Goes In a DPIA

Under GDPR Article 35(7), a DPIA must contain at minimum:

  1. A systematic description of the processing operations and their purposes, including any legitimate interests pursued
  2. An assessment of the necessity and proportionality of the processing in relation to the purpose
  3. An assessment of the risks to the rights and freedoms of data subjects
  4. The measures envisaged to address the risks — safeguards, security measures, and mechanisms ensuring data protection and demonstrating GDPR compliance

US state law requirements mirror this structure: description of processing, purpose, benefits and risks weighed against each other, and measures to mitigate risk.

The Element Most PIAs Miss: Necessity and Proportionality

Companies regularly skip or rush the second element because it feels abstract. It isn’t.

Necessity asks: Do you actually need this data to achieve this purpose? If you’re collecting date of birth “for verification purposes” but never verify age, the processing isn’t necessary. If you’re retaining five years of transaction data for fraud analysis but your fraud models only look back 90 days, the retention may exceed what’s necessary.

Proportionality asks: Is the privacy intrusion proportionate to the benefit? A model that marginally improves fraud detection accuracy by processing sensitive location data continuously may not pass proportionality if a less privacy-invasive approach could achieve comparable accuracy.

Regulators use these questions to challenge PIAs in enforcement. A DPIA that documents the processing without honestly engaging these questions won’t hold up. The ICO, in reviewing organizations’ DPIAs, expects to see genuine analysis — not boilerplate.

A PIA You Can Actually Use: The 7-Step Process

Step 1: Identify the trigger

Use a screening checklist embedded in your change management and vendor onboarding processes to flag new processing activities before work begins. The screening should ask: What data is involved? Does it include personal data? Sensitive personal data? Does it involve profiling, automated decisions, or new technology? Any “yes” answer above the screening threshold triggers a DPIA.

The key operational requirement: the screening has to happen before work begins, not after it ships. Build it into sprint planning, vendor contracting, and product review gates.

Step 2: Describe the processing

Document: what categories of data are collected and from what source; the legal basis for processing; the stated purpose; which systems and teams process the data; any third-party vendors or processors involved; the retention period; and whether automated decision-making or profiling is involved.

Step 3: Assess necessity and proportionality

For each category of data processed, ask: Is this required to achieve the stated purpose? Could the purpose be achieved with anonymized data, aggregate data, or less granular data? Is the retention period proportionate to the purpose? Are there less privacy-invasive means of achieving the same result?

Document your reasoning, not just your conclusion.

Step 4: Identify privacy risks

For each category of data, identify what could go wrong:

  • Unauthorized access or disclosure
  • Use beyond the stated purpose (purpose creep)
  • Profiling leading to discriminatory outcomes
  • Retention longer than necessary
  • Individuals losing meaningful control over their data
  • Data being used for profiling or automated decisions individuals aren’t aware of

Score each risk by likelihood and severity. Use consistent scoring so risks can be compared across assessments and tracked over time.

Step 5: Identify controls and mitigations

For each identified risk, document the controls in place: encryption at rest and in transit, access controls and least-privilege enforcement, data minimization techniques, pseudonymization, contractual data processor obligations, audit logging, and purpose limitation enforcement. Then calculate residual risk after controls.

Step 6: Make the risk decision

If residual risk is high after controls, escalation is required. Under GDPR, when high residual risk cannot be adequately mitigated, the controller must consult the supervisory authority before proceeding — this is the “prior consultation” requirement under Article 36. A competent authority consultation takes time and may result in restrictions on the processing.

Under US state privacy laws, the standard is whether “the benefits to the consumer, the controller, other stakeholders, and the public outweigh the risks.” Someone senior — DPO, General Counsel, or equivalent — needs to own that decision and sign off on it in writing.

Step 7: Document, approve, and maintain

The DPIA is a living document. Any material change to the processing activity — new data categories, expanded vendor access, new jurisdictions, algorithm changes, new purposes — requires revisiting the assessment. Build version control and review triggers into your records management process. A DPIA that was accurate at launch but not updated as the product evolved is not a defensible document.

Building the Screening Trigger Into Your Workflows

The most common PIA program failure isn’t the template — it’s the trigger. Companies build excellent DPIA templates and then discover that nobody uses them because the workflow doesn’t create a decision gate before processing begins.

Effective screening points:

Product development: All new features or significant changes involving personal data require a DPIA screening before the sprint begins. The privacy team reviews the screening and escalates to a full DPIA if needed.

Vendor onboarding: Any new vendor that will receive or process personal data triggers a screening before the contract is signed. Vendors that process sensitive data or enable automated decision-making require a full assessment. Your vendor due diligence process should include this step.

Change management: Material changes to existing processing activities — new algorithms, expanded data access, new retention periods — trigger a DPIA review of the existing assessment.

M&A and data asset acquisition: Acquiring a new data set or business that brings new processing activities requires assessing whether those activities require PIAs under your obligations.

Connecting PIAs to Your Broader Privacy Program

A PIA process doesn’t operate in isolation:

Data classification: You need to know what categories of data you’re processing to accurately assess DPIA triggers and risk levels. A classification schema that distinguishes personal, sensitive, and special-category data maps directly to PIA screening questions.

Data retention policies: Retention decisions made during PIAs should feed directly into your retention schedule. A PIA that concludes six months of retention is proportionate should create a retention rule — not just a statement in the assessment document.

DSAR response: Under GDPR, controllers must maintain records of processing activities (ROPA). PIA documentation forms part of that records structure. When individuals exercise rights around automated decision-making, your DPIA documentation tells you what assessments are relevant.

State privacy law obligations: Data protection assessment requirements vary by state. Your processing inventory should map each activity to the states where it applies, so you know which assessments are required in which jurisdictions — particularly as new state laws activate.

What Enforcement Tells Us

The DPIA requirement has appeared in enforcement in two ways. First, the failure to conduct required assessments is itself a violation. Under GDPR, this can trigger fines independent of any underlying breach or harm. Second, inadequate DPIAs — ones that document rather than analyze, or that are completed after the fact — have been cited in cases involving the mishandling of children’s data and novel data uses.

The pattern in these cases: companies had PIA processes on paper but hadn’t embedded them into operational workflows. The assessments that existed were backward-looking documentation exercises, not the pre-processing risk evaluations the regulation requires.

So What?

If you don’t have a PIA process, the highest-leverage starting point is the trigger, not the template. Build the screening checkpoint into your change management and vendor onboarding workflows before investing time in the template. A screening process that flags 90% of high-risk processing and routes it to a basic assessment is more valuable than an elaborate template that nobody uses because there’s no workflow to invoke it.

If you do have a process, pressure-test the trigger and the living-document maintenance:

  1. Can you identify every new significant processing activity from the last 12 months? Did each one go through a screening?
  2. Are your existing DPIAs current? Have any material changes to your processing occurred since they were last reviewed?

The California CPRA retrospective deadline creates a concrete deliverable for 2027: a documented assessment for every significant processing activity currently in operation. Starting that inventory now — before the attestation deadline — is significantly easier than doing it under time pressure in late 2027.

Privacy impact assessments done well are not compliance overhead. They’re the mechanism that forces a conversation about data minimization, vendor obligations, and processing purpose before those decisions become expensive to unwind.

Frequently Asked Questions

What is the difference between a PIA and a DPIA?
A Privacy Impact Assessment (PIA) and a Data Protection Impact Assessment (DPIA) refer to the same type of exercise — a structured evaluation of privacy risks before processing begins. 'DPIA' is the term used in GDPR (Article 35). 'Data protection assessment' is the term used in most US state privacy laws (Virginia, Colorado, Connecticut, Oregon, and others). 'PIA' is the common shorthand used in US federal practice under the E-Government Act. For practical purposes, build one process and apply it across all jurisdictions.
When is a DPIA required under GDPR?
GDPR Article 35 requires a DPIA when processing is likely to result in high risk to individuals. Mandatory triggers include: automated profiling with legal or significant effects, large-scale processing of sensitive data (health, biometric, criminal), systematic monitoring of individuals, and processing using new technologies that pose high risk. Most financial services firms hit at least one of these triggers for significant processing activities.
What do California's new CPRA risk assessment rules require?
The CPPA's risk assessment regulations (effective January 1, 2026) require businesses to complete a risk assessment before initiating any processing activity that presents 'significant risk' to consumer privacy — including selling or sharing personal data, processing sensitive personal information, and using automated decision-making for significant decisions. Retrospective assessments for pre-2026 processing are due December 31, 2027. An attestation to the CPPA is required by April 1, 2028.
Which US state privacy laws require data protection assessments?
Most modern comprehensive state privacy laws include a data protection assessment requirement: Virginia CDPA, Colorado CPA, Connecticut CTDPA, Oregon, Texas, Montana, New Hampshire (effective January 2025), New Jersey (effective January 2025), Indiana (effective January 2026), Kentucky (effective January 2026), Rhode Island (effective January 2026), and California (CPRA risk assessments effective January 2026). The specific triggers and content requirements vary but are broadly consistent.
What are the required elements of a GDPR DPIA?
Under GDPR Article 35(7), a DPIA must contain: (1) a systematic description of the processing and its purposes; (2) an assessment of necessity and proportionality; (3) an assessment of the risks to the rights and freedoms of data subjects; and (4) the measures envisaged to address those risks, including safeguards, security measures, and mechanisms to ensure data protection. Many practitioners also document the legal basis, data minimization rationale, and retention justification.
What happens if you skip a required DPIA?
Under GDPR, failing to conduct a required DPIA is itself a violation, subject to fines of up to €10 million or 2% of global annual turnover, whichever is higher — even if no underlying data breach occurred. Under US state privacy laws, failure to maintain required data protection assessments is an enforcement trigger. The California CPPA can initiate investigations and impose civil penalties. Failure to conduct assessments was cited in regulatory findings against Meta related to its handling of children's data.
Rebecca Leung

Rebecca Leung

Rebecca Leung has 8+ years of risk and compliance experience across first and second line roles at commercial banks, asset managers, and fintechs. Former management consultant advising financial institutions on risk strategy. Founder of RiskTemplates.

Related Framework

Data Privacy Compliance Kit

Multi-state privacy compliance templates covering 19 state laws plus GLBA and CCPA.

Immaterial Findings ✉️

Weekly newsletter

Sharp risk & compliance insights practitioners actually read. Enforcement actions, regulatory shifts, and practical frameworks — no fluff, no filler.

Join practitioners from banks, fintechs, and asset managers. Delivered weekly.