Privacy Impact Assessment Template: How to Run a DPIA or PIA That Satisfies GDPR, CPRA, and 20+ US State Privacy Laws
Table of Contents
TL;DR
- A Privacy Impact Assessment (PIA) — called a Data Protection Impact Assessment (DPIA) under GDPR — is legally required before processing that presents “high risk” to individuals under EU law and equivalent requirements in California and 20+ US states.
- GDPR Article 35 mandates DPIAs for automated decision-making with legal effects, systematic monitoring, sensitive data processing, and novel-technology use cases — most financial services firms hit multiple triggers.
- California’s CPRA risk assessment regulations took effect January 1, 2026; retrospective assessments for existing processing are due December 31, 2027, with CPPA attestation required by April 1, 2028.
- A defensible PIA is not a form you fill out once — it’s a living document that follows the data through its lifecycle, updated whenever processing materially changes.
Your engineering team just shipped a new feature that uses customer transaction data to generate personalized financial recommendations. It deploys an ML model. It processes data on EU customers. And nobody ran a privacy impact assessment.
That’s not a hypothetical. It’s the pattern behind some of the largest privacy enforcement actions of the last five years — not because companies didn’t know the rules, but because nobody translated the regulatory requirement into an operational workflow that actually runs before new processing goes live.
A Privacy Impact Assessment (PIA) — known as a Data Protection Impact Assessment (DPIA) under GDPR — answers one question before you process data: Is this worth the privacy risk?
Getting that question embedded in your product development lifecycle, vendor onboarding workflow, and change management process is what separates companies that get regulatory inquiries from companies that navigate them.
What Is a PIA / DPIA?
A Privacy Impact Assessment is a structured evaluation of a proposed data processing activity that:
- Describes what data is being processed, by whom, for what purpose, and under what legal basis
- Assesses whether the processing is necessary and proportionate to the stated purpose
- Identifies risks to individuals’ privacy rights — from data exposure, profiling, discrimination, or other harm
- Documents the controls and safeguards in place, and whether residual risk is acceptable
Under GDPR, this is called a Data Protection Impact Assessment (DPIA). Under most US state privacy laws, it’s called a data protection assessment or privacy risk assessment. The term Privacy Impact Assessment (PIA) is commonly used in US federal practice under the E-Government Act. All three terms describe essentially the same exercise.
For operational purposes, build one process and apply it across all jurisdictions. The specific documentation requirements vary at the margins, but a PIA that satisfies GDPR Article 35 will generally satisfy US state law equivalents.
When Is a PIA Required?
Under GDPR (Article 35)
GDPR Article 35 requires a DPIA whenever processing is “likely to result in a high risk to the rights and freedoms of natural persons.” The European Commission has provided guidance on what constitutes high risk. The mandatory triggers that hit most financial services firms:
| GDPR Trigger | Financial Services Example |
|---|---|
| Systematic and extensive profiling with legal or significant effects | Credit scoring, affordability assessment, fraud risk scoring |
| Large-scale processing of special category data | Processing health data for insurance underwriting; criminal conviction data for AML screening |
| Systematic monitoring of publicly accessible areas | Branch or ATM surveillance systems |
| Automated processing with legal or similarly significant effects | Automated loan approval, automated account closure, algorithmic investment advice |
| New technology posing high risk | Biometric authentication, generative AI in customer service, behavioral analytics |
| Large-scale profiling or sensitive data combinations | Transaction-level data combined with geolocation for customer profiling |
Each EU and UK data protection authority also publishes a list of processing types that always require a DPIA. The ICO’s list includes systematic profiling and innovative use of biometric data. The CNIL (France) and other DPAs maintain equivalent lists. When in doubt, run a brief DPIA screening — a shorter review that determines whether a full DPIA is needed — before any new significant processing activity.
Under US State Privacy Laws
More than twenty US state privacy laws now require data protection assessments for high-risk processing activities. The triggers are broadly consistent across states:
| Processing Activity | States Requiring Assessment |
|---|---|
| Targeted advertising using personal data | Virginia, Colorado, Connecticut, Oregon, Texas, Indiana, Kentucky, Rhode Island, and others |
| Selling or sharing personal data | Most state privacy laws with assessment requirements |
| Profiling for consequential decisions | Virginia, Colorado, Connecticut, Oregon, Montana, and others |
| Processing sensitive personal data | Virginia, Colorado, Connecticut, California (CPRA), New Jersey, and others |
| Automated decision-making with significant effects | Colorado, Connecticut, California (ADMT, effective 2027) |
California’s new CPRA risk assessment regulations are the most comprehensive US state requirement. The CPPA finalized rules that took effect January 1, 2026, requiring businesses to complete a risk assessment before initiating any processing activity that presents “significant risk” to consumer privacy — which expressly includes selling or sharing personal data, processing sensitive personal information, and using automated decision-making technology for significant decisions affecting financial services, employment, housing, education, or healthcare.
Key California compliance deadlines:
- January 1, 2026: Risk assessments required for new significant-risk processing
- December 31, 2027: Retrospective risk assessments due for pre-2026 significant-risk processing that continues
- April 1, 2028: Attestation to the CPPA confirming required assessments were completed
If you’re a business with California consumer data and in-scope processing activities, that retrospective deadline is not hypothetical — it requires you to go back and document assessments for processing that may have started years ago.
The Required Elements: What Goes In a DPIA
Under GDPR Article 35(7), a DPIA must contain at minimum:
- A systematic description of the processing operations and their purposes, including any legitimate interests pursued
- An assessment of the necessity and proportionality of the processing in relation to the purpose
- An assessment of the risks to the rights and freedoms of data subjects
- The measures envisaged to address the risks — safeguards, security measures, and mechanisms ensuring data protection and demonstrating GDPR compliance
US state law requirements mirror this structure: description of processing, purpose, benefits and risks weighed against each other, and measures to mitigate risk.
The Element Most PIAs Miss: Necessity and Proportionality
Companies regularly skip or rush the second element because it feels abstract. It isn’t.
Necessity asks: Do you actually need this data to achieve this purpose? If you’re collecting date of birth “for verification purposes” but never verify age, the processing isn’t necessary. If you’re retaining five years of transaction data for fraud analysis but your fraud models only look back 90 days, the retention may exceed what’s necessary.
Proportionality asks: Is the privacy intrusion proportionate to the benefit? A model that marginally improves fraud detection accuracy by processing sensitive location data continuously may not pass proportionality if a less privacy-invasive approach could achieve comparable accuracy.
Regulators use these questions to challenge PIAs in enforcement. A DPIA that documents the processing without honestly engaging these questions won’t hold up. The ICO, in reviewing organizations’ DPIAs, expects to see genuine analysis — not boilerplate.
A PIA You Can Actually Use: The 7-Step Process
Step 1: Identify the trigger
Use a screening checklist embedded in your change management and vendor onboarding processes to flag new processing activities before work begins. The screening should ask: What data is involved? Does it include personal data? Sensitive personal data? Does it involve profiling, automated decisions, or new technology? Any “yes” answer above the screening threshold triggers a DPIA.
The key operational requirement: the screening has to happen before work begins, not after it ships. Build it into sprint planning, vendor contracting, and product review gates.
Step 2: Describe the processing
Document: what categories of data are collected and from what source; the legal basis for processing; the stated purpose; which systems and teams process the data; any third-party vendors or processors involved; the retention period; and whether automated decision-making or profiling is involved.
Step 3: Assess necessity and proportionality
For each category of data processed, ask: Is this required to achieve the stated purpose? Could the purpose be achieved with anonymized data, aggregate data, or less granular data? Is the retention period proportionate to the purpose? Are there less privacy-invasive means of achieving the same result?
Document your reasoning, not just your conclusion.
Step 4: Identify privacy risks
For each category of data, identify what could go wrong:
- Unauthorized access or disclosure
- Use beyond the stated purpose (purpose creep)
- Profiling leading to discriminatory outcomes
- Retention longer than necessary
- Individuals losing meaningful control over their data
- Data being used for profiling or automated decisions individuals aren’t aware of
Score each risk by likelihood and severity. Use consistent scoring so risks can be compared across assessments and tracked over time.
Step 5: Identify controls and mitigations
For each identified risk, document the controls in place: encryption at rest and in transit, access controls and least-privilege enforcement, data minimization techniques, pseudonymization, contractual data processor obligations, audit logging, and purpose limitation enforcement. Then calculate residual risk after controls.
Step 6: Make the risk decision
If residual risk is high after controls, escalation is required. Under GDPR, when high residual risk cannot be adequately mitigated, the controller must consult the supervisory authority before proceeding — this is the “prior consultation” requirement under Article 36. A competent authority consultation takes time and may result in restrictions on the processing.
Under US state privacy laws, the standard is whether “the benefits to the consumer, the controller, other stakeholders, and the public outweigh the risks.” Someone senior — DPO, General Counsel, or equivalent — needs to own that decision and sign off on it in writing.
Step 7: Document, approve, and maintain
The DPIA is a living document. Any material change to the processing activity — new data categories, expanded vendor access, new jurisdictions, algorithm changes, new purposes — requires revisiting the assessment. Build version control and review triggers into your records management process. A DPIA that was accurate at launch but not updated as the product evolved is not a defensible document.
Building the Screening Trigger Into Your Workflows
The most common PIA program failure isn’t the template — it’s the trigger. Companies build excellent DPIA templates and then discover that nobody uses them because the workflow doesn’t create a decision gate before processing begins.
Effective screening points:
Product development: All new features or significant changes involving personal data require a DPIA screening before the sprint begins. The privacy team reviews the screening and escalates to a full DPIA if needed.
Vendor onboarding: Any new vendor that will receive or process personal data triggers a screening before the contract is signed. Vendors that process sensitive data or enable automated decision-making require a full assessment. Your vendor due diligence process should include this step.
Change management: Material changes to existing processing activities — new algorithms, expanded data access, new retention periods — trigger a DPIA review of the existing assessment.
M&A and data asset acquisition: Acquiring a new data set or business that brings new processing activities requires assessing whether those activities require PIAs under your obligations.
Connecting PIAs to Your Broader Privacy Program
A PIA process doesn’t operate in isolation:
Data classification: You need to know what categories of data you’re processing to accurately assess DPIA triggers and risk levels. A classification schema that distinguishes personal, sensitive, and special-category data maps directly to PIA screening questions.
Data retention policies: Retention decisions made during PIAs should feed directly into your retention schedule. A PIA that concludes six months of retention is proportionate should create a retention rule — not just a statement in the assessment document.
DSAR response: Under GDPR, controllers must maintain records of processing activities (ROPA). PIA documentation forms part of that records structure. When individuals exercise rights around automated decision-making, your DPIA documentation tells you what assessments are relevant.
State privacy law obligations: Data protection assessment requirements vary by state. Your processing inventory should map each activity to the states where it applies, so you know which assessments are required in which jurisdictions — particularly as new state laws activate.
What Enforcement Tells Us
The DPIA requirement has appeared in enforcement in two ways. First, the failure to conduct required assessments is itself a violation. Under GDPR, this can trigger fines independent of any underlying breach or harm. Second, inadequate DPIAs — ones that document rather than analyze, or that are completed after the fact — have been cited in cases involving the mishandling of children’s data and novel data uses.
The pattern in these cases: companies had PIA processes on paper but hadn’t embedded them into operational workflows. The assessments that existed were backward-looking documentation exercises, not the pre-processing risk evaluations the regulation requires.
So What?
If you don’t have a PIA process, the highest-leverage starting point is the trigger, not the template. Build the screening checkpoint into your change management and vendor onboarding workflows before investing time in the template. A screening process that flags 90% of high-risk processing and routes it to a basic assessment is more valuable than an elaborate template that nobody uses because there’s no workflow to invoke it.
If you do have a process, pressure-test the trigger and the living-document maintenance:
- Can you identify every new significant processing activity from the last 12 months? Did each one go through a screening?
- Are your existing DPIAs current? Have any material changes to your processing occurred since they were last reviewed?
The California CPRA retrospective deadline creates a concrete deliverable for 2027: a documented assessment for every significant processing activity currently in operation. Starting that inventory now — before the attestation deadline — is significantly easier than doing it under time pressure in late 2027.
Privacy impact assessments done well are not compliance overhead. They’re the mechanism that forces a conversation about data minimization, vendor obligations, and processing purpose before those decisions become expensive to unwind.
Related Template
Data Privacy Compliance Kit
Multi-state privacy compliance templates covering 19 state laws plus GLBA and CCPA.
Frequently Asked Questions
What is the difference between a PIA and a DPIA?
When is a DPIA required under GDPR?
What do California's new CPRA risk assessment rules require?
Which US state privacy laws require data protection assessments?
What are the required elements of a GDPR DPIA?
What happens if you skip a required DPIA?
Rebecca Leung
Rebecca Leung has 8+ years of risk and compliance experience across first and second line roles at commercial banks, asset managers, and fintechs. Former management consultant advising financial institutions on risk strategy. Founder of RiskTemplates.
Related Framework
Data Privacy Compliance Kit
Multi-state privacy compliance templates covering 19 state laws plus GLBA and CCPA.
Keep Reading
GLBA Regulation P Privacy Notices: What Financial Institutions Must Send, When, and the FAST Act Exception Explained
A practitioner's guide to GLBA Regulation P: who must send privacy notices, what the initial and annual notice must include, when the FAST Act exception eliminates the annual requirement, and how opt-out rights actually work.
May 14, 2026
Data PrivacyHIPAA Security Rule Overhaul: The New Technical Safeguard Requirements Coming to Every Covered Entity and Business Associate
The biggest HIPAA Security Rule update since 2013 is arriving in 2026. Here's what the proposed final rule requires, what's actually changing, and how to run a gap assessment before the compliance deadline.
May 12, 2026
Data PrivacyDSAR Response Workflow: A Practitioner's Guide to Data Subject Access Requests Under CCPA, GDPR, and State Privacy Laws
DSARs aren't optional, and mishandling them now costs seven figures. Here's the complete workflow — intake, identity verification, data collection, legal review, and documented response — built for teams managing multi-law obligations.
May 11, 2026
Immaterial Findings ✉️
Weekly newsletter
Sharp risk & compliance insights practitioners actually read. Enforcement actions, regulatory shifts, and practical frameworks — no fluff, no filler.
Join practitioners from banks, fintechs, and asset managers. Delivered weekly.