India's DPDP Act and AI: What Every CTO Needs to Know in 2026
The Digital Personal Data Protection Act, 2023 (DPDP Act) is India's comprehensive data privacy law. For companies building AI products that process Indian user data, understanding its requirements is essential for avoiding penalties and maintaining user trust.
Key Provisions Affecting AI Products
Consent Requirements The DPDP Act requires explicit, informed consent before processing personal data. For AI applications, this means:
- Users must know their data is being processed by AI
- Purpose limitation applies: data collected for one purpose cannot be used for another
- Consent must be freely given, specific, and withdrawable
Data Principal Rights Users have the right to:
- Access information about how their data is processed
- Correct inaccurate personal data
- Erase their personal data (right to be forgotten)
- Nominate another person to exercise their rights
Data Fiduciary Obligations Companies processing personal data must:
- Implement appropriate security safeguards
- Notify the Data Protection Board of breaches
- Appoint a Data Protection Officer for significant data fiduciaries
- Conduct Data Protection Impact Assessments for high-risk processing
How This Affects AI Architecture
Data Collection Layer
- Implement granular consent management
- Track consent states per user per data type
- Ensure consent withdrawal triggers data deletion across all downstream systems, including AI training datasets
Data Processing Layer
- Implement PII redaction before sending data to third-party LLM providers
- Log all data processing activities with purpose codes
- Ensure data minimization: process only what is necessary for the stated purpose
Data Storage Layer
- Implement data retention policies aligned with stated purposes
- Enable data portability exports
- Support deletion requests across all storage systems
Cross-Border Data Transfer The DPDP Act restricts data transfer to certain countries. AI products must:
- Know where their LLM providers process data
- Ensure provider data centers are in approved jurisdictions
- Document the legal basis for any cross-border data transfer
Implementation Checklist
- Map all personal data flows in your AI application
- Implement consent management with auditable records
- Deploy PII redaction for all external AI API calls
- Set up data subject request handling workflows
- Configure data retention policies per data category
- Document your AI-specific Data Protection Impact Assessment
- Establish breach notification procedures
Common Compliance Gaps in AI Products
Gap 1: Training data provenance. If your AI model was trained on data from Indian users, you need consent records for that training data.
Gap 2: Embedding stores. Vector databases used in RAG applications contain encoded personal data. Deletion requests must cover embeddings, not just source documents.
Gap 3: Third-party model providers. When user data transits through OpenAI, Anthropic, or other providers, you are still responsible for DPDP compliance.
Gap 4: Inferred data. AI systems often infer personal attributes (income level, health status, preferences) from aggregate data. These inferences may constitute personal data under the Act.
Conclusion
The DPDP Act fundamentally changes how AI products must handle Indian user data. Compliance is not a one-time checkbox but an ongoing architectural commitment.
Start with data mapping and consent management. Layer in PII redaction for all external AI calls. Build deletion workflows that cover embedded and inferred data. And document everything for the inevitable audit.