
DPDP rules: User rights constrained by state exemptions, loopholes, says activist
DPDP rules introduce breach notifications, correction and erasure rights, but govt exemptions, loopholes, weak oversight threaten user privacy, says Nikhil Pahwa
India’s first digital privacy law—the Digital Personal Data Protection Act—has now taken effect, setting out how companies collect, use and store personal data. But as the government notifies detailed rules, concerns are mounting over broad state exemptions, weak oversight, and gaps around AI.
In this interview, digital rights activist Nikhil Pahwa breaks down what the new framework means for ordinary users and where it falls short.
What is your overall impression of the DPDP rules, especially from a user’s perspective?
The rules are a welcome but this is a limited step. For the first time, users get basic rights: mandatory breach notifications, the ability to correct inaccurate digital data, clearer deletion norms after a service ends, grievance redress channels, and even the option to nominate someone to access your account after death.
But protections apply only to digital data—physical records like building registers remain uncovered. And despite these improvements, the law contains serious gaps. The government can demand user data from companies without informing individuals; AI firms can scrape publicly available personal data without consequence; and retention exemptions across sectors weaken the right to erasure.
Also read: Govt notifies Digital Personal Data Protection Act to protect citizens in online space
Add to this the uncertainty around potential data-localisation orders, and the result is a framework where user rights exist on paper but are constrained by wide state exemptions and practical loopholes.
Was the process of drafting the Act and rules transparent? Was enough public input taken?
There was a consultation process, but it lacked transparency. Under pre-legislative policy norms, MeitY should publish public submissions; it did not. This pattern isn’t new—similar opacity marked the Broadcast Bill consultations. When RTIs were filed, the government even contacted contributors asking if their submissions could be published, weakening the spirit of the RTI Act.
This reduction in transparency and accountability is troubling, especially since the rules include no journalism exemption. This could affect investigative reporting. Overall, the process was not transparent.
Rule 23 allows the government to seek data from companies on grounds of sovereignty, integrity, or security, without informing the individual. What does this mean for user privacy and accountability?
Rule 23 lets the government obtain your data from private companies without notifying you. You have no right to be informed, no right to object, and no way to know what was taken. There is no oversight, no warrants, and no judicial scrutiny. Companies effectively become extensions of state surveillance.
Also read: Data Protection Act: 'Big blow to transparency'
This undermines the 2017 right-to-privacy judgment, which protected citizens against the state. Ironically, a privacy law is now enabling more surveillance.
Rule 8 says companies and the government can retain personal data for years even after it’s no longer needed. Does this pose a risk to users?
Retention requirements embedded in other laws—payments, taxation, e-commerce—mean companies often must store data for years. Because the DPDP Act carves out exemptions for such obligations, the right to erasure becomes weak in practice. Users cannot always demand deletion even after discontinuing a service.
This increases risks: long-term storage means long-term vulnerability to breaches. While the erasure right looks strong on paper, existing compliance laws limit its effect.
Rule 10 mandates strict age verification and parental consent for children. How do you view this requirement?
It is deeply flawed. Many Indian children are primary digital users in their homes. Requiring parental consent for anyone under 18 is impractical and discourages digital literacy.
It also creates consent fatigue—parents will constantly be asked to approve access. Even accessing a news website could require consent. The Act itself is rigid, and while the rules try to soften some parts, rules cannot override the law. Children may simply lie about their age, leaving platforms in a legally grey space.
Also read: RTI vs DPDP: Activists battle Centre’s data law
Extensive feedback highlighting these issues was ignored. Instead of amending the Act, the government attempted to fix problems through rules—creating more confusion.
The Data Protection Board is expected to protect user rights, but it is fully controlled by the government. Can it function independently?
Independence is doubtful. India has a Board, not an independent authority with rule-making power, unlike most global privacy regimes. Members are appointed by the government, with eligibility opened even to officials from any government body. There are only four members to oversee the privacy of 1.4 billion people.
Users must go to the Board before approaching courts, creating a lengthy process. Enforcement capacity and legitimacy will be major challenges; structurally, the Board is weak.
Finally, what else in the rules stands out to you?
The law took so long that technology has outpaced it. AI-related privacy harms—such as scraping publicly available data—are left unaddressed. AI models trained on personal data cannot easily be “untrained,” raising practical questions about the right to erasure.
Also read: Data usage and ground-rules: Inside India's new Digital Personal Data Protection framework
The consent-manager framework also reduces friction for data sharing and doesn’t belong in a privacy law. With account aggregators already enabling cross-platform data flows, inferences about users—such as lifestyle habits—can be made without consent.
The DPDP Act eases—not restricts—such data flows. It’s unclear how this protects privacy. Significant challenges remain, and the rules don’t offer easy answers.
(The content above has been transcribed from video using a fine-tuned AI model. A Human-In-The-Loop (HITL) review ensures accuracy and editorial integrity. At The Federal, AI efficiency is combined with human expertise to produce reliable, verified journalism.)

