Imagine you post a video on Instagram about a protest in your neighbourhood. You filmed it, you captioned it, you shared it with your followers. Under India's existing rules, you are just a user—a person expressing themselves on a platform. Under the government's new draft rules, you might be something else: a publisher, subject to the same regulatory oversight as a media house.
That is the quiet but consequential shift buried inside the Ministry of Electronics and Information Technology's latest proposed changes to India's Information Technology Rules, published on March 30—with a deadline for public comment of April 14.
While the government has described the amendments as "clarificatory and procedural in nature”, the changes could significantly reshape how online content is regulated in India.
The Internet Freedom Foundation has warned that the amendments could have "far-reaching implications for free speech and intermediary governance in India" and demanded their immediate withdrawal.
So, What Is Actually Changing?
The draft touches three distinct areas of how online content is regulated, each with its own implications.
The most striking change concerns what counts as regulated content. Rule 8 of the existing IT Rules sets out an ethics framework for digital news publishers. The draft would now expand that framework to cover user-generated "news and current affairs" content on platforms.
In practice, this means a social media post, a WhatsApp video, a YouTube commentary on current events—content created by ordinary people, not journalists— could be pulled into the same regulatory mechanism that governs news organisations.
The problem, as legal experts note, is that "news and current affairs" is nowhere defined with precision.
Deepro Guha of The Quantum Hub called the provision “vague”. He told Decode, “Anyone could fall under this,” noting that almost anything on social media could qualify as current affairs.
That raises a key question: will everyday users now be expected to follow journalistic codes of ethics?
“That’s not very clear,” he said.
And then there is the chilling effect. According to tech advocate Salman Waris, "If users know that posting about current affairs could trigger regulatory scrutiny, they may choose not to engage at all, thereby shrinking a space that has become central to public discourse and dissent."
That kind of self-censorship, experts say, is often invisible and hard to undo.
Safe Harbour, With Strings Attached
The second and perhaps more structurally significant change is about safe harbour—the legal shield that protects platforms from liability for what their users post.
Currently, that protection holds so long as platforms follow clearly defined legal obligations, including responding to court orders and government notifications issued through formal channels. The draft would introduce a new sub-rule tying safe harbour to something much broader: compliance with government "clarifications, advisories, directions, standard operating procedures, codes of practice, and guidelines".
While the government had issued directions to platforms before, there was no rule-based provision that tied compliance to safe harbour.
Waris explained the likely outcome: platforms could face a heightened risk of losing safe harbour even over directions that may not be clearly defined in law. "This creates an incentive to err on the side of caution," he said.
This means that platforms may take down or restrict content even when the legality of a direction is unclear. "The proposed amendment increases executive control over private platforms," Waris warned, raising concerns under Article 19(1)(a).
Advocate-on-Record Utkarsh Kumar pointed to another dimension: the draft risks running counter to the Supreme Court's landmark 2015 ruling in Shreya Singhal v. Union of India. "The ruling read down Section 79 to require that takedown orders come either from a court or through a government notification under Section 69A, which carries strict procedural safeguards," he said. Informal advisories carry no such safeguards.
Guha, partner at TQH, added a practical concern: the government's reasoning may be rooted in speed. In a fast-moving tech landscape, formal lawmaking is slow, while advisories offer quicker responses. "But it raises the question—how can compliance with these interpretations be made effectively mandatory?" he added.
Deletion Isn’t Full Erasure
The third change is smaller but worth understanding. The draft clarifies that any instruction to take down content or delete user data does not override other legal obligations to retain that data.
The draft tweaks how platforms handle deletion of content and user data under Rule 3(1) of the IT Rules. It adds a line clarifying that any requirement to take down content or delete user data will be “without prejudice” to existing legal obligations to retain information.
So, your post may disappear from public view, but the platform may still be required to keep the underlying data—for tax authorities, law enforcement, or other regulatory purposes.
This creates a quiet tension with the Digital Personal Data Protection Act, 2023, which emphasises purpose limitation and data minimisation. Legal experts have flagged that these two frameworks may now pull in opposite directions.
A System Hard to Challenge
Across all three changes, one theme runs consistently: the difficulty for ordinary users to push back.
The IT Rules do not provide a direct mechanism to challenge government-issued takedown orders. Kumar pointed to a deeper gap in accountability: such orders are not required to be made public, which limits transparency. "With the threat of losing safe harbour status, intermediaries are expected to comply promptly, leaving users in an accountability vacuum," he said.
For users, the process is layered. They must first approach the platform to understand why content was removed, and then decide whether to pursue legal action. "It becomes a complex, resource-heavy process," Guha said.
The end result, as Waris described it: "remedy exists, but is not easily accessible."
For changes with this potential scale of consequence—for users, for platforms, for independent journalism—the consultation window has itself become a point of criticism. The public has until April 14 to respond.
Kumar described the trajectory of user rights across these successive amendments as "a gradual dilution with each step". "Opaque mechanisms have increasingly replaced procedural safeguards," he said.
The government's stated goal is a safer, more accountable internet. The question being asked by lawyers, journalists, and rights groups alike is whether these rules achieve that—or whether, in doing so, they squeeze something out of the internet that was never supposed to leave.
A Shift Underway
The draft amendments to the IT Rules build on a shift that has been unfolding over the past few years.
Part of that shift became visible earlier this year, when the government reduced takedown timelines from 36 hours to three hours—prioritising speed in how platforms respond to flagged content. The latest draft goes a step further, tying safe harbour more closely to how platforms respond to government directions overall.
For Guha, this change reflects how the internet itself has evolved. Platforms, he said, are no longer just passive hosts. “Earlier, social media was just a space where people put up content for their friends,” he said. “Now, algorithms decide what you see, and those algorithms are designed by the platform.”
That shift gives platforms greater control over visibility and reach, Guha said, adding that laws are now trying to catch up. The question, he noted, is whether safe harbour can remain the same when platforms are “no longer just intermediaries, but something more”.
The current amendments reflect that thinking.
Waris describes it as a structural shift, from an immunity-based model to a compliance-driven one. Platforms, he said, may increasingly have to align with government expectations to avoid losing protection.
That comes with trade-offs. “While this may strengthen the state’s ability to deal with harms like misinformation,” Waris said, “it also raises concerns about platform autonomy and the protection of free speech online.”
On user rights, Kumar said the trajectory appears to be "diluting it with each step”, adding that this raises the cost and effort for users seeking to protect their free speech rights against arbitrary takedown notices.
Public comments on the draft can be submitted to: itrules.consultation@meity.gov.in — by April 14.