Why nuanced oversight of social media platforms matter

Social media poses significant challenges to ensuring good governance. Platforms, intentionally or otherwise, may enable the rapid dissemination of fake news and propaganda, which can polarise societies and undermine informed decision-making
Journalism is considered the “fourth pillar” of democracy, alongside the legislature, executive, and judiciary. It acts as a watchdog, ensuring accountability, transparency, and informed public opinion. A free press guards against the misuse of power and acts as a vital bridge between the government and citizens. Traditional media, such as print and television, have been performing these duties for years, contributing to India as a vibrant and evolving democracy. The content in these media is relatively static, reasonably edited, and validated to ensure factual correctness, with scheduled publication cycles for a limited geography. Journalists shoulder responsibility for such content.
Social media, however, thrives on participation, turning passive viewers into creators and amplifiers, unlike print media’s fixed format. This fosters discussion but also risks echo chambers, contrasting with digital media’s controlled navigation. Content evolves in real time via edits or threads on social platforms, while print is immutable once published. Traditional media allows revisions but lacks the crowd-sourced dynamism of social media platforms. In print and digital media, content undergoes mandatory editorial processes, and thus the output is duly validated, whereas social media content becomes challenging when based on falsehood or mala fide intent.
Social media poses significant challenges to ensuring good governance. Platforms, intentionally or otherwise, may enable the rapid dissemination of fake news and propaganda, which can polarise societies and undermine informed decision-making. During events such as the 2020 Delhi riots, false narratives exacerbated tensions, complicating government efforts to maintain order. This viral nature often outpaces fact-checking, leading to public outrage or misguided protests. Algorithms create filter bubbles that reinforce biases and limit exposure to diverse views. During the CAA protests, such echo chambers fuelled division rather than dialogue, making effective communication challenging for authorities.
Governments also face intense public scrutiny online, where even a single misstep can trigger widespread backlash requiring immediate response. Limited resources for monitoring and moderation further strain capacities, especially during coordinated disinformation campaigns. Efforts to counter such threats are often met with concerns related to surveillance and privacy.
Recently, the Ministry of Electronics and Information Technology (MeitY) has proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These draft amendments seek to revise the framework for regulating online news and current affairs content, as well as to strengthen compliance with advisories and directions issued by the Ministry. While the changes may be clarificatory and procedural in nature, if implemented well, they can help create a safer and more accountable internet, improve legal certainty, and address concerns arising from misinformation and harmful content. An open, safe, trusted, and accountable internet is a citizen’s right.
The expanded role of the Inter-Departmental Committee (IDC) allows it to consider a broader range of matters beyond complaints. This could enable better scrutiny of illegal content and a quicker response to rapidly spreading misinformation. The amendments also clarify that data retention obligations under the IT Rules operate “without prejudice” to other laws, potentially supporting investigative and legal processes.
Critics argue that the amendments represent an expansion of executive power over online speech. The insertion of Rule 3(4), which mandates intermediary compliance with government directions as a condition for retaining safe harbour under Section 79 of the IT Act, is seen as beyond the rule-making powers of the Act. Faced with the risk of losing safe harbour, intermediaries may over-comply, potentially leading to excessive censorship. Concerns have also been raised about the broad definition of “news and current affairs content”, potentially bringing a wide range of user-generated content, including social media posts and video commentaries, under regulatory purview. Not clearly addressing whether generative AI platforms qualify as intermediaries may lead to enforceability issues. While the government aims to create a more accountable and safer online environment, critics caution that such provisions, if applied broadly, could undermine media independence and create a climate of over-censorship.
It is relevant to refer to judicial developments in this context. The Karnataka High Court, in September 2025, dismissed X Corp’s (formerly Twitter) challenge against the Central Government’s “Sahyog” portal, a tool for issuing content takedown notices to social media platforms. This order explicitly validates the government’s regulatory power under the IT Act for public order and security, stating that foreign platforms must comply with Indian laws and cannot claim absolute Article 19 rights. The High Court ruling aligns with Article 19(2) restrictions on free speech for sovereignty and public order. The court described Sahyog as an “instrument of public good”, promoting cooperation between platforms and authorities, and rejected claims of overreach. It did not delve deeply into specific amendment challenges but reinforced that regulation is constitutionally permissible when proportionate. The Karnataka High Court decision provides key precedent affirming government authority despite ongoing debates.
The growing scale of information operations adds another dimension. Algorithmic bots and coordinated networks are increasingly shaping online discourse. Advances in artificial intelligence have enabled the creation of deepfake videos, which are difficult to detect and have a stronger impact than text-based misinformation. With enhanced scale and automation, such technologies pose serious challenges to democratic systems and public trust.
In this emerging scenario, bringing social media platforms and content creators under an appropriate oversight framework becomes essential for sustaining a vibrant democracy and supporting economic growth. As social media continues to evolve, regulatory frameworks must also adapt. The proposed amendments seek to strengthen accountability within the existing legal framework and do not confer any additional powers on the government. Ensuring that such measures are implemented with transparency, proportionality, and respect for constitutional principles will be key to achieving a balanced and effective digital ecosystem.
In this emerging scenario, bringing social media platforms and content creators under an appropriate oversight framework becomes essential for sustaining a vibrant democracy and supporting economic growth
The writer is a former Group Coordinator (Cyber Law), Ministry of Electronics and Information Technology (MeitY); Views presented are personal.















