Social media should take responsibility of content they publish: Vaishnaw

Update: 2026-01-02 14:29 GMT

New Delhi: Social media firms should take responsibility for content they publish and a standing committee has already recommended a tough law to fix accountability of platforms, Union minister Ashwini Vaishnaw said on Friday.

The Centre has earlier this week warned online platforms -- mainly social media firms -- of legal consequences if they fail to act on obscene, vulgar, pornographic, paedophilic and other forms of unlawful content.

"Social media should be responsible for the content they publish. Intervention is required," Vaishnaw said here on the sidelines of a Ministry of Electronics and IT (Meity) event.

He was replying to a question on AI app Grok generating indecent and vulgar images of women.

Rajya Sabha member Priyanka Chaturvedi has also written to the minister seeking urgent intervention on increasing incidents of AI apps being misused to create vulgar photos of women and post them on social media.

"The standing committee has recommended that there is a need to come up with a tough law to make social media accountable for content they publish," Vaishnaw said.

The Parliamentary Standing Committee on the Ministry of Information and Broadcasting has recommended that the government make social media and intermediary platforms more accountable with respect to peddling fake content and news.

The committee has endorsed the view of stakeholders, such as enforcing transparency in algorithms, introducing stricter fines, and penalties for repeat offenders, establishing an independent regulatory body, and using technological tools like AI to curb the spread of misinformation, etc.

On December 29, Meity asked social media firms to immediately review their compliance framework and act against obscene and unlawful content on their platform, failing which they may face prosecution under the law of the land.

The advisory followed Meity noticing that social media platforms have not been strictly acting on obscene, vulgar, inappropriate, and unlawful content.

Public policy firm IGAP, Partner, Dhruv Garg, said MeitY advisory to intermediaries does not establish any fresh legal obligations, it rather reiterates that safe harbour protection hinges entirely on strict adherence to due diligence requirements laid out in the IT Rules, 2021.

"Significant social media intermediaries are subject to stricter due diligence benchmarks. They must also deploy automated content moderation tools. The advisory signals that in light of widespread obscene content circulation, reactive content takedowns are inadequate, and platforms must actively fulfil their legal responsibilities or they may face criminal prosecution," he said.

Luthra and Luthra Law Offices India, Senior Partner, Sanjeev Kumar, said Meity's advisory unequivocally states that non-compliance with the IT Act and the IT Rules, 2021 may result in legal consequences, including prosecution under the IT Act, the Bharatiya Nyaya Sanhita, 2023 (BNS), and other applicable criminal laws, and such consequences may extend to intermediaries, platforms, and their users.

"This operates alongside the potential loss of safe-harbour protection under Section 79, exposing non-compliant entities to direct liability. The cumulative impact of these provisions heightens legal, financial, and reputational risk, making adherence not only a statutory duty but a business imperative," he said.

Similar News