Amid growing concerns over deepfakes, the government has directed all platforms to comply with IT rules, as companies have been mandated to inform users in clear terms about prohibited content, and cautioned that violations will attract legal consequences.
IT Ministry will closely observe the compliance of intermediaries (social media and digital platforms) in the coming weeks and decide on further amendments to the IT Rules or the law if and when needed, an official release said.
The government has made it clear to platforms that if legal violations of the IT rules are noted or reported then the consequences under law will follow.
The missive underlines hardening of government stance on the issue, amid growing concerns around misinformation powered by AI - Deepfakes.
Earlier, several "deepfake" videos targeting leading actors went viral, sparking public outrage and raising concerns over the misuse of technology and tools for creating doctored content and fake narratives.
The advisory mandates that intermediaries -- such as WhatsApp, Facebook, X, and others -- communicate prohibited content specified under IT Rules clearly and precisely to users.
"The Ministry of Electronics and Information Technology (MEITY) has issued an advisory to all intermediaries, ensuring compliance with the existing IT rules," the official release said.
The directive specifically targets the growing concerns around misinformation powered by AI - Deepfakes, the release said.
This advisory is the culmination of discussions spearheaded by Minister of State for IT Rajeev Chandrasekhar with intermediaries on the issue.
"The content not permitted under the IT Rules, in particular those listed under Rule 3(1)(b), must be clearly communicated to the users in clear and precise language including through its terms of service and user agreements and the same must be expressly informed to the user at the time of first-registration and also as regular reminders, in particular, at every instance of login and while uploading/sharing information onto the platform," according to the advisory.
The advisory emphasises that digital intermediaries must ensure users are informed about penal provisions, including those in the IPC and the IT Act 2000.
In addition, the advisory said the terms of service and user agreements must clearly highlight that intermediaries/platforms are under obligation to report legal violations to law enforcement agencies under the relevant Indian laws applicable to the context.
"Rule 3(1)(b) within the due diligence section of the IT rules mandates intermediaries to communicate their rules, regulations, privacy policy, and user agreement in the user's preferred language," it said.
It is pertinent to mention here that Rule 3(1)(b)(v) explicitly prohibits the dissemination of misinformation and patently false information.
Digital platforms are obliged to ensure reasonable efforts to prevent users from "hosting, displaying, uploading, modifying, publishing, transmitting, storing, updating, or sharing any information related to the 11 listed user harms or content prohibited" on digital intermediaries.
The rule aims to ensure platforms identify and promptly remove misinformation, false or misleading content, and material impersonating others, including deepfakes.
Over the last one month, during his meeting with industry leaders on the pressing issue of deepfakes, the minister has highlighted the urgency for all platforms and intermediaries to strictly adhere to current laws and regulations, emphasising that the IT rules comprehensively address the menace of deepfakes.
"Misinformation represents a deep threat to the safety and trust of users on the Internet," Chandrasekhar said, adding that deepfake, which is misinformation powered by AI, further amplifies the threat to safety and trust of users.
"On November 17, PM alerted the country to the dangers of deepfakes and post that, the ministry has had two Digital India Dialogues with all the stakeholders of the Indian Internet to alert them about the provisions of the IT Rules notified in October 2022, and amended in April 2023 that lays out 11 specific prohibited types of content on all social media intermediaries and platforms." Consequently, all intermediaries were asked to exercise due diligence in promptly removing such content from their platforms. He also emphasised that platforms have been duly informed about the legal consequences associated with any violations under the IT rules.
"Today, a formal advisory has been issued incorporating the 'agreed to' procedures to ensure that users on these platforms do not violate the prohibited content in Rule 3(1)(b) and if such legal violations are noted or reported then the consequences under law will follow," the minister said.