Entries open for BuzzInContent Awards 2021 - ENTER NOW to avail Early Bird offer until October 22

Best Media Info

Editor’s Picks

IT Rules 2021: Legitimising platform discretion

Aayush Soni, Head of Communications at Koan Advisory, New Delhi, writes why stakeholders should together craft a contemporary set of community standards that are fair, expansive, transparent and do justice to the 21st century’s complex digital world

Aayush Soni

Ever since they were announced by the government on February 25, the IT Rules, 2021 have triggered a lot of debate in the media. Experts have pointed out that these guidelines will constrict the creative freedoms of over-the-top (OTT) platforms. Critics and legal eagles have highlighted how the applicability of certain legal provisions on news media smacks of government interference and is untenable. Yet others have pointed out how some rules, applicable on social media intermediaries, will compromise users’ privacy. In the din of this noisy conversation, one of the guidelines has gone unnoticed.

A due diligence clause in the IT Rules mandates social media intermediaries to not host, publish or transmit content that “is patently false and untrue, and is written or published in any form, with the intent to mislead or harass a person”. In other words, platforms now have legal sanction to remove content which they think is false – even if it isn’t. It gives legal cover to the power of these intermediaries to arbitrarily remove content if it doesn’t meet their “community standards”. Platforms have often used them as an instrument to either pre-censor or remove content that, in their view, was problematic. Users are seldom given reasons for why their posts violated community guidelines and often, the posts that are removed are neither false nor hateful. Conversely, they rarely take action against posts that are prima facie inflammatory.


In February this year, a New York Times article pointed out how Facebook rejected an ad posted by Mighty Well, an adaptive clothing company, because the platform claimed that it was promoting “medical and health care products and services, including medical devices,” even though Mighty Well wasn’t doing so. On the other hand, the platform seemed reluctant to take down posts which were blatantly targeting minority groups and violated its community guidelines, as pointed out in a Wall Street Journal article in August.

In this context, to formulate rules that will only strengthen community standards will not help the government to achieve the objective of removing untrue content.  The need for such standards, which reflect contemporary social mores, was flagged by the Supreme Court of India, in the Aveek Sarkar v. State of West Bengal case.


In its judgment the Court said that, “the obscenity has to be judged from the point of view of an average person, by applying contemporary community standards”. By extension, contemporary community standards should also apply to posts which, though not obscene, can be construed as problematic by intermediaries. Except, stakeholders in India’s internet ecosystem have not come up with consistent community standards which can be applicable on social media posts. As a result, we now have platforms basing their content moderation decisions on opaque principles known only to them. The ultimate loser is the average user, whose right to free expression online gets curbed.

Instead, a more constructive approach would be for stakeholders to come together and craft a contemporary set of community standards that are fair, expansive, transparent and do justice to the 21st century’s complex digital world.

(Disclaimer: The opinions expressed in this article are those of the author. The facts and opinions appearing in the article do not reflect the views of BestMediaInfo.com and we do not assume any responsibility or liability for the same.)


Post a Comment