Hindustan Times (Lucknow)

Reconsider the proposal to regulate online content

Given the inconsiste­ncies with legal precedence, the draft rules may not necessaril­y stand judicial scrutiny

- GURSHABAD GROVER Gurshabad Grover is a senior policy officer at the Centre for Internet and Society The views expressed are personal

Flowing from the Informatio­n Technology (IT) Act, India’s current intermedia­ry liability regime roughly adheres to the “safe harbour” principle, i.e. intermedia­ries (online platforms and service providers) are not liable for the content they host or transmit if they act as mere conduits in the network, don’t abet illegal activity, and comply with requests from government bodies and the judiciary. This allows intermedia­ries that primarily transmit user-generated content to provide their services without constant paranoia, and can be partly credited for the proliferat­ion of online content.

On December 24, the government published and invited comments to the draft intermedia­ry liability rules. These rules significan­tly expand “due diligence” intermedia­ries must observe to qualify as safe harbours: they mandate enabling “tracing” of the originator of informatio­n, taking down content in response to government and court orders within 24 hours, and responding to informatio­n requests and assisting investigat­ions within 72 hours. Most problemati­cally, the draft rules go much further than the stated intentions of battling “fake news”: draft Rule 3(9) mandates intermedia­ries to deploy automated tools for “proactivel­y identifyin­g and removing [...] unlawful informatio­n or content”.

The first glaring problem is that “unlawful informatio­n or content” is not defined. A conservati­ve reading of the draft rules will presume that the phrase means restrictio­ns on free speech permissibl­e under Article 19(2) of the Constituti­on, including that relate to national integrity, “defamation” and “incitement to an offence.” Ambiguity aside, is mandating intermedia­ries to monitor for “unlawful content” a valid requiremen­t under “due diligence”? To qualify as a safe harbour, if an intermedia­ry must monitor for all unlawful content, then is it substantiv­ely different from an intermedia­ry that has active control over its content and not a safe harbour? Clearly, the requiremen­t of monitoring for all “unlawful content” is so onerous that it is contrary to the philosophy of safe harbours envisioned by the law.

By mandating automated detection and removal of unlawful content, the proposed rules shift the burden of appraising legality of content from the state to private entities. The rule may run afoul of the Supreme Court’s reasoning in Shreya Singhal v Union of India wherein it read down a similar provision because, among other reasons, it required an intermedia­ry to “apply [...] its own mind to whether informatio­n should or should not be blocked”. “Actual knowledge” of illegal content, since then, has held to accrue to the intermedia­ry only when it receives a court or government order. Given the inconsiste­ncies with legal precedence, the rules may not stand judicial scrutiny in their current form.

The lack of technical considerat­ions in the proposal is also apparent since implementi­ng the proposal is infeasible for certain intermedia­ries. End-to-end encrypted messaging services cannot “identify” unlawful content since they cannot decrypt it. Internet service providers also qualify as safe harbours: how will they identify unlawful content when it passes encrypted through their network? Intermedia­ries that can implement the rules, like social media platforms, will leave the task to algorithms that perform even specific tasks, such as detecting copyright infringeme­nt, poorly. Identifyin­g contextual expression, such as defamation or incitement to offences, is a much more complex problem. Platforms will be happy to avoid liability by taking content down without verifying whether it violated law. Also, the draft do not mandate an appeal system for users whose content is taken down. Given the wide amplitude and ambiguity of India’s restrictio­ns on free speech, online platforms will end up removing swathes of content to avoid liability.

The draft rules follow India’s proclivity to join the ignominiou­s company of authoritar­ian nations when it comes to disrespect­ing protection­s for freedom of expression. To add insult to injury, the draft rules are abstruse, ignore legal precedence, and betray a poor technologi­cal understand­ing. The government should reconsider the proposed regulation and the stance which inspired it, both of which are unsuited for a democratic republic.

BY MANDATING AUTOMATED DETECTION AND REMOVAL OF UNLAWFUL CONTENT, THE PROPOSED RULES SHIFT THE BURDEN OF APPRAISING LEGALITY OF CONTENT FROM THE STATE TO PRIVATE ENTITIES

 ??  ??

Newspapers in English

Newspapers from India