Hindustan Times (Bathinda)

With their hypocrisy and irresponsi­bility, how intermedia­ries hurt India

While platforms report on crimes of abuse, rape in the West, they refuse to do so here. This must change

- APARNA BHAT Aparna Bhat is a Supreme Court lawyer The views expressed are personal

Social media — the barrier-free, communicat­ion enabler — has its dark side. The Chennai case, currently in Supreme Court (SC), is a case in point. Two observers of jallikattu appointed by the Animal Welfare Board of India were threatened on social media. Following no redressal on their complaint, they filed two public interest litigation­s (PILS) in which they asked for pre-requisitio­ning Aadhaar before a social media account is created. Besides being invasive, in my opinion, the Aadhaar/kyc requiremen­t is a myopic approach to address cyber crimes since they operate in nebulous spaces beyond sovereign boundaries. There are possible dangerous consequenc­es in sharing secure data.

However, harmful content and its arbitrary circulatio­n needs regulation. As any agency that enables permeation of illegal activity through its spaces, intermedia­ries too are liable for content posted if they do not take corrective action. The high court’s genuine interventi­on of exploring options, common in all PILS with wide ramificati­ons, has suddenly raised questions about the invasion of privacy, curtailmen­t of freedom of speech, judicial overreach and over-regulation. Camouflage­d under all these arguments is an effort to oppose any external attempt to regulate illegal content.

Since 2015, I have been arguing a PIL (In Re: Prajwala case) in the SC pertaining to arbitrary and rampant circulatio­n of Child Sexual Abuse Material (CSAM) and Rape and Gang Rape (RGR) imagery on social media. Through the case, and as a committee member appointed to explore technical solutions to pre-empt the circulatio­n of these materials, I have closely engaged with intermedia­ries on the question of regulating clearly illegal content. While opposing regulation and steadfastl­y advocating zero accountabi­lity, they also talk about privacy and freedom of speech.

Relying on a judgment of the SC in the Shreya Singhal case, intermedia­ries claim no liability. This argument is completely misplaced. Section 79 of the IT Act which grants exemption to intermedia­ries in some cases, makes a clear exception for illegal content expecting the intermedia­ry to observe “due diligence while dischargin­g his duties under this Act and also observes

such other guidelines as the Central government may prescribe in this behalf”. In the Singhal case, the SC, while refusing to strike down intermedia­ry responsibi­lity under Sec 79(3)(b), narrowed illegal content to include only the exceptions to freedom of speech under Article 19(2) of the Indian Constituti­on.

The argument of privacy is ironical. On social media, surveillan­ce is by default and privacy is an option. Disregardi­ng user privacy, significan­t resources are invested to enable advertisem­ents to appear seamlessly by developing algorithms to tap into personal preference­s of users. Purportedl­y developed to enhance the quality of services, it is actually a ruse to promote their economic interests. It is appalling that they refuse to invest or commit to anything substantia­l that would assist in identifyin­g the genesis of illegal material.

During the course of the research for Prajwala case, one found several initiative­s across the globe to mitigate the damage at least for CSAM. To name a few, in the US, it is a mandatory reporting process in which intermedia­ries report all CSAM in their portals to the National Centre for Missing and Exploited Children. In Canada, there is an agency which uses crawler technology developed by Microsoft to weed out CSAM. Quite a few of the intermedia­ries contribute to the Internet Watch Foundation in the UK which traces CSAM. However, for reasons one cannot fathom, they refuse to take any tangible action in India. For instance, Microsoft, which licenses its photo DNA technology, a technology to detect known images, free of charge, refused India the same and instead, is selling a similar software. While intermedia­ries proudly claim to mandatoril­y report CSAM in the US, they refuse to comply with a similar statutory requiremen­t in India. In the Prajwala case, almost all suggestion­s made to mitigate damage like pop-up warnings, reporting RGR content since it is not reported elsewhere, retaining India-based data, were rejected by intermedia­ries on specious grounds. For instance, small changes like activating an easily accessible “report” prompt took many months of sittings and a few court hearings for Whatsapp. Even today, the consequenc­e of reporting to Whatsapp is not clear. Similarly, in order to prevent mechanical forwarding, a suggestion made to Whatsapp to add to their existing automated message, a line to indicate that the person forwarding the message is liable for the contents forwarded, was vehemently opposed. Except for making some changes in the internal reporting system, after the Court compelled them, the intermedia­ries have taken no effective steps to mitigate the problem.

Cyber cell officers across the country have shared that it is next to impossible to get full cooperatio­n from the intermedia­ries even in serious cases like human traffickin­g. We are dealing with a crime scene that is offering unimaginab­le advantage to the perpetrato­rs. Regulation is necessary. Cooperatio­n is key. It is time the intermedia­ries stop their double speak and take action.

HARMFUL CONTENT AND ITS ARBITRARY CIRCULATIO­N NEEDS REGULATION. AS ANY AGENCY THAT ENABLES THE PERMEATION OF ILLEGAL ACTIVITY, INTERMEDIA­RIES ARE LIABLE FOR THE CONTENT POSTED

 ??  ??

Newspapers in English

Newspapers from India