Hindustan Times (Ranchi)

Changing the status quo for social media companies in India

- Ambika Khanna is a senior researcher at the internatio­nal law studies programme, Gateway House The views expressed are personal

The ministry of electronic­s and informatio­n technology (MeitY) is engaged in a tussle with Twitter over its directions to block certain accounts. While the legal framework empowers the government to act, the episode throws up larger questions on policy gaps with regard to content regulation, and areas of ambiguity even where policy exists. Globally, social media firms are protected by the “safe harbour provision”. This protects the intermedia­ry, say Twitter or Google, from being penalised for harmful or unlawful content, if it is not created or modified by it, or if the platform did not have knowledge of such content posted by a user.

The United States offers similar protection to internet companies through Section 230 of the Communicat­ions Decency Act. In Europe, the e-Commerce Directive 2000, provides protection to internet intermedia­ries if they act only as a conduit and do not have knowledge of unlawful content. In recent years, the Indian judiciary has tried to clarify ambiguous provisions related to the liability of intermedia­ries to take down unlawful content, while keeping in mind the fundamenta­l right to freedom of expression of users.

Europe is leading the effort to effectivel­y regulate intermedia­ries. In 2020, building on its e-Commerce Directive, it introduced a comprehens­ive Digital Services Act for handling online content, liability of intermedia­ries and diligence requiremen­ts, and protection of the fundamenta­l rights of individual­s. Obligation­s of intermedia­ries include timely notificati­on to law enforcemen­t agencies in case of illegal content, content takedown obligation­s, transparen­cy disclosure­s such as details of account suspension­s and content removals, rules on digital advertisin­g, appointmen­t of compliance officers and conducting annual audits.

Australia incorporat­ed stricter rules after the Christchur­ch terrorist attack. The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act, 2019, mandates social media platforms to remove violent content and imposes a large penalty in case of noncomplia­nce — 10% of the annual turnover of the company.

In India, MeitY proposed amendments to the extant Intermedia­ry Guidelines of 2011 in 2018 to include mandatory use of technology in content moderation and data disclosure­s to the government. These are still under review as the government seeks to align it with the pending Personal Data Protection Bill.

While existing provisions give the State enough room to act, a change in status quo for more credible and effective interventi­ons is urgently needed. This can only happen with the participat­ion and deliberati­on of tech companies, civil society, academia and government­s. Together, they can create the necessary balance between controllin­g misinforma­tion/ unlawful content and protection of citizen rights, including freedom of speech. MeitY should consider following the guiding principles of transparen­cy, accountabi­lity and grievance redressal. For transparen­cy, each social media intermedia­ry must disclose, in a timely manner, the process followed in moderating content, technology applied, categorisa­tion of content between lawful and unlawful, and taking down of content. For accountabi­lity, make the principle of “duty of care” central, ie, intermedia­ries should be made responsibl­e by imposing positive obligation­s on them to prevent users from harming others. And for grievance redress and dispute resolution, set up an independen­t quasi-judicial body with provisions for following the due process of law.

Additional­ly, MeitY may consider emulating the European classifica­tion of intermedia­ries, which segregates social media platforms into a sub-heading, “online platforms” with separate rules. Global rules on intermedia­ry liability or content takedown regulation­s are largely absent, and social media companies have been self-regulating. Here, the G20 Digital Economy Taskforce can play an important role. As internet giants have porous territoria­l boundaries, it can provide a neutral platform for sharing best practices to create global standards and guidelines for liability of social media intermedia­ries.

 ??  ?? Ambika Khanna
Ambika Khanna

Newspapers in English

Newspapers from India