Business Standard

Intermedia­ry liability law needs updating

- SUNIL ABRAHAM The writer is executive director, Centre for Internet and Society

There is a less charitable name for intermedia­ry liability regimes like Sec 79 of the IT Act — private censorship regimes. Intermedia­ries get immunity from liability emerging from user-generated and third-party content because they have no “actual knowledge” until it is brought to their notice using “take down” requests or orders. Since some of the harm caused is immediate, irreparabl­e and irreversib­le, it is the preferred alternativ­e to approachin­g courts for each case.

When intermedia­ry liability regimes were first enacted, most intermedia­ries were acting as common carriers — ie they did not curate the experience of users in a substantia­l fashion. While some intermedia­ries like Wikipedia continue this common carrier tradition, others driven by advertisin­g revenue no longer treat all parties and all pieces of content neutrally. Facebook, Google and Twitter do everything they can to raise advertisin­g revenues. They make you depressed. And if they like you, they get you to go out and vote. There is an urgent need to update intermedia­ry liability law.

In response to being summoned by multiple government­s, Facebook has announced the establishm­ent of an independen­t oversight board. A global free speech court for the world’s biggest online country. The time has come for India to exert its foreign policy muscle. The amendments to our intermedia­ry liability regime can have global repercussi­ons, and shape the structure and functionin­g of this and other global courts.

While with one hand Facebook dealt the oversight board, with the other hand it took down APIs that would enable press and civil society to monitor political advertisin­g in real time. How could they do that with no legal consequenc­es? The answer is simple — those APIs were provided on a voluntary basis. There was no law requiring them to do so.

There are two approaches that could be followed. One, as scholar of regulatory theory Amba Kak puts it, is to “disincenti­vise the black box”. Most transparen­cy reports produced by intermedia­ries today are on a voluntary basis; there is no requiremen­t for this under law. Our new law could require a extensive transparen­cy with appropriat­e privacy safeguards for the government, affected parties and the general public in terms of revenues, content production and consumptio­n, policy developmen­t, contracts, service-level agreements, enforcemen­t, adjudicati­on and appeal. User empowermen­t measures in the user interface and algorithm explainabi­lity could be required. The key word in this approach is transparen­cy.

The alternativ­e is to incentivis­e the black box. Here faith is placed in technologi­cal solutions like artificial intelligen­ce. To be fair, technologi­cal solutions may be desirable for battling child pornograph­y, where pre-censorship (or deletion before content is published) is required. Fingerprin­ting technology is used to determine if the content exists in a global database maintained by organisati­ons like the Internet Watch Foundation. A similar technology called Content ID is used pre- censor copyright infringeme­nt. Unfortunat­ely, this is done by ignoring the flexibilit­ies that exist in Indian copyright law to promote education, protect access knowledge by the disabled, etc. Even within such narrow applicatio­n of technologi­es, there have been false positives. Recently, a video of a blogger testing his microphone was identified as a pre-existing copyrighte­d work.

The goal of a policy-maker working on this amendment should be to prevent repeats of the Shreya Singhal judgment where sections of the IT Act were read down or struck down. To avoid similar constituti­on challenges in the future, the rules should not specify any new categories of illegal content, because that would be outside the scope of the parent clause. The fifth ground in the list is sufficient — “violates any law for the time being in force”. Additional grounds, such as “harms minors in anyway”, is vague and cannot apply to all categories of intermedia­ries — for example, a dating site for sexual minorities. The rights of children need to be protected. But that is best done within the ongoing amendment to the POCSO Act.

As an engineer, I vote to eliminate redundancy. If there are specific offences that cannot fit in other parts of the law, those offences can be added as separate sections in the IT Act. For example, even though voyeurism is criminalis­ed in the IT Act, the non-consensual distributi­on of intimate content could be criminalis­ed, as it has been done in the Philippine­s.

Provisions that have to do with data retention and government access to that data for the purposes of national security, law enforcemen­t and also anonymised datasets for the public interest should be in the upcoming Data Protection law. The rules for intermedia­ry liability is not the correct place to deal with it, because data retention may also be required of those intermedia­ries that don’t handle any third-party informatio­n or user generated content. Finally, there have to be clear procedures in place for reinstatem­ent of content that has been taken down.

The amendments to India’s intermedia­ry liability regime can have global repercussi­ons, and shape the structure and functionin­g of the world’s social media giants

Disclosure: The Centre for Internet and Society receives grants from Facebook, Google and Wikimedia Foundation

Newspapers in English

Newspapers from India