In­ter­me­di­ary li­a­bil­ity law needs up­dat­ing

Business Standard - - OPINION - SU­NIL ABRA­HAM The writer is ex­ec­u­tive di­rec­tor, Cen­tre for In­ter­net and So­ci­ety

There is a less char­i­ta­ble name for in­ter­me­di­ary li­a­bil­ity regimes like Sec 79 of the IT Act — pri­vate cen­sor­ship regimes. In­ter­me­di­aries get im­mu­nity from li­a­bil­ity emerg­ing from user-gen­er­ated and third-party con­tent be­cause they have no “ac­tual knowl­edge” un­til it is brought to their no­tice us­ing “take down” re­quests or or­ders. Since some of the harm caused is im­me­di­ate, ir­repara­ble and ir­re­versible, it is the pre­ferred al­ter­na­tive to ap­proach­ing courts for each case.

When in­ter­me­di­ary li­a­bil­ity regimes were first en­acted, most in­ter­me­di­aries were act­ing as com­mon car­ri­ers — ie they did not cu­rate the ex­pe­ri­ence of users in a sub­stan­tial fash­ion. While some in­ter­me­di­aries like Wikipedia con­tinue this com­mon car­rier tra­di­tion, oth­ers driven by ad­ver­tis­ing rev­enue no longer treat all par­ties and all pieces of con­tent neu­trally. Face­book, Google and Twit­ter do ev­ery­thing they can to raise ad­ver­tis­ing rev­enues. They make you de­pressed. And if they like you, they get you to go out and vote. There is an ur­gent need to up­date in­ter­me­di­ary li­a­bil­ity law.

In re­sponse to be­ing sum­moned by mul­ti­ple gov­ern­ments, Face­book has an­nounced the es­tab­lish­ment of an in­de­pen­dent over­sight board. A global free speech court for the world’s big­gest on­line coun­try. The time has come for In­dia to ex­ert its for­eign pol­icy mus­cle. The amend­ments to our in­ter­me­di­ary li­a­bil­ity regime can have global reper­cus­sions, and shape the struc­ture and func­tion­ing of this and other global courts.

While with one hand Face­book dealt the over­sight board, with the other hand it took down APIs that would en­able press and civil so­ci­ety to mon­i­tor po­lit­i­cal ad­ver­tis­ing in real time. How could they do that with no le­gal con­se­quences? The an­swer is sim­ple — those APIs were pro­vided on a vol­un­tary ba­sis. There was no law re­quir­ing them to do so.

There are two ap­proaches that could be fol­lowed. One, as scholar of reg­u­la­tory the­ory Amba Kak puts it, is to “dis­in­cen­tivise the black box”. Most trans­parency re­ports pro­duced by in­ter­me­di­aries to­day are on a vol­un­tary ba­sis; there is no re­quire­ment for this un­der law. Our new law could re­quire a ex­ten­sive trans­parency with ap­pro­pri­ate pri­vacy safe­guards for the gov­ern­ment, af­fected par­ties and the gen­eral pub­lic in terms of rev­enues, con­tent pro­duc­tion and con­sump­tion, pol­icy de­vel­op­ment, con­tracts, ser­vice-level agree­ments, en­force­ment, ad­ju­di­ca­tion and ap­peal. User em­pow­er­ment mea­sures in the user in­ter­face and al­go­rithm ex­plain­abil­ity could be re­quired. The key word in this ap­proach is trans­parency.

The al­ter­na­tive is to in­cen­tivise the black box. Here faith is placed in tech­no­log­i­cal so­lu­tions like ar­ti­fi­cial in­tel­li­gence. To be fair, tech­no­log­i­cal so­lu­tions may be de­sir­able for bat­tling child pornog­ra­phy, where pre-cen­sor­ship (or dele­tion be­fore con­tent is pub­lished) is re­quired. Fin­ger­print­ing tech­nol­ogy is used to de­ter­mine if the con­tent ex­ists in a global data­base main­tained by or­gan­i­sa­tions like the In­ter­net Watch Foun­da­tion. A sim­i­lar tech­nol­ogy called Con­tent ID is used pre- cen­sor copy­right in­fringe­ment. Un­for­tu­nately, this is done by ig­nor­ing the flex­i­bil­i­ties that ex­ist in In­dian copy­right law to pro­mote ed­u­ca­tion, pro­tect ac­cess knowl­edge by the dis­abled, etc. Even within such nar­row ap­pli­ca­tion of tech­nolo­gies, there have been false pos­i­tives. Re­cently, a video of a blog­ger test­ing his mi­cro­phone was iden­ti­fied as a pre-ex­ist­ing copy­righted work.

The goal of a pol­icy-maker work­ing on this amend­ment should be to pre­vent re­peats of the Shreya Sing­hal judg­ment where sec­tions of the IT Act were read down or struck down. To avoid sim­i­lar con­sti­tu­tion chal­lenges in the fu­ture, the rules should not spec­ify any new cat­e­gories of il­le­gal con­tent, be­cause that would be out­side the scope of the par­ent clause. The fifth ground in the list is suf­fi­cient — “vi­o­lates any law for the time be­ing in force”. Ad­di­tional grounds, such as “harms mi­nors in any­way”, is vague and can­not ap­ply to all cat­e­gories of in­ter­me­di­aries — for ex­am­ple, a dat­ing site for sex­ual mi­nori­ties. The rights of chil­dren need to be pro­tected. But that is best done within the on­go­ing amend­ment to the POCSO Act.

As an en­gi­neer, I vote to elim­i­nate re­dun­dancy. If there are spe­cific of­fences that can­not fit in other parts of the law, those of­fences can be added as sep­a­rate sec­tions in the IT Act. For ex­am­ple, even though voyeurism is crim­i­nalised in the IT Act, the non-con­sen­sual dis­tri­bu­tion of in­ti­mate con­tent could be crim­i­nalised, as it has been done in the Philip­pines.

Pro­vi­sions that have to do with data re­ten­tion and gov­ern­ment ac­cess to that data for the pur­poses of na­tional se­cu­rity, law en­force­ment and also anonymised datasets for the pub­lic in­ter­est should be in the up­com­ing Data Pro­tec­tion law. The rules for in­ter­me­di­ary li­a­bil­ity is not the cor­rect place to deal with it, be­cause data re­ten­tion may also be re­quired of those in­ter­me­di­aries that don’t han­dle any third-party in­for­ma­tion or user gen­er­ated con­tent. Fi­nally, there have to be clear pro­ce­dures in place for re­in­state­ment of con­tent that has been taken down.

The amend­ments to In­dia’s in­ter­me­di­ary li­a­bil­ity regime can have global reper­cus­sions, and shape the struc­ture and func­tion­ing of the world’s so­cial me­dia gi­ants

Dis­clo­sure: The Cen­tre for In­ter­net and So­ci­ety re­ceives grants from Face­book, Google and Wik­imedia Foun­da­tion

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.