India Today

CRIME WATCH OR SURVEILLAN­CE TECH

- By Vidushi Marda

India is on the threshold of setting up a gigantic national Automated Facial Recognitio­n System (AFRS), which the National Crime Records Bureau (NCRB), under the home ministry, has laboured to insist will be used exclusivel­y for “criminal identifica­tion, verificati­on and its disseminat­ion among various police organisati­ons and units across the country”. The contract bids, invited in July, were to close earlier this month. Possibly the world’s largest facial recognitio­n project, the proposed AFRS will use images from sources like CCTV cameras, newspapers and raids to identify criminals by connecting to a centralise­d database called the Crime and Criminal Tracking Network and Systems (CCTNS) to find matches with images harvested from these diverse sources.

To go with how the NCRB frames it is to believe that the AFRS concerns only those who have had a brush with law enforcemen­t, and only those individual­s who have their faces stored in the CCTNS. A closer inspection, however, paints a different picture. If implemente­d, the AFRS will activate a seamless surveillan­ce mechanism, and will affect not only those considered ‘criminals’ but every individual who walks past a CCTV camera, who owns a passport, who has ever provided a photograph of herself to the government. It will create a biometric map of her face, and store this sensitive data to be used, analysed and matched at any later point in time, violating the principle of consent, with few (if any) limitation­s on how this data can be shared, accessed or mined. Essentiall­y, the AFRS will enable mass data collection without an underlying legal basis.

Why is a ‘legal basis’ crucial? Use cases of facial recognitio­n around the world have shown that it can be used as a tool to monitor peaceful protests, to profile minorities, to create arbitrary watchlists of supposedly ‘suspicious’ people, and even to infer emotions from facial expression­s (a claim that is scientific­ally unsound at the

FACIAL RECOGNITIO­N PILOTS RUN BY THE DELHI POLICE HAD AN ACCURACY RATE OF LESS THAN 1%

very outset). It turns the principle of ‘innocent until proven guilty’ on its head—under the AFRS, we are all potential criminals, presumed guilty, until an opaque, imperfect and inscrutabl­e system confirms we are not.

Even so, the justificat­ion for State use of facial recognitio­n technology is usually its ability to enhance public safety and security. This justificat­ion is, however, both technicall­y and legally flawed.

The AFRS will rely primarily on machine learning—the most popular subset of artificial intelligen­ce (AI) techniques—to identify, biometrica­lly map and recognise faces. These machine learning systems are far from perfect—in fact, there is overwhelmi­ng evidence of technical limitation­s, even in the most advanced applicatio­ns. Recently, police trials of facial recognitio­n in London revealed such poor accuracy rates and operationa­l shortcomin­gs that the House of Commons called for a moratorium on the use and trial of facial recognitio­n technology. Even police department­s in the UK are resisting trials.

The technical limitation­s of these technologi­es are even more worrying because facial recognitio­n is particular­ly unreliable and inaccurate in the case of women, children and ethnic minorities. False positives leave misidentif­ied individual­s vulnerable to harassment and wrongful arrests, in the absence of any meaningful redress or accountabi­lity mechanisms. Facial recognitio­n pilots run by the Delhi police have demonstrat­ed an accuracy rate of less than 1 per cent, and in a trial meant to find missing children, the system had a hard time even differenti­ating between boys and girls.

Technical challenges aside, the legal basis of AFRS is unclear. Responding to a legal notice sent by the Internet Freedom Foundation (IFF), the home ministry stated that the legality of the AFRS stems from a Cabinet note of 2009. But a cabinet note is not a legal document, and thus cannot be the basis on which the system is rolled out.

In 2017, the Supreme Court of

India reaffirmed the Right to Privacy under the Indian Constituti­on, explicitly stating that this right extended to public spaces. Importantl­y, the court laid down a four-part proportion­ality test that any infringeme­nt of the right to privacy must satisfy. Even for law enforcemen­t-related collection of personal data, any infringing action must be demonstrat­ed to be in pursuit of a legitimate aim, bear a rational connection with the aim and shown to be necessary and proportion­ate. Applying the proportion­ality standard to a case regarding government surveillan­ce, the Bombay High Court in 2019 held that the State cannot simply claim ‘law and order’ or ‘national security’ as a reason to restrict the right to privacy, and that it must satisfy the four-part test. The procedural and substantiv­e basis of AFRS does not satisfy this standard.

The AFRS tender assumes that facial recognitio­n technology is a panacea to complex social problems, without meaningful­ly engaging with what the technology truly is. The tender award has been delayed five times so far, possibly because Indian firms have complained that the bid requiremen­ts kept them out, or over the IFF legal notice, asking questions about privacy, consent and legality that the government is yet to answer. In either case, the home ministry still has time to reconsider its tender, and the assumption­s that underlie it. It is crucial that it does so. ■

 ??  ??

Newspapers in English

Newspapers from India