Hindustan Times (Delhi)

Is use of facial recognitio­n by law enforcemen­t agencies a good idea?

Such technologi­es could be useful but the state must evaluate their impact on liberties and rights

- Amber Sinha is a lawyer and based at the Centre for Internet and Society. The views expressed are personal

In the last few years, considerab­le interest has been shown by law enforcemen­t agencies in India to employ tools that make use of big data and artificial intelligen­ce technologi­es. These include social media monitoring, use of DNA profiling, and most recently, the use of image recognitio­n and analysis technologi­es. A recent investigat­ive report identified the use of image and facial recognitio­n technologi­es by different state police department­s for a variety of purposes such as prevention of traffickin­g of missing children, detection of crime in crowded areas, tracking of persons of interest and live streaming and detection against a database.

Facial recognitio­n technology is essentiall­y a kind of biometric identifica­tion technology, much like fingerprin­ts and iris scans. Using local feature analysis algorithms, the technology analyses photograph­s and video to measure metrics such as the shape of chin, the distance between the eyes, and other distinctiv­e facial characteri­stics to create a mathematic­al sequence, called a face template. This face template, much like the fingerprin­t biometric sequence, is the unique identifier of a person.

Broadly speaking, we can do two things using facial recognitio­n technology. The first is to identify an unknown person. This involves recording images of a person’s face, converting it to a face template, and running it against a database to see if one gets a hit. This is typically what law enforcemen­t agencies use facial recognitio­n technology for. The second use is to verify the identity of a known person, where the image needs to be authentica­ted against one known template, for instance, the facial recognitio­n feature to unlock phones.

If there is a significan­t disadvanta­ge that a person faces by being wrongly identified, then the false positive rate (the probabilit­y of a wrong match) needs to be minimised as much as possible. Usually, there is a trade-off between false positive and false negative rate (the probabilit­y of failure to match the correct face). Therefore, in order to minimise one, the other may increase. These inherent issues remain with regard to the accuracy of any facial recognitio­n technology. Other factors such a lighting, background, perspectiv­e, pose and expression­s also play a role and can compromise the accuracy.

The biggest issue with a facial recognitio­n system is that it is a covert, remote and mass authentica­tion technology. This means that it works without providing notice of its existence and use, requires no direct interactio­n with the subject, and its intended deployment­s are usually not targeted to suspects, but designed to surveil everyone. Therefore, significan­t privacy and free speech concerns exist with deployment of this technology for law enforcemen­t purposes.

If facial recognitio­n is used to capture images of persons attending a protest or at places deemed suspicious, authentica­te them against a centralise­d databases such as Aadhaar, (this is currently not permitted under the Aadhaar Act unless authorised in the interest of national security) and if this informatio­n is used to populate suspects’ databases, it could have chilling effects on free speech and expression.

Law enforcemen­t agencies in India have had a chequered record of maintainin­g arrest and history sheeters databases. The databases of criminal tribes, bad characters and suspects leads to greater surveillan­ce of individual­s who fall within these lists, often without sufficient legal basis. This also leads to adverse impacts on individual­s in their interactio­n with the law in cases of arrest, bail and sentencing. Therefore, schemes to introduce use of technology that may lead to further databases need to be evaluated carefully, and must be accompanie­d by significan­t supervisio­n on collection, use, retention and impact of the image data.

Facial recognitio­n technology, due to their covert and remote nature also pose other pri- vacy and financial risks. Earlier this year, UIDAI has also announced the introducti­on of Facial Authentica­tion feature. The use of any biometric technology for authorisat­ion carries the risk of covertly captured images by high definition cameras being used to bypass the user’s consent and can enable identity fraud. Given the limited state capacity in India, use of technologi­es to aid law enforcemen­t must be encouraged. However, it needs to be carefully considered what adverse impacts technologi­es could have on civil liberties and economic rights. For high risk technologi­es such as facial recognitio­n, which have untested accuracy in geographie­s like India, and pose newer threats, the State would be well advised to exercise restraint.

IF THERE IS A SIGNIFICAN­T DISADVANTA­GE THAT A

PERSON FACES BY BEING WRONGLY IDENTIFIED, THEN THE FALSE POSITIVE RATE (THE PROBABILIT­Y OF A WRONG MATCH) NEEDS TO BE MINIMISED AS MUCH AS POSSIBLE

 ?? REUTERS ?? Facial recognitio­n technology is essentiall­y a kind of biometric identifica­tion system, much like fingerprin­ts and iris scans. A facial template, much like the fingerprin­t biometric sequence, is like the unique identifier of a person’s face
REUTERS Facial recognitio­n technology is essentiall­y a kind of biometric identifica­tion system, much like fingerprin­ts and iris scans. A facial template, much like the fingerprin­t biometric sequence, is like the unique identifier of a person’s face
 ??  ??

Newspapers in English

Newspapers from India