Who’s Wat­ching?

Fa­cial Re­cog­ni­tion in CCTV: the Im­pli­ca­tions of Sur­veillance

Le Délit - - Sci+tech - Na­bee­la Ji­vraj

The air is a lit­tle fri­gid. You’re ex­pec­ting an im­por­tant call. You’re on your way home from class when you feel your phone vi­bra­ting in your po­cket. With your par­ka, hat, and boots, you fumble to check your phone wi­thout pul­ling off your mit­tens. For­tu­na­te­ly, you on­ly have to look at the screen to ans­wer the call – your phone can be un­lo­cked with both your fin­ger­prints, as well as your face.

Fa­cial re­cog­ni­tion tech­no­lo­gies of­fer a ne­wer, more per­so­nal type of se­cu­ri­ty, th­rough which ar­ti­fi­cial in­tel­li­gence and high-pre­ci­sion ca­me­ras en­able ins­tan­ta­neous iden­ti­fi­ca­tion of users. These tech­no­lo­gies are tou­ted for the sup­po­sed pro­mise of in­crea­sed se­cu­ri­ty. This means that na­tio­nal se­cu­ri­ty forces can in­crease sa­fe­ty by re­spon­ding fas­ter to violent crime, and by being qui­cker with in­ves­ti­ga­tions. For in­di­vi­dual users, fa­cial re­cog­ni­tion tech­no­lo­gy can pur­por­ted­ly of­fer them con­trol over their own per­so­nal in­for­ma­tion, whe­ther that be as a means to lock ac­cess to your phone, bank, or other per­so­nal af­fairs.

Though Chi­na is the first na­tion to ful­ly im­ple­ment sur­veillance with this tech­no­lo­gy, Aus­tra­lia, In­dia, and the Uni­ted King­dom have joi­ned in tria­ling the tech­no­lo­gy over the past year. These na­tio­nal se­cu­ri­ty sys­tems re­ly on na­tio­nal da­ta­bases of ci­vi­lian pro­files to iden­ti­fy people. More re­cent­ly, fa­cial re­cog­ni­tion CCTV (clo­sed- cir­cuit te­le­vi­sion sys­tems) has been ad­ded to the suite of mo­dern vi­deo ana­ly­tics for sur­veillance. In ad­di­tion to being able to iden­ti­fy ob­jects, ani­mals, and to log how fast things are mo­ving, na­tio­nal se­cu­ri­ty sys­tems using fa­cial re­cog­ni­tion CCTV are able to ins­tant­ly iden­ti­fy who is in the frame. This could mean a de­crea­sed re­liance on wit­nesses or in-per­son in­ves­ti­ga­tions. This tech­no­lo­gy al­lows for in­ves­ti­ga­tions to go en­ti­re­ly di­gi­tal and en­ables po­lice to ar­rive on scene to car­ry out ar­rests mi­nutes af­ter crimes are com­mit­ted.

Like any other Tues­day night, red and blue lights bounce off the snow at the in­ter­sec­tion. You hang up the phone as you turn the cor­ner, and are im­me­dia­te­ly stop­ped by an of­fi­cer. “Are you so-and­so?” they ask. “We saw you on ca­me­ra.” You shake your head, no. They ask you to show ID. You try to re­mem­ber if you took your ID to school to­day.

In the mid 19th-cen­tu­ry, phi­lo­so­pher Ben­tham pro­po­sed “the pan­op­ti­con,” an ar­chi­tec­tu­ral pri­son de­si­gn, which of­fe­red com­plete con­trol of those being ob­ser­ved via in­ter­na­li­zed coer­cion. Be­cause people in the pan­op­ti­con are al­ways being wat­ched, they are constant­ly aware of being ob­ser­ved, and are, the­re­fore, un­der con­trol.

With lit­tle re­gu­la­tion or po­li­cy sur­roun­ding fa­cial tech­no­lo­gy, au­tho­ri­ta­rian sur­veillance is en­ti­re­ly pos­sible, and al­rea­dy hap­pe­ning. The use of fa­cial re­cog­ni­tion tech­no­lo­gy for sur­veillance is cri­ti­ci­zed on ma­ny fronts – when it works well, it poses a risk to ci­vi­lian free­dom and pri­va­cy, and when it doesn’t work, it makes in­no­cent people vul­ne­rable. Big Bro­ther Watch, a non-pro­fit ci­vil li­ber­ties or­ga­ni­za­tion which cam­pai­gns against the rise of state sur­veillance, pro­du­ced a re­port which es­ti­ma­ted that fa­cial re­cog­ni­tion tech­no­lo­gy had a high rate of false po­si­tives, as well as false ne­ga­tives. While false po­si­tives oc­cur when the tech­no­lo­gy iden­ti­fies so­meone in­cor­rect­ly, false ne­ga­tives are the fai­lure to cor­rect­ly iden­ti­fy so­meone who is in a na­tio­nal fa­cial re­cog­ni­tion da­ta­base. A cen­tral point of the re­port is that well-wor­king, or per­fec­ted fa­cial re­cog­ni­tion tech­no­lo­gy, would es­sen­tial­ly turn ci­vi­lians in­to “wal­king ID cards.” Con­ver­se­ly, the use of sur­veillance tech­no­lo­gy to po­lice concerts, fes­ti­vals, and car­ni­vals in both the UK and Chi­na have fal­se­ly iden­ti­fied the pre­sence of na­tio­nal sus­pects to po­lice over 90 per cent of the time. The re­port al­so high­lights how fa­cial re­cog­ni­tion tech­no­lo­gy is dis­pro­por­tio­na­te­ly ac­cu­rate when it comes to mi­no­ri­ty groups: it fre­quent­ly in­cor­rect­ly iden­ti­fies wo­men of mi­no­ri­ty eth­nic groups in the Uni­ted States. This is a ma­jor concern, as ra­cial pre­ju­dice in po­lice sys­tems al­rea­dy dis­pro­por­tio­na­te­ly af­fects mi­no­ri­ties. If tech­no­lo­gies pose the risk of in­crea­sing this dis­pa­ri­ty, the “me­rits” of these tech­no­lo­gies should tru­ly be cal­led in­to ques­tion. The risk of ra­cial pre­ju­dice in Ai-ba­sed tech­no­lo­gies is a re­cur­ring concern – a piece ear­lier this se­mes­ter, tit­led Is Ai­ra­cist? exa­mi­ned the fal­li­bi­li­ty of AI and its consistent is­sues in terms of ra­cial bias.

As the price of these tech­no­lo­gies conti­nues to de­crease, ma­king them more ac­ces­sible for other na­tions to fol­low suit, we have to ask whe­ther we are ade­qua­te­ly equip­ped for the re­per­cus­sions of ins­ti­tu­tio­na­li­zing this tech­no­lo­gy, and gi­ving in to in­crea­sin­gly au­tho­ri­ta­rian sur­veillance. With ins­tan­ta­neous iden­ti­fi­ca­tion, ad­vances in sur­veillance move us clo­ser to a mo­dern, and vi­vid­ly real, ite­ra­tion of the pan­op­ti­con. We constant­ly have to ask – who is wat­ching us? Should they be?

This tech­no­lo­gy al­lows for in­ves­ti­ga­tions to go en­ti­re­ly di­gi­tal. With lit­tle re­gu­la­tion or po­li­cy sur­roun­ding fa­cial tech­no­lo­gy, au­tho­ri­ta­rian sur­veillance is en­ti­re­ly pos­sible, and al­rea­dy hap­pe­ning. Per­fec­ted fa­cial re­cog­ni­tion tech­no­lo­gy would es­sen­tial­ly turn ci­vi­lians in­to “wal­king ID cards.”

Nel­ly Wat | The Mcgill Dai­ly

Newspapers in French

Newspapers from Canada

© PressReader. All rights reserved.