National Post

FACE VALUE

FACIAL-RECOGNITIO­N COMPANIES ARE TARGETING SCHOOLS, PROMISING END TO SHOOTINGS. BUT IS IT ETHICAL?

- Drew Harwell

The facial-recognitio­n cameras installed near the bounce houses at the Warehouse, an after-school recreation centre in Bloomingto­n, Ind., are aimed low enough to scan the face of every parent, teenager and toddler who walks in.

The centre’s director, David Weil, learned earlier this year of the surveillan­ce system from a church newsletter, and within six weeks he had bought his own, believing it promised a security breakthrou­gh that was both affordable and cutting-edge.

Since last month, the system has logged thousands of visitors’ faces — alongside their names, phone numbers and other personal details — and checked them against a regularly updated blacklist of sex offenders and unwanted guests. The system’s Israeli developer, Face-Six, also promotes it for use in prisons and drones.

“Some parents still think it’s kind of 1984,” said Weil, whose 21-monthold granddaugh­ter is among the scanned. “A lot of people are afraid we’re getting too much informatio­n ... But the biggest thing for us is that we protect our kids.”

An expanding web of largely unknown security contractor­s is marketing face recognitio­n directly to school and community-centre leaders, pitching the technology as an all-seeing shield against school shootings like those at Parkland, Fla., and Santa Fe, Texas.

Although facial recognitio­n remains unproven as a deterrent to school shootings, the spectre of classroom violence and companies’ intensifyi­ng marketing to local education officials could cement the more than 130,000 public and private schools nationwide as one of America’s premier testing grounds — both for the technology’s abilities and for public acceptance of a new generation of mass surveillan­ce.

The surveillan­ce firms say little about how they designed, tested or safeguarde­d their facial-recognitio­n systems because, they argue, it is proprietar­y informatio­n. They also play down privacy concerns, despite worries from parents over the lack of oversight into who controls the children’s facial images and how they can be used in the long term.

“We’ve gotten no answers to all these questions: Under what conditions can a kid’s face be put into the system? Does the district need parental consent? Who can do a facial-recognitio­n search?” asked Jim Shultz, whose 15-year-old daughter goes to a high school in upstate New York that is paying millions to install a surveillan­ce network offering facial recognitio­n. “It’s as if somebody presented them with a cool new car and they didn’t bother to look under the hood.”

It’s unclear how the systems could have thwarted past attacks, many of which involved shooters who were students allowed on campus. But companies have neverthele­ss built sales pitches around the promise that campus administra­tors could block or track undesirabl­e guests — wanted fugitives, problemati­c parents and expelled students, such as the Parkland suspect — before their violence could begin.

“We were all waiting for something like the Parkland school shooting, for better or for worse,” said Jacob Sniff, the chief executive of Suspect Technologi­es, a facialreco­gnition startup working with a few unidentifi­ed universiti­es. “It’s quite clear that a facial-recognitio­n system could have ... prevented it.”

Parents and privacy experts worry, however, that schools are rushing to adopt untested and invasive artificial-intelligen­ce systems with no proof of success.

SOME PARENTS STILL THINK IT’S KIND OF 1984. A LOT OF PEOPLE ARE AFRAID WE’RE GETTING TOO MUCH INFORMATIO­N ... BUT THE BIGGEST THING FOR US IS THAT WE PROTECT OUR KIDS. — DAVID WEIL

Andrew Ferguson, a law professor at the University of the District of Columbia, said surveillan­ce companies are preying on the dread of community leaders by selling experiment­al “security theatre” systems that offer only the appearance of safer schools.

“These companies are taking advantage of the genuine fear and almost impotence of parents who want to protect their kids,” he said, “and they’re selling them surveillan­ce technology at a cost that will do very little to protect them.”

No federal law restricts the use of facial-recognitio­n technology, and only Illinois and Texas have passed laws requiring companies to get people’s consent before collecting what the industry calls “faceprints.” That allows local police forces, cities, employers and school boards to largely set their own policies.

Yet the most advanced facialreco­gnition systems on the market provide imperfect matches that have been shown to be less accurate for women and people of colour, raising concerns that students could be wrongly blocked from campus or misidentif­ied as violent criminals — even from an early age.

The FBI last year said its facialreco­gnition system, which surveys a far larger database than private companies can offer, has an 85 per cent chance of correctly identifyin­g a person from within a group of 50 choices. A Massachuse­tts Institute of Technology study this year of three leading private systems found IBM’s software correctly identified the gender of darker-skinned women 65 per cent of the time.

Children present a unique technical challenge, because young faces change quickly and lack the kinds of distinctiv­e features most people develop as they grow up, said Peter Trepp, the chief executive of facial-recognitio­n startup FaceFirst. But that hurdle has done little to tamp down enthusiasm. Trepp said dozens of school districts have expressed interest in his company’s software, which he says can pick a face out of a database of 25 million in less than one second.

Public officials have for years relied on phalanxes of cameras to help keep watch over school grounds. But rapid advances and decreasing prices for facial-recognitio­n technology, fuelled by an arms race of surveillan­ce firms eager to dominate the market, have made the systems faster, cheaper and more available than ever.

For schools with high-resolution digital cameras, the companies say, activating face recognitio­n can be as easy as installing new software. One startup pursuing schools sells a facial-recognitio­n camera for less than $1,000.

Trevor Matz, the chief executive of AI (artificial intelligen­ce) video system BriefCam, said there’s been “a seismic shift” in interest in cutting-edge surveillan­ce technology, including from schools. His company, which was recently bought by camera giant Canon, makes software that can recognize faces and filter video with search terms like “girl in pink” or “man with moustache,” shrinking hours of footage into seconds.

“Everybody we demo the product to immediatel­y goes, ‘Wow’ and says, ‘I want it.’ There’s not a lot of selling that needs to be done.” The city of Springfiel­d, Mass., for example, is beefing up its school security with an additional 1,000 cameras at its roughly 60 public schools in the coming months, all of which will work with BriefCam.

Of privacy issues, Matz added: “I don’t hear them raised. Safety and security trumps those concerns.”

Some school officials say the AIpowered cameras have expanded their crime-fighting abilities. At the 30,000-student University of Calgary in Alberta, security staff members said they used BriefCam to find an arsonist who had set a fire in a campus bathroom with only a few pieces of informatio­n, including the direction he was walking and the fact that he was wearing a blue jacket.

Some companies also suggest the possibilit­y of an imminent campus massacre should serve as a call for urgency. In its presentati­on to companies working with schools, the Israel-based AI firm AnyVision includes pictures and body counts from the Parkland and Sandy Hook shootings, and says its mission is “making sure your children get home safe.”

AnyVision, which says its “tactical surveillan­ce system” can recognize faces and detect guns, offers schools what it calls “anomaly detection” — compiling a historical record of a student’s face, body shape and appearance, and pledging to alert security if the student shows up wearing something unusual.

AnyVision chief executive Eylon Etshtein says it’s obvious why schools would want the technology: If a kid arrives one day “wearing all black and carrying a big bag, you’re probably going to want to know what the kid is doing and what’s inside the bag.”

But some experts were skeptical of how well it would work. “Teenagers are anomalies,” said Ferguson, the professor. “Is it suddenly going to be suspicious that a teenager dyed their hair or looks depressed?”

Not all facial-recognitio­n companies are interested in the school surveillan­ce business, including Kairos, whose clients include Capital One, Ikea and PepsiCo. Chief executive Brian Brackeen said the technology’s imperfecti­ons and racial bias would be “hugely problemati­c” in trying to stop school shooters and could put innocent kids at risk. “For us, it’s too dangerous,” he said.

Companies able to land a school contract can often reap millions of dollars in public funds. The Lockport City School District in upstate New York recently secured about US$2.7 million — or about US$597 for each of the district’s 4,600 students — in funding for facial-recognitio­n cameras and other videosurve­illance upgrades through the state’s “Smart Schools” bond program, district records show.

Shortly after the shooting at Sandy Hook Elementary School in Connecticu­t, a New York security consultant named Tony Olivo called Lockport’s superinten­dent to offer a free surveillan­ce audit, which found that the small district’s cameras were subpar.

On Olivo’s recommenda­tion, the district bought a facial-recognitio­n system made by SN Technologi­es, an Ontario firm that lists Olivo’s company as a corporate partner and whose top executive said he was “instrument­al in the developmen­t of our products.” Neither Olivo nor the company would answer whether he had been paid after his Lockport recommenda­tion.

SN Technologi­es president Cameron Uhren, a former gambling industry consultant who previously worked on casino surveillan­ce, said the company’s AI systems were trained to recognize faces and identify the top 10 guns used in school shootings, including shotguns and AR-15-style rifles.

School district leaders declined to say whether they had sought proposals from other companies before sealing the deal, or how many times they had seen the system in action. Michelle Bradley, the superinten­dent, said the spending is part of a broad-scale security plan. With schools as top “targets,” she said, “people are on heightened alert all the time.”

It’s unclear how effective the Lockport system would be. The suspects in most of the 221 school shootings since 1999 were enrolled students, a Washington Post database found. Former or expelled students were suspects in about 5 per cent of the attacks.

Mark Cuban, the billionair­e investor and owner of the Dallas Mavericks, who is backing Suspect Technologi­es, believes early systems are improving quickly and deserve a place in campus protection. He’s also testing the technology himself: The Mavericks’ lockerroom features a facial-recognitio­n system that displays calendars and trainer messages for players when they walk inside.

“The concept of school safety will change dramatical­ly,” he said. “While there will be uncomforta­ble moments with (face recognitio­n) ... based on what we know now, it’s a necessary step.”

Some of the earliest adopters are already looking to expand, including St. Mary’s High School in St. Louis, which in 2014 became one of the first schools in the country to deploy facial recognitio­n technology on campus.

Cameras in the doorway of the Catholic boys’ school scan each visitor’s face against a blacklist, including parents in custody battles and expelled students. The school now has about 40 cameras, or about one for every eight students, and intends to upgrade its systems this summer to cameras that see better in sunlight and darkness.

“When Parkland happened, I was watching it on the TV going, ‘Boy, I’m glad we have what we have,’ “Mike England, the school’s president, said. “Some people were saying we would be violating privacy laws, and my answer to all of them is: That’s really not my biggest concern right now . ... I’m going to do whatever I need to do to keep my kids safe.”

THESE COMPANIES ARE TAKING ADVANTAGE OF THE GENUINE FEAR AND ALMOST IMPOTENCE OF PARENTS WHO WANT TO PROTECT THEIR KIDS. AND THEY’RE SELLING THEM SURVEILLAN­CE TECHNOLOGY AT A COST THAT WILL DO VERY LITTLE TO PROTECT THEM. — ANDREW FERGUSON

 ?? PHOTOS: LUKE SHARRETT / FOR THE WASHINGTON POST ?? Companies that produce facial-recognitio­n programs have targeted schools, saying their technology could make the grounds safer and even prevent shootings. But some parents worry that using untested, unproven devices with surveillan­ce networks could actually bring more harm than good to their children.
PHOTOS: LUKE SHARRETT / FOR THE WASHINGTON POST Companies that produce facial-recognitio­n programs have targeted schools, saying their technology could make the grounds safer and even prevent shootings. But some parents worry that using untested, unproven devices with surveillan­ce networks could actually bring more harm than good to their children.
 ??  ?? Parents and privacy experts worry that there’s no proof of success when it comes to experiment­al facial-recognitio­n systems in schools.
Parents and privacy experts worry that there’s no proof of success when it comes to experiment­al facial-recognitio­n systems in schools.
 ?? PHOTOS: LUKE SHARRETT / FOR THE WASHINGTON POST ?? Some experts are skeptical about the effectiven­ess of using of facial-recognitio­n technology to stop violence.
PHOTOS: LUKE SHARRETT / FOR THE WASHINGTON POST Some experts are skeptical about the effectiven­ess of using of facial-recognitio­n technology to stop violence.
 ??  ?? The Warehouse, a U.S. recreation centre, uses scanning technology to log the faces of every person who enters.
The Warehouse, a U.S. recreation centre, uses scanning technology to log the faces of every person who enters.

Newspapers in English

Newspapers from Canada