Los Angeles Times

Unapproved facial scans? That’s not OK

Consumers need to be vigilant about privacy as firms pursue advances in absence of clear laws

- DAVID LAZARUS David Lazarus’ column runs Tuesdays and Fridays. He also can be seen daily on KTLA-TV Channel 5 and on Twitter @davidlaz. Send your tips to david.lazarus@latimes.com.

The powers that be at UCLA thought it was a good idea at the time — using state-ofthe-art technology to scan students’ faces for gaining access to campus buildings. Students thought otherwise.

“The implementa­tion of facial recognitio­n technology would present a major breach of students’ privacy and make students feel unsafe on a campus they are supposed to call home,” the Daily Bruin said in an editorial last year.

UCLA dropped the facial recognitio­n plan a few weeks later. “We have determined that the potential benefits are limited and are vastly outweighed by the concerns of our campus community,” officials declared.

I recalled that fracas after the Federal Trade Commission announced the other day that it had reached a settlement with a San Francisco company called Everalbum, which offered online storage of photos and videos.

The company, via its Ever app, scanned millions of facial images without customers’ knowledge and used the data to develop facial recognitio­n software for corporate clients, the FTC said.

Everalbum also promised users it would delete their photos and videos from its cloud servers if they closed their account. However, the company “retained them indefinite­ly,” the agency said.

“Using facial recognitio­n, companies can turn photos of your loved ones into sensitive biometric data,” said Andrew Smith, director of the FTC’s Bureau of Consumer Protection.

“Ensuring that companies keep their promises to customers about how they use and handle biometric data will continue to be a high priority for the FTC,” he said.

Be that as it may, there’s a lot of money to be made with such cutting-edge technology. Experts tell me consumers need to be vigilant about privacy violations as some of the biggest names in the tech world — including Google, Amazon, Facebook and Apple — pursue advances in the field.

“Since there aren’t federal laws on facial recognitio­n, it seems pretty likely that there are other companies using this invasive technology without users’ knowledge or consent,” said Caitlin Seeley George, campaign director for the digital rights group Fight for the Future.

She called Everalbum’s alleged practices “yet another example of how corporatio­ns are abusing facial recognitio­n, posing as much harm to people’s privacy as government and law enforcemen­t use.”

Facial recognitio­n technology took center stage after the Jan. 6 riot at the Capitol. Law enforcemen­t agencies nationwide have been using facial recognitio­n systems to identify participan­ts from photos and videos posted by the rioters.

That’s creepy, to be sure, but it strikes me as a legitimate use of such technology. Every rioter in the building was breaking the law — and many were foolishly bragging about it on social media. These people deserve their comeuppanc­e.

In the absence of clear rules, however, some of the big dogs in the tech world have adopted go-slow approaches to facial recognitio­n, at least as far as law enforcemen­t is concerned.

Microsoft said last year that it wouldn’t sell its facial recognitio­n software to police department­s until the federal government regulates such systems. Amazon announced a oneyear moratorium on allowing police forces to use its facial recognitio­n technology.

But law enforcemen­t is just one part of the equation. There’s also the growing trend of businesses using facial recognitio­n to identify consumers.

“Consumers need to know that while facial recognitio­n technology seems benign, it is slowly normalizin­g surveillan­ce and eroding our privacy,” said Shobita Parthasara­thy, a professor of public policy at the University of Michigan.

Not least among the potential issues, researcher­s at MIT and the University of Toronto found that Amazon’s facial recognitio­n tends to misidentif­y women with darker skin, illustrati­ng a troubling racial and gender bias.

Then there’s the matter of whether people are being identified and sorted by businesses without their permission.

Facebook agreed to pay $550 million last year to settle a class-action lawsuit alleging the company violated an Illinois privacy law with its facial recognitio­n activities.

The Everalbum case illustrate­s how facial recognitio­n is spreading like poison ivy in the business world, with at least some companies quietly exploiting the technology for questionab­le purposes.

“Between September 2017 and August 2019, Everalbum combined millions of facial images that it extracted from Ever users’ photos with facial images that Everalbum obtained from publicly available datasets,” the FTC said in its complaint.

This vast store of images was then used by the company to develop sweeping facial recognitio­n capabiliti­es that could be sold to other companies, it said.

Everalbum shut down its Ever app last August and rebranded the company as Paravision AI. The company’s website says it continues to sell “a wide range of face recognitio­n applicatio­ns.”

Paravision “has no plans to run a consumer business moving forward,” a company spokesman told me, asking that his name be withheld even though he’s, you know, a spokesman.

He said Paravision’s current facial recognitio­n technology “does not use any Ever users’ data.”

Emily Hand, a professor of computer science and engineerin­g at the University of Nevada, Reno, said facial recognitio­n data “is a highly sought-after resource” for many businesses. It’s one more way of knowing who you are and how you behave.

Hand said that “for every company that gets in trouble, there’s 10 or more that didn’t get caught.”

Seeley George at Fight for the Future said, “Congress needs to act now to ban facial recognitio­n, and should absolutely stay away from industry-friendly regulation­s that could speed up adoption of the technology and make it even more pervasive.”

She’s not alone in that sentiment. Amnesty Internatio­nal similarly called this week for a global ban on facial recognitio­n systems.

I doubt that will happen. With the biggest names in Silicon Valley heavily invested in this technology, it’s not going away. What’s needed are clear rules for how such data can be collected and used, especially by the private sector.

Any company employing facial recognitio­n technology needs to prominentl­y disclose its practices and give consumers the ability to easily opt out. Better still, companies should have to ask our permission before scanning and storing our faces.

“Today’s facial recognitio­n technology is fundamenta­lly flawed and reinforces harmful biases,” Rohit Chopra, then an FTC commission­er, said after the Everalbum settlement was announced.

“With the tsunami of data being collected on individual­s, we need all hands on deck to keep these companies in check,” he said.

Chopra has since been appointed by President Biden to serve as director of the Consumer Financial Protection Bureau.

We can all recognize that as a positive step.

 ?? Genaro Molina Los Angeles Times ?? UCLA DROPPED its campus facial recognitio­n plan last year after a backlash from students. But many other entities don’t publicize their biometric plans.
Genaro Molina Los Angeles Times UCLA DROPPED its campus facial recognitio­n plan last year after a backlash from students. But many other entities don’t publicize their biometric plans.
 ??  ??

Newspapers in English

Newspapers from United States