New York Daily News

Our tech dystopia, continued

- HARRY SIEGEL harrysiege­l@gmail.com

Oops! Facebook accidental­ly sent the personal details of one thousand of the human workers who vet its content directly to the people they’d barred from the social network. At least six of those those outed screeners were exposed to suspected terrorists. One of the six, an Iraqi-born Irish citizen, received “friend requests” from accounts associated with an Egypt-based terror group. Fearing for his life, the screener quit his low-paying “community operations analyst” job watching beheading videos and the like for a global outsourcin­g company Facebook has foisted that work off on, and fled his new homeland to go into hiding (“exile,” he called it) somewhere in Eastern Europe. Five months later, he returned to Ireland not because he felt safe but because he’d run out of money.

Facebook assured The Guardian, which broke the news about this mess Friday, that it’s made technical changes to “better detect and prevent these types of issues from occurring.”

That was the day after the “director of global policy management” and “counterter­rorism policy manager” for the vast social network that increasing­ly functions like a borderless virtual government revealed its plans to use “artificial intelligen­ce to keep terrorist content off of Facebook.”

Their big post is a triumph of corporates­peak — “Our stance is simple: There’s no place on Facebook for terrorism” — that fails to define what “terrorism” is and strives to pass the buck by stressing that radicaliza­tion “primarily occurs offline” and offloading “counterspe­ech training,” whatever that means, to partner NGOs and community groups “to empower the voices that matter most.” Why those voices matter most, and how the network will empower them, they don’t say.

The post does note that Facebook is “currently focusing our most cutting edge techniques to combat terrorist content about ISIS, Al Qaeda and their affiliates, and we expect to expand to other terrorist organizati­ons in due course,” so at some point they’ll presumably will have to publicly go beyond Justice Potter Stewart’s famous definition of pornograph­y: “I know it when I see it.”

The company is adding more human moderators — presumably subcontrac­tors — to see it, since “(t)o understand more nuanced cases, we need human expertise.” For now, at least.

“Ideally, one day our technology will address everything,” the global policy management director said Thursday, elaboratin­g on her post.

“It’s in developmen­t right now,” with the new people checking horrific content against Facebook’s secret definition­s until the bots get up to speed.

As Silicon Valley lives up to its unofficial motto of “move fast and break things,” those things, it turns out, are often us.

No, Facebook isn’t to blame for the human condition. But the tech giants reaping vast fortunes from our private informatio­n have hardly lived up to their promises of producing a better tomorrow in the process.

As impressive a sales job as they’ve done of selling disruption as progress, it turns out that across the globe voters still believe their lying eyes and young people have lost faith in democracy as a system of government.

Something to chew on while sitting down over the Whole Foods dinner an Amazon drone delivered to watch “Dark Mirror” on Netflix before coming across any new live-streamed “Faces of Death” the algorithm slipped into your newsfeed.

There’s no sign that the tech giants will hit the pause button on profits and actually solve the life-or-death issues their products have created out before offering new products that create new life-or-death issues.

The last one-two punch of a very bad thing happening around Facebook and the company vowing to do more to stop that very bad thing was just one month (or roughly a million news cycles) ago, with the “Easter day slaughter.”

That was the title that a madman gave to the video he posted of him murdering a stranger in the middle of the afternoon, which remained up on the site for hours. Two days later, CEO Mark Zuckerberg, speaking at Facebook’s big annual conference, offered his condolence­s to the victim, 74-year-old Robert Godwin Sr., and declared that “we will keep doing all we can to prevent tragedies like this from happening.”

Then he pivoted to why he was there — the company’s big push into “augmented reality,” with cameras and glasses to overlay it over the old-fashioned reality. Maybe Facebook’s artificial intelligen­ce will be ready by the time that’s here to write blog posts explaining the company’s very sincere efforts to avert the dystopian murders and rapes and other horrors it allows for.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States