Albuquerque Journal

Tech giants stumble in the social world they created

Platform operators are caught in a bind when trying to control content

- BY BARBARA ORTUTAY

NEW YORK — Who knew connecting the world could get so complicate­d? Perhaps some of technology’s brightest minds should have seen that coming.

Social media bans of conspiracy theorist Alex Jones have thrust Facebook, YouTube, Twitter and others into a role they never wanted — as gatekeeper­s of discourse on their platforms, deciding what should and shouldn’t be allowed and often angering almost everyone in the process. Jones, a right-wing provocateu­r, suddenly found himself banned from most major social platforms this week, after years in which he was free to use them to promulgate a variety of false claims.

Twitter, which one of its executives once called the “free speech wing of the free speech party,” remains a lonely holdout on Jones. The resulting backlash suggests that no matter what the tech companies do, “there is no way they can please everyone,” as Scott Shackelfor­d, a business law and ethics professor at Indiana University, observed.

Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and crew, and Google’s stewards of YouTube gave little thought to such consequenc­es as they built their empires with lofty goals to connect the world and democratiz­e discourse. At the time, they were the rebels aiming to bypass the stodgy old gatekeeper­s — newspaper editors, television programmer­s and other establishm­ent types — and let people talk directly to one another.

“If you go back a decade or so, the whole idea of speech on social media was seen as highly positive light,” said Tim Cigelske, who teaches social media at Marquette University in Wisconsin. There was the Arab Spring. There were stories of gay, lesbian and transgende­r teens from small towns finding support online.

At the same time, of course, the companies were racing to build the largest audiences possible, slice and dice their user data and make big profits by turning that informatio­n into lucrative targeted advertisem­ents.

The dark side of untrammele­d discourse, the thinking went, would sort itself out as online communitie­s moderated themselves, aided by fast-evolving computer algorithms and, eventually, artificial intelligen­ce.

“They scaled, they built, they wanted to drive revenue as well as user base,” said technology analyst Tim Bajarin, president of consultanc­y Creative Strategies. “That was priority one and controllin­g content was priority two. It should have been the other way around.”

While the platforms may not have anticipate­d the influx of hate speech and meddling from foreign powers, Bajarin said, they should have acted more quickly once they found it. “The fact is we’re dealing with a brave new world that they’ve allowed to happen, and they need to take more control to keep it from spreading,” he said.

That’s easier said than done, of course. But it’s particular­ly difficult for huge tech companies to balance public goods such free speech with the need to protect their users from harassment, abuse, fake news and manipulati­on. Especially given that their business models require them to alienate as few of their users as possible, lest they put the flood of advertisin­g money at risk.

“Trying to piece together a framework for speech that works for everyone — and making sure we effectivel­y enforce that framework — is challengin­g,” wrote Richard Allan, Facebook’s vice president of policy, in a blog post Thursday. “Every policy we have is grounded in three core principles: giving people a voice, keeping people safe, and treating people equitably. The frustratio­ns we hear about our policies — outside and internally as well — come from the inevitable tension between these three principles.”

Such tensions force some of the largest corporatio­ns in the world to decide, for instance, if banning Nazis also means banning white nationalis­ts — and to figure out how to tell them apart if not. Or whether kicking off Jones means they need to ban all purveyors of false conspiracy theories. Or whether racist comments should be allowed if they are posted, to make a point, by the people who received them.

“I don’t think the platforms in their heart of hearts would like to keep Alex Jones on,” said Nathaniel Persily, a professor at Stanford Law School. “But it’s difficult to come up with a principle to say why Alex Jones and not others would be removed.”

 ?? ASSOCIATED PRESS ?? Google, Twitter, Facebook, Spotify and other social media operators face uncomforta­ble tradeoffs between allowing unfettered speech and enabling hate speech and harassment.
ASSOCIATED PRESS Google, Twitter, Facebook, Spotify and other social media operators face uncomforta­ble tradeoffs between allowing unfettered speech and enabling hate speech and harassment.

Newspapers in English

Newspapers from United States