Orlando Sentinel

The real-world consequenc­es of online hate

-

Antisemiti­sm, the world’s oldest bigotry, festers around the globe and at home, where it’s at its worst since incident records have been kept.

In recent days, a brawl attributed to insults broke out between soccer teams of Jewish and Catholic schools in Miami, a bicyclist was attacked and beaten in Broward County by someone who heard him speaking in Hebrew on a cellphone, and two Jewish worshipper­s were shot outside synagogues in Los Angeles. U.S. Rep. Jared Moskowitz, D-Parkland, reported receiving more than 200 hatefilled responses to a Twitter post.

The Anti-Defamation League reported a record 2,717 incidents in 2021, up 34% from the year before. A survey last year found more than three-fourths of Americans believe at least one common antisemiti­c trope and 20% hold to six or more of them.

It can’t happen here? It is happening here.

The role of social media

Social media behemoths inflame bigotry, but efforts to hold them responsibl­e clash with a federal law that shields them as if they were phone carriers. It’s Section 230 of the Communicat­ions Decency Act.

Two cases challengin­g that were heard by the U.S. Supreme Court this week. The parents of Nohemi Gonzalez, an American college student slain during terrorist attacks in Paris in November 2015, are appealing for the right to sue Google over ISIS posts on YouTube that they say led to the killings. ISIS claimed responsibi­lity for the attacks, which killed 130 people. The family of Nawras Alassaf, a Jordanian citizen, are seeking damages from Twitter for his death in a 2017 ISIS attack on a nightclub in Istanbul. However, they don’t allege that the terrorists in that attack ever used Twitter. Both families sued under the

1996 Antiterror­ism and Effective Death Penalty Act.

These cases won’t be easy calls because of the First Amendment. Whatever the court decides may force Congress to act. Either way, the issue is too grave to ignore.

Twitter, Facebook and similar platforms are commercial ventures that depend on advertisin­g. They tailor their feeds to users’ consumer tastes, based on what their algorithms say. The algorithms, the Gonzalezes argue, make social media the publisher rather than merely the distributo­r of the hate speech that the algorithms select, especially when ad revenue is shared with authors of the posts.

An antisemite fired up by social media murdered 11 Jews at a Pittsburgh synagogue in 2017, one of six fatal attacks since 2016. Last January, another terrorist took hostages at a synagogue in Texas. He recited the antisemiti­c trope that “Jews control the world” and demanded the release of a woman imprisoned for aiding

terrorists in Afghanista­n.

Social media has figured in virtually every racially motivated major incident in recent years, whether the victims were Black, Hispanic or Jewish. Former President Trump’s perceived coziness with some antisemite­s is part of the problem.

Antisemiti­sm on the rise

Since October, the ADL has documented some 30 incidents quoting or referring to Ye, the antisemiti­c entertaine­r and designer formerly known as Kanye West, whom Trump invited to Mar-aLago. Those include “vandalism, banner drops, targeted harassment, and campus propaganda distributi­ons,” including at Florida Atlantic University. Ye has been banned again from Twitter, and should stay banned.

Media like Twitter and Facebook have been unable or unwilling to adequately monitor what is posted on their own pages. They can’t reasonably see everything before it goes out, but they should be responsibl­e for taking down what is intended to incite hateful acts.

The lethal influence of antisocial media has been felt in tragic incidents around the world, from New York to New Zealand.

“I believed what I read online and acted out of hate,” said the 19-year-old who killed 10 Black people at a Buffalo supermarke­t at a sentencing hearing last week.

New York Attorney General Letitia James made the same point in a detailed report on the shooting.

“Several online platforms played an undeniable role in this racist attack,” she said, “first by radicalizi­ng the shooter as he consumed voluminous amounts of racist and violent content, helping him prepare for the attack, and finally allowing him to broadcast it.”

Social media, as private companies, are under misguided pressure from some right-wing politician­s to allow even more latitude than they do.

The James report suggested a balance between free speech and responsibi­lity.

Since buying Twitter, Elon Musk has decimated the platform’s content monitors. One of those let go was an algorithmi­c expert named Rumman Chowdhury. Writing in The Atlantic, she said she was hired to “help protect users, particular­ly people who already face broader discrimina­tion, from algorithmi­c harm.

“But months into Musk’s takeover,” she wrote, “it seems no one is keeping watch.”

The Orlando Sentinel Editorial Board includes Editor-in-Chief Julie Anderson, Opinion Page Editor Krys Fluker and Viewpoints Editor Jay Reddick. The Sun Sentinel Editorial Board consists of Editorial Page Editor Steve Bousquet, Deputy Editorial Page Editor Dan Sweeney, and Anderson. Send letters to insight@orlandosen­tinel.com.

 ?? ?? There is a lot of antisemiti­c hate speech on social media — and algorithms are partly to blame.
There is a lot of antisemiti­c hate speech on social media — and algorithms are partly to blame.

Newspapers in English

Newspapers from United States