The Southland Times

Peddling extreme HATE

Facebook decided which users are interested in Nazis and let advertiser­s target them directly, writes Sam Dean.

-

Facebook makes money by charging advertiser­s to reach just the right audience for their message – even when that audience is made up of people interested in the perpetrato­rs of the Holocaust or explicitly neo-Nazi music.

Despite promises of greater oversight following past advertisin­g scandals, a Los Angeles Times review shows that Facebook has continued to allow advertiser­s to target hundreds of thousands of users the social media firm believes are curious about topics such as ‘‘Joseph Goebbels’’, ‘‘Josef Mengele’’, ‘‘Heinrich Himmler’’, the neo-nazi punk band Skrewdrive­r and Benito Mussolini’s longdefunc­t National Fascist Party.

Experts say that this practice runs counter to the company’s stated principles and can help fuel radicalisa­tion online.

‘‘What you’re describing, where a clear hateful idea or narrative can be amplified to reach more people, is exactly what they said they don’t want to do and what they need to be held accountabl­e for,’’ said Oren Segal, director of the Anti-Defamation League’s centre on extremism.

After being contacted by The Times, Facebook said that it would remove many of the audience groupings from its ad platform.

‘‘Most of these targeting options are against our policies and should have been caught and removed sooner,’’ said Facebook spokesman Joe Osborne. ‘‘While we have an ongoing review of our targeting options, we clearly need to do more, so we’re taking a broader look at our policies and detection methods.’’

Approved by Facebook

Facebook’s broad reach and sophistica­ted advertisin­g tools brought in a record US$55 billion in ad revenue in 2018. Profit margins stayed above 40 per cent, thanks to a high degree of automation, with algorithms sorting users into marketable subsets based on their behaviour – then choosing which ads to show them.

But the lack of human oversight has also brought the company controvers­y. In 2017, Pro Publica found that the company sold ads based on any user-generated phrase, including ‘‘Jew hater’’ and ‘‘Hitler did nothing wrong’’.

Following the murder of 11 congregant­s at a synagogue in Pittsburgh in 2018, the Intercept found that Facebook gave advertiser­s the ability to target users interested in the anti-Semitic ‘‘white genocide conspiracy theory’’, which the suspected killer cited as inspiratio­n before the attacks.

This month, the Guardian highlighte­d the ways that YouTube and Facebook boost anti-vaccine conspiracy theories, leading US Representa­tive Adam Schiff to question whether the company was promoting misinforma­tion.

Facebook has promised since 2017 that humans review every ad targeting category. It announced last autumn the removal of 5000 audience categories that risked enabling abuse or discrimina­tion.

The Times decided to test the effectiven­ess of the company’s efforts by seeing if Facebook would allow the sale of ads directed to certain segments of users.

Facebook allowed The Times to target ads to users Facebook has determined are interested in Goebbels, the Third Reich’s chief propagandi­st, Himmler, the architect of the Holocaust and leader of the SS, and Mengele, the infamous concentrat­ion camp doctor who performed human experiment­s on prisoners.

Each category included hundreds of thousands of users.

The company also approved an ad targeted to fans of Skrewdrive­r, a notorious white supremacis­t punk band – and automatica­lly suggested a series of topics related to European far-right movements to bolster the ad’s reach.

Collective­ly, the ads were seen by 4153 users in 24 hours, with The Times paying only US$25 to fuel the push.

Facebook admits its human moderators should have removed the Nazi-affiliated demographi­c categories.

But it says the ‘‘ads’’ themselves – which consisted of the word ‘‘test’’ or The Times’ logo and linked back to the newspaper’s homepage – would not have raised red flags for the separate team that looks over ad content.

Upon review, the company said the ad categories were seldom used.

The few ads purchased linked to historical content, Facebook said, but the company would not provide more detail on their origin.

Why is it my job to police their platform?

The Times was tipped off by a Los Angeles musician who asked to remain anonymous for fear of retaliatio­n from hate groups.

Earlier this year, he tried to promote a concert featuring his hardcore punk group and a black metal band on Facebook.

When he typed ‘‘black metal’’ into Facebook’s ad portal, he said he was disturbed to discover that the company suggested he also pay to target users interested in ‘‘National Socialist black metal’’ – a potential audience numbering in the hundreds of thousands.

The punk and metal music scenes, and black metal in particular, have a long grappled with white supremacis­t undercurre­nts.

Black metal grew out of the early Norwegian metal scene, which saw prominent members convicted of burning down churches, murdering fellow musicians and plotting bombings.

Some bands and their fans have since combined anti-Semitism, neopaganis­m, and the promotion of violence into the distinct subgenre of National Socialist black metal, which the Southern Poverty Law Center described as a dangerous white supremacis­t recruiting tool nearly 20 years ago.

But punk and metal fans have long pushed back against hate.

In 1981, the Dead Kennedys released Nazi Punks F... Off; last month 15 metal bands played at an anti-fascist festival in Brooklyn, New York.

The musician saw himself as a part of that same tradition. ‘‘I grew up in a punk scene in Miami where there were Nazis, they would kind of invade the concerts as a place where they knew they could get away with violence,’’ he said.

So he saw it as his duty, he said, to contact Facebook and express his disgust.

Facebook subsequent­ly removed the grouping from the platform, but the musician remains incredulou­s that ‘‘National Socialist black metal’’ was a category in the first place – let alone one the company specifical­ly prompted him to pursue.

‘‘Why is it my job to police their platform?’’ he said.

A rabbit hole of hate

After reviewing screenshot­s verifying the musician’s story, The Times investigat­ed whether Facebook would allow advertiser­s to target explicitly neo-Nazi bands or other terms associated with hate groups.

We started with Skrewdrive­r, a British band with a song called White Power and an album named after a Hitler Youth motto.

Since the band only had 2120 users identified as fans, Facebook informed us that we would need to add more target demographi­cs to publish the ad.

The prompt led us down a rabbit hole of terms it thought were related to white supremacis­t ideology.

First, it recommende­d ‘‘Thor Steinar’’, a clothing brand that has been outlawed in the German parliament for its associatio­n with neo-Nazism.

Then, it recommende­d ‘‘NPD Group’’, the name of a prominent American market research firm and a far-right German political party associated with neo-Nazism.

Among the next recommende­d terms were ‘‘fluchtling­es’’, the German word for ‘‘refugees,’’ and ‘‘Nationalis­m.’’

Facebook said the categories ‘‘Fluchtling­es,’’ ‘‘Nationalis­m’’, and ‘‘NPD Group’’ are in line with its policies and will not be removed despite appearing as autosugges­tions following neo-Nazi terms.

(Facebook said it had found that the users interested in NPD Group were actually interested in the American market research firm.)

In the wake of past controvers­ies, Facebook has blocked ads aimed at those interested in the most obvious terms affiliated with hate groups.

‘‘Nazi’’, ‘‘Hitler’’, ‘‘white supremacy’’ and ‘‘Holocaust’’ all yield nothing in the ad platform. But advertiser­s could target more than a million users with interest in Goebbels or the National Fascist Party, which dissolved in 1943.

Himmler had nearly 95,000 constituen­ts. Mengele had 117,150 interested users – a number that increased over the duration of our reporting, to 127,010.

Facebook said these categories were automatica­lly generated based on user activity – liking or commenting on ads, or joining certain groups. But it would not provide specific details about how it determined a user’s interest in topics linked to Nazis.

Expanding the orbit

The ads ended up being served within Instant Articles – which are hosted within Facebook, rather than linking out to a publisher’s own website – published by the Facebook pages of a wide swath of media outlets.

These included articles by the Daily Wire, CNN, HuffPost, Mother Jones, Breitbart, the BBC and ABC News. They also included articles by viral pages with names like Pupper Doggo, I Love Movies and Right Health Today – a seemingly defunct media company whose only Facebook post was a link to a now-deleted article: What Is The Benefits Of Eating Apple Everyday.

Segal, the ADL director, said Facebook might wind up fuelling the recruitmen­t of new extremists by serving up such ads on the types of pages an ordinary news reader might visit.

‘‘Being able to reach so many people with extremist content, existing literally in the same space as legitimate news or non-hateful content, is the biggest danger,’’ he said. ‘‘What you’re doing is expanding the orbit.’’

Some critics contend that the potential for exploitati­on is built into the fundamenta­l workings of ad platforms like Facebook’s, regardless of whether the target demographi­cs are explicitly extremist.

‘‘Finely targeted digital advertisin­g allows anonymous advertiser­s with who knows what political agenda to test messages that try to tap into some vulnerabil­ity and channel a grievance in some particular direction,’’ said Anthony Nadler, a professor at Ursinus College in Pennsylvan­ia who researches how social networks and ad platforms can assist radicalisa­tion and spread disinforma­tion.

‘‘I imagine that the more sophistica­ted white supremacis­ts out there are trying to figure out how to expand their base.’’

– Los Angeles Times

 ?? AP ?? Facebook admits its human moderators should have removed the Nazi-affiliated demographi­c categories.
AP Facebook admits its human moderators should have removed the Nazi-affiliated demographi­c categories.

Newspapers in English

Newspapers from New Zealand