Brand ex­trem­ism

The ar­gu­ment that Youtube is not a con­tent cre­ator, sim­ply a plat­form, does not ab­solve it of re­spon­si­bil­ity for its out­put. The eth­i­cal prob­lems this ad-funded ecosys­tem poses to ad­ver­tis­ers and so­ci­ety as a whole are in­creas­ingly dif­fi­cult to ig­nore, Ni

Campaign UK - - CONTENTS -

The eth­i­cal is­sues sur­round­ing ad-funded plat­forms such as Youtube can­not be ig­nored

In the an­nals of id­i­otic in­ter­net “bros”, Lo­gan Paul (pic­tured, left) is des­tined to be lit­tle more than a foot­note; a vir­tual speck on the largest eth­i­cal ice­berg the ad in­dus­try has ever faced. The 22-year-old vlog­ger, who has 16 mil­lion sub­scribers, was on the re­ceiv­ing end of a global back­lash af­ter he up­loaded a video de­tail­ing his trek into the Aoki­ga­hara “sui­cide woods” in Japan, where he found a man who had ap­par­ently re­cently hanged him­self. He showed the body and filmed his own re­ac­tion. In the 24 hours be­fore Paul re­moved the con­tent, it re­ceived six mil­lion views and was fea­tured on Youtube’s list of trending videos. Paul was among the “gold star” in­flu­encers and part of the Google Pre­ferred net­work, an ad­ver­tis­ing pro­gramme that con­nects chan­nels with brand ad­ver­tis­ers di­rectly. The only rea­son ad­ver­tis­ing didn’t ap­pear next to the above-men­tioned video was be­cause Paul had “de­mon­e­tised” it him­self. In the con­text of the mar­ket­ing in­dus­try’s con­ver­sa­tions sur­round­ing “brand safety”, Paul would have been pre-ap­proved and whitelisted – the very def­i­ni­tion of safe, and the ul­ti­mate poster­boy for Youtube’s leaky ad plat­form.

The age of ex­trem­ism

From The Times head­line on “brands fund­ing ter­ror­ism” to fears sur­round­ing fake news, the sorry story of Paul is far from an iso­lated in­ci­dent. One need look no fur­ther than a re­cent study by ex-google en­gi­neer Guil­laume Chaslot. He wrote a com­puter pro­gram to bring trans­parency to Youtube’s rec­om­men­da­tion sys­tem, which he said showed a trend to­ward ex­treme con­tent. This was dis­puted by Youtube, which said the videos eval­u­ated did not “paint an ac­cu­rate pic­ture of what videos were rec­om­mended on Youtube… in the run-up to the US pres­i­den­tial elec­tion”. In the bat­tle for at­ten­tion, al­go­rithms and edi­tors, alike, have hit upon out­rage as key for at­tract­ing eye­balls and there­fore ad rev­enues. In this en­vi­ron­ment, ad­ver­tis­ers and me­dia agen­cies face a dif­fi­cult ques­tion: as they seek the high­est num­ber of con­sumers at the low­est pos­si­ble cost, have brands in­ad­ver­tently been un­der­pin­ning ex­treme con­tent? Ro­nan Har­ris, man­ag­ing di­rec­tor of Google UK and Ire­land, re­jects the sug­ges­tion that brands are fac­ing an “age of ex­trem­ism”. He ar­gues that tech com­pa­nies are co-op­er­at­ing to pre­vent ex­trem­ist con­tent. He points to Google’s work with the Global In­ter­net Fo­rum to Counter Ter­ror­ism – a shared in­dus­try database that al­lows com­pa­nies to cre­ate dig­i­tal fin­ger­prints for ter­ror­ist con­tent – as an ex­am­ple of such en­deav­ours. Har­ris also em­pha­sises the role of open plat­forms such as Youtube as a force for cre­ativ­ity and learn­ing. “When we look at the ma­jor uses of tech­nol­ogy from in­di­vid­u­als, we see them us­ing it to share pos­i­tive and in­spir­ing mes­sages of all kinds,” he says. “Barack Obama’s ‘It gets bet­ter’ mes­sage aimed at LGBT+ in­di­vid­u­als was viewed over one mil­lion times on Youtube, and his tweet on diver­sity was liked over four mil­lion times. “What this means for brands and ad­ver­tis­ers is that there is an op­por­tu­nity to en­gage with their cus­tomers around the pos­i­tive mes­sages and val­ues that they rep­re­sent and be­lieve in.”

A ques­tion of con­text

How­ever, sev­eral brands and mar­keters are reap­prais­ing their ap­proach to on­line ads. Proc­ter & Gam­ble mar­ket­ing chief Marc Pritchard’s much­lauded drive to clean up the dig­i­tal ecosys­tem has been gal­vanis­ing for the in­dus­try, while Keith Weed, his coun­ter­part at Unilever, has warned a cri­sis of con­sumer con­fi­dence poses a threat to the fab­ric of the ad-funded web. At a grass-roots level, var­i­ous brands and SMES are mak­ing de­ci­sions to press the pause but­ton when it comes to as­pects of their so­cial-me­dia mar­ket­ing ac­tiv­ity. Jamie In­man, head of plan­ning at BMB, says that some of the agency’s clients do not ad­ver­tise on Youtube be­cause of brand-safety fears and the rep­u­ta­tional risk. On the flip­side, other clients, such as Rowse Honey, have put Youtube at the cen­tre of their mar­ket­ing. “The lens of brand rep­u­ta­tion is im­por­tant, but it is just a sub­plot to the larger ex­is­ten­tial cri­sis as to what these plat­forms are do­ing to so­ci­ety as a whole,” he says. The in­ter­est­ing shift for brands is the sheer scale and in­flu­ence of the Youtube brand and seem­ing un­con­trol­la­bil­ity of the al­go­rithm. As In­man puts it, suc­cinctly: “It’s a fuck­ing mess.” He adds: “Seventy per cent of Youtube views come from its rec­om­men­da­tion al­go­rithm. It is hard to grasp from the out­side the scale of the prob­lem and its ul­ti­mate im­pact on so­ci­ety.”

‘There’s a fi­nan­cial in­cen­tive to bend the truth – fake news, for ex­am­ple – to gen­er­ate more in­ter­est’ Dale Lovell, Ady­oulike

The be­gin­ning of the so­lu­tion may start with get­ting to grips with this scale, but In­man ar­gues that, as a so­ci­ety, we are bad at deal­ing with com­plex prob­lems, whether that be cli­mate change or the Youtube al­go­rithm. “To­day, we don’t have the an­swers in the face of large scale and un­in­tended con­se­quences,” he says. These “un­in­tended con­se­quences” form the bedrock of dig­i­tal cul­ture. Ash Ben­de­low, man­ag­ing di­rec­tor at Brave, points out that these plat­forms were built on democra­tised ac­cess, a lack of cen­sor­ship and over­all law­less­ness – the beauty of the in­ter­net. In the main, they take the easy way out with auto-mod­er­a­tion, re­ly­ing fairly heav­ily on the com­mu­nity and its users to “flag” con­tent that vi­o­lates “terms of use”. Ben­de­low ex­plains: “The prob­lem is the av­er­age brand ad­ver­tiser, and prob­a­bly the av­er­age me­dia buyer, doesn’t truly know how it all works. It’s an enig­matic, con­stantly chang­ing black box – part sci­ence and data, part ‘dark arts’ – where the eye­balls and cost per trans­ac­tion are im­pres­sive, but peo­ple just don’t dare ask too many ques­tions about how it’s achieved.”

Ex­treme al­go­rithms

The ar­gu­ment that a plat­form like Face­book is es­sen­tially a civic space, with its own rules and reg­u­la­tions, is run­ning out of steam. The “ex­haust fumes” of so­cial plat­forms, per­vad­ing al­most ev­ery as­pect of pub­lic life, are com­ing un­der the mi­cro­scope. De­spite this mount­ing at­ten­tion, Dan Ha­gen, chief strat­egy of­fi­cer at Carat UK, ar­gues that the abil­ity to ac­cu­rately po­lice plat­forms of this scale is not cur­rently vi­able. “Brands should prob­a­bly un­der­stand that they are tak­ing a risk that could put them in a bad light,” he says. Yet he does not be­lieve al­go­rithms are tac­itly en­cour­ag­ing more ex­treme con­tent. “There are al­ways peo­ple who want to push an agenda, or de­sire cer­tain con­tent; they al­ways have and al­ways will find a way to get what they want or pub­lish their agenda. Al­go­rithms make it faster to pub­lish, but also to find and in­ter­vene,” he ex­plains. “They ac­cel­er­ate the arms race, but I don’t think they cre­ate it – peo­ple do that.” How­ever, Dale Lovell, co-founder and chief dig­i­tal of­fi­cer of na­tive ad­ver­tis­ing plat­form Ady­oulike, says that the fi­nan­cial mo­ti­va­tion for con­tent cre­ators is heav­ily skewed to­ward gen­er­at­ing a “hit”. If some­thing is shared or viewed a lot, tra­di­tion­ally, this means that al­go­rithms start to “sur­face” this con­tent more, cre­at­ing more shares and views. It is a sys­tem that Lovell be­lieves en­cour­ages con­tent cre­ators to push the boundaries: “There’s a fi­nan­cial in­cen­tive to bend the truth – fake news, for ex­am­ple – to gen­er­ate more in­ter­est, or for video cre­ators to go to more ex­tremes.” Whether talk­ing about ex­trem­ist con­tent or cute cat videos, for crit­ics of Youtube this race is one to the bot­tom. It re­flects an en­vi­ron­ment in which the power of con­text has ei­ther been dra­mat­i­cally down­played or com­pletely for­got­ten. Heather An­drew, chief ex­ec­u­tive of mar­ket re­search agency Neu­roin­sight, says that the ris­ing con­cern over brand safety in dig­i­tal me­dia is symp­to­matic of a long-term ten­dency among mar­keters to un­der­rate con­text ver­sus con­tent and au­di­ence size. She points to re­search mea­sur­ing sub­con­scious brain re­sponse, which con­sis­tently demon­strates that con­text and con­tent are in­trin­si­cally linked. “The brain and mem­ory, in par­tic­u­lar, work by as­so­ci­a­tion,” she says. “When we store in­for­ma­tion in our mem­ory, we file it away along­side emo­tional and con­tex­tual as­so­ci­a­tions that ac­com­pany it. Brand ad­ver­tis­ing tends to ex­ploit the link be­tween mem­ory and emo­tion – it’s the ba­sis for cre­at­ing emo­tional brand ad­ver­tis­ing, rather than just com­mu­ni­cat­ing an in­for­ma­tion mes­sage. Yet the cor­re­spond­ing role of con­text in me­dia place­ment is of­ten ig­nored.” Ac­cord­ing to An­drew, cases such as that of Lo­gan Paul demon­strate a strong ar­gu­ment for qual­ity over quan­tity in dig­i­tal ad­ver­tis­ing and, specif­i­cally, il­lus­trate the vi­tal im­por­tance of con­text.

The in­flu­encer para­dox

When it comes to con­text, many brands have lost their way. Like “con­tent” be­fore it, the term “in­flu­encer mar­ket­ing” has quickly lost its mean­ing. All too many brands are guilty of hav­ing an in­flu­encer strat­egy that pays lit­tle at­ten­tion to who the in­flu­encers are or who they are ac­tu­ally in­flu­enc­ing. Em­i­lie Ta­bor, found­ing part­ner and chief mar­keter at in­flu­encer agency IMA, says: “Brands have learned that in this age of ex­trem­ism they need to be ex­tra care­ful with their ad place­ment and se­lec­tion of in­flu­encers.” Paul isn’t the first in­flu­encer to go down in a blaze of in­famy, she adds. Be­fore him there was Pewdiepie, who took a hit af­ter post­ing videos with anti-semitic mes­sages, while in­flu­encer mar­ket­ing as a whole suf­fered from as­so­ci­a­tion with last year’s Fyre Fes­ti­val, the doomed lux­ury mu­sic event en­dorsed by celebri­ties. In light of this, Ta­bor ad­vises brands to un­der­take “in­ten­sive re­search to en­sure that their val­ues are in line with that of a cer­tain in­flu­encer, draw up clear con­tracts, mon­i­tor their ac­tiv­ity, or hire a spe­cial­ist in­flu­encer mar­ket­ing agency to pro­tect their brand name”. In many ways, the prob­lems high­lighted by in­flu­encers who have come un­der fire for poor judge­ment and of­fen­sive con­tent, rep­re­sent an on­go­ing chal­lenge in mar­ket­ing that pre­dates so­cial me­dia. From Tiger Woods to Lance Arm­strong, brand part­ner­ships are fraught with hu­man frailty, but there is no ques­tion that, on so­cial net­works, brands are grap­pling with this frailty at an un­prece­dented scale. Dr Jür­gen Galler, Google’s for­mer EMEA prod­uct di­rec­tor and chief ex­ec­u­tive of pre­dic­tive data man­age­ment com­pany 1plusx, says the topic of con­tent in­se­cu­rity has been around since the web be­gan, but pro­gram­matic ad­ver­tis­ing is not con­sid­er­ing con­text. “Tech­nol­ogy is not yet pre­cise enough and cer­tain cases are still slip­ping through the net,” he adds. How­ever, he be­lieves that, rather than try­ing to fight “niche sit­u­a­tions”, brands need to fo­cus on ap­pear­ing in a pos­i­tive en­vi­ron­ment.

Time well spent

Con­cerns over the im­pact of so­cial me­dia on our col­lec­tive well-be­ing is no longer con­fined to the Sil­i­con Val­ley set. Jem Faw­cus, group chief ex­ec­u­tive of Fire­fish, says that brands are fac­ing up to a re­al­i­sa­tion that there is a sig­nif­i­cant down­side to so­cial me­dia, and peo­ple are ques­tion­ing their own use of the plat­forms. “Brands need to adopt the an­a­lyt­ics and AI that can make the con­sumer ex­pe­ri­ence seam­less,” he adds. How­ever, he be­lieves that this must not be at the ex­pense of hu­man­ity, be­cause it has be­come more vi­tal not just to brand ex­pe­ri­ence, but also to ad­dress­ing the cul­ture of ag­gres­sion and dis­tor­tion on­line. There is no ques­tion that tech­nol­ogy plat­forms are ul­ti­mately re­spon­si­ble for polic­ing the con­tent on their sites. Nonethe­less, the op­por­tu­nity for mar­ket­ing and me­dia lead­ers to drive change by be­com­ing an­swer­able for where their ad­ver­tis­ing ap­pears has never been greater.

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.