Face­book’s se­cret

More and more, it seems to know you bet­ter than you even know yourself

SundayXtra - - FRONT PAGE - By Will Ore­mus — Slate

MENLO PARK, Calif. — We all know by now Face­book isn’t cool. Yet some­how it’s more pop­u­lar than ever. This week, the com­pany an­nounced its growth continues to surge — not only in terms of the sheer num­ber of Face­book users, but in terms of how much they use the site. On any given day, Mark Zucker­berg said, 63 per cent of Face­book’s 1.28 bil­lion users log into the site. The pro­por­tion of users who log in at least six days a week has now sur­passed 50 per cent.

How is it pos­si­ble Face­book keeps get­ting more ad­dic­tive over time, rather than less?

It’s pos­si­ble be­cause Face­book knows what you like — and it’s get­ting bet­ter at un­der­stand­ing you all the time.

As much work and data — your data — as Face­book feeds into its tar­geted ad­ver­tis­ing, it works at least as hard at fig­ur­ing out which of your friends’ posts you’re most likely to want to see each time you open the app. Ad­ver­tis­ers may but­ter Face­book’s bread, but its most press­ing in­ter­est of all is in keep­ing its users com­ing back for more. If it ever fails, its ad­ver­tis­ing busi­ness will im­plode.

So how does Face­book know what we like? On a re­cent visit to the com­pany’s head­quar­ters in Menlo Park, Calif., I talked about that with Will Cath­cart, who over­sees the prod­uct man­age­ment teams that work on the com­pany’s news feed. The an­swer holds lessons for the fu­ture of ma­chine learn­ing, the me­dia and the In­ter­net at large.

Face­book launched the news feed in 2006, but it didn’t in­tro­duce the “like” but­ton un­til a year later. Only then did the site have a way to fig­ure out which posts you were ac­tu­ally in­ter­ested in — and which new posts you might be in­ter­ested in, based on what your friends and oth­ers were lik­ing. In the years since its launch, the news feed has gone from be­ing a sim­ple chrono­log­i­cal list to a ma­chine learn­ing prod­uct, with posts ranked in your time­line ac­cord­ing to the like­li­hood that you would find them in­ter­est­ing. The goal is to en­sure, for ex­am­ple, the first pic­ture of your best friend’s new baby would take prece­dence over a re­mote ac­quain­tance’s most re­cent Mafia Wars score.

For a while, Face­book likes — cou­pled with a few other met­rics, such as shares, com­ments, and clicks — served as a pretty de­cent proxy for en­gage­ment. But they were far from per­fect, Cath­cart con­cedes. A funny photo meme might get thou­sands of quick likes, while a thought­ful news story an­a­lyz­ing the con­flict in Ukraine would be pun­ished by Face­book’s al­go­rithms be­cause it didn’t lend it­self to a sim­ple thumbs-up. The re­sult was people’s news feeds be­came lit­tered with the so­cial-me­dia equiv­a­lent of junk food. Face­book had be­come op­ti­mized for sto­ries people Face­book-liked, rather than sto­ries that people ac­tu­ally liked.

Worse, many of the same sto­ries thou­sands of people Face­book-liked turned out to be ones thou­sands of other people gen­uinely hated. They in­cluded posts that had clicky head­lines de­signed to score cheap likes and clicks, that ac­tu­ally led to pages filled with spammy ads rather than the con­tent the head­line promised. But in the ab­sence of a “dis­like” but­ton, Face­book’s al­go­rithms had no way of know­ing which posts were turn­ing users off. Even­tu­ally, about a year ago, Face­book ac­knowl­edged it had a “qual­ity con­tent” prob­lem.

This is not a prob­lem spe­cific to Face­book. It’s a prob­lem that con­fronts ev­ery com­pany or prod­uct that har­nesses data an­a­lyt­ics to drive de­ci­sion­mak­ing. So how do you solve it? For some, the an­swer might be to tem­per data- driven in­sights with a healthy dose of hu­man in­tu­ition. But Face­book’s news feed op­er­ates on a scale and a level of per­son­al­iza­tion that makes di­rect hu­man in­ter­ven­tion in­fea­si­ble. So for Face­book, the an­swer was to be­gin col­lect­ing new forms of data de­signed to gen­er­ate in­sights the old forms of data — such as shares, com­ments and clicks — couldn’t.

Three sources of data in par­tic­u­lar are help­ing Face­book to re­fash­ion its news feed al­go­rithms to show users the kinds of posts that will keep them com­ing back: sur­veys, A/ B tests and data on the time users spend away from Face­book once they click on a given post — and what they do when they come back.

Sur­veys can get at ques­tions that other met­rics can’t, while A/ B tests of­fer Face­book a way to put its hunches un­der a mi­cro­scope. Ev­ery time its de­vel­op­ers make a tweak to the al­go­rithms, Face­book tests it by show­ing it to a small per­cent­age of users. At any given mo­ment, Cath­cart says, there might be 1,000 dif­fer­ent ver­sions of Face­book run­ning for dif­fer­ent groups of users. Face­book is gath­er­ing data on all of them, to see which changes are gen­er­at­ing pos­i­tive re­ac­tions and which ones are fall­ing flat.

For in­stance, Face­book re­cently tested a se­ries of changes de­signed to cor­rect for the pro­lif­er­a­tion of “like-bait” — sto­ries or posts that ex­plic­itly ask users to hit the “like” but­ton in or­der to boost their rank­ing in your news feed. Some in the me­dia wor­ried Face­book was mak­ing un­jus­ti­fied as­sump­tions about its users’ pref­er­ences. In fact, Face­book had al­ready tested the changes on a small group of users be­fore it pub­licly an­nounced them. “We ac­tu­ally very quickly saw that the people we launched that im­prove­ment to were click­ing on more ar­ti­cles in their news feed,” Cath­cart ex­plains.

When users click on a link in their news feed, Cath­cart says, Face­book looks very care­fully at what hap­pens next. “If you’re some­one who, ev­ery time you see an ar­ti­cle from the New York Times, you not only click on it, but go off­site and stay off­site for a while be­fore you come back, we can prob­a­bly in­fer that you in par­tic­u­lar find ar­ti­cles from the New York Times more rel­e­vant” — even if you don’t ac­tu­ally hit “like” on them.

At the same time, Face­book has be­gun more care­fully dif­fer­en­ti­at­ing be­tween the likes a post gets be­fore users click on it and the ones it gets af­ter they’ve clicked. A lot of people might be quick to hit the like but­ton on a post based solely on a head­line or teaser that pan­ders to their po­lit­i­cal sen­si­bil­i­ties. But if very few of them go on to like or share the ar­ti­cle af­ter they’ve read it, that might in­di­cate to Face­book that the story didn’t deliver.

Some have spec­u­lated Face­book’s news feed changes were specif­i­cally tar­get­ing cer­tain sites for de­mo­tion while el­e­vat­ing the rank­ing of oth­ers. That’s not the case, Cath­cart in­sists. Face­book de­fines high-qual­ity con­tent not by any ob­jec­tive rank­ing sys­tem, but ac­cord­ing to the tastes of its users. If you love Up­wor­thy and find the Times snooze-wor­thy, then Face­book’s goal is to show you more of the for­mer and less of the lat­ter.

Each time you log in to Face­book, the site’s al­go­rithms have to choose from among an aver­age of 1,500 pos­si­ble posts to place at the top of your news feed. “The per­fect test for us,” Cath­cart says, “would be if we sat you down and gave you all 1,500 sto­ries and asked you to re­ar­range them from 1 to 1,500 in the or­der of what was most rel­e­vant for you. That would be the gold stan­dard.” But that’s a lit­tle too much test­ing, even for Face­book.

For a lot of people, the knowl­edge Face­book’s com­put­ers are de­cid­ing what sto­ries to show them — and which ones to hide — re­mains galling. Avid Twit­ter users swear by that plat­form’s more straight­for­ward chrono­log­i­cal time­line, which re­lies on users to care­fully cu­rate their own list of people to fol­low. There’s a rea­son Face­book’s en­gage­ment met­rics keep grow­ing while Twit­ter’s are stag­nant. As much as we’d like to think we could do a bet­ter job than the al­go­rithms, the fact is most of us don’t have time to sift through 1,500 posts on a daily ba­sis. And so, even as we re­sent Face­book’s pa­ter­nal­ism, we keep com­ing back to it.

Just maybe, if Face­book keeps get­ting bet­ter at fig­ur­ing out what we ac­tu­ally like as op­posed to what we just Face­book-like, we’ll start to ac­tu­ally like Face­book it­self a lit­tle more than we do to­day.

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.