THUMBS-UP TO THE WIS­DOM OF CROWDS

Winnipeg Free Press - Section D - - FRONT PAGE -

cent of the time; when prices sug­gest a prob­a­bil­ity of 80 per cent, the events hap­pen 80 per cent of the time, and so forth.

An ap­pre­ci­a­tion of crowd wis­dom sug­gests so­cial net­works hold spe­cial po­ten­tial, be­cause they can ag­gre­gate di­verse views with as­ton­ish­ing speed. But re­cent re­search raises cau­tion­ary notes. It turns out crowds may have much less wis­dom when their mem­bers are lis­ten­ing to one an­other. In such cases, we can end up with forms of herd­ing, or so­cial cas­cades, that re­flect se­ri­ous bi­ases.

Re­searchers have long known crowds can be mis­led if their mem­bers in­flu­ence one an­other. But the new re­search goes far be­yond this sim­ple point. Lev Much­nik, a pro­fes­sor at He­brew Univer­sity of Jerusalem, and his col­leagues used a web­site that ag­gre­gates sto­ries and al­lows peo­ple to post com­ments, which can in turn be voted “up” or “down.” An ag­gre­gate score comes from sub­tract­ing the num­ber of “down” votes from the num­ber of “up” votes.

The re­searchers cre­ated three con­di­tions: “up-treated,” in which a comment, when it ap­peared, was au­to­mat­i­cally and ar­ti­fi­cially given an im­me­di­ate “up” vote; “down-treated,” in which a comment, when it ap­peared, was au­to­mat­i­cally and ar­ti­fi­cially given an im­me­di­ate “down” vote; and “con­trol,” in which com­ments did not re­ceive an ar­ti­fi­cial ini­tial sig­nal. Mil­lions of site vis­i­tors were ran­domly as­signed to one of the three con­di­tions.

You might think that af­ter so many vis­i­tors (and hun­dreds of thou­sands of rat­ings), the sin­gle ini­tial vote could not pos­si­bly mat­ter. If so, you would be wrong. Af­ter see­ing an ini­tial “up” vote, the first viewer be­came 32 per cent more likely to give an “up” vote him­self. What’s more, this ef­fect per­sisted over time. Af­ter a pe­riod of five months, a sin­gle pos­i­tive ini­tial vote ar­ti­fi­cially in­creased the mean rat­ing of com­ments by 25 per cent.

With re­spect to neg­a­tive votes, the pic­ture was not sym­met­ri­cal. The ini­tial “down” vote did in­crease the like­li­hood the first viewer would also give a “down” vote. But the ef­fect was rapidly cor­rected, and af­ter a pe­riod of five months, the ar­ti­fi­cial “down” vote had no ef­fect on me­dian rat­ings. Much­nik and his col­leagues con­clude that “whereas pos­i­tive so­cial in­flu­ence ac­cu­mu­lates, cre­at­ing a ten­dency to­ward rat­ings bub­bles, neg­a­tive so­cial in­flu­ence is neu­tral­ized by crowd cor­rec­tion.” They be­lieve their find­ings have im­pli­ca­tions for prod­uct rec­om­men­da­tions, stock-mar­ket pre­dic­tions and elec­toral polling.

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.