So­cial engi­neer­ing

Hyper - - FEATURE -

Have you ever won­dered why Face­book spent $2 bil­lion USD to buy Ocu­lus VR, an un­proven tech startup which, at that point, had been in oper­a­tion for less than two years? In a blog post an­nounc­ing the deal, Mark Zucker­berg talks about how VR is “a new com­mu­ni­ca­tion plat­form” des­tined to be­come “a part of daily life for bil­lions of peo­ple” and how this nat­u­rally aligns with Face­book’s mis­sion to make the world “more open and con­nected” for ev­ery­one. He doesn’t spec­ify to whom or what we’re sup­posed to be more open and con­nected, but given what we know about Face­book’s data and ad­ver­tis­ing-driven rev­enue model, it’s not hard to fill in the blanks.

This is a worry. We al­ready know that hu­man be­hav­iour can be strongly and sub­con­sciously in­flu­enced by the so­cial and phys­i­cal en­vi­ron­ment. Some of the most (in) fa­mous psy­chol­ogy ex­per­i­ments of the 20th Cen­tury – in­clud­ing the Stan­ford Prison ex­per­i­ment and the Mil­gram obe­di­ence ex­per­i­ments – tes­tify to this fact, as does the de­sign of ev­ery­thing from gro­cery stores to pub­lic parks and train sta­tions. The ques­tion is: what hap­pens when the en­vi­ron­ment is vir­tual and en­tirely be­holden to the whims of a multi­na­tional cor­po­ra­tion?

Vir­tual en­vi­ron­ments open up new and pow­er­ful pos­si­bil­i­ties for so-called neu­ro­mar­ket­ing: an in­tense kind of tar­geted ad­ver­tis­ing in which mi­cro-ges­tures like eye, head, and hand move­ments are tracked and used to in­fer de­tails about the per­son mak­ing them.

“When we have VR in the hands of neu­ro­mar­keters, two things change,” says Madary. “First, [de­vel­op­ers] have con­trol over the en­tire en­vi­ron­ment. In the real world, the space in which ad­ver­tis­ing ap­pears is fixed, but in VR the ad­ver­tise­ment can be any­where in that space. Sec­ond, be­cause the tech­nol­ogy works by track­ing bod­ily move­ments, the neu­ro­mar­keters will have much more in­for­ma­tion about how we re­act to ad­ver­tise­ments. When mar­keters have that kind of in­for­ma­tion they can re­ally take ad­van­tage of it be­cause they can track ex­actly how peo­ple re­act."

But neu­ro­mar­ket­ing is just the tip of the iceberg. VR's abil­ity to sub­con­sciously in­flu­ence peo­ple’s be­hav­iour may also make it a po­tent vec­tor for in­doc­tri­na­tion. We've seen how VR might be used to en­hance em­pa­thy – now imag­ine the op­po­site: a sim­u­la­tor de­signed to de­crease em­pa­thy, to de­hu­man­ise the other from the in­side out. Or imag­ine a kind of ex­po­sure ther­apy for ex­treme vi­o­lence: sol­diers dead­ened to acts of real bru­tal­ity be­cause they've done it all in VR so many times be­fore.

"If we use our imag­i­na­tion, we can think of some re­ally un­pleas­ant uses for VR," says Madary.


Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.