Uni­ver­si­ties need a Big Brother for Big Data, says John Holm­wood

The Cam­bridge An­a­lyt­ica con­tro­versy flags up the eth­i­cal per­ils of com­mer­cial po­ten­tial, says John Holm­wood

THE (Times Higher Education) - - CON­TENTS - John Holm­wood is pro­fes­sor of so­ci­ol­ogy at the Univer­sity of Not­ting­ham.

The scan­dal around Cam­bridge An­a­lyt­ica’s use of Face­book data raises a num­ber of eth­i­cal is­sues. The most im­por­tant one con­cerns the over­sight of aca­demic re­search in­volv­ing plat­forms such as Face­book.

Us­ing hu­man sub­jects means get­ting in­formed con­sent. It is not clear if this was fully ob­tained in this case.

The busi­ness ethics dis­pute within the Univer­sity of Cam­bridge’s Psy­cho­met­rics Cen­tre be­tween the de­vel­op­ers of the myPer­son­al­ity App, David Still­well and Michal Kosin­ski, and Alek­sandr Ko­gan, who sought to use it in his work for Cam­bridge An­a­lyt­ica, has been re­ported. This dis­pute turns on the value of the soft­ware and data. Kosin­ski and Still­well ap­par­ently wanted to plough pay­ments from Cam­bridge An­a­lyt­ica back into re­search, although they also have per­sonal IP rights in the app’s com­mer­cial uses. In the end, Ko­gan de­vel­oped a sep­a­rate app, This is Your Dig­i­tal Life, to gen­er­ate sim­i­lar data.

While Face­book has been crit­i­cised for fa­cil­i­tat­ing ac­cess to data, few com­ments have been made about the wider ethics of aca­demics gath­er­ing data via apps, or about the is­sues as­so­ci­ated with mon­etis­ing aca­demic re­search.

UK re­search coun­cils re­quire all re­search in­volv­ing hu­man sub­jects to be re­viewed by an ethics com­mit­tee, to en­sure proper safe­guards are in place to pre­vent de­cep­tion and se­cure the in­formed con­sent of par­tic­i­pants. In the case of so­cial me­dia re­search, it would not be suf­fi­cient to claim that Face­book users have vol­un­tar­ily made their data pub­lic – ei­ther by the act of post­ing it, or as a con­se­quence of the terms and con­di­tions of the plat­form.

It is not clear if the myPer­son­al­ity project was sub­mit­ted to an ethics com­mit­tee. The app’s wikipage de­scribes how con­sent was sought from par­tic­i­pants in­ter­ested in learn­ing about their per­son­al­ity char­ac­ter­is­tics ac­cord­ing to a com­mon psy­cho­me­t­ric test. This is pre­sented in stan­dard terms, let­ting par­tic­i­pants know that they could with­draw from the study at any time. They are not told that the data could be made avail­able to other re­searchers – or be used com­mer­cially.

It ap­pears that the true pur­pose of the app is re­vealed at the end of the test, when par­tic­i­pants were asked to give per­mis­sion to share their Face­book data. Con­sent was sought on the ba­sis that the re­searchers wanted to ex­plore ques­tions such as “do peo­ple who have con­ser­va­tive po­lit­i­cal views have a par­tic­u­lar type of per­son­al­ity?” Re­spon­dents were as­sured that their “data will not be pub­lished in­di­vid­u­ally, but only as part of ag­gre­gate in­for­ma­tion”.

De­spite the lan­guage of con­sent, these meth­ods are du­bi­ous be­cause they are not seek­ing fully in­formed con­sent. While covert meth­ods can be jus­ti­fied eth­i­cally, that de­pends upon there be­ing a higher pur­pose. The com­mer­cial value of the data be­ing gath­ered would not con­sti­tute such a pur­pose.

More se­ri­ous is that re­spon­dents are not told that giv­ing their agree­ment at this point would also pro­vide the re­searchers with ac­cess to the data of oth­ers. Ac­cord­ing to the Psy­cho­met­rics Cen­tre’s de­scrip­tion of the prod­uct ser­vices pro­vided by the data­base, around 40 per cent of re­spon­dents agreed to “give ac­cess to their Face­book pro­file data and so­cial net­work”. This ap­pears to im­ply that the wider dataset in­cludes data on in­di­vid­u­als who did not give con­sent, since their data was pro­vided by oth­ers. Yet it is these data that con­sti­tuted the com­mer­cial value of the project; this is ev­i­dent in the de­scrip­tion of the data as in­clud­ing 3.5 mil­lion records of “friend­ship tri­ads” and other records of cou­ples and “po­ten­tial” fam­ily mem­bers iden­ti­fied by match­ing fam­ily names and home towns.

No univer­sity re­search ethics com­mit­tee should al­low this form of data har­vest­ing. What­ever the pre­cise de­tails of the mo­ti­va­tion and data gath­er­ing in the myPer­son­al­ity re­search, the case raises the po­ten­tial of Big Data to un­der­mine aca­demic sen­si­bil­i­ties. Re­search fun­ders are in­creas­ingly urg­ing aca­demics to con­duct re­search with “im­pact” and it is clear that many aca­demics in­volved in Big Data are aware of its com­mer­cial pos­si­bil­i­ties. Univer­si­ties and fun­ders need to ask if they are do­ing enough to main­tain eth­i­cal stan­dards in the face of po­ten­tial in­cen­tives to cir­cum­vent them.

No univer­sity re­search ethics com­mit­tee should al­low this form of data har­vest­ing, what­ever the de­tails of the mo­ti­va­tion

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.