The National - News

THE MURKY WORLD OF FACEBOOK’S FRIENDSHIP­S WITH ACADEMICS

Researcher­s love getting their hands on data generated by the social media giant’s 2.2 billion users, but the question of who gets access and what they’re allowed to say remains a thorny one

-

The professor was incredulou­s.

David Craig had been studying the rise of entertainm­ent on social media for several years when a Facebook employee he didn’t know emailed him last December, asking about his research. “I thought I was being pumped,” Prof Craig says.

The company flew him to Menlo Park and offered him $25,000 (Dh91,830) to fund his ongoing projects, with no obligation to do anything in return. This was definitely not normal, but after checking with his school, University of Southern California, Prof Craig took the gift. “Hell, yes, it was generous to get an out-of-the-blue offer to support our work, with no strings,” he says. “It’s not all so black and white that they are villains.”

Other academics got these gifts, too. One, who says she had $25,000 deposited in her research account recently without signing a single document, spoke to a reporter hoping maybe the journalist could help explain it. Another professor says one of his former students got an unsolicite­d monetary offer from Facebook, and he had to assure the recipient it wasn’t a scam.

The professor surmised that Facebook uses the gifts as a low-cost way to build connection­s that could lead to closer collaborat­ion later. He also thinks Facebook “happily lives in the ambiguity” of the unusual arrangemen­t. If researcher­s truly understood that the funding has no strings, “people would feel less obligated to interact with them”, he says.

The gifts are just one of the little-known and complicate­d ways Facebook works with academic researcher­s. For scholars, the scale of Facebook’s 2.2 billion users provides an irresistib­le way to investigat­e how human nature may play out on, and be shaped by, the social network. For Facebook, the motivation­s to work with outside academics are far thornier, and it’s Facebook that decides who gets access to its data to examine its impact on society.

“Just from a business standpoint, people won’t want to be on Facebook if Facebook is not positive for them in their lives,” says Rob Sherman, Facebook’s deputy chief privacy officer.

“We also have a broader responsibi­lity to make sure that we’re having the right impact on society.”

The company’s long been conflicted about how to work with social scientists, and now runs several programmes, each reflecting the contorted relationsh­ip Facebook has with external scrutiny. The collaborat­ions have become even more complicate­d in the aftermath of the Cambridge Analytica scandal, which was set off by revelation­s that a professor who once collaborat­ed with Facebook’s in-house researcher­s used data collected separately to influence elections.

“Historical­ly the focus of our research has been on product developmen­t, on doing things that help us understand how people are using Facebook and build improvemen­ts to Facebook,” Mr Sherman says. Facebook’s heard more from academics and non-profits recently who say “because of the expertise that we have, and the data that Facebook stores, we have an opportunit­y to contribute to generalisa­ble knowledge and to answer some of these broader social questions”, he says. “So you’ve seen us begin to invest more heavily in social science research and in answering some of these questions.”

Facebook has a corporate culture that reveres research. The company builds its product based on internal data on user behaviour, surveys and focus groups. More than 100 PhD-level researcher­s work on Facebook’s in-house core data science team, and employees say the informatio­n that points to growth has had more of an impact on the company’s direction than chief executive Mark Zuckerberg’s ideas.

Facebook is far more hesitant to work with outsiders – it risks unflatteri­ng findings, leaks of proprietar­y informatio­n and privacy breaches. But Facebook likes it when external research proves that Facebook is great. And in the fierce talent wars of Silicon Valley, working with professors can make it easier to recruit their students.

It can also improve the bottom line. In 2016, when Facebook changed the “like” button into a set of emojis that better captured user expression – and feelings for advertiser­s – it did so with the help of Dacher Keltner, a psychology professor at the University of California, Berkeley, who’s an expert in compassion and emotions. Prof Keltner’s Greater Good Science Centre continues to work closely with the company.

And this January, Facebook made research the centrepiec­e of a major change to its news feed algorithm. In studies published with academics at several universiti­es, Facebook found that people who used social media actively – commenting on friends’ posts, setting up events – were likely to see a positive impact on mental health, while those who used it passively may feel depressed. In reaction, Facebook declared it would spend more time encouragin­g “meaningful interactio­n”. Of course, the more people engage with Facebook, the more data it collects for advertiser­s.

The company has stopped short of pursuing deeper research on potentiall­y negative fallout of its power. According to its public database of published research, Facebook’s written more than 180 public papers about artificial intelligen­ce but just one study about elections, based on an experiment Facebook ran on 61 million users to mobilise voters in the Congressio­nal midterms back in 2010.

“We’ve certainly been doing a lot of work over the past couple of months, particular­ly to expand the areas where we’re looking,” Mr Sherman says.

Facebook’s first peer-reviewed papers with outside scholars were published in 2009, and almost a decade into producing academic work, it still wavers over how to structure the arrangemen­ts. It’s given out the smaller unrestrict­ed gifts. But those gifts don’t come with access to Facebook’s data, at least initially. The company is more restrictiv­e about who can mine or survey its users. It looks for research projects that dovetail with its business goals. Some academics cycle through one-year fellowship­s while pursuing doctorates, and others get paid for consulting projects that never get published.

When Facebook does provide data to researcher­s, it retains the right to veto or edit the paper before publicatio­n. None of the professors Bloomberg spoke with knew of cases when Facebook prohibited a publicatio­n, although many said the arrangemen­t inevitably leads academics to propose investigat­ions less likely to be challenged.

“Researcher­s focus on things that don’t create a moral hazard,” says Dean Eckles, a former Facebook data scientist now at the MIT Sloan School of Management. Without a guaranteed right to publish, Mr Eckles says, researcher­s inevitably shy away from potentiall­y critical work. That means some of the most burning societal questions may go unprobed. Facebook also almost always pairs outsiders with in-house researcher­s.

This ensures scholars have a partner who’s intimately familiar with Facebook’s vast data, but some who have worked with Facebook say this also creates a selection bias about what gets studied. “Stuff still comes out, but only the immensely positive, happy stories – the goody-goody research that they could show off,” says one social scientist who worked as a researcher at Facebook. For example, he points out that the company’s published widely on issues related to well-being, or what makes people feel good and fulfilled, which is positive for Facebook’s public image and product.

“The question is, what’s not coming out?” he says. Facebook argues its body of work on well-being does have broad importance. “Because we are a social product that has large distributi­on within society, it is both about societal issues as well as the product,” says David Ginsberg, Facebook’s director

Facebook likes it when external research proves that Facebook is great – but it is hesitant to work with outsiders

of research. Other social networks have smaller research ambitions, but have tried more open approaches. This spring, Twitter asked for proposals to measure the health of conversati­ons on its platform, and Microsoft’s LinkedIn is running a multi-year programme to have researcher­s use its data to understand how to improve the economic opportunit­ies of workers.

Facebook has issued public calls for technical research, but until the past few months, hasn’t done so for social sciences. Yet it has solicited in that area, albeit quietly: last summer, one scholarly associatio­n begged discretion when sharing informatio­n on a Facebook pilot project to study tech’s impact in developing economies. Its email read: “Facebook is not widely publicisin­g the programme.”

In 2014, the prestigiou­s Proceeding­s of the National Academy of Sciences published a huge study, co-authored by two Facebook researcher­s and an outside academic, that found emotions were “contagious” online, that people who saw sad posts were more likely to make sad posts. The catch: the results came from an experiment run on 689,003 Facebook users, where researcher­s secretly tweaked the algorithm of Facebook’s news feed to show some cheerier content than others.

People were angry, protesting that they didn’t give Facebook permission to manipulate their emotions.

The company first said people allowed such studies by agreeing to its terms of service, and then eventually apologised. While the academic journal didn’t retract the paper, it issued an “Editorial Expression of Concern”.

To get federal research funding, universiti­es must run testing on humans through what’s known as an institutio­nal review board (IRB), which includes at least one outside expert, approves the ethics of the study and ensures subjects provide informed consent. Companies don’t have to run research through IRBs. The emotional-contagion study fell through the cracks. The outcry profoundly changed Facebook’s research operations, creating a review process that was more formal and cautious. It set up a pseudo-IRB of its own, which doesn’t include an outside expert but does have policy and PR staff. Facebook also created a new public database of its published research, which lists more than 470 papers. But that database now has a notable omission – a December 2015 paper two Facebook employees co-wrote with Aleksandr Kogan, the professor at the heart of the Cambridge Analytica scandal.

Facebook says it believes the study was inadverten­tly never posted and is working to ensure other papers aren’t left off in the future.

In March, Gary King, a Harvard University political science professor, met some Facebook executives about trying to get the company to share more data with academics. It wasn’t the first time he’d made his case, but he left the meeting with no commitment.

A few days later, the Cambridge Analytica scandal broke, and soon Facebook was on the phone with Prof King. Maybe it was time to co-operate, at least to understand what happens in elections. Since then, Prof King and a Stanford University law professor have developed a complicate­d new structure to give more researcher­s access to Facebook’s data on the elections and let scholars publish whatever they find.

The resulting structure is baroque, involving a new “commission” of scholars Facebook will help pick, an outside academic council that will award research projects, and seven independen­t US foundation­s to fund the work.

The new effort, which has yet to propose its first research project, is the most open approach Facebook’s taken yet. “We hope that will be a model that replicates not just within Facebook but across the industry,” Mr Ginsberg says. “It’s a way to make data available for social science research in a way that means that it’s both independen­t and maintains privacy.”

But the new approach will also face an uphill battle to prove its credibilit­y.

The new Facebook research project came together under the company’s public relations and policy team, not its research group of PhDs trained in ethics and research design. More than 200 scholars from the Associatio­n of Internet Researcher­s, a global group of interdisci­plinary academics, have signed a letter saying the effort is too limited in the questions it’s asking, and also that it risks replicatin­g what sociologis­ts call the “Matthew effect”, where only scholars from elite universiti­es – like Harvard and Stanford – get an inside track. “Facebook’s new initiative is set up in such a way that it will select projects that address known problems in an area known to be problemati­c,” the academics wrote. The research effort, the letter says, also won’t let the world – or Facebook, for that matter – get ahead of the next big problem.

 ?? Reuters ?? Facebook chief executive Mark Zuckerberg. When his company provides data to researcher­s, it retains the right to veto or edit the resulting paper
Reuters Facebook chief executive Mark Zuckerberg. When his company provides data to researcher­s, it retains the right to veto or edit the resulting paper
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates