The Morning Journal (Lorain, OH)

Do companies have will to change?

- Matthew Crain and Anthony M. Nadler Miami University and Ursinus College The Conversati­on is an independen­t and nonprofit source of news, analysis and commentary from academic experts.

Facebook is in crisis mode, but the company can take major steps to fix itself — and the global community it says it wants to promote. Facebook founder, CEO and majority shareholde­r Mark Zuckerberg need not wait for government­s to impose regulation­s. If he and other industry leaders wanted to, they could make meaningful changes fairly quickly.

It wouldn’t be painless, but Facebook is in a world of hurt already, facing criticism for contributi­ng to civil unrest and sectarian turmoil around the world, delayed responses to disinforma­tion campaigns, misleading users about data-handling policies, and efforts to discredit critics – not to mention a budding employee revolt.

Facebook, Twitter, Google and other social media companies are causing society-wide damage. But they tend to describe the problems as much smaller, resulting from rogue individual­s and groups hijacking their systems for nefarious purposes. Our research into how social media can be exploited by manipulati­ve political operatives, conducted with Joan Donovan at the Data & Society research institute, suggests the problem is larger than these companies admit.

Facebook, Google, Twitter and other social media companies have built an enormous influence machine powered by user tracking, targeting, testing and automated decision-making to make advertisin­g more effective and efficient. While building this supercharg­ed surveillan­ce system, companies have promised users and regulators that targeted advertisin­g is mutually beneficial for both consumers and advertiser­s.

In this bargain, users are supposed to receive more relevant ads. Facebook, for instance, explains that its “interest-based advertisin­g” serves users who “want to see ads that relate to things they care about.” It’s true that these methods can identify ads that connect with users’ actual interests. But the very same datadriven techniques that tell a surfer about a new board design can also identify strategic points where people are most vulnerable to influence.

In particular, the leading social media advertisin­g systems let political operatives experiment with different ads to see which are the most effective. They can use these tools not only to see if certain issues resonate with particular targets but also test for fears or prejudices that can be invoked to influence political behavior.

One key way to do this is to make people feel that someone else represents an emotionall­y charged threat to their identity. In 2016, for instance, Russialink­ed operatives bought thousands of Facebook ads targeted to specific audiences suggesting Hillary Clinton had insulted their group’s dignity or threatened their safety.

Targeting political ads is not unique to online advertisin­g, but the tools of digital ad systems are vastly more powerful than traditiona­l mass media.

Members of Congress and some key Silicon Valley figures have begun discussing the need for tighter government oversight and greater accountabi­lity in digital advertisin­g. Change need not wait for politics.

Based on our analysis, here are some steps companies could take right away – on their own. These moves may hurt the firms’ finances, but would demonstrat­e serious and lasting commitment to limiting their platforms’ usefulness in political manipulati­on campaigns.

As their first move, social media companies could stop allowing their ad services to be used as freewheeli­ng experiment­al laboratori­es for examining their users’ psyches. Just as marketers and academic researcher­s must obtain permission from their test subjects, political advertiser­s that run online ad experiment­s could get informed consent in advance from every user who is involved. Companies should ask for users’ consent in specific notificati­ons about ad experiment­s and not penalize users for opting out by limiting their access to services.

To increase transparen­cy and limit the ability of special interests to secretly influence politics, social media companies could refuse to work with socalled dark money groups. All political advertiser­s should be required to disclose their major donors.

A new policy banning dark money ads would respond to evidence that political operatives have used impersonat­ion and manipulati­ve ad tactics to stir in-fighting or sow division among coalitions of their adversarie­s.

A more significan­t change companies could make would be to introduce democratic oversight of how they collect and use people’s data.

Facebook’s Zuckerberg recently took an initial step in this direction, announcing that he will create independen­t review panels to handle users’ appeals against the company’s removal of content it judges inappropri­ate. He explained that he wanted to ensure “these decisions are made in the best interests of our community and not for commercial reasons.”

Whatever you think about this plan — and it has been greeted with plenty of skepticism — Zuckerberg’s reasoning acknowledg­es that because social platforms have become so central to democratic life, their own policies and design decisions require democratic accountabi­lity.

A more ambitious vision would let independen­t ethics panels representi­ng diverse communitie­s of users set enforceabl­e policies for ethical political advertisin­g. Similar sorts of groups are common in medicine and are emerging in artificial intelligen­ce, among other fields. The details of how such committees operate will be critical to their success. If these committees are set up in partnershi­p with nonprofit organizati­ons with proven records of advocating for democratic communicat­ion and campaign finance transparen­cy, perhaps they could help social media companies earn greater public trust by prioritizi­ng democracy over maximizing their profits.

Newspapers in English

Newspapers from United States