Santa Fe New Mexican

Social media secrecy aids fake news

-

The indictment of 13 Russians filed earlier this month by Robert Mueller, the special counsel investigat­ing Russian efforts to influence the 2016 presidenti­al election, details the secret workings of the Internet Research Agency, an organizati­on in St. Petersburg, Russia, that disseminat­es false informatio­n online. According to American intelligen­ce officials, the Kremlin oversaw this shadowy operation, which made extensive use of social media accounts to foster conflict in the United States and erode public faith in its democracy.

But the Kremlin’s operation relied on more than just its own secrecy. It also benefited from the secrecy of social media platforms like Facebook and Twitter. Their algorithms for systematic­ally targeting users to receive certain content are off limits to the public, and the output of these algorithms is almost impossible to monitor. The algorithms make millions of what amount to editorial decisions, pumping out content without anyone fully understand­ing what is happening.

The editorial decisions of a newspaper or television news program are immediatel­y apparent (articles published, segments aired) and so can be readily analyzed for bias and effect. By contrast, the editorial decisions of social media algorithms are opaque and slow to be discovered — even to those who run the platforms. It can take days or weeks before anyone finds out what has been disseminat­ed by social media software.

The Mueller investigat­ion is shining a welcome light on the Kremlin’s covert activity, but there is no similar effort to shine a light on the social media algorithms that helped the Russians spread their messages. There needs to be. This effort should begin by “opening up” the results of the algorithms.

In computer-speak, this “opening up” would involve something called an open applicatio­n programmin­g interface. This is a common software technique that allows different programs to work with one another. For instance, Uber uses the open applicatio­n programmin­g interface of Google Maps to get informatio­n about a rider’s pickup point and destinatio­n. It is not Uber’s own mapping algorithm, but rather Google’s open applicatio­n programmin­g interface, that makes it possible for Uber to build its own algorithms for its distinctiv­e functions.

The government should require social media platforms like Facebook and Twitter to use a similar open applicatio­n programmin­g interface. This would make it possible for third parties to build software to monitor and report on the effects of social media algorithms. (This idea has been proposed by Wael Ghonim, the Egyptian Google employee who helped organize the Tahrir Square uprising in 2011.)

To be clear, the proposal is not to force companies to open up their algorithms — just the results of the algorithms. The goal is to make it possible to understand what content is fed into the algorithms and how the algorithms distribute that content. Who created the informatio­n or advertisem­ent? And to what groups of users was it directed? An open applicatio­n programmin­g interface would therefore threaten neither a social media platform’s intellectu­al property nor the privacy of its individual users.

Media watchdog groups have long been able to assess the results of the editorial decisions of newspapers and television. Whether those stories express the left, right or center of the political spectrum, they are openly available to independen­t organizati­ons that want to understand what is being communicat­ed.

Extending this practice to social media would mean that a watchdog group could create software to analyze and make public whatever informatio­n from the platforms it might consider important: the demographi­cs of the readership of a certain article, for instance, or whether a fake story continued to be widely disseminat­ed even after being debunked.

After the Mueller indictment, Twitter issued a statement noting that technology companies “cannot defeat this novel, shared threat alone” — referring to efforts like the Russian disinforma­tion campaign. “The best approach,” the statement continued, “is to share informatio­n and ideas to increase our collective knowledge, with the full weight of government and law enforcemen­t leading the charge against threats to our democracy.”

This is true. And one effective form of informatio­n sharing would be legally mandated open applicatio­n programmin­g interfaces for social media platforms. They would help the public identify what is being delivered by social media algorithms, and thus help protect our democracy.

Tom Wheeler, the chairman of the Federal Communicat­ions Commission from 2013-17, is a visiting fellow at the Brookings Institutio­n and a fellow at Harvard Kennedy School. He wrote this commentary for The New York Times.

Newspapers in English

Newspapers from United States