Miami Herald (Sunday)

We know how Venezuela’s elections will end. Will observers endorse the farce?

- BY ANDRES OPPENHEIME­R aoppenheim­er@miamiheral­d.com BY JEREMY B. MERRILL AND WILL OREMUS The Washington Post

The official campaign for Venezuela’s Nov 21 regional and local elections has just begun, but I can save you the suspense: Barring an unlikely last-minute deal with the opposition to agree on fair electoral rules, Venezuela’s dictatorsh­ip will claim a landslide victory on election night.

The question is whether Venezuelan dictator Nicolás Maduro’s latest effort to legitimize his regime will be validated by internatio­nal election missions.

The 27-country European Union, the United Nations and the Carter Center, among others, have announced that they will send missions to observe the elections, in which more than 3,000 state and local candidates are running. Maduro is seeking to portray the elections as fair.

Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbsup: “love,” “haha,”

“wow,” “sad” and “angry.”

Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocativ­e content — including content likely to make them angry.

Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business.

Facebook’s own researcher­s were quick to suspect a critical flaw. Favoring “controvers­ial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadverten­tly,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.”

The warning proved prescient. The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproport­ionately likely to include misinforma­tion, toxicity and low-quality news.

That means Facebook for three years systematic­ally amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience. The power of the algorithmi­c promotion undermined the efforts of Facebook’s content moderators and integrity teams, who were fighting an uphill battle against toxic and harmful content.

The internal debate over the “angry” emoji and the findings about its effects shed light on the highly subjective human judgments that underlie Facebook’s news feed algorithm — the byzantine machine-learning software that decides for billions of

But there are big fears in prodemocra­cy circles that some of these missions may give Maduro a huge propaganda victory if they put out a statement shortly after the election that focuses just on the vote counting, instead of evaluating the monthslong electoral process.

That’s because the fraud may not be in the vote counting, but in the lack of a level playing field during the race.

When I asked Venezuela’s opposition leader Juan Guaidó in a telephone interview whether there are conditions for a free election on Nov. 21, he responded, “Today, there are no conditions.”

Guaidó, who is recognized by the United States and several other countries as interim president after Maduro’s fraudulent electoral victory in 2018, reminded me that Maduro still controls Venezuela’s electoral tribunal; the largest opposition parties have been declared illegal; hundreds of opposition candidates have been banned from running for office; there are political prisoners; and the opposition has little access to government-controlled media.

“Nov. 21 can be an option for Venezuelan­s to mobilize and people what kinds of posts they’ll see each time they open the app. The deliberati­ons were revealed in disclosure­s made to the Securities and Exchange Commission and provided to Congress in redacted form by the legal counsel of whistleblo­wer Frances Haugen. The redacted versions were reviewed by a consortium of news organizati­ons, including The Washington Post.

“Anger and hate is the easiest way to grow on Facebook,” Haugen told the British Parliament on Monday.

In several cases, the documents show Facebook employees on its “integrity” teams raising flags about the human costs of specific elements of the ranking system — warnings that executives sometimes heeded and other times seemingly brushed aside. Employees evaluated and debated the importance of anger in society: Anger is a “core human emotion,” one staffer wrote, while another pointed out that angergener­ating posts might be organize,” Guaidó told me. “We are living in a dictatorsh­ip, and we must seek secure spaces to express ourselves.”

He added, “Eighty-five percent of Venezuelan­s reject Maduro, but in recent years we have protested, and they killed us; we have created opposition parties, and they declared them illegal. We have to take advantage of any space that allows us to mobilize.”

Asked whether he fears that the electoral missions from the European Union, the U.N. and the Carter Center may help legitimize a fraudulent election, Guaidó said he hopes that they will focus on the entire election process, rather than on what happens on election day. Otherwise, it will be “electoral tourism,” and the foreign electoral missions will delegitimi­ze themselves, he said.

There are fears in opposition circles that the EU’s High Representa­tive for Foreign Affairs and Security — Spanish Socialist Party politician Josep Borrell — may validate a fraudulent Maduro-regime victory.

On Oct. 12, The Financial Times reported that Borrell overruled an internal report from his own staff that recommende­d essential to protest movements against corrupt regimes.

An algorithm such as Facebook’s, which relies on sophistica­ted, opaque machine-learning techniques to generate its engagement prediction­s, “can sound mysterious and menacing,” said Noah Giansiracu­sa, a math professor at Bentley University in Massachuse­tts and author of the book, “How Algorithms Create and Prevent Fake News.” “But at the end of the day, there’s one number that gets predicted — one output. And a human is deciding what that number is.”

Facebook spokespers­on Dani Lever said, “We continue to work to understand what content creates negative experience­s, so we can reduce its distributi­on. This includes content that has a disproport­ionate amount of angry reactions, for example.”

The weight of the angry reaction is just one of the many levers that Facebook engineers manipulate­s to to not send observers to Venezuela. The newspaper quoted an internal document from Borrell’s office saying that deploying an electoral mission in Venezuela “is likely to have an adverse impact on the reputation and credibilit­y of EU (observers) and indirectly legitimize Venezuela’s electoral process.”

The more-than-70-member EU mission is headed by Isabel Santos, a Portuguese Socialist Party politician. She said recently that her mission won’t deploy any observers in Venezuela’s Amazonas state “because of security reasons.” Several other parts of the country are also expected to be off limits for foreign observers.

Jennie Lincoln, head of the Carter Center’s six-person mission to Venezuela, told me that her group will focus on the preelectio­n and post-election shape the flow of informatio­n and conversati­on on the world’s largest social network — one that has been shown to influence everything from users’ emotions to political campaigns to atrocities.

Facebook takes into account numerous factors — some of which are weighted to count a lot, some of which count a little and some of which count as negative — that add up to a single score that the news feed algorithm generates for each post in each user’s feed, each time they refresh it. That score is in turn used to sort the posts, deciding which ones appear at the top and which appear so far down that you’ll probably never see them. That single all-encompassi­ng scoring system is used to categorize and sort vast swaths of human interactio­n in nearly every country of the world and in more than 100 languages.

Facebook doesn’t publish the values its algorithm puts on different kinds of engagement, let conditions.

“We will analyze the entire electoral process, including the government’s disqualifi­cations of candidates and parties, and the restrictio­ns to opposition candidates’ access to media,” Lincoln told me.

If that’s what the internatio­nal electoral missions do, it would call Maduro’s bluff and put pressure on Venezuela’s dictatorsh­ip to allow more evenhanded conditions for the 2024 presidenti­al elections.

But if the internatio­nal monitors say anything that Maduro can use as a validation of his fraud, they will be shameless accomplice­s of an electoral farce.

Don’t miss the “Oppenheime­r Presenta” TV show at 8 p.m. Sundays on CNN en Español. Twitter: @oppenheime­ra

BEHIND THE SCENES,

FACEBOOK PROGRAMMED

THE ALGORITHM THAT DECIDES WHAT PEOPLE SEE IN THEIR NEWS FEEDS TO USE THE REACTION EMOJI AS SIGNALS TO PUSH MORE EMOTIONAL AND PROVOCATIV­E CONTENT — INCLUDING CONTENT LIKELY TO MAKE THEM ANGRY.

alone the more than 10,000 “signals” that it has said its software can take into account in predicting each post’s likelihood of producing those forms of engagement. It often cites a fear of giving people with bad intentions a playbook to explain why it keeps the inner workings under wraps.

Facebook’s levers rely on signals most users wouldn’t notice, like how many long comments a post generates, or whether a video is live or recorded, or whether comments were made in plain text or with cartoon avatars, the documents show. It even accounts for the computing load that each post requires and the strength of the user’s Internet signal. Depending on the lever, the effects of even a tiny tweak can ripple across the network, shaping whether the news sources in your feed are reputable or sketchy, political or not, whether you saw more of your real friends or more posts from groups Facebook wanted you to join, or if what you saw would be likely to anger, bore or inspire you.

Beyond the debate over the angry emoji, the documents show Facebook employees wrestling with tough questions about the company’s values, performing cleverly constructe­d analyses. When they found that the algorithm was exacerbati­ng harms, they advocated for tweaks they thought might help. But those proposals were sometimes overruled.

When boosts, like those for emoji, collided with “deboosts” or “demotions” meant to limit potentiall­y harmful content, all that complicate­d math added up to a problem in protecting users. The average post got a score of a few hundred, according to the documents. But in 2019, a Facebook data scientist discovered there was no limit to how high the ranking scores could go.

If Facebook’s algorithms thought a post was bad, Facebook could cut its score in half, pushing most of instances of the post way down in users’ feeds. But a few posts could get scores as high as a billion, according to the documents. Cutting an astronomic­al score in half to “demote” it would still leave it with a score high enough to appear at the top of the user’s feed.

“Scary thought: civic demotions not working,” one Facebook employee noted.

 ?? ALASTAIR GRANT AP ?? Demonstrat­ors wearing masks of the ‘angry’ emoji and of Facebook founder Mark Zuckerberg protest in London in 2018. A Facebook algorithm that analyzed users’ responses to posts had the effect of methodical­ly amplifying the inflammato­ry content they received.
ALASTAIR GRANT AP Demonstrat­ors wearing masks of the ‘angry’ emoji and of Facebook founder Mark Zuckerberg protest in London in 2018. A Facebook algorithm that analyzed users’ responses to posts had the effect of methodical­ly amplifying the inflammato­ry content they received.
 ?? Getty Images ?? Venezuela’s Nicolás Maduro is hoping to have regional elections legitimize­d in the eyes of the world.
Getty Images Venezuela’s Nicolás Maduro is hoping to have regional elections legitimize­d in the eyes of the world.

Newspapers in English

Newspapers from United States