Los Angeles Times

Facebook’s political ad policy shifts

Company says it is taking more steps to encourage voting and delete misinforma­tion.

- ASSOCIATED PRESS

The social media giant says it will ban new ads in the week before election day in attempt to minimize misinforma­tion.

With just two months left until the U.S. presidenti­al election, Facebook says it is taking more steps to encourage voting, minimize misinforma­tion and reduce the likelihood of post-election “civil unrest.”

The company said Thursday it will restrict new political ads in the week before the election and remove posts that convey misinforma­tion about COVID-19 and voting. It also will attach links to official results to posts from candidates and campaigns declaring premature victories.

“This election is not going to be business as usual. We all have a responsibi­lity to protect our democracy,” Facebook Chief Executive Mark Zuckerberg said in a post Thursday. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”

Activists hailed the new policies but said the onus will be on Facebook to enforce them. And some experts were skeptical that they’ll really make a difference.

Siva Vaidhyanat­han, a Facebook expert at the University of Virginia, said the company proved once again its incapacity to effectivel­y snuff out dangerous misinforma­tion last week when it failed to remove postings by right-wing militia organizers urging supporters with rifles to converge on Kenosha, Wis.

“Facebook’s biggest problem has always been enforcemen­t,” he said. “Even when it creates reasonable policies that seem wellmeanin­g, it gets defeated by its own scale. So I am not optimistic that this will be terribly effective.”

Facebook and other social media companies are being scrutinize­d over how they handle misinforma­tion, given issues with President Trump and other candidates posting false informatio­n and Russia’s interferen­ce in the 2016 election and its ongoing attempts to interfere in U.S. politics.

Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.

With the nation heavily divided and election results potentiall­y taking days or weeks to be finalized, there could be an “increased risk of civil unrest across the country,” Zuckerberg said.

Civil rights groups said they directly pitched Zuckerberg and other Facebook executives to make many of the changes announced Thursday.

“These are really significan­t steps, but everything is going to depend on the enforcemen­t,” said Vanita Gupta, who was head of the Obama Justice Department’s civil rights division and now leads the Leadership Conference on Civil and Human Rights. “I think they’re going to be tested on it pretty soon.”

In July, Trump refused to publicly commit to accepting the results of the upcoming election, scoffing at polls that showed him lagging behind Democratic candidate Joe Biden. That has raised concern over the willingnes­s of Trump and his supporters to abide by election results.

Under the new measures, Facebook says it will prohibit politician­s and campaigns from running new election ads in the week before the election. However, they can still run existing ads and change how they are targeted.

Trump campaign spokeswoma­n Samantha Zager criticized the ban on new political ads, saying it would prevent Trump from defending himself on the platform in the last seven days of the presidenti­al campaign.

Posts with obvious misinforma­tion on voting policies and the COVID-19 pandemic will also be removed. Users can forward articles to a maximum of five others on Messenger, Facebook’s messaging app.

The company also will work with the Reuters news agency to provide official election results and make the informatio­n available both on its platform and with push notificati­ons.

After getting caught offguard by Russia’s efforts to interfere in the 2016 election, Facebook, Google, Twitter and other companies installed safeguards to prevent it from happening again.

That includes taking down posts, groups and accounts that engage in “coordinate­d inauthenti­c behavior” and strengthen­ing verificati­on procedures for political ads. Last year, Twitter banned political ads altogether.

Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interferen­ce over the last few years.

“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.

But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinforma­tion, including from politician­s and in the form of edited videos.

That internal dissent among Facebook employees might have helped influence Zuckerberg’s decision to do something, said Joan Donovan, a disinforma­tion researcher at Harvard University.

“This is a huge aboutface for Facebook in this moment because for so long they said they were unwilling to moderate political speech and now at this stage they are drawing very sharp lines and I think that’s because their company cannot survive another four-year scandal,” she said.

Facebook had previously drawn criticism for its ads policy that cited freedom of expression as the reason for letting politician­s like Trump post false informatio­n about voting.

 ?? Richard Drew Associated Press ?? POLITICAL ADS that appeared on Facebook are displayed on a computer screen in New York in 2019.
Richard Drew Associated Press POLITICAL ADS that appeared on Facebook are displayed on a computer screen in New York in 2019.

Newspapers in English

Newspapers from United States