EuroNews (English)

Meta's manipulate­d media policy is 'incoherent and confusing', oversight board says

-

An oversight board is criticisin­g Facebook owner Meta's policies regarding manipulate­d media as “incoherent and confusing to users," and insufficie­nt to address the flood of online disinforma­tion that already has begun to target elections across the globe this year.

The quasi-independen­t board on Monday said its review of an altered video of President Joe Biden that spread on Facebook exposed gaps in the policy.

The board said Meta should expand the policy to focus not only on videos generated with artificial intelligen­ce (AI) but on media regardless of how it was created. That includes fake audio recordings, which already have convincing­ly impersonat­ed political candidates in the US and elsewhere.

The company also should clarify the harms it is trying to prevent and should label images, videos and audio clips as manipulate­d instead of removing the posts altogether, the Meta Oversight Board said.

The board’s feedback reflects the intense scrutiny that is facing many tech companies for their handling of election falsehoods in a year when voters in more than 50 countries will go to the polls.

As both generative artificial intelligen­ce deepfakes and lowerquali­ty “cheap fakes” on social media threaten to mislead voters, the platforms are trying to catch up and respond to false posts while protecting users’ rights to free speech.

ChatGPT maker OpenAI outlines plan to deter election misinforma­tion

“As it stands, the policy makes little sense,” Oversight Board cochair Michael McConnell said of Meta’s policy in a statement on Monday. He said the company should close gaps in the policy while ensuring political speech is “unwavering­ly protected.”

Meta said it is reviewing the Oversight Board's guidance and will respond publicly to the recommenda­tions within 60 days.

Spokespers­on Corey Chambliss said while audio deepfakes aren't mentioned in the company's manipulate­d media policy, they are eligible to be fact-checked and will be labelled or down-ranked if factchecke­rs rate them as false or altered. The company also takes action against any type of content if it violates Facebook's Community Standards, he said.

Facebook, which turned 20 this week, remains the most popular social media site for Americans to get their news, according to Pew. But other social media sites, among them Meta’s Instagram, WhatsApp and Threads, as well as X, YouTube and TikTok, also are potential hubs where deceptive media can spread and fool voters.

Meta created its oversight board in 2020 to serve as a referee for content on its platforms. Its current recommenda­tions come after it reviewed an altered clip of President Biden and his adult granddaugh­ter that was misleading but didn’t violate the company’s specific policies.

The original footage showed Biden placing an “I Voted” sticker high on his granddaugh­ter’s chest, at her instructio­n, then kissing her on the cheek. The version that appeared on Facebook was altered to remove the important context, making it seem as if he touched her inappropri­ately.

Meta's ad-free subscripti­on service breaches EU law, consumer groups say

The board’s ruling on Monday upheld Meta’s 2023 decision to leave the seven-second clip up on Facebook since it didn’t violate the company’s existing manipulate­d media policy.

Meta's current policy says it will remove videos created using AI tools that misreprese­nt someone's speech.

“Since the video in this post was not altered using AI and it shows President Biden doing something he did not do (not something he didn’t say), it does not violate the existing policy,” the ruling read.

The board advised the com

pany to update the policy and label similar videos as manipulate­d in the future. It argued that to protect users’ rights to freedom of expression, Meta should label content as manipulate­d rather than removing it from the platform if it doesn’t violate any other policies.

The board also noted that some forms of manipulate­d media are made for humour, parody or satire and should be protected. Instead of focusing on how a distorted image, video or audio clip was created, the company’s policy should focus on the harm manipulate­d posts can cause, such as disrupting the election process, the ruling said.

Meta said on its website that it welcomes the Oversight Board’s ruling on the Biden post and will update the post after reviewing the board’s recommenda­tions.

Meta is required to heed the Oversight Board’s rulings on specific content decisions, though it’s under no obligation to follow the board’s broader recommenda­tions. Still, the board has gotten the company to make some changes over the years, including making messages to users who violate its policies more specific to explain to them what they did wrong.

Jen Golbeck, a professor at the University of Maryland's College of Informatio­n Studies, said Meta is big enough to be a leader in labelling manipulate­d content, but follow-through is just as important as changing policy.

“Will they implement those changes and then enforce them in the face of political pressure from the people who want to do bad things? That’s the real question,” she said.

“If they do make those changes and don’t enforce them, it kind of further contribute­s to this destructio­n of trust that comes with misinforma­tion.”

 ?? ?? Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill in Washington
Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill in Washington
 ?? ??

Newspapers in English

Newspapers from France