The Guardian (USA)

YouTube viewers to help uncover how users are sent to harmful videos

- Alex Hern

YouTube viewers are being asked to become “watchdogs” and record their use of the site to help uncover the ways in which its recommenda­tion algorithm can lead to online radicalisa­tion.

Mozilla, the non-profit behind the Firefox web browser, has produced a new browser extension, called RegretsRep­orter, which will allow YouTube users to record and upload informatio­n about harmful videos recommende­d by the site, as well as the route they took to get there.

“For years, people have raised the alarm about YouTube recommendi­ng conspiracy theories, misinforma­tion, and other harmful content,” said Ashley Boyd, Mozilla’s head of engagement and advocacy. “One of YouTube’s most consistent responses is to say that they are making progress on this and have reduced harmful recommenda­tions by 70%. But there is no way to verify those claims or understand where YouTube still has work to do.

“That’s why we’re recruiting YouTube users to become YouTube watchdogs. People can donate their own recommenda­tion data to help us understand what YouTube is recommendi­ng, and help us gain insight into how to make recommenda­tion engines at large more trustworth­y.”

More than two years ago, the Guardian revealed how YouTube’s recommenda­tion algorithm, the workings of which are not fully understood outside the company, gave “dangerousl­y skewed” video suggestion­s.

In the years since, users have repeatedly expressed surprise and dismay at the harmful content they are encouraged to view. After watching a YouTube video about Vikings, one user said they were recommende­d content about white supremacy; another found You’ve Been Framed-style footage led to grisly clips from real-life fatal accidents.

“More than 70% of all videos viewed on YouTube are suggested by the site’s recommenda­tion engine,” Mozilla said. “But even the basics of how it works are poorly detailed. The organisati­on says it wants to research the answers to questions like what type of recommende­d videos lead to racist, violent, or conspirato­rial content, and whether there are specific YouTube usage patterns that lead to harmful content being recommende­d. The company says it will share findings from the research in an opensource fashion.”

The campaign is a high-stakes move for Mozilla. The organisati­on gains most of its revenue from a deal with Google under which Google Search is set as the default search engine on the Firefox browser, and though it has been trying to diversify its income streams, a revenue squeeze in January this year led to 70 staff members being laid off.

 ??  ?? In 2018 the Guardian revealed how YouTube’s recommenda­tion algorithm offered ‘dangerousl­y skewed’ video suggestion­s. Photograph: Nicolas Asfouri/AFP/Getty Images
In 2018 the Guardian revealed how YouTube’s recommenda­tion algorithm offered ‘dangerousl­y skewed’ video suggestion­s. Photograph: Nicolas Asfouri/AFP/Getty Images

Newspapers in English

Newspapers from United States