How YouTube de­cides what is banned and what is not

Antelope Valley Press - - BUSINESS -

SAN FRANCISCO (AP) — Matt Hal­prin, the global head of trust and safety for YouTube, has a tough job: He over­sees the teams that de­cide what is al­lowed and what should be pro­hib­ited on YouTube.

The Google-owned site has come un­der fire re­cently for al­low­ing videos that fea­ture what many find of­fen­sive or vi­o­lent, and for not do­ing enough to pro­tect kids on­line. Hal­prin has to make dif­fi­cult de­ci­sions to craft poli­cies that keep the site as safe as YouTube wants it to be, while bal­anc­ing what the com­pany con­sid­ers one of its core tenets: peo­ple’s free speech.

The As­so­ci­ated Press spoke re­cently with Hal­prin about how his team works. Ques­tions and an­swers have been edited for length and clar­ity.

Q: How does your team op­er­ate?

A: They’re sep­a­rated into pol­icy de­vel­op­ment and pol­icy in­cu­ba­tion. Pol­icy de­vel­op­ment starts with the high­est level of prin­ci­ples: We are an open plat­form. We do have a bias to al­low free­dom of ex­pres­sion on our plat­form and only re­move con­tent that we think is egre­gious and could cause real harm. We want to be a place where a va­ri­ety of per­spec­tives can be heard, and some­times that even means things that peo­ple dis­agree with or are even of­fended by.

We kicked off a process a cou­ple of years ago to es­sen­tially re-re­view all of our poli­cies. We look at which poli­cies seem to be most out of kil­ter with what our en­force­ment teams are telling us, the gray area cases or which poli­cies are reg­u­la­tors talk­ing about or the press ask­ing about. As an ex­am­ple, in Q2 (June) we re­launched our hate speech pol­icy.

Q: What does the process look like to make or change a pol­icy?

A: The team first does the re­search and puts to­gether the frame­work and es­sen­tially a pro­posal. Once it gets through me, then we bring in our cross-func­tional part­ners and peo­ple on pub­lic pol­icy and pub­lic re­la­tions, in prod­uct, in le­gal. We of­ten get sent back to the draw­ing board on a few is­sues. Then we go to an ex­ec­u­tive steer­ing re­view, which is chaired by our chief prod­uct of­fi­cer. Fi­nally, the fourth and fi­nal step is the top ex­ec­u­tives. We have these meet­ings ev­ery sin­gle week.

As we go through this process, these guys are watch­ing a ton of video ex­am­ples.

Q: How do you think about bal­anc­ing free ex­pres­sion with safety?

A: That is prob­a­bly the tough­est thing that we do. There is not a right an­swer. Not all of us agree. One per­son will think that, “Hey, we should have more ci­vil­ity. We shouldn’t let some­thing like this come up.” And an­other per­son will say, “Yeah, but if you get rid of that un­civil com­ment, you lose some re­ally valu­able, you know, free ex­pres­sion or po­lit­i­cal dis­course.”

And so we have se­ri­ously huge de­bates about this. Some­times we think that if we are not crit­i­cized by all sides for the pol­icy, we’ve prob­a­bly done some­thing wrong. If you’re only up­set­ting one side, then you prob­a­bly haven’t got­ten it right.

Q: How do you en­sure that things aren’t slip­ping through the cracks when it comes to en­force­ment?

A: We’ve al­ways had com­mu­nity guide­lines and that’s what de­fines our rules. We mea­sure how much ex­po­sure oc­curs on con­tent that we think goes against the line. And that’s go­ing down. For ev­ery work­flow, for ev­ery pol­icy, I get a mea­sure of how ac­cu­rate our re­view­ers have been reg­u­larly.


Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.