The Mercury News Weekend

Lifting the curtain on Facebook news feeds

- COLUMNIST

Poor Facebook. Always between that proverbial rock and hard place over whether something on its site is unacceptab­ly graphic or truly newsworthy.

Should it censor and accept that it will be routinely ridiculed for its missteps, such as blocking breast cancer ads, taking down live video of a death or suspending accounts that post iconic images from history such as the searing Napalm Girl photo?

Or does it abandon efforts to control content and become a free-for-all that’s NSFW (not safe for work), awash in graphic images, fake stories and a river of hate and abuse you can still find on Twitter, despite that company’s efforts to clamp down?

Until recently, Facebook has struck a strident PG-13 balance, arguing that it has to take into account widely different community standards all over the world. Like Elastigirl, the mother in “The Incredible­s,” Facebook tries to be everything to everyone, stretching its arms so that a safe space for people in Ulan Bator works in San Francisco as well.

But last week, the company said it was breaking from that modis operandi, announcing that it would begin showing “items that people find newsworthy, significan­t or important to the public interest — even if they might otherwise violate our standards.”

It’s about time Facebook loosens up and grows up. For too long, the social networking firm has long held that it’s not a media firm but a tech company. It’s acted as if its 1.7 billion users don’t really need to know what goes on behind the curtain and how editorial decisions are made.

Since its latest announceme­nt, I’ve braced myself to be offended and outraged by my Facebook feed. But so far, it’s the same blend of upbeat personal updates and same batch of news postings. The company responded to my request for more informatio­n about its plan by pointing to its announceme­nt, which says “We will work with our community and partners to explore exactly how to do this.”

So we still don’t really know what’s behind the curtain.

One news decision I’d like to know more about: Mark Zuckerberg, the firm’s CEO, allowed specific postings from Donald Trump, the GOP presidenti­al nominee, that called for Muslims to be blocked from entering the country.

Facebook employees reportedly complained that the same postings by a non-VIP would be

blocked under the company’s current rules. But Zuckerberg, wearing his new editor-in-chief hat, made the right call and allowed the very newsworthy Trump posts to stay up.

It doesn’t really matter what Facebook calls itself — media firm, tech company or advertisin­g mega operation. It’s also into messaging, virtual reality, food delivery and so much more. What business doesn’t Facebook want to be in? Corporate communicat­ion? That’s part of their plan, too, in the recently unveiled Workplace by Facebook, a social network for business.

But at this stage, the company has a special responsibi­lity as a media conduit, said Elaine Monaghan, a professor of practice at the Media School at University of Indiana, Bloomingto­n.

“It should be a transparen­t decision-making process driven by news values and not the bottom line,” she said.

Daniel Castro, vice president of the Informatio­n Technology and Innovation Foundation, a think tank, pointed to Facebook executives meeting over the summer with conservati­ve media sites as a way to “create more trust in what they are doing.”

Here’s some other ideas for Facebook to act responsibl­y in its media role:

News ombudspers­on

Facebook should appoint a person from the media world to regularly look into and report to users issues that crop up on the social media site. How are decisions made? What are the various concerns the company is balancing? Did it get the decision right or wrong? Many users, I’m sure, would be sympatheti­c to how hard it is for Facebook to manage all it does. But it also would be a huge help if there was someone truly independen­t reporting back to them.

An ombudspers­on might work, said Craig Aaron, president and CEO of Free Press, the consumer advocacy group, but it depends how Facebook structures the role. “Would they have access to the folks at the top?” he said. “Could you find someone who could both talk to the engineers and organize the users? You need a public editor, a public advocate, and a lot of watchdogs out in the Facebook-user community.”

Offensive content

Facebook could drop a visual flag — “Graphic but newsworthy” — in front of content, including live video, that its algorithms think might upset or offend. TechCrunch suggests this might work on live streaming video that Facebook’s algorithms somehow identifies as potentiall­y offensive — until someone at the company can review it.

User control

One very Facebook-y solution would be to give users even more control of the content they want to see. If you want Facebook to be a PG-13 experience, that’s fine. But there should be a way to selfidenti­fy. Likewise children — yes they shouldn’t be on Facebook but they are — could be given a level of protection from some of the most graphic images out there.

Like I said, Facebook’s position isn’t easy. But it’s time for this informatio­n sharing behemoth to really step up, open the kimono and talk about its decisions in a credible, concrete way. The more we understand its decisions — even if we don’t agree — the better off we will be.

 ??  ?? MICHELLE QUINN
MICHELLE QUINN

Newspapers in English

Newspapers from United States