Los Angeles Times

Death, live online

The sharing of video of the mass shooting demonstrat­es how tech platforms can be exploited for hate.

- By Sam Dean

The killing of 49 people at two mosques in Christchur­ch, New Zealand, was engineered to be viewed and shared on the world’s largest technology platforms, taking full advantage of Silicon Valley’s laissez-faire approach to content moderation.

It began with a racist manifesto uploaded to the document-sharing service Scribd and linked on Twitter for anyone to read.

There was a helmetmoun­ted camera of the attack synced to a Facebook Live account, and a link to the stream shared among a hate-filled online community.

There was footage, in real time, of the killings served up to an audience that had found the link.

Facebook only deleted the user’s account after local police alerted the company of the active shooting documented on the stream.

But by then, others had already posted the video to YouTube. That site, owned by Google parent company Alphabet has scrambled to delete new uploads of the video. Twitter has said it’s doing the same.

Soon, the clips circulated on Reddit, a sprawling online message board service owned by Conde Nast parent company Advance Publicatio­ns. Reddit removed message boards titled Watch PeopleDie and Gore, which showcased the video along with other clips of people being killed or injured. Those message boards had been operating on the site for the last seven and nine years, respective­ly.

Hours after the attack, users were posting in the YouTube comments below mainstream news organizati­ons’ coverage of the attack with links to the original livestream.

On one account, a user who self-identified as a 15year-old spoke over a black screen, saying that the platform wouldn’t allow him to post the footage directly to the site, but a link to the video was in the descriptio­n.

The link led to the full 17minute livestream of the mass shooting. It was hosted on Google Drive.

The unfilterin­g of the world was long hailed as a Utopian goal of the internet, a way to dismantle the gates kept guarded by the bureaucrac­ies of print and broadcast media. Blogs covering niche news and catering to under-served communitie­s

could proliferat­e. Amateur brilliance that would never be allowed to air on even the smallest cable channels could be seen by millions. Dissidents could share informatio­n that would otherwise be censored.

But that vision overlooked the toxic spores that the gatekeeper­s had kept at bay.

The United Nations has implicated Facebook in fanning the flames of hate against Rohingya Muslims in Myanmar, who were subject to an ethnic cleansing campaign by the country’s military. YouTube has allowed child pornograph­y and exploitati­on videos to reach millions, and its recommenda­tion algorithms have been singled out as promoting violent white supremacy by suggesting increasing­ly radical channels to viewers.

Twitter is infamous for its coordinate­d harassment campaigns, often inspired by virulent misogyny and bigotry.

“There are so few incentives for these platforms to act in a way that’s responsibl­e,” said Mary Anne Franks, a law professor at the University of Miami and president of the Cyber Civil Rights Initiative, which advocates for legislatio­n to address online abuse. “We’ve allowed companies like Facebook to escape categoriza­tion, saying ‘we’re not a media company or entertainm­ent company,’ and allowed them to escape regulation.”

In response, the tech giants have called it impossible to vet the billions of hours of content that pass through their platforms despite the efforts of employees and contractor­s hired to sift out the worst posts flagged by users or automatic detection systems.

People who share the footage of the Christchur­ch shootings “are likely to be committing an offense” since “the video is likely to be objectiona­ble content under New Zealand law,” the New Zealand Department of Internal Affairs said in a statement Friday.

“The content of the video is disturbing and will be harmful for people to see. This is a very real tragedy with real victims and we strongly encourage people to not share or view the video.”

But the tech companies that host the footage are largely shielded from legal liability in the U.S. by a 1996 telecommun­ications law that absolves them of responsibi­lity for content posted on their platforms. The law has empowered companies, which generate billions in profit each year, by placing the onus of moderation on its users.

“If you have a product that’s potentiall­y dangerous, then it’s your responsibi­lity as an industry to make the appropriat­e judgment calls before putting it out in the world,” Franks said. “Social media companies have avoided any real confrontat­ion with the fact that their product is toxic and out of control.”

The risks of live broadcasti­ng have been present since the invention of radio, and the media has developed safeguards in response. It was illegal for radio shows to allow live callers to be broadcast on air until 1952, when an Allentown, Pa., station got around the law by inventing a tape delay system that allowed for some editorial control.

In 1998, a Long Beach man exploited the live broadcast model to get his message out by parking at the interchang­e of the 110 and 105 freeways and pointing a shotgun at passing cars.

Alarmed drivers called the police — soon, L.A.’s car chase choppers were on the scene, reporting live.

He fired shots to keep the police at a distance and unfurled a banner on the pavement: “HMO’s are in it for the money!! Live free, love safe or die.”

Then, to conclude what The Times called “one of the most graphic and bizarre events ever to unfold on a Los Angeles freeway,” he detonated a Molotov cocktail in the cab of his truck and shot himself in the head on live TV.

In response to public outcry over the grisly footage, which in some cases had interrupte­d afternoon cartoons, TV networks started to more broadly institute tape delays in live coverage, and approach situations with visibly disturbed subjects with more caution.

The tape delay system isn’t perfect. In 2012, a glitch in the system led Fox News to accidental­ly live broadcast the suicide of a man fleeing from the police. “That didn’t belong on TV,” anchor Shepard Smith said in an apology to viewers. “We took every precaution we knew how to take to keep that from being on TV, and I personally apologize to you that that happened .... I’m sorry.”

When the Facebook Live streaming launched in 2016, Facebook chief executive Mark Zuckerberg laid out a different mindset behind the feature to reporters at Buzzfeed News.

“Because it’s live, there is no way it can be curated,” he said. “And because of that, it frees people up to be themselves. It’s live; it can’t possibly be perfectly planned out ahead of time. Somewhat counterint­uitively, it’s a great medium for sharing raw and visceral content.”

 ?? Fiona Goodall Getty Images ?? MOURNERS in Christchur­ch, New Zealand, lay f lowers at a memorial to the 49 people killed at two mosques. Video of the shootings was spread on social media.
Fiona Goodall Getty Images MOURNERS in Christchur­ch, New Zealand, lay f lowers at a memorial to the 49 people killed at two mosques. Video of the shootings was spread on social media.

Newspapers in English

Newspapers from United States