The Mercury News

Facebook says shooting watched before reported

Firm says it didn’t learn of stream until after 4,000 views

- By Levi Sumagaysay lsumagaysa­y@bayareanew­sgroup.com

As tech giants face growing criticism for their part in the disseminat­ion of the video of the deadly Christchur­ch terrorist attack in New Zealand last week, Facebook said nobody reported the livestream of the shooting as it was happening.

Although Facebook said the video was viewed live fewer than 200 times, it wasn’t until it had been viewed more than 4,000 times in total before the company removed the video, General Counsel Chris Sonderby said in an update this week.

Facebook removed the video within an hour of the shootings, minutes after being alerted to it by New Zealand authoritie­s, Facebook’s vice president for global policy, Monika Bickert, told the New Zealand Herald, noting that the company’s muchtouted artificial intelligen­ce technology — which the company says does not have enough data about previous shootings — did not detect the video.

“People may well watch something in horror and still not report it,” said Irina Raicu, director of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University, on Tuesday. “Reports suggest that there was a preplanned, coordinate­d effort to modify the video so that it would evade automatic detection; the people who posted such versions are much more of a concern than those who just watched but didn’t report it.”

A link to a copy of the video was posted on message board 8chan before Facebook could take it down, Sonderby wrote in a blog post. The video was also distribute­d to other places, such as YouTube, the world’s largest videoshari­ng site.

Fifty people were killed after a gunman opened fire at two mosques in New Zealand on Friday. The suspect in custody is a 28-year-old Australian man who is believed to have posted a 74-page right-wing manifesto online, in which he railed against Muslims, immigrants and talked about wanting to preserve the white race.

Some other numbers

Facebook provided:

• The first user reported the original video 29 minutes after it started, and 12 minutes after the live broadcast ended.

• In the first 24 hours, the company removed about 1.5 million videos of the attack. Facebook blocked more than 1.2 million of those videos at upload, before anyone could see them.

• Facebook and other companies have shared more than 800 “visually distinct videos” related to the attack to a collective database of the Global Internet Forum to Counter Terrorism.

Meanwhile, YouTube also has been struggling to take down the videos, which at one point reportedly was spreading on its site as quickly as one per second in the hours after the attacks. Other sites such as Twitter and Reddit said last week they were removing the videos and related content.

As tech giants are being urged to take further action on this issue amid reports that the suspect was steeped in online culture and designed his attacks for maximum viral impact, they are facing boycotts in New Zealand, and some advertiser­s have pulled their ads, according to media reports. In addition, some internet service providers in that country are blocking traffic to sites — such a 8chan, 4chan and others — that are failing to remove uploads of the shooter’s original video.

Eric Goldman, a Santa Clara University School of Law professor and director of the school’s High Tech Law Institute, said Tuesday that the tech companies are being unfairly singled out.

He pointed out that some news outlets aired parts of the video because the attacks were newsworthy.

“There’s no question that the shooting was news,” he said. “If the media could’ve had cameras there, they might have (kept them rolling).”

“It’s so tempting to point to (tech companies) and say they should be doing more,” Goldman continued. “But I don’t think they can win that battle. It’s a combinatio­n of too many uploads and too many variations.”

Newspapers in English

Newspapers from United States