Pittsburgh Post-Gazette

Serving the public good

Do social media companies have responsibi­lities?

-

Untold amounts of money have been poured into social networking technology as companies like Facebook and Google attempt to leverage their platforms to spread informatio­n and, in the process, make enormous profits. Making content “go viral” has become the aim of companies and creators throughout the world.

But the recent massacre at two mosques in Christchur­ch, New Zealand, which left 50 people dead, exposed a serious flaw in this mindset, one that will not be easily rectified.

The killer in New Zealand live-streamed a first-person video of his rampage. It was not the first time a murder has been videoed in this manner, but it was certainly the most highprofil­e and deadly incident of its kind.

Facebook, where the video originated, and YouTube, which is owned by Google, scrambled to scrub the video from its services. Facebook said it erased 1.5 million copies of the video within 24 hours. YouTube said in a statement that the rate at which the video was being uploaded to its platform “was unpreceden­ted both in scale and speed, at times as fast as one video per second.”

But even so, the video was still disseminat­ed to millions of people. Features that draw users to content being consumed en masse brought countless eyeballs to the violent video. Some analysts fear it could be a catalyst for copy-cat attacks. So what went wrong? Algorithms used by social networking companies to moderate content are well equipped to target certain kinds of prohibited material, such as copyrighte­d images, nudity or hateful language. But these systems are evidently not as capable of sussing out violent images. What’s more, users looking to spread the video could tweak simple elements, such as which way the image was oriented or altering the color, and automated content filters would be fooled.

The spread of the video has caught the attention of Congress. The House Homeland Security Committee has asked Facebook, Microsoft, YouTube and Twitter to explain the matter at a briefing on March 27.

New Zealand Prime Minister Jacinda Ardern also criticized the social media companies. “We cannot simply sit back and accept that these platforms just exist and what is said is not the responsibi­lity of the place where they are published,” Ms. Ardern said. “They are the publisher, not just the postman. There cannot be a case of all profit, no responsibi­lity.”

As social media and other digital platforms grow to a role of outsize influence in our lives, questions about their responsibi­lity to serve the public good are sure to grow. Perhaps no event has put that reality into focus as sharply as the Christchur­ch shooting. The same programs that divert peoples’ attention to quizzes and cat videos were leveraged to broadcast the murder of 50 people.

When representa­tives of the major social networks appear before Congress this week, they must inform us as to their responsibi­lities. They must avoid the exploitati­on of incidents like the Christchur­ch massacre. They have a lot to answer for and Congress should not let them dodge the issue.

At the very least, social media companies must devote more resources to careful monitoring of graphic content and some sort of control of material that dehumanize­s and degrades.

 ??  ??

Newspapers in English

Newspapers from United States