Face­book’s high-stakes dilemma over sui­cide videos

Sui­cide pre­ven­tion ex­perts say there are best prac­tices so­cial net­works should fol­low to min­imise harm.

The Star Malaysia - Star2 - - Health - By QUEENIE WONG

AMANDA He­bert felt pow­er­less as she watched a Face­book video of her 32-year-old friend tak­ing her own life.

In the 12-minute video, a po­lice of­fi­cer begged He­bert’s friend to think of her two daugh­ters and to let him help. He­bert called her friend, who streamed the sui­cide on­line, only to see her phone calls ig­nored in the video.

For He­bert, the pain from her friend’s death didn’t end there.

De­spite re­ports to Menlo Park­based Face­book by friends and the Anne Arun­del County po­lice in Mary­land, she said the tech firm took at least six hours to pull down the video.

It made its way to an­other web­site, where it now has hun­dreds of thou­sands of views.

“It’s lit­er­ally still out there haunt­ing her friends and fam­ily,” He­bert said.

So­cial me­dia com­pa­nies such as Face­book and Twit­ter’s Periscope have made videos sim­pler for peo­ple to share on­line, but now these com­pa­nies are in a race against time to re­spond quickly to posts de­pict­ing self-harm – be­fore they go vi­ral.

Bal­anc­ing the risks of sui­cide con­ta­gion with free speech, news­wor­thi­ness and other fac­tors, these com­pa­nies’ com­plex de­ci­sions to leave a video up or pull it down can mean the dif­fer­ence be­tween life and death for peo­ple at­tempt­ing sui­cide.

Some­times, leav­ing a video up can al­low fam­ily and friends to reach out to the per­son or call law en­force­ment for help.

“It’s a hard place for these com­pa­nies to be, to make de­ci­sions about what they’re go­ing to al­low and what they’re not go­ing to al­low, be­cause it be­comes a slip­pery slope quickly,” said Daniel Rei­den­berg, ex­ec­u­tive di­rec­tor of Sui­cide Aware­ness Voices of Ed­u­ca­tion.

Sui­cide is the 10th lead­ing cause of death in the United States, ac­cord­ing to the Cen­ters for Dis­ease Con­trol and Pre­ven­tion (CDC), and as more peo­ple share their lives – and in some cases, their deaths – on­line, tech firms are play­ing a larger role in ef­forts to pre­vent self­harm.

Face­book, the world’s largest so­cial net­work with nearly two bil­lion users, rolled out sui­cide-pre­ven­tion tools to pro­vide help-line in­for­ma­tion and re­sources to those in dis­tress.

Oc­ca­sion­ally, Face­book and its users have suc­cess­fully in­ter­vened.

In May, a Ge­or­gia teenager who at­tempted sui­cide on Face­book Live sur­vived af­ter a friend and Face­book it­self re­ported the video to law en­force­ment.

Sher­iff’s deputies scram­bled to find the right ad­dress for the teen and to con­firm that the video, which could only be viewed by the teenager’s Face­book friends, wasn’t a prank.

“The paramedics were able to ren­der aid to her, and she was trans­ported to the hos­pi­tal,” said Bibb County sher­iff’s Sgt. Linda Howard. “It hap­pened so fast, but it took us 42 min­utes to find out where she was and get to her lo­ca­tion.”

Yet when some­one dies in a sui­cide streamed on Face­book, some say the com­pany needs to pull down those videos faster.

Af­ter her friend’s sui­cide, He­bert said in­ter­net trolls posted hurt­ful com­ments on the dead woman’s Face­book page, which is no longer on­line.

And at least two users posted the video on an­other web­site, where it now has more than 202,100 views and has been shared more than 650 times.

He­len Alexan­der, an ac­quain­tance of the victim, said she re­ported the video to Face­book, but it al­ready had 10,000 views on the site by the time the tech firm re­moved the footage.

Face­book said it re­moved a video of the woman’s sui­cide posted by an­other user from an­other source, but has no record of the woman livestream­ing her own death.

“You can’t stop peo­ple from do­ing what­ever they’re go­ing to do with livestream­ing, but as a plat­form, you can gov­ern what hap­pens to that stream or video once it’s been re­ported to you,” Alexan­der said.

She and He­bert shared an on­line pe­ti­tion that called on the White House to make it il­le­gal to share live videos of sui­cides.

While Face­book may leave up videos if they pro­vide a life­line to the per­son in dis­tress, the com­pany of­ten takes them down af­ter­ward amid con­cerns about the im­pact on sur­vivors and copy­cat sui­cides.

Face­book has on­line rules against pro­mot­ing or en­cour­ag­ing sui­cide or self-in­jury, but the tech firm also started al­low­ing more con­tent that peo­ple find news­wor­thy even if it vi­o­lates the com­pany’s stan­dards.

For ex­am­ple, the so­cial me­dia gi­ant said it left up a video of an Egyp­tian man who set him­self on fire to protest ris­ing gov­ern­ment prices be­cause it be­lieved his act was news­wor­thy.

“It’s hard to judge the in­tent be­hind one post, or the risk im­plied in an­other,” wrote Monika Bick­ert, Face­book’s head of global pol­icy man­age­ment, in a re­cent blog post. “Some­one posts a graphic video of a ter­ror­ist at­tack. Will it in­spire peo­ple to emu­late the vi­o­lence, or speak out against it? Some­one posts a joke about sui­cide. Are they just be­ing them­selves, or is it a cry for help?”

But sui­cide-pre­ven­tion ex­perts say there are best prac­tices so­cial net­works should fol­low to min­i­mize harm.

“There are things that you don’t want to have hap­pen, and those in­clude glo­ri­fy­ing sui­cide, graph­i­cally show­ing how peo­ple take their lives and pro­vid­ing peo­ple with a blue­print with how they might take their life,” said Vic Ojakian, the Na­tional Al­liance on Men­tal Ill­ness Santa Clara County board pres­i­dent, whose son died by sui­cide.

Face­book and other tech firms haven’t said how many sui­cides or at­tempts have been broad­cast through their live-video tools, but ex­perts believe it’s a small frac­tion of sui­cides world­wide.

Nearly 800,000 peo­ple world­wide take their own lives ev­ery year, and sui­cide was the sec­ond lead­ing cause of death among 15- to 29-year-olds in 2015, ac­cord­ing to the World Health Or­ga­ni­za­tion (WHO).

Face­book re­cently an­nounced it was hir­ing 3,000 more work­ers to help re­view posts that are flagged for vi­o­lat­ing its on­line rules, in­clud­ing videos that pro­mote sui­cide and vi­o­lence.

In this in­creas­ingly dig­i­tal world, ex­perts say that any­one – not just men­tal health of­fi­cials and cri­sis hot­lines – can help save some­one’s life.

“Ul­ti­mately in my mind, sui­cide hap­pens when pain out­weighs hope,” said Stan Collins, a sui­cide-pre­ven­tion spe­cial­ist for Know the Signs, a sui­cide-pre­ven­tion mar­ket­ing cam­paign by the Cal­i­for­nia Men­tal Health Ser­vices Au­thor­ity. “So the so­lu­tion is how can we con­tinue to keep hope and help con­vince peo­ple the rea­sons for liv­ing?” – The Mer­cury News (San Jose, Cal­i­for­nia)/Tri­bune News Ser­vice

So­cial me­dia com­pa­nies have made videos sim­pler for peo­ple to share on­line, but now these com­pa­nies are in a race against time to re­spond quickly to posts de­pict­ing self-harm – be­fore they go vi­ral. — TNS

Ji­ranuch Tri­rat (left), 22, is com­forted by friends as she looks at a photograph of her 11-month-old daugh­ter Natalie dur­ing the last fu­neral rites at a tem­ple in Phuket on April 29, 2017. Tear­ful rel­a­tives gath­ered out­side a Thai tem­ple to bury the 11-month-old girl mur­dered by her father in a har­row­ing video he broad­cast live on Face­book be­fore com­mit­ting sui­cide. — AFP

Newspapers in English

Newspapers from Malaysia

© PressReader. All rights reserved.