Facebook tool that spots suicide risk blocked by EU data rules
EUROPEAN data laws have prevented Facebook from introducing a tool designed to spot users at risk of suicide.
The social media giant has announced that it will use artificial intelligence to spot posts and video comments that indicate someone is expressing suicidal thoughts.
However, data protection laws across the EU, which ban processing of an individual’s sensitive personal data without their explicit permission, mean that the update will not make it to member countries.
Facebook already allows users to report posts when they think the person is at risk, at which point a moderator is alerted and can choose to send the person helplines or a chance to talk to a friend. However, it says many red flags often go unreported.
Mark Zuckerberg, Facebook’s boss, said the company has now started to introduce “proactive detection”, which automatically looks for trigger phrases in posts or comments on videos such as “are you OK?” and “can I help?”.
After testing it in the US, it said it would introduce it worldwide except in the EU as it was “a sensitive issue in Europe”, said a spokesman.
Sensitive data, such as political opinions,
‘Data protection law [in the European Union] is tougher in relation to sensitive data’
health information and religious beliefs are subject to strict laws under the EU’S Data Protection Directive and the forthcoming General Data Protection Regulations.
Britain’s data protection laws are expected to remain in line with European rules after Brexit.
Ashley Winton, a data protection lawyer at Mcdermott Will & Emery, said Facebook would have to secure the consent of users in the EU before it could introduce the tool. “Data protection law is tougher in relation to sensitive data,” he said.
Facebook has been forced to change its service in the EU before due to scrutiny from regulators over data privacy concerns. In 2011, it was threatened with legal action in Germany over the facial recognition software used to automatically tag friends in photos, which critics said violated policy and data protection laws.
Facebook has been criticised following several high-profile cases where acts of violence, including shootings, suicide attempts and self-harming, were posted on the website.
In 2016, Nakia Venant, 14, was found dead in her Miami bedroom after a friend alerted police to a troubling Facebook Live broadcast. Two months ago, Ayhan Uzun, 54, from Turkey, shot himself on the Live feature after his daughter told him she was engaged.