Google searches ‘reveal names of women in rape and sex trials’
GOOGLE could be helping users expose the identity of rape and sex crime victims whose anonymity is protected by law.
The web giant is facing fury after it emerged that its ‘related search’ and ‘autocomplete’ features were flagging up the names of female complainants in recent high-profile trials.
The evidence has been described as ‘beyond shocking’ amid fears that online bullies could exploit the information.
The features are designed to help users quickly find the content they are looking for and to suggest related web searches which may be of interest.
But typing in the defendant’s name plus a place name or other common term would reveal victims’ names in a number of rape cases. This is because Google’s algorithms log popular searches and keep a record of names.
Then it is able to suggest the names of victims if users look up details, often after women have been illegally identified on social media, reported The Times.
Lifelong anonymity is given to complainants and victims of sexual offences, even if the accused is acquitted. Breaching the law is a criminal offence and can lead to a fine of up to £5,000.
Fay Maxted, chief executive of The Survivors Trust, said it was ‘ beyond shocking that Google is facilitating access to the names of victims’.
Maria Miller MP, chairman of the Commons Women and Equalities Committee, added: ‘Google has to operate within the law of the UK... if that means they have to change how their search engine operates, then so be it.’
Professor Alan Woodward, a computing scientist at the University of Surrey, said: ‘Convenience can sometimes be the enemy of security and privacy. This is a case of unintended consequences.’
Google’s policy documents on the autocomplete function state it ‘removes predictions that contain sexually explicit or vulgar language’. The web giant also claims to prohibit ‘dangerous and harmful activity’, ‘ hateful predictions’ as well as ‘sexually explicit predictions’.
Users can also report illegal content and Google removes terms ‘in response to valid legal requests’.
It insisted offending results are taken down quickly if something illegal is highlighted. Google communications manager Hannah Glenny said: ‘We don’t allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case.
‘ We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities.
‘We encourage people to send us feedback about any sensitive or bad predictions.’
In 2012, nine people who illegally named a woman who alleged she had been raped by footballer Ched Evans were fined hundreds of pounds each.
Including a female teacher, they posted messages on Twitter and Facebook accusing her of being a ‘money-grabbing slut’ who had made up the attack. Mr Evans was acquitted after a retrial in 2016.
In 2013, Peaches Geldof posted the names of two mothers convicted of allowing their children to be abused by singer Ian Watkins but deleted it after four hours.
Google has previously been accused of acting irresponsibly, particularly over its YouTube video-streaming service.
Last month, a Mail investigation revealed several videos of gangs promoting stabbing alongside adverts for brands such as BT.
Following a spate of attacks, Google said last year it would put more resources into identifying videos which spread hate.
This week, it received court notices in India for disclosing the identity of an eight-year- old girl who was raped and murdered.
‘Has to operate within the law’