- The identities of vulnerable abuse victims can be revealed in Google searches
- Police and courts have been urged to tell Google when anonymity is breached
- A Tory MP said Google had to obey UK law even if that meant changing searches
Google has been accused of letting users look up the identity of rape victims.
Searching online for details about attackers in prominent cases can return the names of their victims or accusers.
The identities of vulnerable defendants who have been granted anonymity by a court may also be revealed.
The problem is caused by Google’s ‘related search’ and ‘autocomplete’ function. It will suggest the names of victims because it has logged popular searches for information around those names.
Searching online for details about attackers in prominent cases can return the names of their victims or accusers because of Google’s related search functions
Maria Miller, who is chairman of the Commons women and equalities commission, told The Times: ‘Google has to operate within the law of the UK – if that means they have to change how their search engine operates, then so be it.’
Labour MP Jess Phillips said the technology was turning rape and abuse victims into ‘click-bait’ and a rape charity said that fear of exposure could stop victims coming forward. Fay Maxted of the Survivors Trust said it was ‘beyond shocking that Google is facilitating access to the names of victims’.
Police and the courts have been urged to inform the technology giant in cases where a victim’s anonymity is breached.
Lifelong anonymity is granted to complainants and victims of sexual offences, even if the accused is acquitted.
Maria Miller (left) said Google had to operate within UK law even if that meant changing its search engine, while Jess Phillips (right) said abuse victims were becoming ‘click-bait’
Breaching these orders is a criminal offence and comes with a fine of up to £5,000. Nine people have been convicted for posting names on social media in recent times.
A Google spokesman said: ‘We don’t allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case.
‘We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad prediction.’