Microsoft made ‘significant changes’ after expose revealed their Bing search engine auto-filled suggestions guiding users towards child abuse websites, inquiry is told
- A report carried out by AntiToxin lead to Microsoft’s ‘significant change’
- Seemingly innocent searches would be auto-completed by Bing search engine
- The search tool added terms bringing up extensive amounts of illegal images
An expose showing indecent images of children on Microsoft’s Bing search engine prompted the software giant to make a ‘significant change’, an inquiry has heard.
‘The ability of campaigning organisations, NGOs, journalists, to hold us to account over this kind of thing is incredibly important’, said Hugh Milward, Microsoft UK director of corporate, external and legal affairs.
He was giving evidence to the Independent Inquiry Into Child Sexual Abuse (IICSA) on Wednesday.
Jacqueline Carey, counsel to the inquiry, referenced a report from the AntiToxin group in January which found Bing, the world’s second biggest search engine, was auto-completing suggestions that led to indecent pictures of children.
The Bing search engine was found to auto-complete searches with terms that lead to results containing extensive amounts of illegal images
The AntiToxin report, commissioned by Tech Crunch, found that when researchers carried out seemingly innocent searches for specific social media sites, Bing search engine would auto complete the search adding terms that brought up extensive amounts of illegal images.
Mr Milward told the inquiry there is ‘some debate how these particular images might classify’ but went on: ‘What was really clear was the results that came up when you searched for them were not good enough, so we needed to act.
‘What this investigation prompted us to do was to fundamentally sit down and rethink the way in which we were devoting engineering attention to the challenge we face here.
‘It’s largely as a result of AntiToxin and the IWF (Internet Watch Foundation) that we’ve worked to make quite a significant change in the way that we’re thinking about the way that search responds to these kinds of queries.’
The IICSA is conducting its second investigation phase, into how the internet is used to facilitate child sexual abuse in England and Wales through acts like grooming, sharing indecent images and live-streaming abuse.
Hugh Milward, Microsoft UK director of corporate, external and legal affairs, called the ability of campaigning organisations, NGOs and journalists, to hold the company to account ‘incredibly important’
Tech giants have been accused of inadequately protecting children and failing to effectively stop the spread of child abuse material, the inquiry has heard.
Ms Carey said: ‘Why does it take publicity like this to cause a rethink by Microsoft in terms of child safety?’
Mr Milward replied: ‘I do think this points to the value of having other organisations working with us on this kind of thing.
‘The ability of campaigning organisations, NGOs, journalists, to hold us to account over this kind of thing is incredibly important.
‘It’s that collective action that we’re going to see incremental improvements.
‘The abusers and those who are determined to share and view these kind of images are constantly finding new ways they can mask their tracks or to effectively fool the algorithm when it comes to search.’
Some hearings involving the technology firms are closed sessions and not accessible to the public or press owing to ‘sensitive matters relating to investigatory tactics which could help offenders evade detection if publicly disclosed’, said the IICSA.