Artificial intelligence will detect child abuse images

Artificial intelligence could soon takeover the traumatic task of trawling through images of child abuse from police officers.

A pilot scheme will see machine learning taught how to grade the severity of the disturbing photos and footage, saving detectives from the distressing task.

If successful, the trial could go into full service ‘within two to three years’, according to the force behind its development.

AI could soon takeover the traumatic task of trawling through images of child abuse from police officers. A pilot scheme will see machine learning taught how to grade the severity of the disturbing photos and footage, saving detectives from the distressing task (stock image)

POTENTIAL DRAWBACKS 

The approach is not without its drawbacks, including the legal ramifications of uploading such sensitive information.

Police are granted legal permission from the courts to store criminal images.

This protection would not apply to any cloud storage service providers.

This could present a legal risk to any company that agrees to work with the Met on the project.

There is also the potential for material to be leaked into the public domain.

Cloud storage services have been plagued by a number of high profile hacks.

That includes the Apple data breach in 2014 that saw nude images of a number of celebrities stolen from their personal accounts and posted online.

The system is being created by the Metropolitan Police’s digital forensics department.

Last year, officers from the Met scoured through 53,000 gadgets for evidence of the indecent images. 

It already employs image recognition software, according to reports in The Telegraph, but these are not yet advanced enough to spot indecent images and video. 

The new approach will rely on cloud computing, with data being moved to online storage from a big name provider like Amazon Web Services, Google or Microsoft.

The Met currently uses a London based server storage centre, but this is under increasing strain due to the volume of images uncovered, as well as rising resolution sizes.

Using cloud storage would expand the force’s data capabilities as well as letting it make use of advanced analytics services provided by the Silicon Valley firms.

Speaking to The Telegraph, Mark Stokes, the Met’s head of digital and electronics forensics, said: ‘We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans.

‘You can imagine that doing that for year-on-year is very disturbing.’

With the help of cloud storage AI could be trained to detect abusive images ‘within two-three years’, he added.

The approach is not without its drawbacks, however, including the legal ramifications of uploading such sensitive information.

The system is being created at the Metropolitan Police's digital forensics department. If successful, the trial could go into full service 'within two to three years', according to the force

The system is being created at the Metropolitan Police’s digital forensics department. If successful, the trial could go into full service ‘within two to three years’, according to the force

Police are granted legal permission from the courts to store criminal images, and this protection would not apply to any cloud storage service providers.

This could present a legal risk to any company that agrees to work with the Met on the project.

There is also the potential for material to be leaked into the public domain.

Cloud storage services have been plagued by a number of high profile hacks.

That includes the Apple data breach in 2014 that saw nude images of a number of celebrities stolen from their personal accounts and posted online.

Mr Stokes said cloud providers have offered some solutions to safeguard any data uploaded. 



Read more at DailyMail.co.uk