Apple reveals it scans iCloud photos to check for child sexual abuse images 

Apple reveals it scans photos uploaded to the cloud from iPhones to check for child sexual abuse images

  • Apple’s privacy director confirmed the company uses software to scan images
  • They did not go into detail over how the software works or what tools they use
  • Privacy director Jane Horvath defended Apple’s use of end-to-end encryption 

Apple are scanning photos uploaded to its cloud service for signs of child abuse images, according to a senior privacy officer for the tech giant.

Speaking at a panel at CES in Las Vegas, Jane Horvath said Apple uses specialist software to automatically screen iPhone images backed up to iCloud.

She didn’t go into any detail about the type of software they use but said they were being used to ‘help screen for child sexual abuse material’.

It was confirmed as part of a roundtable debate into privacy issues and whether legislation is required to protect users personal information.

 Speaking at a panel at CES in Las Vegas, Jane Horvath – pictured – said Apple uses specialist software to automatically screen iPhone images backed up to iCloud

Ms Horvath was defending the fact Apple uses encryption on all of its devices to protect users private information – which includes health and financial data.

She said other solutions, such as software to detect signs of child abuse, were needed rather than opening ‘back doors’ into encryption as suggested by some law enforcement organisations and governments. 

‘Our phones are small and they are going to get lost and stolen’, said Ms Horvath.

‘If we are going to be able to rely on having health and finance data on devices then we need to make sure that if you misplace the device you are not losing sensitive information.’

She said that while encryption is vital to people’s security and privacy, child abuse and terrorist material was ‘abhorrent’. 

‘We are very dedicated and none of us want that kind of material on our platforms but building a back door into encryption is not the way we are going to solve those problems’, she said.

Erin Egan from Facebook said the company agrees about the vital importance of end to end encryption – she said they encrypt all messages in WhatsApp – but it is equally as important to get it right with abusive content.

‘How do we balance that? One of things we are working on is reporting mechanisms, detection ahead of time and looking for signals in encrypted content.’ 

The panel discussion was the first time an Apple executive had appeared at CES in years, with global privacy senior director Horvath’s representing the firm.

Pictured, the all-female panel of executives speaking on a privacy panel at CES in Las Vegas. Pictured, L-R: Facebook's chief privacy officer for policy, Erin Egan; Apple Senior Director for global privacy, Jane Horvath; The Procter & Gamble global privacy officer, Susan Shook; and Federal Trade Commissioner Rebecca Slaughter

Pictured, the all-female panel of executives speaking on a privacy panel at CES in Las Vegas. Pictured, L-R: Facebook’s chief privacy officer for policy, Erin Egan; Apple Senior Director for global privacy, Jane Horvath; The Procter & Gamble global privacy officer, Susan Shook; and Federal Trade Commissioner Rebecca Slaughter

Her counterpart from Facebook, Erin Egan, claimed the firm’s privacy check-up tool is a good example of how the company is making advances in protecting its users.

An Apple spokesman told the Telegraph it has a privacy statement on its website addressing the issue of scanning images. 

‘Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space’, they said.

‘As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.’

While Apple didn’t go into detail on the software it uses, many companies use a technology developed by Microsoft called PhotoDNA. 

The software checks images against a database of previously identified images using ‘hashing’ where it doesn’t see the image but the data behind the image.

It is used by Facebook, Google and Twitter. 

APPLE’S SIRI PRIVACY SCANDAL

Apple CEO Tim Cook has repeatedly claimed privacy is a fundamental human right. 

As a result, the firm is taking steps to protect users that other firms are not, and has experienced fewer privacy scandals than peers Google and Facebook. 

But despite its best efforts, there have been some questionable events. 

In August 2019 it was revealed Apple had human contractors listen to Siri recordings to help improve its service.

Later, Apple formally apologised for this. it said in a statement: ‘We realise we haven’t been fully living up to our high ideals, and for that we apologise.’

Google, Amazon, Facebook, and Microsoft were all doing similar things. 

It was revealed Apple employees could be listening to upwards of 1,000 Siri recordings a day. 



Read more at DailyMail.co.uk