With the average age for a child in the UK to get their first mobile phone now just seven, many parents will likely be concerned about the content their children have access to.
But Apple’s latest tool could put many worried minds at ease.
The tech giant has launched its Communication Safety tool in the UK, four months after rolling it out in the US.
The tool – which parents can choose to opt in or out of – will scan images sent and received by children in Messages for nudity and automatically blur them.
Children will then be able to decide whether to view the photo themselves, message an adult they trust, or block the contact.
Apple has launched its Communication Safety tool in the UK, four months after rolling it out in the US. The tool – which parents can choose to opt in or out of – will scan images sent and received by children in Messages for nudity and automatically blur them
The Communications Safety tool, which is off by default, can be toggled on by parents within the Screen Time settings.
If Messages detects that a child has received or is attempting to send a photo containing nudity, it will blur it out, before displaying a warning that the photo may be sensitive, and offering ways to get help.
‘Messages offers the child several ways to get help — including leaving the conversation, blocking the contact, leaving a group message, and accessing online safety resources — and reassures the child that it’s okay if they don’t want to view the photo or continue the conversation,’ Apple explained.
One of the options offered to the child is to message a trusted adult about the photo.
And if the child is under 13, Messages will automatically prompt them to start a conversation with their parent or guardians.
However, if the child does choose to view or send the photo, Messages will ask them to confirm their decision before unblurring the image.
While the tool initially raised concerns about privacy when it was announced last year, Apple has reassured that it does not have access to photos or messages.
‘Messages uses on-device machine learning to analyse image attachments and determine if a photo appears to contain nudity,’ it explained.
‘The feature is designed so that Apple doesn’t get access to the photos.’
The feature will also not break end-to-end encryption in Messages.
Apple added: ‘Any user of Messages, including those with communication safety enabled, retains control over what is sent and to whom.
If Messages detects that a child has received or is attempting to send a photo containing nudity, it will blur it out, before displaying a warning that the photo may be sensitive, and offering ways to get help
If the child is under 13, Messages will prompt them to start a conversation with their parent or guardians (stock image)
‘None of the communications, image evaluation, interventions, or notifications are available to Apple.’
The communication safety feature is available now for all devices running iOS 15.2 or later, iPadOS 15.2 or later, or macOS Monterey 12.1 or later.
Emma Hardy, Communications Director at the Internet Watch Foundation (IWF), said: ‘We’re pleased to see Apple expanding the communication safety in Messages feature to the UK.
‘At IWF, we’re concerned about preventing the creation of “self-generated” child sexual abuse images and videos.
‘Research shows that the best way to do this is to empower families, and particularly children and young people, to make good decisions. This, plus regularly talking about how to use technology as a family are good strategies for keeping safe online.’
Alongside Communication Safety, Apple also announced a controversial plan to scan iPhones for child abuse images to flag to the police – although this has now been delayed indefinitely.
The contentious plans were revealed by the tech giant on August 5, with the initial aim of rolling them out with software updates at the end of last year.
But in September, Apple said it would take more time to collect feedback and improve the proposed feature, after criticism of the system on privacy and other grounds both inside and outside the company.
Apple explained: ‘Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material [CSAM].
‘Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.’