Facebook Messenger and Instagram could become ‘superplatforms’ for paedophiles

End-to-end encryption would make Facebook a ‘honeypot’ and a ‘superplatform’ for paedophiles, according to one official at the National Crime Agency (NCA).

Rob Jones, director of threat leadership at NCA, criticised the social network for its plans to introduce the security standard to both Facebook Messenger and Instagram, which Facebook also owns.  

End-to-end encryption ensures only the two participants of a chat stream can read messages, and no one in between – not even the company that owns the service. 

Jones called end-to-end encryption – which is already used on Facebook-owned WhatsApp – a ‘high-risk experiment’ and a ‘disaster for child safety and law enforcement’. 

He said the communications system puts the ‘pursuit of profit above the safety of the people on their platform, particularly children’. 

There is even a ‘very real risk’ that more child sex offenders could move to Facebook if end-to-end encryption is installed on the platform, according to Jones. 

He also revealed that the NCA had received just under 24,000 child abuse tip-offs from Facebook and Instagram last year, but only 308 from WhatsApp. 

The figures suggest that more criminals go undetected on WhatsApp because it features end-to-end encryption. 

End-to-end encryption ‘poses an existential threat to child protection’, a senior National Crime Agency (NCA) official has claimed 

WHAT IS END-TO-END ENCRYPTION?

End-to-end encryption ensures only the two participants of a chat can read messages, and no one in between – not even the company that owns the service.

End-to-end encryption is intended to prevent data being read or secretly modified when it is in transit between the two parties.

The cryptographic keys needed to access the service are automatically provided only to the two people in each conversation. 

In decrypted form, messages are accessible by a third party – which makes them interceptable by governments for law enforcement reasons.

Facebook-owned WhatsApp is already encrypted, and now Mark Zuckerberg is looking to do the same with Facebook Messenger and Instagram Direct. 

Facebook’s founder Mark Zuckerberg announced plans for end-to-end encryption on Messenger and its wider family of apps (including Instagram) in spring 2019 – but the change has been heavily delayed and is still yet to be enforced. 

Facebook wouldn’t comment on Friday regarding when exactly it plans to bring end-to-end encryption to Messenger and Instagram, describing it as a ‘long-term project’. 

Its plan has since attracted criticism from the likes of Home Secretary Priti Patel and Detective Chief Superintendent Kevin Southworth, head of Britain’s squad of anti-terrorist, as well children’s charity NSPCC. 

NCA has also long been highly critical about the proposed move.

‘There is a stark difference between a platform that refers content and a platform that can’t,’ Jones told the Times. 

‘If the end-to-end model is based on WhatsApp, I’m worried. It feels like a high-risk experiment.

‘The lights go out and you’re effectively guessing what’s going on in that platform.’ 

Up to now, Facebook has been ‘a huge help’ to lawful investigations and child safeguarding, Jones said. 

In 2019, Facebook made 15.8 million global referrals of child sex abuse material. 

‘But their plans will create a haven for child sex offenders to congregate to target children,’ Jones added. 

‘It’s not too late for Facebook to change their mind.’ 

Facebook built end-to-end encryption into WhatsApp after the company acquired it for about $19 billion in 2014. 

WhatsApp has consistently and proudly reiterated its commitment to end-to-end encryption. 

‘Strong encryption is a necessity in modern life,’ WhatsApp says. ‘We will not compromise on security because that would make people less safe.’

The ‘unbreakable digital lock’ keeps the contents of messages secure and viewable to no-one except the sender and the recipient. 

Despite being much touted by WhatsApp as a leading security standard for online messaging, it means company staff can’t identify child sex offences in the form of messages and video sent between the abuser and the victim.  

Facebook CEO Mark Zuckerberg (pictured). Facebook-owned WhatsApp already has end-to-end encryption, but Zuckerberg wants to introduce it to its wider family of apps, which also includes Messenger and Instagram

Facebook CEO Mark Zuckerberg (pictured). Facebook-owned WhatsApp already has end-to-end encryption, but Zuckerberg wants to introduce it to its wider family of apps, which also includes Messenger and Instagram

WHEN WILL END-TO-END ENCRYPTION ARRIVE ON FACEBOOK?

In spring 2019, Mark Zuckerberg announced plans to introduce end-to-end encryption to Facebook and its other platforms (apart from WhatsAPp, which already has it). 

In a blog post, Zuckerberg called the feature ‘an important tool in developing a privacy-focused social network’.

‘Encryption is decentralizing – it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information,’ he said.  

Two years later, however, there is no sign of the change being implemented. 

Facebook later revealed that encrypting messenger by default will take years.

The last word was from Jon Millican, Facebook’s software engineer for Messenger privacy, in January 2020. 

‘I’ll be honest right now and say we’re still in a place of having more questions than answers,’ Millican said.

‘While we have made progress in the planning, it turns out that adding end-to-end encryption to an existing system is incredibly challenging and involves fundamentally rethinking almost everything.’

Opposition from Facebook shareholder activists may account for the long delay. 

On WhatsApp, whether or not the abuser is caught relies on tip-offs stemming from the child being abused. 

Facebook, meanwhile, relies on a combination of algorithms and staff people to detect illegal activity – such as sexual abuse of minors.  

But introducing end-to-end encryption to Facebook Messenger and Instagram, which also has an in-built messaging stream, will shut these detection methods out. 

Speaking at a virtual press earlier this month, Jones pointed out that the proposed move ‘poses an existential threat to child protection’ and appears to put profits ahead of the safety of its users. 

Jones said that Facebook’s ‘balance between business objectives and public protection is wrong’.

‘They appear to be putting profit, and the pursuit of profit, above the safety of the people on their platform, particularly children.’       

His comments followed the sentencing of prolific paedophile David Wilson, 36, at Ipswich Crown Court for 96 child sex abuse offences on February 10.

Wilson used fake social media profiles to pose as girls and get young boys to send him indecent images, and he approached up to 5,000 boys online.

Andy Burrows, NSPCC head of child safety online policy, said: ‘Wilson’s prolonged campaign of sexual abuse was exposed following a large-scale investigation by the NCA.

‘If it wasn’t for the evidence provided by Facebook this would not have been possible and many more children could have been exploited.

‘Despite this case highlighting the importance of tech companies being able to detect and disrupt abuse on their sites, Facebook still wants to proceed with end-to-end encryption which could prevent its moderators from uncovering prolific abuse.’

In 2020, Facebook also sent 12 million CyberTips to the US National Centre for Missing and Exploited Children, which receives industry referrals before disseminating them to law enforcement agencies to investigate.

Jones said Facebook has ‘invoked tooling on their network which detects these images’ and refers them to law enforcement, which he called ‘great’.

But end-to-end encryption ‘effectively locks them out of their own network and locks them out of their own product and the material that’s on that network’ he added.

WhatsApp has consistently and proudly reiterated its commitment to end-to-end encryption

WhatsApp has consistently and proudly reiterated its commitment to end-to-end encryption 

‘What it creates is a private space where people like Wilson can masquerade as children, engage with children, groom them and potentially develop either coercive control of that individual and get them to abuse themselves and send images to them, or to meet them in the real world and abuse them directly themselves.’

The NCA is asking to ‘maintain a position where Facebook can access their own material and report unlawful abuse of children online to the NCA and to international law enforcement’. 

A Facebook company spokesperson told MailOnline: ‘Child exploitation and grooming have no place on our platforms.

‘Facebook has led the industry in developing new ways to prevent, detect, and respond to abuse and we will continue to work with law enforcement to combat criminal activity.

‘End-to-end encryption is already the leading technology used by many services to keep people safe online and we will continue to invest in finding more ways to detect and fight these heinous crimes.’ 

‘WhatsApp already bans around 300,000 accounts each month suspected of sharing child exploitative imagery, and has increased the amount of reports provided to child safety authorities with further technology developments.’

In his 2019 blog post announcing plans to bring end-to-end encryption to Facebook, Zuckerberg said there was a balance to be found between privacy from end-to-end encryption and protecting people.

He said he’d spoken with dissidents who told him that encryption is ‘the reason they are free, or even alive’. 

‘On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do,’ Zuckerberg wrote.

Child abuse content on Facebook may have been missed due to technical issue 

Facebook is working to catch child sexual exploitation content it may have missed as a result of a technical issue after changes the firm made late last year, it revealed in February 2021. 

Between July and September 2020, the tech giant adjusted its detection technology, which allowed it to find and remove a large amount of old violating content, resulting in a spike in enforcement.

In mid-November the company made changes to its media-matching tools and later discovered a technical issue in the implementation.

‘When that error was discovered, we fixed it and are in the process of going back to retrospectively remove all of that content that was missed,’ said Guy Rosen, Facebook’s vice president of integrity.

Targeted content decreased from 12.4 million pieces in the third quarter, when changes were first introduced, to 5.4 million in the final three months of 2020.

The latest data for child nudity and sexual exploitation is the lowest count since Facebook began publishing progress in mid 2018 in regular Community Standards Enforcement Reports.

The vast majority of offending material is removed by the company’s AI systems before users report it.

Fewer than 0.05 per cent of views are estimated to be of content that violated standards against child nudity and sexual exploitation.

Facebook also revealed that it addressed 200,000 fewer pieces of child sexual content on Instagram compared with the previous quarter, saying ‘fluctuations in manual review capacity’ due to the pandemic were to blame.

The social network has previously said its ability to review content has been impacted by Covid-19 and it is prioritising the most harmful content.

However, Facebook reported improvements in other enforcement areas, particularly bullying and harassment, having reviewed 6.3 million pieces between October and December, up from 3.5 million in the previous three months, due in part to updates in technology used to detect comments.

On Instagram, the number the firm looked at almost doubled from 2.6 million to five million.

Prevalence of hate speech on Facebook fell to about seven or eight for every 10,000 views of content, while violent and graphic content dropped from 0.07 per cent to 0.05 per cent and adult nudity from 0.05-0.06 per cent to 0.03-0.04 per cent. 

And 6.4 million pieces of organised hate content was inspected, up from four million.

Read more at DailyMail.co.uk