Facial recognition technology used by the Met Police is wrong in four out of five cases

Facial recognition technology identifies innocent people as wanted suspects in as many as four out of five occasions, according to a new report.

The controversial system that is currently being trialled by the Met Police is inaccurate 81 per cent of the time, the independent report stated.

The authors of the report also concluded that it is ‘highly possible’ the Met’s use of Live Facial Recognition (LFR) to-date would be found unlawful if challenged in court. 

However, previous figures suggest the technology is even more inaccurate with only two out of every 100 ‘matches’ pinpointing the correct person. 

Figures from Freedom of Information requests by Big Brother Watch last year showed that, for the Metropolitan Police, 98 per cent of ‘matches’ found by the technology were wrong, and for South Wales Police the figure was 91 per cent.

The software has been used at major events like the Notting Hill Carnival in 2017, Remembrance Day 2017, sporting fixtures, train stations and concerts to detect people on a watch list, including wanted criminals. 

Facial recognition software being used by the Metropolitan Police (pictured). The latest report found the force’s system was inaccurate 81 per cent of the time 

The controversial technology has been used in Leicester Square, Westfield shopping centre in Stratford, sporting events and concerts

The controversial technology has been used in Leicester Square, Westfield shopping centre in Stratford, sporting events and concerts 

South Wales Police used the software at various events including the Uefa Champions League 2017 final in Cardiff, international rugby matches plus Liam Gallagher and Kasabian gigs.

Pedestrians in Leicester Square were filmed by cameras over the Christmas period and shoppers at Westfield Stratford in east London have also been cross-checked with the criminal database. 

The Neoface system uses special cameras to scan the structure of faces in a crowd of people. 

The latest independent report by the University of Essex on the Met Police trial of facial recognition technology states there are ‘significant concerns’ and called for it to be stopped.

These concerns focused on the legal basis for the tests and the police ‘mixing of trials’ between research and police work to catch suspects.

Across the six trials that were evaluated, the LFR technology made 42 matches and in only eight of those matches the report could be ‘absolutely confident’ the technology got it right. 

Four of the 42 ‘suspects’ were people who were never found because they were absorbed into the crowd, so a match could not be verified, the report found. 

An unmarked police van with a camera on top scans the faces of pedestrians before officers swoop on anyone identified as a wanted criminal

An unmarked police van with a camera on top scans the faces of pedestrians before officers swoop on anyone identified as a wanted criminal

China has used the controversial technology in shopping centres and airports

China has used the controversial technology in shopping centres and airports

Authors Professor Pete Fussey and Dr Daragh Murray, said this created problems with ‘consent, public legitimacy and trust’, in the force’s use of the technology. 

They were granted unprecedented access to the final six of the ten trials run by the Metropolitan Police, running from June 2018 to February this year. 

Dr Murray, a specialist in international human rights law, said: ‘This report raises significant concerns regarding the human rights law compliance of the trials.

‘The legal basis for the trials was unclear and is unlikely to satisfy the ‘in accordance with the law’ test established by human rights law.

‘It does not appear that an effective effort was made to identify human rights harms or to establish the necessity of LFR.

‘Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process.’

Use of the controversial technology also appeared to not fully comply with human rights law that states it should only be used when strictly ‘necessary in a democratic society’. 

Ladbroke Grove on the last day of the 2017 Notting Hill Carnival in west London where police used facial recognition software. The force dropped plans to use it again in 2018 and this year's event

Ladbroke Grove on the last day of the 2017 Notting Hill Carnival in west London where police used facial recognition software. The force dropped plans to use it again in 2018 and this year’s event 

Therefore it is highly possible that police deployment of LFR technology may be held unlawful if challenged before the courts because there is no explicit legal basis for its use, the report found.

Criteria for including people on the ‘watchlist’ were not clearly defined and there were ‘issues’ with the accuracy of the lists, the authors said. 

This meant suspects were stopped by officers despite their case already being dealt with by the courts.

Professor Fussey, a leading criminologist specialising in surveillance, said of the findings: ‘This report was based on detailed engagement with the Metropolitan Police’s processes and practices surrounding the use of live facial recognition technology.

‘It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public.

‘The Metropolitan Police’s willingness to support this research is welcomed. The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached, and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision making processes.

‘It also highlights a need for meaningful leadership on these issues at a national level.’ 

Britain’s first legal challenge against the use of facial recognition was launched by Ed Bridges in May, claiming it breached his human rights.

Across the six trials that were evaluated by the report, the LFR technology made 42 matches and in only eight of those matches could be 'absolutely confident' the technology was right

Across the six trials that were evaluated by the report, the LFR technology made 42 matches and in only eight of those matches could be ‘absolutely confident’ the technology was right

Deputy Assistant Commissioner Duncan Ball said the Met was ‘extremely disappointed with the negative and unbalanced tone’ of the research and insisted the pilot had been successful. 

Campaigners have called for the technology to be banned as it is ‘almost entirely inaccurate’. 

Figures released last year by the Met Police showed there had been 102 false positives – cases where someone was incorrectly matched to a photo – and only two that were correct.

Neither of those was arrested – one was no longer wanted by police, and the other was classed as a ‘fixated individual’ who attended a Remembrance Day event.

For South Wales, 2,451 out of 2,685 matches were found to be incorrect – 91 per cent. Of the remaining 234, there were 110 interventions and 15 arrests.   

South Wales Police had stored pictures from both false positive and true positive matches for 12 months, potentially meaning images of more than 2,000 innocent people were stored by the force without the subjects’ knowledge. 

The force said the images were only stored as part of an academic evaluation for UCL, and not for any policing purpose.

The software used by SWP and the Met has not been tested for demographic accuracy, but in the United States concerns have been raised that facial recognition is less reliable for women and black people. 

Read more at DailyMail.co.uk