The Met Police Commissioner has defended the use of facial recognition technology to catch criminals, claiming that privacy concerns are ‘much smaller’ than protecting the public from ‘a knife through the chest’.
Dame Cressida Dick told a conference in Whitehall that ‘ill-informed’ critics would need to justify the use of the controversial cameras to victims of violent crime who have been caused ‘real harm’.
The technology – which has been criticised by civil liberties and privacy campaigners – scan members of the public against a ‘watchlist’, said to contain suspects wanted by police.
The Met claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 times.
But the new police cameras have already been branded a failure after spotting no suspects despite scanning 4,600 shoppers in an east London mall.
Dame Cressida Dick told a conference in Whitehall that ‘ill-informed’ critics would need to justify the use of the controversial technology to victims of violent crime who have been caused ‘real harm’
Ms Dick today said it was not the force’s job to decide ‘where the boundary lies between security and privacy’, and argued that the threshold of privacy is likely to have been eroded in the age of social media.
She told delegates at the Royal United Services Institute: ‘It is for critics to justify to the victims of those crimes why police should not use tech lawfully and proportionally to catch criminals who caused the victims real harm.
‘It is not for me and the police to decide where the boundary lies between security and privacy, though I do think it is right for us to contribute to the debate.
‘But speaking as a member of public, I will be frank. In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR (live facial recognition) and not being stored, feels much, much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.’
the new police cameras have already been branded a failure after spotting no suspects despite scanning 4,600 shoppers in an east London mall (pictured signs about the cameras opposite the centre)
The police chief said that if artificial intelligence could help identify potential terrorists, rapists or killers, most members of the public would want them to use it.
Dame Cressida said: ‘If, as seems likely, algorithms can assist in identifying patterns of behaviour by those under authorised surveillance, that would otherwise have been missed, patterns that indicate they are radicalising others or are likely to mount a terrorist attack.
‘If an algorithm can help identify in our criminal systems material a potential serial rapist or killer that we could not have found by human endeavour alone.
The technology – which has been criticised by civil liberties and privacy campaigners – scan members of the public against a ‘watchlist’, said to contain suspects wanted by police (stock picture)
‘If a machine can improve our ability to disclose fairly then I think almost all citizens would want us to use it.
‘The only people who benefit from us not using lawfully and proportionately tech are the criminals, the rapists, the terrorists and all those who want to harm you, your family and friends.’
Recent Metropolitan Police use of facial recognition led to the arrest of eight criminals who would not otherwise have been caught, delegates heard.
But use of the technology has been criticised as a violation of privacy.
Human rights groups say they lead to ‘false positives’ which can mean innocent members of the public being stopped, searched and even arrested.
Silkie Carlo, from civil liberties group Big Brother Watch, said the cameras ‘erode the trust between the police and the public’.
He said: ‘It is purely magical thinking to suggest facial recognition can solve London’s problem with knife crime.
‘It’s a highly controversial mass surveillance tool. It seriously risks eroding trust between the police and the public.
‘The commissioner is right that the loudest voices in this debate are the critics, it’s just that she’s not willing to listen to them.
‘Her attempt to dismiss serious human rights concerns with life or death equations and to depict critics as ill-informed without basis only cheapens the debate.
The Met claims that the technology has a very low failure rate.
However, using a different metric, last year research from the University of Essex said the tech only achieved eight correct matches out of 42, across six trials it evaluated.
The latest algorithm used by the Met is said to show no bias on the base of ethnicity, although it is less accurate for women than men.
Hannah Couchman, policy and campaigns officer at Liberty, says personal data is captured ‘without consent’ and ‘undermines our rights to privacy’.
She said: ‘Anyone can be included on a facial recognition watch list – you do not have to be suspected of any wrongdoing, but could be put on a list for the ludicrously broad purpose of police ‘intelligence interest’.
Scotland Yard deployed controversial live facial recognition technology at Oxford Circus
‘Even if you’re not on a watch list, your personal data is still being captured and processed without your consent – and often without you knowing it’s happening.
‘Facial recognition is a mass surveillance tool that undermines our rights to privacy and free expression. Any one of us might wish to go about our business anonymously and maintaining the right to do so does not make you worthy of police suspicion.
‘The Met started using facial recognition after ignoring its own review of a two-year trial which found that its use of this technology had failed to meet human-rights requirements.
‘By scaremongering and deriding criticisms rather than engaging with these concerns, Cressida Dick reveals how flimsy the basis for the Met’s use of this oppressive tool really is.’