Hundreds of law enforcement agencies using new facial recognition app with 3 billion image database

More than 600 police departments are reportedly using a new using facial recognition app capable of comparing uploaded photos with three billion images in its database culled from social media and other websites. 

Clearview AI allows users to take a photo of a person and upload it to the app, which then matches it up to to publicly-available photos of that person, displaying those images along with links to where they appeared online.  

The publicly-available photos of people are said to be in a database that Clearview pulled together from outlets such as Facebook, Instagram and Twitter, but also Venmo, YouTube, employment and educational websites and supposedly millions of other online sites.  

Traditionally, law enforcement agencies use facial recognition software that primarily searches government images like mugshots and driver’s license pictures. 

In a new report by the New York Times, Clearview’s system is said to go ‘far beyond anything ever constructed by the United States government or Silicon Valley giants.’   

More than 600 police departments are said to be using the Clearview AI app to help identify suspects. This file image demonstrates AI capabilities of facial recognition technology

The Clearview AI app says it has a 3billion image database of photos culled from across the internet. This chart compares Clearview's database to law enforcement photo databases

The Clearview AI app says it has a 3billion image database of photos culled from across the internet. This chart compares Clearview’s database to law enforcement photo databases 

The report said that the app’s computer code allows it be used in conjunction with augmented-reality glasses, potentially allowing anyone with the app to use it to identify anybody they see. 

Clearview could be used to identify activists at rallies, but also random, attractive passersby, providing not just names and addresses, but also what they do and who they know, according to the newspaper. 

The app’s website states that the technology is a ‘new research tool used by law enforcement agencies to identify perpetrators and victims of crimes’ and that it has helped those agencies capture hundreds of criminals, while also exonerating innocent people and helping to identify the victims of crimes.  

At the moment, it appears that only law enforcement organizations are invited to request access to the app. 

Clearview told the New York Times that more than 600 law enforcement agencies have started using the app within the past year, but that it has also been licensed to some companies for security reasons.    

Among the law enforcement agencies said to be using Clearview AI are local police departments in states including Florida, Georgia and New Jersey, while federal organizations such as the FBI and the Department of Homeland Security are said to be testing the app out.    

The Indiana State Police said that they were able to solve a case 20 minutes after using the Clearview app, finding a suspected gunman after uploading a photo taken from the fight that had led to the shooting. The app matched up the photo to a social media video he appeared in, where his name appeared in the video caption. 

The alleged gunman ‘did not have a driver’s license and hadn’t been arrested as an adult, so he wasn’t in government databases,’ Chuck Cohen, an Indiana State Police captain at the time, told the New York Times. 

Cohen said that man wouldn’t have been arrested and charged without the Clearview app’s search, because they wouldn’t have been able to search social media.

Clearview AI is currently available only to law enforcement agencies and is said to be used by local police departments and is being tested by the FBI and Department of Homeland Security

Clearview AI is currently available only to law enforcement agencies and is said to be used by local police departments and is being tested by the FBI and Department of Homeland Security

Clearview says that it is able to produces matches about 75 per cent of the time, but that surveillance images are often shot from an unideal angle for matches. A file image of a security camera's high angle is shown here

Clearview says that it is able to produces matches about 75 per cent of the time, but that surveillance images are often shot from an unideal angle for matches. A file image of a security camera’s high angle is shown here

Police in Clifton, New Jersey, used the app to identifying shoplifters, a thief at an Apple Store and a good Samaritan who knocked out a man who’d been using a knife to threaten people.    

In Gainesville, Florida, police were able to identify more than 30 suspects in previously dead-end cases using Clearview, in part because the app was able to produce matches to profile images or partial views of faces, as opposed to the dead-on images that government facial recognition tools operate with. 

The app is not always a sure bet, though – many of the images police have to work with are taken from surveillance cameras that have overhead views, which are not ideal for facial recognition purposes, especially since the images in Clearview’s database tend to be taken at eye level.  

Still, Clearview told the New York Times that the app is able to find matches up to 75 per cent of the time. 

It’s not clear how many false matches it finds, however, as the app has not been tested by an independent party.  

‘The larger the database, the larger the risk of misidentification because of the doppelgänger effect,’ Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, told the newspaper. ‘They’re talking about a massive database of random people they’ve found on the internet.’

Garvie also noted that there’s ‘no data to suggest this tool is accurate.’ 

Because Clearview has not yet been checked out by independent experts, it’s said that law enforcement agencies are currently uploading what could be sensitive photos to servers which have not had their data security capabilities tested. 

Some law enforcement officials interviewed by the newspaper said that they were unaware that the photos they were running through the app were being sent to and stored on Clearview’s servers, however. 

Clearview said that its customer support employees don’t look at the pictures police submit. 

However, as more and more law enforcement agencies start using the app, Clearview is able to grow its database of people who are being searched for by police. 

Privacy experts warn against the use of facial recognition AI databases like the one that Clearview offers. 

‘The weaponization possibilities of this are endless,’ High Tech Law Institute at Santa Clara University co-director Eric Goldman told the newspaper. ‘Imagine a rogue law enforcement officer who wants to stalk potential romantic partners, or a foreign government using this to dig up secrets about people to blackmail them or throw them in jail.’ 

Clearview told the newspaper that the app ‘flags possible anomalous search behavior’ as a way to stop users from engaging in ‘inappropriate searches.’ 

Several cities and states have now banned the use of facial recognition technology by police and the government, according to Muckrock, which first reported on Clearview AI’s use by police.  

Read more at DailyMail.co.uk