Is Amazon’s facial recognition system RACIST? Expert says it ‘can’t distinguish between black faces’

Amazon’s facial recognition tool is being referred to as a ‘recipe for authoritarianism and disaster’ after it was revealed to be used by law enforcement officials.  

Now experts say it raises even greater concerns, as the artificial intelligence used to power the technology could exhibit racial bias.

Many are calling on Amazon to release data that shows they’ve trained the software to reduce bias, but it has yet to do so.

 

A controversial facial recognition tool, dubbed Rekognition, marketed to police has been defended by its creator, online retailer Amazon. Privacy concerns over the powerful technology emerged after an investigation revealed it is being employed by law enforcement

The controversy was spurred by a report from the American Civil Liberties Union (ACLU), which found that Amazon’s facial recognition tool, dubbed ‘Rekognition’, is being used by law enforcement agencies in Oregon and Florida. 

The ACLU voiced concern that Rekognition could be misused to identify and track innocent people in real-time. 

But Amazon said ‘quality of life would be much worse’ if technologies such as as this were blocked because of fears they may be misused.  

It has pointed out that its tool has helped find lost children in the past, and claims it has great potential for fighting crime in future.

But the ACLU and other civil rights groups argue that it could end up resulting more police stops and arrests for marginalized groups.

‘We have been shocked at Amazon’s apparent failure to understand the implications of its own product on real people,’ Matt Cagle, a co-author of the ACLU report, told the Verge. 

‘Face recognition is a biased technology. It doesn’t make communities safer. It just powers even greater discriminatory surveillance and policing,’ he added. 

Various studies have shown that AI has a tendency to exhibit racial bias. 

A recent study from MIT Media Lab discovered that popular facial recognition services from Microsoft, IBM and Face++ vary in accuracy based on gender and race. 

Researcher Joy Buolamwini found that the facial recognition software was more likely to inaccurately identify a subject when they were a male or female person of color.  

She explained that biased AI can have ‘serious consequences’, including illegal discrimination in hiring, firing and housing, as well as other impacts on social and economic situations.

When it comes to law enforcement, it could lead to the wrongful accusation and arrest of non-white people, the Verge noted.  

Companies like Microsoft and IBM have said they’re taking steps to improve their facial recognition systems by making them more accurate and using more diverse data to train the software. 

Many firms are running their algorithms against a system developed by the government called the Facial Recognition Vendor Test. 

The National Institute of Standards and Technology runs the algorithms through FRVT to measure for racial and gender bias, according to the Verge. 

Many firms are running their algorithms against a system called the Facial Recognition Vendor Test (pictured). The higher the red line, the greater the racial bias

Many firms are running their algorithms against a system called the Facial Recognition Vendor Test (pictured). The higher the red line, the greater the racial bias

The higher the gap between the lines on a chart, the greater the racial or gender bias.  

But Amazon hasn’t taken part in the FRVT project and hasn’t indicated that it’s working on the issue of racial bias.

Until it does, it’s likely that civil rights and policy advocates will continue to hold Amazon’s feet to the fire. 

‘We already know that facial recognition algorithms discriminate against Black faces, and are being used to violate the human rights of immigrants,’ Malkia Cyril, executive director of the Center for Media Justice, said in a press release.

‘We know that putting this technology into the hands of already brutal and unaccountable law enforcement agencies places both democracy and dissidence at great risk’

HOW DOES AMAZON’S CONTROVERSIAL RECKOGNITION TECHNOLOGY WORK?

Amazon Rekognition gives software applications the power to detect objects, scenes and faces within images.

It was built with computer vision, which lets AI programs analyse still and video images.

AI systems rely on artificial neural networks, which try to simulate the way the brain works in order to learn.

They can be trained to recognise patterns in information – including speech, text data, or visual images.

Rekognition uses deep learning neural network models to analyse billions of images daily.

Updates since it was created even allow the technology to guess a person’s age.

In November 2017, its creators announced that Rekognition can now detect and recognise text in images, perform real-time face recognition across tens of millions of faces and detect up to 100 faces in challenging crowded photos.

‘Amazon should never be in the business of aiding and abetting racial discrimination and xenophobia — but that’s exactly what Amazon CEO Jeff Bezos is doing when he sells these loosely regulated facial recognition tools to local police departments,’ she added. 

In the hours after the ACLU report was released, Amazon defended giving its facial recognition tool to police.       

Amazon Web Services, a subsidiary of the Seattle retail firm that deals largely with cloud computing, made the comments in a written statement. 

Speaking to the BBC, a spokesman said: ‘Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. 

‘Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?’

Facial recognition has been shown to exhibit bias in the past. Here, a graphic from an MIT Media Lab study shows how popular technologies tend to be inaccurate for people of color

Facial recognition has been shown to exhibit bias in the past. Here, a graphic from an MIT Media Lab study shows how popular technologies tend to be inaccurate for people of color

There are two types of facial recognition technology: One that seeks to identify/verify users and another that seeks to classify a user. Apple FaceID technology, for example, works to identify/verify a user, while image recognition services may try to classify a user

There are two types of facial recognition technology: One that seeks to identify/verify users and another that seeks to classify a user. Apple FaceID technology, for example, works to identify/verify a user, while image recognition services may try to classify a user

Amazon Rekognition has been used for a number of positive purposes already, the company claims. 

This includes using the program to find children lost in amusement parks and identifying people who have been abducted.

Sky News is reported to have used the software to identify celebrities present at the recent Royal Wedding. 

However, Amazon is drawing the ire of the American Civil Liberties Union (ACLU) and other privacy advocates over the tool.

First released in 2016, Amazon has since been selling it on the cheap to several police departments around the US, listing the Washington County Sheriff’s Office in Oregon and the city of Orlando, Florida among its customers. 

Police appear to be using Rekognition to check photographs of unidentified suspects against a database of mug shots from the county jail.  

Amazon offers the technology to law enforcement for just $6 (£4.50) to $12 (£9) a month.

The tech giant’s entry into the market could vastly accelerate development of the facial recognition systems, privacy advocates fear.

Amazon offers the technology to law enforcement for just $6 (£4.50) to $12 (£9) a month. So far, it counts the Washington County Sheriff's Office in Oregon and the city of Orlando as customers

Amazon offers the technology to law enforcement for just $6 (£4.50) to $12 (£9) a month. So far, it counts the Washington County Sheriff’s Office in Oregon and the city of Orlando as customers

WHAT HAS AMAZON SAID ABOUT ITS RECKOGNITION AI FACIAL RECOGNITION TOOL?

Amazon has defended giving its Big Brother-style facial recognition tool to police following an outcry from civil rights groups.

Amazon’s facial recognition tool, dubbed ‘Rekognition’, is currently being used by law enforcement agencies in Oregon and Florida.

In an emailed statement, however, the firm said it has ‘ many useful applications in the real world’, such as locating lost children at amusement parks

It also noted that the company ‘requires that customers comply with the law and be responsible when they use’ its software products.

Speaking to the BBC, a spokesman said: ‘Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. 

‘Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?’

And it could have potentially dire consequences for minorities who are already arrested at disproportionate rates, immigrants who may be in the country illegally or political protesters.

‘People should be free to walk down the street without being watched by the government,’ the groups wrote in a letter to Amazon on Tuesday. 

‘Facial recognition in American communities threatens this freedom’.

Deputies in Oregon had been using Rekognition about 20 times per day – for example, to identify burglary suspects in store surveillance footage.

Last month, the agency adopted policies governing its use, noting that officers in the field can use real-time face recognition to identify suspects who are unwilling or unable to provide their own ID, or if someone’s life is in danger.

‘We are not mass-collecting. We are not putting a camera out on a street corner,’ said Deputy Jeff Talbot, a spokesman for the sheriff’s office. 

Facial recognition is used by many technology companies, but activists say Amazon's system could lead to dangerous surveillance powers for law enforcement

Facial recognition is used by many technology companies, but activists say Amazon’s system could lead to dangerous surveillance powers for law enforcement

‘We want our local community to be aware of what we’re doing, how we’re using it to solve crimes – what it is and, just as importantly, what it is not.’  

It cost the sheriff’s office just $400 (£300) to load 305,000 booking photos into the system and $6 (£4.50) per month in fees to continue the service, according to an email obtained by the ACLU under a public records request. 

Last year, the Orlando, Florida, Police Department announced it would begin a pilot program relying on Amazon’s technology to ‘use existing City resources to provide real-time detection and notification of persons-of-interest, further increasing public safety.’

Orlando has a network of public safety cameras, and in a presentation posted to YouTube this month , Ranju Das, who leads Amazon Rekognition, said Amazon would receive feeds from the cameras, search them against photos of people being sought by law enforcement and notify police of any hits.

‘It’s about recognizing people, it’s about tracking people, and then it’s about doing this in real time, so that the law enforcement officers … can be then alerted in real time to events that are happening,’ he said.

‘The purpose of a pilot program such as this, is to address any concerns that arise as the new technology is tested,’ the statement said. 

‘Any use of the system will be in accordance with current and applicable law. We are always looking for new solutions to further our ability to keep the residents and visitors of Orlando safe.’  



Read more at DailyMail.co.uk