Facebook still lets housing advertisers exclude by race

Last year, a ProPublica investigation revealed that Facebook allows advertisers to discriminate by race in housing ads, allowing advertisers to filter out certain ethnic groups from seeing their ad. 

Although Facebook said in February that it corrected this flaw, a follow-up investigation revealed that advertisers could still discriminate by race, as well as other categories such as mothers of high school children, people who require wheelchair access ramps, and even expats from Argentina.

These groups are protected under the federal Fair Housing Act, which makes it a federal offense to publish ads that indicate a preference for or discriminate against people based on race, color, religion, gender, handicap, family status or national origin.  

ProPublica purchased a series of Facebook housing ads which discriminated by race. Almost all of the ads were approved within 22 minutes. One ad which excluded African American, Asian American and Spanish-speaking Hispanic people was approved in under one minute

CATEGORIES ADVERTISERS CAN DISCRIMINATE AGAINST ON FACEBOOK

Last year, a ProPublica investigation revealed that Facebook allows advertisers to discriminate by race in housing ads, allowing advertisers to filter out certain ethnic groups from seeing their ad. 

Although Facebook said it corrected this flaw, a follow-up investigation revealed that advertisers could still discriminate by:

  • Race (e.g. African Americans) 
  • Religious affiliation (e.g. Jews)
  • Family status (e.g. mothers of high school children)
  • Disability (e.g. people who require wheelchair ramps)
  • National origin (e.g. and expats from Argentina) 
  • Language (e.g. Spanish speakers)

Facebook claims that the flaw was corrected last year, and this latest error was due to a ‘technical failure,’ and that it will continue to improve its policies and tools to detect violations, extending these rules not just to housing, employer and credit ads, but all ads.

For the investigation, ProPublica bought a range of different housing ads on Facebook which excluded certain categories of people and races, which Facebook refers to as ‘multicultural affinities.’

ProPublica claims the ads they bought were approved in just 22 minutes, and only one type of ad took longer to approve. 

That ad attempted to exclude renters who were interested in Islam, Sunni Islam and Shia Islam. 

Because Facebook does not ask its users to indicate their racial identity, Facebook gathers data and assigns it a preference based on what it believes matches up with a specific ethnic group.  

Although Facebook’s policies should have resulted in the ads being flagged and prevented from posting, this didn’t happen. 

The investigation seems to indicate that Facebook has not changed its policies since ProPublica’s investigation last year.

However, in an e-mail statement to ProPublica, Facebook claims the flaw was due to a ‘technical failure.’ 

The Facebook statement on the discriminatory rental ads bought by ProPublica read:

‘This was a failure in our enforcement and we’re disappointed that we fell short of our commitments. 

‘Earlier this year, we added additional safeguards to protect against the abuse of our multicultural affinity tools to facilitate discrimination in housing, credit and employment. 

‘The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.

‘Our safeguards, including additional human reviewers and machine learning systems have successfully flagged millions of ads and their effectiveness has improved over time. 

‘Tens of thousands of advertisers have confirmed compliance with our tighter restrictions, including that they follow all applicable laws.

‘We don’t want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. 

Facebook claims that the flaw was corrected last year, and this latest error was due to a 'technical failure,' and that it will continue to improve its policies and tools to detect violations, extending these rules not just to housing, employer and credit ads, but all ads

Facebook claims that the flaw was corrected last year, and this latest error was due to a ‘technical failure,’ and that it will continue to improve its policies and tools to detect violations, extending these rules not just to housing, employer and credit ads, but all ads

‘Our systems continue to improve but we can do better.  

‘While we currently require compliance notifications of advertisers that seek to place ads for housing, employment, and credit opportunities, we will extend this requirement to ALL advertisers who choose to exclude some users from seeing their ads on Facebook to also confirm their compliance with our anti-discrimination policies – and the law.’

According to ProPublica, The US Department of Housing and Urban Development – which is responsible for enforcing fair housing regulations – investigated Facebook for its advertising policies in the past, but it has since closed this investigation. 

 

 

 

 

Read more at DailyMail.co.uk