Facebook ‘ignored racial bias research’: Former employees claim the tech giant is ‘failing’ and ‘hurting people at scale’
- Whistleblowers claim they told superiors their new system is racially biased
- Research found black people’s accounts were 50% more likely to be disabled
- They were allegedly told to keep the findings quiet and to stop any further research
Facebook whistleblowers allege the tech giant ignored racial bias research as early as mid-2019 on Instagram’s automatic removal system and told employees to keep the findings secret.
In mid-2019, researchers for Facebook-owned Instagram found that accounts that said they belonged to a black person were 50 per cent more likely to have their accounts disabled by the new moderation system than white people.
The new system, designed to remove problematic and bullying accounts, was introduced to the photo-sharing app in 2019.
The revelation, made to NBC News, came from two current and one former staffers.
Facebook CEO Mark Zuckerberg testifies before the Senate judiciary and commerce committees on Capitol Hill on April 10, 2018, in Washington, D.C
After taking complaints to their superiors, the employees were ordered to stop researching the racial bias in the system, and to avoid talking to their colleagues about the shortfalls in the system.
Adding to the allegations, more than six current and former employees confirmed the company had ignored the emerging evidence of racial bias.
Facebook’s vice president of growth and analytics Alex Schultz told NBC the company had not ‘ignored research.’
‘In this specific case we have put additional standards to ensure we approach the work of analyzing bias in a rigorous and ethical way,’ Shultz told The Hill in a statement.
Adding to the allegations, more than six current and former employees confirmed the company had ignored the emerging evidence of racial bias
‘There will be people who are upset with the speed we are taking action,’ Schultz said.
According to The Hill, Schultz added that the company had increased investments in understanding hate speech and any bias in their algorithm. He did not say how much the investment was.
Facebook spokeswoman Carolyn Glanville told NBC: ‘We are actively investigating how to measure and analyze internet products along race and ethnic lines responsibly and in partnership with other companies.’
CEO Mark Zuckerberg has often been targetted over the amount of hate speech on the Facebook platform.
Yesterday, BuzzFeedNews reported that former employees felt the company was ‘failing’ and ‘hurting people at scale’.
At the start of this month, Boston-based software engineer Max Wang left the company, but not before uploading a video to Facebook’s internal messaging system to condemn the practices he’d seen during his seven-year stint.