Facebook quietly changed its algorithm in 2018 to prioritise reshared material, only for it to backfire and cause misinformation, toxicity and violent content to become ‘inordinately prevalent’ on the platform, leaked internal documents have revealed.
The company’s CEO Mark Zuckerberg said the change was made in an attempt to strengthen bonds between users — particularly family and friends — and to improve their wellbeing.
But what happened was the opposite, the documents show, with Facebook becoming an angrier place because the tweaked algorithm was rewarding outrage and sensationalism.
Researchers for the company discovered that publishers and political parties were deliberately posting negative or divisive content because it racked up likes and shares and was spread to more users’ news feeds, according to the Wall Street Journal.
It has seen a series of internal documents that reveal Zuckerberg was even warned about the problem in April 2020 but kept it in place regardless.
Facebook quietly changed an algorithm in 2018 to prioritise reshared material, only for it to backfire and cause misinformation, toxicity and violent content to become ‘inordinately prevalent’ on the platform, leaked internal documents have revealed
Encouraging more ‘meaningful social interactions’ was exactly why the 2018 algorithm change had been made, because those within the company were concerned about a decline in user engagement in the form of commenting on or sharing posts.
This is important to Facebook because many inside the tech firm view it as a key barometer for the platform’s health — if engagement is down the fear is that people might eventually stop using it.
In 2017, comments, likes and reshares declined throughout the year but by August 2018, following the algorithm change, the free fall had been halted and the metric of ‘daily active people’ using Facebook had largely improved.
The problem, however, was that when the tech firm’s data scientists surveyed users they found that many thought the quality of their feeds had decreased.
Not only that, but in Poland the changes made political debate on the platform more spiteful, the documents show.
One Polish political party, which isn’t named, is said to have told the company that its social media management team had shifted the number of its posts from 50/50 positive/negative to 80 per cent negative because of the algorithm change.
‘Many parties, including those that have shifted to the negative, worry about the long term effects on democracy,’ according to one internal Facebook report, which didn’t name those parties.
It affected online publishers, too.
BuzzFeed chief executive Jonah Peretti emailed a top Facebook official to say that the most divisive content produced by publishers was going viral on the platform.
This, he said, was creating an incentive to produce more of it, according to the documents.
Mr Peretti’s complaints were highlighted by a team of Facebook data scientists who wrote: ‘Our approach has had unhealthy side effects on important slices of public content, such as politics and news.’
One of them added in a later memo: ‘This is an increasing liability.’
They surmised that the new algorithm was leading to an increase in angry voices because it was giving more weight to reshared and often divisive material.
‘Misinformation, toxicity, and violent content are inordinately prevalent among reshares,’ the researchers wrote in internal memos.

Facebook CEO Mark Zuckerberg said the algorithm change was made in a bid to strengthen bonds between users — particularly family and friends — and to improve their wellbeing
Facebook had wanted its users to interact more with their family and friends rather than spending time passively consuming professionally produced content. This is because research suggested it was harmful to their mental health.
To encourage engagement and original posting the company decided its algorithm would reward posts with more comments and emotion emojis, which were viewed as more meaningful than likes, according to the documents.
An internal point system was used to measure its success, with a ‘like’ worth one point; a reaction worth five points; and a significant comment, reshare or RSVP worth 30 points. Multipliers were also added depending on whether the interaction was between friends or strangers.
But after concerns were raised about potential issues with the algorithm, Zuckerberg was presented with a number of proposed alterations that would counteract the spread of false and divisive content on the platform, an internal memo from April 2020 shows.
One of the suggestions was to remove the boost the algorithm gave to content reshared by long chains of users, but Zuckerberg was allegedly cool on the idea.
‘Mark doesn’t think we could go broad with the change’, an employee wrote to colleagues after the meeting.
Zuckerberg said he was open to testing it, she said, but ‘we wouldn’t launch if there was a material tradeoff with MSI impact.’
Last month, almost 18 months on, Facebook announced it was ‘gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment or share political content.’
A Facebook spokesperson told MailOnline: ‘The goal of the Meaningful Social Interactions ranking change is in the name: improve people’s experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends.
‘Is a ranking change the source of the world’s divisions? No.
‘Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed.
‘It also shows that meaningful engagement with friends and family on our platform is better for people’s well-being than the alternative.
‘We’re continuing to make changes consistent with this goal, like new tests to reduce political content on Facebook based on research and feedback.’