News, Culture & Society

Online game teaches players to sniff out ‘fake news’ by encouraging them ‘to be evil’

A game built by psychologists has been designed to teach players to sniff out ‘fake news’ by encouraging them to sabotage elections in a fictional town. 

Cambridge University researchers created ‘Breaking Harmony Square’ in partnership with the US Department of State and the Department of Homeland Security. 

The idea is to teach players to sniff out ‘fake news’ by letting them be as evil as possible and play the part of Chief Disinformation Officer in a fictional small town.

Using tactics such as bots, trolling, and spreading conspiracies, the game shows how disinformation tactics work in real life by using a small town as an example.

In a controlled trial involving 681 people – half playing this and half playing Tetris – 16 per cent of those playing Harmony Square said they perceived misinformation as being less reliable and 11 per cent were less likely to share it in future. 

The goal of the free game, played in a web browser, is to sow as much discord as possible in the fictional town by polarizing audiences and spreading fear and anger.  

Cambridge University researchers created ‘Breaking Harmony Square’ in partnership with the US Department of State and the Department of Homeland Security

The aim of the game is to use a combination of techniques including trolling and fake news to sow discord within a small town community named Harmony Square

The aim of the game is to use a combination of techniques including trolling and fake news to sow discord within a small town community named Harmony Square

In the four chapters, the neighbourhood descends into chaos as players spread falsehoods including setting up a disreputable news site to attack a TV anchor.

Users learn five manipulation techniques as part of the gameplay, with the aim that they will be able to spot those tricks in the real world.

They include: trolling to provoke outrage; exploiting emotional language to create anger and fear; artificially amplifying reach through bots and fake followers; creating and spreading conspiracy theories; polarizing audiences.

Study author Sander van der Linden said the game worked like a form of inoculation intervention against the worst excesses of misinformation and fake news. 

There are four chapters to the game with each taking you deeper into the pit that is the role of Chief Disinformation Officer within the town of Harmony Square

There are four chapters to the game with each taking you deeper into the pit that is the role of Chief Disinformation Officer within the town of Harmony Square

So far about 1.5 million people have played a game built by the team form Cambridge University which ‘for an academic concept is quite large’.

‘We estimate that if it were to be rolled out on a national level, about 63 per ceent of people who go through the treatment would receive a boost in their psychological immunity’ to misinformation, said van der Linden.

PLAYING HARMONY SQUARE INVOLVES ‘TRYING TO BE EVIL’ 

Researchers created the game as a way to ‘innoculate’ users against the threat of misinformation.

You start the game by being recruited as a Chief Disinformation Officer.

You select a fake user profile then set out to build a reputation as a troll.

It starts innocently by ‘winding up’ art lovers or pineapple on pizza fans.

Gradually, over the course of the ten minute free to play browser game you step up your efforts.

You can create a fake news website, bring down local TV anchors and generally sow discord and mistrust in politics and officials in the town.

Researchers say the approach is designed to help people appreciate the techniques used in spreading fake news and misinformation online. 

‘But at the same time, elections are often decided on small margins, so inoculating even a million people could potentially make a practical difference.’

‘Games are interactive and require active cognitive involvement on the part of the player,’ co-author Jon Roozenbeek told MailOnline. 

‘This kind of ‘active’ inoculation against fake news, in theory at least, is a good way to retain the lessons learned during the game.’

Other methods of ‘retraining people’ include more passive interventions such as reading or watching a video which can be tough to remember longer term.

‘Aside from this, you can be quite creative when creating games, and engage in world-building and use humour to make the game more attractive and entertaining,’ Roozenbeek explained.

During a controlled, randomised trial, researchers discovered that players were more canny to fake news after playing the 10 minute online game.

For the trial 681 people were asked to rate the reliability of a series of news and social media posts – some of which were real and some contained misinformation.

Half of the players were given Tetris, while the other half were given the fake news game and had to rate the sources before and after playing.

‘We wanted to include a gamified control condition, to make the amount of cognitive load comparable between conditions,’ Roozenbeek told MailOnline.

‘Tetris is a good fit for this, because it’s been used in previous studies, it’s in the public domain, and the learning curve is quite flat so that most participants already know how to play it.’

Out of the people who played Harmony Square, the perceived reliability of misinformation dropped an average of 16 per cent and willingness to share fake news with others dropped by 11 per cent, researchers discovered.

It was also a politically independent finding – as political party or leaning made no difference to the change in behaviour from playing the game.

Out of the Harmony Square group, 63 per cent said they would go onto be more discerning about fake news, compared to just 37 per cent of the Tetris group.

The gameplay is based on inoculation theory, the idea that exposing people to a weak ‘dose’ of common techniques used to spread fake news allows them to better identify and disregard misinformation when they encounter it in future.

Dr Sander van der Lindon said trying to debunk misinformation after it spread is like shutting the barn door after the horse has bolted.

‘By pre-bunking, we aim to stop the spread of fake news in the first place,’ the researcher from the Cambridge Social Decision-Making lab. 

You start the game by picking a pseudonym that will act as your 'troll profile' on social media as you spread lies, anger and mistrust among other users

You start the game by picking a pseudonym that will act as your ‘troll profile’ on social media as you spread lies, anger and mistrust among other users

‘The aftermath of this week’s election day is likely to see an explosion of dangerous online falsehoods as tensions reach fever pitch.

‘Fake news and online conspiracies will continue to chip away at the democratic process until we take seriously the need to improve digital media literacy across populations,’ he explained. 

‘The effectiveness of interventions such as Harmony Square are a promising start.’

The same team that built Harmony Square also developed a game in partnership with the UK Cabinet Office called Go, Viral! to tackle Covid-19 misinformation.

‘For the Go Viral-game, we focused on three techniques: fearmongering, using fake experts, and spreading conspiracies,’ Roozenbeek told MailOnline.

‘We chose these three not necessarily because these are the only three techniques used to spread misinformation about COVID-19, but because they’re very common and take relatively little time to explain. 

‘For the Harmony Square game, we focused on manipulation techniques that are more directly relevant to political disinformation campaigns.

Researchers say that by teaching users what it is like to actively sow misinformation, they get better at spotting signs of it in the real world, which helps slow its spread

Researchers say that by teaching users what it is like to actively sow misinformation, they get better at spotting signs of it in the real world, which helps slow its spread

‘These techniques overlap to some degree, of course, and we encourage players to look up more information about other manipulation techniques, if they’re interested.’

Van der Linden said during the recent US elections, a wider array of prebunking and inoculation early in the campaign would have limited people’s susceptibility to disinformation further than was already the case from Twitter measures.

‘Twitter recently implemented a prebunking strategy during the last week of the election and disseminated it to all of its US users I believe to prebunk election misinformation. This was innovative,’ he said.  

‘What we show in our research is that inoculation can also have “therapeutic” benefits, i.e., even when people have already been exposed to falsehoods to some degree, it can still boost immune response.’

This follows the development of therapeutic vaccines in medicine and perhaps reduce people’s willingness to spread the misinformation further.

The findings of the controlled trial of Harmony Square can be found in the Harvard Kennedy School Misinformation Review. 

Read more at DailyMail.co.uk


Comments are closed.