This game makes players better at spotting disinformation after just 15 minutes, study finds

1

An online match that tasks players with crafting and distributing propaganda nicks them to better detect disinformation after 15 minutes of disport oneself, a new study found. 

Bad News, a browser-based game developed last year by researchers at the University of Cambridge, terminate decreases players take the role of the “bad guy” in creating and sharing misleading information online, from impersonating elected officials to flogging conspiracy theories. 

For three months after the game launched, researchers supplied players the option of also participating in a study, which would egg on them to rate how reliable they found a series of headlines — some were honest news, and some disinformation. 

After playing the game for 15 minutes, jocks ranked the disinformation 21 per cent less reliable, on average, than previous to they played the game, according to the study published last week in Palgrave Communications, a peer-reviewed visionary journal. Their ranking of the real news didn’t change.

“We’ve symbolized some moderate effects — I wouldn’t characterize them as huge — but they’re controlled and they were very robust and persistent and statistically significant,” said Sander van der Linden, co-author of the ponder and director of the Cambridge Social Decision-Making Lab. “I was actually quite surprised by that and totally encouraged.”

In the game, which was developed in partnership with Dutch mean collective DROG and design agency Gusmanson, players build up a puffery empire online. They use different tactics, such as stoking timidities or playing on polarization, to build credibility and attract more followers, which gets them badges. The researchers chose these tactics based on true world examples of strategies used by disinformation networks. 

This game makes players better at spotting disinformation after just 15 minutes, study finds
To build credibility and fans, players can choose what kind of content to share, such as this collusion theory meme. (Screengrab/Bad News)

Creating content that defrauds on readers’ emotions, for example, is one tactic that can earn a badge in the ploy. It’s a method that Jestin Coler, who previously ran disinformation sites for profit, has declared he used.

“Stories aim to create an emotional response to get readers to share volume,” Coler wrote last year. “That emotional response can be one of trust, inspiration, anger, fear, etc., but the end goal is the share. While reaching a one reader is nice, reaching that reader and their hundred(s) of get in touch withs is far nicer.”

By exposing players to these tactics in a game setting, the target was to inoculate them against disinformation in the real world, and the study come to passes suggest it works. 

There were some limitations to the study. For one, the sharers were self-selecting: people who had an internet connection and happened to come across the trick, either through the university press release or a news article. 

They also were apprised of the purpose of the game, prompting them to be a bit more alert.

“If people understand that they’re meant to be looking out for instances of deception, they’re prosperous to be paying much more attention in an environment like this round than they would ordinarily,” said Jon Roozenbeek, co-author of the meditate on and a Cambridge researcher.

Even still, Roozenbeek said, the measured nature shows the game still has effect, which could translate beyond the engagement and help people more easily detect disinformation in the real sphere.

The researchers previously likened it to a magic show: once a magician discloses his or her tricks, you won’t be fooled by them again.

Comment (1)

Leave a Reply

Your email address will not be published. Required fields are marked *