First Study of Its Kind Reveals How We Can Slow Down Propaganda Memes Online - Science Club

your daily dose of science and nature

Sunday, December 20, 2020

First Study of Its Kind Reveals How We Can Slow Down Propaganda Memes Online

  


 As the US Presidential election comes ever closer, incendiary memes on social media are emerging as a way to further drive a wedge between left- and right-wing voters.

The online disinformation campaign is so well disguised, the US Senate Intelligence Committee is worried Facebook users might unwittingly share foreign propaganda.

A new study by the defence company RAND Corporation, sponsored by the California Governor's Office of Emergency Services, has found the way to possibly mitigate that threat going forward.

The randomised controlled trial included over 1,500 Facebook users on each side of the political aisle. While not published in a very peer-reviewed journal, it's one among the primary studies to check audience reactions to actual foreign propaganda.

In the trial, participants were shown and asked to rate real news headlines, false news headlines, and Russian propaganda memes, both when the source was a mystery and when the source was labelled. They were then asked whether or not they would really like or share the post.

The bad news is that oftentimes, these memes hit their target, causing a powerful emotional and partisan response, especially among hard-left and hard-right American voters, who cared-for depend upon information from The the big apple Times or Fox News respectively.

But there's also more positive news. When participants were shown social media content with labels, revealing Russia because the source, they were less likely to 'like' or share the post.

"It is difficult to assess the degree to which revealing the source may be a feasible intervention," the authors admit, but that said, their findings suggest there can be "immense value in developing a third-party plug-in which will unmask the source of state-sponsored content."

In the meantime, providing a more generalised warning about Russian propaganda may be a neater and fewer costly step to mitigate its spread.

When far-right participants within the study were shown a quick video on a way to assess the veracity of knowledge online, the study found they were less likely to hit 'like' on the meme.

This suggests media literacy can be some way to inoculate some Facebook users against disinformation. Although it didn't work for all participants. The media literacy video had no such effect on left-leaning users, as an example.

Even still, researchers at RAND think there could be some utility in warning people to be highly suspicious of online memes, their sources, and their intent.

"It also may be possible to inoculate audiences against Russian propaganda by pairing the warning with a weakened example of a Russian propaganda meme and providing directions on the way to refute the meme," the authors suggest. 

This idea must be tested further, but it lines up with what Cambridge University researchers are finding with a web 'Bad News' game that teaches people the way to think sort of a propagandist. After playing, people are on the average 21 per cent better at determining the reliability of reports.

Of course, this problem is often avoided entirely if social media firms quickly detect and take away propaganda and pretend news. But at once, that's something they're failing to try to to.

Fake news about COVID-19 is rampant right away and also the overwhelming majority of those misleading posts have escaped regulation and lack appropriate warnings.

The current study is barely a look of possible solutions, but it suggests that teaching left and many right users to recognise propaganda might keep them from spreading it further.

"Left- and right-wing audiences are particular targets of Russian propaganda efforts, in order that they naturally have a giant reaction because the propaganda speaks to them," says behavioural scientist Todd Helmus who works at RAND, the non-profit policy company that conducted the research.

"A big reaction also means plenty of room for improvement in terms of learning a way to think critically about the source of a selected point, and also the consequences of passing it along to others."

  


 As the US Presidential election comes ever closer, incendiary memes on social media are emerging as a way to further drive a wedge between left- and right-wing voters.

The online disinformation campaign is so well disguised, the US Senate Intelligence Committee is worried Facebook users might unwittingly share foreign propaganda.

A new study by the defence company RAND Corporation, sponsored by the California Governor's Office of Emergency Services, has found the way to possibly mitigate that threat going forward.

The randomised controlled trial included over 1,500 Facebook users on each side of the political aisle. While not published in a very peer-reviewed journal, it's one among the primary studies to check audience reactions to actual foreign propaganda.

In the trial, participants were shown and asked to rate real news headlines, false news headlines, and Russian propaganda memes, both when the source was a mystery and when the source was labelled. They were then asked whether or not they would really like or share the post.

The bad news is that oftentimes, these memes hit their target, causing a powerful emotional and partisan response, especially among hard-left and hard-right American voters, who cared-for depend upon information from The the big apple Times or Fox News respectively.

But there's also more positive news. When participants were shown social media content with labels, revealing Russia because the source, they were less likely to 'like' or share the post.

"It is difficult to assess the degree to which revealing the source may be a feasible intervention," the authors admit, but that said, their findings suggest there can be "immense value in developing a third-party plug-in which will unmask the source of state-sponsored content."

In the meantime, providing a more generalised warning about Russian propaganda may be a neater and fewer costly step to mitigate its spread.

When far-right participants within the study were shown a quick video on a way to assess the veracity of knowledge online, the study found they were less likely to hit 'like' on the meme.

This suggests media literacy can be some way to inoculate some Facebook users against disinformation. Although it didn't work for all participants. The media literacy video had no such effect on left-leaning users, as an example.

Even still, researchers at RAND think there could be some utility in warning people to be highly suspicious of online memes, their sources, and their intent.

"It also may be possible to inoculate audiences against Russian propaganda by pairing the warning with a weakened example of a Russian propaganda meme and providing directions on the way to refute the meme," the authors suggest. 

This idea must be tested further, but it lines up with what Cambridge University researchers are finding with a web 'Bad News' game that teaches people the way to think sort of a propagandist. After playing, people are on the average 21 per cent better at determining the reliability of reports.

Of course, this problem is often avoided entirely if social media firms quickly detect and take away propaganda and pretend news. But at once, that's something they're failing to try to to.

Fake news about COVID-19 is rampant right away and also the overwhelming majority of those misleading posts have escaped regulation and lack appropriate warnings.

The current study is barely a look of possible solutions, but it suggests that teaching left and many right users to recognise propaganda might keep them from spreading it further.

"Left- and right-wing audiences are particular targets of Russian propaganda efforts, in order that they naturally have a giant reaction because the propaganda speaks to them," says behavioural scientist Todd Helmus who works at RAND, the non-profit policy company that conducted the research.

"A big reaction also means plenty of room for improvement in terms of learning a way to think critically about the source of a selected point, and also the consequences of passing it along to others."

No comments:

Post a Comment