Political Bias and Impulsive Behavior Open Door to Misinformation

Written by

Americans are three-times as likely to follow strangers on Twitter if they share the same political views, according to new research which sheds more light on the spread of online misinformation and social media “echo chambers.”

The peer-reviewed study from researchers at MIT and the UK’s Exeter University began by identifying 842 random Twitter users who displayed partisan bias towards the Republican or Democratic Party.

Eight bot accounts were then created with varying degrees of partisanship to follow the group.

The report concluded that, no matter what the strength of partisanship displayed by the bots and no matter whether Democrat or Republican, Twitter users were almost three-times more likely to follow back someone with the same political affiliation as their own.

More partisan users were also more likely to follow the bots with stronger political affiliation, it noted.

The authors claimed that previous research had shown conservatives to be more likely to create social ties with those of their own political persuasion than liberals, a finding seemingly contradicted by this study.

“Partisans are much more likely to connect to complete strangers simply because they share the same political views,” explained report co-author Mohsen Mosleh of Exeter University Business School.

“This suggests that if one seeks to reduce partisan assortment on social media networks, it may be necessary for algorithms to actively counteract pre-existing psychological biases – biases that are part of the political sectarianism in which America is currently embroiled.”

The findings may help to explain why deliberate misinformation campaigns, like the ones sponsored by the Kremlin before the 2016 US Presidential election, represent such a danger to Western democracies. Bots spouting fake news designed to appeal to one side or another would seem to have a captive audience, especially in the highly partisan climate of the US.

"What we mainly show here is that  being mostly connected to like-minded others may make people more prone to false beliefs by insulating them from corrective information,"  Mosleh told Infosecurity by email.

A link was also made between misinformation and the social media echo chamber in a 2018 University of Amsterdam study, which likened the latter to dry kindling in a forest and the former to a wildfire.

“There are certain signs that point to a link between these two phenomena – echo chambers and the spread of misinformation – since homogeneous clusters of users with a preference for self-confirmation seem to provide capable green-houses for the seedling of rumors and misinformation,” it warned.

A separate study by MIT, University of Regina and Exeter University researchers released today claimed, perhaps unsurprisingly, that more intuitive and less reflective thinkers were more likely to tweet about hyper-partisan content and fake news.

The same type of users were more likely to fall for “get rich quick” schemes and sales promotions, it said.

This would lend weight to calls for more critical thinking and media literacy classes to be taught in schools.

“These results reinforce the claim that the human capacity to reason is hugely consequential and something that should be cultivated and improved rather than ignored,” argued Mosleh.

What’s hot on Infosecurity Magazine?