Our best tactic for dispelling conspiracy theories is to improve people's overall wellbeing. Also, zapping them with AI helps

Useful piece from the leading science journal Nature, mapping the research, methods and policies that are proven to reduce the attraction of conspiracy theories. Excerpts below:

…I was part of a network of more than 100 academics that this year produced the Routledge Handbook of Conspiracy Theories. Of its 48 chapters, just one directly explores how to counter conspiracy theories. It concludes that it is easier to spread them than to refute them. Correcting entrenched beliefs is very difficult.

So it is better to prevent falsehoods taking root than to try to weed them out. That means looking beyond their content and the platforms and algorithms that fuel their spread. We need to examine what makes people susceptible.

I study how psychological traits and motives affect beliefs. Ideological convictions are a product of top-down cues from politicians and the media, and bottom-up psychological mechanisms. Hundreds of studies have applied this model to conspiracy beliefs, collecting both experimental and correlational data.

My collaborators and I suggest that three broad psychological needs underlie conspiracy beliefs: the need to understand the world; to feel safe; and to belong and feel good about oneself and one’s social groups.

Those who feel defensive about themselves are more likely than others to embrace conspiracy theories, perhaps to deflect blame for their shortcomings. Conspiracy beliefs have also been linked to feelings of powerlessness, anxiety, isolation and alienation.

Those who feel that they are insignificant cogs in the political machinery tend to assume that there are nefarious influences at play.

Politicians who feel threatened fan these fears. Amid this year’s presidential campaign, US President Donald Trump talked about “dark shadows” and planes filled with thugs. Similarly, Jarosław Kaczynski, leader of Poland’s Law and Justice party, insinuated last month that protests against an abortion ban were organized by forces aimed at destroying the nation, and that they bear marks of specialized training.

The COVID-19 pandemic created a perfect storm for vulnerability to conspiracy narratives. Uncertainty and anxiety are high. Lockdown and social distancing bring isolation. People struggling to understand this unprecedented time might reach for extraordinary explanations.

Will recovering from the pandemic, then, mean recovering from the ‘infodemic’? I fear not.

First, being able to mix more freely might ease some social needs — but feelings of grief, uncertainty, powerlessness and marginalization will continue for those who have lost health, loved ones, jobs, education and so on.

Recovery plans should look beyond economic upturns and physical health. Neglecting the mental-health crisis risks perpetuating an information one.

Second, we know too little about how individuals’ vulnerability to conspiracy theories changes over time. Even daily psychological fluctuations might have a role: people are more likely to entertain conspiracy theories in anxiety-inducing moments.

And understanding the long-term effects of major life or world events is important, too. An analysis of letters to the editors of The New York Times and the Chicago Tribune between 1890 and 2010 observed peaks in conspiratorial content in the early 1950s, in the aftermath of the Second World War (J. E. Uscinski and J. M. Parent, American Conspiracy Theories, 2014).

Yet longitudinal research in the field, particularly on within-person changes, is difficult and scarce. The surge of studies tracking psychological responses to the pandemic could yield insights to guide interventions.

Meanwhile, we should not abandon other methods of correcting misinformation and stemming its spread. Debunking is extremely difficult but can work. Debunkers must explain why something is false, drawing attention to the strategies used to deceive and providing facts, rather than simply labelling information false or misleading.

‘Prebunking’ is more effective. Like a misinformation vaccine, this technique warns people that they might encounter misinformation before they buy into it. Online games such as Bad News and Go Viral! show how fake news is spread, and seem to make people more sceptical. Nudging people to consider accuracy discourages them from sharing fake news.

These effects might be amplified by addressing people’s psychological needs. This could make conspiracy theories and other misinformation less tempting, and also improve well-being.

Education counters conspiracy beliefs because it develops analytical thinking and because it empowers people. Other interventions could promote a sense of common identity, to boost feelings of belonging and meaning.

What happened in New Zealand during the pandemic is encouraging. Prime Minister Jacinda Ardern stressed solidarity and transparent decision-making, and offered people a sense of purpose. Early data suggest that despite an increase in distress during lockdown, New Zealanders showed no increase in conspiracy thinking, and more trust in science. We should expand this approach globally.

More here.

Here’s another piece (from the US Conversation) which reports on how AI can help us distinguish between a conspiracy theory and a true conspiracy. It comes down to how easily the story falls apart:

Conspiracy theories, which have the potential to cause significant harm, have found a welcome home on social media, where forums free from moderation allow like-minded individuals to converse. There they can develop their theories and propose actions to counteract the threats they “uncover.”

But how can you tell if an emerging narrative on social media is an unfounded conspiracy theory? It turns out that it’s possible to distinguish between conspiracy theories and true conspiracies by using machine learning tools to graph the elements and connections of a narrative.

These tools could form the basis of an early warning system to alert authorities to online narratives that pose a threat in the real world.

…While the popular image of the conspiracy theorist is of a lone wolf piecing together puzzling connections with photographs and red string, that image no longer applies in the age of social media.

Conspiracy theorizing has moved online and is now the end-product of a collective storytelling. The participants work out the parameters of a narrative framework: the people, places and things of a story and their relationships.

The online nature of conspiracy theorizing provides an opportunity for researchers to trace the development of these theories—from their origins as a series of often disjointed rumours and story pieces, to a comprehensive narrative.

The article goes on to show how it can distinguish a true conspiracy - say, the Bridgegate crimes, a political payback operation launched by staff members of Republican Gov. Chris Christie’s administration against the Democratic mayor of Fort Lee, New Jersey. And a conspiracy theory - like Pizzagate, the bizarre notion that an elite pedophile ring was operating behind a Washington pizza parlour.

The layers of the Pizzagate conspiracy theory combine to form a narrative, top right. Remove one layer, the fanciful interpretations of emails released by WikiLeaks, and the whole story falls apart, bottom right. Tangherlini, et al., CC BY

What the AI is able to do is show how the true conspiracy depends on much longer and deeper elements between the story - Bridgegate came together over seven years, with consistent institutional players in the roles.

The Pizzagate story came together in a month. The graphic to the side shows how flimsily it hangs together if you take one element from it (like Wikileaks email disclosures being weirdly interpreted).

The researchers realise this may be a tool to strengthen conspiracy theories as much as analyse them. As they conclude:

There are clear ethical challenges that our work raises. Our methods, for instance, could be used to generate additional posts to a conspiracy theory discussion that fit the narrative framework at the root of the discussion. Similarly, given any set of domains, someone could use the tool to develop an entirely new conspiracy theory.

However, this weaponization of storytelling is already occurring without automatic methods, as our study of social media forums makes clear. There is a role for the research community to help others understand how that weaponization occurs and to develop tools for people and organizations who protect public safety and democratic institutions. 

Developing an early warning system that tracks the emergence and alignment of conspiracy theory narratives could alert researchers – and authorities – to real-world actions people might take based on these narratives. 

More here.