Information for:

  • Masters
  • Postgrad courses
  • Courses
  • Faculty and research
  • About UPF-BSM

Fake News: existential danger for democracies and the planet?

28 Septiembre - 2022
notícies falses

Alexandra Theben
Tenure Track Professor
Professor at the Full Time MBA
__

We live in a world in which modern technologies and the Internet provide immediate access to information from anywhere, wherever you are in the world. The International Day for Universal Access to Information, introduced by UNESCO in 2015, is celebrated annually on the 28th September to ensure public access to information and protection of fundamental freedoms. 

Indeed, although the internet was supposed to liberate us, we may instead find ourselves confined by it. The spread of disinformation is considered a global threat for freedom and democracy. The Club of Rome, a collective of leading thinkers, even considers misinformation a more dangerous threat than climate change. Their last report, "Earth for All", is about nothing less than the most important measures with which a future worth living for mankind would still be possible. 

"The most important challenge of our time is not climate change, loss of biodiversity or pandemics, but our collective inability to distinguish between fact and fiction"

50 years after their influential report on Earth's congestion, researchers from the Club of Rome have once again simulated the future of humanity. According to the experts, it is still possible to turn around the negative development -the most important problem these days is not the climate or pandemics. Humanity's future depends above all on "five extraordinary turnarounds to create wellbeing for all on a (relatively) stable planet":

  1. Ending poverty
  2. Eliminating inequality
  3. Empowering women
  4. Building a food system that is healthy for people and ecosystems
  5. Transition to clean energy.

Another factor that the experts consider very important is "education that teaches critical thinking and complex systems thinking, for girls and boys alike".

Emotions play a key role in the spread of disinformation

In 2013, the World Economic Forum listed online "misinformation" as one of the ten trends to watch in 2014. It proved to be premonitory, given the non-negligible role that information manipulation played when Moscow  annexed Crimea. Its importance has grown ever since, in particular with view to the political "surprises" of 2016, such as the fact that almost no one seems to have anticipated Brexit and Trump's election.

Prior to 2017, disinformation was rarely a topic of primary analysis in academic research. But the 2016 events and its aftermath prompted a rush of interest on the topic from a wide range of disciplines including communication, political science, and information science. The spread of disinformation sparked during the pandemic and recent Russian aggressions, bringing a range of stakeholders, including platforms tech companies and civil society to sign a strengthened code of practice on disinformation to create a more transparent, safe and trustworthy online environment. 

"The Internet has made it easier to publish fake stories, and social media have made it easier to spread false stories"
Ben Nimmo (2018)

It is when false information is created and disseminated with the intent to intentionally deceive the public or to cause public harm that it becomes dangerous disinformation. That is when we must react at all levels of society, together, to tackle the issue. The exponential development of digital platforms has considerably increased the risk of information manipulation in several ways. To increase the time that users spend online, platforms have developed technologies that, i.e. match us with the sponsored content that is most likely to make us react and click to continue browsing (matching technique). 

Especially young people consume their main news information from social media platforms. The problem is that for most users, these platforms are the "gatekeepers" of the web, the access routes to the rest of the internet. Social media have become de facto "news curators". Furthermore, disinformation is specifically tailored to go viral. It is written in a spectacular, emotional and often alarmist style, playing on fear and anxiety, elements that are generally not as prioritized in the realm of real news. It is therefore our cognitive biases that contribute in large part to the spread of fake news. 

Young people consume their main news information from social media platforms. Furthermore, disinformation is specifically tailored to go viral, written in a spectacular, emotional, and often alarmist style

Disinformation exploits a natural intellectual laziness, characterized by the failure to systematically exercise critical thinking and choosing to relay information naively without looking for evidence to support that information. We tend to favour information that confirms our pre-existing assumptions, supports our positions and does not offend our sensibilities. This phenomenon is commonly referred to as "confirmation bias", a behavioural weakness that is vastly deployed in the field of advertising. There are elements of sociology, anthropology, psychology, neuroscience, media literacy and more in that one impulse to share information with your friends and family. Research has explored various techniques to combat disinformation, including experiments testing the effect of various protective measures. 

An experiment to test the critical evaluation of news messages in social media

The main objective of this study was to gain insight into the extent to which source information and a protective message (a warning about fake news) have an effect on the critical evaluation by Dutch Facebook users. This study showed that source information influences the extent to which someone critically evaluates (news) messages on Facebook. In accordance with our expectations, participants more critically evaluated a news message when exposed to an unreliable source compared to a reliable source. 

The study shows that we use cognitive heuristics to assess information by critically evaluating to a greater or lesser extent

However, we also expected that a protective message preceding disinformation on Facebook would have a positive effect on the critical evaluation of participants -which was not the case. This could mean that current solutions provided by the European Commission and large social media platforms promising to include protective messages to improve critical evaluation to tackle the spread of disinformation may be of only limited effectiveness, since most people believed the message was still valid and credible. This study showed that the inclusion of protective measures of this type might, in fact, not affect critical evaluation.

Despite positive results of a protective message found by others, the study did not find any effect of a protective message on critically processing news information. A possible explanation for this is that participants’ flow while reading the Facebook news had not been interrupted and, therefore, they had not processed the protective message with sufficient attention.

A protective message does not seem to contribute to the battle against online disinformation

The results of this research serves as advice for Facebook to show the source more prominently so that users can quickly and easily see where a (news) message is coming from. Strikingly, participants regarded disinformation as more accurate when it came from a reliable news source – which entails that reliable news sources should never engage in these practices. We need more research studying source acknowledgment as an intervention when we aim for stimulating critical processing of news messages.

Ways forward to combat disinformation

Unfortunately, disinformation is here to stay. False and misleading information can be created about almost anything. Fighting effectively against information manipulation requires first and foremost to identify the roots of the problem. The EU has already bolstered its efforts to counter Russian disinformation campaigns with its "EU vs disinformation" website. Also, the strengthened code of practice signed in June 2022 brought together a range of actors to commit to voluntary commitments to counter disinformation.

Fighting effectively against information manipulation requires first and foremost to identify the roots of the problem

However, we can also do our part: think before we share, question the source of the claims, fact-check, and practice information hygiene. Information hygiene can slow down the spread of harmful misleading information, especially on social media. Coordination with three key players will continue to be key in the fight against disinformation: technology companies, civil society and fact-checkers, and academic institutions. Especially the latter need to teach the importance of critical thinking and seeing the big picture.

You may also be interested in

NEWSLETTER UPF-BSM
Subscribe to receive our news in your email inbox