An investigation into Team Jorge’s activites has shown the sinister influence of misinformation and fake news on politics, society and the economy. Misinformation and fake news became a global phenomenon with the 2016 U.S. presidential election and the Brexit referendum, particularly because more and more people are using social media as a source of news without reflection. The use of artificial intelligence (e.g. chat GPT) in the generation and dissemination of misinformation and fake news will strengthen their influence in the future.The spread of misinformation and fake news on the internet and its consequences are being intensively discussed in the European Parliament. Nevertheless, so far, there is no clear agreement on how to reduce the influence of misinformation and fake news.
“The problem with misinformation and fake news is that even if it is flawlessly identified as such, something still ‘sticks’ – the misinformation and fake news continue to influence our opinion, explains Prof. Johannes Siebert who researches and teaches at MCI | The Entrepreneurial School®. This phenomenon is called “belief perseverance bias“ and explains the great influence of misinformation and fake news on the formation of opinion and the decision-making behavior of many people. “There are numerous newsrooms and nonprofit organizations that identify misinformation and fake news. This very elaborate work helps reduce the influence of misinformation and fake news. However, these fact checks can only be a first step,“ adds Dr. Jana Siebert.
The two researchers have been working on the methodological reduction of the belief perseverance bias in the context of misinformation and fake news in the project “PerFake“ funded by the European Union and the Czech Ministry of Education, Youth and Sports. The aim of the PerFake project was to contribute to reducing the negative influence of misinformation and fake news. Prof. Johannes
Siebert and Dr. Jana Siebert developed two methods to reduce the belief perseverance bias and tested and optimized them in two experiments with numerous participants. The first results have been published in the prestigious journal PLoS ONE.
Both tested debiasing methods showed promising results in reducing the belief perseverance bias. The debiasing method “counter-speech“ focuses on refuting the misinformation and fake news by clear counter-arguments. The debiasing method “awareness training“ generally informs the participants about the existence of the belief perseverance bias and how the bias works. Such awareness training could help increase society’s resilience to misinformation and fake news. Prof. Johannes Siebert explains how this can work in practice: “Let us assume you have received a piece of information, for example, you have heard a speech by a politician or read a post on social media. A fact check shows that it is fake news. Being aware of the belief perseverance bias should then help you realize that your original opinion may still be negatively influenced by the fake news and subsequently correct this bias.“ Dr. Jana Siebert adds: “It would, therefore, be desirable to educate the public about the belief perseverance bias and the way it works. For example, fact-checking organizations could complement their fact checks with a note informing about the belief perseverance bias. Such a note could significantly increase the effectiveness of fact-checking and society’s resilience to misinformation and fake news.“
Source:
Siebert, J., & Siebert, J. U. (2023). Effective mitigation of the belief perseverance bias after the retraction of misinformation: Awareness training and counter-speech. PLoS ONE 18(3): e0282202. https://doi.org/10.1371/journal.pone.0282202
Contact:
FH-Prof. PD Dr. habil. Johannes Siebert
MCI | The Entrepreneurial School®
Email: Johannes.Siebert@mci.edu
Reducing the impact of fake news using debiasing methods. Photo: © Pixabay
Our students share their experiences and give insights into courses, projects, and student life in the vibrant Alpine city of Innsbruck.
Discover the program that suits you.