UKRI funding to expose 'semi-fake news' and build public's 'fake news immunity'

Published on

92819

A University of Liverpool led project will attempt to combat the online coronavirus “infodemic” by helping the media and public identify ‘semi-fake news’ and build ‘fake news immunity’, after securing more than £200,000 UKRI funding.

The proposal, led by Dr Elena Musi in the University’s Department of Communication and Media, defines semi-fake news as information that does not contain outright mistruths, but instead uses selective existing facts – such as partial scientific results or single anecdotes - to reach false evaluations.

This puts it out of the reach of most disinformation fighting efforts, which largely seek out verifiably false content or direct consumers to more reliable sources.

Even the latest algorithmic fact-checkers fail to recognise and flag misleading reasoning based on cherry-picked information, hasty generalisations or false analogies.

Dr Musi said: “The virality of misinformation is having a massive impact on our lives, shaping social behaviours which play a crucial role in the prevention and spread of Covid-19.

“This project aims to reverse-engineer the spread of misinformation through the novel strategy of developing ‘fake news immunity’ in the general public.”

To do this, Elena and her team aim to develop a ‘semi-fake news digital chatbot’ to empower citizens and teach them how to identify semi-fake news.

They will also provide publicly accessible recommendations for journalists and other news media to help them avoid framing information in a way that could lead to hasty generalisations; and to handle uncertainty without causing panic.

The public will be provided with a similar resource to help people critically analyse not just the content but it’s veracity, how it is framed and, crucially, whether it should be shared.

Finally, a Fake News Immunity platform will be created to allow the public to share verified high quality information, flag instances of semi-fake news and discuss news trustworthiness in a forum moderated by experts.

Dr Musi said: “More than half of the fake news circulating contains neither fabricated nor imposter content, but rather reconfigured misinformation; such as a false context or misleading or manipulated content.

“Even usually reliable news sources can draw conclusions which later prove to be false, by cherry picking scientific results or sensationalising single anecdotes.

“This problematic framing is highly dangerous as it creates misleading content – semi-fake news – that cannot easily be identified through manual investigations by humans and automated fact-checkers.”

The Centre for Argument Technology at the University of Dundee, a partner on the project, will contribute to work on AI tools for helping to recognise semi fake news and then communicating about it to users through an intuitive chatbot interface.

Before creating any of the resources, the team will monitor news sources to qualitatively analyse the flawed methods of reasoning that characterise semi-fake news and use these results to deliver the four main project outcomes by the middle of 2021.

[callout title=More]Forget ‘digital natives’ – why inequality determines young people’s data literacy[/callout]

For all the latest news and insight from the University of Liverpool, follow @livuninews