The abnormal diffusion of misinformation, fake news and conspiracy theories has stuck people into a limbo in which no truth is unquestionable and even scientific theories are denied through vicious cycles of lies. Though the detection of fake news has been widely addressed in literature, the counteractions for combating the spread of misinformation has been less investigated. Previous work on the theme mainly proposes the application of either interventions, that try to directly limit the spread of fake news, or mitigations, counter campaigns of truth to contain the spread and damage of misinformation. However, the two strategies are applied independently. In this work we try to fill this gap and formulate a multi-round decision problem for alternating interventions and mitigations on online social networks, aiming at reducing the spread of misinformed content and increasing that of truth. Moreover, we introduce proneness, a score used as a proxy for quantifying the likelihood of users propagating truthful or false content. The decision problem framework will be that of Multi-Armed Bandits, while the diffusion of contents will be modelled solving problems related to Influence Maximization. Conclusively, we experimentally evaluate the quality of the model, giving directions for further work.

The abnormal diffusion of misinformation, fake news and conspiracy theories has stuck people into a limbo in which no truth is unquestionable and even scientific theories are denied through vicious cycles of lies. Though the detection of fake news has been widely addressed in literature, the counteractions for combating the spread of misinformation has been less investigated. Previous work on the theme mainly proposes the application of either interventions, that try to directly limit the spread of fake news, or mitigations, counter campaigns of truth to contain the spread and damage of misinformation. However, the two strategies are applied independently. In this work we try to fill this gap and formulate a multi-round decision problem for alternating interventions and mitigations on online social networks, aiming at reducing the spread of misinformed content and increasing that of truth. Moreover, we introduce proneness, a score used as a proxy for quantifying the likelihood of users propagating truthful or false content. The decision problem framework will be that of Multi-Armed Bandits, while the diffusion of contents will be modelled solving problems related to Influence Maximization. Conclusively, we experimentally evaluate the quality of the model, giving directions for further work.

Intervene or Mitigate? A Decision Problem for Combating Fake News

BERNARDI, ALBERTO
2021/2022

Abstract

The abnormal diffusion of misinformation, fake news and conspiracy theories has stuck people into a limbo in which no truth is unquestionable and even scientific theories are denied through vicious cycles of lies. Though the detection of fake news has been widely addressed in literature, the counteractions for combating the spread of misinformation has been less investigated. Previous work on the theme mainly proposes the application of either interventions, that try to directly limit the spread of fake news, or mitigations, counter campaigns of truth to contain the spread and damage of misinformation. However, the two strategies are applied independently. In this work we try to fill this gap and formulate a multi-round decision problem for alternating interventions and mitigations on online social networks, aiming at reducing the spread of misinformed content and increasing that of truth. Moreover, we introduce proneness, a score used as a proxy for quantifying the likelihood of users propagating truthful or false content. The decision problem framework will be that of Multi-Armed Bandits, while the diffusion of contents will be modelled solving problems related to Influence Maximization. Conclusively, we experimentally evaluate the quality of the model, giving directions for further work.
2021
Intervene or Mitigate? A Decision Problem for Combating Fake News
The abnormal diffusion of misinformation, fake news and conspiracy theories has stuck people into a limbo in which no truth is unquestionable and even scientific theories are denied through vicious cycles of lies. Though the detection of fake news has been widely addressed in literature, the counteractions for combating the spread of misinformation has been less investigated. Previous work on the theme mainly proposes the application of either interventions, that try to directly limit the spread of fake news, or mitigations, counter campaigns of truth to contain the spread and damage of misinformation. However, the two strategies are applied independently. In this work we try to fill this gap and formulate a multi-round decision problem for alternating interventions and mitigations on online social networks, aiming at reducing the spread of misinformed content and increasing that of truth. Moreover, we introduce proneness, a score used as a proxy for quantifying the likelihood of users propagating truthful or false content. The decision problem framework will be that of Multi-Armed Bandits, while the diffusion of contents will be modelled solving problems related to Influence Maximization. Conclusively, we experimentally evaluate the quality of the model, giving directions for further work.
Fake News
Social Networks
Influence Max. (IM)
Multi-Armed Bandits
File in questo prodotto:
File Dimensione Formato  
Bernardi_Alberto.pdf

accesso riservato

Dimensione 6.12 MB
Formato Adobe PDF
6.12 MB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/29702