Over the past two decades, news consumption has shifted dramatically from print to digital formats. This transition gave rise to participatory journalism, which often lacks editorial oversight and focuses more on maximizing audience reach than on delivering high-quality information. As a result of this process, clickbait headlines have become increasingly common. Moreover, the anonymity of the internet, along with advances in automatic writing tools provided by modern LLM, has filled the web—especially on social media—with low-quality content. In this context, for-profit organizations like NewsGuard and non-profit ones like Media Bias Fact Check work to assess online news outlets’ reliability by using structured criteria and expert reviewers. These ratings serve a variety of purposes, including directing sponsored advertising to more reputable newspapers and helping readers by increasing their awareness. However, the sheer number of news sites makes it impossible for these organizations to evaluate them all while maintaining consistent quality. This Ph.D. project aims to develop a software toolkit to automate as much of this evaluation process as possible, focusing on the automatic and independent evaluation of each of the structured criteria to produce explainable results. The underlying idea is that many criteria evaluations can be partially automated by leveraging recent advancements in natural language processing technologies. This would significantly improve the speed of news media reliability’s evaluation, helping to reduce the spread of m/disinformation and supporting the readers to easily identify reliable sources.

Automatic evaluation of online news outlets’ reliability / Bianchi, John. - 15576:(2025), pp. 197-203. ( ECIR 2025 - 47th European Conference on Information Retrieval Lucca, Italy 6-10/04/2025) [10.1007/978-3-031-88720-8_32].

Automatic evaluation of online news outlets’ reliability

Bianchi John
2025

Abstract

Over the past two decades, news consumption has shifted dramatically from print to digital formats. This transition gave rise to participatory journalism, which often lacks editorial oversight and focuses more on maximizing audience reach than on delivering high-quality information. As a result of this process, clickbait headlines have become increasingly common. Moreover, the anonymity of the internet, along with advances in automatic writing tools provided by modern LLM, has filled the web—especially on social media—with low-quality content. In this context, for-profit organizations like NewsGuard and non-profit ones like Media Bias Fact Check work to assess online news outlets’ reliability by using structured criteria and expert reviewers. These ratings serve a variety of purposes, including directing sponsored advertising to more reputable newspapers and helping readers by increasing their awareness. However, the sheer number of news sites makes it impossible for these organizations to evaluate them all while maintaining consistent quality. This Ph.D. project aims to develop a software toolkit to automate as much of this evaluation process as possible, focusing on the automatic and independent evaluation of each of the structured criteria to produce explainable results. The underlying idea is that many criteria evaluations can be partially automated by leveraging recent advancements in natural language processing technologies. This would significantly improve the speed of news media reliability’s evaluation, helping to reduce the spread of m/disinformation and supporting the readers to easily identify reliable sources.
2025
9783031887192
9783031887208
Online news
Transparency and reputability of online news sources
News reliability assessment
File in questo prodotto:
File Dimensione Formato  
Automatic Evaluation of Online News Outlets’ Reliability.pdf

non disponibili

Descrizione: Automatic Evaluation of Online News Outlets’ Reliability
Tipologia: Versione Editoriale (PDF)
Licenza: Copyright dell'editore
Dimensione 137.42 kB
Formato Adobe PDF
137.42 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11771/40038
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
social impact