Workshop Proceedings of the 16th International AAAI Conference on Web and Social Media
Workshop: Workshop on News Media and Computational Journalism (MEDIATE)
DOI: 10.36190/2022.53Recent research suggests that not all fact-checking efforts are equal: when and what is fact-checked plays a pivotal role in effectively correcting misconceptions. In that context, signals capturing how much attention specific topics receive on the Internet have the potential to study (and possibly support) fact-checking efforts. This paper proposes a framework to study fact-checking with online attention signals. The framework consists of: 1) extracting claims from fact-checking efforts; 2) linking such claims with knowledge graph entities; and 3) estimating the online attention these entities receive. We use this framework to conduct a preliminary study of a dataset of 879 COVID-19-related fact-checks done in 2020 by 81 international organizations. Our findings suggest that there is often a disconnect between online attention and fact-checking efforts. For example, in around 40% of countries that fact-checked ten or more claims, half or more than half of the ten most popular claims were not fact-checked. Our analysis also shows that claims are first fact-checked after receiving, on average, 35% of the total online attention they would eventually receive in 2020. Yet, there is a considerable variation among claims: some were fact-checked before receiving a surge of misinformation-induced online attention; others are fact-checked much later. Overall, our work suggests that the incorporation of online attention signals may help organizations assess their fact-checking efforts and choose what and when to fact-check claims or stories. Also, in the context of international collaboration, where claims are fact-checked multiple times across different countries, online attention could help organizations keep track of which claims are "migrating" between countries.