Researchers have developed an artificial intelligence system focused on enhancing the reliability of Wikipedia references by training algorithms to identify dubious citations on the web.
Wikipedia is one of the most popular online platforms, with page views reaching half a trillion annually, making it one of the significant sources of knowledge currently available.
Not everything on a Wikipedia page can be trusted, which is why it’s essential to refer back to the original sources listed in the footnotes, though primary sources can sometimes be misleading.
The AI program, called SIDE, verifies the accuracy of the primary source and suggests new sources, the AI assumes Wikipedia’s claim is correct, meaning it can verify the source but not the actual claims made in the input.
In one study, participants favored the AI’s suggested citations over the original ones in 70% of cases. Researchers found that in about 50% of cases, SIDE provided a source used by Wikipedia as a reference.
In 21% of cases, SIDE went a step further by issuing recommendations deemed appropriate by human assessors in the study.
Artificial intelligence has proven its capability to assist in fact-checking Wikipedia claims, though researchers acknowledge that alternative programs may outperform SIDE in terms of quality and speed.
SIDE has limitations, as it considers the corresponding citations for web pages only. Wikipedia cites books, scholarly articles, and information presented through non-textual media like images and videos, which are not covered by the program.
Researchers point out that using Wikipedia itself might be restricted for study. Apart from its technical boundaries, Wikipedia’s premise is that any writer from anywhere can assign a reference to a topic.
The researchers also explain that individuals who enter citations on the web can be biased depending on the nature of the subject matter. AI trained based on the data used for training SIDE models might have constraints in this regard.
Nevertheless, the benefits of using artificial intelligence to simplify the fact-checking process or utilizing it as a supporting tool may have applications reverberating elsewhere.
Both Wikipedia and social media platforms need to deal with malicious entities and bot accounts flooding the internet with false information.