Wikipedia could be flawless. Do we really want it?
Artificial intelligence can organize Wikipedia. The language model trained for this purpose proved... that we don't want that at all.
Oct 22, 2023 | updated: 4:56 AM EDT, October 23, 2023
Wikipedia is an extremely current, but also not error-free, online encyclopedia. Any visitor can create and edit the content presented there. So it happens that astronomy gets mixed up with astrology there, and flat-earthers argue against physics.
Artificial intelligence handling Wikipedia sources
This flaw in Wikipedia should be limited by sources, but their reliability varies. Links confirming the information presented on the site sometimes point to reliable content. However, Wikipedia just as often redirects to non-existent or unreliable web pages. It turns out that these can be cleaned up, and it can be done in real time, but artificial intelligence must be used.
However, a different language model is needed than the one used by ChatGPT. Sources are not OpenAI software's strong suit, and the popular AI chatbot's tendency to confabulate is legendary. But it all depends on training.
The software SIDE, described in the pages of Nature Machine Intelligence, uses an artificial intelligence model that stands out for its specific training. Scientists have specialized it in handling Wikipedia sources. It's not a chatbot capable of providing information about the climate in Hawaii. SIDE can only flawlessly point to the sources of such information.
How the SIDE software works in Wikipedia
SIDE was trained in recognizing credible sources based on the positive experiences of the Wikipedia service. Its language model received packages of articles that are of great interest to editors and moderators. These were often updated content, and errors and inaccuracies were quickly removed.
In this way, SIDE has learned how to identify low-quality links and point out the more reliable ones in internet resources. Therefore, the prospect of using SIDE by Wikipedia seems tempting. The software seems capable of ridding the online encyclopedia of errors and inaccuracies. According to specialists, not necessarily.
People do not want a reliable Wikipedia
During the SIDE tests, it was tasked with assessing the reliability of previously unknown, randomly selected Wikipedia articles. It indicated that almost 50% of cases already consider the best sources, and in the remaining ones, it found alternative references.
The problem is that SIDE proficiency tests also suggest that its functionality may not be as expected. In verifying its work, only 21% of Wikipedia users preferred sources found by artificial intelligence. 10% of respondents preferred previous quotes, and 39% had no opinion.
As many as two out of three testers of SIDE didn't pay attention to the sources indicated by Wikipedia. They assumed that the presence of content on the site indicates its sufficient reliability or intended to verify it themselves.